36 posts • joined Monday 20th November 2006 09:40 GMT
Marx pointed out that neat flaw with capitalism too.
The optimistic view is that once there is nowhere cheaper to relocate factories to, the world will actually be a lot better place.
Of course, it will be one where the purchasing power of a Western salary will be about the same as the purchasing power of an Eastern one - otherwise it would still be competitive to move.
At which point, you'll just be competing on talent and infrastructure. Well, competing on talent while still trying to pay off a Western mortgage (due to our lack of housing supply) and Western student loan, and maintaining an ageing infrastructure.
(Another interesting question is whether disintermediating the supply chain will reduce the impact of rising labour costs - i.e. right now a $4 increase in manufacturing costs has a large impacts on prices, because it's getting multiplied through the chain)
Re: "no changes need to be made"
Very much what I was going to say.
It's a close enough size that things would indeed pretty much look the same, but I think many people forget that with touch-based devices, you really need to consider the physical aspects of the design.
This is less of an issue with mouse-based systems as you're always operating on an abstraction of your hand-movement and pointer movement, that makes it easier to hit tiny targets. That said, old fixed layout 800x600 Windows applications can be almost unreadably small on a modern high-resolution laptop screen.
Flattery will get you . . .
It struck me as basically as a load of flattery - appealing to the British idea that we're good at inventing things - both artistically and scientifically - but not very good at business. Which is flattering because we all hate salesmen.
But the real payload was the request for less regulation of Internet based businesses, particularly advertising. Which would obviously be to the benefit of everyone else, not the incumbent winner of the online advertising race.
And as for his comment about not being in the content business. I can only presume that is because Schmidt doesn't really believe there is any significant money to be had, otherwise they would be on a buying spree of content producers, and funding original content to sell adverts against.
They don't even do anything like a YouTube Young Filmmakers contest, or some competition where the best Blogger hosted blog gets a salary for a year - things that would be trivial to fund, compared to the millions spent on donations to academics.
Much as I like the device, I'm loath to start buying Kindle books, for much the same reason that I never bought anything from iTunes while there was DRM - I'm happy to carry on buying Apple's shiny devices, but I don't want to be locked into them.
('Apps' are a different matter, in that platform is still materially important).
Interesting patent the other week on combining e-ink and LCD into a single layered display, which could answer a lot of technical reasons for keeping your ereader and tablet separate.
Google may grok the web, but they don't grok selling digital content.
I think they are starting to - having invested heavily in funding various think tanks and lawyers fighting for copyright reform to push everyone over to an ad-funded service model, they have woken up to the fact that consumers evidently are paying for digital content, so they want a slice of the pie . . . . but it is also evident they are always late entrants to any digital market (music, books, apps) - and they're still pretty poor at building systems that work for the creators (i.e. the many complaints about the Android App Store from people actually trying to sell software rather than advertising).
Ah . . .
The banality of 'evil'. Soon we'll have to find a new word to describe, I don't know, people who kill their opponents to silence them, rather than just irritating geeks.
It's not always users tweaking things - I had no problems on one machine, on my laptop, the UI was locking up about 45 seconds after boot. Luckily enough time to fire up terminal and console, and problem went away when I disabled some Akamai download/p2p software (which in turn seemed connected to downloading something from Adobe at some point). Specifically it seemed an issue with a network service it wanted to spawn via launchd.
This is the error a lot of people make - i.e. that the App Store, or direct music/ebook sales - mean that there is no room for publishers / labels / etc.
The mistake is to think their role was something to do with manufacturing and distributing goods, when in reality most of it is sales and marketing - and that counts a lot in a store with 200,000 apps.
The publisher behind Angry Birds have proven they can - yes, it's a good product, but there are plenty more good games that have flopper, because the developers didn't know how to sell, and didn't see the need to employ someone who does.
I make rather than sell software myself, and like most people resent the fact that salesmen make far more than the people who do the real work - but there's no getting away from the fact that someone has to do the sales job.
That said - I hate EA as much as everyone else here. They'd never have picked up or commissioned an Angry Birds in the first place (their mobile efforts so far being ports of console titles rather than original titles).
To be fair, Oracle's information about the future and roadmap for their own products is equally vague.
Every time I think they've quietly killed Oracle for OS X support, I find that they will do something like slip out 10g, or a 64-bit Intel Client library - with no announcement.
They don't even bother addressing their own forums when people ask the straight blunt question - are you are are you not still supporting OS X.
Of course, we're already used to getting that from Apple with regard to Java, or answering any question ever about anything trivial.
No wonder Larry and Jobs are friends.
Yes, perhaps he may have to hire a developer for each platform he wants to 'support'. Firms seemed to be able to do this back in the 80s, when computers were far less common.
I'm reminded a little of Bernard Matthews and his threat - ages ago - to leave the country if a minimum wage was introduced - 'I can't afford to pay it' - really, Mr.Matthews? Can't or don't want to.
And the same thing strikes me with most cross-platform development - investors like it, because it saves them money. As a consumer, the results mostly suck, because they tend to be done with pretty much contempt for the end user - Huawei's Mac app, for instance, is written in Java, but hasn't even bothered with the one-line switch required to put the menu at the top of the screen like a native application. That is one line of code required - provided you knew you should and could do it - which most Java developers don't - i.e. you still actually need developer/designers who understand your target platforms, and your x-platform tool of choice.
Instead, what's more typical is firms want to write on one platform and for it to 'work exactly the same' on others. (How many people here who work in software development even have a single Mac or Linux client within the organisation?).
As it happens, I think this story may already be old, anyway - Apple have reviewed and accepted at least one of these toolkits as OK (I just can't see which one, but I thought it was Appcelerator).
I wonder if toolkits that convert down to C code, which is then compiled by the XCode toolchain are allowable, while the issue with the Flash approach is that rather than converting Flash-to-C and then compiling the C, it's doing it's own compilation and linking in a translation library?
Which would also give Adobe a route out - converting byte-code back to unreadable C code, and providing the source of the library.
That would make a lot of sense - if Apple want to reserve the right to bring in a higher powered Intel based iPad model, or a custom CPU, or go 64-bit - then having their development community dependent on third-party binary blobs would be a bad idea. (Particularly given Adobe's history with OS X and even 64-bit Windows and Linux support).
But if those libraries are also compilable portable source, then there shouldn't be an issue.
Piracy in the developing world
MS - and equally Adobe with Photoshop - know full well that piracy helps them 'lock in' customers - imagine the outcry if people had all had to pay for Word when it was effectively the default format people used for documents on the web (it was such a throwback to encounter a restaurant menu as a Word .doc the other week).
Far better to get a business economy locked into your ecosystem, and then pursue the legal route against businesses (who tend to be more compliant than individuals) than risk pushing a developing economy into using something else.
(It's only now that MS are seriously pushing for license revenue from China, that the Chinese govt is seriously looking at Linux).
As for whether people fear change - it's difficult to tell - they have less problems replacing Nintendos with Playstations with XBox systems, even though there is no software compatibility.
And Apple's fortunes changed when they stopped trying to get people to 'switch' (what I've got works for me, so why change?) and started marketing the Mac as desirable in itself.
The key for any Chrome products is to do the same - not to present themselves as 'PCs with limitations'.
Batteries and that
I'm sure the iPod counts for more than 1% of portable consumer devices, and despite the criticism, it seems most people don't actually care about sealed batteries. The number of people who carry a spare battery for their phone is probably miniscule (although probably higher in the Crackberry crowd who need all-day useage without recharging).
For the rest of us - being able to recharge by plugging into the USB socket kindly provided by work on our computers is great.
As for our Anonymous Coward - I love the idea that if you ignore the user interface (because as all El Reg readers know, that's the least important thing in any piece of engineering) and focus on the iPhone's weakest feature (camera) it's not that great. I don't own an iPhone, but I do own both a Nokia, and an iPod Touch, and I've played with both the N95 and N96, having considered them both as my next phone.
There is just no way the N95/N96 can run anything like the applications available on the iPod (particularly some of the audio apps and games) as the platform lacks an equivalent SDK, and both devices are also significantly slower to respond (the N95 because it has far less RAM than the iPhone).
But I have noticed that owners on Nokia N-series phones don't actually know this, until you thrust something like iShred or technoBox in front of them - even people who should be able to grasp the notion of a software platform.
That said, cameras do seem to be a very significant sales factor in European phones, and I'd say that it's actually Nokia who have a better grasp of marketing in the EU, as reflected by their sales.
(The fact that we're having this discussion at all shows how well they have marketed the N-series to people who want a Swiss Army knife).
What a lot of the comments here are ignoring, with all their 'better than a record deal' rhetoric, is that the artist is now also bearing all the expenses (from rehearsal to recording to promotion and marketing).
As with the Kindle and newspaper story, it seems to me that far from cutting out the middle-man, we've simply replaced many competing middle-men with few (Amazon, Google, Apple) who are using their weight to gain deals Walmart could only dream of, while not offering anything other than access to the marketplace.
Take a step back and ask what Amazon are actually providing for that 60% - an entry in their stock database and a credit-card payment system. Amazon's risk? About 0%. That doesn't seem to justify 60% of the revenue for me.
The whole essence of why a record label can get an artist to sign up to a deal where they see 10% of revenue, is that the label offers to take most of the risk, in return for most of the reward. It's not actually a lot different to selling 90% of your firm to investors (then complaining when they take all the profits or sell you down the river).
As per the last similar story, if you do the maths, the lost productivity in booting up / shutting each day (at least 2 minutes a day, even on a well set up network) far outweighs the cost of any potential power saving over a year.
It may be good for the environment, but the economic argument for shutting down daily is bogus.
Better power management, on the other hand, may make sense, but it can also mean buying slightly more expensive machines (i.e. laptops that support S3 level sleep).
Ignoring the whole debate over whether CO2 is or is not a problem, if we reflect back on the first Industrial Revolution, or look at pollution in China today, it's evident that in both cases, at least part of what made production more efficient, was that businesses had no responsibility for the cost of pollution.
Basically, uncosted externalities. Now the central global warming argument is that CO2 production is as much an uncosted externality as more visible forms of pollution - which we now (in the West) charge companies for . . . which of course means they are less competitive.
And let's face it, much of China's industrial dynamism comes from the fact that it's labour is very cheap, rather than any revolution in improved productivity.
It's also evident that regulation has driven progress at times - pollution in cars drastically improved following legislation, simply because without it, no one had an individual motivation to drive a less polluting vehicle.
From a raw economic point of view, there was no 'net' benefit there - we wasted resources in developing and manufacturing less polluting cars that could have been placed elsewhere - but I think most people do accept that they want less smoggy cities.
£17 vs productivity cost
Putting aside the ecological argument, let's just see if the numbers make sense from a business point of view.
Let's presume that the time taken to boot-up to usefulness & shutdown each day is 2 minutes, and that your staff work at least 230 days per year, that's 7.6 hours of boot time per year, or a whole day of work, for just £17 a year.
It doesn't even make sense to pay more for machines that can do proper S3 sleep.
Who gives them the right???
It's interesting to note that if you follow the money trail for the EFF and similar anti-copyright groups and lobbyists, you will frequently find Google in there somewhere. So when the Google-funded EFF threatens to sue Warners for withdrawing their music from the Google-owned YouTube, is it just me that feels there is a slight conflict of interest??
In response to Mark's comment - that's besides the point. We'll almost certainly have music as people will create it, just as people write crap books regardless of whether anyone wants to read them.
What's been great about the copyright system is that it's allowed us to have talented people become full-time authors, say, rather than books largely being written as the hobby of the already independently wealthy.
Every suggestion I've seen as to how we can move to a 'free' world has sucked, majorly - they largely pre-suppose a rock band based model, and secondly ignore the fact that the things they are proposing are not NEW business models - performers already play live and sell merchandise. At a cynical level I just have a picture of a load of people who think Phish or the Grateful Dead represent the epitome of how culture should work.
I don't buy Android as an explanation, in that Android's browser uses the same WebKit engine as the iPhone, and should be able to render the page in exactly the same way. It would have been easy enough to return this version to any modern standards compliant mobile browser, where the screen size was right, rather than it being iPhone specific.
As for customised content - while newer smartphones are capable of displaying real websites as standards intended, there is still a difference between designing for a 3" screen with a vertical orientation, a 17" screen with a 4:3 ratio, and indeed a 30" HD widescreen. I thought that this was one of the intentions in separating out CSS?
(i.e. that it allowed layout to be customised, rather than needing to develop customised sites).
Given that Photoshop is probably THE majorly used application that could benefit from this, it's a shame not to see Adobe on the list - I know they're interested in GPU acceleration, and something like this would probably be the logical thing to develop filters in . . . although they already have PixelBender to abstract filter development from x86 now.
For Apple, this is a no-brainer - Final Cut Studio is again an app that does precisely the sort of tasks that are suitable for both highly parallel computing, and true 64-bit computing (i.e. if you're working on HD-video editing the ability to address > 4Gb of RAM in a single program might actually be useful) - basically just look at any task where users still cannot get enough power of their desktop.
As for whether this gives OS X a general edge over Windows - I presume that OpenCL will be supported on Windows in the same way OpenGL is - by the graphics card vendors, so Windows developers will be able to use it. The key question is what cards it will be supported on (will it only work with new hardware, or retrospectively?).
For what it's worth . . .
. . . Apple do this annually - it's the only time they discount non-reconditioned kit. Although typically the discount is so insignificant it's not worth bothering with.
May be some mileage in it, as some UK resellers are offering over 10% discount this weekend.
I'm sat here using an Apple keyboard (and Microsoft Mouse) on a Dell desktop, which must put me in a very small subset of people who think the new Apple keyboards are fantastic - who needs clacky full-travel keyboards, other than people who can't get used to the fact a computer is not a typewriter?
(Caveat - I effectively learnt to type on a Sinclair Spectrum, so perhaps the fact my initial typing experience was on a flat keyboard makes them feel more natural to me).
The Register seems to love posting Global Warming sceptic stories. I like the 'bad science' angle, which is valid, but I'm dubious about the motives.
I'd have more faith in Nigel Lawson and other economists views if they were in their 20s and therefore likely to live with the consequences of what they are suggesting now.
Lest we forget Lawson also presided over a previous economic bust in the UK, so why should we trust his economic views.
As for 'everything may be sorted out by an unknown technological advance' - and the housing boom was going to end with a soft landing too.
The one thing that is supposed to distinguish us from animals if foresight, yet it seems that people believe that the 'invisible hand' of the market is somehow better still. I think recent events show how dumb that line of thinking is - correction can be catastrophic.
Although of course it's always fine when you're a member of the international plutocracy.
Yes - Safe Sleep is basically the same as Hibernate.
One problem with it is that a lot of Mac users coming from iBooks are used to 'unsafe sleep' - ie. close the lid and go - something replicated on few Windows machines.
Once in that mode, with a reasonable battery charge, you can leave it for days on standby before the power finally runs out.
It strikes me that safe sleep breaks more than it fixes - unless disabled, it means having to remember to shut the lid a minute before you need to move.
Most modern apps autosave work as you go along anyway - even browsers are getting into the idea of restoring the last set of tabs.
All it really saves is the need to do a cold boot and open your apps/documents after a complete power down, against a huge loss in flexibility.
However - my understanding is that moving the machine should always trigger the motion sensors, same as dropping it.
If you do that while it is going into safe sleep, the worst should be a corrupted safe sleep file.
>Who's to blame?
Dodgy hardware batches are nothing new - I can think of cases going back to the 80s, way before manufacturing moved to China.
Without knowing the percentages it's hard to know if there should be a recall either - what if it's only 1% of the batch that are affected?
What's the exact point at which a recall should be announced?
Just because no one has above, what about Apple's OS X - which will be an actual certified Unix come 10.5, rather than Unix-compatible as now. (No idea what that means for the underlying Darwin BSD). That's a growing Unix distribution, although it's not exactly growing the market for Unix'03 software.
And to concur with the above - still plenty of AIX, HP-UX and Sun boxes out in the Enterprise. Expensive, but typically peanuts compared to the software running on them.
Belated comment as didn't have time to post at work.
I concur with Joe Cincotta's comment - if you look at what's going on with the development side at Microsoft they are definitely trying to pull developers towards a more O/S independent world (with Silverlight there's almost some truth in that too -although only a tenth of the APIs are there).
On the other hand, there's definitely a lot of resistance (i.e. they have had to back-pedal over VB into supporting 'classic' VB under .NET). Once .NET is established as the de facto standard for development, and only then, I expect they may start changing the underlying systems (they do have an OS lab, that's produced a non-Windows experimental microkernel system). I'd actually welcome something that was both new and challenged the dominance of Unix variants.
That said - as other commentators have suggested, there was surely nothing to stop them doing the same as Apple did with Classic (i.e. the old OS is there, running in a sandbox). The compelling reason is surely having a cut-off point for having to support 25 years of legacy code which must be costing millions - going forward you only need to maintain the virtualisation of a legacy machine (which can even be kept invisible to the end user a la Coherence mode in Parallels).
Warren - without wishing this to turn into the usual Mac/PC debate, the whole essence of the Mac was indeed what you described - walling you off from the hardware, and a computer as an appliance (hence lack of upgradability). I recall complaints about this at the time, and it was one reason I never owned a Mac myself, as at the time I was definitely a computer hobbyist.
But I could 100% see why they appealed to Douglas Adams, et al - it was 'a computer for the rest of them' to turn the slogan on it's head.
What you're saying reminds me very much of relatives who are mechanics and would't own cars that they can't repair themselves. My uncle, in particular, hates electronic engine management systems as all he can do is replace the whole thing - it's a box he doesn't understand.
But if you're not interested in cars, more than getting from A-to-B you don't care - you want one that is cheap and reliable rather than cheap and easy to repair. It may actually be instructive to look at the motoring market - in the 70s a lot of people maintained their own cars, and the cheap end of the market was still generally unreliable. Dare I also say that 30 years ago cars were a lot more of a male preserve. These days we have the Ford Ka - which most people into cars loath.
Shiny interfaces/eye-candy - what year exactly did geeks stop being excited about graphics? Back in the day, I remember when people used to fantasise about working at Xerox Parc or the MIT Media Lab, or got excited by developments in 3D graphics.
Read only documents
>And god bless 'em! Thanks to Microsoft and Office nobody has >to worry about the recipient being able to read the document >we sent them.
>Most businesses and workers only care about:
>1. Does my software work properly?
>2. Are people able to read what I sent them?
One acronym : PDF. It's a far more sensible format for the 90% of documents we send that are read-only than Word - and certainly for archiving. It even renders with full fidelity, regardless of what fonts the user has installed - a major problem if you want to do anything typographically interesting in Word.
There's also the small fact that the reader is free - so if you're a government organisation preparing documents for use by the public, it's a far more acceptable format, than something that dictates the end user has to install an office suite just to open up a read-only document (which could have been a web-page 90% of the time anyway).
It's also doesn't imply platform choice - a fair number of domestic users have Macs - and while MS Office is available for OS X, and Open Office available on most OS, it's still a pretty heavy-weight app to run just to read something.
And it is quite quite wrong for tax-payers money to be used to help reinforce a proprietary monopoly, even if it is businesses wish to do so.
But then this is exactly the kind of dumb logic that's got us into the mess we're in now.
And of course, Adobe are also guilty here, in that they got pretty protective of their open standard once MS threatened their dominance of the PDF authoring market.
Note : this is an entirely separate debate from having an open standard for editable documents, for which I pretty much agree with the article's sentiments.
I don't buy the idea of Apple allowing you to buy anything directly onto an iPod or iPhone - for starters you'd get cases of people losing their iPod and therefore everything they've bought, wheras forcing them through iTunes - which nags you to back up - removes that issue.
Also Apple's DRM system is reasonably simple - iTunes manages what goes onto the iPod - the iPod does relatively little. This is far simpler than the various Windows schemes where the player itself has to do a lot of authorisation.
The most logical thing is for it to be part of a wider policy to allow Apple's products to all interconnect without cabling - i.e. go direct from iPod to AppleTV or AirportExpress speakers. Would be a good solution in cars too.
I also suspect we'll see an iPod that is somewhat like an iPhone without the phone bit - you could implement web connected widgets without a touchscreen - with a touchscreen you could have the browser, Google maps, etc, which would make a very neat portable device.
Working with windows?
>Who works with Windows anyway?? It's an OS, nothing more.
Actually, it's not - like 'OS X' it's more a brand that covers an OS and set of development frameworks that are exclusive to that OS (Win32, WinFX/.NET 3/whatever they're calling it this week in the case of Windows, Carbon and Cocoa on the Mac).
Hence why your applications are tied to a specific operating system in a far more restrictive way than typical Unix apps.
Right now, of course, the number of applications built on top of Vista's API, other than those shipped by MS, is somewhere around zero. But it should allow the rapid development of some nice applications, and once the consumer PC upgrade cycle has got it widely deployed (36 months) you can expect to see them.
As for the rest of you, wittering about wasteful eye-candy - I bet you were all complaining about how much RAM colour graphics mode used to take up, and rue the day you were forced to move to a machine with a mouse attached.
It's amazing, that with the Internet at your fingertips, it was much easier to type 'last time I checked' than actually do a quick check over at webkit.org.
Sorry but Apollo are using Webkit, which is a fork from the KHTML code. The code is, of course, appalling as it comes from a proprietary company, and that is, of course, why Adobe and Nokia selected it over KHTML.
Mind your language
Graham - there have been CPUs designed with increased security in mind, but the main issue is backward compatibility, which is the issue that continually dogs Windows. It's no good having trusted computing when 80% of the software you use needs to be run in untrusted mode (and half your hardware drivers are by 'unknown' and unsigned - look at your Windows services to see what I mean).
Steven - application development would be a lot quicker, and safer, if we could trust our programming languages to do what they appear to say. The choice of C++ as the major application - rather than system - programming language is a problem in itself. The programming community has continually rejected safer languages (eg. ADA) in favour of something powerful but unsafe.
There were - of course - other pragmatic reasons - runtime safety checking every variable assignment against type declaration is a performance hit. Which is why C++ slaughtered higher level OO languages.
It's worth reading Wirth's paper on a history of good / bad ideas in programming.
Lastly, however, some blame does still have to go to Windows itself - an application may open a back door, but it should have been a lot harder to download and execute an application without the users consent, and near impossible to modify the system directories. Vista thankfully takes us closer to this point.
I saw Hugh Darwen give a presentation last year on the history of SQL, the problem with NULL, and Tutorial-D. It was fairly enlightening, even if I found myself sceptical as to it's practicality.
As I understood it, one of the problems they see with NULL is that it fails to distinguish between 'information unknown' and 'does not have a' - or rather than the semantic meaning of the NULL value is held in the code that deals with the NULL value, rather than understandable from the schema.
The solution proposed wouldn't work with modern RDBMS - seemed to involve denormalising every NULLable column off into it's own table, but it's only our experience of RDBMS performance that makes us think this is such a bad idea.
I think a lot of the concepts in Tutorial-D would help close the supposed 'Object-Relational' mismatch - it seems to have a closer fit to the notion of inheritance.
But overall, I share your pessimism - the reality is that there is a generation of 'database hostile' programmers out there, who would rather pull back thousands of rows up into an OO language, modify them using an iterator, and push them all back to the database layer, than use a set based update statement.
(Equally I think it is little appreciated how much relational theory stands on top of very solid mathematical foundations, far more so than the heuristic approach of object modelling. Then again, consider the success of mathematically sound formal languages against the programming languages that have been succesful).
Thanks for the corrections
Thanks for the further info on QT. Perhaps the issue I have is more with 'cross-platform developers' then, than the toolsets they are using.
(For instance, it is trivial to make a Java app show it's menu bar at the top of the window on Windows/top of screen on OS X. The fact that many do not do so is indicative of what I see as the issue).
For what it's worth, I'd agree with you on the verbosity and obscurity of some of the Obj-C/Cocoa libraries. I didn't find Obj-C that weird - probably because I have no C++ to unlearn, wheras I do have historical Smalltalk experience.
And there are some powerful features in there, like late-binding, posing, and dynamic typing / runtime type inspection. (The same stuff that made Smalltalk a great idea but performance nightmare in the 80s).
I would imagine that the combination of the NextStep framework and Interface Builder was pretty revolutionary back in 1995 - a lot of the structure seems, to me, similar to QT, right down to the separation of the interface into an XML file read at runtime (admittedly NIBs are not XML, but can be converted to/from quite easily as they are heirachical).
However, I'd agree that some of those method names are so damn long, and IDEs have come a LONG way since then.
I understand there was an attempt, at some point, to modernise the syntax, which is what is needed (and what's with the whole idea of having NS at the beginning of classnames/methods to specify them as NextStep rather than a namespace qualified notation?).
Of course it fell on dead ground - existing developers not wanting to change, and any new developers coming in learn from existing developers.
Not taken a look at the Ruby and Python bridges, which may be an easier road in (no messing around at the C or C-derived level at all). F-Script looks another interesting way to play with Cocoa.
I remain unconvinced about the notion of cross-platform development, in that there is a lot more than separates Mac, Windows, KDE and Gnome than their widget set.
For instance, on OS X you've got AppleScript and Automator as front ends onto OSA, and as far as I understand it there is no bridge from Qt4 to the OSA events.
And of course there is the issue when you drag a file into a window and find that it doesn't accept it, even though the program can edit it if you open via the File menu. (To be fair, that can happen on native applications too).
Looks indistinguishable is one thing, feels indistinguishable is another.
- Facebook offshores HUGE WAD OF CASH to Caymans - via Ireland
- Review Best budget Android smartphone there is? Must be the Moto G
- NSFW Confessions of a porn site boss: How the net porn industry flopped
- World's OLDEST human DNA found in leg bone – but that's not the only boning going on...
- OHM MY GOD! Move over graphene, here comes '100% PERFECT' stanene