1380 posts • joined Thursday 18th June 2009 14:54 GMT
Allow me: OS X contains a BSD layer, derived from BSD. Because it has a terminal, it also contains a bunch of open source components that you commonly see included in Linux distributions. WebKit is notably a fork from KHTML and KDE is generally closely associated with Linux distributions. There's even a rootless X11 manager if you want to use it (though I don't think it's a default install).
So, fine, technically it's not Linux-derived because its original development predates that of Linux and Linux is just a kernel, whereas OS X explicitly uses a completely distinct kernel. But it's quite accurate to say that it shares a large code footprint with what people idiomatically call 'Linux' and that at least some components were part of idiomatic Linux before they were part of OS X.
I'm sure that you could find a bunch of BSD, Linux or OS X people that would be angered by the statement, but hopefully not at as irreverent a site as this.
As above, my guess is...
... downloading to another folder is achieved by supplying an archive with an absolute path, and one of the built-in extractors failing to validate that properly. bsdtar is safe, so I'll guess it's a zip problem. The default set up also doesn't allow users to write to absolutely anywhere on the system, but it does allow them to write to /Applications, so whatever they're doing probably doesn't allow a write to anywhere.
Yes, though, it's a big gaping hole.
They were confused
"Administrator privileges" tends to be synonymous with unfettered access to anything on a computer. A default install of OS X will require a password be entered for a bunch of tasks, such as viewing things stored on the keychain, making changes to certain system preferences and some other things.
However, you're quite right because on a default install, and I'll wager on 99.9% of machines out there, the single user has a tick against 'Allow user to administer this computer' and can write whatever they want to /Applications, whenever they want. Combine that with Safari shipping with 'Open "safe" files after downloading' ticked by default and it's easy to see how this program installs itself, given that archives are considered safe and I guess one of the archive formats doesn't properly guard against absolute paths.
All of the proper, internal paths should be properly locked down by default, so in theory this program shouldn't be able to do anything to stop you from just dragging it to the trash and hence uninstalling it. That said, it should still be a major embarrassment that it can install itself in the first place.
I agree and disagree
Agreed: the important asynchronous fetch parts behind AJAX originated at Microsoft, IE6 didn't hit standards very well but it was an era before anybody did and before most of the standards it ended up living alongside, Microsoft's enterprise software — whatever else you may say — sets a high bar of entry for competitors, Gates was preaching the tablet before anyone actually wanted one and before the form factor really worked for technological reasons.
Disagreed (but not refuted as these are just opinions): much of the Apple stuff. Innovation just means to make changes in something established, especially by introducing new methods, ideas, or products. So it's actually a really easy test to satisfy — taking a good idea from one field and transplanting it into another so as to change perceptions of the market would seem to be enough, so multitouch on phones will do. They also have some genuinely novel manufacturing processes that create very robust enclosures, and Thunderbolt is very interesting.
On CPUs they use the latest Intel parts in the computers and design their own ARM-based silicon for the tablets, phones and MP3 players. Pixel densities are lagging on computers, but if pixel density were the test then you'd have to give them innovation for the latest iPhone, having jumped at least 50% ahead of the competition at launch and still being ahead almost a year later.
The thing with consumer products is that it's artificial to separate hardware and software when trying to pinpoint innovation. Which is why comparing Apple to Dell and Microsoft separately is a little silly and I think that's part of the point the article was making.
But on the other hand...
... the conclusion was drawn by comparing the proportion of people who saw the movie that watched it in 3d to the proportion of people who saw those other movies that watched them in 3d, to reach the conclusion that — amongst films with a large enough release and sufficient success — the proportion of people opting for 3d was lower than previously.
There are then a bunch of possible counterarguments about why the specific nature of the film didn't cause the drop, albeit that they're speculative.
So I don't think it's franchise sickness, since the comparison isn't to other films in the same franchise or to any numbers that may have been affected by the popularity or quality of other films in the franchise.
You've probably got your DVD player or Freeview player or Sky+ box or whatever set to output a 4:3 picture, so it's letterboxing the 16:9 then putting subtitles on at the bottom. Probably things would improve if you set the box to 16:9, adjusting your TV's picture stretching setting accordingly.
You could post the counterexamples if you think evidence is important.
I think I generally agree with you though, as it stands to reason. Some manufacturers lower costs by accepting money to install the Norton Tools or whatever trial versions on their machines. Apple don't. So even if Apple and those manufacturers spent exactly as much on production and applied the same markup, the Apple machine would be more expensive. You don't have to allege that Apple are charging higher margins or in any other way pumping up prices to get to the conclusion that the Apple machine should cost more.
No viruses yet...
... but a growing list of trojans. Platform security obviously helps prevent viruses (I'll bet the number for Windows 7 is tiny compared to Windows 95 when it was at the same level of adoption), but platform unpopularity is the only way to safeguard against trojans. I guess someone is dipping their toe in the water to test the viability of this sort of scam given Mac market share.
Technically he's making that allegation only if you think the Linux box is a cheap imitation of an Apple box. And I don't even agree with him that markets prefer things that are cheaper and less efficient; generally they prefer cheaper and more efficient.
I'm of the opinion that desktop computers long ago became pretty generic, though I tend to buy Macs still because they have a small physical footprint, a tiny electrical footprint, operate silently and usually last a decent amount of time. I'm also quite familiar with the software stack. However, I accept that I'm putting myself into a straightjacket in terms of customisation and I'm not under the illusion that I couldn't get better benchmark results for less money, or that because I like the OS it must be objectively better.
The best computer is the one you like the most, and the competition is what keeps all the vendors on their toes.
Based on Microsoft's denial and the port to ARM, I'm optimistic that they're deprecating some legacy stuff by relegating it to a Windows 7 compatibility mode. So they wouldn't be re-jigging the underlying architecture, just trying to push everyone more forcibly towards the re-jigged stuff.
Would an all .NET Windows with all or most of Win16/32/64 in a sandboxed, legacy support environment really be a bad thing?
Not quite true; if I recall correctly then a PowerPC version of Windows NT 4 shipped for the PReP platform in a few extremely obscure ThinkPads that could run Intel binaries through emulation. Or my memory might be fooling me, and the emulator may have been an add-on, though I'm pretty sure it worked at the system level, to emulate the binaries but forward the relevant system calls directly to the native NT implementations. Or I'm just very confused indeed.
I completely agree; binary compatibility for the Mac goes back a decade at the most and requires the installation of optional components to do so. Obviously you can argue some virtue in that from the perspective of bloat and support, but the benefits of full backward compatibility are so obvious as not to need arguing. Microsoft aren't always 100% on the nail, but it says a lot that I can remember the only two times I've had problems, and the first of those was running a Windows 2 version of PageMaker on 3.1...
Same thing here; I think part of it is that there's no gaps between the pixels, and they've selected fonts that are relatively aggressively fitted to pixel boundaries so there's limited need for anti-aliasing anyway. http://www.bit-101.com/blog/?p=2722 makes the point quite well, especially when you get to the 400x versus 375x zooms.
Given that the proposal is being submitted to an industry-controlled standards body for the scrutiny of the normal industry-controlled standardisation processes, it's probably safe to assume that if Apple don't know what a standard is then someone will tell them soon.
The bluetooth criticisms are valid, the video criticism isn't. Both the QuickTime container and H.264 video are industry standards, being written into the BluRay spec amongst other things. My Android can play them, most £20 DVD players with USB slots can play them, VLC can play them.
I think they already get a share of your subscription costs — that was the deal originally, at any rate. Hopefully competition from Android and others has eaten into that by emboldening networks.
I think tablets may displace laptops for users that buy a laptop to use in their own home, primarily for the web and email, tending to keep it in the lounge or some other socially oriented room. A tablet usually betters a smartphone for the same reason that A4 is what most people put content on rather than on till receipts — reading a web page at approximately the size of a full piece of paper is just easier and more comfortable.
I'm aware they also act as media centres, but I don't see that being a big use. For music you want something that'll fit in your pocket and video content tends to want as big a screen as possible, with the average TV now being probably about 30".
I'll bet that within ten years you get a tablet with your broadband just like you currently get a wireless router. People will plug the router into the wall, put the tablet in the lounge and for 90% that's the home Internet sorted.
The iPhone supports the latest HTML 5 database stuff, for local persistent store, as does Android. Blackberry doesn't.
To my knowledge, Jobs has spoken out publicly only against Flash. The arguments he made that it offers a very poor user experience on mobile have been backed up by every objective review I've seen of the Android client. I doubt I'm alone in having decided never to download the thing onto my Android phone. His conclusions — to ban Flash from his platform — betray his control tendencies but the initial observations were valid.
History has also shown Apple to be a beneficial contributor to the web ecosystem, being the driving force behind the vast majority of WebKit development (it was forked from KHTML when that project was four years old, which was nine years ago) and the originator of the canvas element and 2d/3d CSS transforms amongst others. As far as I can think, they've done nothing at any time to hurt the development and propagation web standards.
Very exciting if true
With Honeycomb being closed source, I guess this'll be a fork in many respects — so there's a sense in which it'll add fragmentation to the Android world, but I'd expect that Amazon won't market these as 'Android' devices and won't officially support anything other than their own store so hopefully the issue will be moot. I expect they won't have done anything dramatic with the underneath APIs either, so falling back into line at some point shouldn't be hard.
Amazon are the only company I can think of that can largely match Apple's content collection (probably the same amount of music and movies, more books and magazines, fewer apps, but the latter feels easier to fix if the devices are successful), while having a good direct-to-the-customer relationship and a fantastic retail infrastructure. If you're looking for someone other than Apple to launch a single prominent consumer device, then Amazon is your best bet. If the news is true, my money's on tablets being a two horse race, not between Apple and the field of Android players but between Apple and Amazon.
If the Kindle experience is anything to go by...
... Amazon adverts are served only if you've accepted a subsidised device. And given the work Amazon will have had to do in isolation due to Honeycomb being temporarily closed source, I'd expect them to have cut out the Google ads even if it means sending all the normal Google apps with them.
@The Fuzzy Wotnot
While I agree with your point — which I take to be that Apple shouldn't be condemned for making a computer as an appliance given that there's a market for computers as appliances and lots of people want them only as appliances — I think possibly the offence here is that Apple are adding and removing features that some people want without any sort of notice and with no regard to that particular audience.
So it's characteristic of their control tendencies and it further evidences which segments they're actually interested in selling to. It's also a sign that they don't mind deviating from industry standards if they think something is to be gained for their target audience. So I think it plays both ways. I can see why it offends a lot of El Reg's readership but I don't agree with a lot of the motives that are assumed to be behind it, or that it imputes much upon Apple's customers.
That's not quite the issue here; Apple have built a temperature sensor and the firmware necessary to report back from the temperature sensor directly onto the drive, have created a proprietary connector to allow drive + sensor to talk back to the rest of the system and have set things up so that any failure by the drive to say that it's safe results in the fans spinning up to the maximum extent possible.
This hinders third party upgrades, so is a negative step in the eyes of a group of their potential customers. Any individual who would have swapped out the base build drive for something larger after purchase and who now instead opts to pay for a build-to-order upgrade will have to pay more than they did previously per the industry-wide rule that build-to-order upgrades cost quite a lot more than buying the better part and performing the same task yourself (often even if, hypothetically, you were then unable to realise any value from whatever bit you remove).
Conversely, it possibly shrinks the total size of the sensor + the drive (or, more probably, the cost of the two together), and iMacs haven't been designed to contain user serviceable hard drives at any point that I remember. Most iMac purchasers already treat it as an appliance and attempts to upgrade are rare — the RAM is user serviceable and very trivial to access, everything else is hidden. Upgrading some non-user-serviceable parts of the iMac is easier in this model than the last (including the CPU, notably), but that doesn't make for an interesting story.
So: Apple have taken a step that upsets some of its customers, but not most. It's news but it's not really the end of the world and it doesn't say much about Apple beyond reinforcing whatever you already thought about them.
On the other hand...
... some who use Doctor Who primarily as something to talk about in advance, with theories and leaked plot elements, seem to get many hours more enjoyment out of it than they would merely by watching the television programme. And they're probably still buying the merchandising, while those of us that prefer it just as a television programme still didn't know what was going to happen.
No, I think it's to protect at the other end...
i.e. Motorola and Samsung versus the no-name, very low specification, resistive tablets that are threatening to give Android an unfair bad name. I also think that maybe why in 'Honeycomb' they've picked a codename that sounds good and is being pushed as part of the branding, and seem to be retaining it to the next minor version.
Either that, or it really is just that the code doesn't look very nice. Not everything is a conspiracy and companies do sometimes tell the truth.
@AC "pad size"
I thought it was more that 10" screens are quite close to the size that both the international community and the Americans separately have settled on as being good for a piece of paper, so the thing ends up a natural size for web pages, PDFs, magazines, etc. The 7" screen is conveniently like a paperback, but less suited to the web. And that's before you throw in the media centre component.
I think you're right; with Clang now fully capable of C++ and Objective-C++, they've switched to a Clang/LLVM pair for Xcode 4, to power not just the compilation stage but the static analyser, the as-you-type error highlighting, and a bunch of other workspace things.
At present they're pushing all the way to a native binary, but it feels like it'd be a relatively trivial step from here to build LLVM into the OS and move towards LLVM bytecode generation being the default in the developer tools.
No, no good guy, but I'd rate Google as the better guy. They've done quite a lot to lead various markets in great improvements, be it in search, maps, email or whatever. They tend to be a bit lax on copyright in places like YouTube and Google Books but I doubt they'd predicate an entire avenue of their business on copyright theft.
Conversely, Oracle's actions with all of the Sun acquisitions are almost a study in trying to use legalistic means to quickly consume whatever value is left as a precursor to dumping the lot.
Could be more telltale than you think
Even if this were an official announcement from Apple, it'd have a little of the 'me too' to it, given that Microsoft has already announced an ARM port of Windows 8. Obviously the difference is that if Apple decide they want ARM then you stop being able to buy an Intel Mac anywhere, but supposing Apple were to switch and to demonstrate gains in doing so then the door will be completely open for companies that ship Windows machines to introduce competing devices into their ranges.
So: Apple's move could start a trend, or at least have more of an impact than just on the tiny OS X audience. Though you'd have to buy into the version of events where Apple are highly influential in everything they do rather than just occasionally influential in some areas; assuming genuine benefits do appear from ARM laptops then I'd expect Windows manufacturers to offer devices anyway, and quite possibly sooner.
While the Mac App Store almost surprises by being quick and snappy (though it shouldn't; this is just comparing it to iTunes), it is as the name suggests Mac only. So the Windows people, for whom iTunes performs about a thousand times worse still, would be left out.
That said, the handset itself has a pretty good client — maybe leave iTunes to back up and possibly to organise, but otherwise keep itself out of the software side of any of Apple's network connected devices? If ever an application could do with having its functionality pruned, it's iTunes.
Your comment is disproved by trivial research. From the Apple-specific websites that have reported the rumour:
9to5mac, from where the story originates: "Apple has long used the proxy of iTunes to push updates to its iOS devices ... Smartphone competitors have long offered a different, more direct method for software updates that happens over-the-air."
Mac Rumors headline: "iOS 5 to Finally Deliver Over-The-Air Updates?"
MacWorld: "Other smartphone operating systems such as Android can be updated over-the-air,"
GigaOm: "Smaller, incremental updates like those served to Android might be the way to go, but that would require a significant change in the way Apple approaches updates "
Barnes & Noble operate in the US only, whereas Amazon operate internationally and (finally) launched the Kindle outside of the US last year. So, while it was an oversight not to mention the Nook, it was probably understandable from an article written in a territory where the thing isn't available.
On the contrary
You're assuming it wasn't a bug, even though it (i) was a 'feature' that explicitly wasn't used — the problematic data wasn't posted to anybody or harvested in any way; and (ii) turned up for the first time in a major OS version revision.
Fine, don't trust Apple, but the evidence that this was deliberate is relatively flimsy and the harm it will have done to others will have involved malicious third parties. I'd therefore rate it alongside any other security bug, such as the dozens that crop up in all of the major operating systems and typically lead to malicious code execution, privilege escalation, etc. So it's something that should have been caught, and it may affect your opinion of competence, but it was quite probably just an oversight.
It's what the companies that have the locations updated to them across the network in real time (I think all three of Apple, Google and Microsoft?) are doing with that data that's more scary.
Proposed Kindle changes undesirable
If the new Kindle screen is literally like the Sony ones then it'll be a step backwards because the Sony screens have a noticeably lower contrast, which people tend to attribute to the way the touch stuff is bonded on top. If it's a tablet-style form factor with no buttons whatsoever then it'll be a step backward in usability since you'll no longer be able to hold it in one hand and turn pages, e.g. while using the other hand to grip some part of the tube/bus/train that you're commuting on but which has no free seats.
On the concept of a separate tablet, I see Amazon not just as the only company with a serious chance to challenge Apple at their own integrated market game, but quite possibly the only company that could displace Apple and become the dominant single player — by simultaneously having a ready-made broad retail audience, a strong content offering and, if Android based, being able to get the bloggers on board from day one.
It has become rather diffuse, hasn't it?
At least on Windows, it didn't perform very well even back when it just managed music but now that it also does movies, apps, photos, podcasts, etc, etc, etc, it really does feel a lot worse. I'm always optimistic that the problem is just the app's heritage (it was an OS 9 app originally) and that the encouraging signs Apple have made by rewriting QuickTime and fixing the Windows port of Safari (for performance and system integration; forget what it does versus your favourite browser) mean that the iTunes problem could be fixed, possibly even in the not-too-distant future. But there's no reason to believe it'll actually happen.
To quote LBJ:
"You do not wipe away the scars of centuries by saying: Now you are free to go where you want, and do as you desire, and choose the leaders you please. You do not take a person who, for years, has been hobbled by chains and liberate him, bring him up to the starting line of a race and then say, 'you are free to compete with all the others,' and still justly believe that you have been completely fair"
It's a question of measure and degree. The iPhone libraries meet probably 80% of the OpenStep spec, which was a pure API effort and was explicitly meant to be vendor neutral. So it's very closely OpenStep related. And OpenStep being explicitly for multiple-vendor implementation, provenance isn't relevant while you're willing to conflate OpenStep and NextStep. Which also nullifies your FIAT/Ferrari comment. OpenStep is a framework, not a company.
Android phones, like iPhones and others, implement probably 40% of SGI's OpenGL (since ES 1.0 cuts a very large amount of extraneous stuff and 2.0 culls almost the entire fixed pipeline) and almost none of the rest of the old SGI APIs, along with none of the design patterns.
If you look inside OpenStep source, you'll see NSArrays, NSDictionaries, NSNumbers, target/action patterns, delegation, key-value observing, the same protocols (in the NSCoding, NSObject sense, albeit largely informal), notification centres, a run loop, selectors and fully dynamic dispatch. If you look inside iPhone source, you'll see all of those same things. So it's the same fundamental base objects, the same fundamental design patterns and mostly the same higher up objects.
Summary: your "some parts" is a massive understatement; I don't consider it so incorrect to suggest a single lineage as to maintain the article's author was wrong. It's not just that the odd API has survived and it's nothing to do with the legal name of the company involved.
Suggest you do your research. Or, possibly, stop using 'shill' as a generic smear when you've nothing intellectual to say.
Me, on "Kindle beats Apple's closed book on choice": "I have a Kindle and an iPad, and tend to buy a few books a month on the Kindle but have never bought any at all on the iPad"
Me, on "The Microsoft mobile reboot needs rebooting": "Google would argue they're only temporarily closed source, but even if the Android source code is never published again, it'd still be the only one of the offerings from Microsoft, Apple and Google to allow anyone to install any application from any source."
Me, on "Android, Steve Jobs, and Apple's '90%' tablet share": "Android dominates the phone market ... by being on a lot of different handsets, relatively cheaply. It ticks all the boxes that a large proportion of the market care about, which is enough."
Me, on "Steve Jobs vindicated: Android is not open": "Whether an OS is open source is a completely unrelated issue to whether it has an open market in applications, Microsoft Windows being the obvious evidence — it has the most diverse market possible and not one jot of it is open."
Me on "Tesco herals 2011 as YEAR OF ANDROID": "I was under the impression that 2010 had been the year of Android and that, across the whole market, Android phones were outselling iPhones already. So it's a bit surprising that Tesco have only just caught up."
Me on "Apple 'greed' tax spreads beyond music, movies, magazines", when in a particularly bad mood: "Apple have now made it impossible for others to defend them convincingly. The new charge on subscription services means that the costs of being in the ecosystem now outweigh the benefits for many major companies that you've heard of."
"Fanboys all a-quiver"
Possibly I'm being dense; does this article's subtitle contain the usual attempt at humour or is it just plain trolling?
Re: the actual story; I can't imagine what took so long, and agree with Ben that the device doesn't even look very nice.
@Davidoff re: "not a 'NeXT handheld computer'"
It has an updated version of the language runtime (per the move towards formal protocols and the addition of closures), all the old Foundation classes (with additions), much the same conventions and patterns (target/action, delegation), and the kernel is a much updated version of that which was part of NextStep. Only the user interface library is all new, per the new user interface paradigm — multiple touches and direct manipulation are in; at the C level DisplayPostscript is out due to licensing costs and a PDF-derived alternative (so, same primitives but no language) is in.
I'd say it is quite closely related to NextStep.
With both mucksie2676 and the other posters making the point that correlation is not causation and, even if it were, taking the one difference in abstract is massively overreaching.
From a completely subjective point of view, I live in London, in a typical London sized property. When I bought books, I tended to read them once or twice, maybe keep them on a shelf for a while but eventually give them away — either to a friend or via those handy Oxfam collection bins. I don't have the space to keep every book I've ever read and I don't really have the desire to. I have a bunch of reference books that I'll probably keep forever but they're in the minority.
Hence, I really don't care about portability of my e-book purchases. I have a Kindle and an iPad, and tend to buy a few books a month on the Kindle but have never bought any at all on the iPad despite downloading the iBooks app and trying the sample copy of Alice in Wonderland. I have the Kindle app for the iPad too, but doubt I've opened it more than about twice.
It's simply that the Kindle is by far the better reading device. It's a much more convenient size for public transport, you quickly forget that it's a screen you're looking at because the type looks a lot better (ie, you can't see the pixels), and the screen is visible in direct sunlight and doesn't attempt to blind me when I'm reading at night. So, it's more convenient: (i) on public transport; (ii) in the park; and (iii) in bed. Which is a clean sweep of my normal reading environments.
I guess it's nice to know that my purchases will someday be portable should a better device come along, but as I've yet to read any of them more than once I'm not really that bothered. It's actually much more bothersome that I can't pass them on having now finished with them.
On the contrary
The primary thing that dictates the best screen size is the size of an average human hand. The bigger necessarily equals better crowd are primarily those that have decided they want to criticise and have worked backwards to find any distinguishing feature. I'd rate it about equal with "the Android user interface is worse because it doesn't have CoverFlow" on the scale of valid criticisms.
Of course, that doesn't mean Apple won't ship a bigger screen. At this point, distinction from the previous years' model seems to be getting ever more slender across the industry, so giving people anything on which they can try to rationalise an upgrade is advantageous.
According to my maths...
1080p has a diagonal of almost 2203 pixels, so on a screen with a 50 inch (or 127cm) diagonal that's 44 PPI (or, the other way around, each pixel is about 0.058cm across).
So if you sit 2m away, each pixel subtends an angle of about 0.02 degrees on the retina, or very very close to 1 arc minute. A human eye in optimal illumination conditions can distinguish two lines if they're 0.6 arc minutes apart, so technically a higher resolution could be beneficial. But I imagine not really in any of the scenes you're likely to see on TV, which tend to be moving scenes featuring large objects rather than perfectly static shots of typography or, ummm, rakes at a distance. And probably not at all if, like most of the people here, you've spent most of your life staring at a screen that's maybe 45cm from your face.
It's probably worth someone else checking my maths before you take this as a definitive answer...
There's some sort of refund mechanism, but I've never been clear exactly how it works. There was a minor uproar amongst developers when it was first announced since Apple were taking 30% of the original sale, but then supplying 100% of the refund from the relevant developer's account. I suspect they may just have decided to make the feature a little obscure rather than deal with the matter properly.
@revisionists both ways
What the iPhone had was a multitouch, direct manipulation API. So, in the browser you didn't use a little joystick to step down the page one notch at a time and you didn't go to a little submenu somewhere to decide whether you wanted 50%, 75% or 100% zoom. And it shipped with industry standard fonts like Times and Helvetica, using Apple's preferred typographically-accurate rendering, on a screen with a then high pixel density. It was also the first phone to ship with and with an OS built around the presumption of a GPU.
When you bought an iPhone you also got an unlimited data plan — at the time almost unheard of — and no carrier interference.
Most competitor's phones then rarely shipped without being crippled by the carrier, required a BSc to operate fully and were proud to include only exactly one font. I was at an a Nokia presentation just shortly before the Microsoft announcement and one of the invited speakers opined that it was a big joke that anyone should care how text on screen looks, as long as it's there.
So, limited innovations if you're going to boil them down to numbers. But if you're the sort of person that genuinely thinks the correct way to compare devices is a simple present-or-absent feature check list and that direct manipulation and physical navigation metaphors count only as "a pretty UI".
The Asus Eee Pad Transformer doesn't have the same specs as the iPad. It has a slower CPU and GPU (see, e.g. http://www.glbenchmark.com/compare.jsp?benchmark=glpro20&showhide=true&certified_only=1&D1=Apple%20iPad%202&D2=Motorola%20Xoom&D3=Apple%20iPad&D4=Samsung%20GT-P1000%20Galaxy%20Tab), being based on an older chip, which it counterbalances with a better camera and slightly larger, slightly higher resolution screen. And it's just £20 cheaper, so all it really evidences is that other manufacturers are able to hit much the same price points for much the same technology.
No innovation required
Android dominates the phone market not by being on any innovative devices — in truth nobody has innovated for years — just by being on a lot of different handsets, relatively cheaply. It ticks all the boxes that a large proportion of the market care about, which is enough.