Re: yeah yeah, it was a bug...
Of course it was a bug, and it's just really, ummm, unfortunate that the same bug managed to manifest on both Android and iOS.
2208 posts • joined 18 Jun 2009
Of course it was a bug, and it's just really, ummm, unfortunate that the same bug managed to manifest on both Android and iOS.
My reaction is to assume the latter; iOS devices are much more likely to come on a contract with free 'unlimited' data than Android phones. It's a natural result of Apple not being competitive at low price points.
So what conclusions to draw? Nothing we didn't already know: failing to support either iOS or Android is going to cost you a large chunk of your potential market.
There's a precedent of sorts; Edison managed to collect sufficiently many patents for film equipment that he was able to force a ridiculous array of rules on the nascent industry, including permissible running times, subjects that could be covered and a bunch more. The net effect? American filmmakers moved to Hollywood, about as far as they could get from Edison, and ignored him.
I guess skipping the US would be analogous here.
The issues related to outsourcing, and highlighted in this article, tend to be about communication and oversight — stemming from the physical distance between one set of staff and the other, and the way that decisions about how to distribute work are often money-oriented and made by people that can largely ignore the consequences.
Conversely, you seem just to dislike Indian people.
It's a nation with a population of 1.3bn so you'd expect a few bad eggs, and if everything else were equal you'd expect failed companies to be significantly more likely to have CEOs and other employees from India than from almost any other individual country. Similarly individual wrongdoers should be much more likely to be Indian, just like people who scratched their left ear yesterday are much more likely to be Indian.
What doesn't follow is that Indians are much more likely to be individual wrongdoers. To argue that is to fall for a trivial logical fallacy — a statement does not imply its inverse, e.g. "thieves are likely to earn less than £1m a year" does not mean that "those that earn less than £1m a year are likely to be thieves".
No; my memory may be at fault but I think I'm right to say that Android 1.x and 2.x essentially rendered everything in software, including real-time interaction responses like scrolling. No GPU caching at all.
3.x introduced the first version of offloading to the GPU and 4.x has consolidated and updated that so that basically everything happens on the GPU — it's not just a cache for a compositing window manager, it's actually doing the drawing. So it's like QuartzGL or WPF on the desktop.
Based on released benchmarks, it's really good stuff.
I'm a big fan of Atari joysticks; when gripped so that your right hand is nearly flat to the top of the base with the stick at the base of the join between your thumb and your palm, the fire button being under your left thumb I think they're quite comfortable.
Starting with the N95 — released more than a year before Apple had an app store — Nokia instigated the 'Symbian Signed' programme, which involved locking those and subsequent handsets down so that they could install signed applications only. By Feb 2008, still several months before Apple had an app store, Nokia announced it would now sign apps only for those paying Nokia a subscription of US$200/year and fulfilling certain other criteria.
That's probably what's being referred to here as a walled garden. It's not as strict as Apple's rules but it's exclusive centralised control nevertheless.
Nokia's documentation on the scheme remains available at http://www.developer.nokia.com/Community/Wiki/User_guide:_Symbian_Signed and a bunch of blogs and news reports to confirm dates are easily found with Google.
My bank have put contactless payment into my debit card already, despite my complete lack of interest. That said, it's actually proven useful quite a few times — a bunch of lunch places around central London take it, and even a few pubs.
But do I want that in my phone? Not really. It doesn't feel like it would add anything that I don't already have, and it'll probably be completely unclear who I'm meant to ring if the thing is lost or stolen and who is responsible for any resulting charges made that amount to theft, and I'll also have the risk of malware.
I guess it's an easier sell in places like the US where they haven't even quite evolved to putting chips in cards and cheques are still routinely used for all manner of transactions.
On the plus side, at least his claim that "[In BRIC countries] it's all about Android" followed by a chart showing Android at 29% share versus iOS at 24% in South America and shares of 38% versus 30% in Asia gave the game away relatively early. By his reasoning it's all about iOS in Europe and America (spoiler: it most certainly isn't).
I would have cited the cost, lack of consumer knowledge and the speed at which online services are being integrated directly into TVs, but to arrive at the same conclusion. Even amongst those who want this sort of set top box, at twice the price of the AppleTV and the Roku they don't seem likely to capture any sort of consumer.
Check out the sales figures — there are a lot more people than the tiresome Internet flame brigade buying Apple products.
You're absolutely right — specifically the port carried both Firewire and USB, the Firewire pins being one source of power that iPods could charge from. The Firewire functionality was removed so those chargers that supplied only via that connection ceased to function.
If they swap connectors, expect adaptors to flourish.
Every iOS device released so far has enjoyed at least one new major OS release. Every device manufactured in the last two years is compatible with its 6.
WP7 phones continue to be manufactured now, but not one of them will ever receive the major OS update expected this year.
Are you actually incapable of seeing why the latter deal is significantly worse than the former?
I think its main problems are that the hardware isn't as cheap as 'free with a mobile phone contract' and the games then don't cost 69p, despite the perception that many of the gimmicky third party titles aren't even worth that.
It's Android and iOS that Nintendo should view as competition, not Sony.
No need to imagine it — just check out a third generation iPod Shuffle (the intermediate one, before they all but reintroduced the previous model).
If you retread my post, you'll see I argue that there's an insufficient distinction between an iPad 2 and Microsoft's ARM tablet. I'm going to take it as given that you're not trying to say the same thing about the ZX81.
Specifically, the two devices are about the same size, have about the same screen resolution, and since the iPad 2 is approximately the same speed as the latest model we're probably talking the same sort of order of performance.
It's not even roughly the same. The iPad 2 is currently available for $399 — two thirds of the cost of Microsoft's ARM tablet — and is a device on which Apple make a profit. And since it's not the flagship product I doubt Apple would have any qualms about dropping the price if Microsoft became an actual threat. Apart from ports there also seems to be no sufficiently obvious way in which Microsoft's ARM tablet is a better pick.
So, yes, I agree completely. Trying to usurp the existing dominant player with a product that isn't obviously better and isn't significantly cheaper is lunacy. Amongst others, Amazon's approach is much smarter, and I bet they'll have managed at least a European release before Microsoft manages to get any hardware out.
GIMP isn't an acceptable replacement for Photoshop for most professional purposes. The big killers are that they're still working on getting support for more than 8 bits per channel into the main code base, and seem to have no announced plans with respect to CMYK. In terms of many of the professionals that use Photoshop, the extraordinarily lazy official OS X port (ie, it's X11 based*) and out-of-date third party versions (eg, Gimp.app is still at 2.6.0) is an issue too. And that's without even getting started on the interface, which isn't just different from Photoshop but often in direct contradiction to the norms on both Windows and OS X.
That being said, I personally consider it invaluable — entirely as a non-professional, you understand.
(*) which means it takes forever to run, uses widgets with different behaviour from the rest of the OS, fills your hard disk with hundreds of megabytes of 'font caches' as it duplicates functionality built into the OS, etc. It's the way X11 apps run on the Mac I'm complaining about, and nothing more.
... but somewhere between them? Tablets are hot but Microsoft's core computing competency is 'things with keyboards' so what they've done is probably the smartest move they could have made, albeit that moving a lot earlier would have been a lot smarter.
As noted by almost everybody else, success will probably depend on pricing, screen quality and battery life, none of which we yet seem to know anything about.
Since 2006 Microsoft have launched Windows Vista and Windows 8. In the same period Apple have launched 10.5, 10.6 and 10.7. Both have their next release planned for this year.
So, totals to the end of this year will be: Microsoft: three releases, Apple: four.
In your estimation, that's three times as many paid-for releases as Microsoft?
I'd rather we didn't use pointless jargon like WUXGA too. It's about as useful as kibibyte.
geejayoh — "I know the rights for the Thunderbolt trademark have been transferred to Intel now. But there still has to be something in it for Apple."
To summarise your argument: if we take it as given that Apple is evil then that leads inexorably to the conclusion that Apple is evil.
Third party apps definitely were able to make use of the data, though I think not directly on the device. Applications running on a Mac or Windows PC could slurp it out of iTunes device backups (unless the user had chosen to encrypt them, but unencrypted is the default and there's no click-through prompt or anything like that).
They seem to conflate new methods with complete APIs — so if they had added, say, the method arrayByAddingObjectsFromArray:range: to the object NSArray (which they wouldn't, because it'd be redundant) then that counts as a new API.
To add briefly to those that have commented above on why a camera's 5 megapixels usually don't look very good; the Bayer filter used by 99.99% of cameras has an extremely low entropy and hence introduces significant aliasing. Almost every camera with a Bayer filter therefore performs lowpass filtering as part of the demosaicing process (the main exception being Leica, Leica owners generally being aware of the issues and able to do what they think is intelligent in software after the fact).
So cameras explicitly go out of their way to ensure the 5mp contains less information than it could, even if sensor size, quality of lens, etc, weren't problems.
I still find the assertion that it's impossible to program with a two-year old GPU laughable (though possibly my memory is at fault; did they have programming before 2010?) but it seems Apple have sided with the critics for once. The 'new' text has been removed from the Mac Pro on the official storefront.
On the contrary, Apple launched new Mac Pros today.
Despite your assertions to the contrary, I think that most developers, like myself, will be insufficient interested even to be able to tell you what's changed.
Shush! You'll get yourself banned from reading Apple news!
So that the industry can move on from its screen resolution plateau. I don't particularly care that Apple have moved first — it really says more about their category of device (ie, the expensive end of the market) than anything else.
900,000 isn't unrealistic. According to the best figures I can find quickly, during Q1 2011 Apple sold about 378,000 iPhones a day and 157,000 iPads. I've no idea about iPods, but presumably if we were to assume that in Google's terms each device gets counted as a single activation then we can conclude that Apple — then already behind Android in shipment totals — were doing at least 525,000 per day a year ago.
So for the OS that outsells Apple in a rapidly expanding market to be doing 900,000 a day a year later seems entirely realistic.
Apple's figures have so far always matched between public announcements and those required by law to be accurate (earnings calls, audits, etc).
You mean other than the kernel, the drivers, the windowing system, the binary format, the filing system and the system- and user-level libraries?
It sounds like it may be so that the computer and the display device can set up a little ad hoc network of their own, meaning that nobody else ends up suffering because of the high bandwidth stuff going on elsewhere.
A search on Amazon for the Playbook reveals the price difference between the 16gb and 32gb Playbooks to be $85.69. On Apple.com the difference between the 16gb and 32gb iPads is $100. I'd suggest that the step up is therefore a very comparable size. Though you'd be an idiot to take it because the 64gb Playbook is then a mere $21.39 more than the 32gb (versus another $100 between the respective iPads), which I assume is the sort of difference you were referring to.
Those are all at the actual street prices, of course. The Playbook's RRPs appear exactly to mirror Apple's.
The Classic OS didn't preemptively multitask, so it used CPU cycles only when you let it. The routes it exposed to the video and audio hardware also weren't all that much of an issue.
More of a problem was the Mac's high resolution display. As per the article, the small portion of the screen given over to the 3d display in the original Marathon was 320x240 — already larger than Doom's entire screen, at a time when pixel painting was definitely the bottleneck.
The Mac versions of classic games are usually worth checking out though. You're not going to find 640x480 versions of Prince of Persia, Chuck Yeager's Air Combat, etc, anywhere else.
The sad thing is, the new models probably won't even be more expensive. They'll probably be indistinguishable but for incremental component improvements.
That being the case, and I can now see that El Reg has done basically everything it can to turn what would otherwise be a single sentence into an entire article. But when's the last time the release of a new computer really excited anybody?
The game: try to spot the story amongst the tedious playground bile.
I think you're living in the realm of the straw men; this document does not claim that iOS is 100% unhackable. It merely documents that Apple has taken many of the steps that the industry advocates in order to secure their OS.
The only person you'll hear an unhackable claim from is an inveterate troll. You can point them at the latest jailbreaking tool if you want instantly to win that argument.
Could that have been one of Europress's Fun Schools?
Nah, he was probably just joking. 2003 is also the year in which he promised us 3.0 Ghz G5s.
But in addition to killing Ping can we please have a disentanglement of the music player from the thing that archives and synchronises apps? Even on a Mac where it performs reasonably, iTunes is a usability mess.
Well then I stand corrected. I've used Zombies Run (ie, this one) on Android and was aware it had been in the public eye quite a while before release because of the Kickstarter; the degree of funding attracted further suggested a novel product.
Despite being El Reg's boards, I'm glad we could both be civil about my error.
I think you're possibly confused; it's the same app from the same company for both iOS and Android, and it got Kickstarter funding last year (see http://www.kickstarter.com/projects/sixtostart/zombies-run-a-running-game-and-audio-adventure-for ), rather than having been around for "a couple of years".
Considered responses to known facts will probably be saved for occasions when there are known facts to respond to. You'll likely mostly get the usual internet-dwelling angries, responding to your transparent attempt to anger them.
Are you sure you didn't mean to write that Steve Jobs' death had opened the way for Samsung to attack Apple's witnesses?
It's usage though, not installations — StatCounter is one of those that collects usage data through web page accesses. So people aren't merely installing Chrome, they're installing it and using it.
However they've done it, I say congratulations to Google. The market is now divided enough to make the standards successful, which aids innovation. Can you imagine the smartphones having done as well if rendering everything like IE6 was still the consumer expectation?
More than that, it's an article on web browser progress that requires Flash (specifically to draw the charts). I can almost taste the controversy.
I think the AC was referring to Jobs' vision of simple, stratified product lines. Sure, they keep the older models around, but I think the point is that the initial presentation is straightforward, there are no mystifying tables of tick boxes for people to traverse in PC World, Carphone Warehouse or wherever and the thing itself can become a desirable object in the group consciousness of consumers without caveats.
That's a policy he implemented almost immediately upon his return to Apple, starting with the Mac, and is probably a large part of what saved the company. I think they'll stick to it. If the iPhone gets larger, I'd expect the iPod Touch to get larger too and for no current generation smaller variants to remain available. No doubt they'll brand it a transition and cite the moves to PowerPC, to OS X and to Intel as evidence that they're not diluting, just advancing. Apply your own pinch of salt.
Obviously there are a bunch of ways that an iPhone screen size change and an iPad Mini would directly contradict previous Jobs keynotes. He used to do that himself, but the observation is nevertheless valid.
Sure, it's marketing speak, with Apple coming up with whatever figure suits the keynote but as a general rule it's double whatever it replaces in each dimension, so old art scales up to an integral size. That would also fit the art resources discovered in recent OS X applications.
So it should be double one of the existing or reasonably recent screen resolutions. So most guesses have been 2560x1600 (which is a doubling in each dimension of the 13" MacBook resolution) or 2880x1800 (which relates to the 15" resolution).
Just guesses though. Mind you, so is the claim that the next resolution increase is going to be dramatic rather than merely evolutionary or non-existent as previously.