No performance boost?
What about the GPU?
The iPad 3 does indeed contain 1GB of memory and sports a processor clocked at 1GHz, it has been claimed. So says one Vietnamese lad who managed not only to get hold of the device but to install the Geekbench performance testing app on it. And, according to Geekbench, the iPad3,3 - the new tablet's internal moniker - has a gig …
What about the GPU?
Does that do any more than just keep up with the increased pixel count?
You'd defo have to up the GPU for a 4x pixel count increase, they'd all get returned for refund if you didn't
No, it does less than keep up (4x the pixels, 2x the GPU).
But for existing iPad 2 apps running at iPad 2 resolution, it's twice as fast, which is a "performance boost" by any measure.
If the screen res has increased by 4x then any GPU boost could be swallowed up just maintaining that.
"But for existing iPad 2 apps running at iPad 2 resolution, it's twice as fast, which is a "performance boost" by any measure."
Only if you want to run ipad2 apps using 1/4 of the screen!
If the app runs full screen then 4x the ipad2 pixels must still be drawn by the GPU...
The story so far: a 4 core GPU, versus the 2 core GPU on the iPad 2. No claims of this being faster clocked or a different GPU architecture. Given that, plus the same old dual core ARM Cortex A9 at 1GHz, it sure sounds like the high resolution screen results in a new performance downgrade versus the iPad 2.
Or the nVidia Tegra 3... which seems to be clocked as high as 1.6GHz, and offers of course four cores, pretty much the emerging standard for non-Apple tablets this year. Heck, my smartphone doesn't match the iPad 2 on GPU performance, but it beats it on CPU speed.
Here's another big question. Google selected the TI OMAP 44xx series as their ICS flagchip for one reason: memory. Like most PCs and unlike every other ARM to date, the OMAP 44xx have a dual memory bus (sure, desktop i7s have a triple memory bus). So here's the thing... Apple's GPU and dual core is already stressing the memory performance of the A5's single 32-bit path. Did the up the memory bus width and/or double it for the A5X? If not, there's some serious doom here.
Think about it... regular dual core processors with single core PowerVR GPUs can already benefit from a dual memory bus. Apple's certainly going to hit the memory performance well on GPU-intensive applications, unless they've doubled the memory bus width and/or speed. But it's actually worst than that: graphics-intensive games and other applications also hit the CPU pretty hard... and the dual-core CPU won't be as able to keep the four-core GPU well fed as it was the two-core on the iPad 2. Not only that, but a 4x display area improvement with only a 2x GPU upgrade... tab vs. tab, the iPad 2 will be faster. OR developers will be able to run games at 1/4 resolution... same as the iPad 2, faster now than the iPad 2 where the GPU is the limiting factor, but at the same resolution. If memory doesn't become an issue.
Apple is the top mobile gaming platform right now, and they're increasingly making decisions to support that, more than other things. A quad core CPU would benefit nearly every application, even save battery power on lightly loaded systems, versus a quad core GPU, which is only useful on high-end games.... this is the same quad core GPU used on the Sony Vita. The Android GPUs don't entirely stink, but they have not kept up... and I'm not entirely sure I care...
I think, photoshopped image or no, that there's going to be a lot of buyers remorse in the not too distant future. Course, they'd die rather than admit it publicly but if I was gonna buy an iPad, I'd go for No2 personally. It felt like the entire reason d'etre for this iteration was to up the display, so yea, faster GPU to move pixels, slightly better camera, but that's it.
And I think the main problem is now battery life, they can't put out something with less battery life than the first one, so faster processor, more memory (keeping more working apps in the background), plus better graphics would just cane it.
I suspect the technical chaps over at Apple probably considered battery life. Do we really think they're going to ship this thing and then go "you're using it wrong" when people say the battery only lasts 3 hours?
They've done it before.
g oogle e mployee
As in treated their customers like chattel...
Fucked antenna design.. oh it's YOUR fault, your nasty hands are holding it wrong.
So the precedent is firmly set. Cite that.
Someone didn't take their irony pills this morning did they.
The attenuation issue that was blown out of all proportion and demonstrated to be present on other phones that affected the iPhone 4 so badly it's stilling like hot cakes.
The battery problems aren't a "You're holding it wrong". They're more of a "We've put out a software update that shows you more bars so everything's good now."
Personally, the only reason I'm getting an iPad 3 is because of the display.
I think there will be many people like me.
Has anyone been complaining that the iPad 2 isn't fast enough? At the very least, it keeps the original iPad relevant and usable.
Were you in a coma during the iPhone 4 antenna-gate fiasco, or what?
According to Apple they increased the battery from 25Wh (iPad2) to 42.5Wh on iPad3, which is a 70% increase just to get the same quoted battery life
As I mentioned earlier - heat issues anyone?
You must've been in a coma whilst reading my reply. It was a fiasco - entirely created by the media and haters believed the hype
"You must've been in a coma whilst reading my reply. It was a fiasco - entirely created by the media and haters believed the hype"
No, I read it. But unlike you I don't have an RDF that is dialed to 11+.
"Personally, the only reason I'm getting an iPad 3 is because of the display.
I think there will be many people like me."
+1 - I'm a photographer so this should be awesome for showing shots.
But if I wasn't the screen would definitely be a luxury...
You're better than that man, C'mon. Or do you have an RDF set at 11 by Google? Do you really believe the nonsensical media hype?
How strange. I've had my iPhone 4 since a few days after its launch in Italy and, er, it not only spanked the crap out of my old Nokia 3310 in the reception stakes, picking up signals where the Nokia couldn't even find anything at all. No matter how I hold it, it still completely and utterly fails to drop a call.
Yes, cupping it my hands and surrounding it in flesh does attenuate the signal slightly, but my Nokia 3310's signal is also attenuated in the same way! As did every other mobile phone I've ever owned! Short of rewriting the laws of physics, there's no way to get around this effect: the human body isn't fully transparent to electromagnetic waves. If it was, X-rays and CT scanners would be utterly useless!
This is first-hand experience with a phone I have absolutely no desire to replace any time soon. It is, quite literally, the best mobile phone I have ever owned, bar none. And this is from someone who's owned a Nokia Communicator 9500, a Nokia 3310, and a SonyEricsson P900.
And I'm no Apple fanatic: I've used Windows and (*BSD) UNIX too, and maintain the family's Windows machines. I even quite like Windows 7 and keep it as a pet in a small VM. My favourite OS is the combination of TOS and GEM on the late, unlamented Atari Falcon. It was a dream to code for.
The media, lacking the will to do any research at all into how phones and radio waves actually work, simply believed whatever they were told by that paragon of truth and veracity, "the internet". So, no change there then.
I have an original iPad and really wasn't going to bother. Then I saw the demo of iPhoto.
As a leisure photographer (rather than gamer or benchmark junkie) that was a killer for me. So rather than carrying a laptop with me when I go out for a days shooting I'll be carrying the iPad alongside my SLR instead.
I'll go further, for those that say tablets can't be used in anyway to produce content rather than consume it, go watch the iPhoto demo and use a little imagination. You'll never write a novel or thesis on a tablet, but iPhoto simply renders a keyboard and mouse redundant. So for multimedia content I think tablets in general and iPad in particular are more than capable of delivering a better tool than more traditional computing.
Most of that's probably just the display. The extra GPUs aren't much of an issue unless you're running high-end games... at which point, you don't necessarily expect all-day battery life. DRAM power is pretty minimal, and with the same CPU clock, that too could be the same.
One interesting question... is the A5X still a 45nm chip, or did they move to a 32nm design? At 45nm, that's going to be a gigantic chip, and they could definitely see power and heat issues with all four GPUs going wild. At 32nm, the chip would be about the size of the A5, and the CPUs ought to actually draw less power. That big laptop-scale battery suggests they didn't go for the smaller part... but they could be working on it. At their volumes, releasing a shunk-down A5X might make sense even six months later, particularly if, as usual, this goes into the iPhone-Next.
I didn't actually realize how primitive iOS is... many apps are very poorly multithreaded, and you can't really multitask in the sense of, well, real multitasking. This is certainly why they didn't bother with four CPU cores -- little benefit. They're going to have to fix this at some point, or Android will find more success on the tablet, as tablets increasingly replace PCs for PC-type work... not just web browsing and games.
>According to Apple they increased the battery from 25Wh (iPad2) to 42.5Wh on iPad3, which is a 70% increase just to get the same quoted battery life<
Wow, if the ipad was just that, and maybe an SD & HDMI slot / 1GB RAM - I'd have upgraded, a tablet lasting for two or three days hard use between charges, sweet!
DISCLAIMER: Not a photographer or ipad games player.
Depending your SLR camera you may be disappointed as there is a megapixel limit on the size of photo that can be imported. Can't find the original story but believe the figure to be 19MP, so full frame users are not invited. Similar details here (http://forums.macrumors.com/showthread.php?t=1335301)
"You're better than that man, C'mon. Or do you have an RDF set at 11 by Google? Do you really believe the nonsensical media hype?"
Well, if it really was a non-issue and just 'media hype' why did Consumer Reports, et al., slam the antenna design? Why did Steve tell people they were 'holding it wrong', and why did Apple hand out free cases?
They say that history is written by the victors, and these claims of a 'media stitch-up' from Apple's army of fanboys reeks of revisionism.
I'd say this is genuine... check out this benchmark on Geekbench also:
Giving fans too many hardware upgrades at once only hurts the profit pipeline, incremental is the apple way no? How else could they bash out a new model every 6 months to keep the frenzy they seem to like creating alive....
Count me unconvinced that they really care about being at the bleeding edge of tech, they only 'really' care about the $.
"Count me unconvinced that they really care about being at the bleeding edge of tech, they only 'really' care about the $."
Well, duh! Did you only just work this out today?
Last time I checked, caring about the "$" is how you run a successful business! It's the whole damned point of it all! Apple have never claimed to be a charity, or into donating money to good causes to improve their reputation. (Google seem to like doing that shit, but it doesn't change the fact that they're an advertising company with an increasingly terrible search engine.)
Apple have never claimed to be into bleeding-edge technology either. Their not-so-secret sauce is design. It's all about the integration of technology to provide a high quality user experience. That's it. That's all they've ever done. They couldn't give a shit about the technology inside their products, because their customers don't care about it either. All they care is that the stuff works and is easy to use. Check out their customer service sometime as well; Apple tend to top the charts for customer satisfaction and customer service as well, because it's all part of their holistic approach to user experience design.
But, yes, it's fundamentally all about making pots of cash too. Oddly enough, it seems to be working rather well.
Also, what about the new camera? What about the improved GSM support, which now handles more of the HSPA range of protocols? Don't those count as "better" too?
You're buying a performance downgrade in fact, thanks to the GPU being twice as powerful but having 4 times the pixels to deal with.
Could be a squared relationship... benefit of the doubt? ;o)
Where is this "twice as fast" statement coming from? I've seen a quote from apple that it's a "graphics powerhouse", and Reg's own articles have said different things, what are the facts (backed by links please)?
The OS hands anything screen related over to the GPU to render so the only thing that should be slower is loading/manipulating bitmaps in code.
Every 6 months? Really? Odd that, I thought the iPad 2 came out a year ago, must be my mistake. And as to incremental being the "Apple" way care to tell us all a company who doesn't do that? Samsung who currently have NINETEEN different Android phones? HTC who have EIGHTEEN Droid phones? All just ever so slightly different. If your going to bash Apple at least try and focus on something they do actually suck at.
Theres also the slight matter that the iPad 2 was the best tablet around, the iPad 3 is a slight improvement making it by far the best tablet. Apple haters bash all you like but if what they do is so easy then why is it that no other company can compete? Wheres the iPad beater?
Slow down there, chap. As a wise man once said:
Gadget Rage is BAD.
Hoist by my own petard for not quantifying my post there.
The 6 month promo cycle thing, I was slinging all of the i devices together. its like their living on an infinite loop or something....
So what if AN Other OEM makes a ton of models? let the people decide what's important to them in a device. Symbian Nokia & Asus still punting kit well above apple tech levels for your example. I bet they look nicer inside too, not that I needed another bugbear..
Sent from my Tablet S using Opera Mobile & Swype, listening to raw web audio not via itunes.
"#winning" - wtf?
Take a walk outside, see the trees, feel the wind on your face, enjoy life.
If you think your choice of tablet device, fruity or not is a win or a lose....
Well, there's an old saying that if you're sat at a poker table and haven't worked out inside 30 minutes who the mug punter is, chances are it's you.
It's highly unlikely that an improved, revised chip has the same performance as the previous one I would have thought. I suspect that these so-called benchmarks are fake. We'll find out for real soon enough anyway.
The 2x/4x graphics debate is very limited thinking. We've long passed the point where bitmap processing is the be all and end all of graphics processing.
Vector processing is king and the number of pixels is just a sideshow. name a credible graphics technology in the past 5 years that hasnt laughed at HD+ resolutions.
I'm betting that the iPad 3 is not only easily capable of pushing the extra pixels but also more capabable f juggling all the overheads associated with gaming jiggery pokery.
Sorry but as someone who works in graphics and gaming and so on, it IS a big deal. An iPad is a low-power device compared to a gaming PC with proper dedicated GPU, and driving a 2048x1536 display on a low-power GPU could certainly lead to fill-rate problems when using shaders.
For regular apps it's not a big deal, but for games it could be... saying it's all about vector processing doesn't change the fact your pixel shading units have 4X the work to do.
You are missing the point. These "low power" GPU's are benefiting from 10+ years of GPU and HPC arms races and because of their highly parallel nature they are almost pre-optimised for low power simply by cutting down the number of units from their mainstream cousins.
They were *made* for this application baby!
The ipad 2 gpu capabilities were widely acknowledged as some of if not the best in class, with plenty of headroom for the future - even if there has been a drop in headroom I expect it to be marginal.
The ipad 2 was the same 1 gig as the ipad 1, but was noticeably faster. I also feel that much of the underhanded side of apple comes from their shareholders, Al Gore comes to mind. The share holders are driving this company. I think that the way they are handling their current short fall of ipad 3's is a good example. The people that preordered have already paid, so they should get their product before the store customers...which have not paid as of yet. I have not ordered an ipad 3 yet, so don't o there. I'm not sure I am yet, until I'm sure the 1 and 2 are going to become obsolete, which I expect they will.
That's overclocked. Stable production clock for that tablet is 1.3Ghz. Come on haters don't cheat.
Btw wanna see get a good spanking elsewhere?
But does it have a volume control that goes up to eleven?
...this is probably the most comprehensive article you're going to find on the subject prior to anyone actually getting their official hands on a unit: