Nvidia's move to go public on its four-core processor plans come, some readers will recall, just a month or so after it was claimed Apple's second-generation iPad will sport just such a chip, and weeks after it was suggested the iPad 3 will be out as early as the autumn. Nonsense, said some. Apple wouldn't release iPad 2 this …
Seems like I will never buy a iphone or fondle slab - not because I don't like them, but because you must always wait on model n+1. I just can't be arsed anymore.
Nothing new there....
Please consider essentially the entire history of computers.... notably CPU speeds and the repeated reinvention of the motherboard to plug them into. Also the iphone if you want to get more recent or non geeky
All you can do is keep an eye on the future and buy in the present when there is nothing (affordable/appealing) in the future.
Personally I pitty all the parents that bought kids the "latest" DSi XL for last christmas only to pick up the spring Agros and see the 3DS ads & hype knowing they'd be bugged for one come xmas 2011
seems more likely the ipad 3 is the ipad 2 and so the ipad 2 wont be out until much later in the year.
I don't want an iPad
But why can't the other tablet manufacturers use decent aspect ratio screens, instead of the stupid 16:9 or 16:10 jobs they all use now. I want 4:3 in 7", which would be good for eBook reading as well as webby stuff.
Yes yes yes yes yes ... [ad nauseum]
Why do they all think that a 16:9 ratio is suitable for anything other than a TV?!?
Gotta make excuses for Cupertino...
> Why do they all think that a 16:9 ratio is suitable for anything other than a TV?!?
Actually, it works just dandy for a normal desktop, even web browsing. I don't know how anyone could go back to 4:3 considering that standard paper sizes are much closer to 16:9.
Back when dinosaurs roamed the earth they actually had such "titled widescreen" monitors sold as "full screen" monitors.
I don't really get the problem with this - the average paper back book is 8"x5", which is exactly 16:10.
As far as the web goes, I tend to find high resolutions cause more problems than the ratio itself. I'm using a 2560x1440 screen currently, and there is a rather embarrasing amount of space at the edges of most websites, and page zoom is.. tempremental.
Personally, I couldn't read a whole book an a backlight tablet anyway. I tried, I got a headache, I gave up. I'm tempted to get a Kindle, but it seems like quite a lot for a device just purely to read books. I already carry enough gadgets most places, without another.
I guess I'll just stick to stained trees for now.
And while they're at it...
I'm not attractive anough to look at my face in the mirror for hours...
...Paris can still use the front facing camera
Standard paper sizes
Standard paper sizes are sqrt(2):1. That's a lot closer to 4:3 than to 16:9.
Of course, most films worth watching are 4:3, and the only things worth watching on TV are the occasional film, so I'd stick with 4:3 for TV, too.
To watch "The Good, the Bad and the Ugly" at 2.35:1 you probably need to go to a cinema ...
Wonderful bit of FUD
"Do you really want to get your iPad 2 pre-order in now?"
Reminds me of the old days, where Microsoft would announce an EVEN BETTER product than the one that would get presented, only to not make good on their words. Except in this case, it's a supposed collborator that is sticking in the knife. So, let's all chant together: "No thanks, Steve, we won't buy your iPad 2, because we're told something better is coming along real soon!"
I'll just file this under 'hyperpunditry'. Where 'hyper' stands for 'hypothetical' or 'hypocritical' I leave for you to decide.
You do know "hyper" (above) and "hypo" (below) mean the exact opposite don't you?
I thought that there were dual-core and maybe even quad-core ARM chips mentioned as upcoming from companies other than NVIDIA. I would have thought that a part from a conventional ARM chip supplier would be much more likely to be used by Apple than one from NVIDIA.
Of course, if they're planning to increase resolution, and only the NVIDIA part would do the job... and, of course, since Apple didn't rebrand the Motorola 68x or Power PC or Intel processors in its Macs, I don't see that they have to rebrand the CPU in the next iPad just because they did with the last one. Especially if new anti-jailbreak tech means that knowing what the CPU is no longer helps.
They branded the A4 because it is a custom chip with ipad-specific circuitry added. ARM is a core design, not an off-the-shelf chip like 68k, PPC and x86 chips are.
Nvidia graphics unlikely
Apple's current processors use PowerVR graphics, and I think a transition to the NVidia GPU is unlikely. I've coded for both the PowerVR and for NVidia (both exist in various Android devices) and there are some significant differences, despite claiming to support the same vesions of OpenGL. This is not the sort of thing that Apple would do without a really strong driving force, and I don't see one; the not-yet-available Nvidia 4-core chip might be fastest on paper this week, but something else (OMAP5, etc etc) will leapfrog it next week.
The rumoured ipad "4 core" chip was not rumoured to be quad cored at all. It was supposed to be a dual core CPU and a dual core GPU - 4 in total.
And a GPU powerful enough to drive an ipad with a retina display doesn't exist? It was rumoured to be an imagination sgx 543MP2 (2 meaning dual core). The 543 has 2x the pixel pushing power of the 535 according to the spec sheet, and scales pretty linearly with multiple cores. That would give it 4x the performance of the ipad's GPU, driving 4x the number of pixels. This chip has been around for a while.
Of course, Apple may not actually move to the "rumours suggest a future iPad will sport a 2048 x 1536 display" at all. The most compelling evidence is:
"iOS coders can add two sets of graphics to their apps: one set plainly named for the 'old' resolution and a second, double-size set with "@x2" suffixed to the filename"
However, it's far more likely they'll move to a 16:9 or 16:10 ratio since it's the current fad, and they're likely wanting to push it as a multimedia/video device for "FullHD" viewing. Granted, a 1920x1200 ratio (16:10), which is used for Mac desktops (possible key-in for portability of desktop apps?), is only 2,304,000 vs the current 1,572,864 pixels, so not quite x2. However, some liberal application of marketing rounding, consideration of "Gen2," and programming consideration of "@x2 is simpler than @X1.46484375," and one can start to string together some compelling considerations.
Do you really want to get your iPad 2 pre-order in now?
Nah, i'll wait untill the iPad3 is available, and the rumours for the iPad4 are doing the rounds.
With tech advancing so quickly these days, it is more true than ever that whatever you buy is instantly out of date. If you keep waiting for the most up to date version, you'll never by anything.
I'm sceptical that Apple would go with this; first, it would almost certainly drive up the price, which is precisely what Apple doesn't want to be doing right now; second, the Tegra 2 and 3 lack NEON (ARM gives implementors of A9s a choice between NEON or a fancy FPU), which is present in current iDevice chips, is used by developers, and will be present in OMAP's dual-core part and probably Samsung's.
On the matter of rebadging, by the way, the situation with Samsung is a bit special. Intrinsity, which is now owned by Apple, collaborated with Samsung on the 1GHz A8 core used in the Hummingbird and A4 (the only apparent difference between these two is the GPU). At the time this work was done, normal A8s were around the 600MHz mark; the Qualcomm Snapdragon does _not_ use an A8, but rather a custom ARMv7 design of Qualcomm's own devising. As Intrinsity was part of Apple by the time the iPad, the first Hummingbird/A4 user, launched, it's not at all surprising they could swing this branding. There's no way they could do the same with Tegra, but they mightn't necessarily care; previous iPhones used Samsung and TI-branded SoCs.
"I'm sceptical that Apple would go with this; first, it would almost certainly <STRIKE>drive up the price</STRIKE> cut into their margins, which is precisely what Apple doesn't want"
Fixed the first sentence for you :)
Despite my snark above, I find your comment elucidating. Thanks for the explanation.
quad core ipad not likely in 2011 unless they want Freescale mx6 Quad A9 in there
as usual the mac heads need to find more news to keep them relevant after MWC
and the Honeycomb on the Samsung Galaxy Tab 10.1 etc.
quad core ipad is simply not likely in 2011 unless they want Freescale mx6 in there
and they wont OC.
Freescale IS the only ARM A9/Neon quad core vendor that has gone on record to say they are sampling these any day now and will probably have products on shelves by Q3 2011
the quad ARM A9/neon will apparently be going for just over $20 a pop
By Eric Brown 2011-01-03
"The i.MX 6 series uses 40nm fabrication and provides low power draw and advanced power management capabilities, says Freescale. The SoC is claimed to enable 1080p video (single stream) with only 350mW consumption. As a result, the i.MX 6 series can deliver up to 24 hours of HD video playback and 30-plus days of device standby time, claims the company."etc
"by launching new single, dual, and quad-core chips at the same time, Freescale will be among the first companies to make quad core ARM-based chips available. All three new chips will begin sampling in the second quarter of 2011, and the company expects devices using the new chips to hit the market before the end of the year."
also regarding the display Texas Instruments Brian Carlson on OMAP5 spec and ready for Christmas 2012, at least 18 months away.
he is the only one i could find that stated anything like the super HD capabilities resolution and that was 2K not 4K
http://armdevices.net/2011/02/14/texas-instruments-talks-omap5-omap4 around 2 minutes in
Are we all 'tag whores' then?
"Nvidia wants to be a key player in the mobile gadget chip market, and it's up against some stiff competition from the likes of Samsung and Qualcomm. Selling lots of chips to Apple is good, but if Apple pretends it designed them, that won't persuade other manufacturers to buy Nvidia's technology."
I'm sorry, but if I know that Apple's ChipFoo (insert whatever silly brand name you want) is really NVidia, I don't think that I would be less hesitant to buy NVidia chips. I believe that those who make chip purchases know what they are getting.
Paris because she is a tag whore ... :-P
wonder what the battery life would be like? or would it way a ton due to having a massive battery?
Samsung is backing ARM/NEON and especially Mali T604 at 850 milliwatts
actually Ian etc , Samsung is backing ARM/NEON and especially the Mali T604 GPU at 850 milliwatts and have been officially since November, 2010
Mali is designed to work with ARM's latest CPU core, the Cortex-A15, which targets smartphones, tablets and even servers. Up to 16 2.5GHz cores can work together for these larger systems.
Mali T604 will be compatible with Microsoft’s DirectX 11 and with OpenCL 1.1, both programming frameworks for parallel processing over multiple cores.
The inclusion of DirectX 11 aroused speculation that this programming technology would soon be supported fully in Windows Phone 7. Currently"
... of the article
I want a psion sized machine I can do make -j4 on with cuda and a 20 hour battery life thank you please.
What I want to know is...
When is someone going to build a miniITX board with a quad core ARM and nVidia GPU on it?
Sounds perfect for an ultra low power MythTV backend/frontend.....
Ipad3 to be powered by mystery power source from alien asteroid
I'll wait for the "iPad X", it'll be worth it! 32 core chip, full immersive 4D graphics with pure surround sound and tactile suit attachment!
Jeez, when is this tech-willy-waving going to slow down? I can fully understand why people simply wait two generations then get whatever it is 2nd hand off eBay for 20% the RRP on the item when it came out.
With so many Retina/iPad res apps already calling themselves HD, what's a person to call his/her iPad3 or heaven forbid iPhone 10 application?
For those who've never thought about it too deeply, choosing the correct suffix is all important in the business of wooing customers to click on your app icon.
I'm all in a fluster now.
I am no fan of the fruit company but they are not.......
....idiots. At least I don't think so. The iPad is a premium product - well at least the prices are premium at any rate. Were Apple to release iPad 3 only six months after their "geniuses" were leading large queues of customers in community dancing and singing outside fruit emporia all over the known world on the launch day of iPad 2, customers who had on that day parted with a reasonably large amount of their hard-earned only to discover six months later that their coolness rating had been sent south by the aforementioned greengrocer would be (we shall say cautiously) fucking pissed off. I do not think that The Man From Cupertino can be that stupid. What is Apple's usual sort of release cycle? I think we are far more likely to see something along those lines.
@ Fuzzy Wotnot
You forgot to mention the iPad X direct neural interface, which, due to insufficient pre-release testing, will require early adopters to retrofit themselves with a black polo-neck to hide the socket...
"They can make a great iPad3 Kal-El, they wish to. They only lack Steve Jobs to show the way. For this reason above all, their capacity for closed marketplaces, I have sent them you, my only GPU"
lets return back to reality please
let me make this very clear - a 2048 x 1536screen resolution tablet Will. Not. Happen. Ever. there is absolutely no reason for it and about 1000 against it. To name a few:
1. There is no media content that can make use of it, so processing power would have to be used to upscale everything.
2. Nobody, bar the extremely anally-retentive, complains that the resolution of a 24" monitor running 1920x1080 is too blocky or poor resolution, so how would a screen at less than a 1/4 the size be improved by increasing the resolution beyond 1080? 99/9% of people wouldn't be able to tell the difference, so why waste money on it?
3. Such a display would be way beyond the cutting edge of display technology and Apple have never even strayed into the cutting edge with their products, let alone beyond it. Apple take well designed mainstream tech and package & market it well. why should they suddenly change their whole method of working?
4. Running any kind of 3D games (portable gaming is an area where Apple are clearly moving in to) at that resolution would require a +400W power sucking fullsize desktop GPU
5. There are much better & cheaper technologies on the market that can improve the image quality of a screen than just ramping up the resolution, so why waste resources on that approach?
Now I could see a 10-12" screen maybe going as high as 1920x1080 in the future, but not for the next couple of gens.
....does a (mostly) non-multitasking OS need multiple cores ?
Look how quickly my iPad does nothing.... oooooh aaaah mmmmm
Multi-tasking <> multi-threaded...
Back to earth please.
Apple don't use Tegra in the iPhone or the iPad they bought a company so they could at least look like they design their own chips.
A quadcore ARM chip still won't drive games at that kind of resolution even the latest (and massively power hungry) Nvidia graphics cards have trouble driving a game at that resolution, Time somebody learned at least a small amount about computer hardware before writing an article about it !
doesn't matter duo or quad or hexa or octa cores
duo core, quad cores, hexa cores or octa cores, it's pointless if the power consumption problem no solved, when the heat precipitation handling is still exist... will be a failure. That's why most windows based devices too decades still unable to make a palm based PC popularize until iPad re-ignite the trend.
- Product round-up Too 4K-ing expensive? Five full HD laptops for work and play
- Review We have a winner! Fresh Linux Mint 17.1 – hands down the best
- Vid Antarctic ice THICKER than first feared – penguin-bot boffins
- You stupid BRICK! PCs running Avast AV can't handle Windows fixes
- Antique Code Show World of Warcraft then and now: From Orcs and Humans to Warlords of Draenor