AMD has demoed its next-generation "Trinity" processor, promising 10-teraflop notebooks based on follow-ons to its new A-series "Liano" APUs by 2020. "Two and a half years or so ago ... I brought up a bold promise: that in 2011 AMD would deliver a supercomputer in a notebook," AMD senior vice president and products-group general …
Where's the software
And what do you do with 10 TeraFlops? Email?
Seriously, the amount of compute power is increasing so rapidly, that the mundane can be done with waste cycles. So, unless these super-books are only going to be used by 10 PhDs at the National labs, we need to cover some other bases.
When do we see some innovations that need that power? Perhaps some of the software people chasing YouTube with the 'next great social network' might do something more creative.
How about speech I/O (No more finger-tapping on my TomTom); bio-sensing packages for paramedics and home use; auto-diagnostic software to go with DNA read-out chips; auto-drivers for our cars; and don't forget voice-based query with real irrelevance filters for Google - I'd strangle any system that read me 200 results before I found the right one!
Of course, cloud computing and ubiquitous wireless may make all that irrelevant in my laptop. Perhaps Google should spend a few dollars and give us a voice platform and APIs for any app.That could be a windfall for them, and resolve the technology issues in one fell swoop.
"...And what do you do with 10 TeraFlops? Email?
Seriously, the amount of compute power is increasing so rapidly, that the mundane can be done with waste cycles. So, unless these super-books are only going to be used by 10 PhDs at the National labs, we need to cover some other bases..."
Computers are becoming faster, which allows us to use more complex software. Compare MS-DOS and the best word processor, vs Win7 + Word2010. The difference is vast. In the future, there will more than 1080p, and 3D GUI, and you can talk to the computer, etc etc which draws computer power. The faster the computer, the more complex software and the easier the life will be for us.
It almost sounds like "who need 640KB?".
unlike 640k memory
CPU raw power isn't fungible. The 640k was 20 times more than the 32k you might have found in similar era micro-computers and this meant that the PC really could scale applications by a factor of 20.
Now the top end i7 cores are around 100 GFlops, so this AMD chip is 6 times better right? Wrong. For starters the i7 is 100 GFlops double precision. For second much less of the i7 performance requires vectorization of your code.
Vectorizing your code takes significant effort and for many applications in science and finance single precision is simply not enough.
Massive single precision vector performance is great but has a very limited number of applications.
The obvious immediate use is playing games. Generating 60+ frames/second of high-res virtual reality can absorb almost limitless quantities of CPU power.
Build it and they'll come? I don't know if there are any mainstream business uses that require moving 3D synthetic imagery, but if there are and the hardware is out there, someone will write the software. I do know that there are many mainstream science and engineering applications.
RE: Where's the software...
...ummm hate to break it to you but people actually use notebooks for all sorts of things beyond email - y'know, editing video, audio, playing games.
The world is big and very interesting, you might want to get out there sometimes....
WTF because it is.
"used to test nuclear explosions " ....... "And now you're going to be able to get that type of capability in your notebook."
I don't want the capability to test nuclear explosions in my notebook - now, creating nuclear explosions might be useful !
you couldn't handle it
Gotta love almost endless CPU's in a small box. AMD is doing it right and Intel will be the ones playing catch-up.
...will it run Crysis?
Must be sending shivers down the spine of the spooks
With super computers like these (good luck getting a US Export License btw) in a notebook then even 512bit crypto keys won't be string enough.
Oh, you had better be good at remembering passwords with at least 40 characters.
Mines the one with a difference engine in the pocket.
Heck, you won't need a license. The chips will be made in China, stuffed into Chinese boards, and put into Chines cases!
The spooks really need to worry that China won't export them to us!
Re Export License
Well the AMD Fab company has a big FAB in Dresden, Germany.
But yes, it does sound a bit silly to need an expoert license but many parts of the US Gov still thinks the Cold War is alove and kicking.
Black Choppers for obvious reasons...
Interesting. However, one wonders if this is just (likely) theoretical peak performance versus more realistically sustained real-world performance. At the moment, it would seem more people are better able to harness more from nvidia/CUDA than AMD/ATI, possibly more due to familiarity and entrenchment, but even then, as far as I recall, achieving theoretical peak performance on these GPU type set ups is quite difficult.
Frankly, I would soon rather see someone figure out how an easier way to program today's increasingly (highly) multicored systems. CUDA is definitely not too easy, and although I love C and everything, I've said it before, concurrency in it is a real bitch.
But I suspect, there will be no one 'magic bullet' solution for everything, as there are, to my mind, just so many compromises to consider in any approach.
It's actually rather clear from Anandtech's numbers that whenever AMD decides to upgrade the memory path that very moment this little GPU will immediately enter the mainstream performance league (it already mops the floor with Intel's integrated garbage in Sandy Bridge.)
Take a look here: http://www.anandtech.com/show/4444/amd-llano-notebook-review-a-series-fusion-apu-a8-3500m/3
"With 400 shader processors behind a shared 128-bit DDR3 memory interface, the upper bound for Sumo performance is the Radeon HD 5570. In practice, you should expect performance to be noticeably lower since the GPU does have to share its precious memory bandwidth with up to four x86 CPU cores.
The mobile version of Llano supports up to DDR3-1600 while the desktop parts can run at up to DDR3-1866. Maximum memory capacities are 32GB and 64GB for notebooks and desktops, respectively."
In other words AMD is working on adding another controller/faster path and simultaneously on shrinking faster 6xxx-class GPU core into the same package while also upgrading CPU cores, I bet.
If I were Intel (Larrabee-fiasco, remember?) I would seriously consider buying Nvidia now or soon - NVDA will fall more as next-gen APU releases will near closer but they will need 2-3 years to integrate their products - or they will see their mobile and desktop business shrinking, thanks to APUs, ARM cores etc.
That $5B AMD spent on ATI doesn't seem that much anymore, huh? :)
PS: speaking of programming GPUs vs C, have you seen the news about C++ AMP in Visual Studio?
Jim O'Reilly is right - in a world migrating to web based services this is madness.
Nonesuch is right too. Woot!
I'm going with Woot. Imagine, being able to crack MSoffice in a few seconds. And, perhaps, being able to run MSoffice and have it display letters as quickly as I can type them!
YOU, dream big...
but don't you worry. MS will have office packed with enough bloat to keep that 10 teraflop/200 gig or RAM system just a little behind your typing speed.
head in the clouds...
A lot of people would LIKE us to migrate to web based services.
But a lot of people aren't interested in going that route...
Migrating to web based services means to be at the mercy of a load of intrinsically greedy vendors, the net being up at all times, some provider firm not fouling up and loosing your stuff either by failure or by cyber attack. You're at the mercy of prices going way up once you've all been converted into thin clients.
Then there's the whole big brother thing: what will you still be allowed to use those online computing resources for? Surely there will be more spectacular cases of someone using some outfit's rented computing resources for some news grabbing hack, and soon there will be regulations of all kind on what you can and can't do.
And then, the cloud is the harbinger of that worst scenario of all: rented software.. we know, vendors love it, a continual tollbooth. I prefer the current approach, I buy a package, I use it as often and as long as I like, without getting nickeled and dimed to death.
For corporations, renting software might make sense in a lot of cases, but vendors are only encouraged to offer a reasonable value add while there's still competition from other options like local, purchased software..
It's possible today:
Bloat is relative... I had the pleasure of installing Office 2000 onto a brand new Windows 7 laptop for an elderly friend recently. To say that Word was like grease lightning was an understatement. There was no delay or splash screen - it was instantaneous.
If it has to be M$ and if you don't intend to use the bloat of the more recent versions of Office, and most users don't, I'd strongly recommend it as the way forward.
I wonder what the battery life will be like? I guess we will all use Plutonium to give the required 1.21 Gigawatts.
the way its going now, if we have 10 TF in notebooks, we'll have 8TF in our mobile phone
dual-CPU + dual-GPU phones are already here
The Samsung SGS2 already outdoes a lot of lower end / small form factor notebooks for 3D power and easily equals them for video decoding AND video encoding.
And before Xmas, Nvidia has promised the next generation Tegra to have quad cores - for phones and tablets!
And Samsung's current chips have been outperforming Nvidia's current Tegra generation, so Nvidia isn't the only one doing it.
It might not be too long before you simply plug a screen into your cellphone and use a wireless keyboard, eliminating the need for tablets and netbooks both...
Motorola already build an early and slightly cumbersome version of that. I think in the future, we'll just be leaving our mobile in our pocket, and the larger screen with or without keyboard will simply hook up wirelessly, but you don't bother duplicating the brains in your cellphone in a tablet, its just a screen with its own battery.
RE: the way its going now...
.... before Xmas Nvidia will announce an even more inefficient, battery-killer Tegra which, based on their track record, will lose out even bigger to PowerVR's latest v6 chip.
Nvidia makes great but crazy power-hungry desktop products and as a result they have a lot to learn about energy efficient architectures to be successful in the mobile space, I think.
Sit one of these on your lap and poof. Each of these will be powered by a black hole stuffed inside of a larger black hole.
Notebooks are already supercomputers
Just depends on what era you compare them with. A smartphone is more powerful than the first Crays.
In 2000, the world's fastest machine was 7 teraflops. My £150 graphics card is 2 teraflops...
The vast majority of the world's computing capacity is already unused. We're actually basing a business on that very premise: http://www.youtube.com/watch?v=DTKnXelDkco
Seem likely 10TF is app dependent.
Seem likely 10TF is app dependent. I doubt the CPU part will provide the 10TF, more likely the GPU part will. So apps that exploit embarrassingly parallel algorithms which themselves mostly involve ADD/SUB/MUL/DIV/MADD will do well on this promised platform.
We do not need 10TF to run Word, Powerpoint, IE, Firefox, Windows Media Player, etc. Some people may get the benefit in their Excel spreadsheets though.
10TF might enable full realism in gore spatters in first person shooters running on notebooks... and that's not a bad thing IMSNHO!
Seriously though, two points:
(1) I work with boffins. Boffins collectively are a small market segment, but every one I know would like to run simulations on their notebook at Starbucks if they could;
(2) Such computing power could enable apps that currently are impractical. What if you could just speak to your computer rather than typing or using a touch screen - that would probably need a lot of cycles yes? Who does not want a Star Trek computer?
Tux because Linux offers a bogomips measurement right in the boot up log messages!
It's a relative term
" I work with boffins. Boffins collectively are a small market segment, but every one I know would like to run simulations on their notebook at Starbucks if they could;"
I too once worked with boffins, but i think the thing is super computer is a relative term. Sure if you could take said notebook back to the 80's (or as some people have pointed out even 2000) then yeah, boffins would get that glazed look in their eyes as they image running huge simulations on their home laptop.
But by 2020, the boffins will just say '10 teraflop? why would i run my sim on that, when i can run it at 1000 times the resolution in half the time on the department's 100 Petaflop cluster?'
So the term super computer really depends on what you're comparing it to. Even if they invented an Exaflop laptop. Some smartass boffin would lash 1000 of the together and create a Zettaflop cluster...
that would be cool *eyes glaze...*
"Not quite a boffin"
But I would say the boffin wanting to hang out at Starbuck's is questionable at best :P
Still, if there is no real significant demand for having the grunt on one's lap (ie responsive somewhat real time data visualisation ie graphics grunt), one could, as you say, "run it on the departments 100 petaflop cluster" remotely, still in the comfort (?) of Starbucks.
I have a high powered lappie (desktop replacement), and by no accounts is it small, quiet nor cool.
Personally there is much to be said about a small lappie/netbook with a good network connection. Don't even have to have all your data with you personally, for the average user, just make sure your server is up. That way your netbook can have a superfast SSD and not be addled with too much pr0^H^H^H data.
(By now you've probably figured out I do not have great fondness for Starbucks!)
Must surely be the new unit of computing power.
not only that it's currency in some nations
suggest it's not quite the supercomputer-in-your-pocket that AMD would like us to think, although it doesn't look all that shabby.
Each elephant is worth....
0.58 teraflops. in that case...
should just about boot Windows 10, then
There's always meaningless icons to animate.....
- Tricked by satire? Get all your news from Facebook? You're in luck, dummy
- Google straps on Jetpac: An app to find hipsters, women in foreign cities
- Updated Microsoft Azure goes TITSUP (Total Inability To Support Usual Performance)
- The Return of BSOD: Does ANYONE trust Microsoft patches?
- Munich considers dumping Linux for ... GULP ... Windows!