back to article Nvidia CEO outs GPU roadmap

Nvidia president and CEO Jen-Hsun Huang has outed the company's GPU roadmap. "For the very first time in the history of our company," Huang during his Tuesday keynote at the company's GPU Technical Conference in San Jose, California, "we are going to tell you the code names and the progression of our next several generations …


This topic is closed for new posts.

questions for author

at many dozens of gigaflops, how do they expect to keep the processor fed with data? What's the mem bandwidth and is it enough? How are they going to keep it fed when it's 3 or 4 times faster, then 16 times faster? Or do they expect it to work only with very restricted data sets (per-processor locally cachable), which frankly seems unhelpful for large data sets which I understand most hpc problems to be.

A crysis-worthy[*] superdupercomputer starved of bandwidth slumps into being just a pc.

[*] just thought I'd get that one out the way, clear the air a bit.

Anonymous Coward

When a CPU would be good and power matters

Given the speculation about the usefulness of adding a CPU/Linux combo in a GPU chip - and the clear focus on performance per watt, I'd bet money on the likelihood of an Nvidia GPU with an ARM core on it.

Since they have a significant presence in HPCm Nvidia have a chance to define the standard with such a chip.

This would have the benefit of derailing Intel and AMD as competitors bringing out x86-centric GPU HPC products.

Further ahead, when phones, PDAs and HPC clusters are all ARM CPU based, how long until people ask why on earth our desktops are still power-guzzling x86 gadgets.

I am sure Microsoft wouldn't have a problem bringing out Windows for ARM if they could finally get a significant piece of the mobile market for Windows too.


I wonder

How long it will take them to sort the drivers out this time :)


Aww man...

Don't care about all this gigawotsits and HPC stuff... just tell me how many more frames per second I'm gonna get in my favorite games already!


Won't find that here...

Most places reporting on this story with 'Kepler the next architecture after Fermi' have missed the point the this is only for the computational card space. If you see on the graph that before Fermi was Tesla. It is just that in this generation the Geforce gamer cards have the same code name as the GPGPU cards of Fermi. Last year Fermi GPGPU cards were also unvieled a long time before we knew anything about what the next Geforce cards were going to be like.

You will have to be patient to hear about the cards and architecture Nvidia will release for the mainstream / game playing part of the market. I think the journo's are to partied up at the Nvidia event to explain this properly at the moment.

Thumb Down

living in la la land

I can remember when there was no competition for Nvidia,then they could dictate terms to the market now they seem to still think they can dictate terms to the market, i changed to Amd for my graphics cards and found them a better product than the 2nd rate offerings by Nvidia, now Nvidia is going to dictate terms to the market again. get a grip nvidia you're only 2nd place in the graphics market and have a lot of catching up to do, Your company released so many copies of the same dam graphics card and called them by so many different numbers we couldn't keep track and we the consumer remember these ripoffs The last time you really made in innovation in a graphics card was the 8800gt now you want to dictate the market, try winning back some trust first.


Winter games and work amortization?

It sounds like they don't expect to optimize for a 7W desktop mode (or 1/2W, say.)

I guess I plan on a 200W card running other peoples' algorithms to amortize the card so I can play games (Dwarf Fortress being OpenGL-based) on it at some point? has me watch keynotes to get the chart, whatever it is? Ah, not enough points to just put log^2 (GFlop64) up as an axis, but clearly where they are Moore's law isn't running into frequency loss, equipment complexity kerchunks, or other issues. Perhaps it will run on waste heat at some point....

This topic is closed for new posts.


Biting the hand that feeds IT © 1998–2017