back to article Nvidia fires off Fermi, pledges radical new GPUs

Nvidia last night introduced the new GPU design that will feed into its next-generation GeForce graphics chips and Tesla GPGPU offerings but which the company also hopes will drive it ever deeper into general number crunching. While the new chip is dubbed 'Fermi', so is the architecture that connects a multitude of what Nvidia …

COMMENTS

This topic is closed for new posts.
  1. Pascal Monett Silver badge

    One question

    Is all this marvelous architecture going to be once again put on a chip that cracks under stress ? Or has the Green Goblin solved that issue without blaming anyone else this time ?

  2. TeeCee Gold badge
    Coat

    Can't........resist.......

    "......Special Function Units (SFUs) which handle complex maths...."

    I take it that they do this, er, very quietly then?

  3. Frumious Bandersnatch
    Coat

    Love the terminology

    I can just imagine the new marketing literature...

    Anouncing the new digital loom from the leader in clacking technology. Based on innovative new floor-space layouts and inter-loom conveyancing techniques, our new loom achieves clacking rates equivalent to 1 million Jacquard-Acres. Each warp is capable of simultaneously handling over 700 wefts, so even your most complex designs can be programmed without the need for hand weaving.

    etc...

    They may want to consider dropping that new-fangled "Fermi" name, though. Maybe call it the GargantuRood or something more in keeping.

  4. K. Adams
    Joke

    Three billion transistors...

    ... and it still won't be able to run Crysis, natch.

  5. Anonymous Coward
    Paris Hilton

    Dissapointly, I can run Crysis maxed out...

    ...on my Geforce 280(GTX? maybe I forget). The model numbers aren't too memorable anymore.

    I've been looking for something to stress my Windows box, which is only used for games. I think I'm gonna have to get a bigger monitor to stress it. In any case, it's in good shape for left4dead 2.

    Err, what was the point of this post again? Oh yeah.. These are the days of miracles and wonders, ladies and gentlement. We're seeing evolationary rather than revolutionary change in this rasteriser-based GPUs as a graphics beast. Just as soon as Intel can get Larrabee running decent-quality DX-capable raytracing libraries (DX12? 13?). the Rasterising GPU is as good as dead.

    They may not be there yet. NVidia still have the hot hands (AMD's Radeon drivers mar otherwise nice hardware). However, Intel are big, greedy and very very rich. They'll eat NVidia's lunch when they're ready.

    Meantime, I don't feel nearly as much an urge to track GPUs and upgrade, as it's not a huge improvement like going TNT2U to Geforce, or Geforce 2 to 3...

  6. Nexox Enigma

    Re: Dissapointly, I can run Crysis maxed out...

    """Meantime, I don't feel nearly as much an urge to track GPUs and upgrade, as it's not a huge improvement like going TNT2U to Geforce, or Geforce 2 to 3"""

    Hmm yeah, this wasn't really an article about a GPU so much as a serious number crunching chip. And I imagine that the differences here will make it well worth upgrading from a Tesla to this Fermi deal.

    Now if they could just slap a decent FPGA or two on each card, these things would be 100% insane.

This topic is closed for new posts.

Other stories you might like