back to article IBM Goes 'GPU-riffic' with new blade

IBM made big news on the first day of last week's GPU Technology Conference by announcing that it'll roll out an Nvidia Fermi–based expansion blade. While it's not formally announced yet (the plan is to do so in Q4), IBM had one at the show and walked me through it for the video below. It isn't a standalone server; it's a …

COMMENTS

This topic is closed for new posts.
  1. Ken 16 Silver badge
    WTF?

    I'm slow

    what's a GPU?

    1. Charles 9

      Graphical Processing Unit

      Essentially a processor built originally around the specialized computational needs of 3D graphics. Thanks to advanced from nVidia and ATI, they've become more generalized and have become extremely useful processors outside the graphics realm. The things that give it a boost to rendering (mainly a high capacity for parallel processing) are also of use in various related mathematical problems: particularly those, like graphic rendering, that involve doing a lot of independent but similar calculations quickly.

      1. Ken 16 Silver badge
        Stop

        that's what I thought it meant

        but it just confuses me more - why would a server blade need better graphics?

        I can understand someone bumping up the graphics card to play games.

        1. Charles 9

          nVidia and ATI thought outside the box.

          What they found was that parallel computation has plenty of uses outside gaming (look at Folding@home; their strongest contributions come from GPUs). Where GPUs excel is what is called "stream computing": essentially, repetitive independent calculations which are ripe for parallelization (if you think the 4-way multitasking of modern quad-core CPUs is hot, GPUs can divide tasks 32 ways or more). Plus in terms of raw computational horsepower, GPUs have CPUs licked (the top-end cards are reaching teraFLOP levels in single-precision--a double-precision teraFLOP GPU device is only a matter of time).

          BTW, there's serious uses for graphical computation, too. Think climate and weather modeling, physics simulations, and other forms of "what if" modeling. GPU computations can even help with professional ray tracing and similar forms of advanced 3D modeling (not to the realtime level yet, but they can still seriously cut down rendering times).

        2. Mike007 Bronze badge

          server GPU

          I can see these as a great use to major operators as they computing provide the power for their custom data processing applications to run on, however I have to agree that i'll certainly see no benefit from plugging a GPU in to my server which runs headless with standard workloads on, because i have no software that can make any use at all of GPU.

          This is probably the case for the vast majority of people, however I can see the potential for database servers to start supporting GPU offloading for their enterprise database users now that there are products like this coming to the market, and for this to then trickle down to become another standard feature. This is a software problem, cheap high power hardware is everywhere as a result of the gaming industry, now we just need software to take advantage or it.

          the real question is why can i still not use those 64 400MHz cores in my bottom end graphics card to run a pretty beefy database server?

This topic is closed for new posts.

Other stories you might like