back to article AMD claims 'fastest graphics supercomputer ever'

IT vendors seem to believe that if you say the word "cloud" enough, an marginal business idea will yield up revenues and profits. Much the same thing happened with grid computing - remember that? - before it morphed with utility computing into the even more nebulous cloud moniker. So it is with the AMD Fusion Render Cloud, …

COMMENTS

This topic is closed for new posts.
  1. Xander Duffy

    wow.. in more ways than one...

    Would it be possible for World of Warcraft to purchase this type of server and deliver higher quality graphics to their customers without taxing the users computer further or am I missing the point? If not, will more be seduced into the cult that is WOW? May El Reg save us all...

  2. Paul Murphy

    Mobile gaming anyone?

    Given a fast enough link from a mobile device to the 'gaming server farm' could mobile FPSs be the next big thing?

    Would it be possible to have LAN-parties on the daily commute (on the trains anyway, it wouldn't be sensible when driving)?

    hmm I wonder what other possibilities will surface?

    ttfn

  3. Eddie Edwards

    Reality check guys

    You can't deliver high-quality graphics in real-time over the internet. Nothing you can compress and deliver will be remotely comparable to the quality of even the shittest mobile phone GPU.

  4. ben
    Thumb Down

    Great,

    Now does anyone have a pipe fat enough to deal with the huge amount of upstream and downstream goodness.

    Might be good for gaming cafes, other than that i cant imagine it will have much appeal.

  5. Justin Clift
    Happy

    Folding@Home

    Sweet. If the unused GPU capacity is put towards something like Folding@Home, they'd almost double the present amount of ATI GPU units in use. (1007 at the time of posting):

    http://fah-web.stanford.edu/cgi-bin/main.py?qtype=osstats

  6. E

    Jargon Alert

    Timothy:

    It is not "rendering" anymore. It is now "visualization". More syllables and a 'z' make it better!

  7. Justin Clift
    Happy

    Oops...

    Oops, read the wrong column at the Folding@Home website. They've already got over 9,000 ATI GPU's in use. (No idea of model though). Oh, and 16,000 NVIDIA ones too. Still, an extra thousand Very High End ones probably wouldn't hurt. :)

  8. Morten Bjoernsvik

    Remote display rendering works but need some bandwidth

    SGI used to have a product called vizserver which linked a laptop/desktop to an onyx which did all the openGL computation and rendering (remote OpenGL client). The finished rendered image where then pushed back to the clients framebuffer. This worked ok for local gigabit ethernet,

    There were no ways getting uncompressed image quality for reasonably sized displays:

    1280 x 1024 x 24bpl x 60Hz / 8bit / 1024 / 1024 = 225 MByte/s

    Adding compression created some latency and artefacts, usually it looked like watching an mpeg2 compressed movie(720p ok). (mpeg4 were too heavy for the old Mips processors).

  9. Anonymous Coward
    Happy

    Re: wow.. in more ways than one...

    Even if you had a big enough pipe the latency would kill you. If you've ever played wow you know that it is very frustrating when latency alters the position of your colleagues or the monsters. WoW tried to cope with that by handling most calculations (for fights etc.) locally, and syncing with the server. That way it doesn't matter that it takes 100ms to talk to the server, your 0.5s cooldown isn't actually delayed at all. With rendering though it would take away all that local benefit.

    WoW doesn't even really hammer good current graphics cards, and the style of graphics doesn't really need it anyway (I remember killing Vaelastrasz in the original WoW on my old computer as the main tank with 3fps - but my newer cards all do 60fps without a problem on every fight).

    This is all about rendering movies or cut sequences for games. Basically pre-rendering. Presumably playing it back to a single box is so that the designer/editor can actually see what they are doing in near realtime. This is a big goal of modern movie editors. Peter Jackson for Lord of the Rings got a big enough render-farm that he could review changes overnight for sequences they were working on. Doing that with a few seconds latency but effectively in realtime would be a big benefit.

  10. Rob Dobs
    Stop

    misunderstanding

    I think there is a general misunderstanding. This is a rendering farm, not a gaming farm. Graphics developers, film producers like Pixar etc, could rent out these machines as opposed to upgrading or buying graphics workstations. This is for 3d modeling, creating effects like fire or fur in a image or film. Its not a bad idea, but wouldn't expect them to get too much revenue out of it, like other cloud concepts and almost all other things, its cheaper if you do it yourself, as opposed to paying someone to do it (plus their profit/overhead).

  11. Steven Raith

    @Rob Dobs

    I think you are overthinking this - the real benefit would be to corporates and local government who want to pre-render presentations and tech drawings [product design etc] in near real time without splashing out on a quad processor, twin GPU box costing £5000 every two years - with the right pricing model, there's a pretty decent market there I reckon.

    Think about it - you have a product you want to show off - you are short on funds. You can either pay £3000 for a mid level rendering station that will take a whole weekend to create a [possibly crap, and requiring another week of reworking] video, or you could pay £500 to rent CPU time for a week and have the results right back at you in hours...once you have paid the Autocad guru, natch.

    IE I built a machine to render municipal developments, including traffic calming and road layouts, as they wanted to show a video of it to get funding to do the physical work. The machine cost over £2500 to build. Instead, they could have done the CAD work on a boggo workstation with a consumer GPU in it, and uploaded the animation details to a server farm for [one would expect] a lot less outlay and downloaded the resulting film in a low res to test it within an hour - rather then rendering the scene overnight, realising it had glitches/wrong angles, and doing it again - wasting time, and money.

    Which is not to say that Pixar etc wouldn't be interested of course, but I think the mid-low level market would be a better one to aim at!

    Interesting stuff.

    Steven "misses building massively specced dual processor and monster GPU workstations for local authorities so he could test them with Battlefield 2" R

This topic is closed for new posts.

Other stories you might like