back to article Nvidia drops veil on game-changing might of VGX

What’s a “holy crap” moment? For me, it’s when I see or hear (or do) something that has far-reaching and previously unforeseen consequences. I’ve had at least two of these moments (so far) at the GTC 2012 conference. The first was when Jen-Hsun Huang, in his keynote presentation, tossed up a slide about Kepler and this new thing …

COMMENTS

This topic is closed for new posts.
  1. zanshin

    I think it's a game changer

    It's an early but important step. I think it's likely to be the first real step onto a slope that leads to not only enterprise VDI, but also consumer VDI, where people who want more than a browser-enabled media device (like a tablet) *still* just a browser-enabled media device to access remote compute resources that run in remote data centers. More than any other tech we've seen yet, It allows gaming and other locally resource intensive programs to become remote apps.

    That has the potential to further current trends we already have in consumers shifting to lower-power, more portable devices, which has implications for the costs of heavier-duty kit. If almost no one is buying full-on desktop PCs, they'll become expensive niche products, assuming anyone sells them at all. (I do assume there will be some legitimate need for them, and someone will meet it.)

  2. Anonymous Coward
    Thumb Up

    Virtual desktops...

    We've started implementing them at work in fairly large numbers. The reasoning initially was that we have a LOT of very old PCs that need replacement, but not that much money to do it. Instead, we spend the money on a serious beast of a server with a big chunk of SSD storage, and we keep those old PCs and run a virtual desktop on them.

    I have to say, I was a bit sceptical at first. I run vmware on my mac for the few windows tools we still need, and that (on a pretty powerful computer) is great. But 100 virtual PCs running on a single server, streamed to some pretty ancient equipment over the network?

    Having now used it for a bit - I'm seriously impressed, and I hope we roll this out pretty much everywhere, and fast. It works brilliantly! We set up one desktop image, tell it to provision 20 PCs with it, and assign it to some desktops or users. They log in, and get a nice, fast windows 7 desktop on an 8 year old PC. Need another desktop with different apps? Just clone it, install software, send out. Ballsed-up a software update? Roll back to the previous snapshot, send it out - users get a "PC will restart in 5 minutes" warning, 5 minutes after that the whole network of PCs are fixed.

    On the downside, I think the techies should start fearing this. It's going to wipe a lot of their jobs out.

    Oh, and the VGX thing? Not seeing the extra benefit, aside from gaming + video work and the like. Or maybe it'll be better for iPads etc. We don't run windows on those, as a rule :)

  3. Anonymous Coward
    Anonymous Coward

    Hmm..

    I've never been able to make the costs of virtual desktops work out - it's fine if you use Presentation Server / XenApp but VDI / XenDesktop type setups have always worked out more expensive (TCO) than new PCs.

    Part of that has been down to the exceptions - CAD users, multimedia users, senior management twits and the like. As long as you still have to support them you have to retain expensive parts of the support and infrastructure.

    If VGX meant that all the exceptions could be eliminated it really might be a game-changer and could have knock-on benefits for data security too.

    Would need to see some licence costings and indications of the back-end hardware requirements (data centre space / power / cooling costs are a killer for this sort of setup).

    I'm sure printing will find some way to stuff it all up if it does look like the ideal solution.

  4. twelvebore
    Coffee/keyboard

    From thin to thick and back again

    21st century X-terminals then.

    Didn't SGI (Silicon Graphics back then) push this sort of stuff a couple of decades ago?

  5. Sir Sham Cad

    Online Gaming

    For any game that's hosted online (MMOs especially) this will have just made them completely platform agnostic. No segregation of consols/PC users which will extend your potential player base and market. No minimum spec required to run the latest game. No need for me to buy the latest GPU in order to handle raid lag.

    The flipside? A lot of these games have recently gone Free To Play with micropayment options for extra content/in-game items/fewer account restrictions in order to monetise a shrinking player base. A big, fat VGX-capable server farm hosting your games so I can experience the game at its best on my old PC at home is going to cost a boatload more to run that you can get from micropayments. The subscription model will have to return and, unless you forego the main benefit of no platform segregation or limitation, it'll have to be a single tier "pay up or piss off" subscription which, in the current state of the market, is a step backwards.

    I'd pay, though.

    1. mark 63 Silver badge
      Thumb Up

      Re: Online Gaming

      With that in mind i cant wait to show those console using joypad jockies how an FPS is meant to be played - with a mouse and keyboard!

  6. Jim_aka_Jim

    What happens when Diablo IX comes out? I have to worry about 2 different companies under estimating server load.

    So for gaming, no thanks.

    Sorry, correction:

    So for Gaming, no thanks.

  7. daz disley
    Coat

    160ms is an age ...

    160ms latency is under ideal conditions, right ? so ... once latency is below ~40ms (i.e. a single frame at 25fps) it'll be a lovely thing ... until then, I'm not so sure. lag is a pita, even when backed-up with a ship-load of grunt ...

  8. JustNiz

    This is how it will go

    This demo was obviously running on a LAN, which will not happen in real world application.

    By the time you factor in the detail that your ISP only thinks in terms of download speed and not latency, and that internet connections are inherently bursty, and that the image itself will have been lossily compressed to hell so the fine detail will inevitably be lost, the whole experience will never be as good as running the game locally. Think netflix vs local blu-ray, + jitters.

    Most consumers are too low-brow to notice quality though, and I fully expect to see a whole new breed of consoles that are effectively just terminal servers fully take over and become the mainstream gaming platform.

    Maunfacturers will love the relatively cheap cost of parts compared to making a fully featured console but almost certainly won't pass the savings on to the end-user, as we are already conditioned to pay $399 for a console.

    Software houses will love the fact that end users never get an actual copy of the software (so no pirating). I wonder what they will blame low sales on next.

    Distributors will love the fact that they can charge users again and again to play the same game.

    These 3 groups will drive this to replace all current gaming regardless of the fact that its totally worse for the end-user. The populace will just buy this en-masse because they've been told to by the advertising.

  9. Adam T

    Anything's better than nothing

    I'm writing this on my macbook via my iPad via Splashtop. I can switch to my PC and play any of my PC games, MMO or otherwise, and the image quality is pretty good. This is over local wifi of course; when I try doing the same from the office, it's a different story.

    If someone can improve image quality and responsiveness without relying entirely on faster connections (the reality is, they will NEVER put cable where I live, 20 meters from the beach in a stuck-up village), then it can only be a good thing.

  10. zanto
    Coat

    yes yes! but......

    Can it run crysis or windows vista?

    (Mines the one from the last century)

  11. Anonymous Coward
    Anonymous Coward

    Don't get it.

    It's just swapping the X-Server (or RDP for that matter) for a video stream isn't it?

    The end result is that you need a network that can support as many video streams as machines.

    Yeah, typical desktop apps might work but then X windows etc work for that stuff quite well.

    1. mangobrain

      Re: Don't get it.

      I don't get it either. They've reduced the latency to "only" 160 milliseconds - anyone here remember the bad old days when your ping would rise to those sort of levels in Quake 2 and you'd be left dead in the water?

      Sure, there might be some clever trickery in the drivers to push the image stream out directly rather than have to screen-scrape it after the fact, but it's nothing you couldn't replicate at various levels of the stack. In fact, it's already been done on Linux, in a far more generic way via "virtual CRTCs": http://www.phoronix.com/scan.php?page=news_item&px=MTAxMDk

      Basically, instead of hooking a GPU up to a physical display, you tell it "there's one over 'ere, guv, honest", then scrape the framebuffer off into whatever pipeline you so desire. For example, a video compressor and a streaming server. "Real" hardware accelerated 3D for VMs is also already covered by the open-source stack (http://wiki.xensource.com/xenwiki/GPU_pass-through_API_support), though presumably NVidia have managed to lift the "one VM per physical GPU" limitation.

      Kudos to them for bringing it all together into a finished product, but give it a year or two, and you'll be doing this with KVM or Xen out of the box on Fedora, with open-source all round.

  12. Ammaross Danan
    Boffin

    Oh fun

    "but it’s not as much as, say, thousands of users downloading and uploading documents all day long."

    Just as a correction for you, thousands of users sporadically downloading/uploading ~100k docs or even 3MB files compared to the same thousands of users streaming H.264@1080p (or at least 1280x1024 if you're still using old square monitors) is not even close to the same network usage. You'd be hard pressed to have 100 users eat 5Mbps continuously (using fat PCs), whereas an H.264 stream of their desktop could easily run 500Kbps/user (total of ~50Mbps aggregate in 100 user scenario) of streaming bandwidth requirements. In best-case, you'll have people typing a document or idling at a desktop (reading), thus near-zero traffic, but scrolling a webpage or flipping windows would burst their streaming usage. Just imagine what would happen if a company-wide email was sent and everyone clicked to open it and your network wasn't designed to handle the max-conceivable-load....

  13. Anonymous Coward
    Anonymous Coward

    How much bandwidth does this use?

    Lots of BS 'cloudy' hype on this has but no talk of online compression or latency effects to the end user. The virtualisation is between VM nodes and GPUs? Good - I'm up for that. i'd guess design assumes uttrafast networks in gigabit-plus LAN environment? Our RemoteFX client boxes pull regularly 80 Mbps plus and don't suffer latency gladly. Other than allowing XenServer and VMware to play catchup with HyperV, will this offering be better in terms of delivery to the end user? if so how? I'm not clear much has been said on that front as that part all still seems to need Citrix / VMwareview/ TS etc etc

  14. Neoc

    A step in the right direction...

    ...but not quite yet where I'd like to see.

    I have a few PCs at home, each with a different CPU, or memory configuration, or add-on cards, or number of HDDs. Ideally, I'd like an OS (or hypervisor system) that I can install on my network so that ALL of my PCs look like on enormous PC with shitload of CPUs, memory, add-on-cards and hard-drives. And then I could run a series of VMs on this "single" hardware. *That* would be nice.

  15. Robert Heffernan

    Enterprise yes, Games no

    I can see this being useful in an enterprise space, actually where I work being an incredibly small company could use this to great effect by having all their engineering work done on the server and streamed to the piece of shit, malware ridden, old desktop PCs.

    As for Gaming and game service providers, it like the rest of the whole cloud gaming industry will never work because of the amount of data that needs to be streamed and the latency involved. Most users would love to have wicked crazy cloud spec machines but not at the cost of burning their data allowance for a few hours of CoD, then there is the double hit of lag. User<-->Cloud Game Client<-->Server

  16. Davidoff
    FAIL

    Not really new - RemoteFX does that already

    This is hardly new. Virtualizing a GPU is already possible under Windows Server 2008 R2 and Hyper-V Server 2008 R2 with RemoteFX. I think it was HP who put up a demo where someone was playing Crysis on a low-end thin client.

    I also remember that when El Reg posted an article about RemoteFX that the majority of comenters didn't get it.

    But now Nvidia does it and it's now suddenly the best thing since sliced bread.

    1. Davidoff

      Yes, it was HP

      http://www.youtube.com/watch?v=udlwgjOrlqc

    2. JulianCA

      Re: Not really new - RemoteFX does that already

      I think you don't understand what NVIDIA is bringing -- *multiple* virtual GPUs per physical GPU (and each fairly capable at that). Of course people have understood virtual GPU forever, but now the horsepower per watt is pretty compelling thanks to NVIDIA, at the same time there is emerging a somewhat realistic possibility of sufficient bandwidth to remote broadband devices.

      1. Anonymous Coward
        Anonymous Coward

        Re: Not really new - RemoteFX does that already

        "I think you don't understand what NVIDIA is bringing -- *multiple* virtual GPUs per physical GPU"

        But what is the advantage of 4 relatively low powered GPU's on a single card versus a higher-powered GPU other than more memory? A GTX670-based board would appear to provide significantly more performance in a similar power envelope.

        On the other hand, if virtualizing the GPU is such a big win, AMD and Intel's APU's would seem to be the logical method given the use of low end GPU's in the nVidia card.

        As for the latency reductions shown in powerpoint slides - assuming they are achievable in real life, everything looks to be achievable with competing solutions (i.e the big latency reductions are in game pipeline and network latency which don't sound like things the GPU will improve)

        So - there's some interesting evolutionary things happening, but nothing revolutionary.

  17. B7LL

    Game Changer

    This should be built into the next generation of game consoles, so your mates can come round and log in on their laptops, ipads etc.. no more split screen. Also, you could be playing in the lounge and "move" the game on your ipad if you want to play in the garden.

    1. JulianCA

      Re: Game Changer

      I think the point is you wouldn't even need a console anymore. Even your TV alone could play these games.

  18. Phil Dalbeck
    Boffin

    This is very different than the virtualised hardware GPU offered under RemoteFX or the software 3d GPU offered in Vmware View 5.

    Essentially - VGX is a low level instruction path and API that allows a vertical slice of the phyiscal graphics cards resources to be routed through to a VM - by a method similiar to VMwares DirectIO for those who want a read. Basically, the VM has direct, non abstracted access to the physical GPU, together with all that GPU's native abilities and driver calls - i.e. Directx11, OpenGL, OpenCL, CUDA... the lot.

    The Virtualised GPU in RemoteFX is an abstraction layer that presents a virtual GPU to the VM, with a very limited set of capability (DirectX9 level calls, no hardware OpenGL, no general purpose compute) - not only does this not fully leverage the capabilities of the GPU, but it is less efficient due to having to translate all Virtual > Phyiscal GPU calls at a hypervisor level.

    Contrary to some comments above - VGX is a real game changer for MANY industries - my only hope is that NVidia don't strangle the market by A) Vastly overcharging for a card that is essentially a £200 consumer GPU B) Restrict competition by tieing virtualisation vendors into a proprietary API to interface with the GPU, thus locking AMD out of the market which is to the longer term detriment of end users (e.g. CUDA vs OpenCL).

    1. theblackhand

      Very different?

      While I agree with most of your post, is vGPU that different from all the other hardware virtualization that non-GPU hardware vendors have done?

      Yes nVidia are the first to do it, but it is a natural evolution as vendors started to offer solutions requiring more GPU power.

      I'll change my mind if I see benchmarks showing VGX beating a current GPU with a similar power budget by a large margin (i.e. >50%).

    2. RonWheeler

      Not quite

      RemoteFX uses the horsepower of the client device to do the rendering on the destination monitor, in some ways bypassing the HyperV machine in the DirectX rendering process. Thus the requirement for a Win7 SP1 client with 32bit terminal services client, meaning it can't be used in many traditional really low end WinCE / linux thin clients. The upside of RemoteFX is that it is no longer the slightly blurry compressed stream of screengrabs that you get with most remote session software, but rather a true DirectX local rendering. i.e. no blockyness or compression.

This topic is closed for new posts.

Other stories you might like