Over the past few weeks we have considered many facets of virtualization and now we take a look at an area that has the potential to subtly alter the way in which the majority of users interact with IT services, namely the desktop. Research carried out by Freeform Dynamics last year with the help of readers of The Register …
I use desktop virtualisation extensively...
... for the following puropses:
1) Having multiple development configurations that I can snapshot & roll back without having to uninstall & reinstall various bits of software / versions all the time.
2) Having a reference baseline build that I can quickly replicate to build new clean development instances without starting from scratch each time. Just by copying a few files.
3) Running a corporate standard desktop on a non-standard laptop buld - ie running XP on Linux
4) Having a throw-away windows sandbox machine that can be used for dodgey software / media downloads and then totally erased from disk after to eliminate viruses etc.
I've used both VMWare workstation and Sun Virtual Box for this and saved myself days of time & effort.
I have done it...
Maybe not quite as you're talking about, but here's my real world tale:
We had an older application that was dependent on Windows 95. Wouldn't run on anything newer, $500,000+ for an upgrade, and a new corporate solution was coming in about three years. Old PCs were failing, and Win95 was very unstable when loaded up with newer versions of Office, Outlook, Acrobat, etc. needed to support more modern needs. Each user had to reboot 2 or 3 times in an average day, quite a problem when they often were on the phone with a customer.
Rather then spend $500K for 30 users just to extend the life of the application a few years, I used VMware ACE. Bought the users new XP computers, dual monitors, ran all the office applications on the XP side, and used ACE to host Win95 with their order entry software. They could seamlessly copy-n-paste between the two systems. Even let them boot up two ACE instances of Win95, so they could be writing up a big order in one screen, and if the phones got busy they could take another quick phone order without having to close the original.
The new PCs were deliberately spec'd heavy so they would still be adequate a few years down the road when the new corporate wide solution was rolled out to make training & transition easier.
Data Governance Advantages
The problem with desktop, and Laptop computing from a corporate point of view is security, knowing where your data is, and having access to it. The more sensitive the data, the more you need to know about who is looking at it, extracting it and so on. PCs, even when encrypted are still a major hole, ultimately the are very expensive and difficult to secure, and easy to hack.
Government seems to be keen on the virtual desktop, deliver through a variety of technologies. SUNRay is popular, from a security perspective, because it isn't based on PC technology, it's low energy, and very secure in a way that diskless PCs, and embedded widows clients just aren't.
The green card is also important, 4/8 watt devices, with integrated IP Phones and follow me technology, are cheaper and more energy efficient than PCs, and even the additional power used by the servers still works out better than the number of PCs replaced, better still if located in a well managed eco-friendly data centre, though there still aren't many of those.
You can even have a thin client laptop, but that does rather rely on good mobile comms, battery life is excellent though.
Ultimately though a good thin client deployment to virtualised desktops does ensure that you always know where your data is, and that it's secure. We are moving to thin client virtual developments and deployments because it's easier and cheaper to deploy that kind of environment securely, which for us is important, yes we loose the flexibility of PCs, but we gain in ease of accreditation, segregation of duty, and governance, it's very difficult to loose a laptop full of payroll/medical and so on details, when you don't have one.
But old timers like myself pointed out long ago that client server would give all kinds of issues for secure data. Bring back mainframes and dumb terminals, oh we are ;-)
The future though has to be a mixed economy model, there are things that thin client really can't do very well, the main thing will be to separate the hype from the reality. Using corporate systems, doing a bit of Orifice, light weight development, then thin client's fine. Doing live media editing, or heavy duty design and development work then probably not.
I use a virtual desktop, and I hate it
The performance is terrible - sometimes it can take 20 seconds just to repaint the screen. I have a VM which is configured with 512mb, which is pathetic (and my *real* machine has 2gb!).
I can not understand why anyone wants to foist this charade on users (I write programs on IBM mainframes - I know that TSO is almost, conceptually, similar to virtualization, but it works infinitely better).
I have a real PC, which is reasonably powerful, but I am condemed to use it as a thin client to an underpowered server, giving me much less performance than I had before.
Never looked back.
"...some feel that providing contractors and other outside parties access to select corporate systems would be another hot area for VDI deployment. In the US a number of organisations have begun to experiment with variations on this theme: instead of supplying users with corporate PCs they provide them with an allowance with which they can purchase a machine of their choice. This machine then has a desktop virtualization system laid upon it and corporate access is delivered in an isolated virtual machine on the desktop thus in theory completely separating corporate and personal use."
My current client is rolling out VDI. I am one of the lucky few who stepped up and wanted 'in' as soon as it was available. I've not looked back. As a contractor, the ability to do the job using my own equipment is a really nice bonus. No longer do I have to carry around a laptop owned by the client, pre loaded with M$ shiteware and a whole load of stupid instrumentation / monitoring software / USB lockdown etc etc. Now I take my Ltd Co owned laptop, with it's proper kubuntu OS, home converted to dvorak keyboard etc etc with me. At work I sit it on the desk and slap in my 3g Dongle. I avoid the corporate LAN and therefore having to sit behind the stupidly prohibitive proxy. I can pull up a terminal and ssh to my boxes at home, I can spend 10 mins on facebook, I can read my webmail or pop connect to pull it into kmail. At the same time I can do my work, efficiently and professionally all in a window on that machine. The point is, from a digital perspective I'm both at home and at work all the time. No need to dial in over some modified VPN client, just point firefox at the client's gateway, key in the # on the RSA fob and I'm at work - At home I do this over wireless, at work, over 3g.
Of course, I can use any machine I want, too. Recently I was in a datacenter owned by the client. I had taken my laptop in to plug into a serial port on a piece of kit to give it it's first ever IP address. Normal task for someone like me. I realised I needed some stuff from my VDI session. Previously this would mean leaving the Data Center, taking the old company lappy with me, plugging it into the network somewhere where there's a desk (ops bridge etc) - so long as there was a spare eth port / power etc, then logging in blah blah. This time I left the lappy where it was, wandered along a couple of racks, popped out the keyb/screen from a SAN management server owned by our team, logged into that and then mstsc'd to my session. And there it all was, email open right where I left it, IM chat with colleagues still up. I probably only reboot the actual VM once a month, no daily routine of log on, open apps, do work, close apps, log off, just connect from wherever, whenever, and do some work, then drop off again. In a traffic jam I've done 'in car VDI' with 3g dongle and Asus eee. I didn't have much screen, but I was productive (read 'billing') and not sat in a jam losing money. Good for the client, good for me. If you work 1 hour's drive away from home and you're on call, you can pull in anywhere and connect so long as there's a PC. Internet cafe I've not yet done, and I realise I'm bigging up 3g as well as VDI. But the combination is awesome. I can take the kids to the zoo AND do that 10 minute failover test that no one not on VDI would volunteer for on a Saturday afternoon.
Ok, so this reads a bit one-sided, time to balance it.. It is rubbish for that 20Mb VISIO diagram with 80 switches, 400 servers and all the cables shown on it. Running anything graphics intensive (and I even mean looking at google maps etc in the VM's browser) is horrible. But then for a lot of stuff you could do that in your native OS, if you're not on a thin client. Also, for obvious reasons, you generally can't get stuff to move between your VM and your native OS, even copy paste of text is locked out. This clearly makes complete sense, but is a touch annoying at times. It really depends on your role/contract/needs. I'm not a web designer, I'm a SAN/NAS/Backup architect / consultant. 90% of what I do is unix or CLI based, if there's a GUI and a CLI, I'm using the CLI, but that's because it's more efficient and I'd do that anyway, without the slow refresh of VDI. In short, this works for me and I'll never look back. But if 90% of my job was in spent in photoshop then it wouldn't make the job harder, it would make it impossible.
If all that's on a machine is a bare metal hypervisor, then the support and compatibility issues drop significantly.
All builds will then be the same (and it's as simple as copying down the latest image, then off you go). Nothing should really be written to local hard disk for users (backups anyone?) and profiles should be network based. With all this in place, upgrading a machine to a new version should be near as simple as copying the image to the machine and off you go. It gets round a myriad driver updates issues for diverse machines (as long as they run the hypervisor), and any 'machine corruption' is a case of drop a fresh image in, and all's good.
On machines that end up more 'custom' with applications, there's always snapshots. If something goes awry over time, if anything's still alive, revert to the last snapshot (which you take after every successful install).
With workstations being (in the main) less worked than a server, there's more resource per machine to soak up any hypervisor overheads without problem.
Also, it allows machines to have a rapid role change (need a set of UNIX apps to complete a task? Fire up the virtual) as long as you have the supporting virtuals on the host.
Wholesale VDI? What a bunch of crock.
Once again a solution to a problem that wouldn't exist if you hadn't used Windows desktops.
Why in the hell would you want to bring the worst of thin and fat clients to the same system? I'm sure VMWare and Citrix all love this idea, but for a company? If anything this adds one MORE system to administrate not less.
I mean, come on! Think about this, 'Server based computing' is all the rage now because it's supposedly easier to admin. Yet, on Windows, it isn't because a lot of apps can't cooexist on the same box, a sever design flaw no matter how you look at it. So, server-based computing can't really work but, hey! buzzword!
So, instead of just giving up and moving to an OS that's, you know, sane. We go and put SEPARATE virtual machines on a central server that are, all in all, no easier to manage than separate desktops and are almost certainly MUCH more expensive (if only because of the insane licensing fees associated with this 'VDI')
And because applications STILL can't coexist we will also use 'Application Virtualization' because, well, virtualisation is all the rage too, isn't it? Spending another 60-100 dollars a desktop per year to solve a problem that shouldn't have been here in the first place.
It's amazing, really...
Especially since Novell ZENworks already solves this problem neatly with remote imaging and application deployment if you MUST use a broken desktop OS which makes managing 'real' desktops not any worse than 'virtual' ones...
Wake up and smell the marketing ploys! THIS IS POINTLESS and it's not 'technology' it's a way to make the already insane TCO of Windows desktops go EVEN HIGHER.
fast forward to bring your own?
The laptop from my company (>100000 worldwide) forces an environment, doesn't allow me to add applications, link to my mp3 player, etc. That would an obvious use, allowing more flexibility, but keeping a validated environment.
The other way round is challenging but more interesting - can I bring my own home laptop, and connect safely to the company environment? It would be similar to using my own pen and paper or car to get a more pleasant writing/driving experience. Think of the cost savings! Just a few old PCs and a fat drive to run the grid server.
Excuse me bringing a serious suggestion for once, did I mention that the company is downsizing IT?
It makes doing remote support SO much easier.
For family members who are clueless about technology (or CIOs - the situation is the same), you can't get a rational explanation about what's gone wrong, so being able to attach remotely to the underlying O/S and then go into their VMd environment is worth every penny that the freeware costs.
The biggest benefit comes when you can couple this with VNC: "show me what you mean" is worth half an hour of questions and answers and then being able to reboot their VM while still staying connected to the Linux system underneath is very convenient.
So long as you've been able to set up the system to boot up and start their windows O/S automatically, the users never have to be aware of what's going on. Since they aren't likely to want to play games, the need for speed just doesn't arise. (Aside: most office users don't even need the performance that a 1GHz box supplies - provided they stick to a decent windows O/S like XP. Further, software developers should be MADE to develop their applications on the minimum spec. box they claim will run it. That way they'll take care about writing efficient code.)
It does have one drawback however, instead of trying to fix faults themselves, the ease at which intra-family support (i.e. me) can sort things out means that the call goes out for every little problem.
Many reasons for Desktop Virtualization
I've blogged extensively on desktop virtualization. It's great for BYOC (bring your own computer to work), to allow companies to build and maintain one and only on image for all PCs in the company, and finally to prevent data leakage. You can read my blog here:
Desktop vitualization is critically essential!
I use desktop virtualization extensively. I consider essential in a development environment, without which I would need access to several PCs at the same time.
It is also a life-saver when supporting multi-platform and multi-configuration environments, again without which means several PCs littering the office.
As it is, I have just one PC and simply boot into whichever VM in whatever configuration I need. Brilliant stuff.
@I use a virtual desktop, and I hate it
You're doing something that's a poor use case then. I've got VMs of Solaris x86, XP, and Vista open now on this PC, and they're all fine for web browsing, compiling C, editing code etc.
Essential to get the best of all worlds
I've used desktop virtualisation for years.
I have Linux as my base OS, and run Windows XP in a VMWare Workstation VM.
This gives me the power flexbility and reliability of Linux as the host OS, while allowing me to continue to use the Windows apps I want - Microsoft Office, Toad for Oracle, Photoshop and more. Plus it allows me to use all those USB peripherals I have that don't work properly under Linux. My mobile phone, for example.
I've always found it pretty fast and reliable. Having lots of RAM is important of course. I have 4GB on my host, and I allocate 1.5GB to Windows.
What I've done most recently is to install Windows on a separate partition on my hard disk. I now have it setup so that I can boot directly into either Linux or Windows; then once booted into Linux, I can also boot the same Windows install under VMWare.
This gives me maximum flexiblity - if I want to play a game, or I know I'm going to be doing Windows-only work for a while (like a long Photoshop session), I can boot into Windows and be using the same install/configuration that I normally use from Linux.
First off Define Desktop Virtualisation
Now ask Arthur Hitomi for a definition in context.
@ AC VM Hater with 512MB RAM
You or an admin should be able to up the amount of RAM assigned to the VM. IIRC even the VMWare player allows the user to change the amount or RAM allocated to the VM if necessary...
Standard Operating Environment
Maintaining a SOE desktop environment in an organisation of any size is a huge challenge. With virtualisation, an absolutely standard environment can be deployed on every desktop. Where staff need to do something different -- such as running non-standard software, then a virtual machine can be created to support their needs. If a whole group has a particular need, then a virtual image can be deployed for that group.
In both cases, the standard desktop environment is maintained.
If the virtual machine (or machines) are given appropriate resources, primarily RAM, then the performance impact is not noticeable.
A virtualisation product will cost money to introduce and support. The payback is an absolutely standard SOE desktop and a lower cost of supporting additional software or just letting selected users do their own thing in a virtual environment.
I would also imagine that a virtual machine would make a great sandbox for browsing or using applications that make external connections. Such a machine could be locked down strongly, or running a different OS such as Linux.
Some free resources pre- VDI Assessment and Knoweldge ... VDI.com
For free tip and tricks before and during a VDI implementation - consider logging on to http://www.VDI.com.
For web testing it's almost essential
Anyone that has to do testing and validation of multiple browsers, VMs are about the only reliable solution. For the company I am working for, the site has to work in IE6/7/8, Firefox3/3.5 Safari3/4 and Opera 9. It is simply not possible to do that on one machine. Only Opera seems to side-by-side install without screwing up, so I use VMs with various combinations of browsers. It's a life (and sanity) saver!
Similar thing applies for web app testing; you can throw together a basic VM of your prod OS and keep re-using it, to test out new stuff or even to try out different installation methods, for example installing Fedora on to a software RAID. I'd not done that before and wanted to know what it was like to install before actually doing it on real hardware - VM!
Lastly, when I was consulting a while back, I had a VM with everything on just for that work - keeping my PC nice and clean and free of the crap I needed just for that work. When done - delete the image with no other clean-up required.
I Use It To Be Hip!
Now I can run a Mac without paying the price for gold inlaid Intel hardware w/ the cool Apple logo on it.
Oh, I also use it to run Linux distros(Ub, Red), Server 2008 RC, SBS2008 RC, XP for pron, and 7 RC environs... all on 1 box.
Any of them gets a bug, becomes unstable or I load an app I decide not to keep, forget about uninstalling/re-installing, just delete and copy a fresh image from the master file.
[Porn Browsing Environment]
Having been involved with technical support for 15 years we/I have seen a lot in IT. Clients ranging from corporates to home users. This has led to support from Amstrad PCW, all versions of DOS most versions of Netware etc etc
Fortunately the cheapness of hardware and in some parts software means I really don't see a DOS 3.3 backup on 5 1/4" DD floppies (even I'm not old enough to remember 8") landing on my desk anymore. Even so we still have a variety of OS & software versions to support.
Previously we have used a variety of techniques to cope. We had machines with removable hard drives with different OS versions. We upgraded the machines and started using System Commander for multiboot keeping the older machines for older OS's.
Last set of upgrades the support desk got new production workstations and the previous models were used in a variety of different ways mentioned previously to cope with various situations.
One of the products we look after released a new version of their product but were going to run it in parallel with their older version. After a discussion with their head programmer he suggested I look at VMware to help with support since he uses it all the time to help with development. I took a quick look downloaded it and had a new OS set-up in less than 1 hour. Since then the old machines have all been farmed off to a recycling scheme and everyone is finally back to one machine per desk. Support for windows and Linux has never been so easy and being able to trash an environment and rebuild it in minutes is the way forward.
May not be for everyone but for developers and support staff with the cost of hardware it really is a must.
Confusing two different things
People are confusing two different things here. Lots of people posting here are interpreting "virtual desktop" as "virtual machines running on my desktop", and that's not what the article was aiming at.
Running virtual machines on your local desktop using VMware Workstation is indeed very useful for many people, especially software developers. No contest about this.
But what we're really talking about here is people not having a corporate desktop at all. Their virtual desktop is sitting in a datacentre somewhere, and they connect to it some other way; RDP, some sort of thin client (either in hardware or software), VNC, whatever. The VM is remote, not on your local machine.
I like the idea quite a lot, especially for Windows desktops. I agree with the poster who said this mainly solves problems for Windows desktops; UNIX systems have had a thin client operation mode for decades; it's called ssh/the X11 protocol. But this is the real world, and I have to deal with a user base 70% of whom are using Windows, not Linux or UNIX.
Virtual desktops are concept I am about to start seriously investigating for work - it may make management of remote access for Windows users much easier to secure and manage, if the promises are true. My principal concerns are whether the performance hit will be too much, even for non-graphical applications.
From what I've seen so far there's a lot of hype and talk, but as yet very few implementations beyond pilot projects.
Used, implemented, supported desktop virtualization for decades
Desktop Virtualization has been a rewarding topic for decades in Managed Services Device Management environments.
Desktop users often need to access equipment on customer sites for management in a secure way where activity is logged and where desktop users can not have direct routable access to the customer environment.
The first phase was to deploy a couple of highly secured systems, offering virtual desktops, and using X11 protocol to provide the desktops back to the user PC's, desktop workstations, or X Terminals. Since the highly secured system could be VERY controlled, microsoft worm/virus proof, highly logged, and very auditable, it was a success in allowing the business to be conducted. We saw this model replicated to dozens of operation centers world-wide during a process referred to as "regionalization". Hummingbird Exceed was a very reliable high performer. The need for the occasional remote user would use Hummingbird Exceed on Demand. Both products were exceptional, in comparison to the competition.
The second phase was to centralize the systems into a highly redundant and clusterized central data center, in order to save costs, provide better redundancy, and consolidate budgets to fund additional software. With people located around the world trying to get access to tools in a single data center - the challenge was significant. The original Hummingbird Exceed licenses would not be leveragable, since the WAN provided too much latency. The client-server version of the tool (to replace X11) performed like a slug in comparison to the snappy local X clients running on the same server as the management platforms. The newer versions of the Hummingbird Exceed on Demand thin clients were the snappiest virtual desktop cliients we tested but did not provide refreshes correctly of some newer software (tech support could not resolve it at the time.) A combination of Windows Terminal Server and Hummingbird Exceed on Demand could be used, but it was WAY TO EXPENSIVE of a solution to deploy. In the end, a deployment of VNC consoles for operation centers for domestic users and GoGlobal consoles for International users proved to be an excellent solution, although the VNC consoles are a little slow and force users to use a framed virtual desktop window while the GoGlobal offers a nicer user experience where the framed virtual desktop window is optional and virtualized desktop clients run snappy all around the world. We support 200 remote and virtual users.
When client software needs to be upgraded, it is very easy to go on the central servers, redirect icons/shortcuts from old software to newer released software, without having to disrupt the user community with downloading software. One software installation has to be done for hundreds of users instead of debugging hundreds of software installs. Correcting a problem revealed by a single help desk ticket instantly propagates to all desktops.
The Sun Secure Global Desktop solution is beautiful, but the investment in the current infrastructure is such that it is done and people just take the existing virtual infrastructure for granted. The phrase "virtual desktop" does not really even get considered - it exists, always worked, always been reliable in production, meets the business needs, and makes software maintenence a very nice thing.
Same problem, different tech
"In the US a number of organisations have begun to experiment with variations on this theme: instead of supplying users with corporate PCs they provide them with an allowance with which they can purchase a machine of their choice."
This sounds like an awful idea. Who get to support all of those PC that the random idiots buy?
- Fee fie Firefox: Mozilla's lawyers probe Dell over browser install charge
- Did Apple's iOS make you physically SICK? Try swallowing version 7.1
- Pics Indestructible Death Stars blow up planets with glowing KILL RAY
- Video Snowden: You can't trust SPOOKS with your DATA
- Review Distro diaspora: Four flavours of Ubuntu unpacked