For an IT manager, desktop virtualization is no bad thing, if only because it limits how badly a user can mess up his or her own settings. So if you are thinking that you could slim down your desktop hardware requirements and maybe keep track of everyone’s software upgrades more simply are there any downsides consequences it …
Good dedupe can solve the IOPS problem...
of all your VDs booting at the same time etc, as well as cutting down on the space you need for them. You just need a very good CPU doing the deduplication. And store user files and stuff somwhere else, so the bulk of the system disk is going to be common across all VDs, even if they have different patches and applications installed.
Here's one way to do it...
1) You need a fast network for iSCSI between hosts and storage. Let's say 10GbE to the hosts and multiple 40GbE to the storage server
2) Storage server needs plenty of PCIe bandwidth, fast CPUs, and lots of fast RAM
3) Storage server runs an iSCSI target that supports inline deduplication - e.g. Starwind. This uses system RAM as a cache, so you most important 40GB (say) of common data is in RAM
4) Your disks are a RAID of SLC SSDs, or maybe use them as cache for hard drives (e.g. LSI CacheCade)
In my small tests my network (just 1 x 10GbE) ran out of road before I could stress the CPU. I was booting up to 10 Windows VMs in the same time I could boot just one off a non deduped target - about 20 secs. Starwind's dedupe is still an experimental feature but should RTM around Q1.
A word to the wise regarding improving login times - sort out the users local roaming profiles before migrating them to a VDI environment.
A place I've been working at has just migrated >5000 users into a virtual desktop environment and there are still applications loading in the background whilst the machine is trying to establish the virtual environment. You can make the VDI load as fast as you like, but if it's pre-dated by loading up loads of useless apps/drivers before becoming available it's all 'VDI' to the user and can create massive bottlenecks in support helpdesks.
Still, loads of opportunity to make yourself popular by streamlining your favourite users' desktop for them :)
Fully with you on the profiles (although the articles focus was on storage and networking); it's an important aspect that shouldn't be overlooked when doing desktop virtualisation.
Different profile solutions (roaming, hybrid, streaming, mandatory etc) can also have a big impact on the VM storage I/O generated, and doing crazy things like redirecting AppData folders onto the network can kill many an application dead in its tracks performance-wise.
So, needs serious storage and networking, but also somehow saves money?
Reduces the need for a technology at the client, but achieves decent performance by offloading things onto the client?
This has Bad Idea written all over it, and looks very much like a sales pitch where the aim of the exercise is to sell new storage, servers, networking etc.
Manage the PC estate properly in the first place ("properly" = automated app delivery, automated installation, etc), and that way, you take advantage of the local hardware, offline use is trivial, hardware can be swapped out easily if faulty, no single points of failure, and the user can have whatever degree of flexibility you choose to give them.
VDI rarely if ever saves money in the enterprise (SME's are a different matter, but this is in the Enterprise IT section)
I've you've been sold a VDI solution on some cost-saving premise, I'd cancel your PO immediately.
"VDI rarely if ever saves money in the enterprise"
You might want to try telling the reg that - from their own "infographic" of a few days ago, also in the Enterprise IT section, they say "Desktop Virtualisation: Yes, it's cheaper".
We didn't go with VDI, partly because it clearly wasn't cheaper, but also because it didn't seem to stack up in other ways either - green IT, managability, etc. The tools for managing a PC estate are far more mature, currently, and you can generally speaking do a lot more, and a lot better, if you make good use of computing power and storage local to the user.
The short version
One master image and feed it out to a number of local sub-servers and lots of RAM on the desktop and the apps designed to run with intermittent/slow network I/O ..
- Mounties always get their man: Heartbleed 'hacker', 19, CUFFED
- Batten down the hatches, Ubuntu 14.04 LTS due in TWO DAYS
- Samsung Galaxy S5 fingerprint scanner hacked in just 4 DAYS
- Feast your PUNY eyes on highest resolution phone display EVER
- AMD demos 'Berlin' Opteron, world's first heterogeneous system architecture server chip