Desktop virtualisation has its benefits but it is also an important structural shift. This is compounded by other changes likely to take place at the same time, such as a move to Windows 7 or an office move. When the Co-operative Group decided to migrate some of its 18,000 desktops to virtual desktops and thin clients, it also …
As well using the desktop PC
Seeing as you still need something with a glowy panel and a set of those pushbutton-key-things on your desk anyway, may as well retain the PC to perform that duty.
The main cost of supporting a PC is that of AD domain setup, user profiles, email accounts, troublesome apps, licensing, malware, etc. A PC used as a thin client needs none of these, in principle making it no more costly to maintain than a specialist terminal. Also, making all desktops userization-free means computers can be swapped-out as required.
Also a good opening for Linux. If the cloud software will work with a Mozilla browser then no need to pay for a licensed desktop OS. Although if you've already got XP COA's just use them, a limited XP user with software policies enforced is reasonably secure against malware.
Not that I'm advocating cloud working, the people I've seen switch to it have had their fingers burned. Inhouse virtualization yes where suitable, relying on some datacenter in another country... NO.
Parents deficient in imagination...
Nice story, I like it. But is the ending missing? or is it meant to stop like that?
The title is required, and must contain letters and/or digits.
“There has been a massive decrease in support calls,” he says. “When a machine does die we just drop in a new unit and users are up and running within minutes. We used to spend a day or two just setting up applications but we’ve completely eliminated that now.”
Thousands of users and they never had standardised images for quick installation? It's not hard to make improvements when you're doing things rather badly...
If they were moving to thin clients...
If they were moving to thin clients did they continue with Windows, move to Linux or something proprietary?
Back to thin client eh?
Had to happen. Microsoft must have cost the world economy trillions moving us away from it in the first place. Users unable to install their own shiznit? Nirvana.
Re: Back to thin client eh?
It has been what? 10 years? Something like that.... since the last time thin clients were all the rage. we'll be back to autonomous workstations again by 2017. Early part of decade: terminals/thin client/web-apps. Late part of decade: autonomous computing/client-server architecture
back to a time-sharing system, and couldn't be much happier.
Granted, it has caused a dustup or two - as when one user's elbow was jostled in the corridor and he dropped his stack of cards which he hadn't striped on the edges to make their re-stacking possible.
Still, the ease of administration and the most effective lock-out of rogue users has more than made up for the general lack of utility.
What goes around...
I'm finding it difficult to be impressed by "the cloud". Over twenty years ago we were using Unix workstations connected to central file servers. The standard set of software was on the workstations but the user data were on the servers, so you could log in on any workstation and work on your own projects. Not quite the same (it was 10base5 ethernet for starters) but the principle is the same.
re: What goes around...
UNIX and X Window based systems have always had the advantages of remove file systems(NFS for example) and remote application runtimes so I agree, this is mostly a yawn. But what has always been awesome about UNIX/X is that the same application could run either remote or local depending on how it is started. It's also mind boggling how much has to be hacked together to make a Windows environment even close to manageable. From creating virtual machines to run different OS images in to remote display capability built on top of the OS instead of into the display and application framework.
Same here as I was running remote applications and using networked file systems and logins over 15 years ago and it worked great.
"We used to spend a day or two just setting up applications"
This tells me that the fundamental problems in that organisation seems to have been of a whole different nature than they might think right now and probably will not be solved by them going virtual. Not on the long run at least.
There, free advice :-)
People who say that kind of thing have usually worked in large cubicle farms. The small or medium business has a much wider diversity of software requirements, and furthermore cannot afford to buy fifty identical PCs at once, so the usefulness of cloning methods is far more limited.
Plus, the enormous complexity of software push-delivery systems simply makes it uneconomic to deploy these on small sites. Two hours to install software on each of five computers equals ten hours.. or, two hundred hours at a wild guess to configure and debug a scripted install-system, to set-up each computer in half an hour. I'd love to charge for the latter as it'd be far more fun to deploy... but I somehow don't think I'd get paid!
Trying to keep personalization off smallbusiness desktops as far as possible really makes things simpler. Most smallbusiness computers have an identified role, and different roles require different software. In most cases the role of the computer remains the same regardless of who is using it. The only thing which needs to be personalized is email, and here it would be preferable if the data and settings could roam freely with logon. Unfortunately Outlook, the usual preference, is very bad at this. This is an area where Thunderbird does better, and Web-based clients excel.
Not a Small Business
They've got at least 6,000 PCs, which at a conservative turn-around means 1,200 new PCs a year to configure, plus any repairs and upgrades. To spend 2 days on installing software on a case-by-case basis is absurd - define standard images and use them. Vendors like Dell will even pre-install custom software for orders above what, 30 machines?
Can someone explain to me ....
... if an enterprise has a DR site, how can desktop virtualisation save money?
You go from one PC per user to two sets of desktop virtualisation infrastructure - one in each data centre.
how does one PC per user provide DR, except for home workers, assuming the house does not burn down with the PC inside it?
Does he realise....
what he said -- “It was only once we started the migration that we discovered just how many rogue applications there were dotted around the organisation. We then either had to virtualise them or take them away from people.”
Quite! If you want rigid control of simple repetitive tasks all performed the same way across the organisation and if you're prepared to risk it all across vulnerable links, then the system described may make sense.
What beats me is why they bother employing humans at all.
Of course there are "rogue" applications in the real world -- I suppose he means applications which "the organisation" will eventually discover actually did a specific task that the officially-prescribed applications couldn't accomplish.
IT Departments tend to be populated by people who have little understanding or experience of the world beyond IT Departments and little capability of grasping either the true significance of failures of communications links or the real-world inadequacies of their own chosen applications which may well meet the (inadequate) specifications conjured up by managers who aren't at the sharp end of things but can't actually perform the tasks required in the real world.
Paris, because even she's not that dumb!
- Fee fie Firefox: Mozilla's lawyers probe Dell over browser install charge
- 20 Freescale staff on vanished Malaysia Airlines flight MH370
- Did Apple's iOS literally make you SICK? Try swallowing version 7.1
- Neil Young touts MP3 player that's no Piece of Crap
- Review Distro diaspora: Four flavours of Ubuntu unpacked