back to article The case for DV

Ever since the PC landed on desktops in the early 80s, it has been a mixed blessing for IT administrators. On the one hand it has empowered employees, but on the other hardware refreshes, maintenance and support have entailed high capital and operational expenditure. Most organisations are virtualising the desktop to regain …

COMMENTS

This topic is closed for new posts.
  1. Anonymous Coward
    WTF?

    A couple of points

    "Most organisations are virtualising the desktop" - errr.... no they're not.

    As for the actual example of desktop virtualisation, what's the difference between this and a good old fashioned client-server model? Apart from the flashy name, that is? I'm struggling to see the difference. I mean, all this says is basically "rather than have a PC sat on a desk, one per bod, we have a server farm and the same bods connect into it over the [cue, Dr. Evil] 'network' from the comfort of their sofas, via their laptops". Wooo-hooo - who'd have thought, eh?

    So back to my original comment (re: "no they're not"), well, yes, maybe they are. It's just that it's not got the whizzy "DV" moniker.

    1. Mike Shepherd

      "Most organisations are virtualising the desktop"

      I think you were right the first time: no, they're not.

      Most organisations have a few PCs and don't "virtualise" anything. What's more, that's exactly right for them.

      The author omits any figures to justify "most organisations" and (I would suggest) doesn't have any. It recalls Kelvin's remark: "...when you can measure what you are speaking about, and express it in numbers, you know something about it; but when you cannot...your knowledge is of a meagre and unsatisfactory kind; it may be the beginning of knowledge, but you have scarcely in your thoughts advanced to the state of Science...".

    2. Fuzz

      it's not very well written but I think what he means is

      "Most organisations are virtualising the desktop to regain some of the efficiencies of the mainframe era while retaining the productivity of the desktop environment"

      By this the author means most organisations who are virtualising their desktops are doing so to regain ...

      All the the sentence is slightly ambiguous I think it's fair to assume the author doesn't think that most organisations are virtualising desktops.

    3. DannyB

      No, they're not, and that wasn't the point.

      That's not what I meant. Most organisations *that* are virtualising the desktop are doing so because they yearn for some of the old mainframe efficiencies. We're still at a relatively early stage in the game, and I'm not suggesting that everyone has jumped on the bandwagon yet. But yes, as it happens, I do think that's coming.

      There most certainly are some significant differences between between desktop virtualisation and the old client/server software model. You're oversimplifying what was commissioned as introductory piece to start with. As I recall, client/server as it pertained to computing in the early 90s was about applications distributed between the client and the server. It was about taking some of the more compute-intensive and data-intensive work and backing it off to the server, while delivering the results back to a supporting application on the client. It didn't replicate an entire desktop, per-user, at the server level. VDI does.

  2. jake Silver badge

    Huh? How does that work?

    "Data never leaves the data centre, so it can’t be accidentally left somewhere,”

    So in Chris Knowles'[1] mind, all of the folks out in the RealWorld[tm], using "virtual desktops" for their computing needs actually have to be physically present in the data center?

    Who the heck are these clowns, anyway?

    [1] head of solutions at Dimension Data, or so it says here.

    1. DannyB

      Data isn't stored on the PC

      What he's saying is that the data is stored centrally. In a VDI session you're accessing a screen image delivered over an efficient protocol. The data itself isn't beamed to the PC, and isn't stored there (which is why, for example, you can get away with using a zero client with no operating system to access an entirely virtualised desktop).

      1. jake Silver badge

        Yeah, sure, right.

        "In a VDI session you're accessing a screen image delivered over an efficient protocol. The data itself isn't beamed to the PC, and isn't stored there (which is why, for example, you can get away with using a zero client with no operating system to access an entirely virtualised desktop)."

        Do you really think that nobody will develop a terminal emulator that'll allow a general purpose computer to access that bit-stream, and then do whatever with that data?

        Yeah, sure, right. See Wall Data's "RUMBA". And others.

        This is snake oil. At best. Outright fraud, at worst.

        1. DannyB

          An attacker could unleash a squadron of rabbits, with lasers, too.

          Let's not also forget that someone could subvert the entire system by gaining physical access to the server room and recovering the data in person! Horrors!

          You can attempt to invalidate any article by simply saying that someone can step outside the operating parameters to subvert the system. We shouldn't install SAP, because, someone might, you know, hack it. So let's not have any articles about ERP in future without mentioning that it isn't actually built on mathematically verified code. Actually, we should all really be using abacuses, because they're more secure.

          If you're going to raise this argument, which *wasn't* the subject of a short introduction to DV, then you need to properly explore the ramifications, including ways to *prevent* someone spoofing a terminal. You're taking one of a series of about 15 articles exploring the subject and attacking me for not covering every single, possible, eventual outcome in 750 words.

          This is disingenuous. At best. Outright sophistry, at worst.

          1. jake Silver badge

            Disingenuous? Sophistry? Nah.

            I'm a jaded old sysadmin. People have been trying to bring back centralized computing for decades, but that bird flew thru' the open barn door when IBM legitimized the personal computer with the release of the original IBM PC. Can't herd those cats back into the worm can ... Mainframe computing has it's place, don't get me wrong (I used to speak SNA like a native), but GP computing makes far more sense for the vast majority of us ... regardless of how many bells & whistles you graft onto it.

            My grand-daughter ERPed on me while we were watching the other all-singing, all dancing, brightly colo(u)red dinosaur, Barney.

            RUMBA wasn't spoofing, it was a 3270 (etc.) emulator. Bit streams are bit streams. I can do with them what I want at my end, which is my point, and a point that DV advocates gloss over. Not attacking you, attacking a concept.

            I still use an abacus (it's in the feed barn, I use it to calculate critter chow & nutrient needs ... calculators only last a couple months in that environment). I use sliderules regularly, too. There is one in each of the aircraft, and I still use the old Sun model that got me my first engineering degree to calculate fencing needs, roofing, road base, concrete, DG, beam loads, etc.

  3. Anonymous Coward
    Anonymous Coward

    No need for security eh?

    >> "Data never leaves the data centre, so it can’t be accidentally left somewhere,” he says. “That means you no longer need costly and complex disk encryption.” However, for this advantage to be properly realised, administrators may have to set up policies that stop files being copied to a local desktop. <<

    and also policies that mean that users can't use any email. At all. which means blocking webmail too, in fact any kind of web access as uploading to 'the cloud' is nowadays a trivial matter at best. So I think he's talking a bit of 'poo poo' there.

    Love the idea of DV though, developers will of course need much meatier VM's than everyone else ;)

    The only practical everyday application I can think of that it seriously falls down on is watching video (in whatever form that takes).

    1. Sarev
      FAIL

      hmm

      Or absolutely anything which involves getting data into the system in the first place. Because not every home is connected to their work server (farm) via a fibre link...

      I take it I'm not allowed to mention "network computer" here?

      1. jonathanb Silver badge

        Re: hmm

        and even if they are, the upload speed is usually not that good.

  4. Adam Jones
    FAIL

    BullSh*t Bingo

    "These days, as companies adopt follow-the-sun IT support practices, business continuity is baked directly into everyday operations. Virtualisation is a lynchpin technology here."

    BINGO!

    1. DannyB

      Care to elaborate?

      I see two models. Classic DR, where everything stops breathing and falls over, and then you have to get your playbook out to get it up and running again. Your physical facility goes down, and you have to move to a hot site, transfer all your data to a new set of PCs, and get it rolling once more (that is, if you can afford to have a hot site, rather than a warm or cold facility). Seems to me that in this model, operations and recovery are two separate processes.

      The other model is business continuity, which brings the idea of operations and recovery together. So, instead of having your desktops running locally, you have them virtualised centrally, which makes them more manageable. Maybe 40% of your employees are in the office at any one time, with the others working from home or on the road. Or maybe in telecottages (which I still think are a great idea). Your site goes down. Only some on-site desktop clients are affected, but none of the data is stored locally anyway. Your desktop VMs are on a central server, which is replicated, and the data is stored in a SAN which is also replicated. So your recovery period is far shorter.

      I don't think it's fair to call that bullshit. I think it deserves at least a reasoned discussion, because it could offer some benefits, if done right.

  5. Anonymous Coward
    Grenade

    Desktop Virtualisation is Ridiculous

    Having been involved in a bunch of projects with DV where it has been forced on me from on high (against my advise) I can say with some confidence that there are very few environments where it actually works.

    The biggest problem is that most people's networks aren't robust enough. With "1 PC per desk" setups if the network fails then people can often carry on working. With virtualisation solutions a single network outage affects everyone. In many organisations with 1 PC per desk, if someone's machine fails at a critical time (say accounts running the payroll), they can often walk to another PC near to them and carry on working - not the same in a DV world. Most organisations don't have twin data centre's with synchronised data between them, and robust networks running out.

    The next problem is that you still need terminals. This article seems to sidestep that issue completely. Either you expect people to use their own machines - which has it's own issues - or you provide machines - when you are back at square one. I've seen precisely one organisation that actually put in dumb terminals connected direct to the data centre - and that was a bank on a trading floor. They actually had the robust network and data centre necessary. But it certainly wasn't a cheap option - the main reason they did it was the building they were in didn't have powerful enough air-con and power for the trading floor to be stacked with computers.

    Next PCs on desks are easy to get support staff for, complex virtualisation solutions need lots of network and back end staff. And a network guy probably costs you 2-4 times more than the PC support tech.

    Finally, the experience just isn't the same. For users doing basic computing (like word processing, etc.) they might as well be using the low spec machine you are using as terminals. For high end users the DV solution is appalling. Video is poor, graphics intensive applications (like CAD systems etc.) just don't work properly.

    What DV is is overkill for the bottom end, underpowered for the top end, and less robust than a distributed solution. And, at least in my experience, it never costs less in the end.

    1. Anonymous Coward
      Thumb Up

      So it seems

      ...not too much has changed since the Proof of Concept I worked on in 1997 with Citrix and NT 3.51 (Wyse Terminals, "[hey look I'm running NT in xterm on a Sun Sparc workstation]", etc).

      Good to know :)

    2. DannyB

      Good points

      They've commissioned me for an article that discusses what the network needs to actually make desktop virtualisation work. That point about single points of network failure is a good one. I'll address that, thanks.

      I think your point about people not being able to support virtualised desktops has legs, but the parameters are moveable. It depends how mature your operations are at the back end, and how well you've automated everything. I do think it's fair to say that unless you've achieved a certain level of competency in back-end management, you're going to run into trouble.

      No, I've not run across an environment where you'd want to do video editing in a virtualised environment. There are other types of applications, too - particularly high-end visualisation apps, for example - where you'd want to keep the logic local. I wouldn't want to see CAD apps running virtually, because they often need some chunky dedicated processing at the client end. But that's why some of the users I've spoken to have kept a small number of dedicated local PCs operating, for specialised tasks.

      But your language gives something away for me, AC:

      >involved in a bunch of projects with DV where it has been forced on me from on high

      Having an attitude like that is never a good way to begin a project, and will pretty much doom it to failure. If you don't buy in, and do your best to lend your support and enthusiasm for a project, then you're going to end up as a part of the problem.

  6. juice

    Thin clients redux...

    And so we're back to the thin client model. This week. Until further notice.

    (in truth, I'm struggling a bit to figure out what "DV" is actually meant to mean in this context. Are they slapping linux on all the machines together with a copy of VMWare and a network-mounted virtual hard drive, or something more like a straightforward VNC thin-client?)

    The article was impressively light on technical details - what are they going to be running on the thin clients? How long do they take to boot up and gain access to the server? What is the response time like? Has there been any significant impact to their network traffic (either positive or negative)? How do the thin clients handle media streaming (e.g. course materials)? How well does the system work when you're offsite (e.g. running over ADSL), or in an area with heavy wifi congestion/poor reception? How well does the system handle user customisation (e.g. large fonts for the partially sighted)? What happens when multiple users need to use a resource-intensive app? Have any metrics for assessing the change to the user experience been defined?

    Most importantly: do the long-term costs outweigh the short term benefits? Considering the constraints on a virtualised system (network overheads/throughput, thin-client startup time, reduced CPU and memory resources), it's not hard to see this sort of solution adding 10-15 minutes of overhead per user, per day. If we shamelessly round this down to an hour per week, assume an average per-employee cost of £40/hour (including things like wages, NI, office space, power bills, etc) and work on a 40-week "working" year,, then the VD model is effectively costing the business £1600 per year, per employee...

    1. Anonymous Coward
      Anonymous Coward

      "£40 per desktop per week"

      Some good thoughts there, but given that my desktop has three unusable hours a week when the corporate Anti-Norton Virus clobbers the disk to the exclusion of anything else, three hours during which a Network Computer (er, sorry, "DV-based solution") would hopefully allow me to be productive again, I can't help thinking that your account has missed some savings.

      So my three hours of lost working time becomes one. Multiply that by a few dozen people, and with the money you save you can buy a lot of server-based computing and network support.

      1. Anonymous Coward
        Happy

        Or you could...

        ...dump Windows and get even MORE time back!

      2. DannyB

        Some possible solutions

        1) Why are you using Norton AV? It's a notorious resource hog.

        2) Make use of offline security and patching of VMs overnight, when no-one's using them, on the server. There are a couple of products that manage that now. I think VMware does one. I can probably find the details if it would help.

        I guess there's also the option of making the VMs non-persistent and just instantiating a new one from a central image whenever the user logs on. Then you could pull in their data from a redirected folder.

    2. DannyB

      Excellent questions, juice

      This was commissioned as an introductory article, and it's part of a series. Sit tight, there are more coming. I love getting comments like this. Helps me to work out what I should be asking as I research the other pieces.

  7. Anonymous Coward
    Anonymous Coward

    "Most organisations are virtualising the desktop...."

    Ummmm, that wasn't the whole sentence. Did you just get to that point and stop reading?

    The whole quote was 'Most organisations are virtualising the desktop to regain some of the efficiencies of the mainframe era while retaining the productivity of the desktop environment.'

    It's not stating most organisation are virtualising desktops, it's stating main goals of the companies that are going down this route.

    Yeah this article is a bit thin on statistics, but at the end is does say that this was an introductory article.

    1. Anonymous Coward
      Anonymous Coward

      A title is required

      If you're going to be a pedant it's best to be correct. The statement as written suggests most organisations are virtualising the desktop. To achieve your suggested alternate meaning something similar to the following would be required: "Most organisations that are virtualising the desktop are doing so to regain some of the efficiencies of the mainframe era while retaining the productivity of the desktop environment."

      And yes, the article is, disappointingly, a bit thin all round. But it seems to have that in common with a lot of El Reg content these days.

  8. Anonymous Coward
    WTF?

    Desktop Virtualization

    Is that what we old fogeys used to call 'timesharing" , but with a better class of VT100?

    1. DannyB

      On steroids

      In my overmatter file (the bits that I chopped out of the article), I called zero clients "VT100s with big biceps. And tattoos." :-)

      But there is a difference. The VT100s just sucked down character streams. These are providing you with access to an entire desktop operating system at the back-end. I sometimes think people gloss over this. Sure, what's old is new - that's the industry we live in. But what's new is not the same as the old. The basic concepts have developed a lot in 30 years.

      It's a bitch trying to play Minesweeper on a System 360, ain't it?

  9. Anonymous Coward
    FAIL

    Cost savings?

    not likely!

    Pay for licenses, over and over again. Pay for honkin' big servers in a data centre. Pay for more comms. You'll be wanting to double that cost - for redundant data centres.

    Oh yeah, now your laptops don't work offline. Sorry about that. I know, I know, that's why you bought a laptop in the first place, but you've got to suffer to get onto the bleeding edge of technology, don't you know.

    I once worked at a place where the boss *really* wanted to do this. The business case fell over at the first hurdle - a cheap PC was actually cheaper than the thin client hardware alone.

  10. Anonymous Coward
    Happy

    "better class of VT100?"

    No no no no not that at all.

    You mean standards compliant (ANSI X3.64) low cost high efficiency green desktop client.

    As long as it's not a C.Itoh with the wrong connector gender on the back. Madness.

  11. Anonymous Coward
    Happy

    "VT100s just sucked down character streams."

    And then a few years later (maybe in the early 1990s) Xterminals just sucked down X11 protocol streams.

    For younger readers and readers certified Microsoft dependent, X11 is a thing of beauty (or pos, depending on viewpoint) where the graphics hardware (screen, kbd, mouse) is virtualised [1] from the application's point of view, and there is potentially a network layer between the application and its user's hardware. Ideally you have a fast low latency LAN between the datacentre and the desktop. A bit like you might for Citrix, except unlike Citrix this X11 stuff wasn't a proprietary protocol brute-forced on top of an OS which was never designed with multi-user access in mind.

    Thus, as Danny wrote "providing users with access to an entire desktop operating system at the back-end." The OS could be anything that supported X, typically a UNIX, but also VMS and maybe other real OSes too (QNX, VAXeln, and other RT kernels were also possible, iirc).

    Then along came PCs putting power (and anarchy) on everybody's desktop. People liked the power, no one (except a few far sighted people) worried about the anarchy. DV is tomorrow's answer to the anarchy.

    [1] virtualisation is what we're talking about, right, and virtualisation is trendy, therefore X11 is good, right?

    1. jake Silver badge

      @AC 22:20

      Other way around. Personal computers (including the late-comer IBM PC) were around long before X or X11. Dad & I built a PDP-11 based Heath H11A in 1978. I first used an IBM PC in early 1981 (was a pilot-build, running PC-DOS 0.98). X first appeared in '84, X11 in '87 ...

      BackInTheDay[tm], we used serial muxes to connect to UNIX[tm] (or BSD, usually, in my case) systems. The first thing I do when installing a new personal system, even today, is hang an IBM 3152 and a model M keyboard off a serial port ...

      PCs as GP computers are only anarchical if the IT staff is incompetent ... which is ALWAYS a management issue. Placing bandaids on broken-by-design systems isn't fixing the problem ... it's a symptom of the problem.

  12. Highlander

    *Sigh* Another round of thin client roulette?

    Does this idea come around every 10-15 years?

    Diskless workstations suck. end of story.

    To provide enough compute power on the desk to handle the presentation layer of Windows, you have to essentially put a PC on the desk, regardless of how you cut it. Whatever is on the desk has keyboard, mouse, monitor, networking, graphics processing and some local CPU to run the thin client on. the only difference between that and a PC is the HDD and perhaps the amount of memory. The user training is the same, the hardware costs are not significantly different, the software costs are not significantly different. The flip side is that you now depend on those centralized servers. OK some will say - cloud, whatever, it's a cluster of servers, whether distributed or not, it's the same thing conceptually because the user's client connects via a network to the server - for everything. Just like VT100s and VAX systems. Just like 3270 and IBM AS/400 or Mainframe systems. It's the same old crap again. Except now that all the application processing power and data storage has been centralizes you need some big assed servers to handle the load. not only that, but now that your enterprise runs on a virtual desktop, your network and server cloud have to be far more resilient because now your entire operation depends on them. So you have hot stand by servers, much more expensive SAN storage requirements, ridiculous backup requirements and have a damn good disaster recovery plan. All of that costs $$$ and has to be managed, administered and supported by a larger team than just your ordinary app servers require.

    All this to save perhaps $200 per desk in hardware costs? Total and complete BS. That doesn;t even begin to cover the issues that this kind of centralization brings. Pretty soon you have disk quotas because that SAN storage is fantastically more expensive than the 1TB drives shipping in desktop PCs today. So people start getting pissed that they can't have everything they want on 'their' desktop. Organizations soon find that many, many virtual desktops all running WeatherBug and all the other innumerable task bar trash soak up CPU time, as does Farmville. So those are summarily banned, causing more user unrest. Then the mainframe cycle is repeated when end user groups get tired of the lack of freedom and flexibility and decide to get a few real workstations for their own use, and pretty soon, you have lost control all over again as departments invest in more special workstations and users migrate to the Personal Workstations instead of the shared desktop.

    I've been through this three times now, once transitioning away from mainframe, once experimenting with diskless workstations in a pre-Windows environment, and once dealing with Windows Terminal Server in a predominantly XP environment. I also dabbled with this with Windows NT, but fortunately sanity prevailed and we went with PCs on the desktop. It's the same schtick every time. Overblown reports about TCO of PCs, overly optimistic estimates of TCO for the cloud/virtual desktop/diskless workstation solution. No one ever considers the additional costs on the server side, nor the lack of any real savings on the client side. It all comes down to a bid for control by the centralized IT admin. Which is a poor reason to make a fiscal decision.

This topic is closed for new posts.