12 posts • joined Tuesday 5th April 2011 18:24 GMT
Re: Desktop 'dumped'?
@El Andy: "doesn't starting with the application launcher open make rather more sense than starting with it closed and basically requiring a redundant mouse click just to open it?"
Only if you have nothing pinned to the taskbar, no desktop icons, haven't turned quicklaunch back on, don't run applications on startup, etc.
It might arguably make some sense for the users first login, but after that (like TIFKAM generally), it's an exercise in getting in the way.
Someone wake me when decent 4:3 or 5:4 monitors are available again. Something running 2560x2048 at around 22" would do very nicely, thanks.
16:9 or 16:10 is just too wide for a proper multi-screen setup. ("proper" = more than 2...)
Re: No portrait-mode?
Well, our field of view is wider than it is tall, so portrait is a little odd in that respect. Cinema and TV were landscape long before computer monitors went in that direction.
The one downside to turning an LCD through 90degs is that things like cleartype no longer work properly. You'll generally get better font rendering in landscape.
Ok, so increased cloud stuff (private cloud, presumably) requires closer working between the people with specialised knowledge, whether of storage, servers, networking, whatever.
Fine. But what does that have to do with them being in the same team? If the people involved can work well together, my experience is that they will do so regardless of whether they happen to be in the same team in some management heirarchy somewhere. Likewise, if they don't/can't work well together, putting them in the same team is unlikely to help and stands some chance of making things worse.
Seems to me that what's needed is good working relationships between people, and the selection of the right people to work on a given project. The somewhat artificial division into teams is a side issue. If those working relationships are there, does it make any difference who approves your holiday?
Something that does everything tends to be rubbish at a large fraction of it and tends to include a lot of stuff you don't need. There's a lot to be said for separate tools that do less, but do it well. It's also much easier to mix third-party and in-house that way.
With you up to here:
"Perhaps the long-term answer is to move the data into the cloud,"
Are you not then tied to your cloud provider instead? Even tighter, in fact - moving data between big arrays in your own data centre is at least possible to do yourself, even if it takes a couple of weeks.
Depends on what the data is, but the usual approach of an initial bulk copy, following by a final sync of the differences, works quite well in my experience. The bulk copy can take as long as it likes as it isn't final (and you can throttle it if needed to avoid affecting live services), and the diff is very fast as has relatively little to do.
"Give me one good reason to switch?"
In my case, there were two. Firstly, using Win7 at work, I find it easier to use the same at home. After a while using Win7, XP feels really clunky. (At first, it's the other way around. And, yes, I use classicshell.)
Secondly, I moved to an SSD. Win7 provides TRIM support, XP doesn't.
Then there's new features such as bitlocker, newer DirectX, etc.
But bear in mind, I never suggested moving for the sake of a newer version. I mentioned that, in my view, a re-install every so often is beneficial. That's a convenient time to upgrade, if you're so inclined, but not essential.
And, as mentioned, I have sometimes kept the OS intact when changing machines (via sysprep). Changing OS and changing hardware are already separate questions, without bringing VMs into the picture.
"There are zero apps I have encountered that require anything newer than XP and which offer even a remotely compelling reason to switch."
We're starting to see some apps where 32bit is treated as legacy, and the expectation is for 64bit. Rare, so far, and specialist stuff, but I expect that will become more common.
"Al you’ve presented is fearmongering..."
Nope, just personal experience after having dealt with automating windows installation, one way or another, since the NT4 days, in an environment with 1000s of PCs and a wide range of hardware types (including departments that can buy their own kit and expect to put our image on it). I find, in practice, that a clean install every 18 months or so has significant benefits. This isn't an idle or untested view, although it obviously relies to some extent on what typically happens to that machine over that period.
I think you'll find that 'Windows needs a re-install every so often' is quite a widely held view, but YMMV.
"I am using a 2004-ish XP vm."
Wow. Well, each to their own and all that, but I don't think keeping an install around for that sort of time is an approach to recommend. The accumulated cruft of hotfixes, app updates, app version changes (e.g. office - or are you still on 2003 there too?) means it will be slower and clunkier than a clean install. I generally consider windows installs to have a lifetime of maybe 18 months, two years at most, before the benefits of a clean install become very significant.
In short, you don't solve the problems of accumulated junk in windows by moving it into a VM.
Do you plan on sticking with XP indefinitely, or are you going to move to something newer at some point? When you do move, aren't you going to run into exactly the problems you're trying to avoid?
I'm also interested in how multi-screen RDP works in practice for this sort of thing. Is it two separate windows, one for each monitor, or is it one really wide window? If the latter, then it's not very good solution - it's impossible to maximise to one monitor, for example.
I have two primary systems (work and home). One has two screens, the other three. Each has various tweaks to ensure that particular apps go to particular screens, and as the number of screens is different, these are different tweaks on each system.
This is a specific example of the more general case of sometimes needing app and OS configs specific to the device you're trying to use, and having the same settings everywhere is actually a problem, not a benefit.
"Also: you aren't "creating other systems that you need to manage." ..."
Of course you're creating other systems - you've gone from one system, to three (the endpoint, the vm host and the vm guest). All of which need patching, updating, maintaining in various ways.
"This has worked for 7 years. Life is easy."
So you're still using a 2004-era WinXP install as your VM - with all the apps of a similar vintage? That must be a really clean and fast install by now...
On the other hand, if you're rebuilding it every once in a while, how is this different from having to do the same at the PC?
"I fail to understand how this is remotely harder or more scary than reinstalling and reconfiguring every tiny thing each time you get a new endpoint..."
Personally, I don't change "endpoints" that often. And I've either sysprepped and cloned the earlier machine, thereby keeping all the settings and migrating the OS to the new hardware, or I've changed OS and app versions in a fairly major way.
I find most config tweaking is the result of new app versions, not changing "endpoints".
What you describe isn't "scary" - it's an overcomplicated solution to a problem that doesn't really exist in the first place.
If you manage the desktops properly, and have appropriate policies in place, data can be stored centrally even if the OS and apps are local. Likewise, OS and app deployment etc can become solved problems (App-V has its place, but tends to only help with the apps that are easy enough to deploy by other means - try it with ArcGIS sometime).
You then have the benefits of interchangeable client devices, but also have the benefits of local processing power (including local GPU if appropriate), local storage of the app itself, and less dependence on network bandwidth and latency. A room full of users all starting ProEngineer at the same time only thrash the local disks in each machine, and don't thrash the network, the central storage, or any central VDI servers. Laptop users can use offline files and the like. Other routes (citrix, etc) can be provided for remote users to access their apps and data. (This is a far more plausible home-working scenario than VDI).
At which point, just what is the problem that VDI is trying to solve? The overheads, costs and risks are huge, but where's the benefit? Seems to me the benefit is to those people trying to sell new kit, not to the IT admins, or to the users of the services being provided.
Why oh Why 4
But the article is still things like "How can IT departments build a business case for desktop virtualisation?", which assumes that they should be doing this. i.e. that VDI is something IT departments should want. The Reg has been relentlessly plugging VDI for quite a while now, for no obvious reason.
Having looked into it, there seem to be no major problems that it "solves" that are not better dealt with by managing the desktop pcs properly in the first place (thereby avoiding the storage and networking headaches that VDI causes). It may have a niche role to play somewhere, but it certainly isn't suitable for the general case that the Reg keep trying to make out that it is. It's a solution looking for a problem.
- Lightning strikes USB bosses: Next-gen jacks will be REVERSIBLE
- OHM MY GOD! Move over graphene, here comes '100% PERFECT' stanene
- World's OLDEST human DNA found in leg bone – but that's not the only boning going on...
- Beijing leans on Microsoft to maintain Windows XP support
- Google's new cloud CRUSHES Amazon in RAM battle