Re: Personally if i was spreading a virus
"Easy prey". A bit on the nose considering the recent arrest of someone who got teenage girls to cavort on webcams by pretending to be the Idiot Noisebox they call Beiber.
48 posts • joined 18 Sep 2009
"Easy prey". A bit on the nose considering the recent arrest of someone who got teenage girls to cavort on webcams by pretending to be the Idiot Noisebox they call Beiber.
Cisco says that 3% of their ceritifcations are at the CCIE level. What kind of percentage were the defunct MS ones?
MS used to publish the numbers of various certifications, but stopped a few years ago. I think they should bring that back.
Whilst I work almost exlusively in the MS stack, I don't recommend their stuff without good reason. And in this case, I just can't.
The main problem is US law rather than any technical point. Bob Muglia shortly before his departure confirmed that a UK company, using Office 365 and/or Azure in the MS datacentre in Ireland, could potentially have their data transferred to a US datacentre for probing by the US government. The UK company would not notice (which is kind of a testament to the technical side, if true) and would not be told. The US government would not need a warrant to do this, but as MS is a US company, they would be compelled to assist.
From a business point, I can see all cloud businesses using the low cost to draw people in, and then slowly escalating fees once you are there. You will be very aware of the shaft, but it will cost more to migrate out. So year after year, the price goes up beyond reasonable cost, and you just accept it.
B: Very true. I tend to use qualifications to paper over the gap in the CV at that point. If I take six months off to go somewhere, I usually spend a month or two getting the latest MS exams done. Any question as to why I have a six month gap "I took the latest round of MCSEs". I just neglect to mention that some of the study time was done on a beach with cocktails being replaced as fast as they were drunk.
More fool you for putting your DOB on there. Why do that in the first place? They don't need your DOB, passport number, address or "Personal Interests", so you are wasting space with it existing anyway.
I have done CV vetting for several places, either putting together a contracting team for a project or hiring permanent replacements (including my own: I'm not staying for a year to babysit a system, I prefer building them) and the truth is that few people, permanent or contract, are aware of how to put a good CV together.
It used to be that your address was there as contact info, to send out an acceptance/rejection letter. That isn't done any more, so take it off there. DOB baffles me. Is it there so we know when to send you a birthday card?
But the worst offender is "Personal Interests". This waste of CV space isn't going to impress or intrigue anyone, even if you put (I swear these are true) Bear Wrestling, and Spaghetti.
Why can't a single politician just take a stand and speak the (Chair Leg of) truth:
"Everyone's looking for someone to blame. Society. Culture. Hollywood. Predators. Looking everywhere but the right place. Children are very simple. Very easy devices to break, or assemble wrong. You want to know who did this to these kids? Only their parents. That's the thing no one wants to hear. Every time you stop thinking about how you're treating your kid, you make one of these. It really is as simple as that. It's got nothing to do with the failure of the society or any of that. It's got everything to do with the responsibility of making a human. Why are your kids looking at porn on the internet? Because you fucked up the job of raising them. That's what no one wants to hear. That we can't blame anything outside our houses." [With apologies to Mr Ellis for the slight bastardisation]
There are many technological stops to this specific problem, be they blacklists, whitelists, etc. But ultimately, the single biggest solution to any of these problems be they internet porn, violent video games, violent movies or "suggestive" music, is for the parents to monitor and control what media their children have access to and consume.
"But I can't watch them 24 hours a day", "They don't listen to me", "I don't have the time" are just bullshit excuses. You chose to have that kid, you do the job of raising them. If you can't, then give them up. Only at that point does their media consumption become the problem of the state: when they are in state care. Otherwise, do your damn job.
I remember having such problems with that little bastard. No matter how well we measured, he would always run short, or not quite make the turn.
After a little testing, we worked out that the board he was moving around on was too shiny for the wheels. Cue the caretaker with some sandpaper, and suddenly all our previous "wrong" programs work fine.
Kind of sad that nowadays he would be seen as too primitive, when really it is just the kind of introduction to IT that is needed around 6 or 7.
That maybe only seven percent of people are QUALIFIED to be considered professionals in anything at that age?
I was lucky enough to be started on my IT career at the age of 6 programming my own games on the ZX Spectrum (a fairly hefty tome full of BASIC-coded games was my first tutor), which gave me a slight headstart.
Most people getting into IT now seem to start the proper work at college or university. Before that, IT lessons consist of "How to use Word" and "Spreadsheets for Beginners". The extra-curriculur computer activity is almost non-existent.
If you want younger experts, you need to teach them better skills from an earlier age. When I went through the GCSE IT programme if I hadn't had my previous experience, I would have been put off completely. Most of my classmates were. From a class of 30, I am the only one with a career in IT.
I am paying my cable bill in the US, but I'm over in the UK. Torrents allow me to watch something I paid for without weekly international flights.
No, I'm not having Sky installed, or installing iTunes so I can pay twice for the same show.
As a worker, I want several things to convince me that BYOD is worth it. I want compensation on my device, because if I have to buy my own tools for a job it shouldn't come out of my own pocket. I want to keep complete control over it, I'm not having anyone but me authorised to access it. I'm not having it subject to any corporate policies regarding content, searches or deletions.
As an admin, I need assurances that any device connected to the network isn't crawling with viruses, spyware, gypsy curses and the like. I need to know that data remains under the control of the company as much as possible. That means the ability to prevent copying it anywhere it shouldn't go, and removing it when they leave the company.
The only way I can see to marry these two positions is a combination of BYOD with VDI. Remote sessions that keep the data within the company bounds whilst allowing BYOD to be used. Security holes that come to mind are things like screenshots, but they are fairly minimal. It just depends a lot on the ability to control the VDI context.
Both BYOD and VDI are far from maturity. I'll watch what happens, but I won't be an early adopter of either technology for the next 2-5 years.
This explains why I couldn't pay for the petrol this morning. Luckily I had enough Yanqui play-money scattered around my pockets to convince the guy that I would come back when I found a cashpoint, and wasn't trying to do a runner.
This is the second time I have regretted staying with RBS/Natwest. Time to move, I think. Question is, where?
Outsource to India? Not really. There is the same problem with initiative. If you want a bank of graduates to stumble their way through a script, fine. But proper 3rd line admin, SME and project work? That requires experience, and people who aren't afraid to poke around to find the cause of an issue (and whilst poking, know which parts to avoid prodding too hard.)
I don't care what the added benefits are, they lost me at "updates are automatically applied" because you just know that either:
a) Users will happily ignore any prompts to restart or shutdown the application or machine when required, so critical patches won't get applied
b) A patch will go in and mess up a beancounters lovingly crafted macros, causing much whining from the Legume-reckoning pit. Cue an outcry, someone pops up with an LO/OO suggestion, a switchover is suggested, more teeth-gnashing, this time from everyone in general when they find old documents are no longer correctly formatted, and then a switch back to MS Office, costing everyone time, money, the will to live and yardage on their hairline.
Had I not seen it happen myself, I would chalk it up to a fevered BOFH dream.
The majority of offices us MS Office. They send documents to other people and those people need to be able to see them correctly. LO and OO don't do that. For a simple "home user" test on a document or spreadsheet, they can handle it. But an accounting spreadsheet with macros, or a long complex document? Forget it.
MS have created a monopoly. Is it the best software? No. Does it do the job? Yes. Their format won. And so the first thing LO and OO should do to have any chance of widespread corporate acceptance is to match the MS format exactly, in every case.
Everyone always gives MS shit about not following the XML or HTML format when the majority of people use them as written. And that is absolutely fine. MS are the minority there, they should follow the spec.
But in the corporate world, the MS document specs are the de facto standard. And so LO and OO should follow them. Because everytime I see a document formatting issue crop up, it's someone opening an MS document wondering why LO or OO has a problem with it.
Get the formatting right first, then add proper useful features. That will make LO/OO viable. Until then, they aren't going to be accepted in the business world in the numbers you want.
Mandate that instead of a pint of beer, it's a litre of beer. For the same price.
Guaranteed overnight conversion, courtesy of well-lubricated new SI zealots.
But yet again, it's a straw man.
See, if you knew anything about how Exchange or Outlook worked you wouldn't have said this. And if you had any kind of corporate experience, you definitely wouldn't have said this.
The version of Exchange that created the mailbox is irrelevant. You can migrate all your old mailboxes up to the latest version with little hassle, and Microsoft will support you for this. There's even provision in the licensing agreements for it. So if you have to stand up a 2000 Exchange and a 2003 to recover an old mailbox from tape, not a problem. I still work migration projects where this is the case. Tapes are stored for a decade or more, since nobody ever throws things away.
As for Outlook, do you think corporate bosses care that much about support? I'm on site where Office 2003 is still in use. So that's your argument blown out of the water.
As for "some suppliers don't support their software running in virtual environments (other than their own...)" MS were supporting their stuff in VMWare and Citrix for years. So your "subtle" dig here is without merit also.
If you want to argue against proprietary standards, you are better off aiming at "Document Management" systems, which usually lock customers in for life since there is no easy way to export information for use in another competing system.
What I have encountered as a contractor is that the entrenched decision-making staff (sometimes an IT Manager, sometimes a Project Manager) don't want to pay for the right thing do be done in the right way.
It could take £n and x weeks to do the job properly, however for £2/3n and 3/4x weeks you can have a kludge of a system running that users hate because it only does half the job, and does it badly.
And the Manager responsible will pick the kludge, every time. "It can't be as bad as you say." "I'm sure it will be fine." "My design doesn't break Swiss law."
Yes it was; no it wasn't; and oh yes indeed it does, as the auditors are now confirming.
'Do more with less' is the mantra of someone who will accept sub-standard work, right up to the point it fails, when you will be blamed for it.
And if the IT person refuses to do sub-standard work? They are replaced with someone who will. Because the Manager fails to see the problem, that good IT costs more than most are willing to spend on it. It requires time spent thinking, in order to avoid any money deemed worthy of the IT budget being spent on an incompatible solution. It needs a testing facility, something that I rarely see in any workplace, large or small.
IT requires policies that don't hamper it, such as "Don't patch anything; it might break it" (NHS client, obviously). It needs a Manager who understands that even if the rest of the business is 9-5, the IT department isn't, so maybe clockwatching isn't how to measure performance. The hours I keep on a contract will rarely start before 11am, because the out-of-hours window for implementation doesn't start until 6pm. Some days I won't start until then, and be finished just as the sun is coming up. But I can still get the "Where are you?" call at 9:30 from the Manager who believes that Everybody Should Be In At Nine, Because That Is The Way Things Are. And not because a meeting is scheduled, not because anything is broken, just because there is an empty desk.
My first thought was that a policy of certificate expiry on the first of the following month would have avoided all this.
Companies should change the way they operate? How about these "me me me" children grow the hell up and learn that life doesn't revolve around them?
What company will really open up a network to a swathe of potentially compromised devices just so employees can use their new shiny lookit instead of a company controlled, trusted machine?
Continuous partial attention = easily distracted. Only 2% of people are capable of true multitasking, the rest just believe they are. Tabbing out of youtube every five minutes to check for updates on facebook isn't multitasking.
Automatic tagging of data? Yes, because AI is going to be able to accurately identify complex information in a wide variety of formats, some of them bespoke to the company. If CAPTCHA can defeat it, I doubt it will be able to cope with a scan of a bank statement or clinical record.
Users can't be trusted to tag data themselves? Not even when it is their job? People who can't be trusted to do their job get fired, simple as that. Or is that some kind of discrimination against the lazy and incompetent?
Cloud audit trails. Like who in the cloud provider has the ability to access data placed there, and where in the world the data is physically located at any one time? How do multi-nationals prevent breaking the law when German mailboxes get moved to a datacentre in Ireland? Who has liability and/or culpability in a case where a user claims actionable messages were not sent by them, but possibly an administrator?
Shut down access to information quickly. As in, the data stored on a personally owned device that was joined to a company network. A personal device that you may not have technical control over. That you may even find you can't legally touch once the employee has been terminated.
This "new generation" might have radically different approaches to the Internet, information and security. Doesn't make them correct, worthy of emulation or the new arbiters of company policy.
Most people will use MMYY or DDMM, so those are the first two I would try.
I have a personal UK debit card, 2 personal UK credit cards, a UK online banking access card, a Swiss debit card, a Swiss credit card and a Swiss online banking access card. All have PIN numbers, and I don't want to use the same number on them all. But I can't memorise 7 PINs plus all the other passwords I use every day. I use two cards frequently, and the others rarely. I can't be forgetting them when I'm abroad.
So I use an algorithm for the cards that lets me carry the numbers in plain sight.
4929 7014 5583 4826 <- Not my card
The simplest algorithm could be to choose a block of four, but in common practice that is too vulnerable. Here are a few different ones that I could use
One from each group = 4754
First from first, second from second, etc = 4086
Fourth from first, third from second, etc = 9154
Two from the first group, two from the back = 12 combinations in each group, for 144 possible
And so on. You can even use different algorithms for different types of cards, so in my example I can use one for debit cards and another for credit cards. Or one for UK and one for Swiss.
The important thing is that you can simply remember the rules, and look at the card every time you use it. One rule for 7 cards means 7 PINs that are as random as anything the bank will generate, it is right in front of you and yet no-one will see it.
We should be the first country to use UTC permanently. It would be a minimal disruption for us, and might lead the way to every country using it and the abolition of time zones.
But the obstacle is convincing people that 9am doesn't need to be the time to start work every single day of the year. And it really shouldn't be that difficult. Imagine you work 9-5 every day. Then your boss says "From next week, we need you to start at 8, and finish at 4." What would you do? Set your alarm an hour earlier. Everyone would be able to cope with this a lot easier than the current system of turning every clock in the place forward or back (and missing one, or going the wrong way, or forgetting entirely) and which could be staggered by individual companies, schools, etc.
Benefits include less of a peak during rush-hour traffic, no need for clock reconfiguration twice a year, and finally ending the debate over "putting the clocks back makes it dark on the school run. So start school later in the day, you damn idiot!"
From what I understand (don't do much storage myself, I usually have someone else handle that) the IBM Storwize does this kind of thing too. You throw in a mix of SSD and HDD, and it will put the most frequently accessed data on the SSD. When it gets old, it moves it off the SSD and puts the newer, more frequently accessed stuff on there.
You can't mix 2.5" and 3.5" disks in an enclosure, but you can mix enclosures in a system. So you can (theoretically) use the 3.5" large slower disks as your cold disks in the "hot, warm, cold" strategy. Whether it is intelligent enough to recognise that ability, or it only knows "SSD fast, HDD slow" I don't know, but its worth a look.
Small company that relies on e-commerce and e-mail "downsizes" their sysadmin to replace with a cheaper outsourcing company.
Three months later, their site and e-mail stops working. Numerous phone calls to the outsourcing company yield nothing. I am called in to troubleshoot a week later. One WHOIS trawl, and I ask "so who is John xxx?" "He was our old sysadmin" "Well you may want to call and ask him for your domain back."
The sysadmin had been paying the bills through his limited company, and effectively "owned" the domain. When the renewal came up, it was forwarded to a parking site. Not sure whether the company bought the domain back, went through the arbitration, or some other solution. But every company since, I have been interested to see that a lot of sysadmins do this as a form of "insurance", ostensibly because "it's easier to have them contact me."
IS part of the core business. An accountancy company that eliminated IT spending would have to spend just as much (almost certainly more) on staff, ledgers, notebooks, pens, pencils and ink. To say that IT isn't the core business is either a deliberate attempt to push the Cloud case, or a fundamental understand of exactly why IT is used in a business in the first place.
The ONLY reason to make a change in how IT is used by a company is to increase security, increase functionality or decrease costs, and none of these should be done at the expense of the other two.
Cloud (really just hosted services) might decrease costs due to scale, but does it do so without impacting the other two? We have already seen Cloud outages that have severely impacted businesses. As more people move to the platform, the more critical an outage will be, to the point that a single outage will have a measurable impact on the GDP.
And don't argue that companies have outages with their in-house IT. A Cloud outage eliminates all services at once, for multiple companies. Few companies hosting their own servers will have ever suffered a complete outage. One application might go down, or one network segment. But never the entire thing at once, short of a power outage (which would take the desktops with it as well) and that usually stops all companies in their tracks.
Cloud is a way to put all your eggs in someone elses very flimsy, untested basket in order to save a few quid. And that saving is eliminated with the first outage, because the stress and downtime will cost many times over.
"He also told the representatives he planned to disclose vulnerability details publicly **once a patch was released.**"
To me, that says he was willing to wait for them to fix the problem before telling people. So he simply wanted the credit for finding the hole, and wasn't making any threat of any kind.
He told the company first before going to the press, offered help to fix the problem he found (yes he would have wanted paying for doing work, what a concept) and either way would keep the problem quiet until it was fixed.
And this is the response he gets.
it probably wasn't a consultant who designed the network. In fact, the network was likely never designed at all. A room had computers put in, and so cables were run to it. As a switch fills up, another one is added on to expand capacity. If anyone mentions redundancy, it probably goes something like this:
"We should probably get a second switch for resilience."
"What do you mean?"
"Well if this switch breaks..."
<irate>"Why would it break? Have you recommended the wrong thing? We have paid enough for it, why would it fail?"
<techie mentally weighs the likelihood of this ending well> "...no, it's fine. Forget it"
Management is obsessed with avoiding blame. If they hear that there is any risk *at all* in doing something, they simply won't do it. If it will cost money, they won't do it for fear of blame over the budgets.
Again, IT is seen as a cost-centre, rather than a system that enables people to do their jobs. So everything is done cheap, crap and quiet.
...the building society didn't want the source code for it's financial systems to be public knowledge, even in part? Either as a security measure (and feel free to parrot the old "security by obscurity" line, but if that is one of say eight measures taken it is more effective than simply taking the other seven) or because the formulae and methodology they use are proprietary in and of themselves.
Or maybe they are a completely MS shop, and don't want to have to deal with anything else. Whether you are MS or Linux, you can't deny that having all the boxes running the same thing makes your job easier from an administrator's point of view. I have worked in places that were all MS, and a mix of MS, Linux, ESX and crusty mainframe. The All MS ran a lot smoother, not because it was MS, but because there were no interoperability problems and the daily overheads for patching and troubleshooting were lower.
As to the story, it isn't the OS you run that determines your innovation. First you need ideas, then you need funding. We have ideas, but trying to get funding in this economy is a nightmare.
Thanks for that. A real kick in the teeth first thing in the morning for everyone hired through an agency. Which is probably 90% of people.
Or it would be, if the article didn't read like an OSS mastubatory piece. "Look! Look! OSS is how Jesus would work! We have unicorns!"
You mean she kept forgetting her password.
And the council went with the cheap option of issuing encrypted sticks (+2 points) and trusting users to make sure they used only those (-20 points).
The more expensive option would be to install software that prevents anything other than an encrypted stick from working. But it's still cheaper to pay the toothless ICO fine.
Until the ICO can mandate that the offenders take specific steps to remedy the situation, these data breaches will continue.
Why was an .exe allowed to go through the mail system in the first place?
calling it compensation, isn't it?
When moving from one product to another, it is going to cost time and money to scope out the migration, train staff, etc. MS are simply removing one obstacle on the path. I often here a variation of the phrase "We'd love to switch to x/y/z, but just don't have the budget"
but it's likely this was just a larger version of the same practices done by workers at petrol stations, supermarkets and other places with such loyalty schemes: swiping their own card when a customer doesn't participate in the scheme.
I have seen this done hundreds of times, but on the smaller scale. One cashier picking up an extra 200-400 points per shift can net around 70,000 points a year. If a point is worth 1p (common supermarket exchange rate) it's a small increase on top of their minimum wage.
It seems you are saying that only the most damaging secrets should be classified. But what would happen then?
If there is a chance of communications being intercepted, best practice is to encrypt everything, so that an attacker doesn't know what to target. It appears that this has been taken to heart. Office gossip, bad-mouthing and the like might be embarrassing, but there is nothing truly revelatory here. And that might just be the point.
Little of column A, little of column B.
SBO as the only security implementation is doomed to fail. But, if StrongType means that implementing SBO *in addition* to other sceurity practices is a good idea, well, it kind of is.
There is a reason safe doors are opaque and not transparent. It's a lot easier to crack the thing if you can see all the moving parts.
just turn the tile upside down?
It's not mold or fungus or somesuch, it's just the pattern of the tile. Flip it over and you likely won't recognise a face in it.
Now if the face stays the same way, THAT'S the devil at work...
I think the last numbers I saw (2008/2009) were Windows 33.6%, Unix 33.3%, Linux 17% and 16.1% "other". I'll try and find a link , but I don't want to start "my graph beats your graph".
My numbers were close. I'll leave the
The growth in linux is down to things like blades, virtualisation kit, etc.
But servers do not sit in isolation. Users access them through the desktop environment, and as long as there exists the combination of multiple "incompatible" distros, lack of a unified GUI for regular business users, and the holier-than-thou attitude amongst Linux afficionados, Windows is going to still be dominant.
(Note: Yes, I'm an MS user. But I think Vista sucked, and I find IIS unwieldy. Nothing MS makes is perfect. But most is good enough to get the job done, and there are some hidden gems. That's all I expect Linux to be as well. If only the zealots would calm down and admit it, we could all move on.)
but the "know where each setting is" is not quite right. For example, in IE if I want to increase the text size in a page, but not the pictures, how can I do it? Is it even possible?
A quick poke around in the menu bar, under View, says yes it is. And I don't risk changing anything else unless I am a bit heavy-handed with the mouse. A simplified example, but true. You can do the same in Exchange (although some stuff is in the Powershell), OCS, SCOM.
And whilst there are some GUIs for linux, that is the minority. Most of the work is done in the command line (don't think I am solely GUI. I am having great fun with Powershell) whilst in MS products the reverse is true: the command line and Powershell are for rare tasks that need them.
Command line linux is a place where mistakes can cause great damage. I wouldn't want to experiment in there without a hazmat suit and a box of rabbits feet.
Since the first place most admins learn their job is on the desktop (remember the days of the helldesk?) most will be entering the world of Windows and MS apps. So they proceed on to third-line and Admin as Windows people. And until there is a user-friendly linux desktop for the business, Linux is going to continue to creep in via the backdoor.
Linux is usually installed because there is one specific application the business is looking to use. And so they end up with say, RHEL. Then a second app is bought, because we already have some linux. Only this will only run on Suse. The difference between the command lines can mean someone unfamiliar with Linux can make mistakes.
The way I can see for Linux to progress is for it to have a single GUI that works for all distro's. A desktop user doesn't care if they have Suse or Ubuntu. They want to just do their job. The admin doesn't want to have to know thirteen different commands to delete a folder. They just want to do their job.
"Oh but the command line is so powerful." But power that sits unused is wasted.
I just dislike the attitude of the Linux zealots who see any mention of their beloved OS as an excuse to start banging on about how much better it is than Windows.
The psychological price they force people to pay is just one reason amongst many that people stick with Windows, however good, bad, ugly or indifferent it is.
If they put their time to better use, such as making Linux friendly and easy for new users, they might actually gain a measurable increase in their userbase. But as long as they continue to act as they do, Windows will dominate. Not because of marketing, not because of legacy systems issues, not because people are too stupid to try anything else, not because of familiarity. Because of these zealots.
Why still using 2003?
...Is not a helpful or friendly response to those taking their first steps towards trying a Linux OS.
I don't care if Linux sneezes double rainbows and craps gold, as long as this attitude prevails amongst it's users, I won't be among them.
Bill, because a GUI's learning curve is less steep, making it easier to just get on with my job instead of memorising "rm /usr/src/linux ; ln -s /usr/src/linux-*.*.* ; cd /usr/src/linux". I have actual work to do, thanks.
You can wipe that smug grin off your face, because Linux market share has the same percentages as the survey rounding errors.
I don't care about how hard it is to learn a new OS. I just don't want to have to deal with people like you.
And there is no apostrophe in Windows.
It is deemed illegal to impersonate someone online.
The FBI are impersonating people online.
Regardless of their intentions, they are doing exactly what they say is illegal.
Reading Comprehension Fail.
in order to prevent these script readers and cable monkeys being classed as IT professionals.
Doctors have their boards of certification, lawyers have the Bar exams. And you don't get orderlies performing operations "because we are all health workers."
So many problems in IT setups that I am called in to fix stem from people not knowing what they are doing, yet being able to call themselves "technical architect" or "Chief Information Officer" (he was the only IT worker in the place) because there is no industry standard to prohibit someone woefully unqualified from using those titles.
Microsoft have their Master and Architect qualifications, Cisco has the CCIE. These are rigorous, in depth qualifications that are as close or identical to the peer reviewed systems of doctors and lawyers. They are a start, but without protection of the titles such as Systems Engineer, Technical Architect, Developer et al. the real professionals in the IT industry will continue to be lumped together with the ignorant call centre script readers that now pass for 1st line support.
Get the ok to perform "a typical office desktop crisis" from one of the higher-ups. Then come in late at night, unplug all the machines and pile them in an IT storeroom (or white van at the back entrance)
Your line the following morning is "oh God, we've been robbed!" and then you proceed to enact your DR plan of handing out stock machines to key people (you have a DR plan, right?)
Once you have wide-eyed managers screaming "just buy us some more desktops now!" you get to tell them the 2-3 day delivery times and watch their heads explode.
Then you can bring the desktops out of the storeroom/van and tell them you were just illustrating the point of how important desktops are to the business.
UK Sale of Goods Act means that the warranty should last for as long as is "reasonable." For a cheap wall clock, then one year is appropriate.
For a top-of-the-line console, being charged more than £100 for a replacement after 18 months is not only insulting, but a breach of the Act.
Biting the hand that feeds IT © 1998–2018