Wait.... Is it thursday or friday?
I'm baffled! But a cheap UPS should fix it....
"What the fuck just happened?" the Boss garbles, crashing around Mission Control like a madman after dashing down two flights of stairs from the 4th floor boardroom. "Uh.... UPS failure," the PFY says calmly, glancing up from his monitor briefly. "Well aren't you going to do anything about it?" "I am," he responds. "I have …
".....the art of keeping a straight face...." I always find it helps to think of Bambi's mum getting shot at those points. Right before you pass on the email trail from the cretin in question in which you laid out the risks and they replied completely ignoring them. It then helps to say bumph like; "But pointing fingers isn't going to help here, so let's look at what we need to do to fix it." What you actually mean is "Let's look at how much extra consulting I'm going to charge you to fix it," which of course you will have - coincidentally, just as a contingency, of course - already have prepared in advance. Ah, good times, good times....
> "We're using the kit that you spec'd ..."
I *never* say that.
i always use something like "The kit specifically required by document <blah>". Then, when pushed on the question of "well, who wrote that document?", I have to mention that my memory is not what it once was, and I'd have to look that up.
All the time, the git in the corner who actually specified the crap is turning puce. It's about this point where he finally owns up (as he's going to get found out anyway...)
Vic.
Heh. Many moons ago - the '90s actually - I and my wingman were told to build some servers, since it was cheaper than buying them. We went shopping and came back with all the parts, highest quality we could grab within the budget set. The owner nearly fainted though when he saw the cases we got. He insisted we wait while he dashed off to trade in the cases for far less expensive stamped pieces that were vaguely prismatic in form and which lacked right angles in any plane at any corner. The covers were also oddly non-Euclidian. Tightening down a cover caused the MB to torc, popping out cards and memory like a Las Vegas slot machine hitting a jackpot. Just sliding the cover on was OK, but of course left the innards exposed the inside to a fair bit of rapid cruft accumulation. The boss liked to bring in his Golden Retriever to work. The cases lasted until shedding season led to a strong odor of burning hair.
When asked what the hell WE did, we simply explained that 1) those cheap cases could not be properly closed up without causing equipment failure, so the covers didn't serve much purpose except to keep errant paper planes from shorting something critical, and 2) the hair that had caught fire and ruined the video card was not off OUR dog. Sadly, we had to rebuild everything.
That's the idiots who actually keep reading this drivel, right?
Honestly, the entire BOFH thing was tired & derivative of itself before it left Usenet. Stop it, already. You've been embarrassing yourselves for over a decade.
(Yes, I've been waiting for a suitable headline. No, I didn't read the bilge.)
Not really, and I don't bother downvoting them either. I'm of the belief that he does it deliberately in an attempt to get as many negs (read: "as much attention") as possible. Being the most unpopular person on a message board still makes you the most of something, after all. Far be it from me to help stroke his ego.
The BOFH will remain topical and enjoyable for longer than everything! Yahoo! will! continue!, then+ Google+ comments+ will+ endure+, longer+ than+ facebook...bitch will...bitch be...bitch annoying...bitch and until after the term freetard is no longer used for people with a difference of opinion.
This post has been deleted by its author
I must ask my boss if she would start bringing in some every morning for the AM break.
<shock>
You said pastie, shit, I thought you said pastry
</shock>
OOPS!!! I bet that didn't go over too well.
Paris icon, because, well, she knows about pasties, I guess.
It's been too long since BOFH played with any serious electrical hardware (bring back the pinch-that-wasn't!). The only thing better would be a small fire. Or a bloody big fire.
They put a defibrilator in our office recently. I only had to look at it before the first-aider told me very firmly "No".
They actually trained the pfy and I on using it here. I think they wanted to show that the failsafes mean that no voltage can be delivered unless the heart is in vfib....
And I am sure that there is no way the safety measures installed couldn't be bypased in any way.... And the paddles definately couldn't get connected to the metal chair.....
Anon as we don't want those HSE lot poking around, some spots in the local woods are already 2ft higher than they used to be...
In my experience, UPSs are, to the IT establishment, essentially a Cinderella technology--they've no idea how the technology works but they take it for granted until the midnight of its failure.
Few IT-ers have the vaguest notion about the chemistry, operation and charging of Planté's lead-acid cell, and probably only a handful across the whole country would have ever heard of the Nernst equation with respect to the charging of such a cell and that it's integral to the reliable charging and longevity thereof. Thus it's little wonder that this black-box is simply ignored until the morning after the disaster.
Methinks UPSs should be taken away from IT in the same way that incoming mains power is left to electricians and power distribution authorities. It's not that IT shouldn't care about UPSs, of course they should. Rather, others more suited to their management should be responsible for them and that IT should be told and *know* key parameters such as incoming power reliability statistics, UPS supply reliability parameters etc.; in addition, they should have immediately to hand emergency numbers for the those responsible for both incoming mains power and UPS maintenance. Such info should be well integrated into well-established emergency procedures to deal with power outages.
All too often, IT just buck-passes UPS issues. Procedures and policies thus should ensure that this cannot happen.
IT don't control the UPS systems where I am working. It's been given to the Site services team (eg. electricians).
So at the present, there are two water sprinklers from the fire control system in the server room; one above the industrial sized UPS, one above the mainframe sized telephone system. (I asked if the water was active and got a blank look; I still don't know)
The power input cables need to be replaced as they were under specced; it's been highlighted for 2 years and there have been 2 separate occasions in the past 4 months when they were going to replace them; but the wrong stuff delivered so project put on hold.
Add to that, there are 5 racks around the factory with UPS protecting them; 3 of these have flashing red lights indicating a fault. These have been showing the same fault since I have been here. I'm told that a request for replacement batteries was submitted and they are waiting for details of when this will happen.
I'm not saying that Graham is wrong; it is just down to the individuals concerned. Some do a good job, some don't; that is a fact of life. IT staff are generally no better and no worse than any others.
Add to that, there are 5 racks around the factory with UPS protecting them; 3 of these have flashing red lights indicating a fault. These have been showing the same fault since I have been here. I'm told that a request for replacement batteries was submitted and they are waiting for details of when this will happen.
I saw what happens if you leave a battery in a UPS for ~3 years after it first complains that the battery is failing.
First, the battery gradually swells. Initially past the size of the aperture through which you are supposed to replace it. Good built-in obsolescence, that. So they needed a new UPS not a new battery. Then it swells past the size of the welded steel compartment it rests in, with sufficient pressure to bend the steel. By the time they'd procured a whole new UPS the old one was immovably wedged in the rack. But you couldn't actually see why it was stuck, until finally, one night ...
It exploded.
Not on my ship. Someone sent me the pictures.
Been there (nearly)
Luckily, I got ours out before it exploded.
But it was in the *top* of the rack, which is a tricky place to put 40+kg of batteries in the first place. When they've pinned themselves into their hidey-hole (humuhumunukunukuapua'a-style) it becomes a real mission...
It's been given to the Site services team (eg. electricians).
It sounds like either your electricians are lazy; or are held up by some fucking beancounter who is blocking those needed repairs.
Case in point: The complex where I work was supposed to have the main wires from the utility transformers to the main disconnects wired with COPPER WIRE, according to the architects specs. Some stupid beancounter decided to replace the copper with ALUMINUM WIRE of the same size. An aluminum conductor has 70% of the current carrying capacity as it identically sized copper conductor. So, guess what happened on a hot summer day!!!
All I can say is that the lawyers got richer.
*shudder*
The Nernst equation, just thinking of those gives me a cold sweat and takes me back to the horrible electrochemistry stuff I had to learn at uni.
of course, the end user shouldn't have to worry, or indeed know about the internal workings of their UPS. What they do need to know is how much juice it can hold, how much power it can supply, and how many charge cycles it is good for before the battery dies. After all, it is just a glorified car battery, and you don't expect the average driver to go round fiddling with the insides of one of them.
Few IT-ers have the vaguest notion about the chemistry, operation and charging of Planté's lead-acid cell
Hmm, chemistry. H2 exits and reunites with O2 with an attractive amount of usable violence. You may just have written part of the next BOFH..
"... essentially a Cinderella technology--they've no idea how the technology works but they take it for granted until the midnight of its failure."
Your post has little to do with a BOFH article, but I do like this line, and I think I may be using it.
A now defunct defence contractor I once worked for, set up a temporary project office in a rented warehouse to allow it to get started on a project they had just won whilst more permanent accommodation was being sorted. They'd put in a raised floor and cubicle 'offices' along the walls, but it was essentially open plan. A couple of mid-range servers (under desk type models) served-out shared drives and had been put into their own cubicle, out of the way at one side. These each had their own UPS - again an under-desk model.
We came in one morning to find no power to the building, and discovered that the servers had crashed hard because the UPS's had been turned off.
It was only later, when the project accountant (who always came in really early in the morning and occupied the cubicle next to the servers) overheard our conversations and fessed-up to turning them off because 'the buzzing noise was annoying me'.
[And, no I can't remember why they weren't set to shut themselevs down - long time ago now and I wasn't the person who set them up!]
I can't remember why they weren't set to shut themselevs down
Mine are set to shut the servers down when the battery power reserve is down to 30%, about 5 minutes later. Usually the mains is back on before then so the server stays up 24x7.
Except for the time the cleaner unplugged the UPS to plug in her vacuum (in a room she shouldn't even have been in), and then found the UPS "Off" button on the UPS within the five minutes, and even worked out she had to hold it in for five seconds, all because she didn't like the bleeping noise while she was hoovering.
One company I used to work for, under each deak we had 2 power supplies connected to the UPS, 2 connected to normal power, clearly colour coded. The day we had a power failure we realised that in the sales department the only thing connected to a UPS backed socket was the Christmas tree.
We have one of the ultimate UPSes here: A bank of six 100KW class diesel generators. You know that there's something seriously wrong with the commercial electric supply when the six inch diameter exhaust stacks are blowing a column of black soot.
Oh, yeah, best not be staring into one of those exhaust ports when the commercial power fails. Those things start incredibly quickly. ;-)
Dave
has the uncanny ring of truth about it
At least from any poor bloody tech who's had to argue with accounting/planning/management about the value of buying decent gear or enough of it or more usually both.
I had to deal with one set of idiots who decided to cut down the order of 15 tools we put in to 5, we knew we'd consume 1 tool per 20 parts and where to send the customer when he only got 33% of his job done.
A beancounter once decided to save 0.01 cents per unit by substituting a glue, on an order that said "Non-Parametrically specified, DO NOT SUBSTITUTE".
The units were disk drives. About 100K units had to be replaced under a manufacturer recall, because they would all have failed in service within a year. (Something outgassed from the glue settled on the platters and built up in lumps until it crashed the heads).
My company just moved into a new facility. I work in the AV department, and one of the items I put on my wish list was a UPS system. The company hired a consultant to build the AV system, and he denied the request, saying, "The power from the utility is reliable."
Um, part of my work is digital recordings. I have to buy my own UPS units. :(
I'm not sure what digital recording has to do with this (is this a vague general term with a very specific meaning in the UK?), but this is where acquisition experience comes in handy. You don't specify the solution, you specify the capability requirements. In this case, it sounds like your requirement should have been something like 99.999% availability of mains power at a specified quality level (that works out to about 5 minutes of outages per year). If the utility guarantees you that, great. If the facility has backup power that can meet that requirement, great. If not, you would get some sort of UPS system. Now if they claimed that the utility could meet that requirement and the power goes out for more than 10 minutes, you sue them for breach of contract. If you/your company didn't properly specify your requirements or manage the contract, you're SOL.
For those who have fought a battle with beancounters over the need for a UPS; I now propose a test of bean counters who block requests for a new UPS, or replacement batteries for an existing one.
That test shall consist of the beancounter being strapped into a chair with a loaded 10 gauge shotgun aimed at their head. The trigger is actuated by a solenoid that is powered by a battery, with a line powered relay keeping the contacts OPEN.
As long as the utility power is maintained the beancounter has nothing to worry about. But, if for some fraction of a second, the power drops, then the contacts close, and the solenoid gets actuated, pulling the trigger. Then IT should ask the beancounter: Feeling lucky, punk?
"That test shall consist of the beancounter being strapped into a chair with a loaded 10 gauge shotgun aimed at their head."
Unwise. Extremely dangerous. If the gun goes off, the resulting implosion would devastate a very large area.
I'd suggest a combination of electrodes and testicles, but electron microscopes are still rather pricey.