back to article Give me POWER: How to keep working when the lights go out

In the summer of 2001, I began consulting for a travel company in North Yorkshire. A very innovative company, but nothing all that unusual about it in a mechanical sense: a 70-or-so-seat call centre, finance department, sales and marketing people, and the IT department hidden nicely away where people couldn't just casually drop …

  1. TRT Silver badge

    We had one of those Watts-in-a-box a while back...

    Some workings of a fountain/stream at More London burst and flooded an underground substation. Took weeks to dry out. They parked the genny under my window, cue diesel fumes 24//7. Eventually, after a couple of days of shaking, the Victorian-era underground tunnel between the hospital buildings started to collapse and the genny dropped a foot or two into the pavement whilst the dry route to the secondary x-ray unit was barricaded off with more yellow and black tape than I've ever seen in my life before.

    Nothing to add to the article really, just that there's more things when planning power backup strategy than just how many Watts and were to plug it in.

  2. Captain Scarlet
    Coffee/keyboard

    Hangon a moment

    "It's not really much more expensive to give a user a company laptop than a company desktop"

    Notebook computers for Businesses are normally 1/3 more as you have to also provide docking stations, a posh bag because the old one smells, a two factor device such as an RSA keyfob, etc.... You then need to provide a bigger connection to the outside world to support them, the headache of Oh I accidently stamped on the keyboard and then threw it out of a window. So my experience is Desktops still are a much cheaper option overall.

    "Productivity is usually impacted as a result of the reduced intra-company communication" Hmm except the people who refuse to talk to anyone they sit next to and have to emailed instead whilst ccing everyone in the building for everything.

    1. Sandtitz Silver badge
      Facepalm

      Re: Hangon a moment

      Notebook computers for Businesses are normally 1/3 more as you have to also provide docking stations

      If a company sees value in users' ability to work at home then the company would have to provide another computer for homes. If the desktop computer was replaced with a laptop without any intention of working mobile then a dock may not be necessary at all.

      a posh bag because the old one smells,

      Basic laptop bags start at something like 10 pounds/bucks/euros.

      a two factor device such as an RSA keyfob, etc.... You then need to provide a bigger connection to the outside world to support them

      So if a company buys laptops all the users need remote connections to workplace, but if the company buys desktops then the users really didn't need remote connections? That's... Confucius!

      Internet usage can be denied at external locations if you're talking about routing all internet traffic through company firewalls, so no need for fatter pipes.

      1. Anonymous Coward
        Anonymous Coward

        Re: Hangon a moment

        "If a company sees value in users' ability to work at home then the company would have to provide another computer for homes."

        Still applies even if they have a laptop. From a business continuity perspective you plan for the worst case, but on a site by site basis. Obviously if staff are dead or injured, their job won't get done with or without IT, so the plan focuses on a scenario where the staff are intact, but have no kit and nowhere to work. If the fire alarm goes, then you're supposed to leave the building without your stuff, so the planning has to assume on a site by site basis that the staff are out, but all the desktops, all the laptops, and mobile phones are inside the flaming/bombed/contaminated/collapsed building.

        Realistically not all staff would be on site, so you'd have some laptops available, but you then have to have a plan where the recovery manager takes posession of these and allocates on a basis of need. If you work in marketing, strategy, forward planning, etc then you can be off line for a few weeks before the effects show. If you're in hot seats in procurement, trading or sales, or AR/AP then you need to be back on the job in hours. So that means that all the marketing peeps on £60k a year might need to hand in their computers to keep the £20k a year accounts payable staff working. That is the sort of thinking that business continuity is about.

        A proper business continuity plan is a right PITA, has costs and zero value until the day things go wrong.

      2. omnicent

        Re: Hangon a moment

        How about using some technology....?

        Give thin clients to people in the office (cheaper than both laptop/desktops) or use existing/old kit as thin clients, and move all the clever stuff to a real DC, then people can use whatever kit they have at home to remote onto their corporate dekstop via a technology of your choice (Citirx/vmware etc.)

  3. a cynic writes...

    And when the lights come back on...

    ...you realise that your dedicated line to the DC runs down the same tunnel.

    Interesting times...

    1. JerseyDaveC

      Re: And when the lights come back on...

      Indeed; I've written elsewhere on The Reg about that; many people are surprised that: (a) one often finds that office buildings have diverse entry points; and (b) the telco can often diversely route your connectivity through the street. There's a cost, of course, but it's often possible. Had it with a former employer's London office a while back: one WAN circuit from COLT and one from BT, going different ways down the road, so we didn't go down when someone put a JCB through a 200+-pair fibre.

  4. Alex Threlfall

    Clever planning...

    ...means you can integrate a generator without losing too much space. I've done works on buildings where they've had gensets on the roof, and in underground plant rooms. I've even brought generators in to these sites as a backup while work was undertaken on the primary set, but normally n+1 is chosen where they can!

    1. This post has been deleted by its author

  5. Spaceman Spiff

    Sensible

    Good, solid advice.

  6. Anonymous Coward
    Anonymous Coward

    Hurricane Sandy

    A good place to start for lessons learned would be Hurricane Sandy in the US. Look for the live blogs of the ISPs and server farms in the area. One company did everything that they thought was right - generator on an upper floor, plenty of diesel fuel, etc. Except, they never thought that the basement may be flooded and guess where the fuel tanks were located? They ended up having to use a 5 gallon bucket brigade up over a dozen flights of stairs to keep the daily use fuel tank topped off. Or the fact that the traffic jams and damage prevented the fuel trucks from getting in to other places to keep the generators going.

    1. Yet Another Anonymous coward Silver badge

      Re: Hurricane Sandy

      Similar problems in London "an incident" closed off access to our office - no problem as we had a backup site 1/2 mi away.

      Trailed down to the street to find that the dear old bobbies of the Met (Total Policing (tm) wouldn't let us into the street.

  7. Anonymous Coward
    Anonymous Coward

    There's always a common mode failure point

    There was a case several years ago where a boat's anchor managed to drag far enough to sever two well separated cables on the sea bed. Main route and diversity route both cut.

    BT had a national route failure when a JCB managed to break two cables that in theory were well separated under a field.

    Separated data centres on an airport flight approach path can look vulnerable - like pins in a bowling alley.

    1. JerseyDaveC

      Re: There's always a common mode failure point

      There comes a point where you just can't legislate for everything. In the mid-2000s I seem to recall two major undersea cables getting anchor-dragged within a few days, which actually had a tangible effect on the global bandwidth available since such cables are generally shared by providers. With the example I gave in Yorkshire they could keep trading normally but at a cost (the generator hire) - which was a big deal for them because the vast majority of their sales was skewed into Jan and Feb. Other companies accept the risk in return for not lashing out vast sums on protection.

  8. Little Mouse

    Here's two more words for you:

    Auckland. 1998.

    A whole city centre running on temporary generators for a few days is a noisy, smelly place. I was a jobbing contractor living there at the time. Happy days.

    Four separate underground power lines fed the whole city IIRC, and the long hot dry summer parched the ground so much that the cables all packed in one by one.

    1. Yet Another Anonymous coward Silver badge

      Re: Here's two more words for you:

      I think it was actually the giant spider that was to blame.

  9. rhydian

    Nowt much to add, but...

    The one simple thing that every small/medium office can do to keep things going when the lights go out is to keep a couple of standard, regular, run-of-the-mill corded phones in a cupboard. Then, if you lose power then at least you still have something to plug in to the direct exchange ADSL or fax line just in case. In fact, I've set up two old 1980s corded phones as permanent standby on our ADSL line.

    They're pretty much bombproof, and come in a fetching shade of red.

    As for generators, just be sure that your switchover gear is in good condition. Multiple short connections/disconnections can cause more damage than the original power failure. Also, make sure that you've connected all of your vital equipment to the generator. There's no point having a load of servers spun up and ready to go if you've not supplied power to the external network termination gear.

    1. Haku
      Happy

      Re: Nowt much to add, but...

      "a fetching shade of red."

      I hereby dub these red emercency corded phones "Batphones"

      1. JerseyDaveC

        Re: Nowt much to add, but...

        I once went on the Cisco Call Manager (ICUCM) course and the (Egyptian) tutor was trying to explain the concept of a phone that would automatically call a set number when you lifted the receiver. "Ah, you mean a Batphone!" we cried ... then had to spend the next ten minutes explaining what a Batphone was. :-)

      2. rhydian

        @Haku

        >I hereby dub these red emercency corded phones "Batphones"

        Waaay ahead of you...

    2. JerseyDaveC

      Re: Nowt much to add, but...

      Yup - analogue lines are good. Mobile phones are generally pretty handy these days too - cell masts tend to be protected pretty well. A few years back we had a complete power cut here in Jersey for a few hours, and we kept in touch via BlackBerry email and mobile calls, since the data centre had top-end protection and the cell masts were battery-backed. The office went offline but the DC-based servers stayed up (including the secondary PBX) and happily when the power came back the LAN switches and primary PBX started up without argument. My kind of power cut.

      1. rhydian

        @JerseyDaveC

        "Yup - analogue lines are good. Mobile phones are generally pretty handy these days too - cell masts tend to be protected pretty well."

        During our last "weather event" (Feb 2014 storms) we lost total mobile coverage for about 18 hours, thanks to the only mobile mast in the area suffering serious wind related damage. Even when it does work, indoor signal strength is patchy at best. However in a more populated area with better coverage I agree that Mobiles do have a place, but I'd still prefer a fixed landline due to mobiles suffering contention issues.

    3. SImon Hobson Bronze badge

      Re: Nowt much to add, but...

      > There's no point having a load of servers spun up and ready to go if you've not supplied power to the external network termination gear

      Ha ha, yes I've seen that.

      In fact, I recall getting called to a site of another company in the group to assist replacing a failed bit of kit. Went into their comms room, there was the big expensive PBX with a big expensive UPS - but the NTE for the phone lines was on the opposite wall, plugged into a 13A socket.

      More recently, and on a more mundane level, at work we manage some connectivity for a science park. One of their buildings went off-line recently - because a microwave oven in the kitchen failed and tripped the breaker in the distribution board. The comms stuff was an afterthought and shared the same ring circuit as the kitchen !

  10. Dave 32
    Coat

    Complex

    Oh, providing backup power is a VERY complex topic. It's a LOT more complex than simply buying/renting a generator.

    Consider that, during a wide-spread power outage, generators disappear at Warp 10 speed. Thus, if you're going to be using one, it had better either be bolted in place at your building well before the event, or it needs to be provided for in a long-planned rental contract with a company who is reliable (probably with clauses which make it too painful for them to renege). And, for anything under about 10 tons, it needs to be locked in place, else it may wander off in the middle of the night.

    Of course, simply having a generator won't do much good unless the building wiring is set up with an appropriate transfer switch, which will need to be installed by a licensed electrician in most jurisdictions.

    One probably doesn't want the entire building to be powered from the generator, well, unless one wants a really huge generator. So, usually only essential services are powered by the generator. However, the definition of essential varies considerably. Does the room lighting need to be powered, or will the employees be in the dark (literally)? What about the HVAC system? Will the employees be sitting in a tropical jungle environment, or will they be shivering in arctic-like conditions? Will the vending machines be powered, or will the staff be forced to do without coffee? How about workstations? What about networking? What about the phone system, both inside and outside the building? Of course, all of these decisions will impact the electrical system in the building, and may require substantial rewiring.

    I've seen some installations which have only powered the machine room. Of course, the first black-out that hits leaves the programmers and support people in the dark, with their workstations shutdown, and no way to support/administer the servers. Whoopsie. I've seen other installations where some bleary-eyed fool plugging in a coffee pot brings the entire generator to its knees and shuts the entire building down. Whoopsie.

    There are also safety concerns with where the generator is placed. If it's on the top of the building, can the building withstand the vibratory load it will produce, or will the building come down like a pile of sticks? I'm aware of one roof mounted generator which had fuel supply problems, since the fuel tank was in the basement, and the lack of commercial power meant that the fuel pump couldn't be run to get the fuel to the generator. Whoopsie. Do you really want your employees carting 55 gallon drums of fuel up the stairs?

    Also, consider where the generators are vented. Are they vented into the air conditioning intakes for the building? How long before the staff keels over from Carbon Monoxide poisoning? Note that some of those generators have six inch (or larger) stacks on them. That's a LOT of exhaust.

    Back to the subject of fuel, will there be fuel available, and of the correct type, for the generator in the event of a wide-spread power outage? Note that most gas stations won't be able to pump fuel in the event that there's a wide-spread power outage (Their pumps run on electricity, too.). But, if storing fuel on-site, can it be done safely? 1000 gallons of gasoline (or even diesel) will make a VERY impressive column of flame if it gets ignited accidentally. Plus, there are legal restrictions on where fuel can be stored in some jurisdictions. Some make it illegal to store fuel in an occupied building. Others may have rules about tank leakage safety. There's the whole insurance thing, too.

    Also, can the fuel be stored for long periods of time with out jellifying (polymerizing)? I've seen 5 gallon cans of gasoline that has been stored for a year look more like jelly than a liquid (Hint: Generators don't like trying to burn jelly!). There are products which supposedly keep gasoline/diesel from jellifying, but they have to be bought and added to the fuel.

    Oh, yeah, that reminds me. Don't forget to periodically test the installation. It's no good having a backup generator if the thing won't start. Or, if the transfer switch is fused/stuck. Of course, such testing should be done in a manner which won't take the building down if something doesn't work. There's also regular maintenance issues with the generator, such as changing the engine oil. Or, for that matter, even checking that the think has oil in it. There aren't bird nests in the exhaust stack, are there?

    I could go on and on (and probably write a book about the things I've seen done correctly and done wrongly), but I think it's clear that providing backup power isn't a simple, easy, nor cheap task. It can be done correctly, but only with the correct people planning and maintaining the installation, and enough resources to do the job correctly.

    Dave

    P.S. I'll get my coat; it's the one with the package of AA batteries in the pocket.

    1. Anonymous Coward
      Anonymous Coward

      Re: Complex

      "Will the vending machines be powered, or will the staff be forced to do without coffee? "

      One data centre I worked at had an electric pump for the ladies' toilets. It was found to be the cause of EMI noise spikes that crashed the mainframe once in a while. The penny dropped when one of the women remarked that the mainframe was often down when she came back from a toilet break.

      1. JerseyDaveC

        Re: Complex

        Former colleague of mine had a similar problem: railway lineside equipment that kept crashing. So he rocks up to the lineside cabinet with an engineering gang in tow to literally sit and wait for the kit to crash. In his words: "A bloody great electric thing came whooshing past" and *bam*, the kit rebooted itself. A bit of shielding later, no more problem.

        1. TRT Silver badge

          Re: Complex

          Never a problem with analogue TV, crystal clear in fact, but as soon as I installed terrestrial digital... sparkles, break-ups, EPG went nuts... and every time a train went by. I did live only 100 yards from the West Coast Mainline. Those things throw out a lot of interference.

    2. phil dude
      Thumb Up

      Re: Complex

      @Dave 32: Excellent post!

      It really makes you appreciate the civilizing and stablising effect of the utilities - water, power, heat, food, internet....

      With the exception of medicine what else (apart from freedom from oppression) is needed for basic human rights?

      There is a James Burke "Connections" episode (The Trigger Effect) when he talks about the powercut in New York 1965 and its knock-on effects.

      Classic stuff.

      P.

      1. Anonymous Coward
        Anonymous Coward

        Re: Complex

        "a James Burke "Connections" episode (The Trigger Effect) when he talks about the powercut in New York 1965 and its knock-on effects."

        Be interesting to see that remade taking into account the modern trend for reduced (preferably zero) stocks of everything everywhere courtesy of the Just In Time and LEAN zealots having far more influence than is actually sensible in the bigger picture.

        It's only a couple of years since the weekend in December when there was a flash rumour of heavy snow in my bit of the UK (Midlands) one weekend, leading to every shop in a large part of the Midlands being without fresh food (bread, milk, bottled water, etc).

  11. Anonymous Coward
    Anonymous Coward

    home / office setup here with a stack of 4 "development" servers

    backup systems are a fairly hefty UPS system powering the servers and all networking gear + phone system

    individual ups powering desktop pc's and their primary monitor (secondary and tertiary lose power on ups)

    auto start and switchover to a 20kva generator (provides enough power for the entire building)

    backup generator 6kva diesel can provide enough power to run essential items + some extras

    diesel on hand to run main generator for 3 days (can run for weeks if needed by using heating oil as fuel but generator output would be degraded by around 15%)

  12. Anonymous Coward
    Anonymous Coward

    There are many options

    Depending on the Biz there are many options - as this story points out. Finding the options that best fit a specific application is the challenge. I recently looked at natural gas powered stand-by generators which can be a nice convenience as long as you don't live in an area where earthquakes might sever the supply lines from the gas company. You could also have on site propane and natural gas storage tanks that can supply the generator engine for quite a long period of time in the event of a lengthy power outage.

    It's unfortunate that the obvious first step in power back-up is overlooked by many businesses - that being UPSs. Until businesses suffer serious data loss or disrupted operations, they rarely have a proactive view on back-up power to maintain their ops.

  13. Terry 6 Silver badge

    Avoid the bureacrats getting involved

    I used to have a business continuity plan - on two sides of A4- that covered just about every scenario that we could think of.

    E.g. Staff working from a either home or a set of partner sites that would be prepared to lend us a bit of space (quid pro quo).

    Access to essential files and confidentiality arrangements. And so on. All planned for.

    Then the corporate people came in on the act.

    They bought a massive web based business continuity database system (anyone see the weakness there?). Which was so fiendishly complicated that it wasn't only just about impossible to populate all the fields and sub-fields, but it was totally impossible to actually get any action plan out of it.

    There were sections and sub-sections all of which had incomprehensible names and no recognisible logic to the structure.. Compulsory fields that made no sense in our context. Nowhere to put other genuinely essential actions that were core to our continuity (like those partner spaces, because the designers hadn't thought about distributed working. At least that we could find,)

    In the end the trainers who had been hired to help us set up our plans had to go round and set them up for all the (council) departments. And while they were doing it they told us that in a real crisis we should just have available.......... can you see this coming? A plan on two sides of A4.

  14. Sgt_Oddball

    this takes me back

    My last job had occasional (so every 5-6 months natch) power outages and while a back up genny would have been great, it wouldn't have saved us from aged ups units going pop one after the other (2 ups's to a server but once the first one died the extra load cascaded pretty quickly) resulting in the final insult of the unit that had just had new batteries in going bang. Cue server room full of smoke while we're frantically trying to get the servers to shut down safely before they get fried or burned (since we weren't quite sure where the smoke was coming from and if there's smoke... Well you understand our concern).

    Strangely I don't miss the place.

  15. Matthew 3

    Best lesson is experience.

    My old employer had an office in Manchester's Arndale centre at the time of the IRA bomb. All the backup tapes were in a fireproof safe, inside the sealed off powerless building.

    Fortunately an unaffected ISDN2 line lasted just long enough to take a full backup...

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like