my new favourite quote:
"... they've ignored the physics of how things actually work," Hines says.
"This can lead you grossly astray."
There have been a lot of scare stories in the media about electrical power grids in recent times, suggesting that it would be a simple matter to bring down a national transmission system by way of a minor cyber attack or physical sabotage - thereby bringing that nation's infrastructure to a grinding halt. There's just one …
that such things follow a power law - i.e. the common failures of smaller components are unlikely to cause problems, whereas the rarer failures of larger components are more likely to. The fact that the smaller components fail more often means that sooner or later the failure of ONE of them might cause a cascading problem, but most of the time the failures won't have a severe effect. Therefore deliberately causing a cascading failure by attacking a minor component such as a substation would have a low probablility of success; an attacker would have to attack a large number of substations to cause the whole network to fail, whereas attacking a large component (such as a power station) is more likely to screw things up.
That goes both ways, of course. It's still bloody hard to pick just the right component to sabotage exactly such that it will fail with the maximum possible effect on everything else. Which, in fact, was the point the boffins were making.
Now, something like stuxnet might be able to provide some miscreant with enough information to pick the right target, but even so it would require a lot of study to pick just the right places to attack. For simply shutting down everything you can isn't going to be very effective compared to just the right sequence of commands to cause an actual "HCF" in the grid.
This post has been deleted by its author
http://www.newscientist.com/article/mg20327255.900-how-to-shortcircuit-the-us-power-grid.html?full=true&print=true
but that article (some registration required) just specifies the best place to cause a cascade failure in the US; OK , Ian Fells says (briefly) how to do it in UK.
whilst this article here
http://books.google.co.uk/books?id=Gfh9AnIDxS8C&pg=PA295&dq=ninfield+new+scientist
says how it **actually** happened on "Black Wednesday 1981" (New Scientist 29 Oct 1981 pp295) and I agree it might be hard to find two trees in the UK, I could go on....
Tell someone that everything's fine and they'll ignore you. Tell them there might be a possible threat (note: 3 levels of uncertainty) and you have their attention. You don't even have to mention that the "might" is a one-in-a-million eventuality, the "possible" is almost infinitely improbable and the "threat" is so non-specific; in degree, importance and eventuality that it becomes meaningless.
Until someone is able to quantify some potential badness - attributing a real, numerical chance to it AND to describe the extent of the effects said badness would have (if it did come to pass), there's no information available to base a response on. So when a low-level manager starts running around, waving their arms in the air claiming that "there's a potential security lapse that would let allow someone to steal our data", without solid facts about what data, what could they do with it, how many times has this happened (to other organisations) and how many attempts have been made to steal ours - all you have some unfocussed paranoia. Sadly, these days that seems to be all you need to trigger all sorts of draconian limitationss, huge inconvenience and massive costs simply because an impressionable individual watched too much TV the night before.
Back to the case in point. While it *does* seem like some nuaghty people somewhere did create a worm targeted specifically at an Iranian institution that they didn't particularly like - and that it's perfectly possible for some other bad people to do the same, again. The fix is simple: KEEP YOUR INFRASTRUCTURE OFF THE INTERNET. The facilities have fences, security guards and locks on the doors, the control systems can go one better and isolate themselves completely. There are almost no circumstances where workers, doing their jobs in such plants need any sort of internet access - or to plug in thumb-drives, CDs or any other media. Prevent them from doing this and the threat (if it was ever really there int he first place) just goes away. In the small number of cases where it is needed, use the same level of security and scrutiny that is used for anything else entering or leaving the establishment.
Once you have sewn things up, tight. Sit back, breathe deeply, hire a team of penetration experts to keep your security up to scratch and focus on the things that could actually go wrong, rather than the hysteria from unqualified commentators who thing Die Hard, or The Matrix is real-life.
But I didn't have time to finish reading it because the current terror threat level is Severe:
http://www.homeoffice.gov.uk/counter-terrorism/current-threat-level/
So I'm too busy panicking to even go to the shops and buy more tin foil.
I find the best way to panic and act hysterical is to pat my head, rub my belly, and scream the tune of Ode to Joy at passers by while I run around the bus station wearing nothing but a used condom.
LA LA LA LA LA LA LA LA
Their risk analysis is outdated post-Stuxnet, but they weren't to know that when they started. Their theoretical analysis also doesn't wholly tie in with observed events.
Anyway, it depends what you call a large outage. Large parts of greater London were blacked out not many years ago when the wrong size fuse (too small) was installed in a significant distribution station (as part of rotine maintenance iirc) but not noticed at the time because it wasn't in a circuit active at that time. A distribution network reconfiguration which ought to have been perfectly routine brought that fuse into active use and not surprisingly blew the under-rated fuse. The consequence was a somewhat inconvenient chain of events. Anybody got a reference for more details?
A lot of Britain's electricity is used in, but not generated in, the South East, and is shipped in via small number of 400kV overhead cables. Sometimes one of the 400kV lines is out of service for maintenance, leaving less headroom than usual. See where I'm headed?
And didn't large parts of southern Europe have a widespread outage not all that long ago? Might that class as a large outage?
That was a wrongly specified relay, not a fuse, the reconfiguration was as a result of a fault. Unfortunately where you are headed is nowhere. The system has surprising resilience left in it even in the event of a fault during a planned outage.
As for Europe, well poor planning and costcutting. Terrorist accountants, now there IS an idea.
There are a handful of pinch points in the system, but they only let you near a small portion of network. And even then, no guarantees that it will achieve much. Take out windfarms for political gain and no-one would notice.......................................................................
having a bit of industry knowledge these so called boffins are funny it is not terrroists or disgruntled 4chan nublets we have to worry about, it is capacity the UK power infrastucture is balanced on a knife edge and a surge in demand could do more damage than any outside attack.
Most european power grids are ropey at best each helping the other in times of need it would not take much more than a prolonged nasty winter of more than 3-4 weeks to outpace supply V'rs Demand or sever agreements between member states.
God help us if we lost a whole power station.
Maybe so, but that's nothing that cannot be fixed by some infrastructure work.
OTOH, the latest "cybergeddon" alert level pushed among others by Terrorexpert Richard Clarke (See http://www.wired.com/threatlevel/2010/04/cyberwar-richard-clarke/ and http://www.computer.org/portal/web/computingnow/silverbullet ) which is being used among other to shovel tax monies to dubious companies and to hand the Nobel Peace Laureate an Internet Kill Switch is NOT about accidents...
Yup - using an angle grinder on a few pylons is going to cause no problem at all - I mean just like the internet they've invested so much in redundancy that its easy to route all the power through the other lines!
As for garnering the information required no-one can afford the high tech required to drive round the m25 and look up.
We're not doomed We're not doomed!
Dont think it would need much C4 or whatever its called to take down a pylon. Could even be done with a decent angle grinder. Take down 10 pylons around the country - would that create a big problem? Or is the grid resilient enough to havng entire lines taken down in that way? Could never be guarded against, as these pylons are in fields in the middle of nowhere.
in a RTA a few years ago 1 leg of a pylon (an L4M no less, so not at all beefy) was completely taken out the pylon remained upright.
These things are designed to meet pretty severe weather conditions (9.5mm of ice and a 380 N/m wind) and still maintain a factor of safety of 2.5.
I live in the mid-western US and between the tornadoes, lightening strikes, and the occasional blizzard the US would never have electricity if the doom and gloomers were right. We lose substations, switching stations, power lines and various other bits of infrastructure on a fairly regular basis. If splatting one of these took the entire grid down, there would be no grid.
Once again reality gets in the way of a good funding, er, scare tactic.
This also helps explain why the mid-west is so sparsely populated compared to the coasts. Of course you get the earthquakes on the west coast and hurricanes on the east coast. Come to think of it, it is amazing the US has survived at all. According to all these "scientists" running around, the US should have been knocked back to the Stone Age and everyone killed decades ago.
Our local telephone exchange is on the other side of a very deep canal. It seems that all the cables come across at a single point - a few years ago, someone poured petrol down a single manhole and killed every telephone in a sizeable portion of the city. (We had a fair few customers needing computers replacing that week thanks to the sudden isolation of every alarm system....)
Its the reason for pumped storage hydro stations.
As soon as the break in Corrie comes on, the power requirements spike from all the kettles turning on for a cuppa.
The guys at the national grid know what to expect though. They have it down to a fine art when the footie or soaps cause a nation wide timed surge.
More than once we had cases where the local underage hoods-in-training would lob some suitable metal object like a bicycle into the local substation for a laugh (failing, as such idiots generally do, to realize they were shutting off the power to their own neighbourhoods; shitting on their own doorsteps is something they were always star performers at).
Yet the only "ZOMG the gridz are asplode!" moment I recall was during a huge storm one night when Norn Iron's only major generating plant at the time got a good old soaking from the salt spray and shut down for several hours plunging a large portion of the province into darkness. Which is what you get when you really don't have enough redundant generating capacity.
Speaking of which, all the terr'ists have to do is wait while the increase in energy use outstrips the greedy power companies' willingness to invest in infrastructure. Then all the bad guys will need to do is launch "Operation Time For A Cuppa" and kaboom!!
[Not again. Where's The Other Steve, please?]
Get a clue. Or keep off the keyboard till you've got a clue.
Stuxnet did not need the Internet.
Sneakernet (USB sticks etc, or a machine which connects alternately between "secure" and "outside" network) works just fine for virus propagation, and has done for several years pre Stuxnet, in case you weren't aware (see e.g. Conficker).
Stuxnet's successors will not need the Internet.
Stuxnet's successors might need some inside info on the software and systems in use in specific places e.g. switching centres (they're not the same as substations) such as the one in Gloucester (?) which the fire brigade and army tried desperately to protect from floods not many years ago because it supplied much of the local area (including GCHQ).
Sell it off, then allow someone else to sell it off again to a company that's going to cut back on even bare bones maintainance. Then wait a few years. Add a lack of generating capacity near where the main loads are and wait for a nice icy spell. Voila. Now where are the candles?
Didn't BT have a trial run for that a few months back with the Paddington exchange fire+flood (or vice versa) which hit over four hundred exchanges and hundreds of thousands of customers?
http://www.theregister.co.uk/2010/03/31/burne_house_burns/
That theory has already been tested. Were I live ATT was about to go on strike. Some how a key fiber ring was cut and about 50,000 people lost phone and internet. Even though the fiber was owned by ATT , it also took out about 8-12 Version cell phone towers in addition to ATTs towers in the area. 911 in the area was shut down too.
Fair comment, it was a relay not a fuse, but do you think customers with no electricity, or even Register readers with power, care about the difference between a fuse and an overcurrent relay ?
Anyway, the first incident I was trying to remember occured on 28 August 2003 and was followed a few days later (5 Sept) by a similar but less widely reported incident with a similar cause (misconfigured protection relay) affecting large parts of the West Midlands. There had also been a major blackout in the north east USA shortly before (August 14th).
The London incident directly affected 400,000 customers and resulted in the closure of (half of) the Tube and although power was only off for half an hour, chaos lasted for the remainder of the day. The Midlands incident disconnected 300MW of demand (200,000 customers including the NEC and airport). The US incident affected 20GW+ of demand.
The two UK incidents in close succession (and following shortly after a major US outage) were sufficient to trigger an Ofgem inquiry, report URL below.
Everything's OK now though, right? Courtesy of "market forces" planning for our future, UK plc is so short of generating capacity in the UK and has so little replacement capacity being built that the nearly-forty-year old Magnox station at Wylfa on Anglesey yesterday got a two year licence extension till 2012. Sadly the aluminium smelter next door (which relied on 250MW of electricity from next door) closed a year ago with the loss of 500+ much-needed jobs.
http://www.ofgem.gov.uk/Pages/MoreInformation.aspx?docid=18&refer=About%20us/enforcement/Investigations/ClosedInvest
http://static.london.gov.uk/assembly/reports/pubserv/powercut.pdf
http://en.wikipedia.org/wiki/Northeast_Blackout_of_2003
I remember a few years ago chatting with a supposed top consultant who suggested that all generators on the grid must remain in phase. The 'pull' of the majority of the grid is enough to automatically bring new generators into phase, however - once a proportion of the grid is disrupted / knocked out of phase it can be incredibly difficult to sync things back up. I remember him suggesting the amount of damage/disruption required to bring the whole thing crashing down could be surprisingly little. Due to this never having previously happened there's little understanding / confidence in how the entire grid could be restored from a down or unsynchronised state. I remember him expressing his concern at what he considered an obvious weakness in the system. Anyone know if this is a genuine concern?
Um far as I know its impossible to sync the generators in two different location to the same phase or is hard. The solution is DC interties.
http://en.wikipedia.org/wiki/Pacific_DC_Intertie.
High voltage direct current (HVDC) is used to transmit large amounts of power over long distances or for interconnections between asynchronous grids. When electrical energy is required to be transmitted over very long distances, it is more economical to transmit using direct current instead of alternating current. For a long transmission line, the lower losses and reduced construction cost of a DC line can offset the additional cost of converter stations at each end. Also, at high AC voltages, significant (although economically acceptable) amounts of energy are lost due to corona discharge, the capacitance between phases or, in the case of buried cables, between phases and the soil or water in which the cable is buried.
HVDC is also used for long submarine cables because over about 30 km length AC can no longer be applied. In that case special high voltage cables for DC are built. Many submarine cable connections - up to 600 km length - are in use nowadays.
HVDC links are sometimes used to stabilize against control problems with the AC electricity flow. In other words, to transmit AC power as AC when needed in either direction between Seattle and Boston would require the (highly challenging) continuous real-time adjustment of the relative phase of the two electrical grids. With HVDC instead the interconnection would: (1) Convert AC in Seattle into HVDC. (2) Use HVDC for the three thousand miles of cross country transmission. Then (3) convert the HVDC to locally synchronized AC in Boston, and optionally in other cooperating cities along the transmission route. One prominent example of such a transmission line is the Pacific DC Intertie located in the Western United States.
"Um far as I know its impossible to sync the generators in two different location to the same phase or is hard."
It's far from impossible. Any "national grid" type setup does this inherently, across many miles, whether it be from North of Scotland to South of England, or some other long distance (US East Coast to US West Coast?).
However, managing grid frequency is an important part of managing grid operation (you mention this in your post). If you've got two separately managed grids and want a power interconnect (eg UK and France, UK and Norway), it has to be DC. If it's AC, you can't manage the two frequencies separately, therefore you can't manage the two grids separately, therefore there's no interconnect.
Did you know there is a (~2GW?) HVDC link between France and England? And that one is being discussed between UK and Norway (along a similar route as the existing gas link)?
Wrt transmission losses: according to sources used by Professor David McKay, the HV transmission losses are small in comparison with the losses on the LV side of things. It's less than 10% in total for both LV and HV, he says, so it's not a huge loss anyway and LV improvements wouldn't be easy.
I really think you need to see this post:
http://www.lightbluetouchpaper.org/2010/07/26/who-controls-the-off-switch/
"We have a new paper on the strategic vulnerability created by the plan to replace Britain’s 47 million meters with smart meters that can be turned off remotely. The energy companies are demanding this facility so that customers who don’t pay their bills can be switched to prepayment tariffs without the hassle of getting court orders against them. If the Government buys this argument – and I’m not convinced it should – then the off switch had better be closely guarded. You don’t want the nation’s enemies to be able to turn off the lights remotely, and eliminating that risk could just conceivably be a little bit more complicated than you might at first think. (This paper follows on from our earlier paper On the security economics of electricity metering at WEIS 2010.)"
Smart meters add a whole new attack method.
Some folks are already well aware of that, although the real motivation for smart meters is to ready the UK for when electricity demand exceeds electricity supply (as it will in 5-10 years). When that happens, smartmeters will allow Honest Joe Public to be disconnected selectively but en masse, thus ensuring that supply matches demand, whilst ensuring also that critical facilities (police, hospitals, shopping centres, etc) do not have their operations disrupted.