BT is still working to restore access to thousands of people and businesses in east London left offline when a tunnel borer cut through fibre cables and copper wire. The problem is also preventing Transport for London from managing its traffic lights. Contractors working on the Olympic site in Stratford sent a large thrust borer …
wait a minute...
has no one else noticed something really odd about this... contactors working at the weekend?!
blame the accountant trolls..
a couple of decades ago when I was a telecom's network designer, the creed was "it must not fail", and we designed in backups on backups.
nowadays the mantra is "cost down", and huge chunks of the country rely on single exchange's.
that'd be the result..
This just proves...
That the ultimate survival tools is 5 meter length of Fiber Cable, no matter where you are, you just have to lay the cable down, and a work crew will be there within the hour to dig it up.
Aah, this might explain...
...why London City Airport was in a chaotic mess this morning, with huge queues at check-in, and a huge wait at the gate for my hugely delayed plane.
They mentioned their "main IT feed" being down. The whole airport seemed to be being run with pen and paper.
BT have (or certainly had) a dedicated office in Stoke that deals with works planning. Surely any competent burrower would check with all utilities before digging on such a scale. I can't fault BT here. Unfortunately.
Your broadband connection is only piggybacked on your phone line on the local loop, and jumps off it at your local exchange. From there the two signal paths are separate.
As you have broadband, but no analogue phone, either only the phone line goes via the broken tunnel and the IP data goes a different route, or the IP data is on one of the other links in the tunnel which survived...
So give it a day or two and I'm sure they'll manage to damage the IP datapath too :-)
"since the cut is half way through a link, you can't expect to see any 'colour coding' as you sugges. And to claim 1. you would route it overland and 2. 'you would have it done in half a day given BTs resources'? are you serious?"
You have NEVER seen a proper fibre "cable" have you, no really you havent. How the FU$K do you think they terminate main trunk fibres? Think they identify fibres one at a time or are you used to BT installations in your office where they might "blow" a fibre in from time to time? Think that when laying a transatlantic submarine cable they lay a tube and then blow fibres down it one at a time or just make a cable 3500 miles long? you have to "Joint" fibre optic cables every so often.
For your information, if you sliced a main trunk fibre (anything from 2 to XXX fibres) you would find the following colour codes:
And then guess what, if there are more than 12 fibres they have these things called "tubes", they are also colour coded. so yes, identifying what fibre goes to what fibre is EASY. Thats why we have these things called "Standards" the above colour code allows for 144 fibres (12 fibres x 12 tubes), this is probably one of the larger Cables in use because with modern DWDM technology you can get many more channels per fibre so you dont need to sink as many cables into the ground (or under the sea)
As for doing it in half a day and routing fibre overland I am sure (like the huge international telco I work for) that BT would have some mobile disaster recovery vehicles which they could park say 500 M apart and run some temporary armoured fibre between so that they could put a joint at each end (because they would have access points into the tunnel at regular intervals), plug in and off you go. Yes I am being serious. The MAJOR headache would be the COPPER cables (also colour coded) but carrying less traffic per pair and needing THOUSANDS of pairs to be jointed.
"You really expect BT to put 2 lines to every sodding building in the UK?"
Er, you do know the standard BT consumer cable can support 3 separate lines as standard? And sometimes have to if one of the pairs get damaged.
Historically BT has multiple routes in its trunk network between exchanges. Partly to allow for growth (they did think telephone use would grow) and partly for resilience as at one time the telephone unions were apt to down tools regularly. Electromechanical re-routing. Who'd have thought it possible?
A tunnel 30m down sound like its a trunk line, as its a hell of a way to burrow to hook up some subscribers.
You need to understand the concept of resilience.
If you pay for a non resilient connection, this is what happens when the link is broken. The fault lies at the companies who have gone for the cheap option when implementing their comms infrastructure. I'm sick of listening to moronic comments from people who have zero understanding on telecoms networks.
Notice any banks in the list of affected companies, many of whom use BT? Thought not, because they paid for the extra connection.
"You do have to wonder if all these professional building contractors and companies can actually read maps"
If you've seen some of BT's plans you'd know why they dig the cables up.
We saw the ones for our building with a BT project leader. They were utter rubbish, saying main feeds were coming into the wrong building, feeding up on the wrong side of the road etc etc....
BT STILL managed to chop through a feed, despite us telling them they were digging in the wrong place....
re: blame the accountant trolls
It is a perfectly reasonable thing to do a cost/benefit analysis on the amount of money that you are prepared to spend on resilience. Putting in 100% fully resilient, redundant systems for everything costs money (in the case of IT systems, often more than double as all those resilience processes have to be tested, rehearsed, documented etc.). In the case of some systems, like aircraft flight control systems, the consequences of any single failure would be too large to bear. In others, then the costs of a very rare failure could well be much less than the amount of money required to protect against it.
Now this is not to say that the accountants always get this right and don't sometimes take shortcuts without due regard to the consequences, but the principle holds. Spending disproportionate amounts of money protecting against very rare, albeit expensive, incidents like this are often not cost justified. It's cheaper to take the hit, pay compensation (or, in this case, presumably the contractors insures will do that) rather than increase the total cost base.
There is also, as others have pointed out, a difference between a localised failure (albeit a fairly large area) and of a systemic failure that brings the entire network down.
actually they do, they just happen to be in the same 4 core cable.
Makes me glad I'm no where near London.
Why does everything have to be centralised there.
Oh, I forgot, the place is filled with people who don't believe/care there is anything outside the city.
The only separated network in London is Urban Wimax's. Shame it doesn't go as far as Ilford - it only covers from the city to hammersmith as yet.
BT doesn't actually know where all of its cables and ducts go. Before it was privatised record-keeping was shocking. I once asked how many Megastream circuits we had and they hadn't got a clue!
You may laugh, but it's actually a verifiable fact that there is nothing and no one outside of London that is not computer-generated. You might *think* you exist, you may even see 'evidence' that you exist, but... oh, I couldn't go into detail, it'd blow your mind, man.
But how do you know you aren't a brain in a jar connected to a simulation of London, which in turn runs a simulation of the rest of the world?
I left my ISP
"We lost an entire BT Central DSL platform..you need to take a look at your ISP as much as BT for this little disaster..."
I did. I'm no longer with Namesco - I was out for nearly three days.
I wonder why the Olympic site needs tunnels bored at the deep level of BT's bomb proof comms tunnels? Conventional utilities are nowhere near that deep, and how many athletics events take place 100 feet under ground.
The poor Contractor
will be liable for all the costs of this outage.--
If the tunnel that got perforated was accurately shown on the maps/plans/records.
It may well be that the person doing the recording of the tunnel location made an error. It does happen.
Maybe it was a computer error. That's what gets the blame a lot of the time.
These DLTN tunnels are SECRET !!!!!!
Careless talk costs Bandwidth.
So do big knob bell end thrusters.
Multiple ISP's and multiple data connections don't always solve the problem of downtime. All the fixed wired providers out there usually end up in the same ducts or Exchanges anyway. Go wireless, have two truly diverse connections, one in the ground, one over the airwaves. Only way to get rid of local and common points of failure or at least not be impacted by them too much. At the end of the day most businesses have internet connections, why? To help them make money of course!
If the strategic decision makers don't realise the risks to their business they take, by putting all their eggs in one basket ,then they need to be educated on how to mitigate them. This type of article helps, a lot!