Reply to post: Re: Differing cultures

'Do not tell Elon': Ex-SpaceX man claims firm cut corners on NASA part tests

Anonymous Coward
Anonymous Coward

Re: Differing cultures

@Destroy All Monsters,

"The only item missing is Risk Management."

Quite right. Risk management, at least the management of really big risks, seems to be something that the tech companies are doing really badly in my opinion. For example, TEPCO in Japan ignored the advice of it's own engineers and the Government regulator and kept Fukushima running for a small profit. How I bet they wished they'd taken the risk seriously...

They had grudginly spent the money on pressure relief valves just prior to the earthquake, without which it would have turned into 3 x Chernobyl (exploded containment, cores venting freely to the atmosphere, bye bye lots of Japan including probably Tokyo), so in a way they were incredibly lucky.

Take Google and their self driving car effort. Surely they know that if they do a "self driving car" then, like it or not, they are inheriting all liability for the actions that car and their software performs, irrespective of what the EULA says. That is a phenominally massive risk to the existance of Google / Alphabet. They'd be one software bug away from millions of simulatenous accidents. Ok, so their testing might eventually suggest that such a bug is unlikely, but the risk is non-zero.

Given that, one wonders why they're bothering. Let's see... Risk likelihood: low (or at least that's the idea). Risk impact: lots of dead people, Google ceases to exist as a company due to burden of mass litigation. Risk mitigation: none, it cannot be eliminated, lots of testing needed to minimise it. Risk complexity: high (driving a car is a complicated, unpredictable thing). Risk perception evolution: as time passes, Google will be perceived to have become more liable as people forget how to drive. Current risk mitigation activity: sticking our heads in the sand and pretending that the risk won't exist when we're done.

Personally I think the State of California is doing Google (and everyone else doing self driving cars) a massive favour by publishing the statistics from their on-road testing. It means they're not being allowed to fully ignore the risk they're running...

For SpaceX, and I suppose Tesla, etc. it's an organisation that really hasn't had any real experience in dealing with a death for which they bear the responsibility. The closest they've come is people taking the phrase "Auto Pilot" too literally (and, given the nature of the phrase, it's understandable why they did so even if they were crazy to do so).

If a company builds something that ends up killing someone, the company from the top down has to show that it took all possible measures to guard against a fatal outcome. If they cannot, and if it's shown that they deliberately did not take all possible measures, then someone goes to jail.

So with SpaceX and manned flight, Elon Musk personally is running a risk that he could wind up in jail if an astronaut is killed and the subsequent inquiry reveals he ordered short cuts in QC be taken, or instilled a culture of fear, or etc.

I wonder how he's personally managing that risk? Has anyone actually told him of that risk?

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon