Re: Even if it was down to "a couple of software engineers"...
[No apologies for length. Share and enjoy.]
It would be very surprising if the VAG incident comes down to "rogue engineers".
Analysis of similar "engineering vs commercial pressure" precedents typically shows that occurences like this come down to "management culture" (not even "rogue management", just the routine way an organisation does its business).
Even if an individual engineer does something stupid, the *organisation* is supposed to have a set of processes (and in some cases even "corporate ethics") to stop the stupidness reaching the outside world. Realistic system testing, proper QA processes, and such like tend to be helpful but have an unfortunate tendency to be seen by HQ as "non value added".
When the company loses tens of billions of dollars because such "NVA" tasks are skipped, maybe it's time for HQ to rethink that viewpoint.
I mention "corporate ethics" because large companies I am familiar with claim to have formal corporate ethics programmes, whistleblower hotlines, and such. I'm afraid that experience in many organisations (private and public sector) suggests that these ethics hotlines etc exist for cosmetic reasons only.
One organisation I knew had such an "ethics programme" but there was also an informal but widely understood and very clear "don't rock the boat, or else" policy: senior engineers/managers do not make mistakes or misjudgements. To even think aloud about the possibility, when a well-informed independent outsider would likely be asking "why is this approach a good idea, show me the documented arguments in its favour", would be a career-limiting move.
Below is a very brief extract from a presentation given to an Oil and Gas Industry conference on some cultural aspects of the 'modern' large-organisation engineering process. The presenter is not an engineer but a man with a history of investigating how disasters happened. Do you think his thoughts might be relevant in this picture?
SEVEN THEMES OF NIMROD
11. The following seven themes struck me forcibly as I began to investigate the Nimrod story:
(1) Complexity. The Byzantine complexity of the organisation, the rules, the standards, the safety processes in the MOD was remarkable. Complexity and change had become the altar at which a lot of senior people worshipped – but had become the problem rather than solution.
(2) Dilution. The immediate casualty of this complexity was a dilution of responsibility and accountability – it was difficult to divine who was responsible or accountable for what. Accountability is the ‘reciprocal’ of Responsibility.
(3) Management by committee and consensus. There were more committees, sub-committees, working parties etc dealing with safety-related matters than the UN.
(4) Lack of challenge. There was a culture which rewarded conformity rather than the asking of awkward questions.
(5) Migration. There was a migration of decision-making and budgetary power away from those with most direct working knowledge to those sitting in warm offices back home.
(6) Triumph of generalists over specialists. There was too little appreciation of engineering specialist skills, too great a reverence to for the young MBA.
(7) Paper safety. Safety was increasingly a paper, coloured diagram and PowerPoints exercise, rather than a people, process and cultural matter.
From Charles Haddon-Cave's presentation on "LEADERSHIP & CULTURE, PRINCIPLES & PROFESSIONALISM, SIMPLICITY & SAFETY" at a 2013 conference marking the 25th anniversary of the Piper Alpha oil rig disaster.