Initial investigations into the Qantas Airbus A330 mishap have concluded that it was due to incorrect information fed into the flight control system and not interference from passengers' gagdets. A report by ABC news states that the Australian Transport Safety Bureau (ATSB) said incorrect information from a faulty air data …
Not passenger gadgets, eh. What a fricking surprise.
The way of the world now is to first point fingers and blame everything/everyone else while the headlines are being made. Then figure out what actually went wrong -- but that gets reported way down on page 23 under the Viagra adverts.
No wonder credibility is dead.
just as well for Airbus
If the "laptops were responsible" story HAD been true then it would be curtains for Airbus. Absolutely NO-ONE would fly on their planes again.
I can believe that various devices might cause interference with navigation (not for sure, but some could). But a mid-air upset ? No way! I can't imagine that any certification authority (FAA, etc) would allow an aircraft to fly if the flight controls were subject to laptop interference.
And it's totally irresponsible of Qantas to have made such a suggestion.
So the evil computer
Did exactly what it was told to do?
Of course it is now a 'fact' that a laptop caused rash of plane crashes with all on board killed horribly 
Qantas never said it was a passengers computer
"A computer onboard the aircraft"
Which journo's took to mean a passengers laptop, it wasn't as the story above confirms.
So much for triple redundancy
What were the other 2 computers doing when one of them decided to nosedive?? Good thing this didn't happen at landing or takeoff or 200 people would be dead.
..but flight computers probably were
According the reports in comp.risks (digest 25.38) this was due to a faulty Air Data Inertial Reference Unit (ADIRU). However, an A330 carries THREE of these and only one was faulty. It was sending "erroneous, spike" data to the Flight Control Primary Computers , which disconnected the autopilot. A short while later the ADIRU sent "very high, random, incorrect" values to the computer, causing a pitch-down command.
It seems to me that the flight control computers contain bugs because (1) I'd expect the computers to ignore or smooth spikes and (2) the whole point of having three sensors is so that errors can be spotted by comparing the sensor outputs. Evidently the computers didn't do this correctly or they'd have decided that the spiking ADIRU was faulty and turned it off.
The full ATSB report is here:
Little wonder. Everyone wants everything...*yesterday*. That includes answers.
I don't think it's passenger gagdets either
But shurely it would be relevant to know *why* the inertial reference wotsit data was either wrong or misinterpreted, before ruling out various possible root causes? And *then* we can reasonably conclusively say what the possible cause might (or might not) be?
'random and incorrect' ....Huh?
"...About two minutes after the initial fault, the [ADIRS] generated very high, random and incorrect values for the aircraft's angle of attack...."
I have worked on safety critical and flight control systems before and this sounds completly off the wall.
Firstly, there should be software filtering which can detect 'random and incorrect' values outside of normal flight parameters, discount them and mark a fault condition. Secondly, there should be multiple sources of data, so that the failure of one can be discounted by comparison to the other sources.
So what what went wrong here?
Why the ADIRU presented wrong data....
...appears to be because a failure mode was considered unlikely and the software was not written to cope with it.
Essentially, it was expected that should a given circuit fail the output voltage would fall to zero, and this would be detected by software and subsequently this input would be ignored. It appears that the failure mode that was discounted, where the output voltage would stick at its maximum, actually occurred but because this was not checked by the software in the autopilot it was misinterpreted as real data with the results we now know.
So, someone's Failure Mode Effects Analysis did not include this failure, and so no one coded protection against it.
Thumbs down, because they were lucky the A330 didn't turn upside down....
I'm even more surprised now
Apparently "an Inertial Reference System fault occurred within the Number-1 Air Data Inertial Reference Unit (ADIRU 1), which resulted in the Autopilot automatically disconnecting."
So the system knew that ADIRU 1 was broken.
Yet for some reason "The faulty Air Data Inertial Reference Unit continued to feed erroneous and spike values for various aircraft parameters to the aircrafts Flight Control Primary Computers which led to several consequences including:" (the jet plunge)
I'm confused - if the system knew that said ADIRU was broken, why the heck does it continue to pay any attention to it? Surely once it's dead, you kill it and ignore it.
At the very most, you shut it down, reboot it, and check if it's still playing up.
In the fail-over backup systems I install and configure, that's the procedure! And our 'worst case scenario' is a live TV broadcast going flashy - which isn't going to actually hurt anybody.
If the system was getting two sets of information that were at odds with each other, then disconnecting autopilot _might_ be a sensible thing to do. You know _something_ is wrong but not _what_ is wrong.
So it didn't know the ADIRU #1 was broken. It knew the ADIRU data was screwed.
And as Brian suggests, nobody had taught it what to do next in that situation.
Where's the Stanley Kubrick icon? Daisy, daisy...
has anyone considered
a plot from our evil computer overlords?
The latest thinking is
that high powered transmissions from a nearby naval base may have been responsible.
"No way! I can't imagine that any certification authority (FAA, etc) would allow an aircraft to fly if the flight controls were subject to laptop interference."
FAA are a bunch of idiots: see for example TSB findings on Swissair Flight 111.
resp to Kevin
"If the system was getting two sets of information that were at odds with each other, then disconnecting autopilot _might_ be a sensible thing to do. You know _something_ is wrong but not _what_ is wrong."
No, the safest thing to do would be to do nothing, to simply have ignored the input and carry on just as before.
Why? Think about it. The auto-pilot was engaged and the aircraft was in a safe situation, probably flying straight and level. Errant input appears, ignore the new inputs, you know something is wrong, but not what, sound an alarm to get the attention of the pilot, issue a voice warning to state the problem and suggest disengaging the auto-pilot.
If the system disengages the auto-pilot automatically, the flight control system will read the demand inputs from the flight controls, stick, throttle lever, which can be in any position, the flight control system will regard these inputs as being genunine inputs from the pilot, which they are not.
I would argue it is safer not to disengage the autopilot, having said that, if the state of the aircraft when the autopilot was engaged was losing height and then the errant input from the AIDRS occurred, you might decide that leaving the auto-pilot engaged might be an unsafe situation ( depends how high you are and how high the mountains are!), but you're still at risk of the flight controls being in a non-suitable state and the flight control system taking those inputs and moving the control surfaces to achieve what it thinks is demanded input by the pilot.
RE: ..but flight computers probably were
"... an A330 carries THREE of these and only one was faulty."
WHERE'S MY MINORITY REPORT?!
Did you say 'overlords'?
You meant 'protectors'...
Reminds me of...
Why does this remind me of "What do you mean kilometers, we thought it was miles," and the ever infamous "Unsigned Values? No, we always passed _SIGNED_ values. Why would you assume otherwise."
One of the funny, and scariest presentation I ever went to was at an Embedded System Conference a few years back. The presenter was talking about failures that should have been caught, but weren't, and the results of them in some safety critical systems. (One was a radiation therapy machine, one was a mars probe, several were missiles.)