back to article Oddly enough, when a Tesla accelerates at a barrier, someone dies: Autopilot report lands

A Tesla with Autopilot engaged accelerated toward a barrier in the final seconds before a deadly crash, an official report into the crash has revealed. Apple engineer Walter Huang was driving his Model X P100D on a Silicon Valley freeway on the morning of March 23 when the car, under computer control, moved into the triangular …

Page:

      1. Anonymous Coward
        Anonymous Coward

        Re: Nothing is right first time

        They are only following the illustrious example of Microsoft, which has been treating its users as unpaid beta (and sometimes alpha) testers for decades now.

        But at least Windows doesn't usually kill you.

      2. rg287

        Re: Nothing is right first time

        Or did I miss it and those early jets were taking passengers and crashing into airports killing people while they worked the bugs out?

        Well yeah actually, three De Havilland Comets broke up in 12 months before they grounded the fleet and discovered this new thing called "Metal Fatigue".

        1. Lee D Silver badge

          Re: Nothing is right first time

          Gosh, if only you could trial them at slow speed on things that are lesser risk, in areas where it's safer to go wrong.

          Everything from golf carts ("Drive me to hole 9") to milk floats, theme park transport to warehouses.

          No, nobody did that. Nobody bothered. It was straight into "self-driving car, but it's not really self-driving, but everyone think it's self-driving, smack bang on public highways with all the other drivers at 80mph".

          There's a little train for the kiddies, that's actually just a wheeled vehicle, that drives around my local shopping centre. Pay £1. Stick the kids in. You loop around the upper level and come back. There are low-speed pedestrians everywhere, the train makes a noise to let you know it's coming, it travels at about 5mph with some minimum-wage employee on the controls, past other shoppers, in a controlled environment, on a fixed route (that isn't railed or anything, just drives through the shoppers).

          That would be INFINITELY safer to test on, especially as regards "What if someone just steps out in front of us". Worst case, you'd catch a pedestrian on the ankle. I see no evidence that Tesla's etc. code has been tested significantly in such environments. Hell, a self-driving shopping cart! Genius! And a sub-product you can actually sell even if you can't scale it.

          But these things are still making stupid mistakes and are entirely unpredictable.

          1. Charles 9

            Re: Nothing is right first time

            As the saying goes, "Ain't nothin' like the real thing, baby." There's just no substitute for actual, on-the-road testing, just as the final phase of clinical trials always involves actual people.

      3. JohnG

        Re: Nothing is right first time

        "They can have their teething problems OFF THE PUBLIC ROADS!"

        Then the systems will never be ready for public roads, because they will not have been tested in the real world and will have insufficient data/"experience" of the variations in real world road markings, signage and driver behaviour.

        "Or did I miss it and those early jets were taking passengers and crashing into airports killing people while they worked the bugs out?"

        That is precisly what happened with the Comet and numerous other aircraft types. Of course, manufacturers and safety regulators attempt to address all the bugs before the aircraft enter service but numerous accidents have resulted in recalls and retrospective changes. This is pretty much the story of every accident investigation programme on TV.

  1. Anonymous Coward
    Anonymous Coward

    Still a bit of uncertainty

    The report gives a lot of detail, but there are still unanswered questions, particularly about what the driver did or didn't do.

    This leaves a bit of wriggle-room for those who would rather blame the driver than consider that the vehicle was primarily at fault. For my part I think that in effect the 'autopilot' committed suicide for reasons unknown.

    Why it did so requires a detailed technical investigation, but in the meantime I think it is a gross mis-representation, leading to a false sense of security, to call the driver-assist function an 'Autopilot'.

    The end result is a system that can fail catastrophically, and that should be sorted out before the public are allowed to drive these things on public roads.

    1. Anonymous Coward
      Anonymous Coward

      Re: Still a bit of uncertainty

      "For my part I think that in effect the 'autopilot' committed suicide for reasons unknown".

      Ever read Frank Herbert's "Destination: Void"?

      1. Anonymous Coward
        Anonymous Coward

        @Archtech - Re: Still a bit of uncertainty

        No, that story passed me by. Either that or I read it so long ago (~50 years) that I forgot it.

        Perhaps people who develop AI for safety-critical uses should spend some time reading SF back to the 50s, to get an idea of the fuck-ups that they should be watching out for.

  2. Chris G

    All or Nothing

    Autonomous vehicles will be fine when they work but IMHO at the present autopilot and it's relatives from other companies are on a par to human based drug trials. The drivers are guinea pigs who are part of the process of developing the tech.

    I don't think any system that gives a false sense of security can be allowed on the roads because it will end in accidents like this one, a tired driver who like most of us thinks he can pay attention to the road, relax, text or whatever will put his trust in these systems and then sometimes pay the price for his inattention. If the system either was not on or even not installed the accident would probably not have happened.

    At the current state of development anyone in control of an autonomous or semi-autonomous vehicle is ultimately responsible for it, if even the testers working for people like Uber and Google are allowing their attention to lapse and subsequently having accidents, then the systems should not be on public roads.

    When they work reliably with adequate redundancy, sensory and analytical systems maybe they would be usable.

    Though not something I would want, the simplest way to be able to get in a car,give a destination and sit back and relax is to have either a driver or control from a central source that oversees and choreographs all of the vehicles in a given zone and hands over to the control for the next zone when passing into it ( The next horror ' The Internet of Vehicles') .

    In large cities and conurbations autonomy doesn't really make sense, it's a vanity.

    To some extent governments and authorities are partly to blame in encouraging manufacturers to roll this stuff out early before development is sufficiently advanced.

    1. Charles 9

      Catch-22

      What you propose, however, is a Catch-22.

      Because the ONLY way to make it considered trustworthy on public roads is to TEST them. But the ONLY way to test them reliably is to use public roads. There is NO substitute.

  3. bish

    Fire Department

    I realise that everyone is far more interested in attacking or defending Tesla's flakey autopilot, but can I ask: what were the fire department doing, pouring water on a burning battery? Electric and hybrid cars are pretty common now (more so in the Valley, I'd guess), so either no one has bothered to tool up the fire fighters with suitable extinguishing materials, or they haven't yet realised that pouring water on a chemical battery is probably the second worst thing you can do, behind setting fire to it in the first place.

    1. Ben Tasker

      Re: Fire Department

      The water is used to cool the packs. They actually used foam to try and extinguish the fire.

      1. YetAnotherLocksmith Silver badge

        Re: Fire Department

        Indeed.

        There's not a lot you can put on a few hundred kilos of burning lithium to put it out.

      2. Charles 9

        Re: Fire Department

        The lithium in battery packs as I recall isn't raw metallic lithium but rather in a compound. The result is that the material is not nearly as water-sensitive. That's why airliner guidance for a phone battery on fire is to douse it; for something as sensitive to fire as an airliner, they wouldn't be saying this if they didn't consider the ramifications carefully. In this case, cooling down the battery to prevent further thermal runaway is clearly more of a benefit than the risk of a lithium reaction.

    2. Anonymous Coward
      Anonymous Coward

      Re: Fire Department

      The thing about water and lithium is a bit overstated. Lithium is less reactive than the other alkali metals. Yes it produces hydrogen when wet, but the rate of production of hydrogen depends on temperature and surface area. Apply lots of water to keep the temperature down, because simply denying it air won't work terribly well once it's over 85C.

      1. Lee D Silver badge

        Re: Fire Department

        The problem is the cell-compartmentalisation. It takes only one cell to become a thermal runaway and you have flames. But cooling them involves pouring water on the middle of a battery that's almost entirely solid metal contained in a metal box. It's hardly accessible.

        It's not going to go "boom" on contact with water, but it's going to expand, release gas, put pressure on nearby cells, all contained in a fixed size box with thousands of other cells that were - until then - untouched.

        And as shown - even days later the tiniest sliver of stray/punctured/scored metal from the accident shorting out the cell starts a fire again.

        I have seen a Macbook keyboard physically expand upwards and then ping keys off it, over the course of just minutes, because a battery got wet. The battery was literally 2-3 times its original size and punched/warped the aluminium casing in a matter of seconds. That's not helpful.

  4. RobertLongshaft

    Why the hatred for Tesla and Musk?

    Is it because he told you the truth about your pantomime profession?

    Did the driver have his hands on the wheel like he was supposed to? No. Case closed.

    1. Anonymous Coward
      Anonymous Coward

      Dear "Robbed of your longshaft",

      Did the valid criticism of Your Beloved Elon cause your tech erection to flag?

      1. DropBear

        While I don't find Elon's reaction particularly tasteful, I don't think he did anything out of the ordinary either. Tesla didn't lie; they were simply quick to point out anything and everything that may have contributed to the crash beside their own role. Hardly surprising, that. Every single person and company I can think of does exactly that immediately whenever blamed for something. It may not be the reaction you're looking for, but it's certainly human nature...

    2. sabroni Silver badge

      re: Is it because he told you the truth about your pantomime profession?

      Oh no he didn't!!

  5. Paul Hargreaves

    There are a lot of non-Tesla drivers in these comments.

    Anyone who actually owns one, having spent a few minutes getting to know Autopilot, quickly learns it's limitations. It's pretty damn good on motorways, but you know as soon as you come to junctions you have to actively tell the car what to do.

    Any one who is stupid enough to just let the car drive, without paying any attention at all, would probably be the sort of person who would have done the same thing in a car without the feature.

    1. Baldrickk

      Non tesla driver here

      I test drove a Model X. Got the sales-person to turn on Autopilot

      Car immediately accelerated hard and pulled sharply to the left, presumably to find the line, only it was sharp enough to have taken me off the road had I not resisted the turn which stopped it. Autopilot was immediately turned off again.

      It's not fully autonomous, and I wouldn't be happy to leave it trying to drive without my guidance/overwatch if I were to get one.

      1. JohnG

        Re: Non tesla driver here

        "It's not fully autonomous, and I wouldn't be happy to leave it trying to drive without my guidance/overwatch if I were to get one."

        Which is exactly what the user manual says you should do. The autopilot systems are in beta and full self driving is not yet available (FSD probably won't be available for a long time, probably eons or elons)

      2. DryBones
        Pint

        Re: Non tesla driver here

        It seems to me that the machine vision is being done wrong, and completely backward, and needs to go back to first principles.

        How do I stay on the road?

        - First, find the edges of it. Edge detection is key.

        - Lanes have a mostly standardized width, so it is pretty easy to figure out how many there should be. If the number is sufficiently fractional a lane is probably merging.

        - Next, look at the motions of other cars, they are likely to give a good indication of pathing.

        - Last AND least, look at lane markings, because 101 has too many bloody places where they didn't paint over the old markings so they cross over each other and run straight into barricades.

        How do I navigate unexpected obstacles?

        - My vehicle can be described as a rectangular prism of "Will bend and break if intersected".

        - Around it there is another slightly larger area of "Keep Out Zone" that I want to try to protect.

        - I should choose a path that will allow me to pass without allowing any intersections of my "Keep out zone" with the current and projected paths of objects. It does not matter if it is a ball, a child, bicycle, or car, it is not desirable to hit it.

        - It is easier to identify things like wind-blown paper, bags, etc which are not a problem than the myriad things which are, so train for the smaller set and treat the rest kinematically.

        1. Ken Hagan Gold badge

          Re: Non tesla driver here

          - First, find the edges of it. Edge detection is key.

          Edge not found for unspecified reason. Now what?

          - Lanes have a mostly standardized width,...

          Not on this road. Now what?

          - Next, look at the motions of other cars,

          Road full of nutters who left it too late to be in the correct lane. Now what?

          - Last AND least, look at lane markings

          Computer vision is rubbish and delivers a *clear* identification of a lane marking that doesn't actually exist. Now what?

          Human beings suffer all of these problems, but get around them by understanding the road and the others users at a far higher level, so when they receive implausible information they can reject it and try harder to find some better sources. We find this process so easy that we usually don't even realise we are doing it. Sadly, we've no idea how we do it. The workaround, so far, for autonomous vehicles is to spend shedloads of cash on numerous and exotic sensors that far outstrip the capabilities of our eyes and ears.

          1. DryBones

            Re: Non tesla driver here

            Rubbish. Think about how you stay on the road sometime. If all those things fail YOU are going off as well. What if you suddenly go blind or have a stroke. Hey, same result.

            The processes I listed were my understanding of how I stay on the road through less than ideal conditions. There are likely more, but they build What a hierarchical process that gives different weights to different types of data and rejects or adjusts if there are contradictions.

            My point was that the behavior I see reported from self driving vehicles seems like it relies most on things like lane markers that go totally awry when the highway department gets involved, so the way the vehicle determines position and navigation may need a rethink.

            1. Charles 9

              Re: Non tesla driver here

              That's PRECISELY the problem. We DON'T think about it. Not consciously, at least. It happens all SUBconsciously in our autonomous mind, and one of the things we've learned through machine learning is that it's bloody hard to teach intuition, because most of the time we don't know how OUR OWN intuitions work. You can't teach something you don't understand. And before you disregard the idea, consider how much conscious thought we put into walking, which we typically learn as babies when our capacity for reasoned, conscious thought was limited to begin with, yet nigh everyone from schoolchildren to adults handle the process with hardly a second thought. If you want to see just how much goes into a basic walking gait, try playing QWOP (look it up with your favorite search engine).

              1. Alan Brown Silver badge

                Re: Non tesla driver here

                "If you want to see just how much goes into a basic walking gait, try playing QWOP"

                or watch someone learning to walk again.

        2. Alan Brown Silver badge

          Re: Non tesla driver here

          - First, find the edges of it. Edge detection is key.

          The first self-driving tech attempts tried that, they found the car freaked out when the edges changed suddenly (like when the roadway became a bridge) and couldn't cope well if the edge wasn't strongly defined (a lot of places don't paint white lines on the edge)

          What people have found is that everything you think you know about driving isn't actually what you know about driving. It's all the edge cases which make it hard.

    2. Anonymous Coward
      Anonymous Coward

      Anyone who actually owns one, having spent a few minutes getting to know Autopilot, quickly learns it's limitations. It's pretty damn good on motorways, but you know as soon as you come to junctions you have to actively tell the car what to do.

      So in other words it's fucking useless and no better than cruise control just with a misleading name.

      Mine's the one with a manual gearbox and no flaky ELON9000 trying to murder me, open the pod bay doors Elon.

  6. Anonymous Coward
    Anonymous Coward

    Why???

    Why would an engineer - of all people - do such a thing? Riding at the front of a 2-ton lump of metal and plastic travelling at high speed towards other such lumps approaching at equally high speeds, with a head-on crash averted only by some buggy software?

    Even seen as a method of committing suicide, it is excessively complicated.

    1. YetAnotherLocksmith Silver badge

      Re: Why???

      Of course, we don't know which Apple product this guy was working on, do we? If he were the real Miles Dyson, and Elon figured it out... Or, more likely* it was just neutering the competition.

      *unlikely!

  7. 0laf
    Holmes

    Self driving cars will always kill people. The only question will be - do they statistically kill fewer people than people driven cars.

    The headlines are never going to go away.

  8. Anonymous Coward
    Anonymous Coward

    Everything makes mistakes

    100 people die on US roads per day, likely due to stupid human mistakes, little mention of that.

    1 person dies due to 1 stupid mistake of an autonomous system, the news won't stop going on about it.

    1. Anonymous Coward
      Anonymous Coward

      Re: Everything makes mistakes

      Weird. Human's crash cars all the time and it's not news. Computers hardly ever crash cars and it's news. How does that work? I don't see the difference.

      Oh wait, is it do with how common the things are?

    2. Wayland

      Re: Everything makes mistakes

      Computer driven cars are in the minority. The crashes are ones that people would not have. A human driver would have no problem with leaving the 101 for the 85 yet should not have allowed the car to attempt this. So it was human error to have allowed the car to do this.

      All car crashes should be regarded as human error. Any time a Tesla crashes on autopilot it's the error of the driver who allowed autopilot to be in control. Alternatively someone hacked the car and murdered the driver.

      1. Baldrickk

        Re: A human driver would have no problem with leaving the 101 for the 85

        As the barrier was damaged from a previous crash, and we haven't heard about a Tesla crashing there before, I think it is safe to assume that a person had indeed crashed there previously.

        So much for human drivers having no problems.

      2. Jediben

        Re: Everything makes mistakes

        "A human driver would have no problem with leaving the 101 for the 85..."

        The damaged barrier present BEFORE this event suggests otherwise...

      3. JohnG

        Re: Everything makes mistakes

        "A human driver would have no problem with leaving the 101 for the 85 yet should not have allowed the car to attempt this."

        The fact that the crash barrier had not been repaired since being damaged in a previous accident indicates that at least one human driver had a problem leaving the 101 for the 85.

      4. Alan Brown Silver badge

        Re: Everything makes mistakes

        "The crashes are ones that people would not have."

        Are they? Are you really sure about that?

        "A human driver would have no problem with leaving the 101 for the 85"

        And yet, the crash attenuator on the gore was removed, because someone had crashed into it in virtually the exact same manner as the Tesla did. Had it been there the crash would have been perfectly survivable.

        More to the point, video of this piece of road clearly shows a misleading lane marker trivially capable of pulling unwary drivers directly into the gore - and I'll point out _again_ that someone had already crashed into the gore, which is why it was in the dangerous state it was in.

      5. Malcolm Weir Silver badge

        Re: Everything makes mistakes

        Very late, but why oh why do people insist on commenting without bothering to look at the data?

        "A human drive would have no problem..."

        EXCEPT ONE DID. In a Toyota Prius. Which resulted in the crash barrier being damaged. Which resulted in the Tesla not benefiting from that barrier. Etc...

        1. MachDiamond Silver badge

          Re: Everything makes mistakes

          "EXCEPT ONE DID. In a Toyota Prius. Which resulted in the crash barrier being damaged. Which resulted in the Tesla not benefiting from that barrier. Etc..."

          Yes, somebody hit the barrier in a non-Tesla vehicle previously. The big "but" is there isn't any information on what might have caused the crash. I can't count how many times I've seen somebody dart across a few lanes to make an exit they weren't in line for. I expect that some of the time those nimrods don't make it successfully and crash.

    3. Baldrickk

      Re:1 person dies due to 1 stupid mistake of an autonomous system

      Tesla autopilot isn't autonomous.

      1. tom dial Silver badge

        Re: Re:1 person dies due to 1 stupid mistake of an autonomous system

        The fact is that the driver of the Tesla apparently chose to abandon control of the car to the autopilot, so it was autonomous in fact even though it was not Tesla's intention that it should be allowed to operate that way.

  9. DrXym

    When will people learn

    It's not the normal events which confound automated driving systems, it's the abnormal ones.

    The reality is unless a vehicle capable of handling all situations in the road safely, the driver must be compelled to pay attention. An alert driver combined with an autonomous vehicle is far safer than an autonomous vehicle by itself.

    1. Wayland

      Re: When will people learn

      "An alert driver combined with an autonomous vehicle is far safer than an autonomous vehicle by itself."

      But there's the rub, how can a driver remain alert if he's not doing anything most of the time?

      In order for computers and humans to drive cars together then the human must be involved all the time whilst the computer assists to make the job easier and more precise.

      For instance power steering makes steering easier and more precise so the car is driven better with less effort. The 'autopilot' should be a co-pilot.

      1. DrXym

        Re: When will people learn

        Well that's the point I was making. If you don't keep the driver engaged and the car does something dumb, then there is no human intervention when the car piles into a truck / tree or whatever. An alert, attentive driver can hit the brakes even when the car is doing something dumb.

        And if necessary that means the car has to force the driver to be alert. Force them to hold the wheel, monitor their face, reaction times, issue activities to perform, keep them engaged with drive. And start bleeping and slow down if they don't react.

        The problem is Tesla didn't bother with any of that in the first instance and has only begrudgingly implemented it now.

        They're not alone in this - all autonomous cars have the same issue.

        1. JohnG

          Re: When will people learn

          "Force them to hold the wheel, monitor their face, reaction times, issue activities to perform, keep them engaged with drive. And start bleeping and slow down if they don't react.

          The problem is Tesla didn't bother with any of that in the first instance and has only begrudgingly implemented it now."

          This is incorrect - the Tesla Autopilot does (and did at the time of the accisent) monitor if the driver is holding the steeering wheel and will first warn the driver but will ultimattely disengage. If it believes the driver is still not responding, it will engage hazard flashers, pull the car over and stop.

          1. Charles 9

            Re: When will people learn

            "If it believes the driver is still not responding, it will engage hazard flashers, pull the car over and stop."

            Is it just me, or am I picturing one of these going into the ditch when it tries to do this on a road with no shoulders?

      2. MachDiamond Silver badge

        Re: When will people learn

        "The 'autopilot' should be a co-pilot."

        A problem happens when you take to much activity away from the person behind the wheel. There has to be a balance between relieving a driver of some tasks while requiring them to perform others to stay involved.

        I use cruise control whenever I can. I get better mileage and I don't skip a heartbeat when I spot a cop and find I've been speeding up and I'm too far over the limit. On the highway in light traffic, I wouldn't be constantly changing speed anyway so letting the car handle staying at the same speed isn't a big deal. I still have to steer and keep an eye on what's ahead and around me. I'm also ready to tap the brakes to disengage the CC if I see traffic slowing ahead or if I'm going to be exiting the motorway. Take the task of steering away and I'm not really doing anything at that point. If I don't need to stay aware of the surroundings constantly to do the job of steering, it's likely my attention will start to wander worse than normal.

Page:

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like