back to article Uber robo-ride's deadly crash: Self-driving car had emergency braking switched off by design

One of Uber’s self-driving cars killed a pedestrian crossing the road at night because its emergency braking systems were turned off, according to an investigation by the US government's National Transport Safety Board. The watchdog's four-page report released on Thursday is a grave reminder that today's autonomous vehicles …

Page:

  1. Martin-73 Silver badge

    The driver and Uber both need to be prosecuted for causing death by dangerous driving (or whatever the leftpondian equivalent is). uber needs to go away now.

    1. JLV Silver badge

      See this little jewel was left out of the article.

      https://arstechnica.com/cars/2018/05/emergency-brakes-were-disabled-by-ubers-self-driving-software-ntsb-says/

      the driver was distracted by having to review/flag an Uber touchscreen, which logs significant trip contextual info. This is part of her job, _per Uber design_.

      Un-fucking-believable. I would love to see a $10M+ fine, and jail time for the brakes-off / driver-does-other-stuff decision makers @ Uber.

      It's just so callous, it seems like a caricature of shitty corporate profit-seeking behavior by a retarded script writer.

      1. DougS Silver badge

        If the safety driver is required to take their attention away from that very important job to type crap into a computer, they need a second person in a car to handle the book work.

        All the executives at Uber who were responsible for creating and approving this asinine policy should be held responsible for that woman's death. As well as whoever had the brilliant idea to turn off the emergency braking function!

        1. Chz

          It was also noted that Uber used to have two people in the car - one for entering data, and one for emergencies. Then they decided to save money and merge the roles.

      2. John Robson Silver badge

        "the driver was distracted by having to review/flag an Uber touchscreen, which logs significant trip contextual info. This is part of her job, _per Uber design_."

        At that point the vehicular manslaughter charges against Uber's management should be upgraded to negligent, or even deliberate.

        If you need someone reading a screen during a drive then that should be a different person from the safety driver...

        1. Anonymous Coward
          Anonymous Coward

          As others have said, and I also made the same mistake originally, I thought the driver was doing other things. They were doing what they were told to do. They may not have been made aware what or how the autonomous mode works, and if they needed to perform emergency breaks themselves.

          Which puts a lot of responsibility on Uber.

    2. Anonymous Coward
      Anonymous Coward

      A normal human driven car would still have run over the cyclist/pedestrian/lemming so I don't see how they could have caused the death.

      1. John H Woods Silver badge

        Should have gone to SpecSavers

        You might want to stop driving until you do.

        Any reasonably alert and competent driver could have (should have) been able to reduce the speed of the vehicle from 43mph to at least under 20mph in that time, even looking at the (possibly artificially darkened) YouTube footage. If, as it seems, the area is more brightly lit, they would have had time to stop the car.

      2. Tikimon Silver badge
        Angel

        Humans see intentions, cars only react afterward

        "A normal human driven car would still have run over the cyclist/pedestrian/lemming..."

        Speak for yourself, and please stop driving! I posted about this yesterday. Humans excel at something the algorithms utterly cannot do: determining INTENTIONS. I would see a woman with a bike and keep an eye on her, prepared to react if she moved into the road. I don't wait until she's IN the road so I'm ready to react in time to dodge. No self-driving car can do that, nor will they in the foreseeable future.

        Humans can also interpret events that cars cannot. A ball rolls into the street - you know a child might follow it, the car would not expect that. There's a dog beside the road - is it preparing to cross the road, eating a french fry, or sniffing a signpost? You can interpret its behavior pretty closely and be prepared.

        Game, set and MATCH to humans. Sometimes we're stupid or distracted, but our greater understanding of the world means we're still safer.

        1. Dr. Mouse Silver badge

          Re: Humans see intentions, cars only react afterward

          Humans excel at something the algorithms utterly cannot do: determining INTENTIONS. I would see a woman with a bike and keep an eye on her, prepared to react if she moved into the road.... Humans can also interpret events that cars cannot. A ball rolls into the street - you know a child might follow it, the car would not expect that.

          While I agree with your point that the original poster should stop driving if he thinks no human operator could have avoided this crash, I disagree that algorithms can't do the things you say they can't.

          With the correct design, all of the things you mention should be well within an autonomous vehicle's reach. In fact, they should be better at them. They should be more likely to see the woman at the side of the road, and be prepared for that tiny movement which could indicate she is about to step into the road. They should be able to react more quickly to the ball rolling out into the road, take more appropriate action, and then react more quickly to the child appearing behind the ball.

          With the machine learning which is going into this, they should be able to pick all this up very quickly. The default position should be "I don't recognise this situation, I'll take some precautions and be ready in case it turns bad". Once enough data has been provided back to the machine learning algorithm about this kind of event, it will start to have a good idea of how it will turn out and be able to decide for it'self the appropriate action to take. When this is pooled from all the cars on the road, it should learn very quickly and be much better than a human driver (or as good as a potential human driver who has driven as much as the thousands of cars on the road have in total, anyway).

          However, the caveat here is "With the correct design". What Uber seem to have done is defintiely not this. They have gone for convenience over safety. Their default position is "I don't recognise this situation, I'll just ignore it and let the meatbag deal with it (and not tell them about it)". When combined with the fact that the meatbag must enter data during the journey, taking their concentration completely off the road to do so, it was only a matter of time before this happened.

          1. Roland6 Silver badge

            Re: Humans see intentions, cars only react afterward

            With the machine learning which is going into this, they should be able to pick all this up very quickly.

            What machine learning?

            From what I've seen these are constraints-based algorithms not neural networks, that get tweaked by developers - that are too scared to sit in the cars that use their algorithms....

            I suspect if the developers were compelled to sit in the cars during these trials then perhaps they might develop self-driving cars that were safe, or find another job...

            1. Anonymous Coward
              Anonymous Coward

              Re: Humans see intentions, cars only react afterward

              "I suspect if the developers were compelled to sit in the cars"

              Roller coasters do it right. First they test with sandbags, then the software engineers ride.

        2. Alan Brown Silver badge

          Re: Humans see intentions, cars only react afterward

          "A ball rolls into the street - you know a child might follow it, the car would not expect that. "

          This happened to my mother. She stopped., The driver coming the other way didn't. The kid got minor injuries and the driver got prosecuted for careless driving.

          SOME humans would expect it, _with_ training. Once programmed to expect it, ALL cars with the same programming will anticipate the child.

          The problem is the programming, not the computer. Google gets it right. It's perfectly possible to predict where a human (or animal) will walk based on where they're looking and where they're currently heading. Further, a robot is good at continuously looking for stuff a human might miss, like movement/feet visible in the gap under the parked cars ahead.

          The fault in this case lies in USA car-centric laws that almost always (it's a state level thing) say that pedestrians WILL give way to cars and/or WILL NOT cross the road except at designated locations sometimes only when permitted to. When you program based on those assumptions being canonical you have a robotised killing machine mowing down everything in its path.

        3. Anonymous Coward
          Anonymous Coward

          Re: Humans see intentions, cars only react afterward

          "Speak for yourself, and please stop driving! I posted about this yesterday."

          One of the problems we face with drivers is that they drastically overestimate their abilities. In this instance you would not have seen the woman with the bike and been prepared to react if she moved into the road. Whilst the LIDAR saw the cyclist 6 seconds before impact, looking at the video frame by frame an human driver would have had 1.5 seconds from the first glimpse of the lower part of the wheels and a little under that to realize that it was a cyclist. The very best a prepared alert driver can react and brake is 0.7 seconds. In accident reconstruction it's normal to use 1.5 seconds between. A controlled study has found that the average is 2.3 seconds.

          You "might" have been able to avoid that cyclist, but real world data suggests that the vast majority of drivers would not have done so.

          1. Roland6 Silver badge

            Re: Humans see intentions, cars only react afterward

            > looking at the video frame by frame an human driver would have had 1.5 seconds from the first glimpse

            Not been keeping up with the discussion, the video being on a typical cheap digital camera is not representative of the actual lighting level on that road at that time of night. Additionally, even the videos that show a more accurate lighting level don't take into account what a night adjusted human eye can see.

            However, in saying this, it is clear that for the Uber 'driver'/passenger to be doing work, they would need additional illumination inside the car, which would mean their night vision would be at best on a par with the official video footage. [Aside: It amazes me how many people in the UK drive at speed with their interior lights on and/or a large bright (not light level adjusted) SatNav screen in their normal field of vision, significantly impairing their night vision, particularly in areas of no streetlights - like motorways and dual carriageways; I suspect they are drastically overestimating their abilities...]

            >Whilst the LIDAR saw the cyclist 6 seconds before impact

            This should be raising serious questions! Given where on the road the accident happened, the speed and direction in which the pedestrian was walking in, there is no way the person could have moved from a pavement/sidewalk without being seen much much earlier! So immediately we know that either the LIDAR was very poorly set up or a p*ss poor LIDAR detector was being used - I suspect both, particularly when you compare the official video with third-party video's and overlay the information to resolve the blurred lighting - the LIDAR was most probably getting confused resolving light sources and distances.

            Interestingly, this case also shows that the algorithms being used by Uber do not track 'objects of interest' and determine whether their projected trajectory might result in a collision. Because then it would have seen the person step off the sidewalk...

      3. 2Nick3 Bronze badge

        "A normal human driven car would still have run over the cyclist/pedestrian/lemming so I don't see how they could have caused the death."

        A normal Volvo would have engaged the emergency braking feature, which was disabled by Uber. They turned off the feature that was designed to help in this exact scenario - distracted driver, low visibility, object/person moving into the path of the car.

        I'm not a big supporter of all of these "take responsibility off of the driver" technologies, as they aren't common across all cars, and drivers don't adjust well when they don't have them. I've been in a rental car with a driver who, because there wasn't a light on the side mirror telling them there was someone in his blind spot, changed lanes without looking. The other driver's reaction saved us. He was used to the feature in his own car, and the rental didn't have it. Having that "don't change lanes" light created a new behavior in him, so when the light wasn't there he didn't react properly.

        1. Alan Brown Silver badge

          "Having that "don't change lanes" light created a new behavior in him"

          Why? That light is an AID and is not a legal substitute for turning your head and looking.

          If you change lanes without checking during a driving test in most countries (mirror AND head turn) it's an automatic fail. If you get caught doing it after you get your license, it's a prosecution for careless driving regardless of the existence of a blind spot warning light.

      4. 404 Silver badge

        "A normal human driven car would still have run over the cyclist/pedestrian/lemming so I don't see how they could have caused the death."

        No. That road is brightly lighted up, video notwithstanding, I've driven down there myself at night.

        If you've ever been to Scottsdale or Mesa, they have money there, lots of it, and they light up the roads.

    3. anothercynic Silver badge

      @Martin-73, I'd second this, although with the caveat that the programme director who chose not to include a second person in the car (and hence burdening the driver with the touchscreen entry) should rather be prosecuted than the driver.

  2. Brian Miller Silver badge

    State of pedestrian irrelevant

    The pedestrian does not control the car. It is up to the driver to do so. Since the pedestrian was detected six seconds before impact, the car should have been slowing, if not actively applying the brakes.

    1. This post has been deleted by its author

    2. John H Woods Silver badge

      Six seconds at 43mph (18m/s) ...

      ... is plenty of time to slow to dead stop at 0.3G, it's hardly emergency breaking and how the hell would it be 'erratic behaviour' ... slowing down, even gradually, to avoid impact? Are they really prioritizing a "smooth ride" over not actually bumping into stuff?

      If Uber want to spend 4 seconds waiting for the driver to do something (why?) they could still have dead stopped at 0.9G in the last 2 seconds which, though uncomfortable, was still easily doable (especially in the prevailing road conditions). Even a single second of full braking would probably have avoided the fatality, and anyone with more than half a brain must know that once you are in the last second there is ZERO chance of effective back-up driver intervention, and the system might as well do the very best it can from that point on.

      The rental car I'm currently driving can already stay in lane; follow the vehicle in front at a set distance if that is less than the set cruising speed; and still brake the car hard if there's an object in front of it (while displaying a BRAKE NOW message and sounding a shrill alarm). This vehicle, in that mode, is doing more "self driving" than Uber was doing on that day, and if I hit a pedestrian because I was, say, responding to a new route suggestion on the Sat Nav, nobody would accept the excuse "but this car is supposed to be self-driving"

      Uber basically put a person in a car that doesn't really qualify as self driving, told them it was self driving, and crossed their fingers. Absolutely disgusting behaviour, even by their own bottom of the barrel standards.

      1. SiFly

        Re: Six seconds at 43mph (18m/s) ...

        Also, you know, the horn could have been used to signal to the cyclist ... why didn't the system activate the horn ....

        1. Anonymous Coward
          Anonymous Coward

          Re: Six seconds at 43mph (18m/s) ...

          "why didn't the system activate the horn "....

          I see. So you are from the school of driving who sounds the horn if they see something in the road, then mow it down six seconds later when it hasn't obligingly moved out of the way.

          This was not an emergency situation were the cyclist needed to be informed about an approaching car they were unaware of. It was a situation where the car needed to stop, avoid the cyclist or inform the human "driver" in adequate time. There are better ways of alerting the human driver than sounding the horn.

      2. SiFly

        Re: Six seconds at 43mph (18m/s) ...

        It occurs to me maybe the UBER software was asking the driver what was in front of it on the road ?

      3. John Brown (no body) Silver badge

        Re: Six seconds at 43mph (18m/s) ...

        "Uber basically put a person in a car that doesn't really qualify as self driving, told them it was self driving, and crossed their fingers. Absolutely disgusting behaviour, even by their own bottom of the barrel standards."

        It also raises the point that nothing less than a fully autonomous car is going to be safe on the roads. If an in attentive "driver" is supposed to take over in an emergency in anything less than a level 5 system, then this incident demonstrates clearly that a properly trained person who is employed to monitor the cars driving can't react in time, what likelihood has Joe Public got of being better? IIRC it's already been demonstrated that keeping a humans attention on the road while not in control is difficult and the delay between not in control and taking control is significantly longer than for someone driving manually.

        1. Alan Brown Silver badge

          Re: Six seconds at 43mph (18m/s) ...

          "It also raises the point that nothing less than a fully autonomous car is going to be safe on the roads."

          That's the point that Google's been making for quite a while.

        2. Dr. Mouse Silver badge

          Re: Six seconds at 43mph (18m/s) ...

          If an in attentive "driver" is supposed to take over in an emergency in anything less than a level 5 system, then this incident demonstrates clearly that a properly trained person who is employed to monitor the cars driving can't react in time, what likelihood has Joe Public got of being better?

          But, as has been pointed out by others, he wasn't being attentive at the time. He was filling in data which Uber had requested. His attention was completely off the road at that point, something which shouldn't be happening without a fully autonomous system.

    3. Tigra 07 Silver badge

      Re: Brian Miller

      I'd like to (probably being the first) to point out how well this system appears to work. I couldn't spot that cyclist until probably half a second before impact. If the emergency braking wasn't turned off (by some idiot) then this would have been a good example of how well driverless cars can save lives.

      1. Anonymous Coward
        Anonymous Coward

        Re: Brian Miller

        @Tigra 07

        The footage shows what was captured by the front-facing camera, which is effectively blind in the night time conditions at the time of the incident (think taking night time shots on a cheap/few years old smartphone). If this had been the actual sensor info that the car was using to drive it would have had to be banned from night-time driving (or more likely any type of driving).

        According to most reports a human driver with normal vision would easily have spotted the cyclist in time to avoid an incident, as indeed the self-driving car's LiDAR did. The only difference between the two is that the human driver would most likely have done something to avoid colliding with the cyclist, whereas the self-driving car was programmed not to, so it just mowed her down. You're going to have to look elsewhere to find an example of a self-driving car even potentially saving a life.

      2. steelpillow Silver badge
        FAIL

        Re: Brian Miller

        "I'd like to (probably being the first) to point out how well this system appears to work. I couldn't spot that cyclist until probably half a second before impact."

        YouTube is a poor substitute for being there. Having nearly hit several cyclists during my many decades of driving in all conditions, and having been actually hit by a car and narrowly avoided others during my cycling years, I can vouch that cyclists are a darn sight more visible than YouTube would suggest. I once stopped dead in a one-way street because a dozy kid stepped out right in front of me without looking. But I was alert and watching the pedestrians, I stopped literally with my front wheel over his toes but not quite touching. The first thing he saw was my front wing where his legs ought to be. Oh, the comical expression of surprise on his face! Had he been a cyclist high on drugs, I would still have been ready. OTOH I confess I once nearly killed a cyclist, but that was because I was in a hurry and broke the rules to (literally) cut a corner. Luckily the cyclist was alert and managed to avoid me. So I have been there from all sides.

        A vehicle driver must always drive so that they can stop in the distance they know to be clear. If the conditions are too dark to see clearly then you must slow down until you can stop within your own narrow puddle of light. If you do not slow down in poor visibility then you are driving dangerously.

        OK not everybody is that competent a driver, but that is just the kind of human failing that auto systems are intended to fix. But Uber baked the shitty drivers in by design. That is a FAIL in any language.

        1. Will Godfrey Silver badge
          FAIL

          Re: Brian Miller

          There are several youtube vids people took of that same route under the same conditions. You can see the road very clearly, and would have to be blind not to see a pedestrian with a bicycle.

      3. Alan Brown Silver badge

        Re: Brian Miller

        " I couldn't spot that cyclist until probably half a second before impact"

        As others have pointed out - that camera footage is doctored and the road is well lit. You would have seen the pedestrian the moment she stepped onto the road.

        Nonetheless, even if the road was pitch black (as the roads I drive at night are) and the camera was accurate, you would still see the pedestrian _at least_ 2-3 seconds out.

        Most dashcams at night have utterly abysmal sensitivity when there's no external lighting. The Uber footage is on par with my cameras on the roads I drive, which have no lighting whatsoever except the car's headlights. Even then I can see at least 2-3 seconds ahead. As a number of commentators have pointed out the road is required to be lit ahead by headlights for _at least_ the vehicle's stopping distance.

        I believe Uber was attempting to fool people into believing the road was unlit/poorly lit.

        It fooled me until i saw other footage, because what I saw was on par with what my cameras would show - but even then I know from experience that the pedestrian would have been driver-visible long before the camera picked her up (She would have already been clearly visible to a driver in that frame where her shoes _begin_ to show up in the camera) and as such I was wondering why the car or safety driver hadn't reacted, especially as it was clear she'd already crossed at least one traffic lane and as such had been clearly visible for some time.

        What the footage showed immediately was the lie of "the pedestrian jumped out of the bushes" narrative being put out by the local sherriff's department, which makes you wonder about _why_ they made that statement and why no further investigation of it has happened.

    4. Paul 195
      Flame

      Re: State of pedestrian irrelevant

      I couldn't agree more with you.

      This: "Toxicology results showed she [the pedestrian] tested positive for methamphetamine and marijuana." seems completely irrelevant in view of the fact that she was run over by a vehicle that had plenty of time to stop. But the cynic in me says Uber will at some point try to spin this information to deflect public approbation away from them.

      They have form in this area: https://gizmodo.com/uber-settles-lawsuit-alleging-it-obtained-rape-victim-i-1821156541

  3. SVV Silver badge

    It's not a bug, it's a feature

    Nice to hear that old chestnut again. A whole 19 minutes before it killed someone. Care to inform us what you consider an acceptable MTBF (mean time between fatalities) is before you continue testing your invention that you've invested heavily in to make you all rich ?

    1. John Smith 19 Gold badge
      Unhappy

      Re: It's not a bug, it's a feature

      A feature that kills pedestrians.

      Which (inside the head of Uber executives) means "Not anyone who pays us any money."

      That about right?

      I'd say that makes the Uber C-suite a bunch of Unters

      1. Stoneshop Silver badge

        that makes the Uber C-suite a bunch of Unters

        I see what you did there

  4. 404 Silver badge

    Sooo.... wait...

    Uber turned off the emergency braking portion of the 'AI', and of all the options it probably should have been programmed with (avoidance, *some* braking, big fucking light/sound screaming at the driver, etc)..

    This van digitally went 'KIIIIIIYAAAA!' and ran the human over... Uber Fuck It Option. Damn.

    0_o

    1. DougS Silver badge

      Re: Sooo.... wait...

      A warning for the driver wouldn't have helped, since she was distracted by having to type crap into her computer (per Uber job requirements - she wasn't playing around on Facebook) there's no way her reaction time would have been sufficient to stop the car in only 1.3 seconds. Humans can't context switch from a completely different task that quickly.

    2. Anonymous Coward
      Anonymous Coward

      Re: Uber turned off the emergency braking portion of the 'AI'

      No, worse. They left that on, but disconnected the wires. The car "knew" it needed to break. Uber prevented it from doing so. If they disabled that part of the AI, they would have an ability to hide their negligence, but here it is made obvious.

      1. Alan Brown Silver badge

        Re: Uber turned off the emergency braking portion of the 'AI'

        "The car "knew" it needed to break."

        It did break - the pedestrian. Because it failed to brake, thanks to that being disabled.

    3. John Smith 19 Gold badge
      Unhappy

      "This van digitally went 'KIIIIIIYAAAA!' and ran the human over.."

      Ahh. The light dawns.

      They plan to offer this as an option for military vehicles in foreign war zones.

      Where "Put the foot down and run the furriner over" is not so much a crime as SOP.

    4. PickledAardvark

      Re: Sooo.... wait...

      When I first read The Reg's report, I thought there had been some misunderstanding or that the NTSB had fumbled the wording. I read a few other reports which clarify the NTSB report slightly but confirm that the Uber had no automatic emergency braking system.

      I can understand why Uber disabled the manufacturer's safety systems in the Volvo XC90. It wouldn't have been a good idea for the car to have two independent systems driving the car; it's not comparable to multiple linked systems which are used to fly aircraft. Everything else is inexplicable to me.

      1. Cuddles Silver badge

        Re: Sooo.... wait...

        "I can understand why Uber disabled the manufacturer's safety systems in the Volvo XC90. It wouldn't have been a good idea for the car to have two independent systems driving the car"

        I don't really see what the problem would be if the systems have entirely separate functions anyway. You have Uber's system doing the actual driving part, plus an emergency override that only comes into play to stop the car if the main system screws up. The excuse of "avoiding potential erratic behaviour" simply doesn't add up - automatic braking is installed in commercially available cars, if it was erratically hitting the brakes all the time, they'd be off the road in no time and Volvo would be facing massive fines for putting such dangerous cars on the road to start with. As it is, I'm not aware of a single incident caused by auto-braking misfiring, and you can guarantee the media would be all over it if it actually happened.

        Long story short - Uber are lying. Volvo's braking system is not an erratic mess that would interfere with their own system; the only way it could cause problems is if they've actively programmed their cars to drive in a dangerous manner that would trigger emergency stops. As with so many of Uber's baffling decisions, the only possible conclusion seems to be that they're either terminally stupid or actively malicious, and quite possibly both.

        1. PickledAardvark

          Re: Sooo.... wait...

          "I don't really see what the problem would be if the systems have entirely separate functions anyway. You have Uber's system doing the actual driving part, plus an emergency override that only comes into play to stop the car if the main system screws up."

          I think you misunderstand the concept of "testing". Uber's cars are on the road to test their AV functionality. One test of Uber's AV functionality is to avoid an accident. Not Volvo's.

          Separate functions? The Volvo emergency braking system might have used its sensors to detect a potential collision and used modulated braking in a straight-ish line to stop. An AV system might detect a hazard and move right or left to a different lane, perhaps braking earlier and approaching a hazard at lower speed. Let's try both at once, with two systems on the brake pedal and steering wheel -- or three if the human operator gets involved.

          1. Stoneshop Silver badge
            Facepalm

            Re: Sooo.... wait...

            An AV system might detect a hazard and move right or left to a different lane, perhaps braking earlier and approaching a hazard at lower speed.

            If the AV system was already braking and/or moving to change lanes, Volvo's system would not have needed to act. Unless the braking/swerving was insufficient, and Volvo's system was still detecting the pedestrian as an obstacle in the path of the vehicle at the moment it should start to act to avoid hitting them.

            It's a backup system designed to act in case other actions look to be insufficient. Normally those actions are by the driver, but as the AV is taking that place it should be a backup for that too. Unless the AV has proven collision avoidance routines, and even then I don't see why such a backup should be disabled. You can prove the collision avoidance on a test track, randomly shoving all kinds of objects to be avoided as well as objects you can just run over with impunity, out into the path of the cars being tested. Under all lighting circumstances, all kinds of weather, all kinds of road conditions. And then you go out on the road in the real world with an attentive driver. If there's a need to enter manual logging during the test drives, then a second person should do that.

          2. Cuddles Silver badge

            Re: Sooo.... wait...

            "Separate functions? The Volvo emergency braking system might have used its sensors to detect a potential collision and used modulated braking in a straight-ish line to stop. An AV system might detect a hazard and move right or left to a different lane, perhaps braking earlier and approaching a hazard at lower speed. Let's try both at once, with two systems on the brake pedal and steering wheel -- or three if the human operator gets involved."

            I think you've rather missed the point. The entire problem is that there is no AV system; Uber have not designed or installed any way for the car to make emergency stops, and they require the human "operator" to focus on tasks that prevent them taking any action themselves. It's not a question of having two or three separate systems all trying to do the same thing, they've simply disconnected the existing system and left it at that.

  5. imanidiot Silver badge

    Just for this:

    Instead, according to Uber, “the vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator,”

    Uber should be banned from ever developing a self driving vehicle ever again. The sheer stupidity of this decision is mindboggeling

    1. Notas Badoff Silver badge

      Re: Self-driving, not self-stopping

      They developed a self-driving car, and it was. They disabled the self-stopping feature. *That* was plain bonkers.

  6. Z80

    If I'm reading that report correctly, the woman was hit by the right side of the front of the vehicle after the human driver made a steering input to the right. I guess she didn't have enough time to assess which direction the pedestrian was moving after finally noticing her on account of not paying enough attention.

Page:

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2019