back to article Uber won't face criminal charges after its robo-car killed woman crossing street

This month last year, one of Uber's self-driving cars operating in autonomous mode hit and killed Elaine Herzberg as she walked a bicycle across a road at night in Tempe, Arizona. The deadly crash is believed to be the first pedestrian death attributable to autonomous vehicle. This week, the Yavapai County Attorney’s Office in …

  1. PC Paul

    What? The car can't do emergency braking on it's own?

    I find it hard to believe the system is set up so that the car cannot emergency brake on its own, so it waits until it decides it's needed, 1.3 seconds before impact, *then* tells the 'safety driver' 'Hey mate, you need to mash the brakes' - 'Oh. Too late.'

    There's no way that was ever going to work in real life, even with an attentive driver.

    1. Alister Silver badge

      Re: What? The car can't do emergency braking on it's own?

      *then* tells the 'safety driver' 'Hey mate, you need to mash the brakes'

      Not even that. From the article:

      the car's systems weren't set up to alert the driver of the need for intervention

      So the computer identified it needed to brake, logged it, then carried on.

      1. Chris G Silver badge

        Re: What? The car can't do emergency braking on it's own?

        I seem to remember that Uber had disabled the emergency braking for some reason so both they and the driver ought to be liable.

        1. vtcodger Silver badge

          Re: What? The car can't do emergency braking on it's own?

          I don't think the Uber got a pass on liability. Liability is a civil issue not a criminal issue? That is to say that Uber can presumably still be sued for a zillion or dollars or so.

          That said, I have trouble seeing why, from the information presented, Uber is not guilty of negligent homicide or involuntary manslaughter or some such. They killed someone dammit. In an entirely foreseeable fashion. In what way are they not criminally responsible?

          1. werdsmith Silver badge

            Re: What? The car can't do emergency braking on it's own?

            That said, I have trouble seeing why, from the information presented, Uber is not guilty of negligent homicide or involuntary manslaughter or some such

            The reason that they are not guilty is because they are not called "Huawei".

            1. Archtech Silver badge

              Re: What? The car can't do emergency braking on it's own?

              There does seem to be an overriding principle that corporations can't be found responsible, so the search is on for the nearest individual citizen to dump on.

              1. Anonymous Coward
                Anonymous Coward

                Re: What? The car can't do emergency braking on it's own?

                ...and this is why I don't want self driving cars on any streets where I live or might travel.

                Even if a case was brought against Uber they'd just throw up a wall of lawyers and perhaps after 15 years of appeals and appeals of those appeals a deal would be struck where they pay a small fine.

                Someone was killed here due to incompetence of both Uber and the "overseer" and both need to be held to account.

        2. I ain't Spartacus Gold badge

          Re: What? The car can't do emergency braking on it's own?

          I seem to remember that Uber had disabled the emergency braking for some reason so both they and the driver ought to be liable.
          .

          As I understand it, that Volvo model has emergency as one of the normal safety features. Which Uber had disabled as they wanted their computer in control. Which makes sense - as you don't want two separate computers in charge of the brakes - plus you're testing your computer.

          Of course the separate issue is that Uber had also disabled emergency braking in their own system, because it was basically shit and doesn't work properly.

          For some reason they'd also turned off notification to the safety driver - that could be a design issue, incompetence, a setting that got turned off accidentally, or some other set of complicated reasons. Maybe as simple as the warning system was going off so often that people were just turning it off.

          The reason that Uber should be prosecuted in my opinion is that their system is a pile of shit. Plus they'd set up housekeeping stuff for their "safety driver" to do on the computer screen while they were supposed to be driving the fucking car. Thus deliberately distracting them from the already dangerously boring job of watching the road and waiting to intervene. And for that negligent design alone, someone should be seriously punished.

        3. LucreLout Silver badge

          Re: What? The car can't do emergency braking on it's own?

          both they and the driver ought to be liable

          Agreed. It's obvious from the in car video that the driver simply isn't paying attention. It's impossible to be ready to take control if you're not paying attention to what is happening outside the vehicle and are dedicating almost all your attention to operating a computer.

          I don't see this any differently in terms of liability if the driver was using the laptop for "teaching" the self driving car, or for playing Pac-Man. It's clearly dangerous driving and in UK law terms, this seems to me to be an obvious case of causing death by dangerous driving, had the incident occurred here.

      2. Mark 85 Silver badge

        Re: What? The car can't do emergency braking on it's own?

        In all the articles on this, none have stated that the driver was aware that the "panic"/"emergency" brake system wasn't operational. If the driver wasn't aware, then there's grounds for Uber to be sued I'd think.

        1. macjules Silver badge

          Re: What? The car can't do emergency braking on it's own?

          But are we saying that the "driver" was not able to operate the vehicle in any way? I would have thought that it would be a common human reaction to think, "the car is going to hit that lady, I had better take over control".

          1. Jimmy2Cows Silver badge

            Re: What? The car can't do emergency braking on it's own?

            Indications are the driver was watching some streaming thing from Hulu at the time of impact, instead of watching the road.

            This at least means she wasn't doing her job properly. Negligence leading to death suggests she is personally liable. If Uber expected her to do housekeeping tasks instead of watching the road, Uber should reasonably be considered liable too.

      3. StephenH

        Re: What? The car can't do emergency braking on it's own?

        "So the computer identified it needed to brake, logged it, then carried on."

        As long as the documentation is complete and accurate, it will probably pass the QA audit

        1. Frank Bitterlich
          FAIL

          Re: What? The car can't do emergency braking on it's own?

          function impact_pending() {

          // apply_emergency_brakes();

          // 2014-03-17/JD - Disabled because it activates too often.

          ///@todo IMPORTANT - Fix before public release!

          }

      4. Steve 53

        Re: New???

        Well, there is a clear reason for this. If you've not got your algorithm right yet and KNOW that it's skittish with the emergency breaking, then that IS a good reason to disable said system. If the system incorrectly performs emergency breaking then you're very likely to end up with a car in the back of you (Or I guess correctly - but at least then a car in the back of you is the lesser of two evils).

        So you set it to log, you put cars on the road and you gather data. Once you have data you can refine your algorithm and get more data. Once you get it right you actually turn on the system.

        The problem here is us meatbags - The safety drivers is there to deal with these situations, but if a machine only needs intervention on rare occasions then the job is boring as hell and you're likely to piss about with your phone. And of course the squashed meatbag put themselves in danger by crossing the road without checking said road was actually clear. I could do that the main road 200m away from my house and end up squashed even with no robodrivers involved.

        If uber didn't tell the safety drivers quite how critical their role is, then that is of course an error on their part - but the safety drivers know they're ultimately responsible for the cars safety...

        1. Doctor Syntax Silver badge

          Re: New???

          " If the system incorrectly performs emergency breaking then you're very likely to end up with a car in the back of you "

          Only if the car behind is too close in which case that's the car behind or its driver's responsibility (depending on whether the car behind was also "autonomous").

          1. Steve 53

            Re: New???

            Legally, yes. Practically people don't leave enough space for a completely unjustified emergency stop

            1. Jimmy2Cows Silver badge

              Re: New???

              Speak for yourself much?

              1. NotBob
                Paris Hilton

                Re: New???

                Ever driven in city traffic? Get out away from those cows and you'll see the other person is speaking for a lot of drivers. Even Paris could see that much.

              2. Anonymous Coward
                Anonymous Coward

                Stopping distance and other tangents to original story

                Drivers leaving insuficient stopping distance from the vehicle in front is sadly very common - and also very dangerous.

                If I find myself being tailgated, I very gently ease off the throttle until the vehicle behind wakes up and drops back enough for it to be safe to go faster, or it overtakes. If they do neither I indicate, pull in to allow them to pass, then continue.

                Taking a little longer to get where you're going is way better than being hit at speed from behind because you've had to do an emergency stop (personal experience) or hitting a pedestrian by being shunted into them (avoided so far, though have seen it happen).

                Incidentally, if you want to know how to really stop a vehicle on a sixpence, check out cadence braking, then be very careful how you use it. Be warned though that some ABS brakes may stop you using this. Some of these systems are worse than useless, including those fitted to at least some Renault Meganes.

                Experience here was in a borrowed car, having to brake hard to avoid an idiot turning right in front of me. The road was dry and the tyres good, but the ABS defeated the brakes way short of wheel lock and I missed the idiot by only inches, where I'd have had yards without the ABS.

                1. Anonymous Coward
                  Anonymous Coward

                  Re: Stopping distance and other tangents to original story

                  I don't, I touch the break pedal enough to activate the lights but not slow the vehicle. Had someone lock up their wheels on a motorway once because I was (correctly) in a segregated T2 lane and they were trying to use it to overtake, tailgating me like an Audi driver right at the speed limit. I couldn't legally move over nor speed up so touched the pedal and watched them shit themselves as smoke came off their tyres. They learned their lesson and didn't repeat it.

                  Tailgating is one of the most dangerous acts carried out almost systematically by the driving public. In some cases you don't even need to be doing what you consider tailgating - driving behind a brand new 911 in an entry level car and you should understand the big difference in stopping distance of the two vehicles. The road rules generally state the gap should be sufficient to safely stop your vehicle without battering the one in front.

                2. Mark 65

                  Re: Stopping distance and other tangents to original story

                  I'd imagine EBD and EBA on most vehicles would make up for your perceived shortfall in ABS capability. Generally if the vehicle didn't have ABS it wouldn't have the other 2. Remember ABS is there to allow you to maneouvre the vehicle under heavy braking and to prevent the lock-up and skidding that actually extend the braking distance. From wikipedia,

                  "ABS is an automated system that uses the principles of threshold braking and cadence braking, techniques which were once practised by skilful drivers before ABS braking systems were widespread. ABS operates at a much faster rate and more effectively than most drivers could manage. Although ABS generally offers improved vehicle control and decreases stopping distances on dry and some slippery surfaces, on loose gravel or snow-covered surfaces ABS may significantly increase braking distance, while still improving steering control"

                  https://en.wikipedia.org/wiki/Anti-lock_braking_system

                  On your dry road ABS was better than you'd otherwise be.

        2. Archtech Silver badge

          Re: New???

          "If you've not got your algorithm right yet and KNOW that it's skittish with the emergency breaking, then that IS a good reason to disable said system".

          No. It's a good reason to withdraw the car from public roads, while you FIX the "skittish" braking problem.

    2. Steve Channell
      Facepalm

      Re: What? The car can't do emergency braking on it's own?

      It's amazing how utterly misguided the whole autonomous car business currently is.. driven by completely the wrong use-cases

      Instead of trying to be just like a regular car, actual autonomous will drive much slower on normal roads, with the focus being on productive use of time rather than reduced time. Smooth, but slower vehicles will allow people to work, watch a film, or sleep whilst in the pod.. much like a commuter train.

      1. Warm Braw Silver badge

        Re: What? The car can't do emergency braking on it's own?

        sleep whilst in the pod.. much like a commuter train

        When did you last commute by train?

        1. Stoneshop Silver badge

          Re: What? The car can't do emergency braking on it's own?

          When did you last commute by train?

          I did, yesterday. And on Monday. And the entire week before. And the one before that, except Thursday. And so on for at least the past fifteen years.

          Most days I can get half an hour of reading or work in both ends of the day.

          1. OGShakes

            Re: What? The car can't do emergency braking on it's own?

            I have commuted to London daily for a long time and get 10 min on the train before someone else's bum is in my face. The return journey is just as bad, with the bonus that by the time the train is empty enough for me to get my book out I have to get off. It depends on your train. On the plus side, getting to central London by car takes even longer, so any idiot who thinks they are better off driving always arrives in a stressed state. I guess self driving cars would make them not stressed from other drivers, but they would still be another hour on their way in.

            1. MrXavia

              Re: What? The car can't do emergency braking on it's own?

              The crazy thing is I bet it would be cheaper to use a self driving car to get to work than the train!

              I know I worked out it would cost me about 1/3 the price to drive to London daily than take the train! when self driving cars hit the road fully, then they had better build large park & ride car parks at London tube stations!

    3. katrinab Silver badge

      Re: What? The car can't do emergency braking on it's own?

      If you look at stopping distances from back when you were studying for your driving test, the "thinking distance" is 1 foot per mph. Do the sums on that, and the thinking time is about 1.44 seconds. That's for someone who has situational awareness which you don't necessarily have in a self-driving car.

      1. Jimmy2Cows Silver badge

        Re: What? The car can't do emergency braking on it's own?

        Right so even if the car had alerted the driver at T minus 1.3 seconds, there was no time to react, never mind stop.

        Suggests the collision detection algorithm is seriously flawed if it can't properly classify a high-probability collision outside the human thinking time required to take action.

        1. Anonymous Coward
          Anonymous Coward

          Re: What? The car can't do emergency braking on it's own?

          Right so even if the car had alerted the driver at T minus 1.3 seconds, there was no time to react, never mind stop.

          Even if the "driver" had insufficient time to fully stop, being alerted and applying the brakes, that may have reduced the velocity of the crash, and possibly turned a fatality into an injury. Not optimal, but a better outcome. If impact is unavoidable, I'd rather be hit by a car going fifteen mph than forty.

  2. Tom Chiverton 1
    FAIL

    " emergency braking maneuvers are not enabled while the vehicle is under computer control"

  3. Anonymous Coward
    Anonymous Coward

    I'm not sure how Uber gets a pass on this. It's their code that caused this woman's death. Not sufficient warning for the driver and with emergency braking being to erratic to use this software is not ready for real time road tests.

    1. iburl

      Agree that Uber shouldn't get a pass, but it was less the code, and more the fact that they allowed the code to speed tons of metal and plastic through the city that was the problem.

      1. aks Bronze badge

        I believe the car wasn't speeding.

    2. cd

      Yavapai is a large and relatively backward county of retirees, mostly conservative. They likely do not have the facilities to audit the code. Maricopa County might, but the got off the hook from doing so because of the flimsy conflict of interest charade.

      1. Doctor Syntax Silver badge

        "They likely do not have the facilities to audit the code."

        Why should they need to? Car did X. How it did X is irrelevant. The only thing to decide is whether doing X is illegal. What's more, if the driver/minder was a Uber employee Uber should share responsibility for any actions or inactions on their part.

    3. wolfetone Silver badge

      "I'm not sure how Uber gets a pass on this."

      I could hazard a guess as to why they get a pass on this. I think a lot of people could. But to say why would probably get this comment deleted. Palms were greased, blind eyes have been turned. Move along now, let the poor smuck "driving" the vehicle pick up the can.

      1. Archtech Silver badge

        A land fit for corporations

        The USA is now a country whose only first-class citizens are (American) corporations.

        Human beings - except the very rich and influential - count for much less.

    4. TeeCee Gold badge
      Devil

      That would be Uber's business model.

      Outsource any tax / regulatory / legal liabilities to the grunt on the ground.

      While busy doing this, they also pretend to not run a taxi service as a sideshow.

  4. Anonymous Coward
    Anonymous Coward

    I wouldn't want to be the one to explain this to her family - or the next victim's.

    This is not a video game, people.

  5. DCFusor Silver badge

    The reg wanted to see a police review?

    Of the code? Does anyone honestly think the police can review complex code with any better accuracy than...$many_colorful_expressions_possible?

    They could evaluate the car with the code, put it to some tests, perhaps...at whose expense?

    Liability for issues caused by poor code is an ongoing problem - this merely throws it into sharper relief.

    The corporate veil needs to be pierced in many cases. A fine is often a small cost of doing business.

    1. Irongut

      Re: The reg wanted to see a police review?

      Generally the Police get a subject expert to do that for them. You know like in counterfeit cases, financial cases, etc, etc, etc. They don't actually analyse things like that themselves.

      1. Archtech Silver badge

        Re: The reg wanted to see a police review?

        Yeah - and who evaluates the "subject expert"? Is (s)he chosen on the tried and tested principle of "the bigger the fee, the greater the expertise"?

      2. Ghostman

        Re: The reg wanted to see a police review?

        In a case like this, it would go to the state DOT (Department of Transportation) and jurisdiction would then be handed off to the NTSB (National Transportation Safety Board), the ones who investigate things like complex traffic accidents, airplane crashes, and so forth.

    2. Stoneshop Silver badge
      Boffin

      Re: The reg wanted to see a police review?

      They could evaluate the car with the code, put it to some tests, perhaps...at whose expense?

      Who else but Uber?

      If someone builds or significantly modifies a vehicle (and turning it into an autonomously moving lump of metal is, IMNSHO, at the significant end of significant), it needs to be tested for roadworthiness at the builder's expense, and if it fails and needs to be re-tested after fixes are applied, again at the builder's expense.

      So if whatever department is responsible for the roadworthiness of Uber's flimflam decides to test it, it's Uber who'll have to foot the bill. At most they can say that the bill would be excessive and retract the vehicle from the test, but then it won't have a roadworthiness certificate and Uber can't go out and mow down pedestrians with it.

  6. YourNameHere

    Safety driver?

    The interlock was disabled because the the safety driver was supposed to be watching and then break if needed. However, the safety driver was watching the voice on her cell! If you are going to have a backup, make sure the backup is up and running...

    1. Flocke Kroes Silver badge

      Re: Safety driver?

      It is worse than that. The driver was required keep a log of what was going on on a touch screen device. I could tolerate making a verbal record with a hands free microphone but not something that required both hands and eyes.

      The emergency stop detector only looked ahead and did not get any input from the route planning part of the software that knows where the road curves and when to turn. Left to itself, the car would stop before any obstruction on the outside edge of a curved road. Uber's solution was to prevent the emergency stop system from making emergency stops.

      Uber should have kept these vehicles off the road until the emergency stop system could ignore stationary objects beside the planned route while still considering objects moving towards the planned route. Uber should not have instructed their drivers to operate a touch screen device while they are supposed to be watching the road. The driver should have refused to drive and keep a log at the same time.

      Not a legal requirement, but I strongly recommend cyclists fit reflectors between the spokes. I suspect Elaine Hertzberg was a pedestrian pushing a bicycle because it was dark and she was not wearing high visibility clothing. The video of the accident gives some idea of how invisible pedestrians and cyclists are in the dark without plenty of reflectors and lights.

      1. AZump
        WTF?

        Re: Safety driver?

        There is one factor nobody has brought up. The woman was NOT crossing at a crosswalk. from the previous article on this: "The accident happened on Sunday outside of the crosswalk near the intersection of Mill Avenue and Curry Road". That in itself puts all liability on the pedestrian. ...at least in my state it does. Kill someone in a crosswalk, your ass belongs to the prison system. Hit and kill someone crossing outside a crosswalk, sue the family/estate for property damage and mental distress, and *win*.

        Uber should at the very least be required to pull all autonomous cars off the road until they can apply the brakes themselves without human intervention. Hell, damn near every car manufacture has automatic emergency braking either as standard equipment or as an option. NO EXCUSE for Uber on that one.

        This woman was too lazy to cross the street lawfully and paid the ultimate price. Happens all the time.

        1. Joe W Silver badge

          Re: Safety driver?

          What.

          Broken system at your place, that is. Yes, if you are stupid / lazy (dark clothes, no lights on the bike, crossing where you should not) then part of the liability is yours. I have a hard time imagining all of the liability be attributed to the "weaker" party, i.e. pedestrians or cyclists. Especially if that person gets killed by an inattentive driver.

          1. Fursty Ferret

            Re: Safety driver?

            Different attitude to road safety in the USA. You only have to look at the number of people using their phones while driving to realise that big changes are made. "Pedestrian at fault" sounds about right for most of the USA, and the rest of it is still fraught as you try to cross legally against traffic turning right at red lights and expecting you to jump out of the way.

            1. Archtech Silver badge

              Re: Safety driver?

              Hotly as it may be denied, it seems likely that many drivers unconsciously think of pedestrians as an inferior order of being. The "logic" (if you can call it that) would be that if they can't afford a car, they must be poor - and of course the poor deserve to be poor because they are stupid, lazy, socialist, etc.

              The road - and particularly the area directly in front of a driver's vehicle - is treated emotionally like a wild animal's personal territory. For an inferior being to trespass on it merits the most severe punishment.

              1. John Brown (no body) Silver badge

                Re: Safety driver?

                "Hotly as it may be denied, it seems likely that many drivers unconsciously think of pedestrians as an inferior order of being. The "logic" (if you can call it that) would be that if they can't afford a car, they must be poor - and of course the poor deserve to be poor because they are stupid, lazy, socialist, etc."

                You're right, but I would say it's a blend of selfishness and arrogance. Just look at how some drivers behave in car parks. As drivers, they expect right of way and often go too fast considering by definition, a car park is going to have pedestrians in it. Once they've parked and demand their pedestrian right of way, they suddenly become highly indignant when other drivers treat them in the same way.

            2. OGShakes

              Re: Safety driver?

              Weirdly in the UK there is a level of liability on the Pedestrian, they still need to stop look and listen before they step out in to traffic. If they don't do that drivers are not at fault outside of a marked crossing! If they are on a Zebra crossing then the pedestrian should pause to make sure they are seen by the driver so they can stop, but legally the driver will be considered at fault for hitting them without additional evidence.

              I have been involved in some events where the Car insurance industry has started looking at the 'who it liable' side of things for self driving cars, it is a nightmare and will cause problems for years to come.

              1. DavCrav Silver badge

                Re: Safety driver?

                "Weirdly in the UK there is a level of liability on the Pedestrian, they still need to stop look and listen before they step out in to traffic. If they don't do that drivers are not at fault outside of a marked crossing!"

                False. A driver is at fault if they do not make every effort to stop.

                1. Anonymous Coward
                  Anonymous Coward

                  Re: Safety driver?

                  Also, if I remember correctly the vehicle was travelling a little above the speed limit, which could be the reason why they misjudged the timing of their crossing. If this had been an incident with a human driver I'm sure the police would have been all over that as a reason to assign blame.

        2. DavCrav Silver badge

          Re: Safety driver?

          "Hit and kill someone crossing outside a crosswalk, sue the family/estate for property damage and mental distress, and *win*."

          If that's actually true, your state is a dystopian hellscape.

        3. Baldrickk Silver badge

          Re: Safety driver?

          Hey AZump, you know all those sensors on automated cars. Guess what they're for?

          They don't just turn them on when coming up to pedestrian crossings.

      2. Francis Boyle Silver badge

        Re: Safety driver?

        Spoke reflectors are useless* if you are actually riding the bike. You'd be better of advising pedestrians to wear reflectors.

        *Reflectors and headlights are both strongly directional. If a driver can see the reflectors it's going to be too late to do anything about it.

        1. Stoneshop Silver badge

          Re: Safety driver?

          Reflectors and headlights are both strongly directional.

          Reflectors cast back most of the light in the direction from which they are illuminated. Spoke reflectors and reflective tyre sidewalls happen to be very good at pointing out there's a bicycle ahead of you if you're not approaching it from straight on or behind. Especially since the 'two wheels' shape should immediately clue you in on what's that thing ahead of you.

          And I can see those reflectors well before they're in the actual headlight beams.

          1. Steve 53

            Re: Safety driver?

            Agreed. The wheels are by definition going to be low on the bike and you can see them quite some distance in advance. Reflective sidewalls / reflectors on spokes are a very good way to be seen from sideways, for example at a T junction - bikes don't tend to have sideways lights.

            Unfortunately wearing dark clothes at night with no lights seems rather common around here, and the build in reflectors are normally pretty high up the bike or removed because they don't look cool :/ It's lead to a few "Where the hell did they come from" moments.

            I do a 50/50 cycle/drive to work, but my bike has 5 lights (2 flash, 2 constant, 1 wheel light - n+1 redundancy!), 2 reflectors and spoke reflectors. You can pick up lights for a couple of quid from amazon, I don't understand why people are allowed to get away without them tbh. (Take the bikes and crush them)

            1. Francis Boyle Silver badge

              Re: Safety driver?

              Spoke reflectors and reflective tyre sidewalls happen to be very good at pointing out there's a bicycle ahead of you if you're not approaching it from straight on or behind.

              That's the problem. If the bicycle is on the same road as you it's very unlikely you won't be approaching it from behind. And if it's on a cross road you won't see it in time. Just because reflectors are very visible in some circumstances doesn't mean they're actually helping when you need help.

              If you want to use reflectors put them on the pedals. (I wouldn't buy pedals without them.) Put them back and front (legally required where I live). Put them on your wheels if you want. But (1) don't assume they're a substitute for lights and (2) don't think that because you're lit up like the Titanic you're protected (especially at intersections).

        2. aks Bronze badge

          Re: Safety driver?

          She was walking the bike across the road, straight into the path of the vehicle. I can't see how a human driver could have stopped in time.

          1. The First Dave Silver badge

            Re: Safety driver?

            That is pretty much the definition of "too fast" - if you can't stop within the distance you can see, then you are driving too fast, even if that is below the speed limit.

        3. John Brown (no body) Silver badge

          Re: Safety driver?

          "*Reflectors and headlights are both strongly directional. If a driver can see the reflectors it's going to be too late to do anything about it.

          Only partially true. The OP was talking about side reflectors, which will show from quite some distance if the bike is travelling across the cars direction of travel, which is the point of side reflectors. If the side reflectors don't show until the car is too close to do anything about it, then either the driver didn't notice the bikes lights or the cyclist had no working lights.

      3. Baldrickk Silver badge

        Re: The video

        Is completely non-representative of the road itself - it's actually a well lit area, as shown by numerous people who took their own dash-cam footage immediately following the incident.

      4. MrXavia

        Re: Safety driver?

        "The emergency stop detector only looked ahead and did not get any input from the route planning part of the software that knows where the road curves and when to turn. Left to itself, the car would stop before any obstruction on the outside edge of a curved road. Uber's solution was to prevent the emergency stop system from making emergency stops."

        That makes no sense to me, my car 6 years old, it has adaptive cruise control, and it also has emergency braking, it won't stop you from bumping into something, but it will alert you and begin to break heavily if it thinks you will hit something, it is smart enough to not stop you on a curved road...

        1. Stoneshop Silver badge
          Headmaster

          Re: Safety driver?

          but it will alert you and begin to break heavily if it thinks you will hit something,

          No, it might break heavily if you actually hit something because of insufficient braking.

  7. adnim Silver badge
    Joke

    The EULA

    has taught us that no one is responsible but the end user, or victim.

    Joke icon, but I am serious.

  8. Deltics
    FAIL

    Liability could (and perhaps should) ...

    ... pass UP to the agency/body that licensed Uber to operate these vehicles on public roads without first ensuring that adequate safety protocols and controls were in effect to protect the public.

    Superficially it seems like an absurd flaw in the system as described.

    "Yes sir, our vehicles are perfectly able to recognise when emergency braking is required to avoid a potentially fatal collision when the vehicle is operating autonomously"

    "Good, so there is no possibility that such a collision could occur ?"

    "Well, not quite, because you see although the vehicle can determine that braking is needed, it won't actually apply the brakes when operating autonomously, but it will alert the safety driver who is then required to perform the emergency braking maneouvre, so it really depends on the safety driver being alert and their reaction time"

    That should have been the end of the license hearing right there. Relying on a "safety driver", let alone one expected to have lightning fast reaction times, even if paying attention, in this situation is patently absurd since the circumstances under which the vehicle is going to NEED to alert the driver to the need for emergency braking are most likely to occur when the driver is NOT paying attention! For if they WERE paying attention then the need for emergency braking would almost certainly have been avoided long before it became necessary!

    The fact they were allowed to operate without even this obviously ineffectual backstop operational only highlights further the negligence of the issuer of the license.

    But, the license issuer being a state body and the operator being a Golden Boy "tech startup" means that it of course will have to be Joe Blow that takes the fall.

    1. DavCrav Silver badge

      Re: Liability could (and perhaps should) ...

      "Well, not quite, because you see although the vehicle can determine that braking is needed, it won't actually apply the brakes when operating autonomously, but it will alert the safety driver who is then required to perform the emergency braking maneouvre, so it really depends on the safety driver being alert and their reaction time"

      It's worse than that. From the article, the safety driver isn't actually alerted.

    2. John Brown (no body) Silver badge

      Re: Liability could (and perhaps should) ...

      "... pass UP to the agency/body that licensed Uber to operate these vehicles on public roads without first ensuring that adequate safety protocols and controls were in effect to protect the public."

      They were invited, and probably got fast tracked permits because the AZ governor saw a chance to poke California in the eye when Cali caught Uber operating with no permits and stopped them. So yes, it's arguable that the AZ governor is also partly liable.

  9. Big Al 23

    Bad call I'd say

    This vehicle does not belong on public roadways if it can't automatically stop when a pedestrian walks or cyclists pulls out in front of it. How on earth can Uber get away with this failed safety for an AV?

    1. Anonymous Coward
      Anonymous Coward

      Re: Bad call I'd say

      It would seem to me that this would be the prime function of an autonomous vehicle. It may be very difficult to code effectively but without that the rest is pointless. I think this accident highlights that a "safety driver" is just about pointless in an autonomous vehicle. By the time there's an incident they should attend to they've been lulled into in attentiveness and inaction. This is a very bad idea until there are huge advances that can be demonstrated on tracks. If these companies won't build large difficult tracks with lots of realistic obstacles for testing then they can shove this idea their own backsides. It is not ready for live road work.

      1. John Brown (no body) Silver badge

        Re: Bad call I'd say

        "This is a very bad idea until there are huge advances that can be demonstrated on tracks. If these companies won't build large difficult tracks with lots of realistic obstacles for testing then they can shove this idea their own backsides. It is not ready for live road work."

        ISTR Mythbusters did some driving myths using an abandoned military base complete with it's own suburbia-like town specifically because it was very close to real driving around residential areas. They even used cardboard cutout "people" who were pulled into the road to simulate potential accidents for the drivers to avoid. All done in near as as dammit complete safety.

  10. Menasco
    Devil

    Justice

    Unbelievable FAIL , i wonder if , say , a faulty accelerator module on a conventional car would have met with the same judgement in the same circumstances? I would bet on a prosecution.It seems that whilst Uber vehicles dont know when to stop , the firm itself is able to prevent "fatal" collisions with the justice system , i wonder how they do that.........oh look a bag of money to help prevent drink driving incidents , in a neighbouring state , "conflict of interests"? WTF!!!!! Excuse me..PERSON DEAD. Self driving cars guided by gps etc , are you fucking mad ? oh just look the other way , it wasn't anyone i cared for , its all good , all hail Elon et all.The technological Infocracy is running the justice system , narcissistic cretins running the government and the mainstream media is a haze of coke n' crack liberal elite neurotics and, and....were just..... HUMAN SLAVES , IN AN INSECT NATION (apologies to Bill Bailey).Question: How many commentards does it take to push an Uber off a cliff......Answer: More than this.Bolleaux , ime going for a cuppa...Wait until the twitterati find out it was Phil The Greek driving (eeeeeeshg just making a few extra quid on the side, blasted pedestrian cyclists,probably a vegan i shouldn't wonder AND it was DARK ,bloody typical,nnnnhg Liz is going to go absolutely mental)

  11. Windows8

    Bullshit Bullshit Bullshit Uber is liable this is the candy ass court system protecting another big company and new market. The driver should have been aware this system was deactivated yet Uber isn't liable and now they want to go after the employee? Wtf is going on here...this industry needs to regulated before more citizens take flight when the autonomous wheels go whack whack at your knees.

  12. Anonymous Coward
    Anonymous Coward

    The very first thing...

    I thought when I first saw the video of the driver when it was released was:

    "It looks as if they are looking down at a cellphone."

    I hate the fact that I have so little faith in humankind but then this came out....

    https://www.nydailynews.com/news/national/ny-news-uber-driver-watching-reality-show-during-fatal-crash-20180622-story.html

    (sigh)

  13. Pascal Monett Silver badge

    DISGUSTING

    That's really all there is to say. I don't care what the excuse is : it was Uber's test vehicle running on automatic. Uber is liable.

    I hope somebody is going to appeal this appaling decision.

    1. Version 1.0 Silver badge

      Re: DISGUSTING

      Uber is a corporation, in the US corporations are very rarely charged with causing deaths because corporations aren't people, "Corporations don't kill people, people kill people." There were no people involved, it was software/hardware that caused the homicide, the Uber driver just failed to stop it and that's an accident, not a crime in the corporate world.

      If Uber get away with this then there's no reason for them to worry, they can disable the breaking systems completely and just speed through town like a corporate bowling ball delivering their passengers to their destination in minutes.

      That's the "anti" view anyway - but the real issue here is that our society has stopped caring about the fate and well-being of everyone except the wealthy. Do you think we'd even be having these discussions if the Uber had T-boned a Bugatti and killed the driver?

  14. Keith 20

    Humans

    This is not about the code or the autonomous vehicle. There was a human supposedly in charge of the car, who didn't see the pedestrian in the dark.

    This should be seen as a positive thing for autonomous vehicles because at least it would have applied brakes. If this were a normal car the woman would still have died, but there would be no news story, as happens every day around the world.

    1. Anonymous Coward
      Anonymous Coward

      Re: Humans

      If this had been a normal car the human driver would have seen the cyclist and stopped (or more likely just slowed down or changed lanes so they wouldn't hit her). The autonomous vehicle just carried on regardless and mowed her down without taking any action (solely creating a log item that says a pedestrian is in the roadway ahead and emergency braking was advisable doesn't count as action).

      The safety driver's attention was elsewhere - whether it was watching a reality tv show or filling in a journey log is irrelevant, what matters is their attention wasn't where it needed to be to avoid the collision.

      This should be seen as a huge negative for autonomous vehicles, specifically for Uber because they've designed the system to not bother to avoid killing pedestrians and cyclists, and more generally for the entire industry because the legal framwework for assigning accountability for any autonomous driving incidents is entirely absent.

      1. John Brown (no body) Silver badge

        Re: Humans

        "This should be seen as a huge negative for autonomous vehicles, specifically for Uber because they've designed the system to not bother to avoid killing pedestrians and cyclists, and more generally for the entire industry because the legal framwework for assigning accountability for any autonomous driving incidents is entirely absent."

        Even worse, part of the "mitigation" was that the system was dithering over what the object was and kept changing its "mind". I'm not sure why that should matter. There was an object in the road so the car systems should have tried to avoid it either by stopping or changing direction or both. It doesn't matter whether it was a human being or not, it should still have tried to avoid the collision.

        1. DavCrav Silver badge

          Re: Humans

          "Even worse, part of the "mitigation" was that the system was dithering over what the object was and kept changing its "mind". I'm not sure why that should matter."

          Indeed. If I think "I'm not sure if that's a person or something else" at night I don't then think "well, I'll just carry on and find out when I get there". So human 1, AV 0.

          1. Anonymous Coward
            Anonymous Coward

            Re: Humans

            To be fair, the indecision made it difficult for the software to predict the future trajectory of the unknown object, and hence whether the car was on a collision course at all. Not that it matters - a human driver* who could see something up ahead that they couldn't identify would back off the throttle to buy time until they'd figured out what the hell it was. The uber car just carried on (and through) regardless of the obstruction, presumably because the software spec didn't want the car to suffer from "unnecessary" delays (c.f. the reason why the built-in automatic braking system was turned off).

            This suggests that the Uber development model prioritises shorter travel times above safety, which makes this story even more alarming than it already was. That philosophy doesn't make sense unless we're talking about a morally bankrupt corporation that's trying to minimise staff costs and to hell with everything else...

            ...oh right, yeah, that makes perfect sense actually.

            * the 99% of human drivers who aren't complete psychopaths, at any rate

    2. Doctor Syntax Silver badge

      Re: Humans

      "If this were a normal car the woman would still have died"

      Prove it.

    3. Adam 1 Silver badge

      Re: Humans

      A human who was also tasked with capturing information about the vehicle's performance on a device as it drove. If they had her in the car solely as "your job is to monitor the decisions being made by the car and intervene if necessary", your comment would be reasonable. But her job required her to also be a data entry clerk. As such, it was perfectly foreseeable that her attention would from time to time be averted. If the car cannot operate safely workout a human supervisor, then they were negligent in not having a human supervising it at all times.

      1. Baldrickk Silver badge

        Re: Humans

        So then UBER is liable for overloading the driver with tasks, to the point that they cannot perform the task that they are there for.

        Not to mention that aparrently, they were watching a TV show instead of doing their damn job

        1. John Brown (no body) Silver badge

          Re: Humans

          Yep, under UK law, both the driver and Uber would almost certainly both be culpable. The driver for not paying attention and Uber for not only giving the "supervisor" tasks which required a lack of attention on the road, but possibly also expecting 100% attention on a task in relatively poor conditions for too long a period.

          Keeping a high degree of attention on a task which basically just involves sitting there doing nothing is one of the most difficult things to do. Having spent some time as a swimming pool lifeguard many, many years ago, I can speak with some experience of that. Just standing around doing nothing is quite tiring. Which is why all the staff move around and swapped posts every 10-15 mins or so. Like supervising an autonomous car, lives are potentially at risk so you need to keep a close eye on your patch as well as a general situational awareness of the rest of the pool area and that is noticeably easier when you break off to move around. I would suggest that supervising an autonomous car should probably only ever be for an hour or so at a time, especially since these are likely to be minimum wage posts rather than well paid technical jobs.

  15. Tom Melly

    Still don't get it

    How is a system where you are encouraged to both relax and stay on a trigger-finger alert ever meant to work? It seems like the worst, most dangerous, combo possible.

  16. Steve 114

    Why?

    Why would anyone push a loaded bicycle into the path of an oncoming car?

    1. Fursty Ferret

      Re: Why?

      I would guess that if you're tired; you've been walking for a mile or more just to find somewhere to cross the road; your eyesight maybe isn't so good; and you're not a driver so you've not got much experience in judging oncoming traffic speed purely based on the headlights you might make a mistake like this?

      Think about the number of people who'll pull out of a side road in front you when driving and force you to brake because they can't interpret your distance from them.

      1. Anonymous Coward
        Anonymous Coward

        Re: Why?

        Add in that the autonomous car was speeding, and the cyclist may have misjudged the timing because of that.

  17. steviebuk Silver badge

    Jesus!

    ""At 1.3 seconds before impact, the self-driving system determined that emergency braking was needed to mitigate a collision," the NTSB says in its report summary. "According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control to reduce the potential for erratic vehicle behavior."

    The safety driver is supposed to handle emergency braking, but the car's systems weren't set up to alert the driver of the need for intervention, according to the NTSB."

    And they aren't being hit with a massive fine for that MASSIVE flaw!!!! How the fuck have they gotten away with that!?!?!

    Did Uber say "Don't blame us, blame the safety human person. Even though if the system had alerted them to hit the breaks, 1.3 seconds before impact was never going to be enough for them to react, blame them anyway."

    Were brown envelopes involved?

    1. Archtech Silver badge

      Re: Jesus!

      Actually I see some similarity with the problem of automated airliner control systems. In both cases there is a (presumably skilled and experienced) human operator ready to intervene at a moment's notice. But then the automated system does more and more of the work, and the human operator just sits for hour after hour after hour...

      And then, with no warning at all, a disastrous emergency arises and the operator is unready, shocked, disoriented, and more or less incapable of responding appropriately.

      Of course in the Uber case the operator was actually distracted by doing something extraneous. But even if she had been sitting staring ahead into the darkness, she might not have reacted as quickly as a normal driver would - or as she would have, if driving a normal car.

      1. StewartR

        Re: Jesus!

        "Actually I see some similarity with the problem of automated airliner control systems. In both cases there is a (presumably skilled and experienced) human operator ready to intervene at a moment's notice. But then the automated system does more and more of the work, and the human operator just sits for hour after hour after hour...

        And then, with no warning at all, a disastrous emergency arises and the operator is unready, shocked, disoriented, and more or less incapable of responding appropriately."

        I'm not an airline pilot, but I would imagine that the timescale for responding to sudden emergencies is somewhat different. If one of your engines fails at 39,000 feet, you're going to have quite a lot longer than 1.3 seconds to deal with it. That's a very significant and important difference. Sure, if there's a major problem in the take-off or landing phase then that's more urgent, but you'd expect the pilots to be at their most attentive at those times.

        1. Archtech Silver badge

          Re: Jesus!

          I was thinking of https://en.wikipedia.org/wiki/Air_France_Flight_447 and https://www.telegraph.co.uk/travel/news/kevin-sullivan-qantas-flight-72-computer-failutre/

  18. aks Bronze badge

    Automatic emergency braking was disabled because of erratic behaviour. I translate this to mean that if enabled, the car would slam on the brakes for many invalid reasons and thereby risk causing accidents.

    The human driver was there and would not have responded to a sudden alarm by braking hard but would have started looking for the cause. With 1.3 seconds before impact, neither the automated system nor an attentive driver would have reacted in time to avoid the crash.

    From what I read elsewhere, the woman was wheeling her bicycle into the path of the car on a dark night. I don't know if she even had lights on.

    If this were a non-automated car I doubt if the driver would be found at fault.

    1. Anonymous Coward
      Anonymous Coward

      There's dashcam footage proving that absolutely no action was taken to avoid or mitigate the force of the collision, even though the pedestrian was clearly visible before the impact even with the very poor night vision capability of that dashcam. Whilst perhaps not being found ultimately responsible for causing the accident, a human driver would be held accountable for not trying to at least brake before the impact, so would still be on the hook for very serious charges. They may still get away with it if they swear that they didn't see the pedestrian until it was too late to take action, but that would really only be lessening the charge from vehicular manslaughter* to driving without due care and attention*.

      In this specific case, we have logged computer evidence that the autonomous car saw the pedestrian and carried on regardless. This is akin to a human driver saying "yeah, I saw her, but couldn't be arsed to do anything about it". I doubt that would go down well in a court.

      It now looks like Uber have got away with the criminal aspects of the case, which seems like a failure of justice to me. As this incident occurred within the land of expensive civil litigation hopefully the relatives of the victim will be able to take Uber for enough cash to make them reconsider testing their autonomous vehicles in "Death Race 2000" mode.

      * AIUI, that's what they would call these things in the UK, but most parts of the would would have similar options for pressing charges.

    2. Alister Silver badge

      From what I read elsewhere, the woman was wheeling her bicycle into the path of the car on a dark night. I don't know if she even had lights on.

      What you read elsewhere is incorrect. She didn't wheel the bicycle into the path of the car, she was crossing a three-lane carriageway, and had almost got to the far side before the car hit her.

      Contrary to the impression given by the released video, the road was well lit, and she would have been in plain sight of the driver, if the driver had been looking.

      A human driver could have avoided the accident by just backing off the throttle when she was in view, giving her time to complete the crossing, the driver wouldn't have even had to move out of their lane to avoid her.

    3. theDeathOfRats
      Devil

      Automatic emergency braking was disabled because of erratic behaviour. I translate this to mean that if enabled, the car would slam on the brakes for many invalid reasons and thereby risk causing accidents.

      And I think most of us think the same:

      IT WAS NOT READY FOR PRODUCTION!

      If the supposedly "Autonomous Vehicle" can't operate as a meatbag should (that is, trying to anticipate whatever could happen in your path), IT SHOULD NOT BE ALLOWED TO BE IN A PUBLIC ROAD.

      Icon: because it's the closest I've seen here to Sally (the AV I've loved since I was a kid).

  19. Zack Mollusc

    timing

    judging by how long it took the pedestrian to cross one lane from the dashcam footage, it seems that they began crossing the road before the car was in sight. When the car spotted that there was _something_ in the road, it lacked the resolution/accuracy/intelligence to predict the course and speed of the something and realise that it was on a collision course.

  20. Fonant
    Unhappy

    We're going to need to trade speed with safety.

    The elephant in the room: motor vehicle danger.

    Do we want AVs to be as safe as, say, railways and aeroplanes? In which case they're going to drive very carefully, with plenty of just-in-case braking. That will mean slower car journeys compared to risk-taking human drivers (who do not meet anything like the safety levels of railways or planes).

    Or do we want AVs to drive like humans do, and accept that AVs will kill and seriously injure at a similar rate?

    Presumably somewhere in between, but that means that both (a) AVs will kill people and (b) AVs will be slower than human drivers.

  21. Anonymous Coward
    Anonymous Coward

    The human was only there as a backup to a test system... to stop this exact scenario from happening - system makes a mistake - human steps in to correct/avoid/recover/stop.

    They didn't react and somebody died. They should be held fully accountable as ultimately, system or no system, they had responsibility for the vehicle at the time.

    Interesting argument for the future though... once autonomous vehicles are "the norm" and humans no longer have to pay attention - who is responsible when they make a mistake?

  22. MrXavia

    I watched the video, I timed my own reaction from when I first saw the bike on the video and how long until it hit...

    Purely based on the video, if I was driving my car, I would have braked before I fully registered what was in the road and would have stopped just in time, maybe hitting the biker at a slower speed as it is hard to judge the distance exactly.

    But then again my car would have warned me about the bike before I could see it on the video and I would have braked earlier.

    And I suspect I would have seen it earlier because lighting is better there than the video footage suggests.

    The thing is, the car should have seen the bike and stopped, so Uber should be liable, so should the safety driver.

  23. Chris Evans

    But who was the driver?

    Still not clear who the 'driver' was and what the humans responsibilities were.

    If the human was told the automatic system would be driving the car and he was only to monitor the system and should never take control, then Uber guilty human driver not.

    Was the human told he should always monitor the road ahead and take control if necessary.

    Or what?

    Not enough information to decide culpability.

  24. SNAFUology
    Stop

    Death by technological cop out

    In Australia we have no death penalty, cops are trained to shoot to kill and selfdrive cars will (in an accident situation) wipe you out to save their negligent driver (who is texting) and their family, while you are driving safely alone nearby.

    Driver Assist that's what they *should* call it and clearly state that the driver must maintain control of the vehicle at all times.

    The longer a driver does not have to respond to the road they will suffer from boredom and distractions, until they are occupied sufficiently not to be able to respond in time.

    A red area to the left (or right) down the side of the windscreen could be displayed when the radar/lidar system detects movement near that side of the road, alerting the driver something is near in the distance.

    It seems inevitable that the worst will happen so we will see in the future auto/selfdrive vehicles driving down the footpath, ploughing thru school crossings, as well as running up parked car & trucks.

    IF it takes 10 onboard computers to deal with peripheral objects and movement detection (or lack there of) then do it.

  25. M.V. Lipvig

    What I don't get...

    The computer identified an object in its path. What did it matter if it was confused as to what said object was? An object in the car's path should have triggered a warning to the driver, then a stop if no intervention by the driver, whether it was a person or a box.

    All I know is, if I or mine are ever injured by one of these things I'm going after whoever it was that msde the car, not the person in the driver's seat. If you built a car capable of driving itself, you are the driver and you are responsible for what the car does in autonomous mode. The judge ruled wrong here.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2019