back to article Dear Tesla, stop calling it autopilot – and drivers are not your guinea pigs

Tesla is misleading drivers about the efficacy of its Autopilot feature and is putting lives at risk, according to Consumer Reports. The automaker's autopilot system, when engaged, is supposed to control the speed of the vehicle and its distance from other rides and objects – but it's more of a super-cruise-control than a …

  1. Mark 85 Silver badge
    Holmes

    In Beta and customers are the testers. Never ends well*. I daresay, there will be lawsuits a plenty in the offing.

    *Yeah... MS has been doing this for years but the computer isn't driving down the road.

    1. NotBob
      Holmes

      Pretty sure that's a Google standard, labelling all things beta...

      1. allthecoolshortnamesweretaken Silver badge
        Coat

        Pretty sure that's a Google standard, labelling all things beta... / MS banana ware*

        What are you, under forty? Get off of my lawn, pronto.

        * That's what we used to call it for a while. Because you buy it in an unripe state and have to wait for it to mature a bit before it's of any use for you. You know what? It's friday, let's go to the pub and have a few. I'll get my coat.

    2. a_yank_lurker Silver badge

      @Mark 85 - Others are showing the correct way to develop autonomous automobiles; slowly and carefully in specially designed test vehicles. These vehicles are operated under controlled situations even when on public streets by trained staff. They are not operated by untrained owners under any and all conditions which is what Tesla is doing.

      The technology is several years away from being deployable safely to the masses. Driver assist systems that handle emergency braking, etc. are deployable because the driver is still actually driving the vehicle.

      1. John Robson Silver badge

        "The technology is several years away from being deployable safely to the masses. Driver assist systems that handle emergency braking, etc. are deployable because the driver is still actually driving the vehicle."

        Clearly having a collision rate lower than that of humans is too dangerous - so we should actually ban all human drivers...

        Of course not *you*, you're one of the 90% of drivers who consider themselves to be a 'good' driver.

        The technology is very good at what it does - and it's capabilities are improving all the time, unlike human drivers who are generally careless and whose abilities/habits tend to degrade over time (after those first couple of years).

        Given a choice - I'd have an autopilot enabled car now, and use it as well. I would be significantly safer as a result of doing so.

        1. BebopWeBop Silver badge

          As someone who once had a little more money than was good for him (post divorce so no restraining influence) I bought my Tesla about 12 months ago, it's my primary vehicle and has clocked up about 28k since. The 'auto pilot' is excellent as a driving aid - enhanced cruise control. The manual makes this very clear - but I suspect far too many idiots don't RTFM.

          As for the car, lovely to drive and a very comfortable large car for long distances. A little planning is needed occasionally, but frequently doing a Scotland-Midlands-Bristol trip, no big deal. Stupidly expensive, but some mugs have to be earlyish adopters or things don't progress.

        2. ST Silver badge
          Mushroom

          Darwin Award

          > Given a choice - I'd have an autopilot enabled car now, and use it as well. I would be significantly safer as a result of doing so.

          Are you applying for a Darwin Award?

          1. DropBear Silver badge
            Facepalm

            Re: Darwin Award

            He clearly is. First pointer: they're never aware of actually doing it...

          2. John Robson Silver badge

            Re: Darwin Award

            "> Given a choice - I'd have an autopilot enabled car now, and use it as well. I would be significantly safer as a result of doing so.

            Are you applying for a Darwin Award?"

            No - I'm not.

            I've looked at the state of play, I know people who work at Tesla, I know people who have them.

            I have read the user guide.

            The addition of autopilot is a net safety enhancement. It is not a license to kip, nor to watch a film/read a book.

        3. Eddy Ito Silver badge

          Clearly having a collision rate lower than that of humans is too dangerous - so we should actually ban all human drivers...

          Are you referring to the autonomous vehicles that get to pick and choose the driving conditions under which they are tested? It's pretty easy when the typical route doesn't change much and the road conditions are almost always near perfect. While humans do worse in inclement weather the autonomous vehicles haven't really been tested in rain, snow, or other storms now have they? Hell, how many miles have they logged in the dark?

          If you're referring to Tesla, their autopilot has logged only about 130 million miles and they've already had their first fatality. That gives them a rate of 0.77 per 100 million miles which just about ties them with the drivers in the state of Maryland who logged 56,432 million miles in 2014 alone.

          1. Alan Brown Silver badge

            "their autopilot has logged only about 130 million miles and they've already had their first fatality"

            In a condition where a truck driver turned across oncoming traffic and it's not even known if there was sufficient time for a human to react and avoid it if he was 100% in control.

            Also, where the evidence seems to have been tampered with (the driver was known to use a dashcam at all times, yet the dashcam is missing) and the truck driver claims to have _heard_ a portable DVD player over the noise of his own engine from 20 feet away - which is pretty suspect to say the least.

            Not to mention your cherrypicking of statistics. You can't compare long-established rates with 1 incident in this milage. GIve it another 3-6 sampling periods and you might be right, but right now there's insufficient data to draw any conclusion.

            1. Eddy Ito Silver badge

              @Alan Brown

              I'm not picking any cherries. I'm merely looking in the basket of cherries that has been pre-picked by the many already declaring that the technology is safe. My whole point was to show that there has been nowhere near adequate testing so saying it's safe is a bit premature considering how many miles are driven by actual people. Consider that if there is another Tesla fatality on autopilot in the next week it will change the statistic from being one of the better numbers to being tied with Mississippi which is one of the worst. While everyone may think 130 million miles is a lot, it's a mere drop in the bucket compared to annual road miles.

              1. Anonymous Coward
                Anonymous Coward

                Statistics of small lumpy numbers

                "if there is another Tesla fatality on autopilot in the next week it will change the statistic from being one of the better numbers to being tied with Mississippi which is one of the worst."

                Exactly. It's the statistics of small lumpy numbers. They can be used, or abused, even more than most other statistics can.

        4. John Brown (no body) Silver badge
          Coat

          "Of course not *you*, you're one of the 90% of drivers who consider themselves to be a 'good' driver.

          As someone who's driven over a million miles without an accident, I do consider myself to be quite a decent driver. Although some of the sights I've seen in me rear view mirror have been quite harrowing.

          1. Rich 11 Silver badge

            Although some of the sights I've seen in me rear view mirror have been quite harrowing.

            That reminds me of Jasper Carrot speaking about his mother-in-law: "She's never had an accident. But she's seen thousands."

          2. Anonymous Coward
            Anonymous Coward

            Although some of the sights I've seen in me rear view mirror have been quite harrowing.

            Maybe point it at the road instead of yourself? :)

            (sorry, too good to ignore - carry on :) ).

    3. John Brown (no body) Silver badge

      "Yeah... MS has been doing this for years but the computer isn't driving down the road."

      And that is the crucial point. How come the software that can control the steering isn't a fully tested and approved bit of code? How come it can be updated without being re-submitted for new approval? I can't seen that happening in the aviation or pharmaceuticals industries where even the slightest update to a product needs official testing and approval, possibly lasting for years.

      Is it just that so many people die on US roads that an extra few don't natter? DaaS, Death as a Service, ooooohh it's internetty and cloudy so it must be good.

      1. Mike Shepherd
        Happy

        DaaS LOL

        Just sprayed my monitor with tea.

  2. inmypjs Silver badge

    That Tesla are full of shit...

    is no surprise to anyone who has listened to them.

  3. Anonymous Coward
    Anonymous Coward

    Also

    Please stop calling it Tesla. Super cheesy, borrowing the name of a genuinely great inventor.

    1. JeffyPoooh Silver badge
      Pint

      Re: Also

      Nikola Tesla was a genius, he basically invented everything to do with Alternating Current.

      But he was also completely bonkers. Mad as a hatter. Fell in love with a pigeon. He is quoted saying things that were just wrong. 'Power the world with six towers...', no. 'Knock down a building with a small clockwork mechanism...', no.

      Somehow he lived in a hotel, but died penniless. I assume that the hotelier was extremely generous.

      But yes, a great inventor. A genius. A half-mad genius.

      1. This post has been deleted by its author

      2. wolfetone Silver badge

        Re: Also

        "But he was also completely bonkers. Mad as a hatter. Fell in love with a pigeon. He is quoted saying things that were just wrong. 'Power the world with six towers...', no. 'Knock down a building with a small clockwork mechanism...', no.

        Somehow he lived in a hotel, but died penniless. I assume that the hotelier was extremely generous.

        But yes, a great inventor. A genius. A half-mad genius."

        You make it sound like his achievements are worth less than those of others because he lived his life differently to what you would consider normal.

        Einstien treated his wife like shite, but everyone kisses his arse.

      3. Ogi

        Re: Also

        "But he was also completely bonkers. Mad as a hatter."

        “No great mind has ever existed without a touch of madness.” -- Aristotle

        "Power the world with six towers..., no."

        Perfectly possible, but would be hugely wasteful of energy. Maybe one day when we have abundant energy, but I suspect even then we would like efficiency, except for a few things where convenience is more important. For now we use the technology to transfer power between sealed sections of UK submarines, and a previous gen of the technology is used in those "wireless charging" mats.

        "'Knock down a building with a small clockwork mechanism...', no."

        The theory of resonance is quite well understood now, and a powerful enough mechanism could knock down a building if the correct resonant frequency was found.

        Building a small compact mechanism that can do it is tricky, but I don't think there has been much study into it. Primarily because if you want to knock down a building, you might as well skip the intricate clockwork mechanism, and brute force it with a seismic bomb.

        1. Anonymous Coward
          Anonymous Coward

          Re: Also

          "The theory of resonance is quite well understood now, and a powerful enough mechanism could knock down a building if the correct resonant frequency was found."

          Rubbish. You're clearly not a physicist.

          Let's assume for argument's sake that we find a building with a nicely tuned resonant frequency with near-negligible losses.

          The device doing the exciting has to provide the energy into the resonant system. Energy doesn't come from nowhere.

          If the exciter doesn't provide enough energy (or, if you will, power) to overcome losses in the resonant system, there will be no visible resonance. Just damped oscillation, maybe.

          A small clockwork motor will not provide enough power to overcome losses in a real building.

          Next.

  4. chekri

    Do we want to advance or not?

    First of all, as already stated by Tesla, a car running in Autopilot is far less likely than a car not driven in Autopilot to be involved in a fatal collision.

    The fact is that when used properly this technology saves lives, but like many other safety technologies when used incorrectly can cause serious injury or death. i.e. an airbag is more likely to cause injury to a passenger who is not wearing a seat belt involved in a low speed collision than had it not been there. Do we then take that one isolated fact and extrapolate it out to conclude that airbags are bad and should be banned? No we don't, we expect people to adhere to the road rules and not behave in an unsafe manner so that the majority of people can benefit form the net safety benefits of airbags.

    Another factor with any safety technology is that people risk compensate, for example this excerpt from the British Medical Journal:

    "Compulsion to wear a seatbelt cut deaths among drivers and front seat passengers by 25% in 1983. But in the subsequent years, the long established trend of declining deaths in car accidents reversed, and by 1989 death rates among car drivers were higher than they had been in 1983. Evidently the driving population risk compensated away the substantial benefits of seatbelts by taking extra risks, putting others in more danger. This period saw a jump in deaths of cyclists "

    So we need to decide will we all become luddites in the face of advanced safety technology or will we embrace it despite the fact that humans will initially try and find a level of risk that they are comfortable with, usually overshooting the mark and then coming back to some soft of acceptable level.

    1. P. Lee Silver badge

      Re: Do we want to advance or not?

      This issue is not, "is it a good idea," but, "is Tesla misleading drivers with regard to its abilities."

      My snake-oil detector goes off whenever I see "intelligence" applied to to IT stuff. "Autopilot" might work well on an aeroplane in an obstacle-clear sky with radar, objects in relatively predictable trajectories, coordinated air-traffic control and two pilots at the ready, but in a cluttered ground environment even speed-maintenance alone is dodgy around town.

      1. chekri

        Re: Do we want to advance or not?

        First of all the name Autopilot, obviously borrowed from aviation, does not imply that you can have a snooze, hop in the back seat or read the paper. Do you see the pilot and co-pilot just wandering around the plane whilst letting the autopilot on a plane do its thing?

        Let's say Tesla caves in and calls it Driver Assist. Do you think that the man watching the DVD would have done anything different? That the lady who accelerated her husbands Tesla into a brick wall wouldn't have? That the man who crashed into a field because he refused to place his hands on the wheel despite the car asking him to do so repeatedly would have put his hands on the wheel?

        Look I'm no aviation expert or behavioural expert but I think common sense dictates that the answer to the above questions is no, no and no.

        1. allthecoolshortnamesweretaken Silver badge

          Re: Do we want to advance or not?

          "First of all the name Autopilot, obviously borrowed from aviation, does not imply that you can have a snooze, hop in the back seat or read the paper. Do you see the pilot and co-pilot just wandering around the plane whilst letting the autopilot on a plane do its thing?"

          If you like air travel, I might have bad news for you...

          1. chekri

            Re: Do we want to advance or not?

            I refer you to the FAA, Advanced Avionics Handbook - Chapter 4:

            "An autopilot can be capable of many very time intensive tasks, helping the pilot focus on the overall status of the aircraft and flight. Good use of an autopilot helps automate the process of guiding and controlling the aircraft."

            I don't read anywhere that a pilot can just go wander around and leave it to George. I see emphasis that the Autopilot is there to help the pilot "focus on the overall status of the aircraft and flight"

            So although you may find some case where an asshat pilot may have done so illegally, this does not detract from the overall case that the name Autopilot does not imply that you can divert your attention from the driving task.

          2. Anonymous Coward
            Anonymous Coward

            Re: Do we want to advance or not?

            There are generally TWO people qualified to fly the metal tube at the front of the plane. One of them can go for a leak/wander around the plane leaving the other one in control

            I have been on a scheduled flight where there was only one pilot. Granted, this was in a twin engined Cessna flying from Fort Myers to MIA. This was operated by Air Florida.

            As someone who had a job a long time ago designing Autopilots, it rankles me when Tesla calls their thing an Autopilot when it isn't. Driver Assist is a far more accurate description.

        2. Dan 55 Silver badge

          Re: Do we want to advance or not?

          First of all the name Autopilot, obviously borrowed from aviation, does not imply that you can have a snooze, hop in the back seat or read the paper. Do you see the pilot and co-pilot just wandering around the plane whilst letting the autopilot on a plane do its thing?

          They are fully trained as to what an autopilot actually does. They will get fired if they do. Not the same thing as being alone in your own car.

          Let's say Tesla caves in and calls it Driver Assist. Do you think that the man watching the DVD would have done anything different? That the lady who accelerated her husbands Tesla into a brick wall wouldn't have? That the man who crashed into a field because he refused to place his hands on the wheel despite the car asking him to do so repeatedly would have put his hands on the wheel?

          Look I'm no aviation expert or behavioural expert but I think common sense dictates that the answer to the above questions is no, no and no.

          Indeed, but common sense has to compete against a childhood of watching Knight Rider. Autopilot implies more hands off. Nobody is going to read x pages of EULA to check what the Autopilot can or can't do, they're going to go "yeah, it's got Autopilot, sweet". This is also how Apple's phones are sold, by marketing to people who have no idea what it really means, but they know it's cool. Have you read iOS's EULA? Has anyone? Thought not.

          The other difference between Teslas and the rest is you can 'drive' with your hands off the wheel for longer.

          It all gives a sensation of autonomy that the rest don't have, but it doesn't mean that Teslas actually have it either. Tesla can't handwave away problems and say it's "Beta" because that's Silly Valley culture, they're in meatspace now. You'd never get a Rainbow Road screen on a Volvo to distract the driver with.

          By the way, Consumer Reports copied me.

          1. LDS Silver badge

            "They are fully trained as to what an autopilot actually does" - and what it doesn't.

            You're right. In most people imagination, "autopilot" means some kind of sci-fi AI fully able to control a vehicle flawlessly. Tesla cars don't have an R2D2 in the trunk taking control.

            While in aviation pilots are trained in understanding their hardware capabilities and limitations. There are also limitation on when and how an autopilot can be used and when not - some rules made after crashes related to too much confidence in autopilots (i.e. the Turkish Airline 737 crash at Schipol - the pilots then took too much to react too, the root cause was a faulty sensor).

            Moreover planes fly in a highly controlled environment with far less obstacles (and less morons). I wonder if car makers didn't study what happened in aviation. Anyway, maybe car too need a special license for "instrumental driving rules"....

            1. Crisp Silver badge

              Re: "Tesla cars don't have an R2D2 in the trunk taking control."

              How long until I can have an R2D2 in the trunk taking control? (because that would be so cool)

            2. Tom 38 Silver badge

              Re: "They are fully trained as to what an autopilot actually does" - and what it doesn't.

              In most people imagination, "autopilot" means some kind of sci-fi AI fully able to control a vehicle flawlessly.

              I think of Otto from "Airplane!"

          2. Jez Burns
            Pint

            Re: Do we want to advance or not?

            "Common sense has to compete against a childhood of watching Knight Rider."

            You absolutely nailed it - have a beer.

      2. JeffyPoooh Silver badge
        Pint

        Re: Do we want to advance or not?

        'A.I. is hard.'

        Might be an understatement.

        In fact, it might be impossible (except in the sense of neural nets). Coder drones typing in zillions of lines of code? Forget it, not a chance.

        A.I. is as allergic to the infinite complexities of the real world as those alien microbes in 'The Andromeda Strain' were to rain.

        "Do we want to advance or not?" is apologistic rubbish.

        Remember: The stupid thing drove into the side of a truck.

        The NHTSA or DOT should order Tesla to turn it off.

        Not only does it need to be several times safer than humans, it must be certified not to make stupid mistakes.

        They're very far from being done.

        1. Matthew Taylor

          Re: Do we want to advance or not?

          "A.I. is as allergic to the infinite complexities of the real world as those alien microbes in 'The Andromeda Strain' were to rain."

          Balls. Neural networks (which all modern AI systems use), are the best paradigm we have for dealing with real world data.

    2. Anonymous Coward
      Anonymous Coward

      @chekri - Re: Do we want to advance or not?

      This is not about safety, it is about control, it is a power grab so I will chose to become a luddite, thank you very much.

    3. DougS Silver badge
      Mushroom

      What a stupid fucking statistic

      Of COURSE cars in autopilot are less likely to be involved in collisions. Because it is basically a super cruise control! The per mile accident rate for cruise control (the old school kind that just sets a speed and doesn't keep following distances or brake) is far lower the regular accident rate for the same reasons - not because using cruise control makes you safer, but because you only use it during situations where you are already safer.

      Man, I'm really losing a lot of respect for Musk over this. That guy is willing to twist statistics and say anything instead of admitting to the real problems with autopilot - chief among them the name and the fact that it doesn't enforce any attention from the driver. Hell, it can detect when you take your hands off the steering wheel but DOESN'T DISENGAGE. I hope the NTSB investigation puts some heavy penalties on Tesla, they deserve it.

      Musk has probably singlehandedly set back autonomous driving by five years, because the regulations his ineptitude is going to cause will make it tougher for everyone. Even those who are responsible and not trying to push the envelope by using humans as guinea pigs for beta software in the name of publicity.

      1. petur
        Thumb Down

        Re: What a stupid fucking statistic

        Your argument makes absolutely no sense.

        If you have seen any video on how users are using AutoPilot, you would see that it is being used a lot in circumstances Tesla does not condemn. Maybe get informed before blurting out your venom.

        I find it amazing how little accidents have happened *despite* how users use it, far away from the easy, safe highway with simple lanes, etc...

        That said, Tesla should invest even more time and energy in informing the users on the limits of the system since there are plenty of daredevils who want to find the outer edge of its limitations...

      2. AIBailey

        Re: What a stupid fucking statistic

        "Hell, it can detect when you take your hands off the steering wheel but DOESN'T DISENGAGE."

        So, just to be clear here, you'd prefer Tesla to design a system that stops trying to drive your car at the point that you remove your hands from the wheel (rendering the vehicle technically out of control at that point). Surely that's the exact point in time at which you want something to be in control of the vehicle?

        From what I can tell, Tesla have a series of warnings and alerts that go off if it detects no user input. Actually disengaging the system would be one of the stupidest actions they could possibly take.

    4. Stevie Silver badge

      Re: Do we want to advance or not? 4 checkri

      Tesla safety risk: a soundbite answer to a question involving dozens of complex variables. It completey evades the driver's risk of adopting a more relaxed attitude to situational awareness because "the autopilot has it covered".

      The comment regarding accident stats connecting seat belt usage with increased accident rates is also overly simplistic and a clear example of correlation rather than causation. I could, for example, point out that in the same time period cars began shipping with all-round disc brakes as standard, resulting in drivers using them more agressively in traffic, and turbocharging became ecconomically viable and mechanically reliable, resulting in drivers using accelleration more agressively. I know this because I Was There.

      And there is a difference between a Luddite and someone who would rather their kid, raised carefully and dutifully to all the proper standards espoused by SINKS and DINKS posting on El Reg forums these days, not be killed as collateral damage when some fucktard with more money than sense doesn't RTFM or does and fails to recognize the gap between designer hyperbole and engineering reality.

      The Tesla is being developed in this case as one would develop software in a post-www world when it should be being developed more traditionally with the knowledge that even fancy expensive cars can kill people, just like the conventional auto industry does. A bad driver for a flatbed scanner will ruin your day. A bad driver for the Tesla "autopilot" will ruin someone else's life.

      Also: naming the feature "autopilot" invites stupid people to act stupidly. We live in a world where I dig about a car a year out of my front lawn because young people equate cruise control and automatic gearbox with "I can fiddle with center console gui stereo controls while exceeding the speed limit in perfect safety".

  5. moiety

    17 seconds? Do they insist on finishing the chapter, or what?

    1. Anonymous Coward
      Anonymous Coward

      17 seconds? Do they insist on finishing the chapter, or what?

      It's the reality that the likes of Tesla are choosing not to acknowledge. A driver who's been lulled into a false sense of security because "the car's driving itself" not only isn't holding the steering wheel, and doesn't have feet anywhere near the brake, but will likely have let their attention drift and have no situational awareness. When the bing-bongs go off first the driver will just be startled, then they'll have to get their hands and feet into the controls, scan their instrument panel to work out what the hell caused that, and then start looking around outside their vehicle to start the process of working out what their vehicle's doing and what's happening around them. Only then can they decide what action to take and effectively take control (assuming they didn't do something daft when first startled). It doesn't at all surprise me that this might take 17 seconds.

      1. LDS Silver badge

        scan their instrument panel to work out what ... and then start looking around

        And that's exactly the wrong sequence. You should *first* look around to ensure you don't crash into anything or anyone, ensure you have control, and only *then* you can start to assess what's wrong.

        That's for example, what pilots are trained to do, while drivers are not.

        1. sgp

          Re: scan their instrument panel to work out what ... and then start looking around

          Pilots will often need time to figure out the situation too. In many cases they have the time as there is not much to crash into in the air..

          1. moiety

            Re: scan their instrument panel to work out what ... and then start looking around

            You're in charge of 3 tons or so of metal; travelling at motorway speeds; being controlled by beta software. You should bloody well know what's going on around you. 17 seconds response time sounds like criminal negligence to me...and as an ex lorry driver, I am aware that you zone out a bit on long motorway stints; but even so.

            It also sounds like the alarm should be a little more strident...a discreet dinging bell is fine to call you to yoga lessons; less useful to alert you to a situation that may well involve your fiery death.

            1. Richard 12 Silver badge

              Re: scan their instrument panel to work out what ... and then start looking around

              Alarms can be dangerous though.

              A loud alarm is likely to make the driver look at the source of the sound - and not at the dangerous situation developing outside the vehicle.

              1. moiety

                Re: scan their instrument panel to work out what ... and then start looking around

                Good point...I wasn't saying there should be klaxons though, just something a little more strident than soothing bong noises, which clearly aren't doing the job. Can't use voice because that takes too long. Maybe a buzzing sound with a hint of urgency. Buzz/hiss/white noise combo maybe...I'm no sound engineer. Rapid high-pitched beeps?

                If it's too alarming you could maybe give people heart attacks and it would definitely be distracting and also people would disable it if it really annoyed them.

                It should be low-key; but it should also command attention. More "You need to be paying attention right now matey" and less "There is something you may care to attend to when you've finished your latte".

  6. bazza Silver badge

    "We will continue to develop, validate, and release those enhancements as the technology grows. While we appreciate well-meaning advice from any individual or group, we make our decisions on the basis of real-world data, not speculation by media."

    That's a brave statement considering that regulators are seemingly unimpressed by the performance of things like Autopilot. With statements like that it seems Tesla are wilfully ignoring the Human Factors aspects of such a thing.

    What I don't get is why on earth Tesla are risking all with Autopilot. Their main thing is half-decent practical electric cars, yet they're willing to take a huge commercial risk on Autopilot, something that their main technology doesn't need or benefit from at all.

    Google are nearly as bad, saved by the fact that they're not openly selling cars to the public. "Woohoo, self driving car" they say in demos, ads, papers, trials, and as much publicity as they can generate, yet in the small print they say "you have to be paying attention and will have to take control at short notice"... So not self driving at all then. Most people are believing and responding to the publicity, but have no idea about about actual constraints on the technology. If it wasn't for the strict rules imposed by the State of California (CA published the trials data) we'd not be told that actually it's pretty unreliable at the moment.

    The only company doing it properly is Volvo, who at the outset of their development programme said Volvo is aiming for a system where Volvo have the liability, ie a true self driving car. Good for them.

  7. nematoad Silver badge

    It looks to me as if the people trusting in this sort of stuff have just put themselves on the short list for the Darwin Award.

    1. LDS Silver badge

      Actually, in the "Darwin Awards" movie, one of the episodes was exactly about someone thinking that the cruise control was a full AI autopilot, and they could enjoy the trip in a different way...

    2. Triggerfish

      Just call it Darwin mode.

      Solves all problems.

  8. Anonymous Coward
    Anonymous Coward

    Perhaps Tesla is really run by a GLADoS prototype

    But there's no sense crying

    over every mistake.

    You just keep on trying

    'til you run out of cake.

    And the science gets done.

    And you make a neat gun (or electric car...)

    for the people who are

    still alive.

    Although I am sure the article was triggered by an A/C commentard on el-reg who also pointed out that the feature should not be called autopilot.

  9. David M

    Degree of control

    I will only let go of my steering wheel when there isn't one, i.e. when the Autopilot functions like a chauffeur and is so reliable that the car design simply doesn't include the option of manual control. In the meantime, if I have to be alert enough to take over at a moment's notice, I might as well retain manual control all the time. Adaptive cruise control is fine, but if I could let go of the steering wheel I would inevitably stop paying proper attention to the road.

  10. jzl

    Evidence & numbers

    The number of deaths per mile with Autopilot enabled is lower than the average number of deaths per mile already, meaning that Autopilot is likely to be saving lives.

    Also, Musk's just tweeted that they've finally got physical access to the latest crash vehicle (Pennsylvania) and the Autopilot was turned off at the time of the crash. More to the point, he says, the crash would not have happened if it had been turned on.

    1. LDS Silver badge

      "was turned off at the time of the crash"

      And in the instants before? You have to reconstruct all the events leading to a crash. Was the autopilot a factor? Did the driver disabled it in an attempt to avoid the crash? If the autopilot was enabled before? Did it deceived somehow the pilot, causing him to choose the wrong response?

      Moreover the statistics about a relatively few Tesla, all new cars used by wealthy relatively young people, may not be directly comparable to the statistics of all cars. I'm quite sure the number of deaths per miles of Rolls Royce and other luxury cars is lower than the average, even without an autopilot. While cheap, old cars especially in the hands of very young people or older ones probably have an higher one.

      Beware of statistics, not always they say the real "truth". Especially when there are many "dimensions" to take into account. Tesla data should be compared *only* to similar cars and drivers, to have a meaningful statistic.

      1. Matthew Taylor

        Re: "was turned off at the time of the crash"

        "And in the instants before? You have to reconstruct all the events leading to a crash. Was the autopilot a factor? Did the driver disabled it in an attempt to avoid the crash? If the autopilot was enabled before? Did it deceived somehow the pilot, causing him to choose the wrong response?"

        Or did the Autopilot call the driver a bumface? Or... or... none of those things. You seem to be fishing.

        1. LDS Silver badge

          Re: "was turned off at the time of the crash"

          It looks you never did a "root cause analysis". A snapshot of a single instant in an incident may tell not enough and deceive you. Was the autopilot a factor? The fact it wasn't enabled in the instant of the crash doesn't mean it wasn't a factor, if it was enabled before. When it's all about safety, you must be very careful and check and understand everything that lead to an accident. You may be free to rely on unsafe systems just because they make you feel trendy, and kill yourself, but you may also kill other people, and that's unacceptable.

    2. DougS Silver badge

      Re: Evidence & numbers

      Bullshit. The deaths per mile with old school cruise control are also lower. Are you going to argue that cruise control saves lives, or acknowledge that it has more to do with people only using such features when they already feel safer - and also are generally traveling at highway speeds where the per mile death rates are lower anyway.

      This is similar to arguing that not showering makes you safer in your home (because you eliminate the 'slipping in the shower' accidents)

      1. Vic
        Joke

        Re: Evidence & numbers

        This is similar to arguing that not showering makes you safer in your home

        And so it does.

        Eschew showering for long enough, and you're far less likely to contract an STD...

        Vic.

    3. Anonymous Coward
      Anonymous Coward

      Re: Evidence & numbers

      The number of deaths per mile with Autopilot enabled is lower than the average number of deaths per mile already, meaning that Autopilot is likely to be saving lives

      AI vehicle 500,000 miles in 2 years 1 deaths

      Piloted vehicles 600,000,000,0000,000 miles in 120 yeras with a million deaths

      still doesnt add up to me, until they cover as many miles as normal vehicles

      1. DougS Silver badge

        More bullshit numbers

        Listing all traffic deaths for the last 120 years to compare against the last two years? How about at least using the last two years, since the death rate was a lot higher in the old days before seat belts and crash testing?

        The trends show about 1 death per 100 million miles driven these days. That includes ALL roads not just the nice highways where autopilot is far more likely to be engaged, and all vehicles including poorly maintained 20 year old cars which Teslas aren't. AND it includes the 1/3 of deaths that are alcohol related and can be thrown out unless you are going to claim that the reason autopilot makes you safer is because it saves people who are drunk, texting while driving or doing other stupid stuff. That's hardly the makings of a great ad campaign: "Are you a terrible driver who does stupid things and doesn't pay attention, buy a Tesla and use autopilot and you're less likely to die!"

  11. DrXym Silver badge

    A sign of things to come

    Tesla's "autopilot" is actually quite modest and it's easy to see how you might break the problem down and model it - multiple lanes of cars all going the same way, sensors that model the car's surroundings / lane markings, algorithms that maintain speed & distance, algorithms that mark opportunities to overtake, algorithms to avoid / brake hazards based on proximity, control steering / brakes / lights. It's complex no doubt but it can be modeled.

    But it requires:

    a) The computer is able to see all hazards, act in a predictable way and additionally only engage when the road and conditions are suitable. This is clearly not the case.

    b) The car forces the driver's attention. Force the driver to hold the wheel with both hands. Force them to touch a pedal in a certain way. Monitor their head and posture. This is clearly not the case either.

    It is the failure of a) and b) which causes accidents. A failure of a) is bad enough but without an attentive human, it's a guaranteed accident. This is a forseeable consequence of not forcing attention, i.e. bad design.

    The funny part is Tesla's self drive solution is quite modest. The problems facing mostly or even fully automated cars are multiple factors higher. Perhaps reports of accidents might do something to allow a little bit of reality to creep into the hype about self drive vehicles.

  12. Anonymous Coward
    Anonymous Coward

    This smells like darwin award time. =( The question in my mind is, did Tesla loudly and clearly communicate to its customer that they are not to read, watch tv etc. etc. while engaging the autopilot? If they clearly communicated what they can expect, I see a clear case of the darwins.

    If they did not communicate, then I see a big law suit, or several, coming up.

    Personally, I would never trust the autopilot enough in its current state to remove my hands from the wheel. On the other hand I'm paranoid, so I don't trust my computer and rely on backups, I don't trust the state, so I encrypt etc. etc. ;)

    1. DougS Silver badge

      Doesn't matter if they communicated it if they don't enforce it. They have sensors that can detect if you are holding the wheel or not, but don't disengage autpilot no matter how long you keep your hands off the wheel. There's no excuse for such stupidity, and I imagine a jury will agree when the inevitable lawsuits begin.

      1. Anonymous Coward
        Anonymous Coward

        Hmm, I guess you are referring to US-style law, that assume that the human being is incapable of breathing himself, that gave rise to warnings such as "do not let children play inside laundry machine".

        I actually do not like that legal style at all, however, your point is noted. My preference is to have a legal system that assumes a minimum level of intelligence and common sense, like in the far north of europe.

        On the other hand, that legal system can be a bit too pointless when the penalty is so soft that you clearly come out ahead if you break the law. I guess something between the US and northern europe might be ideal.

      2. parperback parper

        Not necessarily useful

        Sounds like a bad idea to turn OFF the self driving when there isn't anyone holding the wheel.

        Turning it ON, with some kind of alarm would be a better option.

        Anyway, this is all bass-ackwards.

        If this was about safety, the human should have to drive all the time with the Autopilot kicking in only if they do something blatantly stupid (with the ability to do a conscious override, a la stability control).

    2. DrXym Silver badge

      It doesn't matter what they communicate. A system which allows a driver to be inattentive will cause accidents. A system which is in itself is imperfect will cause accidents. Both need to be addressed for the system to be safer than a driver by themselves. So this is a forseeable consequence of bad design.

      An analogy might be a factory with a dangerous hydraulic machine. You could put warnings all over the machine saying not to do certain things while it's running and someone still will either through stupidity, inattentiveness or whatever. That is why factories are required to install things safety gates, two handed controls, sensors etc. that automatically shut down the machine if the operator does something that puts them at risk. A car hurtling down the road at 70 mph is a dangerous machine and safety should be treated as importantly as it is in a factory.

      1. Alan Brown Silver badge

        " That is why factories are required to install things safety gates, two handed controls, sensors etc."

        And why employees bypass the things, then find they have no recourse when the machine amputates body parts (except that these days they do, because the factory is generally found to be negligent in allowing someone to bypass the safety mechanisms.)

  13. TRT Silver badge

    I'm surprised they called it "Autopilot"

    Because from the description it sounds like:

    1) In-lane-guidance assist, which is something Toyota have had for about 3-4 years, and uses their electrically driven power steering to artificially "profile" the road (changing lanes feels like you're steering over a 6 inch high ridge where the lane dividers are - if you don't actively steer, it feels like the road is pushing the car into the bend).

    2) Collision-avoidance, which Volvo and others have had for a round 5-6 years as a front facing feature, and the side/rear-collision detection has been on high end cars for around 2-3 years.

    3) Adaptive cruise-control, which Toyota have again had for 3-4 years, which is supposed to maintain distance to the vehicle ahead whilst respecting an upper speed limit.

    You can't just keep piling driver assistance features on top of each other and expecting it to one day magically start working as something that can drive the car.

  14. Anonymous Coward
    Anonymous Coward

    Hands off, foot off.

    Why doesn't the car start to decelerate if the hands are off the wheel, would be safer than many other drivers.

    I watched a vlog, timelapse in car and the driver was overtaking other vehicles (multi-lane highway) with a phone in one hand (+ charging lead) and a lollipop in the other, steering with her knee.

    I respect the vlogger and I bet his girlfriend is nice too but FFS, that is the sort of thing that would have me walk.

    Also been back seat in a Jeep and the driver drops his fag (UK=Cigarette), both him and his GF (front seat) bend down to look for it so it does not burn the carpet, left just me watching the road and wondering how to word something.

    Automation is far behind the human mind, it would never think to do those things ;-)

  15. hoola
    WTF?

    Beta

    Just how have they managed to get a car on the road running beta software that is directly related the safety of the occupants and more importantly others. This is either some very good lobbying or simply the regulations have not caught out. The other possibility is that because it is electric and seen as "IT" and techie, like everything else, current regulations do not apply.

    Can you really see Ford, Honda, Toyota, VW etc doing this?

    1. Anonymous Coward
      Anonymous Coward

      Re: Beta

      "Can you really see Ford, Honda, Toyota, VW etc doing this?"

      Toyota don't seem to have had a problem shipping control systems that aren't fit for purpose. There have been deaths, and court cases leading to billion dollar penalties, in the USA.

      VW's ability to do software and systems wrong may not have killed drivers or passengers (yet) but their ECU-fiddling has now become relatively visible.

      Who knows what we'll find out about Ford, Honda, and others.

      For probably the best documented example to date (Toyota), look up (e.g.) "uncommanded acceleration".

      Places to start include:

      http://www.eetimes.com/document.asp?doc_id=1319903 25 Oct 2013 [1]

      http://www.eetimes.com/document.asp?doc_id=1319966 31 Oct 2013

      http://www.eetimes.com/document.asp?doc_id=1321734 1 Apr 2014

      https://users.ece.cmu.edu/~koopman/pubs/koopman14_toyota_ua_slides.pdf 28 Sep 2014, Prof Phil Koopman (expert witness at the Toyota trial)

      [1]

      "Could bad code kill a person? It could, and it apparently did.

      The Bookout v Toyota Motor Corp. case, which blamed sudden acceleration in a Toyota Camry for a wrongful death, touches the issue directly.

      This case -- one of several hundred contending that Toyota's vehicles inadvertently accelerated -- was the first in which a jury heard the plaintiffs' attorneys supporting their argument with extensive testimony from embedded systems experts. That testimony focused on Toyota's electronic throttle control system -- specifically, its source code.

      The plaintiffs' attorneys closed their argument by saying that the electronics throttle control system caused the sudden acceleration of a 2005 Camry in a September 2007 accident that killed one woman and seriously injured another on an Oklahoma highway off-ramp. It wasn't loose floor mats, a sticky pedal, or driver error.

      An Oklahoma judge announced that a settlement to avoid punitive damages had been reached Thursday evening. This was announced shortly after an Oklahoma County jury found Toyota liable for the crash and awarded $1.5 million of compensation to Jean Bookout, the driver, who was injured in the crash, and $1.5 million to the family of Barbara Schwarz, who died.

      During the trial, embedded systems experts who reviewed Toyota's electronic throttle source code testified that they found Toyota's source code defective, and that it contains bugs -- including bugs that can cause unintended acceleration.

      "We've demonstrated how as little as a single bit flip can cause the driver to lose control of the engine speed in real cars due to software malfunction that is not reliably detected by any fail-safe," Michael Barr, CTO and co-founder of Barr Group, told us in an exclusive interview. Barr served as an expert witness in this case.

      A core group of seven experts, including four from Barr Group, analyzed the Toyota case. Their analysis ultimately resulted in Barr's 800-plus-page report. [continues]"

      1. JetSetJim Silver badge
        Joke

        Re: Beta

        > For probably the best documented example to date (Toyota), look up (e.g.) "uncommanded acceleration".

        Their marketing department did a good job of it, though - where else could the slogan "The car in front is a Toyota" have come from

      2. Stevie Silver badge

        Re: Beta

        Interesting cases, the Toyota "acceleration" ones.

        Ever note how many elderly drivers were involved?

        I have no skin in Toyota's game, and I am regarded as elderly by some whippersnappers hereabouts, but it occurred to me many years ago that maybe Toyota valued the public image more than the quest for public truth in these cases.

        But then again, maybe not. I Was Not There.

  16. Bertie D'astard

    I LOVE my Tesla Model S, and I DO use the auto-pilot feature regularly as an intelligent assist function, but do not abdicate the responsibility of having to be in control. This is the best and smartest car I have ever owned and driven and absolutely love it, and look forward to continuous software updates that bring new and advanced features all the time. This is the first car I have owned that improves the longer I own it.

    But like all technology, mis-use or abuse, and you're headed for trouble. This is very sophisticated technology, and it requires the user to be smart enough to work with it.

  17. This post has been deleted by its author

    1. Stevie Silver badge

      Re: About the naming...

      As for beta - that doesn't mean it is an untested feature that is not of sufficient quality to be rolled out. It is an indicator to people to not entirely rely on the feature.

      Tell you what: you beta test your Tesla on a private road away from me, my family and friends until it isn't in Beta any more and we are good.

      Until then, what is called for is a siren about as loud as those used at football matches to sound inside the Tesla's cabin whenever some twat takes his or her hands off the fucking wheel in traffic so he/she can take a selfie.

      Anyone caught photoblogging from the backseat of their Tesla while beta-not-ready-for-prime-time-autopilot is in charge should be recycled for organ donation.

      1. 9Rune5

        Re: About the naming...

        Stevie, I think the point here is that the term "autopilot" in itself does not promise much. Look up the definition on merriam webster and you will find no implied promise of any intelligence whatsoever. Basically it is a device that can be implemented with a piece of rope. Clearly there is a huge span of what the various implementations do. And there are different complexities involved. In an airplane you have the luxury of integrating with a collision avoidance system installed into all other aircrafts. No such thing in a car, so you end up with having to syphon similar information from a camera. I suspect that it is actually harder to implement autonomous operation (which is still <> autopilot) in a car than in an airplane. (but that is probably not relevant to the discussion at hand)

        The article's author seem to have a different interpretation of what "autopilot" means, but I think it is very relevant to question that interpretation.

        OTOH I found the discussion of statistics interesting. It makes sense that people would activate e.g. a cruise control in places where the traffic situation is predictable. At the same time I am worried that we (even on an IT website) tends to act like luddites. I see similar arguments against this feature as back when people still used to discuss ABS (also an oft misunderstood technology: ABS will not reduce braking distance on slippery surfaces, but it might help you steer while slowing down)

        1. Vic

          Re: About the naming...

          In an airplane you have the luxury of integrating with a collision avoidance system installed into all other aircrafts

          • Collision avoidance is rarely integrated with the autopilot
          • Many, many aircraft have no collision avoidance mechanism but the pilot's eyes

          Vic.

        2. Stevie Silver badge

          Re: About the naming... 4 9Rune5

          The name "autopilot" carries with it many ideas and suppositions, including the idea of a safe method of stepping away from the controls.

          I could wave a dictionary back at you pointing to the term "pilot" but that would not satisfy anyone since you seemingly believe cars should have the same testing and release philosophy as the software that made the wealth of the man behind the Tesla - cram in Teh Awsum, shove it out the door and fix it in the mix as problems arise - whereas I don't think that thinking belongs anywhere involving large chunks of metal and/or space age composites hurtling along the highways.

          Yes, that includes "smart" traffic signs.

          Aeroplanes have many advantages, not the least being that when autopilot is engaged the actual pilot is in clear airspace guaranteed by all sorts of backup mechanisms and laws to which all but a few insane types adhere for the public good.

          But fly an aeroplane on autopilot into an area where some of that is not true and it all ends rather badly in short order. It has happened many times, once in recent years when a small jet of the Lear type (but not necessarily that marque) suffered what has been publicized as a failure of a door seal that caused everyone to pass out.

          I'll pause while everyone inserts their personal favorite conspiracy theory.

          The aircraft swanned across the American skies in excellent order while increasingly anxious ATC personnel attempted to contact someone without transistors in their brain. Eventually it intersected a mountain range, whereupon the need for a real pilot was suddenly proved beyond a shadow of a doubt.

        3. Alan Brown Silver badge

          Re: About the naming...

          "ABS will not reduce braking distance on slippery surfaces"

          It does over the typical scenario (wheels lock and driver doesn't lift foot from brake) but quite a bit.

          It can pull a car up slightly better than an expert driver in an non-ABS car, but it's far more important that it can pull the car up far better than most drivers can achieve AND won't result in the car spinning if the surface under left/right sides of the vehicle are different in their levels of grip (eg, side of the road or one set of wheels on paint) - this is one that even expert drivers have trouble avoiding.

          https://www.youtube.com/watch?v=mKiTAcXK6M4 - 3:41

          It's probably more saved more lives under these kinds of circumstances than anything else.

          The steering part is a bonus but no matter how you try and play that it does extend the stopping distance.

  18. Kinetic

    Fundamentally flawed.

    Either it needs to be able to deal with all any any situations that might arise, or it's not ready. This idea that people are going to drive for hours on a motorway with their hands and feet hovering near the controls, ready to take over at any stage ... and not fall asleep / get totally distracted is nonsense. It's a manufacturer cop-out to try and sidestep responsibility.

    I can see how some people would go for full automation, but this half-assed sort-of automation is just asking for problems.

    To the people who keep parroting the line about them already being safer - lets see the stats breakdown on that one before buying the marketing, but in addition, that's almost not the point. When you have a crash whilst driving your car - you had some skin in the game. Quite literally your life. The guy who wrote the bad update code that causes the autonomous car you were riding in to crash would doubtless feel terrible about you being killed, then he would go on with his life. There may not even be a fine.

    The distinction here is that by putting your life on the line, you buy in a certain amount of trust from the other road users. You have as much to lose as them. The guy 9-5ing it in a software house on the other side of the world who messed up and killed you and other road users hasn't "bought-in". He needs to be held to a higher standard with rigorous testing.

    Here's another interesting thought experiment. Say we assume that the "AutoPilot" software is good enough. Then we say that we pass a law that stipulates that if the deaths per mile with AutoPiliot exceeds that of regular drivers, the entire development and testing team are executed. Now they have some skin in the game. Do you think the testing regime is going to stay the same or get much more thorough? If you think it will get more thorough, I thought the software was good enough? Good enough for strangers maybe, but not them!

  19. steward
    Facepalm

    Planes have autopilot - doesn't mean they don't have pilots.

    Commercial jetliners have had autopilot for decades. That doesn't mean they don't have a pilot and a co-pilot as well.

    From CNBC: "The autopilot system relies on a series of sensors around the aircraft that pick up information like speed, altitude and turbulence. That data are ingested into the computer, which then makes the necessary changes. Basically, it can do almost everything a pilot can do. Key phrase: almost everything."

    Anyone who drives a car with "autopilot" and expects it to do everything is an ignoramus and a fool.

    1. JeffyPoooh Silver badge
      Pint

      Re: Planes have autopilot - doesn't mean they don't have pilots.

      At the outset, Airbus took a different approach with the entire Human Factors thing. Over the years, quite a few Airbus have been in perfect mechanical condition in the millisecond before impact.

      Somebody should make a plot of the rate of 'perfect mechanical condition' versus 'seriously broken or on fire' (both: a millisecond before impact) of the various brands of aircraft, plotted against year. Based on my observation of the crashes over the years, there would be something of note in the historical data

      Tesla is following in Airbus' footsteps. The 7pm news shows are going to have a regular 'Self Driving Car Crash of the Week' segment.

      Hopefully the NHTSA or DOT will shut down this 'experiment' until the regulations mature.

      1. Alan Brown Silver badge

        Re: Planes have autopilot - doesn't mean they don't have pilots.

        "Over the years, quite a few Airbus have been in perfect mechanical condition in the millisecond before impact."

        Not just Airbus. The same accusation can be levelled at Boeing and McD.

        Pilot error (CFT) has been the prime contributory factor to almost all air crashes in the last 40 years. The few where it hasn't been have been all the more newsworthy because it wasn't pilot error.

        (This is one of the reasons why airlines don't hire ex-military pilots anymore. They tend to keep trying to push on regardless when everyone else goes around or gives up and diverts.)

  20. JeffyPoooh Silver badge
    Pint

    Gives new meaning to the term 'Crash Report'...

    Dear Tesla,

    It drove into the side of a truck.

    It drove into the side of a truck.

    It drove into the side of a truck.

    ...

    Cheers.

    P.S. 'A.I. is hard.' (<- never forget that.)

    1. Anonymous Coward
      Anonymous Coward

      Re: Gives new meaning to the term 'Crash Report'...

      The truck cut the driver up, and the AI wasn't even switched on at the time.

  21. JeffyPoooh Silver badge
    Pint

    Fermat's Last Theorem (repost)

    Remember when Andrew Wiles finally proved Fermat's Last Theorem? The final proof turned out to be just 'a bit' (LOL) more difficult than had originally been imagined by Fermat. Orders of magnitude more complicated.

    Self-Driving Cars are probably going to be quite similar. When they finally get one actually finished, one that successfully stays out of the headlines and avoids contributing to 'interesting' tragedies, they'll look back over the intervening decade(s) and then laugh at the vast ratio between their final fully debugged system, and the 2016-era trivial kits that some had imagined would be sufficient.

  22. Inachu

    Customers who use auto pilot for more than 2 minutes are complete idiots anyway.

  23. Mike Shepherd
    Meh

    "Drivers are not your guinea pigs"

    It may be uncomfortable to hear it, but it's worth killing a few hundred, maybe a few thousand people if it brings forward a few years the enormous economic advantage of self-driving vehicles. Of course, you may need to do your testing in countries where life is cheaper.

  24. Anonymous Coward
    Anonymous Coward

    I use this feature every day and I think Autopilot is a perfect description of its features. Just like autopilot on a plane. It doesn't mean the pilot can go to sleep or do something else. You still have to pay attention to the road. Is it perfect? No. I see it drift out of it's lane and I see it has problems cresting a hill but if you are paying attention, it does make driving easier. I'm only correcting the car when these things happen. I'm certainly not going to go to sleep or start reading a book. By far it works best in stop and go traffic. By the time I get home, I'm not feeling tired from driving. Long trips don't drain me. I love it, flaws and all.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2019