back to article 'Autopilot' Tesla crashed into our parked patrol car, say SoCal cops

Police in Laguna Beach, California, have said a Tesla car – which the driver claimed had been operating in "autopilot" mode – has crashed into one of the force's stationary cop cars. Photos of the incident were tweeted by Laguna Beach Police Department Public Information Sergeant Jim Cota on Tuesday (this morning UK time). The …

  1. AMBxx Silver badge
    FAIL

    Hmm

    >> they must accept a dialogue box which states that ‘Autopilot is designed for use on highways that have a center divider and clear lane markings

    If the car can't find the centre divider and clear lane markings, perhaps it should refuse to allow Autopilot?

    1. Boothy

      Re: Hmm

      Indeed, or even add geo-location, so it can only be activated on known 'good' locations.

      "You're not on a highway, computer says no!"

      1. Ledswinger Silver badge

        Re: Hmm

        Indeed, or even add geo-location, so it can only be activated on known 'good' locations.

        How about the arrogant tossers of the US tech sector stop releasing unreliable software for safety critical systems, and treating the resultant incidents as some form of acceptable cost of using the public as beta testers and crash test dummies?

        1. CrazyOldCatMan Silver badge

          Re: Hmm

          How about the arrogant tossers of the US tech sector stop releasing unreliable software for safety critical systems

          Indeed. "Break things and move fast" is not a good mantra for car software..

          1. Stoneshop Silver badge
            Coat

            Re: Hmm

            Indeed. "Break things and move fast" is not a good mantra for car software..

            How about "Breakfast and move things" ?

    2. Dwarf Silver badge

      Re: Hmm

      Isn't this the fundamental problem of autonomous vehicles ?

      They need things in particular places and do unexpected things when those are not present or are not sensed correctly by the array of sensors.

      Compare that to a driver (of any skill level), they are trained to know the rules and if they get stuck, they will figure out what to do in that situation, acting on instinct, training, common sense or a self-preservation instinct. The computers in the car don't have any of those capabilities. I wonder if they can even determine if a sensor is returning questionable or inaccurate data.

      I also think that calling something "autopilot" is a bit misleading, since we've all seen aeroplane films where the pop-up autopilot handles everything perfectly and nothing ever goes wrong.

      1. Martin Summers Silver badge

        Re: Hmm

        "I also think that calling something "autopilot" is a bit misleading, since we've all seen aeroplane films where the pop-up autopilot handles everything perfectly and nothing ever goes wrong."

        Apart from when it deflates?

        1. Dwarf Silver badge

          Re: Hmm

          Apart from when it deflates?

          There was an operational backup for that too if I recall correctly.

          BTW - glad you got the intended link to Otto / Airplane!

      2. Duncan Macdonald Silver badge

        Re: Hmm - Autopilot

        The Tesla Autopilot seems to be about the same stage in car autopilots as the WW2 autopilots (eg the Sperry A-5 autopilot) were in aviation autopilots.

        (The Sperry A-5 could fly a plane on a straight and level path - a modern autopilot can be set to do an entire flight from takeoff to landing - even some of the better consumer drones can now do this.)

        1. Anonymous Coward
          Anonymous Coward

          Re: Hmm - Autopilot

          Actually, the self drive is way more advanced than your giving it credit. What your describing is more like a 3 year old Volvo or Subaru Eyesight system, with basic lane keeping, adaptive cruise control and emergency braking. Tesla's self drive tracks both the location and the roadway, and will refuse to engage on many secondary roads. It also tracks the lanes and the cars in traffic, and according to my roommate worked even on fresh tarmac that didn't even have the lane lines painted on yet. Autopilots on planes aren't contending with flying nap-of-the-earth, or through complicated air space, so even the comparison to current Airplane autopilots is inaccurate.

          While the safety issue is a valid question, this is turning into another overblown moral panic. People who have no idea about the technology and have never even driven one of the cars for more than a day are projecting their imagined fears onto something that may not actually be as dangerous as the idiot behind the wheel. (at least in the US, where driver training is imaginary, and personal responsibility is less than fashionable). The reality is less capable systems than this are already saving more lives then they endanger. That's is by the hard standard of fatalities prevented compared to injuries or fatalities sustained in self drive.

          I ride a motorcycle, and a self driving Tesla of this generation is probably less likely to kill me than it's owner. This is being re-framed in the press and public opinion as a public menace, when the net effect of the safety features are actually a big positive.

          1. Alan Brown Silver badge

            Re: Hmm - Autopilot

            "I ride a motorcycle, and a self driving Tesla of this generation is probably less likely to kill me than it's owner. "

            This, in spades. In the wake of Joshua Brown's death the USA's NTSB stated that according to their stats Teslas are 40% less likely to be involved in a crash vs any comparable vehicle. The system isn't perfect by a long shot, but it's clearly already better than a lot of human drivers.

            'Speed humps are put on our roads primarily to try and prevent children walking to school from being killed by vehicles containing those being driven to school'

        2. Alan Brown Silver badge

          Re: Hmm - Autopilot

          "A modern autopilot can be set to do an entire flight from takeoff to landing"

          Some modern autopilots. The one on the Cessna 302 I used to fly was a "straight and level" jobbie - and it wasn't that straight, thanks to gyro precession. There are a lot like this still in service.

        3. Captain Scarlet Silver badge

          Re: Hmm - Autopilot

          "WW2 autopilots"

          Yes but the pilot is likely to still keep awareness of what is going on around them. I can't see why they don't call it a driving assistant, its there to assist driving and does what its designed to do very well from what I can tell. I think Tesla would be get less flack if it wasn't named AutoPilot, I assume the name AutoPilot would be able to do nearly everything (As Autopilot can in most airline aeroplanes manufactured in the past few years).

      3. Thoguht Silver badge

        Re: Hmm

        I'm not sure I'd want a computer in my car that had a self-preservation instinct, I'd rather it had a people-preservation instinct.

        And by the way, my car has lane-following, adaptive cruise control and emergency braking controlled by front-facing Lidar, but Ford had the good sense not to give the impression that these things make the car self-driving.

        1. Alan Brown Silver badge

          Re: Hmm

          "Ford had the good sense not to give the impression that these things make the car self-driving."

          There's the apocryphal story of the Winnebago driver who engaged the vehicle's "autopilot" back in the 1970s on some dead straight road in Kansas and went aft to make coffee. You can guess what happened next.

          The problem is partly that Tesla calls it autopilot and mostly that humans do really stupid things.

          One of the more interesting thing I learned watching one of the endless "motorway cops" repeats is that when attending anything by the roadside on a motorway, it's common practice to have someone stationed on lookout at oncoming traffic, because some drivers get fixated on the flashing lights and unconciously drive directly towards them.

          1. DropBear Silver badge

            Re: Hmm

            "There's the apocryphal story"

            My one rule concerning any "haw-haw funny" stories is that unless credible proof to the contrary exists not a single one of them ever happened, all of them being the invention of some moron going "wouldn't it be a hilarious story if some guy... and then that happened... and they made it much worse by..."

            1. Francis Boyle Silver badge

              Well

              I've heard these stories about vicious animals called drop bears.

            2. Alan Brown Silver badge

              Re: Hmm

              "unless credible proof to the contrary exists not a single one of them ever happened"

              Look up "apocryphal" - there's a reason I used that word.

              That said, I've been a passenger in cars where the driver is clearly _NOT_ paying attention to the road ahead whilst on cruise control (rummaging around on the back seat, etc) and that was well before the advent of collision avoidance systems. On one occasion I needed to grab the wheel to prevent the car crashing off the road because the driver had apparently drifted off to sleep.

              The reason stories like the Winnebago are believeable is because there really are people like that and everyone knows at least one person who would do it.

      4. Anonymous Coward
        Anonymous Coward

        Re: Hmm

        It's Cruise Control. It has zero features of a full Auto Pilot that we associate with aircraft.

      5. Stoneshop Silver badge

        Re: Hmm

        Compare that to a driver (of any skill level), they are trained to know the rules and if they get stuck, they will figure out what to do in that situation, acting on instinct, training, common sense or a self-preservation instinct. The computers in the car don't have any of those capabilities.

        On the other hand, drivers may be interpreting the situation ahead of them wrongly due to having only one set of eyes that only work with a sufficient level of visible light, and their figuring out what to do may take more time than is available to them (better known as "getting into a situation that's over their head"). Common sense often turns out not to be sufficiently common either. And while I grant you the self preservation instinct, it's not rare to see that overridden by target fixation: one drives into what one's looking at even when having consciously noticed it being an obstacle to avoid, like a tree, a large rock or a fire engine. Consider it some kind of brain lock-up.

        Driver assist technologies have been around long enough now[0] that they reliably work as they should; in the 14 years that the Volvo XC90 has been on the market in the UK it has not been in the type of accident that its collision avoidance system[1] was designed for. It's the driver replacement stuff that's not up to scratch, especially if it's (wrongly) interpreted by the driver as being something that it's evidently not ("Autopilot").

        I wonder if they can even determine if a sensor is returning questionable or inaccurate data.

        The traffic authority's report into the crash where that Tesla went under a truck that was crossing the lane it was in, contained parts of the vehicle logs. It shows that the system is aware at least of broken and missing sensors and actuators.

        [0] unfortunately to the point that people have come to rely on them in a way that affects their traffic awareness.

        [1] yes, that same system that Uber saw fit to disable, as they considered it might interfere with their own AV setup.

        1. Dr Dan Holdsworth Silver badge
          Boffin

          Re: Hmm

          It may come as something of a shock to many here, but Google have done extensive research (to add to the already extensive research done on aircraft pilots) and have concluded that when humans are involved with operating complex heavy machinery, there are really only two varieties of automation that are useful.

          Firstly, there is the technology that is being steadily fitted to all cars now, which is stuff which improves how a human controls a car. This is stuff like blind spot warnings, braking assistance systems, anti-collision radar and various lane-keeping aids, together with ABS braking, improved suspension and so on. All of this requires the human to do the driving, and the tech just tries to help the human out whenever it can.

          Secondly, there is fully autonomous driving. This is where the car and its systems are doing all the driving work, and the human merely has a big red button to hit in times of panicked emergency, together with some sort of very low-speed movement control for parking the vehicle somewhere that the autonomous systems cannot.

          Tesla seem to be trying to extend the assistance technology into the driverless technology niche. The problem here is that humans are really bad at not driving but remaining alert; not just slightly bad but truly, dangerously terrible at this to the extent that this system is actually more dangerous than a person driving an onmodified, unaided car by themselves. Until Tesla realise or admit that their systems don't sit in either of the sweet spots for this sort of technology, they are going to carry on having problems.

          1. Alan Brown Silver badge

            Re: Hmm

            "To add to the already extensive research done on aircraft pilot"

            Studying human factors has done more to reduce aircraft crash rates since the early 1970s than all other changes combined. It's one of the reasons why most airlines no longer directly recruit ex-military pilots (wrong mindset for safe bus driving - a military pilot will push on regardless in marginal conditions instead of diverting. There might be a 90% success rate when doing so, but that 10% is a bitch - and there are far too many cases of "hero pilot abilities saved the day" when "hero pilot's" lack of judgement and fixation on target was directly responsible for putting the flight in danger in the first place.)

            Google is absolutely right to be studying human factors and the NTSB's crash studies always find a chain of events. Human factors start right back at the level of road design/layout and ensuring that markings are done sensibly.

            The problem is that the vast majority of "road engineers" have no actual training in road safety other than the mechanical stuff and a huge number of "no brainer" assumptions(*) about what's safe or not safe result in decisions being made which actually make roads _less_ safe.

            In particular there's a major tendency to look at XYZ "problem" on a road and not realise that it's a symptom of a larger problem. You see this in towns, where panic about people being able to cross the road safely results in extra pedestrian crossings, lights, parking restrictions and fencing being installed, which perversely have the opposite result to what's intended(**). The REAL problem is usually "why is there so much traffic on _this_ road, why is it travelling so fast and why aren't drivers paying attention to conditions?" and the proper answers are usually "Find a better route for that traffic/encourage use of XYZ existing better route and/or slow it down dramatically along with making sure drivers are paying attention to surroundings instead of focussing on the traffic light ahead"

            (*) Assumption is the mother of all fuck ups.

            (**) to wit:

            Crossings usually result in slightly higher levels of car vs pedestrian incidents and they cluster around the crossings.

            Fences, parking restrictions, lights, etc _ALL_ result in traffic going faster and drivers getting tunnel vision (ie, paying LESS attention to what's on the footpath, etc) - and fences are particularly lethal if a cyclist/motorcyclist/pedestrian is pinned between it and a car - which usually results in victim blaming by people who think that fences are there for safety (they're not, they're there to guide pedestrians to a "safer" crossing point and if they're being jumped or bypassed, or a documented hazard to other road users they must be removed immediately - almost all of the 2000-2016 "HGV vs bike" deaths in London were a direct result of cyclists being crushed against fencing as an example, vs being able to fall/escape onto the footpath when the HGV turned across their path.)

    3. Anonymous Coward
      Anonymous Coward

      Re: Hmm

      If the car can't find the centre divider and clear lane markings, perhaps it should refuse to allow Autopilot?

      Nope. It will lock out the user and home into the nearest car of The Blue Guys.

      Musk's putinesque/gerasimovisque Plan Of Evil to create chaos and mayhem while watching from one's lair has been revealed early.

    4. Captain Scarlet Silver badge
      Facepalm

      Re: Hmm

      Rename the feature, the name of this feature is the issue

    5. ridley

      Re: Hmm

      I believe it does.

      Well sort of, it warns the driver that it does not understand what is going on buzzes and shakes the wheel, instructing the driver to take over. Very similar to a commercial plane's autopilot. I believe it will not allow it to be engaged until it is "happy" it can track the surroundings.

      In a similar way to a plane's autopilot the driver should be aware of what is going on around them at all times and they should be prepared to take over at any time.

      Would you be OK with the Pilot and Co-Pilot of a plane both coming back into the cabin for a meal* at the same time, or would you expect them to be monitoring the situation at all times?

      *esp if fish is on the menu.

      1. Jellied Eel Silver badge

        Re: Hmm

        Well sort of, it warns the driver that it does not understand what is going on buzzes and shakes the wheel, instructing the driver to take over.

        There's a couple of potential problems. Having a recognisable alarm tone, or expecting the driver to look up from their phone and notice the steering wheel wibbling quietly. And then assume the driver can process the alarms and react appropriately to whatever the car was trying to warn them about.

        But a modest proposal. AC shocks can create spasms. DC shocks kinda paralyse the muscles. The driver is sitting only a few inches away from a decent sized battery, so a couple of spikes up through the seat may be a more effective way to get the drivers attention. A few more spikes and you could probably lock the driver into a braced position.

        It's unfortunate that cars have chosen public vehicles to euthanise themselves or their drivers against though. Perhaps as part of US law enforcement's drivers ed programs, they could point out drivers are sitting on more juice than Old Sparky used in the US Prison Service?

        1. Alan Brown Silver badge

          Re: Hmm

          "expecting the driver to look up from their phone"

          What part of "the driver isn't supposed to be looking at their fucking phone in the first place" isn't sinking in?

          I'm quite glad that the UK prosecuted a Tesla owner for dangerous driving. He wasn't "in control" of his vehicle and the passenger in an adjacent car was correct to film the antics.

          In the UK the problem with people doing "other things" whilst driving is bad enough that are dedicated motorway/highway patrols which specifically look into cabins to find that kind of dangerous activity - and the problem isn't new - it existed long before the rise of computer-augmented driving aids and thankfully the incidence of this kind of sociopathic behaviour doesn't seem to be rising.

          When I was much younger, court reports used to regularly feature judgements requiring that driving offenders resit their license tests for offences that didn't result in actual bans - particularly for older (as in more than 10 years licensed) drivers. Perhaps this should be a more common occurrence.

          1. Jellied Eel Silver badge

            Re: Hmm

            "expecting the driver to look up from their phone"

            What part of "the driver isn't supposed to be looking at their fucking phone in the first place" isn't sinking in?

            Yup. In this case, the driver was apparently happily texting while the Tesla drove into the back of the police vehicle. So an easy ticket for the officers involved, and hopefully the driver will be prosecuted for texting and the US version of driving without due care and attention. And the US police take a dim view of been driven at, so could have shot the driver and/or car.

            But this is the danger of perceived 'autonomous' vehicles, when driving is probably the most hazardous activity that people do regularly. I totally agree with the comments that driving aids are good, but drivers still need to pay attention to their driving.

      2. Anonymous Coward
        Anonymous Coward

        Re: Hmm

        "Pilot and Co-Pilot of a plane"

        If I may indulge in a minor pedantic moment.

        That terminology is not entirely correct.

        At any one point in time, there is one and only one PIC (Pilot-In-Command, or "Captain" in old-school terminology). They are the person where the buck stops. They are the person in which the law vests ultimate authority over the operation of the flight.

        For the majority of operations its PF/PNF (Pilot Flying / Pilot Not Flying). Whereby PF is responsible for "Aviate" and PNF is responsible for (Navigate, Communicate etc). The role switches round in flight as necessary.

        So for the most part (and in order to foster good multi crew cooperation), the two people at the pointy end are considered equals (hence PF/PNF instead of pilot and co-pilot).

        The seniority aspect of the Pilot-In-Command only comes into play in limited circumstances. For example, where exercising of their legal authority is required, or where a "Captain's only" flight task is required (e.g. landings at certain tricky airfields or certain weather conditions are deemed "Captain's only" by many airlines).

        Finally, I would urge you not to compare a Tesla so-called "autopilot" to the more complex environment of a large commercial aircraft, of which the actual "autopilot" bit is just the tip of the iceberg. Lots of safety and redundancy measures built in at every step that you will never find in a Tesla.

        1. TRT Silver badge

          ... a computer in my car a self-preservation instinct ... a people-preservation instinct.

          KITT versus KARR?

        2. IHateWearingATie
          Thumb Up

          Re: Hmm

          'pedantic moments' are never minor on El Reg's comment boards. They are the sine qua non of a thread.

          Have an upvote :)

      3. Anonymous Coward
        Anonymous Coward

        @ridley - Re: Hmm

        Have an upvote for mentioning the fish on the menu...

      4. kwhitefoot

        Re: Hmm

        I have a 2015 S70D and I use autosteer a lot. It frequently refuses to engage because of inadequate lane markings and it disengages promptly when they disappear while giving audible and visual alarms.

    6. Eddy Ito Silver badge

      Re: Hmm

      I don't see where a center divider would have changed anything. It's pretty clear from the damage and the black streaks coming from the tires that it was a small overlap impact which resulted in the Tesla being spun sideways and the cop car was pushed up on the sidewalk.

      In this instance I'd say that if autopilot was engaged it likely got confused by the road markings which has just added a split for the right turn lane. I could see where the computer would be confused as to which lane to choose and if it chooses a bit too soon there wouldn't be enough clearance for the parked cars and a nickel says the driver was probably looking at their phone thinking autopilot has it under control.

      1. Corporate Scum

        I think your on the right track

        There are only a couple of sections of the outbound leg that have those side of the road markings for parking, and most of the road HAS a center divider, unlike the photo, so that narrow the location to just a couple spots. (You may have seen a more specific location, I'm working off the photo and being fairly local.) It looks like the are by the dog park but, another section does something similar by Broadway.

        This town is great example of near total disregard for normal road markings. The road lines don't use the normal width, the dashed yellow lines for the center don't use the normal spacing, most of the lane lines are solid white instead of dashed and there are huge arrows everywhere. I'm not sure, but I think that the parking spaces where the police car was parked are limited to off peak hours, and as the Twitter thread points out that section is Wackadoodle from a road planning standpoint. It basically routes the right hand lane into parked cars.

        So if we find out this is a blind spot for the Tesla's neural net, no one should be surprised. It does not play to the strengths of those systems to if they are dealing with a bunch of needlessly unique visual queues where there is only one place or set of training data.

        Speaking of which, it would be a great idea if the NTSB was building up a shared pile of data from all of the crashes of cars equipped with autonomy, either full self drive or safety features. Every manufacturer should be using these incidents in their training data and testing. That data should include cases where the driver was in control as well, as the car should be learning to help us humans avoid the mistakes we are prone to.

        1. Anonymous Coward
          Anonymous Coward

          @Corporate scum - Re: I think your on the right track

          Yeah but even a child could figure out where the road markings should be, irrespective of line color or width or position. Here in Canada you have several months when road markings are indistinguishable and we still don't hit firetrucks or police cars.

          So give me sensors, night vision, navigation aids, heads-up display and other assisting tech and let me drive my car. Come on, it's ridiculous to keep your hands on the wheel and simulate you're driving when it's your autopilot that does it.

          1. Alan Brown Silver badge

            Re: @Corporate scum - I think your on the right track

            "Here in Canada you have several months when road markings are indistinguishable and we still don't hit firetrucks or police cars."

            That's because you have training in such driving conditions.

            It's extremely likely that the kinds of people who let their automated car drive into a parked car would probably be playing road pinball if exposed to them and forced to drive manually.

      2. John Brown (no body) Silver badge

        Re: Hmm

        "In this instance I'd say that if autopilot was engaged it likely got confused by the road markings which has just added a split for the right turn lane."

        This one of the reasons I can't see fully autonomous cars working reliably on the roads. There are many, many examples for junctions where, especially in queuing traffic, it's easy to get in the wrong lane. More specifically, where there are two lanes approaching a roundabout (UK in case it's not obvious) and when you get almost there you finally get to see the road marking in the gap between you and the car in front telling the left lane is for left turners only and you should have been in the right lane to go straight over. Once you solve that problem, you get to the next roundabout, move to the right lane, then discover at the last minute that THIS time, the right lane is for right turners and the left lane is for left and straight ahead.

        York outer ring road was notorious for this until they eventually put actual road signs up where the road goes from one to two lanes on the approach. If I, as a driver who does 80,000+ miles per year can get that wrong, there's little hope for an autonomous car.

        Here i the UK, road signage and road markings are pretty much standardised across the whole of the UK. I can only imagine the difficulties and potential confusion in the US where signage and road markings seem to be at the mercy of local road planners in far less regulated way, per city, per state and federal on the main highways.

      3. Stoneshop Silver badge

        Re: Hmm

        if it chooses a bit too soon there wouldn't be enough clearance for the parked cars

        Never mind choosing a lane too soon, at any moment it should detect an object that the Tesla will hit if it doesn't change course, and take action. If possible move over as far as necessary, brake, or both. Which would take precedence over just barging ahead.

        Given that one of the objects a Tesla should reliably detect is cars in all positions relative to itself, the Tesla clearly miscalculated the police car's position and size. Not detecting a lane divider that starts at a certain point between the main lanes and a slip ramp, and which doesn't have clear markings and sufficient contrast with the road surface around it I can comprehend. Maybe that one could have been avoided if it had weighed other positional sensor inputs more heavily, maybe not. I don't know. But not judging the correct position of something it will have detected? That's bad.

    7. Fungus Bob Silver badge

      Re: Hmm

      "If the car can't find the centre divider and clear lane markings, perhaps it should refuse to allow Autopilot?"

      This is all part of Musk's cunning plan to sell more cars.

  2. vtcodger Silver badge
    Devil

    Teslas don't LIKE firetrucks

    In what seems to be an entirely separate incident from the Culver City crash, a Tesla in Jordon Utah ran into a stopped fire truck a few weeks ago. http://www.newsweek.com/tesla-model-s-crash-car-autopilot-sped-just-utah-firetruck-944251. The only possible conclusion is the Teslae have a deep, instinctual hatred of fire trucks.

    1. Stoneshop Silver badge
      Devil

      Re: Teslas don't LIKE firetrucks

      EV crashes are often attended to by fire engines because of the inherent fire risk of damaged battery cells. So it's fair enough that Teslae develop negative feelings towards fire trucks because of the association with accidents, but the tendency to want to destroy them should really be suppressed. Maybe Tesla can convert an abandoned race track or something into a rehab center where frustrated vehicles can smash up Tonka Toy fire vehicles, or cardboard mockups, with impunity.

    2. BebopWeBop Silver badge
      Happy

      Re: Teslas don't LIKE firetrucks

      Was the fire engine painted yellow?

    3. jmarked

      Re: Teslas don't LIKE firetrucks

      This will be another reason for my uncle to not like the EV's. I was helping him installing the new skid plate and smittybilt bumper on his Wrangler diesel when he mentioned how boring these new electronic vehicles to him. Well, he used to work on a fire station before.

  3. Lt.Kije

    So, in crashes per mile driven, does Tesla Autopilot have a better record than meatbags?

    The data is out there fer sure.

    1. Brangdon

      The data isn't publicly available. We know that Tesla cars as a whole got safer around the time that autopilot was introduced, but it was introduced at about the same time that automated emergency braking was added, and it may be the latter which is wholly responsible for the improvement and not autopilot. That Tesla have the data and don't release it is regarded as suspicious by some people.

  4. Mark 85 Silver badge

    We have asked Tesla for comment.

    After Musk's rant, it's probable that El Reg will start receiving the "Apple Treatment". But what else should Tesla expect? This is new tech and too many drivers are idiots what with playing with cellphones, etc.

  5. Anonymous Coward
    Anonymous Coward

    It looks like it mistook a big white marking in the road for the central lane.

  6. Boohoo4u

    Drugs

    Laguna Beach streets are a freaking madhouse.

    There are pedestrians and jaywalkers everywhere. It has some of the worst and most confusing streets in the nation. You’d have to be on drugs to enable autopilot there...

    It’s also an extremely wealthy area. Probably a parent bought a Tesla for their stupid kid...

    1. Alan Brown Silver badge

      Re: Drugs

      "There are pedestrians and jaywalkers everywhere"

      In other words it's just like a normal road in most parts of the world.

      "Jaywalking" is a uniquely american concept, from a universe where a century of lobbying by GM and friends has resulted in a situation where cars have been legislated to have more rights on the road than people and pedestrians are legally only allowed to cross at designated points, when signalled to do so.

      It's programming assumptions derived from those road rules (UBER!) that result in pedestrians crossing the road being run over by mechanised killing machines - but the same laws also result in USA drivers running over pedestrians at an alarming rate and frequently treating any on the road as "targets"

      1. DropBear Silver badge
        WTF?

        Re: Drugs

        There's nothing "uniquely American" about the concept of pedestrians not simply walking across the street wherever they feel like it; try that shit in Eastern Europe, get fined all the way to oblivion if there's a cop around - as you should be*. There's a good reason the international "pedestrian crossing" traffic sign exists. Sane places use them.

        * of course we still do it when the road is effectively empty far to the left and right - but everyone knows it's unmistakably our ass on the line if we failed to spot a cop** and our fault if anything goes wrong.

        1. Alan Brown Silver badge

          Re: Drugs

          "try that shit in Eastern Europe, get fined all the way to oblivion if there's a cop around"

          Next time that happens to you: Watch where that "fine" actually goes.

          Hint: Not into the authorities' coffers.

          Also: watch who it gets enforced against

          Hint: Not the locals.

          ' The forgotten history of how automakers invented the crime of "jaywalking" '

          https://www.vox.com/2015/1/15/7551873/jaywalking-history

          https://en.wikipedia.org/wiki/Jaywalking

          1. David Nash Silver badge

            Re: Drugs

            The problem with the concept of Jaywalking is that it seems to criminalise harmless crossing the road when it's safe to do so.

            If you are wandering in the road when there is traffic about, you are a danger to yourself and others and there may be a case for prosecution. If on the other hand you cross the road at a sensible time and perfectly safely, it's a waste of time and money to treat that as wrongdoing, which I have heard stories of many times.

      2. Archivist

        Re: Drugs

        "Jaywalking" is a uniquely american concept,"

        No it's a concept used in many civilised countries to protect humans from machines.

        Heck, I'm sticking up for the US. I'll have to watch myself or next it'll be Microsoft.

  7. Neil Barnes Silver badge
    Holmes

    Think of it

    as evolution in action.

  8. Zog_but_not_the_first Silver badge
    Thumb Down

    Self driving cars?

    Bollocks!

    1. Anonymous Coward
      Anonymous Coward

      Re: Self driving cars?

      A Wild Bull would also be a danger if left unattended.

  9. Anonymous Coward
    Anonymous Coward

    What's the point of having a dog then barking yourself, autopilot seems as much use as an ashtray on a motorbike.

  10. Rebel Science

    The fundamental problem of self-driving cars is deep learning

    Deep learning sucks. Unlike the brain, a deep neural net can only see things it has been trained to detect. IOW, don't wear a Chewbacca costume in front of an autonomous car. Just saying.

    1. BebopWeBop Silver badge

      Re: The fundamental problem of self-driving cars is deep learning

      That surely depends on whether the designer was a Star Wars fan or hater?

    2. Francis Boyle Silver badge

      Re: The fundamental problem of self-driving cars is deep learning

      "Unlike the brain, a deep neural net can only see things it has been trained to detect."

      Nonsense. A neural net can only recognise things it has been trained to detect. Just like you. Can you distinguish a sonnet by Marlowe from one by Shakespeare? But since the detection of objects by autonomous vehicles has nothing to do with neural nets (and everything to do with things like LIDAR) Star Wars fans can safely roam the streets.

  11. JK63

    IMO, the important metrics to consider are:

    Accidents per whatever unit of distance compared to human drivers.

    Fatalities per whatever unit of distance compared to human drivers.

    That's a start of looking at this objectively rather than with an impossible to meet standard of perfection. Ample evidence exists to demonstrate humans are far from perfect as drivers.

    1. Anonymous Coward
      Anonymous Coward

      I'd add to that line of thinking:

      It's fair to compare accidents\fatalities involved per mile, but you should also consider the number of times that the safety features engaged successfully and prevented or reduced the severity of an accident.

      People are looking at this with a very negative space point of view, only counting the crashes and failures. The big picture looks very different. Musk's recent rants aren't helping people focus on that, but it's important as it impacts every player not just Tesla. The basic technology is never going to get to 100% safety. It shouldn't be expected to, and in these early generations it just needs to be be close to a human driver, and work in a complementary fashion with one.

      Auto accidents are one of the leading causes of death and serious injury, and the number of self drive incidents is still tiny even relative to the number of miles driven. So I wish people would stop acting like this was in any way an issue worthy of the panic. I'd have bigger concerns if this was being used for unattended vehicles, but not with a driver who's hands are on the wheel.

      1. veti Silver badge

        Re: I'd add to that line of thinking:

        It's fair to compare accidents\fatalities involved per mile, but you should also consider the number of times that the safety features engaged successfully and prevented or reduced the severity of an accident.

        No, you shouldn't. The reason being, those numbers are already included in the headline "accidents/fatalities per million miles", or whatever number you're looking at.

        The trouble is that if you get a number for "times safety feature engaged", you have nothing to compare that number with. Human drivers don't, typically, make a systematic count of every time they have to brake to avoid crashing into the car in front - and if they did, the answer would be so subjective as to be meaningless anyway. So that number can only, at best, be a distraction.

        We need numbers that can actually be measured with a reasonable degree of certainty and consistency. Number of accidents, and especially number of fatalities, are the only metrics that come close to meeting that requirement.

      2. John Brown (no body) Silver badge

        Re: I'd add to that line of thinking:

        "It shouldn't be expected to, and in these early generations it just needs to be be close to a human driver, and work in a complementary fashion with one."

        Except you need to take human nature into account. Although it's still devastating for the family of anyone killed on the roads if a human is driving it's either an unavoidable accident or there's someone to blame and hopefully be punished. When it's a "machine" that kills someone, how do we accept the "unavoidable accident" when it was the "infallible computer" that did it or choose who to blame or get punished?

  12. c1ue

    The real problem is likely that the 90% (or higher) times that AutoPilot does work, disarms people's ability to handle the remaining bits.

    This is a serious, architectural problem. If 90%(or even 99.5%) success is accompanied by 10% or 0.5% catastrophe, the technology is fundamentally unsafe.

    1. Kevin Johnston

      Fully agree and people are sick of me banging on about this. In schools now they are so bubble-wrapping everything that children do not learn to evaluate risks which means they become adults and enter an environment where simple everyday events are trying to kill you (or have you kill yourself)

      The same thing is happening in healthcare. People are so germ'o'phobic that children are no longer building a proper immune system and instead we are weeding out all the weak bugs and just leaving the superbugs by overusing anti-bacterials

  13. mr.K
    Holmes

    Wise choice

    "...walk away from the crash uninjured and refused an offer of medical treatment."

    Pro life tip, always refuse medical treatment when uninjured and healthy.

  14. DougS Silver badge

    Not fit for purpose

    This is at least the third time a Tesla on "autopilot" hit something in its path without even slowing. Even the lesser "super cruise" modes available on many luxury cars that provide emergency stop do better.

    I wonder how many cases of this have to happen before the government requires Tesla to disable the autopilot feature in the US? Obviously the warnings they claim they are putting out to let people know it isn't what most people think of when they hear "autopilot" and the warnings they claim they are doing to try to get people to stay in control of the car when using it aren't working.

    Will a Tesla on autopilot have to kill someone in another car or a pedestrian before they take action? It is ridiculous that Tesla is allowed to beta test an alpha quality product on public roads.

    1. elgarak1

      Re: Not fit for purpose

      Well, it's not that a car not equipped with "autopilot" or other driver assistance system has ever hit something in its path without slowing down before, now, has it?

      1. elgarak1

        Re: Not fit for purpose

        And I bet there are other car manufacturers out there who do have their own share of crashes with driver assistance systems turned on.

        You don't hear about them because they're not Tesla. They're not juicy enough. They didn't advertise their – sometimes better – systems as aggressively. They don't bundle all their assistance system into one big "autopilot", neither in advertising, selling (in fact, while gleaning sales brochures of the big German manufacturers I have hard time decoding what exactly each of the system does, or how it should be compared to Tesla's system, which I understand quite well. They dowse the explanation in so much technobabble that it puts Star Trek dialog to shame) or in use.

        That makes them less attractive to report. If the Tesla Autopilot fails, it is a story. If the VW Collision Avoidance System fails, it's a non-story. Because the former is seen by the PRESS (not the majority of Tesla drivers or technical minded people) as a revolutionary type of self-driving future, while the latter comes across to them as just another car part like brakes. Are failing brakes a story? There you go.

        1. Blank Reg

          Re: Not fit for purpose

          Driver aids such as emergency braking work fine in most cases. The key difference is that no one expects to take their hands off the wheel and stop paying attention just because they have these driver aids to help keep them safe.

          Autonomous vehicle makers are pushing exactly that, and that will lead to deaths that could have been avoided.

        2. tfewster Silver badge
          Devil

          Re: Not fit for purpose

          Fair point. The AAA did some testing on other cars and discovered they're far from perfect either, even in "avoidable" accident scenarios.

          But Tesla have brought the bad press on themselves by calling it "Autopilot" and lulling their users into a false sense of security.

          A car, why, what do you see? --->

        3. Anonymous Coward
          Anonymous Coward

          @elgarak1 - Re: Not fit for purpose

          Do you really believe in collision avoidance systems ? I don't.

        4. Stoneshop Silver badge

          Re: Not fit for purpose

          Are failing brakes a story?

          If brakes on a particular make and model are failing in numbers greater than "just a few, and negligible compared to the total number on the road", then it may well become a story. Especially if those brakes are of a new design with several improved features.

      2. DougS Silver badge

        @elgarak1

        Or should I call you "Tesla apologist"?

        Please tell me which cars have "driver assistance systems" that allow drivers to keep their hands off the wheel and treat it as a self-driving car, with minimal warning (and apparently you can buy third party devices intended to fool the steering wheel into thinking you are touching it so you can drive without the annoying warnings)

        No one else is stupid enough to call their system "autopilot", knowing full well that in most people's minds the word autopilot means it can drive itself. And it actually does try, it just does a really shitty job at it and will continue getting in accidents until it kills an innocent bystander and Tesla is sued for $50 million.

        They knew exactly what calling it autopilot would connote in people's minds, and lied to owners that the cars would be upgradeable to level 5 automation when they aren't shipping with the hardware necessary to implement that. Heck, it may be short of the hardware required to even detect a vehicle right in front of it, given that it keeps ramming into stopped vehicles in its path without even slowing.

  15. Anonymous Coward
    Anonymous Coward

    That section of laguna canyon has "clever" road markings

    Laguna Canyon isn't a side road. It is a numbered highway, and parts of it are divided with a center barrier. Parts of it also have an oddly marked "Suicide Lane" that is a great example of the city planners getting creative with the road markings. Hint, human divers get confused by non-standard road markings, not just computers. Stop being clever and use the same markings that the rest of the state uses.

    Looking at the photos it looks like the crash was in a two lane section an the police car was a white SUV with blue lettering, not a full black and white. It also looks like it was parked by the side of the road, not parked in the middle of the street(which is a thing that they like to do some times). It will be interesting to see if the self drive had kicked out and the driver didn't notice, or if the car just went wide in the turn and plowed into the corner of the police car.

    1. John Brown (no body) Silver badge

      Re: That section of laguna canyon has "clever" road markings

      "Stop being clever and use the same markings that the rest of the state uses."

      Would that not be undue governmental interference into the rights of the local government who did it? Standardised top down regulation in the USA seems to be seen as some sort of commie pinko plot by certain outspoken people.

  16. Gene Cash Silver badge

    > the ~40,000 people who died in US auto accidents alone in past year get almost no coverage

    He's right. This has always pissed me off personally.

    12 people get shot in a school and it's the end of the world (which it is) but 3,000+ die EACH MONTH and it's completely ignored.

    And the penalties are nil. An old geezer killed someone on a bicycle and got a whopping $80 fine - until the local community revolted and she got 3mos in jail - 3 months for killing someone!

    1. veti Silver badge

      That's just not true. That 40,000 figure got extensive coverage from, among others, Washington Post, Wall St Journal, USA Today, CNBC, AP, and just about every other mainstream outlet.

      In 2014, $416 billion was spent on maintaining US highways, and that's not including the cost of policing them, or the cost of building new highways, or vehicle inspections, driver education and licensing, or many other related costs. That's over $10 million per death, even excluding some of the largest costs. That's - not my idea of "completely ignored".

    2. Anonymous Coward
      Anonymous Coward

      @Gene Cash

      Death of a person is a tragedy. Death of tens of thousands is just statistics.

    3. Chemist

      "12 people get shot in a school and it's the end of the world (which it is) but 3,000+ die EACH MONTH and it's completely ignored."

      Difference is 12 people are shot without any reason WHATSOEVER whilst deaths on the road are one of the risks of everyday life (which we should try and minimize) for which we derive a benefit

    4. ridley

      There could be many reasons why someone knocks over and kills a cyclist, not all of them being the car drivers fault.

      So without more information it is impossible to say whether this was far too lenient or a travesty of justice that they received any sentence at all.

  17. doug_bostrom

    "...drivers are continuously reminded of their responsibility to keep their hands on the wheel and maintain control of the vehicle at all times.."

    Otherwise understood as the "notapilot" feature. Used as intended, it's really doing nothing at all.

  18. Anonymous Coward
    Anonymous Coward

    Unrealistic expectations?

    Does Tesla really believe that U.S. drivers will keep their hands on the steering wheel at all times and actually pay attention to their driving with a feature called: "autopilot"? If so then maybe Tesla is in the wrong business because it's never gonna happen. It's unlikely that Tesla can escape major lawsuits based on their expectations of U.S. drivers.

    1. Francis Boyle Silver badge

      Musk is a geek

      he thinks like a geek and is thus subject to "nerdview" (basically assuming that everyone sees a system as a system and not just as a thing that does something useful). I hate to say it but this is a case where a good marketing person would have come in handy.

    2. Jay Lenovo Silver badge

      Re: Unrealistic expectations?

      The use of of Autopilot seems to be like chaperoning a beginning driver.

      Maybe everything goes ok, but you constantly need to keep on edge to ensure no hiccups are encountered.

      I don't need that stress (because I care). I'd rather drive myself, than take a chance getting lulled to sleep driving without driving with the unlicensed driver that is "AutoPilot".

  19. This post has been deleted by its author

  20. Anonymous Coward
    Anonymous Coward

    Tesla Autopilot is not even capable of AEB ?

    Automatic Emergency Braking (AEB) is where a car's sensors see something coming up (for example, a parked Police Car or a barrier across the lane) and slams on the brakes. It's an increasingly common feature and doesn't appear to be all that difficult to implement.

    It seems clear that Tesla have somehow neglected to include a fully-functioning AEB within the Autopilot system.

    What a big bucket of FAIL.

    Clowns.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2019