back to article Watchdog growls at Tesla for spilling death crash details: 'Autopilot on, hands off wheel'

The US National Transportation Safety Board (NTSB) has expressed displeasure with electric carmaker Tesla for releasing information relevant to a fatal Model X crash in California last month without alerting the agency beforehand. The NTSB began investigating the killer smash last week. "The NTSB is unhappy because parties to …

  1. stuff and nonesense

    Ok, so Tesla has a $3000 option on a $5000 enhancements . What extra safety features does that package use?

    If none, why would I trust it?

    If extra safety features are programmed into the “upgrade “ why are they not in the driver assistance pack?

    Self driving cars have got to have safety first and foremost, the highest levels of protection baked into the base packages. Extra functions to aid driving richer people than me can pay extra for but safety must be paramount.

    1. Francis Boyle Silver badge

      I'm not sure

      what "extra safety features" have to do with anything. The $3000 option is for the self-driving pack which, unsurprisingly, enables self-driving. It's software that does something useful and you pay for it. Are you arguing that self-driving is so much safer that Tesla are ethically obligated to provide the software for free?

    2. smartermind

      If you can afford to waste money on an overpriced TESLA, you can afford the extra for the self drive option.

    3. Anonymous Coward
      Anonymous Coward

      Self driving cars have got to have safety first and foremost, the highest levels of protection baked into the base packages

      Why? Most other safety technologies started off as extra cost premium features - safety glass, seatbelts, crumple zones, airbags, assisted braking, stability control, antilock brakes - all of those things first came to market on premium vehicles, and were only mandated to ensure that the trickle down of technology was extended to all vehicles.

      Self driving cars will not be perfect, but if they already reduce casualties* compared to meatsacks, then delaying them further to insist on additional levels of safety will cost more lives than it saves, as a matter of simple maths.

      * I'm not convinced by the studies that Tesla trumpet on the 40% reduction in accidents, because they don't look to be properly controlled comparator groups. Which doesn't mean Teslas are not safer, it just requires a more sceptical, rigorous and scientific study. My personal guess is that like for like a Tesla is safer than a peer group car and peer group driver, but only by an unimpressive single digit percentage.

      1. Anonymous Coward
        Anonymous Coward

        Consider your family member is killed by a base model with “rudimentary “ safety features.

        If your family member was killed by a car on autopilot. It had base level safety software and you knew more sophisticated software was in use in a higher spec of car. You would be looking to sue the pants off Tesla or whoever.

        The grounds? Sub standard equipment would be a start.

        Safety isn’t about the drivers it’s about the victims.

        1. Anonymous Coward
          Anonymous Coward

          .... you knew more sophisticated software was in use in a higher spec of car. ....

          Yeah, sure, why is an AI different from other things we can have Right Now? Like Better Drivers or well maintained vehicles?

          I'd like to first see someone manage to successfully sue one of the transport companies for running lorries with unqualified / drunk / overworked / drugged drivers, which, unlike AI drivers, are causing carnage at least every month or so when they rear-end some traffic jam because they are sleeping or on their mobile. If one can pin some corporate responsibility for the flesh-bots in their service, then perhaps one could do this with the AI once this exist at some remote point in the future, perhaps. I would not hold my breath until this happens.

    4. Anonymous Coward
      Anonymous Coward

      What extra safety features does that package use?

      I think it simply bumps the value of a hidden "driver personal survival"-parameter, so that the MI in the autopilot is more likely to prioritise driver survival over that of the proverbial school bus full of nuns and children. Kinda like happened in "I Robot". Moar Dollar, biglyer Bump. But, I could be wrong.

    5. Neil Barnes Silver badge
      Coat

      I like driving myself

      But I'm damned if I'm going to pay three grand extra for the privilege.

    6. Mike Richards
      Joke

      In the event of an accident the $3000 goes towards the deployment of Tesla's autonomous rapid-response lawyers.

  2. The obvious

    Walter had complained to his Tesla dealer...

    "...about how his car swerved unexpectedly several times when passing the area where the accident eventually occurred."

    It never at any time occurred to him that may it would be a good idea to drive manually through that area?

    1. Adrian 4 Silver badge

      Re: Walter had complained to his Tesla dealer...

      It doesn't state that the car was wrong to swerve. Just that he didn't expect it to.

      1. Anonymous Coward
        FAIL

        Re: Walter had complained to his Tesla dealer...

        As an engineer, if my car swerves more than twice at the same location on Autopilot, I'd have my fscking hands on the steering wheel while passing through that location.* I really wonder about his state of mind.

        *- "Once is happenstance; twice is coincidence; thrice is enemy action."

      2. Anonymous Coward
        Anonymous Coward

        Re: Walter had complained to his Tesla dealer...

        It doesn't state that the car was wrong to swerve. Just that he didn't expect it to.

        He was an Apple techy. He just thought the AI was holding the wheel wrong.

      3. Anonymous Coward
        Anonymous Coward

        Re: Walter had complained to his Tesla dealer...

        Interesting about this ... when (I think) the Guardian did an article on driving a Tesla to a Gite in the middle of France (to see if it was really possiblte to use one for a holiday like this - answer was yes but with a bit of planning of the recharge stops needed) there article commented that the road markings around the exits on French autoroutes confused the autopilot feature and the car would often start to turn to exit the autoroute before manual intervention having to steer it back.

        1. boltar

          Re: Walter had complained to his Tesla dealer...

          "the road markings around the exits on French autoroutes confused the autopilot feature and the car would often start to turn to exit the autoroute before manual intervention having to steer it back."

          Thats why true autodrive will have to wait until true AI that can grok the whole world around it, not just markings on the road, comes about. At the moment these systems are little more than glorified line followers with (apparently poor) collision detect systems. Personally I don't see the point - if you have to keep your hands on the wheel most of the time anyway then how is it any assistance other than for the congenitally lazy? You might as well just turn the wheel yourself. In fact I'd find it more stressful keeping an eye on the automation AND the road ahead than just doing the latter in manual mode.

          1. Neil Barnes Silver badge

            Re: Walter had complained to his Tesla dealer...

            It's certainly going to be interesting when these cars arrive and and have to deal with urban areas that are deliberately bereft of markings, designed to cause fleshy drivers to slow down and think...

    2. Adam 1 Silver badge

      Re: Walter had complained to his Tesla dealer...

      What are you stating, the obvious?

    3. smartermind

      Re: Walter had complained to his Tesla dealer...

      That would involve engaging brain in gear.

      But to be fair, what if the same happens in other areas Walter or other drivers are not aware of. So in that context Walter is right to raise the alarm.

  3. Anonymous Coward
    Anonymous Coward

    Autopilot name

    I can't believe that Tesla still insist on using the misleading 'autopilot' name for its driver assist technology.

    1. This post has been deleted by its author

      1. DougS Silver badge

        Re: Autopilot name

        That was a stupid argument when people first made it, and it is a stupid argument today. What a pilot thinks "autopilot" is is 100% irrelevant, because Telsa is not selling cars only to licensed pilots. What the average person thinks autopilot is is the only thing that's relevant.

        It is kind of crazy that Tesla wants to keep that name that is becoming associated with "deathtrap" for the average person at this point.

  4. Nate Amsden

    Wonder why it swerved

    I read many posts in another forum that were harping on how the autopilot is little more than technologies available from other companies but named differently like lane assist, auto braking etc. Someone posted a link to a google maps satellite view where the accident occurred, and the claim is the car somehow got confused thinking there was another lane and moved into that "lane" even though it wasn't a lane which then ran into a barrier.

    What doesn't make sense to me is why the car would do that, at least I haven't read mentions of Telsa's (or other company's tech) behaving in that they pass cars by themselves (but maybe they do I don't closely track this stuff). Anyway if the car's basic function is to stay within the lane how could it possibly get confused of lines to the left of the car, it should see easily the line on the left side of the car is a solid white strip marking the boundary, and on the right side is a dashed line marking the boundary. Sure there is ANOTHER solid white line past the one closest to the car, and somehow the car thinks it should cross one solid white line to align itself with the next white line it sees. For me it all comes down to the logic involved in deciding to cross a solid white line on a highway, which at least the last 20 years or so of my driving on the west coast I don't recall ever there being a situation where you can "legally" do that (breakdowns excepted of course).

    I've driven that stretch of road many times myself having lived in the bay area from 2011-2016(and travel back several times a year I don't live far away just far enough for lower cost housing).

    I could probably understand if the weather was really bad, or debris on the road or something to mask the lines, but have seen no claims of anything like that.

    The lane in question was quite straight as well, in the grand scheme of things(all the situations the car would face) it should be a simple situation for the car to stay within the lane on a mostly straight highway road with good road conditions and good clear weather.

    I recall one time I think back in 2005 I was driving from Boston to Montreal on a Friday night it was in Feb or March, light snow.. I have very little experience driving in the snow. Anyway was in a rental car with my friend and we were in Vermont at the time, they salted the roads or something a lot, the roads were not slick(at all) but they were almost completely white. As in I cannot see the lines in the road. Not many cars on the road. My friend said stop driving like I'm drunk, and I wasn't drunk had no drinks that day. I was driving and trying to follow whatever lines I saw in the road, sometimes I saw the line on the right side of the car, other times I only saw the one on the left(so naturally went to both sides of the lane many many times as I tracked the lines). No accidents or near misses or anything but something that stuck in my head as to a time where I really could not see the lines on the road. I think after we got into Canada it was fine then, just a couple stretches of road in Vermont that were particularly scary (slowed down of course for those bits). One of those bits was directly before the border crossing. Fortunately the cops there had no issue with how I was driving.

    1. Nate Amsden

      Re: Wonder why it swerved

      forgot to mention on that Montreal trip it was around 10:30-11pm at night, so light snow, late at night, and road covered in white.

      1. Daniel 18

        Re: Wonder why it swerved

        Looking for lanes on snowy roads can be fun.

        Even more fun is driving on a flat expanse of white between two distant fences, knowing that somewhere under the snow is a road.

        1. Stoneshop Silver badge

          Re: Wonder why it swerved

          Even more fun is driving on a flat expanse of white between two distant fences, knowing that somewhere under the snow is a road.

          Late at night on a motorcycle, clear night with near full moon, and ground fog up to about half a meter high. On a local road with alternately ditches to the side(s), or fencing with barbed wire. Riding with only the parking light it was possible to make out the road when looking not more than a few meters ahead.

          I don't think I did over 10 km/h that stretch.

        2. TechnicalBen Silver badge

          Re: Wonder why it swerved

          "Looking for lanes on snowy roads can be fun.

          Even more fun is driving on a flat expanse of white between two distant fences, knowing that somewhere under the snow is a road."

          And somewhere is a lake/river/ditch/pond. ;)

        3. TrumpSlurp the Troll

          Re: Wonder why it swerved - white lines and snow

          Some roads in the Moors (Lancashire and Yorkshire) hove these tall wooden poles spaced down both sides of the rosd. Strange until you realise that in snow they are the only thing visible to show where the roads is.

          A bit like the withys along tidal channels in rivers. They show you where the channel is at high tide when you can't see the mud inches below the surface.

    2. Solmyr ibn Wali Barad

      Re: Wonder why it swerved

      "driving from Boston to Montreal /.../ trying to follow whatever lines I saw in the road"

      Below the 50th parallel then. Bloody southerners and their fancy lines. We have no such luxuries here around the 60th. :-P

  5. Sandtitz Silver badge
    Holmes

    Crash (almost) re-created by another driver

    https://electrek.co/2018/04/02/tesla-fatal-autopilot-crash-recreation/

    "Now another Tesla owners tried to film his Model S following the same lane change scenario on Autopilot in an almost identical section of road in Chicago and it might show exactly what happened during the accident:"

    "We can see the driver ignoring an alert to ‘hold the steering wheel’ sent out a few seconds before the barrier just like Tesla said in its report based on the logs – though that was likely a time-based alert.

    Then it seems like Autopilot’s Autosteer stayed locked on the left line even though it became the right line of the ramp. The system most likely got confused because the line was more clearly marked than the actual left line of the lane.

    That led the car directly into the barrier and it’s easy to see how a driver who is not paying attention couldn’t have been able to react in time since the driver who recreated it was barely able to apply the brake in time himself."

    1. veti Silver badge

      Re: Crash (almost) re-created by another driver

      Which is all well and good, and very interesting in itself...

      But what bothers me is why the autopilot didn't stop, or at least slow, the car when it perceived that it was rapidly approaching a solid obstacle, regardless of "lane markings". What if it had been completely right about the lane, but there had been a stationary car in it? Wouldn't it have stopped?

      Or did it not perceive the obstacle? Because that's a whole other can of worms, but no less wriggly.

      1. Davidcrockett

        Re: Crash (almost) re-created by another driver

        I'd imagine for the same reason that one went under the lorry - couldn't pick it out of the background. The concrete barrier was missing the crash thingybob on the front of it, so could have appeared as a concrete block against a concrete road.

        Oddly my small sprog did something much the same the other day. He crashed his bike into a concrete wall because he couldn't pick it out against a concrete path.

        1. Fred Dibnah

          Re: Crash (almost) re-created by another driver

          "Oddly my small sprog did something much the same the other day. He crashed his bike into a concrete wall because he couldn't pick it out against a concrete path."

          When a car in auto-pilot has about the same driving ability as a small sprog, I don't want it on the road anywhere near me, thanks.

      2. DropBear Silver badge

        Re: Crash (almost) re-created by another driver

        Holy shit, that section of road is murder materialized! I saw the setting of the accident before so I knew what to look for and yet watching the "recreation" the first I realized anything at all is wrong was when the guy dropped the camera and hit the breaks. By the fourth re-watch I could see where the lane split marking were supposed to be, but they're dim as fuck, I would have probably completely missed them the first time around even driving personally. Yes, a fully self-driving vehicle would need to be able to detect that, but I'm not particularly surprised Tesla's glorified lane assist didn't. Crash barrier or no crash barrier, the absolute non-negotiable bare minimum that road needs there at all times is a long string of traffic cones in that "lane". And some actual fucking paint.

    2. Anonymous Coward
      Anonymous Coward

      Re: Crash (almost) re-created by another driver

      Sounds very similar to comment I made above taken from a Guardian article a year or so back which commetned that Tesla autopilot had often tried to turn onto exits rather than stay on the autoroute when they'd been driving in France. At least in that case it was just following the wrong bit of road but it seemed clear that it was recognizing the wroing lines as the lane marking.

      1. NightFox

        Re: Crash (almost) re-created by another driver

        If the autopilot detects and warns for hands-off the wheel after 6 seconds, why doesn't it take further action if the situation isn't then rectified, e.g. by progressively reducing the speed by a safe rate?

        1. JohnG

          Re: Crash (almost) re-created by another driver

          "f the autopilot detects and warns for hands-off the wheel after 6 seconds, why doesn't it take further action if the situation isn't then rectified, e.g. by progressively reducing the speed by a safe rate?"

          It does - but it first issues a couple of visual warnings,followed by an audible warning. After that, it slows, looks to pull off the road and stop.

      2. tiggity Silver badge

        Re: Crash (almost) re-created by another driver

        Hope nobody uses them on some of the roundabouts near me - the lane markings are confusing enough for people, a Tesla would have no chance (at best would take the wrong exit or end up circling round and round forlornly)

        1. ridley

          Re: Crash (almost) re-created by another driver

          Don't be daft Tesla is from Yank land and they don't believe in roundabouts and much less know how to negotiate them.

          Who knows what a Tesla would make of the magic roundabout https://www.youtube.com/watch?v=6OGvj7GZSIo

    3. MrXavia

      Re: Crash (almost) re-created by another driver

      That video is scary!

      surely the car can see the barrier ahead and should know it can't drive through it?

      1. Aitor 1

        Re: Crash (almost) re-created by another driver

        That road is very badly signaled, and is very dangerous.

        still the system should have picked it up.. but cmon, who designed and who does the maintenance on that? put a green/yellow plastic with reflectives!!

    4. Anonymous Coward
      Anonymous Coward

      Tesla's "autopilot" is broken by design: LIDAR missing

      The following videos show that Tesla's autopilot is deadly inadequate, merely a overhyped lane assistant:

      https://www.youtube.com/watch?v=VVJSjeHDvfY

      https://www.youtube.com/watch?v=KEaJW-Mk0Bo

      https://www.youtube.com/watch?v=kDgJnlr6Ak8

      Autonomous cars need a LIDAR. Tesla has none, their stereo-camera plus radar doesn't detect road conditions in a reliable way even in best weather conditions. How should it even excel at snowy weather in the Alps? Musk should be ordered a hefty fine, for doing very bad meaning PR work, constantly overstating the cars capabilities, intentionally naming it "autopilot" while being merely a half-blind lane assistant. Musk is teasing us, stop him. His greed not to add a LIDAR means a heavy road toll, real people die.

  6. EveryTime Silver badge

    Until recently Tesla was using the Mobileye vision system for their Autopilot software. This system has several features, such as sign recognition, obstruction/vehicle/pedestrian recognition and lane tracking. The one at issue here is the lane tracking.

    Lane tracking is a very treacherous capability. It's quite easy to develop a vision system that appears to work. It's easy with freshly painted lane markers and no exits or merges. Complexity and special cases quickly pile up with right hand exits and entrance merges. Quick fixes such as always following the left-side lane line come back to bite you when encountering left exits and lane merges. Roadways with faded and old ground-away lane markings that are confusing to human drivers are even worse for lane-following vision systems.

    Tesla mitigated this by requiring each section of highway to be successfully observed several times before enabling Autopilot. I suspect that here we are seeing the weakness of that approach -- the roadway marking degraded below a threshold and the lane tracking made a bad decision. This has probably happened thousands of times before, with the harmless result of taking the wrong exit or driving in a break-down lane. Here it was a fatally bad result of driving into a barrier centered in the 'false lane'.

    1. Anonymous Coward
      Anonymous Coward

      "Lane tracking is a very treacherous capability. "

      In my neck of the woods there are streets without lane markings. They are painted on, wear off and are never repainted.

      1. David 164

        I think the whole idea on relying on anything that needs to be maintain by local authorities simply won't work for automated cars.

        Because they are pretty much universally shite when it comes to maintain any roads that isn't outside their offices or just happens to be a road a councillor lives on!

      2. Commswonk Silver badge

        In my neck of the woods there are streets without lane markings. They are painted on, wear off and are never repainted.

        At the other end of the spectrum I know several roundabouts (am I right in thinking these are unknown in the US?) where there are myriad white lines, sometimes crossing live lanes, and single lanes splitting into two and all sorts of confusion for a human driver, especially one with possibly limited knowledge of the area.

        And others where the designer appears to have settled on a bizarre layout just to see if he can get away with it.

        Just the sort of things to make a (semi) autonomous car think "I want to go home".

        1. IceC0ld Bronze badge

          At the other end of the spectrum I know several roundabouts (am I right in thinking these are unknown in the US?) where there are myriad white lines, sometimes crossing live lanes, and single lanes splitting into two and all sorts of confusion for a human driver, especially one with possibly limited knowledge of the area.

          not to mention THAT 'magic' roundabout in Hemel Hempstead, single roundabout, surrounded by six other roundabouts, even the locals have issues ffs

          1. Neil Barnes Silver badge

            even the locals have issues ffs (Hemel Hempstead)

            Only those seeing it for the first time. Once you look at it, it's obvious it's a series of short carriageways joined by roundabouts; two lanes on the anticlockwise direction and one on the clockwise. For every combination of in and out, there's a correct lane to enter, and a correct lane to approach a roundabout to exit.

            It's not difficult.

            Mind you, where the A404 meets the A1... that's got clearly marked lanes but people seem to be completely incapable of staying within them.

        2. Mark 85 Silver badge

          (am I right in thinking these are unknown in the US?)

          They're not unknown as such but there's not very many of them. Some people (most?) will never see one in their lifetime.

        3. Hans 1 Silver badge
          Paris Hilton

          In France, you have two types of roundabouts, some where any entering vehicle has right of way and others where entering vehicles need to give way. Some are a mixture of both where some entrances have right of way, others do not. The town where I took my driving test was full of roundabouts with priority to entering cars ...

          Of course, right of way means there are no line markings, just, cars coming from the right have right of way to enter as they see fit ... how are automated cars going to cope with that ? Considering that since giving entering cars right of way is brain-dead, potential deadlock, most of these are now being converted to conventional roundabouts, but still ... how on earth is an automated car gonna get that right ?

          Rush hour at the place de l'étoile in Paris, the automated car will wait until rush hour is over before entering the roundabout ...

          icon: Paris, coz ... place de l'étoile ...

        4. wx666z

          Roundabouts

          Roundabouts were unknown in my part of,the U.S. until a couple of years ago. Then the fascist city council dropped one on University Ave.at an inter change with Arlington Expressway. Much confusion and traffic jams ensued.The confusion continues to this day. I prefer the old 4 way stop junctions,

          1. Dave 126 Silver badge

            Re: Roundabouts

            As a human driver I had no problem when I encountered the Magic Roundabout in Swindon for the first time (for the initiated, a circle orbited by five or six smaller circles) because the lane markings are clear and standard roundabout logic applies (which is actually just normal road logic) As long as you follow the logic, it's fine - so I don't get the fuss made about it!

            The roundabouts that cause confusion are the ones where the lanes dissapperar or cross each other, so staying in lane will bring you into the path of other vehicles. The St Paul's roundabout on the M32 is one such ratbag.

            1. TechnicalBen Silver badge

              Re: The St Paul's roundabout

              They revamped a really busy roundabout around here like that. :(

              Near enough half the time I either end up in the wrong lane, blocked in lane moving or dangerously having to go across a lane...

              Who on earth thought it was a good idea to mix a crossing with traffic lights AND a roundabout? AND have lanes crossing each other with conflicting directions of traffic? AND have lanes that tell you to move into the wrong position for the wrong turning (for example, the St Paul's roundabout, try finding the correct lane for a 360 turn ;) )?

              I see no problem with a crossing, with lights or with a roundabout. In fact, lights for access to the roundabout are fine. but lights on it? Wow.

              Yes, it can work for on ramps/motorway access, where is makes a fly over etc... however, my local example, unlike the St Paul's roundabout... is a mini roundabout too!!!

            2. I am the liquor

              Re: Roundabouts

              The main problem I had when I first encountered the Swindon magic roundabout was seeing where I was going through the tears of laughter. Surely the greatest practical joke ever.

            3. fajensen Silver badge
              Pint

              Re: Roundabouts

              In Abingdon there is a notorious double roundabout, where the round things are merged into some kind of peanut shaped blob.

              Notorious because nobody even after many years of daily practice can figure out how to navigate that thing correctly - and yet, I know of no accident there exactly because nobody can figure it out and therefore navigate it very carefully, looking in all directions.

              A simple tee-junction onto Culham road, however -> carnage about once every week!

    2. Boris the Cockroach Silver badge

      Watching

      the vid, it clearly tracks the solid line until the driver brakes...

      In real life on this side of the pond I've seen drivers go straight into the divider without help from an 'autopilot' simply because they've not been paying attention

      And that I suspect is the case.... and what Tesla should look at, rather than calling it an 'autopilot' they should call it a driver assist and ensure that all drivers of cars equipped with it are reminded that it does'nt absolve you of the responsibility of actually driving the thing

      adding a warning bell with a loud voice saying "WARNING: DRIVE ASSIST WILL DISENGAGE" would help, along with a pair of dead man switches on the back of the steering wheel so you keep your hands on the wheel at all times....... mind you the last would help every other car driver too...

      1. Stoneshop Silver badge
        Facepalm

        Re: Watching

        In real life on this side of the pond I've seen drivers go straight into the divider

        Or, probably with less injury unless it's to someone's funny bone: http://media.dumpert.nl/foto/cb098dcd_Bmw_vangrail.JPG (the guardrail starts at ground level). I've seen cars 'parked' like that a few times myself.

        1. DougS Silver badge

          Roundabouts in the US

          Where I live they added the first one about 20 years ago and there are now probably a dozen in the metro area. They seem to like adding them near schools. The most recent is the largest one they've done, while the four roads are only single lane one is fairly high volume and the other will be growing fast since there will be a lot of development nearby since a new high school was just opened.

          Not exactly like the huge ones in the UK, but I'd hate to think what would happen if we tried to put a roundabout in the US with busy multi lane roads feeding it. They are mostly unknown to too many people for that to end well!

          1. The Indomitable Gall

            Re: Roundabouts in the US

            " They seem to like adding them near schools. "

            That's quite logical -- roundabouts force drivers to slow down, whereas you can fly over a crossroads at full pelt when the light's on green.

            Unless you're suggesting that the only solution to a bad man with a speeding car is a good man with a speeding car, and that all schools should have a NASCAR-trained marshal to nudge speeding drivers out of the way of kids, I think roundabouts near schools are eminently sensible.

    3. DougS Silver badge

      Most roads have no lane markings, so no autonomous car should rely on their presence. Often during construction the old lane markings are partially removed, but are still somewhat visible but if they were followed you'd be directed into a barrier or worse!

    4. Anonymous Coward
      Anonymous Coward

      "Roadways with faded and old ground-away lane markings that are confusing to human drivers"

      No they're not.

  7. fnusnu

    Known issue

    The aviation industry has years' of experience with the autopilot handing back control and the humans being unable to cope with the situation (which was so complex the computer couldn't...)

    1. TechnicalBen Silver badge

      Re: Known issue

      https://en.wikipedia.org/wiki/Air_France_Flight_447

      In most cases, as far as I can tell, it stems from the pilots not actually following procedure, or taking note of actual data. Not as much a failure of the autopilot, which in most cases is actually a "heading hold" device of types.

      While arguably there was a minor fault in the above example, there is no reason it would have caused any difficulty to a pilot, and in fact 2 of the 3 were taking the right action (though evidently as said, not responding to the stall warning/data that was available).

  8. ST Silver badge
    Stop

    When is an Autopilot not an Autopilot?

    Tesla: Full Self-Driving Hardware on All Cars.

    Tesla: Full Self-Driving Capability.

    The system is designed to be able to conduct short and long distance trips with no action required by the person in the driver's seat. [ ... ] All you will need to do is get in and tell your car where to go.

    Where does "but keep your hands on the wheel at all times and drive your car manually just like a Chevy" figure into Tesla's assertion about their Autopilot's capabilities?

    Either it's an Autopilot that can drive the car with no action required by the person in the driver's seat -- Tesla's own words -- or it isn't.

    They can't have it both ways.

    It's an Autopilot if you don't crash and die. If you crash and/or die, then it's not an Autopilot. You should have kept your hands on the wheel. Because we beeped you.

    1. eldakka Silver badge

      Re: When is an Autopilot not an Autopilot?

      > Either it's an Autopilot that can drive the car with no action required by the person in the driver's seat -- Tesla's own words -- or it isn't.

      "Autopilot" facility is exactly as described, it requires the driver to keep their hands on the wheel at all times.

      The Full Self-Driving Capability is not Autopilot. It is an upgrade to Autopilot, therefore it is in no way a conflict to then state that after upgrading Autopilot to the Full Self-Driving Capability then the text you quoted is applicable:

      The system is designed to be able to conduct short and long distance trips with no action required by the person in the driver's seat. [ ... ] All you will need to do is get in and tell your car where to go.

      Therefore you are conflating the capabilities of two different (although one is built on top of, is an enhancement of, the other) systems:

      1) Autopilot ($5k option) - requires the driver to keep hands on the wheel at all times; and

      2) Full Self-Driving Capability ($3k option on top of $5k Autopilot) - this is an add-on/upgrade to autopilot, that has more sophisticated capabilities, being "able to conduct short and long distance trips with no action required by the person in the driver's seat".

      1. ST Silver badge
        Mushroom

        Re: When is an Autopilot not an Autopilot?

        > "Autopilot" facility is exactly as described, it requires the driver to keep their hands on the wheel at all times.

        Thanks for the self-serving damage-control bullshit.

        Good luck in Court.

    2. Snorlax
      Facepalm

      Re: When is an Autopilot not an Autopilot?

      @ST:"Either it's an Autopilot that can drive the car with no action required by the person in the driver's seat -- Tesla's own words -- or it isn't."

      The Model S owner's manual states:

      "Warning: Autosteer is a hands-on feature. You must keep your hands on the steering wheel at all times.

      Warning: Autosteer is intended for use only on highways and limited-access roads with a fully attentive driver. When using Autosteer, hold the steering wheel and be mindful of road conditions and surrounding traffic.

      Do not use Autosteer on city streets, in construction zones, or in areas where bicyclists or pedestrians may be present. Never depend on Autosteer to determine an appropriate driving path. Always be prepared to take immediate action. Failure to follow these instructions could cause damage, serious injury or death."

      ...

      "Autosteer is intended for use only by a fully attentive driver on freeways and highways where access is limited by entry and exit ramps"

  9. a well wisher

    Shows the need for that large spike on the steering wheel to be reconsidered to remind drivers whose bum is going to get bitten if things get a little bit out of shape

    1. Anonymous Coward
      Stop

      I see this argument all the time. Odd that now cars are safer and and have things such as anti-lock brakes, seat bets, traction control, crumple zones, airbags and on and on...death rates have fallen despite rising vehicle numbers; surely by your logic, they should have risen.

      https://en.wikipedia.org/wiki/Reported_Road_Casualties_Great_Britain#/media/File:Killed_on_British_Roads.png

  10. Anonymous Coward
    Anonymous Coward

    Damage control mode

    Telling drivers they must keep their hands on the wheel of a vehicle with an "autopilot" mode is just wishful thinking. They won't buy the system if they must keep their hands on the wheel. What's of more concern is why the vehicle did not automatically brake well before hitting the concrete barrier? You certainly don't offer "autopilot" without auto braking for obstacles that appear in front of the vehicle. Clearly this tech is not suitable for public use at this time. Tesla and friends should be held accountable for the defective operation and fatality. How many other folks desire to be test dummies?

    1. John 209
      Unhappy

      Re: Damage control mode

      "How many other folks desire to be test dummies?" ...and how many of the rest of us want to be involuntarily put in harms way?

      1. Ken Hagan Gold badge

        Re: Damage control mode

        "and how many of the rest of us want to be involuntarily put in harms way?"

        I can't believe anyone has downvoted this point. If it was your loved one that got mown down by a computer system with a deliberately mis-leading name (I see no reason to mince words here. The two syllables "auto" and "pilot" are (i) already used in the aviation industry to refer to a fully autonomous system with a good safety record and (ii) clearly used in this instance for marketing reasons.) would you just shrug your shoulders and say "C'est la vie, or fin de la vie in this case."?.

        Tesla are selling something sounds like it is a fully autonomous system but which actually will kill people if used in that way. Adding some fucking small print in the instruction manual does not let them off the hook.

        1. DropBear Silver badge

          Re: Damage control mode

          What they should have done was reserve the name "Autopilot" to their fully autonomous driving tech (if they do indeed have that capability) and call their non-autonomous assist features something more reasonable and intuitive like "CO-pilot".

        2. Anonymous Coward
          Anonymous Coward

          Re: Damage control mode

          ""and how many of the rest of us want to be involuntarily put in harms way?"

          I can't believe anyone has downvoted this point. If it was your loved one that got mown down by a computer system"

          As always corporate PR drones and naive die-hard fanboys.

          As you said, they would behave differently, if their beloved ones were harmed by such overhyped overpriced under-performing cars with merely lane-assistants.

          Worst case... What would happen if one of Elon's young children get mowed down by a Tesla car in "autopilot" mode? Possible outcomes would be A) renaming of "autopilot" feature to "lane assistant", B) recall of all Tesla cars, C) shut down Tesla company in full mourning and grief. Why not double down on PR bullshit bingo or level up by adding a LIDAR sensor?

  11. Anonymous Coward
    Anonymous Coward

    How on Earth would it work in winter?

    That video...

    This AV technology, presently, is crap. AI = Artificial Imbecile. Following painted lines? Based on what? An Opamp and a 555 time chip? Crikey.

    God help Tesla if they ever get their production cranked up. The USA Legal System is going to chew it up and spit it out.Too much concentrated liability.

    1. JeffyPoooh Silver badge
      Pint

      Selecting Lane 3 of 2

      Regarding the video of the Tesla being confused by painted lines...

      The car obviously has GPS Navigation, which in turn has maps. Even a $89 GPS can know that the left hand route option has two lanes. But the Autopilot seems to have selected the third (rightmost) of the *two* lanes.

      WTF? ARE YOU TELLING ME THAT IT IS MAP-UNAWARE? It's not integrated with basic maps? It doesn't keep track of lanes?

      Sellers on eBay sell cheap toy cars that will follow a line on the floor. I'd expect a more sophisticated system design, considering it's life and death.

      Am I missing something?

      1. bazza Silver badge

        Re: Selecting Lane 3 of 2

        @JeffyPoooh,

        Given that GPS isn't accurate enough for pinpointing which lane a car is in, map awareness isn't going to help much. Watch how a current SatNav does "road snapping", ie it adjusts the car's GPS measurement to a position that aligns with a road. You can see this in action when you don't take a motorway exit slip when the SatNav is expecting you to. It can take quite a while for the SatNav to recognise that you're still on the motorway.

        It seems that a single radar sensor and video camera is also insufficient for lane tracking.

        Humans are very good at interpreting the scene they see in front of them. Though some are worse at it than others (reference to all those who blindly follow their sat nav into rivers, train tracks, etc, despite that obviously being a bad idea). All humans are very good at recognising immediate danger.

        I can't see AI systems matching that ability any time soon. For me it's level 5 autonomy, or nothing more elaborate than adaptive cruise control (and I worry about that). Level 3, 4 are going to be too dangerous for inattentive humans to be trusted with.

        1. wallyhall

          Re: Selecting Lane 3 of 2

          > I can't see AI systems matching that ability any time soon. For me it's level 5 autonomy, or nothing more elaborate than adaptive cruise control (and I worry about that). Level 3, 4 are going to be too dangerous for inattentive humans to be trusted with.

          (Emphasis added.)

          Absolutely agree. That's my feeling, and I believe several "big names" in the industry have stated it too (although I can't remember them to cite here). For as long as people are driving cars which are *not* fully autonomous (level 5, or whatever you want to call it) but which have features making the drivers "feel" like it's at all autonomous, it's deadly.

          Humans are very fickle and often very small minded and short sighted, and we'll promptly switch off (get distracted) and let an incapable shadow of level 5 assume "full control" - which will continue to result in deaths (which - in fairness to Tesla and the various other manufacturers, doesn't happen all that often all things considered).

          I say all that, fully admitting that I've had at least one "near miss" while using standard cruise control in my Mazda 3 - where my brain has just turned off that slight bit "too much" while crusing along the M11 when someone tapped their bakes 8 cars ahead... I sincerely doubt that I'd have reacted any differently to this guy if I'd been in his situation - with only myself to blame.

      2. JeffyPoooh Silver badge
        Pint

        Re: Selecting Lane 3 of 2

        Please don't be distracted by the "GPS".

        I was referring to the ***MAPS*** (built into the car's plain old Navigation System).

        Maps that would show TWO LANES.

        So the Autopilot selected Lane #3.

        This seems to reveal a MASSIVE design flaw. It seems to be map-unaware, even though the maps are already onboard.

        WTF!!

        1. Deckard_C

          Re: Selecting Lane 3 of 2

          I don’t think it was selecting lane 3 of the left hand route but lane 1 of the right hand route and possible not aware it had got to the point of where the routes are diverging as GPS isn’t that accurate.

  12. rmullen0

    Don't be naive

    Is anyone naive enough to think that these car companies are going to be able to create self driving cars that work reliably? They run on software and hardware that is programmed. Programs have bugs in them. That is a fact. Look how many times you have to patch your computer. It is nothing but a constant stream of bug fixes. I don't trust these self driving cars at all. And the IDIOTS in the government and at these corporations are perfectly fine using unsuspecting bystanders as guinea pigs. How about the morons at the car companies first focus on building efficient vehicles and improve battery technology. Maybe they could take time away from lobbying the government to weaken environmental laws and the emissions standards and spend that making vehicles that are less damaging to the planet. Also, said IDIOTS are making these vehicles internet enabled. I'm sure no one will hack them. Like what happened to Michael Hastings when he went up against the deep state.

    1. JeffyPoooh Silver badge
      Pint

      Re: Don't be naive

      RM0 asked, "Is anyone naive enough...?"

      Yes.

      1. bazza Silver badge

        Re: Don't be naive

        And when the investors who are funding all this work out that there's nothing going to come from all this research, there's going to be repercussions. It will become difficult to get good ideas funded as the money men stop trusting the techies. Again.

        Couple that with Facebook and other social media companies likely losing profits as the full repercussions of Cambridge Analytica-gate sink in (regulations, laws, compensation, ad-boycott campaigns), and you can see a whole lot of investors getting badly burned over the next few years.

        There's going to be a recession in the tech business. The money is going to dry up.

        What investors want is steady, reliable returns on their investment. That's where true shareholders value comes from. The current crop of tech mega giants don't look like offering that. The "boring" companies like Apple and Microsoft do, because they actually have something real to sell.

    2. Dave 126 Silver badge

      Re: Don't be naive

      > . I don't trust these self driving cars at all. And the IDIOTS in the government and at these corporations

      So, the OP doesn't trust self driving cars, but then he doesn't trust human beings either. Neither is an invalid view in itself, but it would seem the pragmatic approach is to work out which to mistrust the least - or not go near a road. :)

    3. wallyhall

      Re: Don't be naive

      > Is anyone naive enough to think that these car companies are going to be able to create self driving cars that work reliably?

      I can't (and am not) commenting on the rest of your post.

      Regarding "is anyone naive enough" - yes: I am.

      I'm a software / hardware engineer, I studied robotics and computer vision (and various AI including natural language engineering) at university - and in the space of a few months we had things working remarkably well in a very academic environment.

      As the various sayings go, 90% of the effort and duration is the first 90% of the project. The *other* 90% of the effort and duration is the last 10% of the project.

      I'm putting aside concerns (which I share) of the software getting hacked or having severe bugs (like it suddenly overflows an integer and the car takes an immediate left turn) - I'm just talking about the capability of software driving. I can see it getting there. If Google can 9/10 identify an image of a dog as a dog - and if my £20 a month phone handset can overlay a 3D image of a frog's anatomy on the desk in front of me, I sure as heck expect a car to be able to safely navigate a road - with or without line markings or otherwise.

      Two things shake me up about this story:

      1. The car's sensors (or the interpretation of the data) didn't recognise a stationary block of concrete in front of it. That's a catastrophic accident, and they *must* resolve it. I don't care if it's called autopilot or "smart brake support" (I believe is Mazda's branding) or anything - it terrifies me that the systems didn't see it. I am however willing to give them the benefit of the doubt - and say that's just part of the illusive last 10%. Which brings me onto point 2:

      2. We're living in the most dangerous period of autonomous cars (IMHO). It's the period where cars are *almost* capable of doing something interesting (like driving me up the road as well as I can) but they're not capable of doing it without a human overseeing it (and being ready to immediately take control). For this, I cite Uber's recent news. Perhaps the problem is made worse by the naming of the technology (Autopilot in this case) but I'll readily admit I've had a near miss on the M11 using standard cruise-control in my Mazda 3. Driver 8 cars ahead tapped his brakes, I had just that little too much relaxed my concentration from looking 8 cars ahead to gazing at the car immediately in front - when I found myself with half the time to react.

      3. That crash barrier/crumple zone should have been repaired/replaced 10 days earlier. Or a temporary speed limit should have been put in place (like 30MPH). Seriously, whoever is responsible for ensuring that a car hitting that barrier at the legal speed should not result in a death has to bear some of the responsibility here (IMHO).

      Anyway, rant over. I'm impressed by Tesla, I'm fascinated by SpaceX, I thoroughly enjoy Elon's enthusiasm. I'm not trying to be a fanboi. I hope I've been sufficiently objective in my words above.

      1. TechnicalBen Silver badge
        Facepalm

        Re: Don't be naive

        I studied art. I predicted they could never beat uncanny valley in CGI. I'm not far off. ;)

        The 90% got better. The other 10% got smaller. But there is still the diminishing % of failure of the systems in there. There will always be errors or "tells" they cannot remove. But it can get "good enough" that we stop caring when watching a film. Is it the same with all progress?

        Breaks, engines, crumple zones, seat belts... none of these work 100% of the time. But it is down to the intent and direction of the development. Me making a hammer and not bothering to screw down the handle vs me making a hammer and 0.01% have manufacturing defects and the handle snaps.

        1. rmullen0

          Re: Don't be naive

          I don't have a problem with the companies trying to get this to work. However, I do have a problem that they are already deploying it to the public when it is not ready for prime time. They aren't just putting the driver of the vehicles at risk. They are putting everyone else at risk as well. Like the lady walking her bike across the street that got ran over.

      2. Adam 1 Silver badge

        Re: Don't be naive

        @wally, I agree with 90% of your post, it's just the other 90% where we differ ;p

        > Seriously, whoever is responsible for ensuring that a car hitting that barrier at the legal speed should not result in a death

        Newton has a thing or two to say about such a possibility. Kinetic energy follows a square relationship to velocity.

        ie. K = 0.5 * m * v2

        What that means in practice is that a car doing 120km/hr must shed 4 times the energy it would have at 60 (or 16 times the energy of a 30km/hr crash).

        At highway speeds, the barrier's main goal is to control the direction of the collision so you are less likely to be torpedoed into another vehicle (especially head on). With that much energy to absorb head on, the shear force of your brain mass hitting your skull is likely to be fatal, even if the barrier, crumple zones, air bags, pretensions, etc all perform perfectly. For perspective, EuroNCAP frontal test is at 64hm/hr. Take a look at one of the better performers in that test, then try and picture it without 4x the crash energy.

        But I totally agree that replacing safety barriers after a collision must be a priority. I also share a big concern over why the sensors failed to detect the obstacle even if it got confused over the lane markings, or if it did see the obstacle, why it didn't appear to attempt to avoid it.

  13. Disk0
    Megaphone

    Here's an idea.

    1: Stop any music playing, turn on all interior and exterior lights, especially hazard lights, activate the horn and have the car pull over if the driver lets go of the steering wheel for more than, let’s say, two seconds. Also applies to cruise control - no hands? no drive, it's simple really.

    Drivers will soon learn not to make themselves look like the tool they actually are.

    On an unrelated note, when hitchiking on trucks back in the day, I learned some truckdrivers reverse the whole procedure: they wouldn't touch the steering wheel unless it was absolutely neccessary.

    One of them just got out of the driver's seat in mid-traffic, crawled in the back cab to dig around for cigarettes, simply telling me to grab the wheel if we would start to swerve. We didn't, but it was a quite a long minute.

    1. Dave 126 Silver badge

      Re: Here's an idea.

      I'm not sure that turning on interior lights will aid the human driver's view of the road at night.

      Also, automatically turning on exterior lights can cause confusion to other road users (it looks like the blinking we use to let other drivers know we see them and they can pull out of a junction). This is one of the reasons that simple automatic headlights (i.e if less than X amount of light hits a CCD then turn on lights) are fitted anymore.

      1. Fred Dibnah

        Re: Here's an idea.

        "it looks like the blinking we use to let other drivers know we see them and they can pull out of a junction"

        In the UK, flashing your headlights is the visual equivalent of sounding your horn. Both indicate a 'Warning: I'm here', not 'I'm letting you out'. People fail their driving tests making that error.

  14. atodd

    Experience of autopilot

    With regard to how autopilot would work in bad weather, it doesn't. Autopilot won't engage if the conditions are bad or it can't see road markings. It beeps a lot and warns the driver if conditions change and it needs to turn off autopilot.

    With regard to the suggestion that the autopilot turn off if the driver isn't holding the wheel, it does exactly that after beeping loudly and the instrument panel flashing red. It then refuses to turn on for the rest of the journey.

    With regard to the autopilot swerving, there's a fairly gentle bend on a brow on the A1 northbound heading towards Leeds that often catches it out so I can believe that. Whether it would actually crash, I can't tell you but it tries to turn the wheel in a way I don't want it to. At this point the fact that I'm holding it stops the wheel turn and autopilot turns off. It only seems to do it in the right hand lane and is certainly unnerving.

    1. Commswonk Silver badge

      Re: Experience of autopilot

      It really is unfortunate that the term "autopilot" has been allowed to enter the lexicon of motoring, however much or little control it is actually able to exert.

      Comparisons with aviation are completely misleading, IMHO. Certainly in commercial aviation the aircraft will be equipped with "autopilot", and it will doubtless be enabled for some part of a flight. However, the same aircraft will be equipped with two fully qualified pilots, but the use of "autopilot" does not enable both of them to do their crosswords of choice (or whatever!) at the same time. One or other of them is always keeping watch on what the aircraft is doing while the autopilot is flying the aircraft.

      Furthermore aircraft occupy distinct corridors based on track and height, those corridors being specifically designed to keep aircraft well apart and thus minimise (or hopefully eliminate!) any chance of a mid - air collision. Yes; there are lanes to follow but they do not have white lines within in which both human and auto pilots are required to remain. On top of that ground controllers do keep an eye on what air traffic is doing; for example any change of height does or may need clearance from ATC before putting it into effect. Other than when on the ground air travel is mercifully clear of fixed obstacles and other influences that might upset a pleasant journey.

      As if those weren't sufficient differences in themselves, any event that requires the intervention of a human pilot should have a sufficient interval in both time and space between that intervention being required and something really nasty happening, but the same cannot be said of cars more or less bumper to bumper (fender to fender if you really must!) on congested roads.

      In my view there is simply no sensible similarity between what happens in aviation and what happens in motoring; the differences between the two concepts of travel are so great that allowing them to share the word "autopilot" is profoundly, and dangerously wrong.

      Somehow (and no; I don't know how either) the word "autopilot" has to be expunged from land - based travel.

      1. handleoclast
        Coat

        Re: Experience of autopilot

        Somehow (and no; I don't know how either) the word "autopilot" has to be expunged from land - based travel.

        Planes have pilots; cars have drivers. Therefore planes have autopilots and cars should have autodrivers.

        Simples.

        1. Mike Richards

          Re: Experience of autopilot

          Autochauffer if you have a Bentley.

      2. Ken Hagan Gold badge

        Re: Experience of autopilot

        "Somehow (and no; I don't know how either) the word "autopilot" has to be expunged from land - based travel."

        Tesla apparently share your view and are slowly building the association between the word "autopilot" and "dead" in the land-based context.

  15. Pier Reviewer

    Wrong type of snow

    "The reason this crash was so severe is because the crash attenuator, a highway safety barrier which is designed to reduce the impact into a concrete lane divider, had been crushed in a prior accident without being replaced”

    He crashed into the wrong type of wall? It feels a little like blame shifting doesn’t it?

    Tesla have a big interest in protecting their image, and an even larger bank balance with which to do it. Hopefully people that buy these things will heed this example and use their cars appropriately. You don’t want to be on the wrong end of a large corporation (or a few hundred tons of concrete)

  16. Anonymous Coward
    Anonymous Coward

    This relative upstart car company Tesla, as large and as dominant in their niche as they have quickly become, seems to lack the experience depth and breadth for: roads, driving occupant safety and post-accident data, that many of their industry peers possess in spades.

    Regardless of AI and SW clever stuff, and big hot batteries, driving eye-catching (watering) performance, when the likes of Mercedes Benz have been crash testing structures for 50+years, have continually collected and compiled crash data from accidents involving their cars around Stuttgart, you have to ask why they are not rushing at this like Tesla. No prizes for first.

    We are seduced by electronic event log data, but there is still a plethora of analogue physical empirical data around crashes and crash scenarios, crumple zones, deformed structures, objects hit etc. Tesla does not have that. It's learning as it goes with its customers. AI and modelling are not enough. Neither is marketing.

    I guess the MBs , Fords and VWs automotive giants have all also been around long enough to see a few large potentially crippling Class Actions over the decades . I wonder if Tesla really are big enough to take a couple of CAs 'on the nose' and survive.

    Clearly a long way to go to make autonomous driving safe enough.

  17. Snorlax
    Facepalm

    Stupid is as stupid does...

    "Drivers are supposed to keep their hands on the steering wheel even when Autopilot is engaged. Think of the technology as a super-cruise-control, rather than a self-driving brain."

    I wasn't totally surprised when I saw a YouTube video showing an Autopilot user defeating the 'hand-on-wheel' check by jamming an orange in the steering wheel.

    Doubly stupid as an orange wedged next to an airbag is going to turn into a 300km/h projectile if the airbag goes off...

    1. TechnicalBen Silver badge
      Trollface

      Re: Stupid is as stupid does...

      Now you'll give the Youtubes ideas on how to make orange juice!

      1. Stoneshop Silver badge
        Trollface

        Re: Stupid is as stupid does...

        Now you'll give the Youtubes ideas on how to make orange juice!

        That's where Juicero got their 'inspirationj' from.

    2. handleoclast

      Re: Stupid is as stupid does...

      Doubly stupid as an orange wedged next to an airbag is going to turn into a 300km/h projectile if the airbag goes off...

      Think of it as evolution in action.

  18. MartyOhr

    where's the big data/machine learning?

    I thought the Teslas were very clever. Wrong it seems. I assumed that data from all the vehicle sensors and the drivers actions was from all their cars was federated back to update the algorithms and add specific instructions for bits of the road. Like a clever version of Waze.

    If I was writing an 'autopilot' I'd take all the times a driver has had to correct the driving lines and use that to update in realtime any temporary hazards and feedback to me any locations where multiple vehicles have had issues with following the road.

    I'm not suggesting that is easy, but surely such a feedback loop is the minimum a responsible developer would want?

  19. Anonymous Coward
    Anonymous Coward

    Why didn't the guy just steer to the right?

  20. Ian Johnston Silver badge

    I love the way Tesla are offering to sell for $3,000 a product they don't have and which, even if they had, could not be used. Vapourware bullshitting at its best.

  21. cs9

    Did your Tesla kill you? You were "holding it wrong"

    "Your car just drove into a crash barrier at full speed while under autopilot and killed the driver"

    Tesla: "Other people have used Autopilot there and not died. So there!"

    Tesla: "(leaking to the press in continued damage-control mode): The guy didn't have his hands on the wheel! His fault! Autopilot? It's not auto or a pilot, where'd you get an idea like that???"

    The only thing we know with certainty about this incident is that Tesla doesn't give a shit about anything but their share price. The driver is just an idiot who was "holding it wrong". Their Autopilot is great and 99.9999% probably won't kill you and if it does you shouldn't have been using it in the first place. Exceptionally crass behavior from front to back.

  22. SVV Silver badge

    It was the road's fault, apparently

    "The reason this crash was so severe is because the crash attenuator, a highway safety barrier which is designed to reduce the impact into a concrete lane divider, had been crushed in a prior accident without being replaced," Tesla said.

    So, you're not actually looking to see whether or not it's there and intact : your software just ASSUMES it's there? Yerp, thjat's going to be safe. It's not the fault of the inadequate software, it's the authorities who by your line of argument now have to repair every single bit of minor damage everywhere instatntly for the sake of your self driving pipe dream. And who is Elon expecting to pay for this unprecedented public service ? That's right, every taxpayer. Here's a hint Mr. Tech Bro - it's not going to happen. You are going to have to use a lot more of your fortune to develop the software a lot further. And pay huge compensation when accident victims have the misfortune to be the human sacrifices your draem demands.

    1. Aitor 1

      Re: It was the road's fault, apparently

      this accident happened for several reasons as I seeÑ

      1.Badly designed road.

      2.Bad maintenance.

      3.Lack of due care after an accident.

      4.SW limitations in the autopilot.

      5.User not giving enough attention.

      I can see a tired driver crashing there.. and as a previous car crashed exactly there, this is quite obviously dangerous.

  23. c1ue

    Frankly, this focus on the poor quality of the lane markings is severely misplaced.

    Yes, in the ideal world, the markings - lane and otherwise would be prominent and fresh.

    However, in the real world, this is rarely the case.

    The real question is: how many non-"Autopilot" drivers have crashed into this poorly marked barrier?

    If that number is zero, or even crashes but with zero deaths, this underscores the fundamental dangers of "automated driving" whether Level 2, 3 or 4. If humans aren't running into these barriers, then partial or full automation shouldn't be either - and if they are, they should be severely regulated and/or banned.

    Remember we aren't even talking about enemy action. If morons with laser pointers are blinding aircraft pilots, I can only imagine the "fun" with lidar spoofers or (in)appropriately placed markers/signs.

  24. Anonymous Coward
    Anonymous Coward

    This is quite misleading; the way the article reads. EAP (enhanced autopilot) is what was enabled in this case. As stated in the article this basically a superb version of cruise control...but you are still driving the car. That is a crucial point. The reason your hands are required is to try and ensure that you are there actively driving the car. It will help with steering but you must be ready to take over immediately at any point. To then driftinto talking about FSD is confusing. FSD is not yet available so irrelevant to this case.

  25. Anonymous Coward
    Anonymous Coward

    I like the $1000 delivery for the full product on top of the $3000.

    Almost as if paying for delivery of the complete car, not a bit of software and built in cameras etc.

  26. Anonymous Coward
    Anonymous Coward

    I'd rather employ a badger from our local sett to drive me around than ever use this shit

  27. Phukov Andigh Bronze badge

    interesting

    expensive car company finds in its non transparent data collection systems, information that of course totally exonerates it from any responsibility.

    If it weren't Tesla, Darling of the California Tech Community, no one would even give their self-defense a moments notice before tearing it apart.

  28. Ken Y-N

    Why no braking though?

    The most "interesting" point for me is why there seems to have been no attempt by the car to brake. It has depth-sensing stereo camera, otherwise not-really-an-Autopilot would plough through the car up ahead. However, the crash was at night, so that suggests that it couldn't pick out enough of the concrete block to detect that there really was something there and not just a camera artefact. The NTSB may be looking to see if Tesla have dialled down the object detection algorithm to ignore excessive misdetections at nighttime.

    If I were buying (or just using) an autonomous vehicle I'd choose one from the boring old companies; Fail Fast may work when delivering web services; not so good for vehicles.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2019