back to article Oddly enough, when a Tesla accelerates at a barrier, someone dies: Autopilot report lands

A Tesla with Autopilot engaged accelerated toward a barrier in the final seconds before a deadly crash, an official report into the crash has revealed. Apple engineer Walter Huang was driving his Model X P100D on a Silicon Valley freeway on the morning of March 23 when the car, under computer control, moved into the triangular …

  1. Stoneshop Silver badge

    Re: Aircraft autopilot ... terrain-following radar to avoid collisions.

    Are there any more recent inadvertent controlled flights into terrain that might provide useful for this discussion?

    Here you are. Looking at a few of the recent crashes classed under CFIT that involved modern airliners, they were all caused by crew ignoring or responding too slow or incorrectly to warnings.

  2. CraPo

    Re: OlaM

    German Wings 9525 anyone?

  3. JimC Silver badge

    Re: OlaM

    I think there's probably an age thing going on here. I grew up in the 60s and 70s, and my father was in the flying business then. My default understanding of an autopilot is something that flies the plane straight and level when there's nothing difficult going on, and needs to be overruled for take off, landing, evading Messerschmitts or Migs and anything else difficult. That fits the Tesla offering reasonably pretty accurately. I think of autoland and so on as extra capabilities above and beyond the autopilot.

    Clearly a younger generation is thinking differently, and we have a cultural gap. For them, it seems, autopilot is not the right word.

  4. Voyna i Mor Silver badge

    Re: OlaM

    "I thought he worked for Apple, that may explain why he was driving it wrong."

    I think you were downvoted for a variety of reasons, but I also think there's a sensible point to be made there. Apple have created an expectation of what electronic systems do, and I imagine it's shared by their engineers.

    At any time, people's expectations are related to general state of the art. In a time when a lot of people had coal fires, it didn't seem odd that a steam locomotive needed someone to shovel coal into a firebox. In a world of central heating, it seems a bizarrely dangerous idea.

    Whatever Apple's faults as a company (I'm not going there) Apple stuff does pretty much what it says on the box. If an Apple engineer read "Autopilot" as "pilots car automatically" it would be unsurprising. Transportation technology probably wasn't his thing, or perhaps he wouldn't have bought a Tesla. He wanted an Apple type experience, i.e. pay a whole lot of money for something and then expect to have it do what it seems to claim.

  5. This post has been deleted by its author

  6. Prst. V.Jeltz Silver badge

    "While we can assume Huang did not notice the Tesla was in between lanes and heading toward a crash barrier, neither did the Tesla"

    Very telling, you'd think if you were even half paying attention , youd notice your car had driven off the road , and maybe press the brake before some off road obstacle appears

  7. juice Bronze badge

    "The way it is laid out (same as most similar places on a USA motorway) is criminal in its incompetence."

    That may well be true, but Tesla is an american company, and as other comments have pointed out, this is in an area which has been heavily used for self-drive testing. So they should have been fully aware that this is a potential scenario, and their software/hardware should be configured to address it.

    It's also worth noting that even in other countries, hatching and other visual indications may not always be there - on quieter roads, it may have worn away, or if they're doing roadworks, it may simply not have been repainted yet.

    Auto-pilot mechanisms need to deal with *all* scenarios, not just the best-case ones.

  8. Anonymous Coward
    Anonymous Coward

    Re: Aircraft autopilot ... terrain-following radar to avoid collisions.

    The Sukhoi scandal:

  9. dnicholas Bronze badge

    Re: OlaM

    Fuck it I laughed. Going to hell anyway

  10. Hans 1 Silver badge

    Re: OlaM

    Autopilot on planes is simple.

    You are joking, right ? Do you have any idea of the sheer number of flight parameters there are on an aircraft ? Ever heard of lift, roll, pitch, yaw, stall, altitude, air density, thrust...?

    On modern aircraft, an autopilot can land the bloody thing - OF COURSE, PILOT MUST BE READY TO TAKE OVER ANYTIME, because unexpected things can and will happen, most likely at the worst possible moment. It is called Murphy's law ... the most important law of nature for aviation engineers, pilots, and now Tesla drivers, so it seems.

    Tesla is 100% right, the driver was careless when the accident happened, was NOT ready to take control of his car. Yes, USian highway maintenance shares some of the blame too ... but, behind the wheel, you are responsible for your car. Yes, Tesla autopilot is nice to have, on USian highways probably not as useful as in Europe ... certainly NOT a replacement for a driver, though.

  11. Bill Michaelson

    Re: OlaM

    AirLINER autopilot (flight management) systems typically do all that good stuff. Aircraft autopilot systems of earlier vintage and lower grades can be as simple as a wing-leveler or a heading or altitude hold device and can indeed direct the airplane into a mountain or other obstacle. It is the pilot's responsibility to know well the capabilities and limitations of the specific system and supervise the operation appropriately at all times. It is also worth noting that the more capable systems require the most complex and nuanced supervision, thus requiring the most training and experience to do safely.

    Yet even the simplest of such aviation mechanisms are called autopilots.

    There is a distinction between the level of training, experience and certification required to operate airplanes as compared to automobiles, owing to several factors. That is a core issue. The other core issue is the regulatory environment that has allowed the deployment of these tools to insufficiently prepared drivers.

  12. Oneman2Many

    Real world barriers get damages, road markings wear out, road signs get covered in gunk, etc. The system should be able to cope with that

  13. Anonymous Coward
    Anonymous Coward

    Ah, so the road markings are at fault. Well that's nothing a few billions of $ can't put right, so a few people can pretend their adaptive cruise control is an "auto pilot" whatever that is.

    The real world is never going to be perfect for autonomous vehicles, even if they kill fewer people than meatbags, they are still going to kill people. Wait until there are 15 year old autonomous boxes buzzing about with faulty sensors and boss eyed radar due to shit servicing by lazy owners (we know they are lazy as the can't be bothered driving themselves).

    Now, where's my Santogen and a copy of the Daily Fail?

  14. David Nash Silver badge

    Re: OlaM

    The difference is that this is aimed at consumers, not at professional pilots.

    It takes a lot more training and people take it a bit more seriously (rightly or wrongly) than learning all the controls of their car and what every bit does.

    So a large number of consumers don't know precisely what an airline autopilot does and does not do? And that makes it their fault if they therefore make a mistake about the capabilities of a system aimed at them for domestic cars?

    I don't think so.

  15. MachDiamond Silver badge

    Re: OlaM

    "I'm not sticking up for Tesla here, just defining what an "autopilot" actually is. Tesla need to sort that out. They really do."

    An aircraft autopilot is just a cruise control with lane keeping assist. The difference is that somebody else (controllers) are watching where the plane is in relation to other aircraft and directing pilots to make course corrections when there are conflicts. There is also a lot more room in the sky lanes. Planes using autopilot are also more likely to be flying IFR so they have had a course plotted that doesn't have them aimed at mountains that they can crash into accidentally.

  16. katgod

    Re: Blame Jerry Brown and the idiots at CARB

    Get angry with the wrong people much?

  17. chrisf1

    Re: The right person to blame is the USA highways administration.

    "...real time integration of both car, bike, pedestrian and street level systems..."

    "What might these bike and pedestrian systems be? I hope you're not thinking that cyclists and peds will be told to carry a beacon to prevent AVs from hitting them."

    I'm saying I would have my doubts they can safely (safely enough?) integrate into a mixed environment without such systems and hence incremental roll out of open mixed systems is possibly harder than the public debate might suggest.

  18. JohnG Silver badge

    Re: OlaM

    Tesla repeatedly tells owners that Autopilot is in Beta, that they need to keep their hands on the steering wheel at all times and that they do not yet have "Full Self Driving". In the vehicle, there are two modes available: Traffic Aware Cruise Control and Automated Lane Keeping - but that doesn't sound as sexy as Autopilot or Full Self Driving - and some apparently intelligent drivers seem to ignore all the warnings and fixate on the marketing terminology.

  19. harmjschoonhoven
    Thumb Up

    Re: How in it done in Europe

    Dutch Rijkswaterstaat designed sophisticated crash cushions called RIMOBs in the 1980s. They are pre-deformed iron tubes which collapse like an harmonica on impact. At the time we were involved in making high-speed films of crash tests and I remember that my manager had a compressed RIMOB tube on his window-ledge which looked like a piece of abstract art. These and simular infrastructure saved many lives.

  20. Uffish

    Re: autopilot age gap

    I'm old generation and autopilot means nothing to me in the context of cars but I can quite believe that the Tesla Marketing Dept is very attached to the 'autopilot' designation - for all the wrong reasons.

    As an engineer, if I wanted a self driving car I'd take a taxi or if I was a plutocrat I'd have a chauffeur and a car. I would probably enjoy driving a Tesla but I would rip out the 'auto-da-fé' before I drove it.

    I would rather not have an accident in a car but the hurt would be worse if it was some sodding, half baked, handful of assorted transducers and a huge touch screen, marketing department driven, programmable controller that caused the accident.

  21. Michael Wojcik Silver badge

    At least one HAL 9000 was even more antagonstic towards humans than the Tesla code is....

    Sure, but it was consistent. If Teslas always tried to kill their occupants, we wouldn't have stories like this one. The problem is they're unpredictable; you never know when they'll turn on you.

    Or won't turn, as the case may be.

  22. Michael Wojcik Silver badge

    Re: The right person to blame is the USA highways administration.

    What might these bike and pedestrian systems be? I hope you're not thinking that cyclists and peds will be told to carry a beacon to prevent AVs from hitting them.

    No, no, no. Automatic pants, so you don't have to direct your feet while you play with your phone.

  23. MonkeyCee Silver badge

    Re: The right person to blame is the USA highways administration.

    "This is why I'm very much a skeptic on autonomous cars , especially in cities like London"

    My stepdad learnt to drive in London. He drives like he's under fire, with little regard for what the roads are signed as, but where you can fit a car through. My mum (now) drives much the same way, albeit with a steady stream of apologies for the assorted words of power being cast her way. Missed careers driving white vans or a minicab.

    They do this because it *works* very well off peak, and quite well on peak. Mildly terrifying in the passenger seat, hence why I tend to hide in the back.

    Since an autonomous vehicle will give way to someone driving like my parentals, who would want to be stuck in one when everyone notices they can just cut you up and your car will let them. Even if they don't normally drive like arseholes, if doing so will get you through faster they'll start doing it.

  24. tim292stro

    Re: OlaM

    "...IMHO if you manually tell it to do a dangerous thing, it stops being an autopilot at that point. Aircraft autopilot follows routes, with set safe altitudes, and terrain-following radar to avoid collisions..."

    Autopilot is pilot automation - but only REDUCING the pilot workload. If someone or something is watching your heading and altitude, it can give your brain cycles to look into why your #3 engine is running a little warm. It helps you follow your flight plan easier too, getting you from one waypoint to the next and lining you up for a nice landing on the runway you punch into the flight computer.

    Now realize that they still load navigation data onto those things with 8" floppies on some older variants, and you'll start to get an idea about the technical limitations of what an autopilot can do. Your average 737 doesn't have the latest Nvidia Volta GPU in its telemetry rack - it's a step up from vacuum tubes and loading programs by tape or punch card.

    Aviation autopilot also has the benefit of a mandated air traffic control in Class A airspace, so that comes with an external nanny in the event something goes off. You may have also noticed the lack of erratic turns and road maintenance issues on your last flight (although the turbulent air potholes can bee a real pain).

    Getting a self driving car to work without human oversight is a HUGE effort, and is has never even been attempted at commercial scale in aviation or marine markets (there are recent attempts to get systems working but nothing is being deployed like Waymo on the water or in the air).

    "...Tesla's tech shot off into a barrier..."

    Sure that's one damning way to look at it, but as an engineer I also look pragmatically at the longer phase that was used to describe this situation: "it was following a car and then the car went through an interchange and then the Tesla drove into a barrier". I also noted in the NTSB statement that the following distance was set to minimum...

    So with those two data points, I immediately apply my expertise in driving in the S.F. Bay Area, where our roads are crap, and so are our drivers (on ballance).

    I can imagine a scenario where, the Tesla was following a car that decided late, to take the interchange a different way, and made a moronic move across the gore point which was poorly maintained (we are lucky if Caltrans fixes a crash barrier or guard rail within a month let alone a week - now take a guess how bad the paint markings are...). Now in my imagination I can see the Tesla following closely behind that idiot who jumped across the gore - with the lane markings partially obscured by the closely followed car in front (the idiot's car). In that case, the car was probably simply following the car in front into the gore that was poorly marked, and once the idiot completed his late freeway change to the other direction across the gore, the Tesla realizing he was crossing a line decided not to follow him. From there the line which would have been the left solid line may have come back and the car thought it was in a lane and tried to center itself (remember it's narrow at the tip of the gore). Another part of the code, facing south and brightly lit from head on by the sun (because that's the direction that interchange faces) no longer saw a vehicle in front - many reasons possible, a Tesla has previously not seen a truck trailer across lanes because of poor lighting, I'd speculate that the camera based vision system Tesla chose still can't see in adversarial lighting conditions. Then with no one in front the car attempted to return to the un-obstructed cruise control set point (even though the speed limit is 65, people will do 90 because they feel like it - Tesla drivers out here love their electric torque and drive like jerks.

    So the steering would be trying to center itself on the two lines it detected, and the cruise setting would not think there is a car in front so would speed up to the set-point.

    To me, this looks like a failure of localization (GPS-only can easily be off by 3-meters, or about a full lane). Without having a high resolution map and LIDAR to compare your position on the road, to known fixed things like concrete barriers, bridges, and road signs - relying on radar within a few miles of Moffett NASA Ames Federal Airfield which has the civil aviation surveillance radar for the S.F. Bay area, isn't a good idea - and that pretty much leaves you with visual navigation to keep you in lane.

    See previous comment about which direction the forward facing camera was pointing relative to the sun, and our terrible road maintenance practices in California and the obscuring of oncoming obstructions by the car in front. If anything I'd be surprised if Caltrans doesn't get a bit of a spanking on this for road conditions, and then Tesla being dragged over the coals for not having good enough sensory perception and geo-localization.

  25. tim292stro

    Re: OlaM

    "...Autopilot on planes is simple.

    You are joking, right ? Do you have any idea of the sheer number of flight parameters there are on an aircraft ? Ever heard of lift, roll, pitch, yaw, stall, altitude, air density, thrust...?..."

    I think he means the data entry for autopilot systems and what they are meant to control is simple... On something like an Airbus A310, the autopilot has only a few controls: heading, altitude, speed.

    All of the flight surface controls that affect the airframe's trajectory are manipulated by fairly simple optimized mathematical models and some simple tuned filters to control dynamics - but the user doesn't get a window into those on the display. They just get three settings and a few knobs to adjust "the big picture".

    Hell if you set your altimeter's barometric value (which tells the plane what altitude it's at) wrong your automated landing system (ALS) can fly you right into terrain... The actual skill on a modern commercial plate still occupies a seat and has to RTFM to get the procedures right...

  26. Alan Brown Silver badge

    Re: OlaM

    " Aircraft autopilot follows routes"

    The Dunning-Kruger is strong in this one.

    The one on my old Cessna 302 simply held heading and altitude.

    Anything more that is an added feature. Autopilots are very dumb pieces of kit. The more advanced ones can fly in a more-or-less straight line from waypoint to waypoint and load the next waypoint in as they reach each target Even the alttude is dictated set by pilots (unless directed otherwise by ATC, that tends to be "as high as you can fly without stalling, for best economy."). Collision avoidance is another system. Automated landing is another system too. NONE of it is a substitute for having someone at the pointy end ready to step in when the mechanisms can't cope - the systems are there because flying is mostly tedium(*) and otherwise people get bored/sleepy and do silly things.

    The moment anything unexpected happens, an autopilot goes "bong" and hands control back to the meatsack - and unlike a car an aircraft with nothing controlling it will spend several minutes flying in a straight line before it needs a correcting hand on the controls. It's called "inherent stability" and all airliners are designed with lots of it built in(**). You'd be lucky to get 10 seconds hands-off in a car before you need to intervene, mostly because unlike aircraft your car _can't_ travel in a dead straight line due to roads seldom being straight (and even where they are, they have drainage crowns trying to push the car to the sides). On top of that, cars are in a challenging environment with a lot of other obstacles in close proximity(***) that are NOT being centrally monitored/directed, nor do you have comms with all the other drivers around you(****)

    (*) It's 30 mins to an hour of chaos at each end. The rest is incredibly boring. A _good_ transport pilot may have 1 or 2 incidents in his entire career. Pilots who save the day due to their incredible flying skills are most commonly the same pilots who overconfidently got themselves and their passengers in the shit in the first place.

    (**) Military fast jets and aerobatic aircraft are "unstable" as this is what allows them to be thrown around the sky. Some are so unstable that without a computer constantly correcting the desired flight path they'd be out of the sky in a few seconds or less. (F22 and F35 being prime examples)

    (***) In "crowded airspace", the routinely nearest thing to you is tens of miles away at the same altitude, or at least 1000 feet above/below you. You have several tens of seconds to react to anything out of order. In a car, having somethere between zero and 2-3 seconds is routine.

    (****) All aircraft in an area are using the same frequency, and that doesn't matter whether it's controlled or uncontrolled, All pilots hear the instructions and know what the other guys are doing, or they're telling each other what they're doing. When things start going pearshaped everyone knows to shut up, listen and get out of the way. Compare and contrast with the average fumducker driving blindly into a crashzone or causing chain reactions by rubbernecking instead of looking where he's going.

  27. Alan Brown Silver badge

    "BUT, it won't do so quietly."

    The autopilot will.

    "Trust me, you'll have all sorts of alarms and flashing lights going off in the cockpit and extremely loud voices telling you to "PULL UP. PULL UP"."

    None of those are connected to the autopilot, nor will they pull up for you.

    The Germanwings aircraft that crashed into a french mountain a few years back had been programmed to do it by the suicidal pilot. It didn't try to avoid the obstacle, all it did was fly in the direction and height it was told to go at.

  28. Alan Brown Silver badge

    Re: OlaM

    "For them, it seems, autopilot is not the right word"

    It has nothing to do with age.

    _MOST_ non aviators/seamen think that an autopilot is some magical device that can handle everything thrown at it.

  29. Alan Brown Silver badge

    Re: Aircraft autopilot ... terrain-following radar to avoid collisions.

    "The Sukhoi scandal:"

    The captain of the jet was Alexander Yablontsev (57), a former Russian test pilot;

    Human factors to the fore again - and yet another example of why ex-military fliers are a poor choice for civil transportation. They tend to press-on regardless when anyone sensible and cautious would have diverted. Being able to safely land 95% of the time is one thing but cleanup after the last 5% is problematic and unlike a military aircraft the people sitting in the back didn't sign on for that risk.

  30. Alan Brown Silver badge

    Re: How in it done in Europe

    "Dutch Rijkswaterstaat designed sophisticated crash cushions called RIMOBs in the 1980s."

    Plastic 44 gallon barrels full of water (water attenuators) are much cheaper and just as effective for the most part. They're don't require complex engineering works to setup and can be replaced in minutes when someone does drive into them.

    Fitch barriers (same principle but using sand) are equally effective and only slightly more expensive. and (water filled plastic jerseys are attenutaors too. Their concrete brethren are not.)

    Even if more permanent barriers are deployed/destroyed, these kinds of attenuators are often dropped in temporarily whilst replacement works are arranged. The youtube video of how dangerous the leadup to that gore is, has other shock value in that there's no kind of attenuator at all - not having one and not physically coning off the gore would be a criminal matter in a lot of countries.

  31. Charles 9 Silver badge

    Re: OlaM

    You want my opinion on why people have misconceptions about auto-pilots? Blame the Abrams Brothers...and "Otto".

  32. astrodog

    Re: OlaM

    Sadly no, airliner autopilots will happily fly the plane directly into other aircraft, above their service ceiling, beyond their fuel range or even directly to the ground:

    “Lubitz deliberately crashed the aircraft. He had set the autopilot to descend to 100 feet (30 m) and accelerated the speed of the descending aircraft several times thereafter.”

  33. Ledswinger Silver badge

    After the last childish outburst...

    ..I might be hoping for some entertainment from the Musk who seems to have been descending into an overly defensive and emotive mindset. Indeed, a "bunker mentality".

    Having said that, I temper my hopes for entertainment with the fact that somebody has died needlessly due (IMHO) to the ongoing over-promise and under-delivery of autonomous vehicles, and that even if Musk will man-up, accept responsibility and apologise, (1) I doubt he'll mean it, and (2) no amount of apologies will help the deceased, their family and friends.

    A very sobering thought for all of us: Most of us don't work in roles where we potentially put human lives at risk. Park that for a moment, put yourself in the situation that your job does. Now extend that to the idea that you messed up, and somebody is dead because of the inadequacy of your efforts. How do you make good from that? I've worked for somebody who in a previous role was underground safety manager for a large, deep coal mine. He had multiple experiences of miners being killed, and having to tell the families that although he was the Underground Manager, their loved ones weren't coming home. His eyes were strangely soulless, almost like the shark in Jaws. I don't think that was the case before he did that job.

  34. Jemma Silver badge

    Re: After the last childish outburst...

    And he put himself willingly in that position in an industry that has a very poor safety record and a high chance of getting killed even *if* you do everything perfectly.

    I've very little sympathy for the guy. About the same amount for a girl killed on the most dangerous road near my home bar none (Birch Park) - you *do not* ride a pushbike down there at 3am without lights or helmet wearing black and expect to live. I won't even take the Wolseley down there, in broad daylight, because of the sociopathic soccer moms screaming flat out down the middle of the road.

    I do have sympathy for the driver however and personal experience of deranged cruise control. The Renault Safrane 2.0/2.2 had a very nasty cruise habit. If you were 25+mph off your cruise set speed it'd purr up to speed gradually like a well fed and happy Rolls-Royce. Anything under that and it'd take off in "boyracer mode" like a Siamese cat with a firework rammed up its butt. It could catch you out even if you expected it. I really really hope they changed the cruise control logic for the Biturbo version - because the results would have been similar to what happened to the Tesla. Maximum splat.

  35. Anonymous Coward
    Anonymous Coward

    Re: After the last childish outburst...

    The reality is that these people convince themselves that the pros outweigh the cons. To the extent that the death of a handful of people is worth it for perfecting this technology and what it could do for the human race.

    (I am not saying this is my view, but how people justify it)

  36. Dabbb

    Re: After the last childish outburst...

    "Most of us don't work in roles where we potentially put human lives at risk."

    That's one the most important points - how many developers working on self-driving cars have background in developing medical applications or airline systems, anything that might kill a human being for that matter, and how many of them were building websites and mobile apps before they started programming something that weights a tonne and moves at 100MpH ?

  37. T. F. M. Reader Silver badge

    Re: After the last childish outburst...

    somebody has died needlessly

    And let's not forget the people in the other cars that were directly hit by the Tesla - the Audi and the Mazda. It is not clear to me if they were hurt, but they could have been. Even if they didn't suffer as much as a scratch there is a lot of damage to their property. And in this and in other similar situations quite a few other drivers may have to swerve, brake hard, and in general take extreme evasive action, putting themselves, their passengers/families, and yet other people at risk through no fault of their own.

    Of all the people involved, only one bought the new shiny Tesla, engaged the Autopilot, trusted it to a degree, mistakenly or not, ... In a situation like this they all are innocent victims, whether or not the Tesla driver was partially responsible. It seems clear from the description that Tesla are at least partially responsible, and all those other people on the road should be a factor in their requirements, design, implementation, testing, marketing, etc., as much or more than the driver probably, because he driver is supposed to have more information and act accordingly. Other drivers may not even realize the car in the next lane is a Tesla (just that it is blue), may be on Autopilot, its driver may be out of control for many seconds, etc.

  38. Anonymous Coward
    Anonymous Coward

    automonous driving system developers

    These tend to be of 2 types

    1. hardened embedded developers who have worked on other types of safety-critical system (think industrial control systems, military)

    2. algorithmic developers who work on the code that transforms raw sensor data into information the control system can (hopefully) act on.

    No mobile devs or web-monkeys. IMO the risk comes from the algorithmic side as the algorithms are the product of recent research (IMO a partially solved problem) and the mind-set required to write research code (= push the boundaries) is very different from that of safety-critical code (protect the boundaries).

  39. rg287

    Re: After the last childish outburst...

    The reality is that these people convince themselves that the pros outweigh the cons. To the extent that the death of a handful of people is worth it for perfecting this technology and what it could do for the human race.

    (I am not saying this is my view, but how people justify it)

    In reality, if your deaths-per-million-miles is lower than the average for entirely manual cars, then that's good.

    You shouldn't treat it lightly, but it's also not black and white. There are grades of safe-safer-safest. How many fatal accidents were there that day involving conventional control vehicles - even when adjusted for number-of-cars in circulation? What about the day before, or the day after?

    As it currently stands, the biggest problem with AutoPilot appears to be the marketing and users refusing to RTFM and understand the system (which is entirely to be expected).

    Aircraft autopilots weren't perfect to start with either. But the pilots were generally better trained in their limitations. Air France 447 flew into the Atlantic precisely because the autopilot bombed out and the human crew had lost all spatial awareness and weren't in a position to take effective control... seems familiar.

  40. DropBear Silver badge

    Re: After the last childish outburst...

    Now extend that to the idea that you messed up, and somebody is dead because of the inadequacy of your efforts. How do you make good from that?

    EASILY. I would say with a shrug, but your death or mine doesn't even warrant that much. No, I'm not talking about me - I'm talking about doctors. If you think any of them will have trouble sleeping at night because you died because of something they did (or more likely, failed to) do, think again; they'll do the exact same thing tomorrow. Those who would have felt responsible are either younglings who'll learn soon enough not to, or aren't doctors any more. If they'd actually care, they'd go mad. So they just don't. Those who are still there don't see you as a person. You're just more meat for the system. A lot of it dies. Plenty remains. See? Easy...

  41. Goldmember

    Re: After the last childish outburst...

    "the fact that somebody has died needlessly due (IMHO) to the ongoing over-promise and under-delivery of autonomous vehicles"

    This is not the case. Teslas are NOT autonomous vehicles. They don't claim to be such, either. The problem here is that it seems people treat them as though they are. Like the bell end in the UK a few weeks ago who was filmed climbing into the passenger seat of his car with the autopilot engaged. Or the other guy to be killed a couple of years ago, who was too busy taking selfies and watching DVDs whilst driving in Autopilot mode to notice a bloody great truck ahead of him.

    Tesla has to change its attitude with regard to the "Autopilot" software; it should be renamed, and the point stressed that it's purely for aiding driving. They really should market this differently, as you can't eradicate the inherent stupidity of humans.

    But for fuck's sake... if you're driving a car, YOU have a responsibility to give your full, undivided attention to the task at hand. It's a huge responsibility. A simple mistake made in a split second can permanently alter or even end lives. Ultimate culpability has to lie with the driver, unless the car is fully autonomous. Which these ones are not.

    Yes, the tech drove the car into a part of the road it should not have driven in. The driving aid failed in its task. But based on the information provided so far, it seems that the driver had transferred too much of his responsibility to the tech. Had he been paying attention he could have seen the trouble ahead and applied the brake, and things would have worked out very differently.

  42. Holtsmark

    Re: After the last childish outburst...

    "Those who are still there don't see you as a person. You're just more meat for the system."

    -I feel sorry for you and the healthcare system that you are in.

    I know plenty of extremely experienced doctors who continue to care for their patients as people throughout their career. That they will have sleepless nights due to their work is clear, but they deal with it.

    At the same time, there are clearly some fields of medical work, where patient death occurs more often (with or without the intervention of a doctor). Not everyone can continue to function in these fields, however, there are enough "safe" fields that one can move to if it gets too much..

  43. Paul Cooper

    Re: After the last childish outburst...

    " No, I'm not talking about me - I'm talking about doctors"

    I'm sorry, but you are WRONG! I know several doctors and nurses who work with terminally ill children - perhaps the hardest job in the medical world. Each and every one of them cares enormously when a child dies; so much so that some have breakdowns and have to leave that job. And that's when they know that the child is likely to die, have probably spent much time preparing parents and relatives for the death, and have worked hard to ensure that the child's passing is as peaceful as possible. Doctors and nurses do NOT become "immune" to the death of patients, unless they are totally unfitted to be in their profession - or possibly, even be members of the human race.

  44. Alan Brown Silver badge

    Re: After the last childish outburst...

    "Air France 447 flew into the Atlantic precisely because the autopilot bombed out and the human crew had lost all spatial awareness and weren't in a position to take effective control... seems familiar."

    It was worse than that. the pilots were so disoriented that _they_ flew the aircraft into the deck in their blind panic.

    If they'd let go of the sticks the aircraft would have returned to level flight. It was only an iced-up pitot (one of 3). They were so busy "trying to regain control" and fighting with each other that they didn't spend any time actually assessing the situation. You may as well have had Minions in the front seats.

  45. Alan Brown Silver badge

    Re: After the last childish outburst...

    "The problem here is that it seems people treat them as though they are. Like the bell end in the UK a few weeks ago who was filmed climbing into the passenger seat of his car with the autopilot engaged. "

    Climbing into the back or passenger seat has been a "thing" for quite a while - but it wouldn't be hard to prevent either. The cars have weight switches in the seats and can tell when someone's pulling this shit but it takes someone to program the things to recognise "naughty driver" activities.

  46. Charles 9 Silver badge

    Re: After the last childish outburst...

    To Joe Stupid, autopilot = autonomous vehicle, and they're too dumb to know otherwise, so if you can't fix Stupid, you'll have to fix the perception.

  47. Anonymous Coward
    Anonymous Coward

    When I took some refresher lessons and by accident reached for the wrong stalk on the wheel, the one that had the cruise control on the now modern car (hence refresher lessons) my driving instructor would go spare, what reaction would this autopilot elicit from her I'm scared to imagine..

  48. Prst. V.Jeltz Silver badge

    You cant rely on the stalks to be consistent from one car to the next. I think your instructors in the wrong job!

  49. mikeyg

    My understanding of the Tesla Auto pilot is that it's simply a combination of:

    1- Adaptive Cruise Control

    2- Collision Avoidance Control

    3- Lane Following Control

    4- Lane Change Control

    When all are turned on at the same time it becomes Autopilot. Can anyone say if I am right or not?

    It would be fun to have a Tesla to rip apart and play with, but they are still a bit expensive for that!

  50. Oengus Silver badge

    1- Adaptive Cruise Control

    2- Collision Avoidance Control

    3- Lane Following Control

    4- Lane Change Control



POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2018