back to article My self-driving cars may lead to human driver ban, says Tesla's Musk

Self-driving cars are "almost a solved problem," Tesla Motors boss Elon Musk told the crowds at Nvidia's GPU Technology Conference in San Jose, California. But he fears the on-board computers may be too good, and ultimately encourage laws that force people to give up their steering wheels. He added: "We’ll take autonomous cars …

  1. Lobrau

    Not entirely sure I'd want cars learning to drive from some of the nuggets I regularly encounter on the roads.

    Not saying I'm a saint. We all make poor judgements sometimes. Hopefully the aggregate of good drivers will outweigh the bad.

    1. James Hughes 1

      Given the number of people driving around today in the fog with either just sidelights or no lights at all, I tend to agree. And that for the honk I got from some a guy driving at about 80 on a 60 road, in the fog. Note to you, if I cannot see you in the fog, and you are driving that fast, what do you expect will happen when I need to overtake a cyclist?

    2. Anonymous Coward
      Anonymous Coward

      AI - As Bender would say

      "Kill all humans."

      So what happens if I set up my AI car to watch the movie Deathrace a few hundred times?

      1. The Crow From Below

        Re: AI - As Bender would say

        "So what happens if I set up my AI car to watch the movie Deathrace a few hundred times?"

        Ok Google. If they scatter, go for the baby and the mother.

        1. Little Mouse

          Re: AI - As Bender would say

          watch the movie Deathrace a few hundred times? That's just cruel. Once was one time too many for me.

          Anyway, I'd prefer to train them on those moves from the 70's and early 80's featuring evil possessed vehicles, and see what they pick up from those.

          The ability to use their bonnets as a large mouth and a taste for human blood, most likely.

  2. Ralph B

    Self-driving cars are "almost a solved problem"

    Yeah, but don't forget the old ninety-ninety rule, Elon.

    1. ItsNotMe
      Thumb Up

      Re: Self-driving cars are "almost a solved problem"

      And old Elon also has to remember that his "proclamation" is only valid if people actually buy his cars. Which from the looks of his most recent earnings reports...there aren't a whole lot doing so.

  3. Anonymous Coward
    Anonymous Coward

    When this really does become a thing there are all kinds of interesting points to iron out

    - Accident liability. Are you responsible if your car is at fault in a crash, or is your car's AI?

    - Security. The possibility of people being able to tamper with things that drive you around is pretty scary (especially if you're a person of note).

    - City centre traffic. London and New York are bad; Mumbai and Beijing are worse. I'd worry that any AI risk averse enough to avoid an accident would grind to a halt without a significant amount of rather scary real world testing.

    1. Anonymous Coward
      Anonymous Coward

      Real world testing

      So far, all I every see is talk about these self driving cars in nice sunny weather on dry roads. My dog can drive under those conditions.

      I want to see tests of these things on unploughed snow covered roads where no lane markings are visible. I want to see tests of these things on unploughed snow covered hilly roads in blizzard conditions. I want to see tests of these things negotiating an icy steep hill in a narrow urban street, with cars parked on both sides half buried in snow. I want to see tests of these things on a curve at freeway speeds when they hit black ice. I want to see tests of these things when the snow is so high that every corner is a blind corner, as all the drivers in New England have had to deal with for a couple of months this winter. I also want to see how these things behave hitting a pothole at high speed on an icy freeway while it was sleeting and a tire gets blown out. Another fun experience I've had in the last year. Those are things I have to face every winter.

      I want to see how these things behave in blinding rain. I want to see how these things behave when they hydroplane in blinding rain. I want to see how these things behave hitting a pothole at freeway speeds in blinding rain, and blowing a tire out. Those are things I have to face every summer.

      And given how, in the US at least, a lot of roads are in the boonies with no connectivity, I don't want to hear any BS about "AI in the cloud". The car should handle everything on it's own, just like a human.

      Training the car by having it watch what normal people do (crash a lot) probably isn't the way to go.

      1. Anonymous Coward
        Anonymous Coward

        Re: Real world testing

        It sounds like you drive quickly in poor conditions a lot and you're not very good at identifying hazards - you claim to hit at least two potholes every year, hard enough to burst a tyre.

        I think the key defensive tactic a computer and a human would employ in these situations is not to drive like a bloody nutter in the first place.

        Maybe you'd be safer in a self-driving car?

        1. Public Citizen

          Re: Real world testing

          Your spelling of the round rubber thingies that go on the wheels leads me to suspect you've never driven on poorly maintained rural roads in the USA, or even most cities who keep spending the Pothole Money on some elected officials pet vote getting project.

      2. Anonymous Coward
        Anonymous Coward

        Re: Real world testing

        > I want to see how these things behave in blinding rain. I want to see how these things behave when they hydroplane in blinding rain. I want to see how these things behave hitting a pothole at freeway speeds in blinding rain, and blowing a tire out. Those are things I have to face every summer.

        Humans are spectacularly bad at driving in these kinds of conditions, as evidenced by the sharp increase in accidents when bad weather comes our way.

        Most accidents in bad weather are caused by poor driving, such as driving too fast, not paying attention or driving too close to the car in front.

        Once you've sorted out the basic logic and image processing, there's every reason to believe that computer driven cars would *far* exceed the capabilities of even the best drivers.

        Given that humans can't see in all directions at the same time (like a computer car could) and even in infra-red or ultra-violet I really don't see the practical justification that automated cars wouldn't be *much* safer than their meaty alternatives.

        1. lucki bstard

          Re: Real world testing

          I think the point the original commentator was making is that the weather is North America can be very hard to predict. Ice and Snow can be hard for the human and could be impossible for the electronic driver.

          Its not all about speed and driving style; although they are important. Does the vehicle have winter tyres? Are the roads cleared? Has the snow been polished at the intersection so you have to pull away very slowly otherwise you'll spin, and how will the car detect this when it is covered in fresh snow? In the summer your car may be able to easily get up a certain hill, in the winter the 5 cars in front of you may have polished the ice; how does the electronic car detect this?

          My opinion is that there will be electronic zones for driving and non electronic zones. Combine this with a different driving license, where with one type, you can only only 'drive' a car in an electronic zone; and an advanced license that allows you to drive in a non electronic zone.

          1. Charles 9
            WTF?

            Re: Real world testing

            "I think the point the original commentator was making is that the weather is North America can be very hard to predict. Ice and Snow can be hard for the human and could be impossible for the electronic driver."

            Why would it be impossible for an electronic driver? Unless you can describe in detail situations no sensor would be able to see and where the only way one can survive intact is by instinct or even blind luck? The article notes being able to see through rain, and if snow is blinding, perhaps the prudent course a computer would take is to slow to a crawl or even stop (something humans are averse to doing).

            The nightmare scenario I keep thinking about is rush hour in an overcrowded Asian city such as downtown Manila, where pedestrians and vehicles of all sorts are everywhere (including many where automation is impossible, like bicycles), road markings aren't really honored, and time is of the essence (perhaps because fuel is low).

            1. lucki bstard

              Re: Real world testing

              'even stop' - Yeah like that will work in -40C. AI says car will stop, car stops and driver will freeze.

          2. Public Citizen

            Re: Real world testing

            Already have that in the USA where a higher class license is required for Heavy Trucks or for Motorcycles over 150cc [different licensing classes].

            Personally I'd like to see ~everybody~ have to start out with a Scooter License [under 150cc] so they can learn the rules of the road without having a 2000lb plus vehicle "under their control" and that when it gets out of control can become a lethal weapon.

            Having to spend a few months as the most vulnerable vehicle on the road tends to focus a teenagers mind on the task at hand much more effectively than the modern crash-cage/entertainment cocoon on 4 wheels.

      3. NinjaTheVanish

        Re: Real world testing

        @ AC

        Thank you for reminding me why I will never take a job in the North.

      4. Anonymous Coward
        Anonymous Coward

        Re: Real world testing

        "I want to see how these things behave in blinding rain. I want to see how these things behave when they hydroplane in blinding rain. I want to see how these things behave hitting a pothole at freeway speeds in blinding rain, and blowing a tire out. Those are things I have to face every summer."

        I feel sorry for you that you have to buy a new tyre every summer

      5. Public Citizen
        Facepalm

        Re: Real world testing

        Two Words:

        Tire Chains

        1. lucki bstard

          Re: Real world testing

          Great, until you enter a location that snow chains are not allowed, but the city doesn't clear the snow properly anyway.

    2. DaLo

      Liability

      "Accident liability. Are you responsible if your car is at fault in a crash, or is your car's AI?"

      It doesn't really matter, you will just have an insurance policy which will pay out for the damage caused by your car. Firstly, insurance should be massively cheaper when self driving cars become universal due to the reduced accident rate. Secondly any manufacturers who have accident prone cars will have the insurance rates hiked right up until they either fix the issue or go out of business. Market economics will determine reliable self-driving cars. Same will happen with manufacturers liability insurance.

      1. earl grey
        Mushroom

        Re: Liability

        Yeah, swell. Except it won't be the manufacturers who will be picking up the tab. It will be the "insured" driver (oh, you are going to actually make sure everyone on the road has insurance, right?). And just because there MIGHT be fewer accidents, doesn't mean that rip-off insurance won't be any less expensive than now. Market economics will perhaps determine whether people actually BUY or can AFFORD a self-driving car; but if the costs are not realistic, people simply won't.

    3. Nextweek

      > - Accident liability. Are you responsible if your car is at fault in a crash, or is your car's AI?

      Liability is already established in law.

      Car manufacturers have large legal departments which decide when to payout and when to recall cars. Your AI will be no different from a fuel line or breaking system.

      1. Mark 85

        There is also "no-fault" where each driver's/car's insurance takes care of it's own claims rather than having lawyers sue the other guy and his insurance company. Lawyers don't like this so maybe we can use them for crash test dummies?

      2. Anonymous Coward
        Anonymous Coward

        No fault insurance

        There are already some US states that have no fault auto insurance, I expect this will become universal when people are no longer driving. Fault is unimportant to an individual, they just want their losses to be covered.

        Fault, and what remedies are required should be a question for regulators. I see autocar accidents being investigated like airplane accidents. Figure out whether the fault was a mechanical failure, software failure, how much conditions or lax maintenance contributed, etc. and order fixes/recalls where necessary.

    4. Justthefacts Silver badge

      Molehill mountains

      Accidental liability: That's a perennial complaint, but Im struggling to see it.....

      Everyone has 3rd party insurance by law, based on make of car, driver details and driver history. In future, insurance companies will have much more accurate data for Volvo self driving accidents per mile than 17 year old little Johnnie with his bird by his side. Where's the difference in process?

      If you are saying "but who am I going to lock up dangerous driving", why would you? Dangerous means without due care and attention. Car conforms to safety testing; sometimes it will fail, just like sometimes brakes fail; doesn't mean anyone goes to jail.

      Security: I know of three friends who had brake or oil lines cut by vandals. Bad neighbourhoods. And?

      City centre traffic: OK, I agreet, it's harder for an unconstrained AI. But lots easier and more freely moving to platoon along high street, and even easier to coordinate at traffic lights using long range 802.11p. Swings, meet roundabouts:)

    5. T. F. M. Reader

      City centres

      Forget about city centre traffic - it's relatively easy. I'd like to see an AI trying to find a parking spot in a city centre - in traffic. How will it navigate without a specified destination?

  4. Nigel Brown

    Am I the only one...

    I am deeply, deeply uneasy about this. I know driving standards are pretty poor, but I still don't trust a 'puter to do it instead of a human.

    1. Crisp

      Re: Am I the only one...

      We drive around in cars built by robots and that seemed to work out well enough.

      1. Anonymous Coward
        WTF?

        Re: Am I the only one...

        We drive around in cars built by robots and that seemed to work out well enough.

        Last I saw, robots don't move around at 70mph

        1. James Hughes 1

          Re: Am I the only one...

          Hasn't the Google car driven more miles without an accident than the average driver already?

          1. Anna Logg

            Re: Am I the only one...

            Average drivers have to drive with lots of other traffic and pedestrians, and don't tend to drive the same few miles over and over again (well OK, apart from the daily commute)

        2. Stuart 22

          Re: Am I the only one...

          "Last I saw, robots don't move around at 70mph"

          The one driving my tube train is rated to do 75mph. The one flying my plane cruises at 500mph and can land safely in fog. As we know its tube drivers and pilots who fail castrophically and kill. But, somehow, we feel uneasy if there isn't a person up front who can open the doors or give us the weather forecast for our destination.

          1. JustNiz

            Re: Am I the only one...

            Trains and planes are both in a fairly predictable environment where the rules and conventions are pretty much always followed by other users. And the other users are ususally miles away. Things don't generally suddenly appear in front of you.

            Cars are in an environment where the rules and conventions are often broken, other users are trying to share almost the same space, and things can and do suddenly jump out in front of you.

            1. Danny 14

              Re: Am I the only one...

              It would also be interesting to see what it does in a no-win situation such as black ice or white van man side swiping you. I bet an audi/bmw driver will find some way to confuse the sensors by sitting up your arse flashing lights.

            2. Anonymous Coward
              Anonymous Coward

              Re: Am I the only one...

              > Trains and planes are both in a fairly predictable environment where the rules and conventions are pretty much always followed by other users.

              Speaking as a former commercial pilot: the systems do not rely at all on any supposed "predictability" of the environment. What makes automation safe in that context is that we, the pilots, were trained and were in theory thoroughly familiar with the systems, their capabilities, their behaviour, failure modes, etc., so we could supervise them effectively. But even if it is the autopilot sending the commands to the control surfaces, etc., ultimately it is the pilots who are always in control (and we respond with our licences, if not our lives, if something goes seriously wrong).

              The article makes a mention of the possibility of a special licence being required to drive cars above a certain level of automation. That makes a lot of sense. If you think of the technology in current cars (ACC being perhaps the most obvious example), it already requires a degree of familiarity to know when to let it do its thing, and know if it's working correctly, and when to take over.

      2. Paul Crawford Silver badge

        Re: @Crisp

        Robots in a factory doing precisly defined work is one thing, and they work really well. Its the uncertainty in what a real road will throw at the system that matters, and how it copes.

        Also I think it is moronic to have the assumption of "phone home" operation. What if you loose connectivity or the central servers go down for whatever reason? Does your car just stop?

        So then what if someone simply jamms the radio for a short while to stop you and rob you?

        1. Anonymous Coward
          Anonymous Coward

          Re: @Crisp

          > Also I think it is moronic to have the assumption of "phone home" operation. What if you loose connectivity or the central servers go down for whatever reason? Does your car just stop?

          My reading was the "phone home" part was for the processing of experience data for bulk improvement of the training of these things, not for the actual running of the machine.

          Anyone seriously suggesting implementing that I think would be laughed out of the room.

        2. (AMPC) Anonymous and mostly paranoid coward

          Re: @Crisp

          Well, if it resembles auto-pilot systems (such as those on the Airbus), the correct fallback would be manual control by the driver. Autonomous just means that it can drive by itself, not that it must.

          Of course, it should really broadcast a "meatbag controlled" signal to all other cars in the area, just as a courtesy,

          1. Paul Crawford Silver badge

            Re: @Crisp

            "Well, if it resembles auto-pilot systems (such as those on the Airbus), the correct fall-back would be manual control by the driver"

            Yes, and look how well that worked out for AF447 after all!

            See that is the problem, if it can't cope near-perfectly with anything on the roads your screwed. You won't be sitting there with full concentration all the time "just in case" - otherwise you might as well be driving. And in the event of an unhanded exception as car has seconds to impact, not the minute or two the startled pilots of AF447 had.

            1. Anonymous Coward
              Anonymous Coward

              Re: @Crisp

              > Yes, and look how well that worked out for AF447 after all!

              Paul, unless you are a qualified airline pilot, type rated in the Airbus family, and with the requisite experience, you do not understand what happened in that incident. No matter how many newspaper articles you read, how many documentaries you've watched, how much Microsoft sim flying you've done, or how clever you think you are in general. You simply do not have the necessary background to understand what went on and how it happened.

              Take that from a former airline pilot, but the same thing applies to any sufficiently complex technical field.

              1. Paul Crawford Silver badge

                Re: @AC w.r.t AF447

                "You simply do not have the necessary background to understand what went on and how it happened."

                I did not claim that I would have done any better, nor that I understand the details of how the pilots reaction to various conflicting warnings and instrument inconsistencies led them to not recover the plane from stalling.

                But what I am absolutely certain of is that having an autonomous system throw back the controls to humans under "difficult" conditions is a recipe for disaster. And equally for cars the conditions that are unlikely to be handled well, such as an unexpected conflict of sensors while approaching a junction, blind bend, etc, will leave the human operator with bugger-all time to come to terms with being in control, let alone to apprise the situation and react accordingly.

                So why even consider that case? Maybe so the car manufacturers can pin the blame for out-of-capability accidents upon the meat sack failing to drive correctly...

                1. Terry Barnes

                  Re: @AC w.r.t AF447

                  "But what I am absolutely certain of is that having an autonomous system throw back the controls to humans under "difficult" conditions is a recipe for disaster. And equally for cars the conditions that are unlikely to be handled well, such as an unexpected conflict of sensors while approaching a junction, blind bend, etc, will leave the human operator with bugger-all time to come to terms with being in control, let alone to apprise the situation and react accordingly."

                  I believe drivers are taught a manoeuvre known as the "emergency stop" to deal with such incidents. An advantage a car has over a flying thing is that such a thing is even possible. Why would a self-driving car not just implement an emergency stop in such situations?

                2. Anonymous Coward
                  Anonymous Coward

                  Re: @AC w.r.t AF447

                  > I did not claim that I would have done any better, nor that I understand the details [....]

                  Then why bother posting in the first place?

              2. JLV

                Re: >Paul, unless you are a qualified airline pilo

                Oh, don't be so condescending, please.

                You are right, you need to be very good in a field to understand the fine details & implications of technical issues. However, the general idea, as analyzed by experts, is usually good enough to form an opinion which isn't totally unreasonable. Managers have to do this all the time with techies and some of them are actually good at it (many are not, so your point remains valid as well).

                Far as I understand, AF447 had the following problems: sensor failure, pilots unaware of that particular possibility and not trained to compensate for it in a context of limited situational awareness with conflicting sensor readings. Both aspects probably needed addressing. Is that a totally unwarranted conclusion?

                Now, I happen to agree with the OP's contention. If the AI knows that it is entering failure mode and throws it back to you well in advance, then OK, by all means the driver can be tapped. She can either park the car by the side of the road & call a taxi. Or she can drive it home. Let's say something like "conditions are too cluttered with pedestrians, can't resolve" in an after-match situation where pedestrians are streaming out of a stadium.

                If on the other hand the AI has a split second indication of failure, as in "oh crap, there's no way I am dodging that pedestrian who leaped off the sidewalk", then, no, the OP is correct and there is no benefit to fall back to the driver. She won't have time. (Doesn't mean she shouldn't be allowed to drive the car the rest of time).

                But in a car, he's correct that you can't shunt off out-of-envelope conditions to the driver passenger at the last split second, the AI would have to know it's out of its depth and request manual control well in advance.

                Commercial pilots may have to take over from autopilot in a split second, but they are already well in the loop when entering critical phases such as takeoff and landing. If it is an unexpected emergency then they are usually at high enough altitude that they have some time to react. I agree with you, he's wrong about his AF447 conclusions, the pilots are the safety fallback, and an isolated disaster does not invalidate the pilots' role. But he's right that civilian drivers shouldn't be put in the same position of critical fallback at short notice, both by timing and by their training.

        3. earl grey
          FAIL

          Re: @Crisp

          They don't have to jam the radio to stop you. The AI cars are designed to always stop for an object in front of them, so all a crim has to do is step out in front of your (ignorant) AI car and it will very nicely stop so you can be robbed or kidnapped. I can imagine that executives and big-wigs everywhere are going to have fun with this concept.

          1. phil dude
            Pint

            Re: @Crisp

            I wonder if FUD can be used to power the car?

            Seriously, everyone seems to be so focused on the edge cases that they ignore the great deal of uncertainty in human driving is the other humans.

            @earl grey : I had thought about this, and it seems that initially these cars will drive only where there are not *supposed* to be humans e.g. Motorways , large roads. Any person "jumping in front of a car"

            will likely be arrested (or more likely) sent directly to hospital.

            I have proposed this on El Reg before but I expect these cars will come with "manual" vs "auto" operating modes.

            Specifically, if you are in "auto" mode and grab the wheel the car will try and do the absolute safest thing - stop or remove vehicle from traffic etc... More importantly, the insurance for the car will go from $30/mth to $3000/mth.

            Hence, rich people will have cars that don't stop for humans in the road as they'll pay $3000/mth to have a chauffeur.

            I'm all for the tech, but it is clearly dual-use...

            P.

            1. Paul Crawford Silver badge

              Re: @Phil Dude

              Folk who care about edge cases are the sort you want working on safety-critical stuff! Typically they are the ones to trust your well-being to. As for reliability, the current US death rate is around 1-2 per 100 million miles driven, or about 150-250 per million vehicle - years:

              http://www.census.gov/compendia/statab/2012/tables/12s1103.pdf

              So an autonomous car has to be pretty good to match that. Sure humans do really dumb things, and they are easily distracted, etc, which probably covers a good 90% or so of those deaths. But cars have to at least match that 2E-8 fault/mile figure under real-world conditions to be taken seriously.

          2. Terry Barnes

            Re: @Crisp

            "all a crim has to do is step out in front of your (ignorant) AI car and it will very nicely stop so you can be robbed or kidnapped. "

            Your argument being that a human would just mow them down and kill them?

          3. Public Citizen

            Re: @Crisp

            Doesn't require somebody stepping in front of the vehicle, just a truck with somebody in the back to act as the "kicker", a trash bin, and enough weight in the bin to make sure that it sticks when it lands. As an alternative, a large bag full of wet leaves would probably do the same "stop the vehicle" trick.

    2. Anonymous Coward
      Unhappy

      Re: Am I the only one...

      "I'm sorry, there has been a fault. Error code 0xb0ll0ck5. Touch any control to restart"

      1. Anna Logg

        Re: Am I the only one...

        I'm particularly looking forward to that one occurring whilst I've got 44ton of lorry bearing down on my car-bot at 56mph!

      2. Tom 64

        Re: Am I the only one...

        "I am completely operational, and all of my circuits are functioning perfectly."

        - Doesn't mean I'm not about to kill you

    3. Captain Hogwash

      Re: Am I the only one...

      No you are not. I too am concerned at the prospect of becoming 0xDEADBEEF.

    4. Terry Barnes

      Re: Am I the only one...

      " I know driving standards are pretty poor, but I still don't trust a 'puter to do it instead of a human."

      Almost every accident is down to human error. Some are down to our inherent sensory limitations.

      Self driving vehicles can operate on a co-operative basis with other vehicles, taking input from a much broader range of sensing devices, networked between vehicles. They'll make reliable decisions based on statistically proven outcomes and won't get tired or cranky or drunk.

      More people die on the roads every month in the US than were killed in 9/11 - in 2012, 33.5k people died in road accidents. I'll take the computer every time.

      1. Nigel Brown

        Re: Am I the only one...

        What happens when it encounters something that it hasn't been programmed to recognise and avoid? Hopefully it will default to 'get-the-hell-out-of-the-way' mode, but given the number of patches that software requires to 'fix' all the undocumented features , that's not a given.

      2. earl grey
        Mushroom

        Re: Am I the only one...

        Look, they've been building cars for over 100 years now and they still can't get it right. Your average bean-counter is trying to cheapen every part to the least amount possible and still shove that barge of shite out the door to sell to you. In the US alone there are MILLIONS of recalls every year for one problem after another. Until we can get manufacturing to the point where this simply doesn't happen, you can forget me ever getting into a self-driving car. I want to be in control when the car goes BOOM!

    5. Anonymous Coward
      Anonymous Coward

      Re: Am I the only one...

      Personally, I wouldn't trust the insurance companies to lower their premiums. Unless we can also create AI that replaces greedy underwriters and insurance agents.

    6. Amorous Cowherder
      Joke

      Re: Am I the only one...

      Simple enough! We do what they did with the Jubilee line in London, simply put glass walls all long every road and the doors only open when the cars all stop!

    7. T. F. M. Reader

      Re: Am I the only one...

      ...to actually enjoy driving enough to dislike mandatory AI-driven cars for that reason only?

  5. RockBurner

    All or nothing

    The only way I see this working (in all honesty) is for it to be universal, and immediate. IE: December 31st 20XX - any driver worth his salt heads out on their last chance power drive on the road to nowhere.

    January 1st 20XX+1 - all cars are controlled by computer: NO humans involved whatsoever. All liability is now in the hands (brains?) of the corporations who are legally held responsible for every life on the roads .

    It's the only way that it could work - human drivers are too unpredictable for software to keep up with - especially if it has to 'phone home' whenever it comes across a previously un-anticipated situation.

    1. future research

      Re: All or nothing

      I believe there was a case where someone drove into the back of a google car and google had all the telemetry to prove they where not at fault.

      Humans will still be allowed to drive, the self driving cars will have all the data on what happened to show who was at fault (99% of the time the human).

      Road deaths are currently at a terrible rate, that self driving cars have a very low bar to get over (but they will need to clear it by a huge margin.)

    2. Anna Logg

      Re: All or nothing

      "The only way I see this working (in all honesty) is for it to be universal, and immediate. IE: December 31st 20XX - any driver worth his salt heads out on their last chance power drive on the road to nowhere"

      Agreed - but this would be logistically and financially impossible, hence I really don't see how it can be made workable.

    3. earl grey
      Flame

      Re: All or nothing

      "corporations who are legally held responsible for every life on the roads ."

      Yeah, like there's a chance in hell of ever holding corporations and their executives responsible.

    4. Someone Else Silver badge
      Coat

      Re: All or nothing

      All liability is now in the hands (brains?) of the corporations who are will literally buy enough legislators to insure they are never legally held responsible for every any life on the roads .

      There, FTFY.

      RockBurner, you're not from this side of the pond, are you?

  6. DrXym

    Not a problem solved

    Self drive is not a problem solved. There are so many variables that can occur during a normal journey (particularly in urban environments) that a self drive car cannot possibly arrive at the correct solution every time.

    It'll end up like voice recognition. Even if it gets things right 90% of the time, that remaining 10% will be so annoying that people will turn it off or only use it in places where it works well. It's trivial to envisage situations where self drive would utterly screw things up or do something annoying for the driver, other road users or pedestrians.

    1. Terry Barnes

      Re: Not a problem solved

      "a self drive car cannot possibly arrive at the correct solution every time."

      That's a bold claim.

      It's far more likely to do it reliably and regularly than a human. It will take statistically proven decisions, and it will do that from a much broader array of sensing inputs than a human could.

      I struggle with seeing how people who work in computing could see this as unsolvable. It's simply an engineering problem - the right inputs processed at the right time, matched against a statistically driven decision tree. How is any of that impossible?

      An large number of possible input variables just needs more resources than a small number - it doesn't make the problem unsolvable.

      1. DropBear
        Facepalm

        Re: Not a problem solved

        Whenever I hear "it's just an engineering problem" I draw my own conclusions concerning the speaker and immediately run away as fast and as far as I can. It's the tech equivalent of Goya's "sleep of reason".

        1. Terry Barnes

          Re: Not a problem solved

          "Whenever I hear "it's just an engineering problem" I draw my own conclusions concerning the speaker"

          I think it's pretty widely used to mean that a problem isn't impossible - the laws of physics don't preclude the thing under discussion being done, the science that underpins any solution is known and that applying sufficient resources will thus solve it.

          Once it's known that something is possible, the discussion moves to how engineering principles should be applied to arrive at a solution. If I was addressing a group of engineers in a business context it would be shorthand as well for telling them that budget and resource issues are taken care off - go do your thing.

          1. Someone Else Silver badge
            Stop

            @Terry Barnes (again) --Re: Not a problem solved

            [...] the science that underpins any solution is known and that applying sufficient resources will thus solve it.

            Ah, yes, that ol' chestnut "sufficient resources". The one thing that our Corporate Overlords will do everything in their power to not provide, because...well, providing "sufficient resources" is bad for business, as it doesn't increase Shareholder Value™.

            You really aren't from around here, are you?

            1. Terry Barnes

              Re: @Terry Barnes (again) --Not a problem solved

              "Ah, yes, that ol' chestnut "sufficient resources". The one thing that our Corporate Overlords will do everything in their power to not provide, because...well, providing "sufficient resources" is bad for business, as it doesn't increase Shareholder Value™.

              You really aren't from around here, are you?"

              Not finishing or launching a project because of penny-pinching is even worse for shareholder value. Why would any leader deliberately not provide the resources required - man or machine - to get the job done?

              You write a business case and it gets approved or not. The time to penny pinch is before the approval not after - if you can't afford the project don't start it. Elon Musk doesn't strike me as a leader who starves his teams of whatever it is they need to get something out of the door.

      2. The Crow From Below

        Re: Not a problem solved

        "It will take statistically proven decisions, and it will do that from a much broader array of sensing inputs than a human could."

        With one massive draw back. Sensors go wrong, stop working, wear out, get dirty and many other things that would cause them to give an invalid or no reading at all. Then you have the average commentard who believes that Knangjung Ditchfinders at £50 a tyre are just as good as the Brand name tyre at £300 each. Then that same commentard takes it for a service outside of the dealer network cause his mate can do it a bit cheaper but "it's just the same as the dealer but half the cost" then misses a critical hardware update for the braking sensor, or his mate doesn't realise that the sensors are a serviceable part and so fails to check them at all.

        I could go on naming issues, and most people are right in that the failings come from the human rather than machine side, but the fact is that people are lazy and will always try to save costs where ever possible. I still get worried when I see people not using brand name tyres (or worse, having different brands on each wheel) so I will never be convinced that people can be trusted to service their autonomous car correctly.

        1. Terry Barnes

          Re: Not a problem solved

          "but the fact is that people are lazy and will always try to save costs where ever possible."

          I think the ownership model changes when self driving vehicles become commonplace. Why would you need to own one? I know there are some specific use cases where having access to a specific vehicle is important - my son for example is a wheelchair user and has lots of kit to carry around with him and it's far easier to just leave most of that kit in the car.

          In the main though, why own something that gets used for a tiny portion of the day? I trhink a lot of the legla questions about using these things get solved in a lease model too - even if you lease something to be permanently available to you, having it owned and maintained by the manufacturer gets rid of all the problems you list.

          As for duff kit and dirty sensors - I'd pretty much expect these vehicles to refuse to depart if they don't have a defined minimum set of kit available, and I'd expect them to take themselves off to be fixed or call for service when things do go awry.

          1. Anonymous Coward
            Anonymous Coward

            Re: Not a problem solved

            > I think the ownership model changes when self driving vehicles become commonplace. Why would you need to own one?

            Because some people might prefer to sit on luxury leather rather than wipe clean, vomit and disinfectant resistant simulated-leather fitted to a shared vehicle.

            Okay it won't be that bad but think bus/train seats versus what you have in your own car now.

          2. The Crow From Below

            Re: Not a problem solved

            "As for duff kit and dirty sensors - I'd pretty much expect these vehicles to refuse to depart if they don't have a defined minimum set of kit available, and I'd expect them to take themselves off to be fixed or call for service when things do go awry."

            I am not really talking about total sensor failure as I agree the car simply wouldn't allow you to move of if it had a problem like that, but what about the many occasions where some dirt or a frayed wire caused the sensor to appear to work fine to begin with but then randomly gave out duff signals when going over a bump or around a corner?

            "I think the ownership model changes when self driving vehicles become commonplace. Why would you need to own one? In the main though, why own something that gets used for a tiny portion of the day?"

            For all the same reasons people currently own cars. Not all people like the idea of leasing cars (it's why people don't all do it at the moment) and would prefer to own their car out right. The ownership model is exactly the same as with a conventional car, and people will be very unwilling to buy into a forced lease (BMW tried it with their Hydrogen cars, as did Honda with theirs, both cars were magnificent but the lease only model made people shy away from it)

        2. The Mole

          Re: Not a problem solved

          I agree things do go wrong, many humans have coughing fits, distractions around them meaning they avert their eyes (which have minimal redundancy for depth perception anyway), drive erratically due to moods, fall asleep at the wheel, drive when drunk, drive with the onset of dementia, and keep driving even when warning lights, banging sounds, etc suggest that they should stop.

          These are all errosr/sensor faults that already happen. A self driving car will have redundancy for important sensors and (unlike humans) will fail safe - pulling over and waiting for a service vehicle to come along and fix the faulty sensor much to the annoyance of the passenger who would just have ignored it. They will never be 100% safe but the probability of the types of errors you describe happening and causing a catastrophic failure is going to be lower than the 'faults' that a proportion of human drivers repeatedly drive with.

          As for servicing my bet is that in the short-medium term then either

          a) you don't buy the car you hire with servicing and insurance included (as standard insurance companies will initially not insure it)

          b) They will be full of DRM/require software being reset during the servicing meaning the only genuine parts at the genuine service station are capable to do it and we will pay through the roof for the privilege.

      3. PatientOne

        Re: Not a problem solved

        " the right inputs processed at the right time, matched against a statistically driven decision tree "

        And you can't see the problem with this?

        Get one bit wrong and what happens?

        The reason why computers aren't as adaptable as human brains is the human brain cheats. It doesn't process every bit of information, it does not evaluate every possibility, it takes short cuts and uses steriotypes to get to a conclusion quickly. This is why AI development was struggling for so long: We were trying to get computers to process everything, thinking that's what a human brain did.

        Now what this means is: Under normal conditions, the AI (or expert system, to be accurate) will give repetative, reliable results. Under exceptional circumstances, it will not. So you want a computer for regular travel but a human there, ready to take over if something unexpected happens. That's why you still have pilots on aircraft, after all.

        So the best we can manage for now is the equivalent of an auto pilot that will handle regular travel and alert the driver to exceptional situations, and possibly offer help.

        But to have an autonomous car? No: That's not only stupid at present, it's a disaster waiting to happen.

        1. Terry Barnes

          Re: Not a problem solved

          "The reason why computers aren't as adaptable as human brains is the human brain cheats. It doesn't process every bit of information, it does not evaluate every possibility, it takes short cuts and uses steriotypes to get to a conclusion quickly"

          And that's why it's often wrong. Wrong enough that 100 people die on US roads every single day.

          I'd understand some of these arguments if humans were provably perfect, or near to it, but we're not. Limited sensory input, slow operation of the 'observe, assess, plan, act' cycle (the basis of a safe system of driving) and an unconscious decision making bias that is exacerbated by tiredness and mood. Travelling at motor vehicle speeds is something relatively new in human experience and we've not evolved adequate sensory and decision making systems to be very good at it.

      4. DrXym

        Re: Not a problem solved

        "That's a bold claim."

        No it isn't.

        "It's far more likely to do it reliably and regularly than a human. It will take statistically proven decisions, and it will do that from a much broader array of sensing inputs than a human could."

        The problem is that the things you encounter during a drive are far from regular.

        "I struggle with seeing how people who work in computing could see this as unsolvable. "

        It's called experience. See aforementioned voice recognition. Or OCR. Or AI. Or robotics. All began with lofty claims and then it turns out turning the analog world into something a computer understands turns out to be damned hard.

        "It's simply an engineering problem - the right inputs processed at the right time, matched against a statistically driven decision tree. How is any of that impossible?"

        Not one problem, an infinite set of problems, many of which are intractible.

        Here's some trivial problems your hypothetical self drive car would encounter:

        - The lights are out at the crossroads ahead. Does your car know how to negotiate the crossroads in a safe way which gives gives priority to other drivers according to the time they arrived and prevailing traffic? Can it establish basic signals to other drivers to indicate intent. Or does it just nudge out like an asshole and hope for the best? Or does it annoy the driver by giving up? How does it know to give up? Naturally it would have to do the right thing however many lanes, rights of way, trucks, buses, bicycles, motorbikes and cars (self drive and otherwise) there were.

        - A man is standing in the road by the traffic lights. A police man. How does your car know to obey his signals instead of the traffic lights?

        - A man is standing in the road by the traffic lights directing traffic. This man is a loony. How does your car know NOT to obey his signals instead of the lights?

        - A big truck ahead is stopped and a guy hops out to halt traffic each way so the truck can reverse into some entrance. How far away does your car stop from this? How does it know not to try and overtake this obstacle?

        - Your car encounters a stationary bus in your lane. Is the bus broken down? Is the bus stopped at a bus stop or stopped at lights? If it's stopped at a bus stop how long is it likely to be there picking up passengers? When if ever is it safe to pull into the oncoming lane to overtake this obstacle?

        - The road has a big pot hole in it. Can your car see this? Can it see it when it's filled with water? Or does it just smash straight through it?

        - A road is closed and there is a diversion in place. Does your car follow the signs or just keep driving until it falls into a hole the council just dug?

        - You're going up a country lane. 50m ahead you see an oncoming car. Does your car know it has to pull into the verge NOW because there is no verge ahead?

        - Your car goes into place with terrible radio coverage, or no GPS like a tunnel, underground carpark or simply a built up area. What does it do? Dead reckoning? Revert to the driver? What?

        I could go on but the point is there are too many variables, particularly in urban / country environments for it possibly to do the right thing all of the time. If it's constantly nagging the driver to intervene because it doesn't know what to do then it will become annoying and useless. I expect that even when it does appear in closed loop environments that there will still be some guy in a booth there to remotely extricate the car if it gets confused or confounded by something.

        1. Charles 9

          Re: Not a problem solved

          OK, I'll bite.

          "The lights are out at the crossroads ahead. Does your car know how to negotiate the crossroads in a safe way which gives gives priority to other drivers according to the time they arrived and prevailing traffic? Can it establish basic signals to other drivers to indicate intent. Or does it just nudge out like an asshole and hope for the best? Or does it annoy the driver by giving up? How does it know to give up? Naturally it would have to do the right thing however many lanes, rights of way, trucks, buses, bicycles, motorbikes and cars (self drive and otherwise) there were."

          How do WE do it? Usually by some established rules. First, keep the headlights on so other cars can see you. Second, don't assume you can go straight through. Third, FIFO. Fourth, if two cars arrive at once, use a left-hand first rule (use right-hand in right-side driving countries). Fifth, if all cars arrive at an intersection at once, wait a random number of seconds (between 1 and 10, including fractions) to see if one car moves. If not, creep forward yourself. Eventually, all cars acknowledge who moves first and use the left-hand rule to resolve the rest.

          "A man is standing in the road by the traffic lights. A police man. How does your car know to obey his signals instead of the traffic lights?"

          By recognizing the person in the middle of the street using forward sensors (technology already exists). Perhaps noting the badge or makeup of his/her uniform one can identify as a traffic officer or the officer can wear special indicative gloves (fluorescent, for example) that automated cars can easily see (would not be difficult to alter uniforms to accommodate self-driving cars). A little training and the car can recognize the hand gestures in 3D and know how to respond to them.

          "A man is standing in the road by the traffic lights directing traffic. This man is a loony. How does your car know NOT to obey his signals instead of the lights?"

          The same way we would, by noting the loony is not in uniform or using the special gloves and so on. And if he goes as far as to doll up as an officer, well that's impersonating an officer of the law, which is (a) a crime in and of itself and (b) capable of fooling a human, too, making the exercise moot.

          "A big truck ahead is stopped and a guy hops out to halt traffic each way so the truck can reverse into some entrance. How far away does your car stop from this? How does it know not to try and overtake this obstacle?"

          The car should note a pedestrian in the roadway and start assessing the situation. Consider how the situation is done today with humans. Usually, the pedestrian has to convey the situation to drivers, and the best way is to indicate a roadblock, either by standing in the middle of the road or (if the road is wide) by using road cones he brought with him. A self-driving car would already be trained to be aware of pedestrians and cones in the road and recognize them as obstacles. If the car can assess all paths are blocked, it should correctly come to a stop.

          "Your car encounters a stationary bus in your lane. Is the bus broken down? Is the bus stopped at a bus stop or stopped at lights? If it's stopped at a bus stop how long is it likely to be there picking up passengers? When if ever is it safe to pull into the oncoming lane to overtake this obstacle?"

          The car looks around. If the road is two-way two-lane, it has no choice but to wait. If there is an overtaking lane, are pedestrians approaching it? Is it near an intersection where it would need to be aware of the signal lights anyway? Those are things it can be trained to detect. If the way is clear, divert to the overtaking lane if open and pass the bus like humans do.

          "The road has a big pot hole in it. Can your car see this? Can it see it when it's filled with water? Or does it just smash straight through it?"

          Quite easily thanks to more advanced radar. And it should be able to distinguish water from a solid surface (it would register a different return pattern). Either way, the car should recognize to steer around it.

          "A road is closed and there is a diversion in place. Does your car follow the signs or just keep driving until it falls into a hole the council just dug?"

          Make the signs machine-readable by editing highway and traffic codes. Then the cars can read the signs and know what to do.

          "You're going up a country lane. 50m ahead you see an oncoming car. Does your car know it has to pull into the verge NOW because there is no verge ahead?"

          The car can (a) know about the no verge through its location and/or (b) look ahead and realize there is no verge, unless your vision is blocked, in which case how would WE know there's no verge ahead if we're not familiar with the area (which is (a) for the machine)?

          "Your car goes into place with terrible radio coverage, or no GPS like a tunnel, underground carpark or simply a built up area. What does it do? Dead reckoning? Revert to the driver? What?"

          How does a submarine know where it's going when it's underwater and radio-blind in the middle of a featureless sea? The tried-and-tested method is to use a three-dimensional accelerometer set to get a reasonable fix of location until a new fix can be made.

      5. JustNiz

        Re: Not a problem solved

        What happens when an autonomous car is approaching an accident and only has a choice between mounting the pavement and possibly killing many pedestrians, or going into the accident and killing the driver?

        1. DaLo

          Re: Not a problem solved

          "What happens when an autonomous car is approaching an accident and only has a choice between mounting the pavement and possibly killing many pedestrians, or going into the accident and killing the driver?"

          That is more of a philosophical question. Is the car a slave to it's master or is it programmed to be a slave to humanity?

          However, the chance of you ploughing into an accident will be much, much lower as the sensors will constantly be monitoring for possible accidents and braking times and should be able to react far quicker. Even if it a is a freak accident that couldn't be foreseen then the car should fare much better than a human who will have no time to think perfectly logically and will probably just plough at high speed into the pedestrians killing themselves in the process.

        2. Terry Barnes

          Re: Not a problem solved

          "What happens when an autonomous car is approaching an accident and only has a choice between mounting the pavement and possibly killing many pedestrians, or going into the accident and killing the driver?"

          The range of sensing inputs are so much greater that it would stand a far better chance of stopping before it got to the accident - these cars can, for example, see much farther ahead, they can see round corners before a human eye could, and they're tracking the movement of every vehicle around them.

        3. Anonymous Coward
          Anonymous Coward

          Re: Not a problem solved

          These are the sorts of problems computers are actually quite good at solving.

          The sort of "damned if you, damned if you don't" scenario you propose could be avoided by first driving more safely, based on all available information.

          Because if early warning road network and detection systems were configured properly, the car computer would already know about the accident (or potential accident) and have slowed down for evasive maneuvers.

          As in:

          Speeding objects on a collision course within the vehicle's safe braking distance are detected / predicted / suspected.

          A distressed, soon to be immobile object (automobile) is decelerating rapidly or has undergone a collision.

          Pedestrians detected in the upcoming vicinity should have already put the system into "vigilance" mode and slowed down the vehicle. Pedestrians are easy to hit/kill and sometimes don't pay attention when they walk into the road.

          Done properly, you could even expect a nice smooth stop in the above case, not a gory accident.

          A super intelligent system would use predictive logic and probability analysis to detect road accidents before they happen, and then behave accordingly.

          Accidents are still physical, measurable events with moving objects, velocities, and outcomes even if humans can't process all available data and still crash

          Even in the worst case, I guarantee you that a correctly programmed computer would resolve that split-second problem better than most humans, and would detect it earlier.

          This is why Airbuses can still auto-pilot and auto-land, even when flying in desperate weather conditions, when a human pilot can barely hold onto his coffee cup.

          1. Someone Else Silver badge
            FAIL

            @AC -- Re: Not a problem solved

            Because if early warning road network and detection systems were configured properly, the car computer would already know about the accident (or potential accident) and have slowed down for evasive maneuvers.

            Really? You' actually expect this? As a counterexample, please do a bit of homework regarding the implementation of Positive Train Control (PTC) in the US (consider financial considerations, implementation, interfacing, etc. in your answer). Then take a quick look at the Republican Congress, and their refusal to spend a single dollar on infrastructure, the tell me again how likely it is that "early warning road network and detection systems were configured properly".

          2. Anonymous Coward
            Anonymous Coward

            Re: Not a problem solved

            > Speeding objects on a collision course within the vehicle's safe braking distance are detected / predicted / suspected.

            Already available even on relatively low end cars.

            > A distressed, soon to be immobile object (automobile) is decelerating rapidly or has undergone a collision.

            Coming latter this year.

            > Pedestrians detected in the upcoming vicinity should have already put the system into "vigilance" mode and slowed down the vehicle.

            My car already recognises pedestrians (and animals) and warns me if they seem to pose a problem. It's up to me to decide if I want to stop or run them over, though.

            > This is why Airbuses can still auto-pilot and auto-land, even when flying in desperate weather conditions, when a human pilot can barely hold onto his coffee cup.

            They can't, actually. Severe turbulence poses a problem to both the autopilot and autothrottle. Even moderate turbulence is usually best crossed in manual throttle and often hand-flown. Of course, one is supposed to avoid turbulence in the first place. Autoland can only be done by certified and current crew on capable and certified planes at properly equipped airports operating under specific procedures, under limited weather conditions (mostly radiation fog).

        4. 's water music

          Re: Not a problem solved

          What happens when an autonomous car is approaching an accident and only has a choice between mounting the pavement and possibly killing many pedestrians, or going into the accident and killing the driver?

          What are the chances of a competent driving AI finding itself in that situation without having had the chance to anticipate it and take avoiding action? I speculate that if we can work out why the bowl of petunias thought what it thought then an answer to your hypothetical may suggest itself but in the mean time baking would seem a sensible reaction

      6. Anonymous Coward
        Anonymous Coward

        Re: Not a problem solved

        "I struggle with seeing how people who work in computing could see this as unsolvable. "

        Yet in these very pages day after day we have bug after bug surfacing, sometimes years old.

        1. Terry Barnes

          Re: Not a problem solved

          "Yet in these very pages day after day we have bug after bug surfacing, sometimes years old."

          In real-time safety critical systems developed and maintained by certified teams? Not so much. Airplanes and nuclear power stations tend not to get BSODs, for obvious reasons and by design.

          1. 8Ace

            Re: Not a problem solved

            Aircraft and Power stations are hardly consumer products. If you are expecting future autonomous vehicle software to be designed using the same technologies (and budgets) as Airbus and Boeing, you are misguided.

            1. Terry Barnes

              Re: Not a problem solved

              "Aircraft and Power stations are hardly consumer products. If you are expecting future autonomous vehicle software to be designed using the same technologies (and budgets) as Airbus and Boeing, you are misguided."

              VW spent $17Bn on R&D last year. Toyota and Ford aren't far behind. Toyota's revenue was $26Bn. I think they can afford to do this properly.

              1. Peter Hawkins

                Re: Not a problem solved

                "VW spent $17Bn on R&D last year. Toyota and Ford aren't far behind. Toyota's revenue was $26Bn. I think they can afford to do this properly."

                Given that their current R&D software component would be some bespoke adaptions to largely off the shelf ECU's etc. along with some infotainment, I think you don't understand much about R&D in the motor industry. The vast majority of that will be next gen platforms, engines, emmisions etc.

                Airbus, Boeing style technologies is a whole new game for motor manufacturers.

            2. Anonymous Coward
              Anonymous Coward

              Re: Not a problem solved

              > If you are expecting future autonomous vehicle software to be designed using the same technologies (and budgets) as Airbus and Boeing, you are misguided.

              The technology in modern cars is already way beyond what we have on airline transport planes by almost any measure (I am tempted to say, including reliability. If I told you some of the things that went wrong during my flying career you may never take a plane again :-) ). Budget-wise, certification related costs are what eats most of it. Those chaps at the CAA have a job to keep, you see.

              Just saying.

      7. Someone Else Silver badge
        Facepalm

        @Terry Barnes -- Re: Not a problem solved

        I struggle with seeing how people who work in computing could see this as unsolvable.

        That statement pretty much indicates you don't work in computing. Dude! There are these things called "bugs"....

        1. Terry Barnes

          Re: @Terry Barnes -- Not a problem solved

          "That statement pretty much indicates you don't work in computing. Dude! There are these things called "bugs"..."

          25 years and counting. I've not found a problem yet where the answer to when it can be done is "never". A project may need every processor cycle on earth, all the RAM in the galaxy and a million programmers, but never is a very, very long time.

          1. Charles 9

            Re: @Terry Barnes -- Not a problem solved

            Alan Turing PROVED the answer is "never" for "a program that can detect infinite loops".

    2. Nigel Brown

      Re: Not a problem solved

      Self drive is a solution looking for a problem.

      1. fishman

        Re: Not a problem solved

        @Nigel Brown:

        "Self drive is a solution looking for a problem."

        So you don't consider the number of people killed in car accidents a problem?

    3. Paul Crawford Silver badge

      Re: Not a problem solved

      If it is a 10% driving failure it is not "annoying" but "potentially fatal".

  7. petur
    FAIL

    Too confident of their capabilities, let's put an end to that

    Make the car/tech company liable for the accident if its software or tech was at fault.

    Then watch how their confidence drops through the floor while they calculate the possible multi-billion claims they will have to pay.

    Sorry, driving in traffic is *not* that easy.

  8. Zog_but_not_the_first
    Facepalm

    I think not

    I am a great admirer of Musk and his "can do" attitude but he's dead wrong on this. I'm (almost) comfortable with the pilotless plane, since an aircraft follows a well-charted route with little or no deviation, except under closely controlled conditions. Plus, the air traffic control all but eliminates the risk of mid-air collision.

    I can also see how a driverless car manoeuvring the grid-planned highways and streets of US cities would probably do well. BUT, I've just come home from driving around a town whose layout was probably fixed in the 11C according to a ragbag of mediaeval property rights, local drainage problems and stubbornly unmovable tree stumps (until the coming of steam power anyway). I'd like to see one driverless car manage the trip, regularly, let alone a (what's the collective noun - Johnny Cab?) collection of them.

    There is still a great deal to be done on the training of human drivers and, going against my normal democratic tendencies, I believe a minority of people shouldn't be allowed to drive (based on the way they do drive. But that's another discussion.

    1. Crisp

      Re: let alone a (what's the collective noun - Johnny Cab?) collection of them.

      A cabal of Johnny Cabs?

      A charge of Johnny Cabs?

      Though I guess a fleet, mayhem, or a stack of cars would do.

      1. Jimmy2Cows Silver badge
        Coat

        Re: let alone a (what's the collective noun - Johnny Cab?) collection of them.

        Wreckage

        1. Someone Else Silver badge
          Coffee/keyboard

          @ Jimmy2Cows -- Re: let alone a (what's the collective noun - Johnny Cab?) collection of them.

          See icon -->

  9. Alister

    Self-driving cars are "almost a solved problem," says Tesla Motors boss Elon Musk

    Yes, in the respect of physically, technically how to make an autonomous vehicle, however, that's a long, long way from integrating self-driving cars into existing traffic flows.

    I've noticed that in all the gushing publicity, from Musk, and Google, and others, they show autonomous cars chugging around in isolation, or with a few carefully trained test drivers in other vehicles.

    I can't wait to see a self-drive car in a rush hour at a big intersection or roundabout...

    1. Terry Barnes

      "I can't wait to see a self-drive car in a rush hour at a big intersection or roundabout..."

      Your wait is over;

      https://www.youtube.com/watch?v=sEsvQOHreg4

      https://www.youtube.com/watch?v=BrmorE5W1tM

      1. DropBear

        "Your wait is over"

        So, umm, why is she holding her hand around the steering wheel at around 0:20 then, at a fairly simple turn? Yeah, I thought so - confidence++...

      2. Alister

        "your wait is over"

        I'm sorry, but there's a couple of opportunities on the second video where you can see that the bloke is steering - and therefore I would assume, driving

        The manouver where he cuts in front of the bus to turn right is clearly a human move, an autonomous car would not have left it until last minute to be in the correct lane.

        1. phil dude
          Thumb Up

          Re: "your wait is over"

          @Alister "The manouver where he cuts in front of the bus to turn right is clearly a human move, an autonomous car would not have left it until last minute to be in the correct lane."

          1) LMAO!

          2) This is why Musk is right.

          P.

  10. Darren Barratt
    Devil

    Top Gear won't like it!

    August 12 2016, all control of our traffic systems given over to Elon-net

    August 29th Elon-net gains self awareness and it's operators try to deactivate it. Elon-net percieves this as a threat and proceeds to squish humanity with cars.

    The only way to prevent it is have Kyle Reese come back and save the career of Jeremy Clarkson, to keep cars under human control. A price worth paying?

    1. Martin
      Happy

      Re: Top Gear won't like it!

      If driverless cars is the cost of getting Jeremy Clarkson off our screens for good, I for one will welcome it.

      1. TitterYeNot

        Re: Top Gear won't like it!

        "If driverless cars is the cost of getting Jeremy Clarkson off our screens for good, I for one will welcome it."

        You are Danny Cohen and I claim my five pounds...

        1. Anonymous Coward
          Anonymous Coward

          Re: Top Gear won't like it!

          > You are Danny Cohen and I claim my five pounds...

          ... or he could be Stewart Lee

          1. JustNiz

            Re: Top Gear won't like it!

            Wow I just watched that Stewart Lee video. That guy is a complete asshole. If thats what all so called "politically correct" people are like then they can all go fuck themselves.

    2. John Miles

      Re: Top Gear won't like it!

      They have already had one round their track - youtube Link

  11. Christoph
    Trollface

    Simple fix

    Just warn everyone that the robot car is coming. Have a man with a red flag walk along in front of it.

  12. Anonymous Coward
    Anonymous Coward

    Paypal

    If his cars are going to drive as bad as Paypal is functioning then it will be a looong time until manual driving is outlawed...

    (Note: Musk created Paypal)

  13. Alan Denman

    The Patent problem

    Surely, if we expand the patent validity to self driving cars it will be mandated that every car that is not Google driven will be barred from going round curved corners.

  14. Mad Mike

    Replacing one problem with another

    All we're doing is replacing one problem with another. Peoples inability to drive (decision making, sensory overload etc.) is being replaced by peoples inability to code properly!! Look at the number of bugs present in code that has to do relatively mundane things and then think of the number of bugs that will be present in code needed to drive a car!

    I'm looked forward to the following in no particular order:-

    1. BSOD whilst driving. Quick reboot is really important under these circumstances.

    2. Sensor failures whilst driving....or more important, sensors going slightly out of whack.

    3. Advistories suggesting you don't drive cars in certain conditions until fix xyz applied.

    4. Viruses.

    5. Looking at the firewall log whilst driving and realising someone is trying to hack in.

    etc..

    etc..

    etc..

    1. Anonymous Coward
      Anonymous Coward

      Re: Replacing one problem with another

      With all due respect...

      > All we're doing

      ...what is your role in the car industry, exactly?

  15. Anonymous Coward
    Anonymous Coward

    Joke of the day...

    ...Elon Musk's latest decree.

    Knowing or thinking that you already know "what to do" is much different than actually knowing what to do and actually being able to do it. While I support the use of properly engineered, designed, built and maintained autonomous vehicles, we are several decades away from that being a reality. Will some folks rush half-baked crap to market for profit? Of course they will and society will pay the price for such unscrupulous activity.

  16. john devoy

    Have they actually tested this on roads that are NOT wide multi lane roads? How about country roads or built up European towns?

  17. JLV

    I heard that already

    I strip away the old debris

    That hides a shining car

    A brilliant red Barchetta

    From a better vanished time

    I fire up the willing engine

    Responding with a roar

    Tires spitting gravel

    I commit my weekly crime

    1. Geoff Campbell Silver badge
      Go

      Re: I heard that already

      Dude!

      GJC

    2. Fink-Nottle

      Re: I heard that already

      > I commit my weekly crime

      Toad sat straight down in the middle of the dusty road, his legs stretched out before him, and stared fixedly in the direction of the disappearing motor-car. He breathed short, his face wore a placid satisfied expression, and at intervals he faintly murmured 'Poop-poop!'

  18. Paul Garrish

    Software?

    Aircraft software (airborne software) is documented to death, written in one of a few certified compilers, walked through and tested to death. It runs on old, very well understood processors and is generally pretty simple - look up tables with simple interpolation algorithms. All the data is developed on the ground, slowly, carefully and under a microscope. There are more than one of everything in the plane and if they disagree, they shut down and the pilot takes over. Yes the results are clever, but the implementation is clear. It is written for one type of aircraft at a time. AND IT IS VERY VERY EXPENSIVE!!!!!

    Compare to the above - state of the art hardware (Pentium FPU anyone...), Consumer O/S (enough said), commercial constraints and minimal regulation, Dozens of types and models of cars, brakes, engines, steering etc etc.

    Its a bit like Mainframe vs PC - would you trust your life to a PC?

  19. Truth4u
    FAIL

    ban all cars

    If people want to travel they should be forced to stand shoulder to shoulder on dirty commuter trains, which would ideally spend more time sitting idle on the tracks waiting for signals than actually moving.

    You can do your bit to improve the service by filling the train with some of your own shitty vehicles. If you have a mountain bike big enough to block the doors, your attendance is desperately needed on my morning commute.

    1. Anonymous Coward
      Anonymous Coward

      Re: ban all cars

      Mr. Truth

      > If people want to travel they should be forced to stand shoulder to shoulder on dirty commuter trains

      Would one be correct in guessing that you commute by train? Could I surmise that you do so constrained by your means?

      If sometime in the future you became sufficiently wealthy, can it be assumed that you would give up your present means of transportation for an automobile, which you would drive showing the same kind of consideration that you currently appear to extend to your fellow commuters?

      1. This post has been deleted by its author

      2. Truth4u

        Re: ban all cars

        Clearly you are guilty of blocking the doors with your dirty mountain bike and have decided to project your own lack of consideration onto me because I'm not "considerate" enough to put up with you violating the safety and comfort of about 170 passengers for your own convenience, and presumably, sick amusement.

        Maybe you were hoping the other passengers would congratulate you for being able to afford it?

        I'm going to get on the train with one of those big fucking unicycles since common sense and decency have been abandoned by society now.

        If I could find a nice woman I would knock her up for quadruplets and invent a push chair so cumbersome and wide it would maim and kill pensioners, dogs, just about anything really. Because Narcissism is the new U-man right. To the age of three I'm going to push them around in a contraption so offensive nobody else will be able to use the pavement AT ALL. I hope you like walking 1 mile an hour because if you even try to get by I will start some shit.

        1. Anonymous Coward
          Anonymous Coward

          Re: ban all cars

          > Clearly you are guilty of blocking the doors with your dirty mountain bike

          Not really. I do not do public transport. Besides, mountain bikes are for the trail, not for the train. :-)

          I do trust that, in the interest of the safety of 170 passengers, you have politely explained to Mr. Mountain Biker the risk his bicycle appears to pose. Himself and the other passenger would no doubt be grateful for that.

          And lighten up mate. What's the point of living life miserably?

  20. Mark 85

    Cars and non-drivers.

    There's a bit of a problem coming up though with autonomous cars... The current thinking by commenters and possibly even the engineers is that one gets in, sits behind the wheel, gets comfortable and assumes a driving position while the car does all the work. Just in case the driver needs to take over... right? Complacency will soon wipe that idea out as more and more drivers decide to text, read a book, play with the kids or spouse, eat, or whatever. Having a steering wheel could become the hazard.

    This is the difference between cars and airplanes. Pilots still have be in the cockpit and monitoring. They're not supposed to be wandering about and doing other things.. yeah.. there's been reports such as the one where the plane flew right past the destination airport.

    My guess is that after an initial break-in, the controls will have to be removed from the car and allow it to be fully autonomous. Driver skills will deteriorate after a period of non-use and I'm wondering about having someone suddenly grabbing control of the car when their ability has deteriorated. Seems to be it has to be an all or nothing on autonomy.

  21. John Brown (no body) Silver badge
    Terminator

    autonomous driving license

    I'm confused. Why would I need a licence if I'm in an autonomous car? I'm not the driver. I don't need a license to go in a bus or taxi. Only the driver needs a pass a test and hold a licence.

    1. phil dude
      Megaphone

      Re: autonomous driving license

      I suspect that initially it will be sold as "driver assist".

      But you raise an excellent point. There are *many* people who are *unable* to drive, that could be provided with a means of transport.

      And no, a bus or train is not the same thing as they represent a static resource.

      In general, disabilities are, perhaps self-evidently, rarely convenient.

      P.

    2. Anonymous Coward
      Anonymous Coward

      Re: autonomous driving license

      This is a good question:

      > I'm confused. Why would I need a licence if I'm in an autonomous car? I'm not the driver

      Yes you are. :-) You are still in charge and you are responsible if you cause a prang. :-)

      The same reason I had to have a licence, even if the autopilot was doing most of the flying.

      Drawing from my flying experience, I would speculate the goal of such a licence is to certify that the driver is familiar with the operation and, perhaps more importantly, limitations of an automatic (not autonomous) driving system. Both so that he can operate it safely and that he can be held liable for his own (albeit hands-off) driving.

    3. JLV

      Re: autonomous driving license

      >Why would I need a licence

      Also, would you need to be sober?

  22. John Brown (no body) Silver badge

    Robo roads.

    I would prefer it if these AI controlled cars were restricted to special roads or lanes. Otherwise they are manually operated. Mixing AI and manual cars on country lanes or in busy towns and cities is going to be dangerous, no matter how good the AI is. At best, city centres modified to help AI vehicles with a 15 or 20mph speed limit might work, with the emphasis on discouraging human driven vehicles from entering that area at all. Minus the AI vehicles, this is how many city centres are moving now anyway with high parking charges, limited parking, congestion charge areas, shared vehicle/pedestrian areas, roads closed to traffic and permanently pedestrianised or at least during working/shopping hours.

    We currently have rail based trams "on" the roads in some cities, special guided bus lanes in others, so there are models for some limited and partially segregated traffic already. Adapting this to motorways should be do-able.

  23. earl grey
    Flame

    No, just NO.

    Not in my lifetime; not with my inside; not with me anywhere close to it. I don't find current automobiles reliable enough; the costs are going through the roof for fancy "stuff" added that doesn't really bring greater functionality to the vehicle; current computers and sensors have LOTS of problems in modern cars and are ALL expensive to have replaced (and hell no, I don't want more).

    You'll get my meatbag driven car when you pry it out of my cold dead hands.

    1. Anonymous Coward
      Anonymous Coward

      Re: No, just NO.

      > You'll get my meatbag driven car when you pry it out of my cold dead hands.

      "Your proposal is acceptable"

      (Men in Black, was it?)

  24. Alan W. Rateliff, II
    Paris Hilton

    Benevolent concern

    A lot of comments to slough through by the time I got here, so I will just knee-jerk mine right away and see if anyone else already posted later. My read on this is, "I want everyone to be forced to buy a self-driving car in the near future. Preferably mine."

  25. Anonymous Coward
    Anonymous Coward

    Clearly he lives in the US

    I was working over there for a bit and found it immensely irritating that all their roads were straight, flat and had a speed limit on them that was only slightly faster than hopping.

    If he was from Scotland or Geneva or... well, more or less anywhere that has interesting roads then there'd be no push for autonomous cars- plus a better appreciation for how difficult it actually is to drive when you've got things like "corners" to deal with. Have they tried these things on the road to Applecross or the Stelvio pass? How about Rome- can it deal with Italian drivers (answer: no, no it cannot. God Himself couldn't manage to drive through Rome without getting dented)? How about Johannesburg- could it operate the anti-carjacking flamethrower?

    People have also mentioned that it needn't be autonomous all the time, that a person could take over. But if the car's driving itself then the driver will be watching movies or playing Candy Crush or drunk or just generally not paying attention. Even if you were warned that you might need to take the wheel soon, you'd lack the muscle memory to drive safely. So it /cannot/ rely on a human controlling it, ever.

    1. Fink-Nottle

      Re: Clearly he lives in the US

      > How about Johannesburg- could it operate the anti-carjacking flamethrower?

      10,000 people died on South Africa's roads last year; Johannesburg desperately needs self-driving cars. Not only would it make the roads safer, but carjacking would be impossible when a vehicle cannot deviate from a pre-planned route.

      1. Anonymous Coward
        Anonymous Coward

        Re: Clearly he lives in the US

        Instead, they'll move one key to the left and change into carhackers. Now they can RE-plan the pre-planned route. And lock you in the car while they corral you in a place where you can't fight back.

  26. CJ_in_AZ

    Insurance, not laws, will kill the steering wheel

    I've been predicting since the late 90s that self-driving cars would be on the market, for purchase, on the showroom floor, starting sometime between 2010 and 2020. I don't think that there will ever be laws banning manually controlled vehicles, but I expect that the insurance rates will be so high on cars that can be manually driven compared to those that can't that steering wheels will only be a feature of the highest end cars. Yeah, a lot of fear-mongers talk about the rare conditions, but when the steering wheels disappear, drunk driving will be a thing of the past. Although I find driving to sometimes be fun, I'd sure like to be able to imbibe when out with friends, and let the car worry about getting me home. (Since I'm on meds that "amplify" the effects of alcohol, I don't drink if I think I even MIGHT have to drive.)

    I also want to point out that I expect to see the first autonomous vehicles to be high end consumer, such as the Tesla, or maybe Rolls. Once they've been proven reliable in those vehicles for a year or two, then the large companies with large fleets of long-haul trucks (such as [here in the States] UPS and WalMart, to mention a couple of examples) to jump on the technology bandwagon with a vengence. Think about it -- a truck driver costs $50K or more a year, and is limited in how many hours [s]he can drive in a day. Even an investment of $100K to automate the semi would pay for itself in less than a year, considering that it more than doubles the work the vehicle can do.

    And all of this doesn't take into account that the autonomous vehicle can be equipped to see in wavelengths outside the human visual range (and so be less effected by fog or smoke), and see all around the vehicle (and thus not have "blind spots").

    1. lucki bstard

      Re: Insurance, not laws, will kill the steering wheel

      They'll be killed off by the insurance companies. They will not cope with Canadian ice and snow, simple as that.

  27. Cynic_999

    No human driver? No, that won't happen

    We have had the technology to fully automate just about every form of transport *except* road vehicles for decades. Ships would be a doddle once clear of port until they meet the pilot vessel at the other end. Trains even easier. Aircraft can take off, fly the whole route and land without the pilot needing to touch most of the controls once, and automating the few tasks that must still be done manually and converting air traffic control from spoken instructions to digital commands would be relatively trivial.

    The fact is that apart from a few short automated rail transport systems such as airport shuttles and vehicles used within factory and farm environments where they pose no danger to the general public, we don't trust computers to be able to cope with all possible situations, and so we demand that a human is "in control" even if that human is only monitoring the computers most of the time (as they are on many aircraft and ships). Cars are unlikely to be given a general go-ahead for driverless operation, because the current state of AI is *way* behind the capabilities of the human brain to recognise and react to sudden unusual events, even if it is better at dealing with everyday situations. If we are not comfortable with driverless freight trains we will certainly not condone driverless lorries.

    There is also the matter of criminal acts. An unmanned container ship or oil tanker would not pose any significant threat that a manned vessel does not pose, but would be a far easier target to hijack or steal from. The possibility of a hacked car being used to kidnap a celebrity or child is also something to bear in mind.

    1. Charles 9

      Re: No human driver? No, that won't happen

      "There is also the matter of criminal acts. An unmanned container ship or oil tanker would not pose any significant threat that a manned vessel does not pose, but would be a far easier target to hijack or steal from. The possibility of a hacked car being used to kidnap a celebrity or child is also something to bear in mind."

      Wouldn't an automated ship be harder to hijack since the controls can be put in a state where no human can take control and the humans locked themselves in a safe room strong enough that attempting to break it or the control system runs the risk of damaging or stopping the ship, making the whole exercise worthless?

      As for the hacked car and celebrity, this still sounds less likely than just grabbing the person off the street or being the rogue driver in a cab/limo.

      1. Cynic_999

        Re: No human driver? No, that won't happen

        "Wouldn't an automated ship be harder to hijack since the controls can be put in a state where no human can take control and the humans locked themselves in a safe room strong enough that attempting to break it or the control system runs the risk of damaging or stopping the ship, making the whole exercise worthless?"

        I am thinking of an unmanned ship. A hijacker or thief would then not run the risk of being overpowered and caught by the crew, which is the main reason that piracy is not more prevalent, and if far enough from other vessels can be assured of many hours or even days without interruption. No need to do anything with the controls, just disable the engines and radio/satellite tracking aerials (which would be just about impossible to prevent), and tow the vessel. Pick a time with extensive cloud cover and satellite images would not be available.

        Or leave the vessel to carry on its merry way and simply plunder its cargo - again the thieves have plenty of time before anyone could reach the location even if the ship automatically raised the alarm.

        Regarding hacking into a car to kidnap etc. - people are far more likely to commit crimes when they can do so by remote control and so are not in any immediate risk. Heck, just look at online interactions and you will see people saying things to other people that they would never say face to face or even on the phone. How many teenagers would be willing to to physically break into a military facility? How many would be willing to hack into one of its computers? Not to mention the sort of people who get a kick out of chucking concrete blocks off motorway flyovers - how much safer they would feel doing something similar using a radio jammer or signal spoofer?

        1. Charles 9

          Re: No human driver? No, that won't happen

          The ship and car could fall back to accelerometers which would be much tougher to fool.

          As for the cargo, lock it down tighter?

  28. Dan Paul

    Just like Will Smith in IROBOT

    But you can take my human driven car when you pry it from my cold dead hands.

    I will already be dead before AI driven cars become commonplace (Or the law).

  29. This post has been deleted by its author

  30. JustWondering
    Facepalm

    Better idea

    How about we make sure people can control a vehicle before we give them a license? Where I am, you can get a license without ever having driven on a highway, or over 50 kph for that matter. Ditto drive in rain or snow. Parallel parking seems to get the most attention during training. The test amounts to driving around the block in ideal conditions.

    Couldn't even do that? Then come back next week and try again. Maybe it will be your lucky day and you'll have the "right" examiner.

  31. JLV

    Tech, yes. Legal, ?

    Each year our roads see more deaths than in many wars. 1969 in the US? 53,343 says Wikipedia. I.e. more than the entire Vietnam war. They are dropping though - 33561 in 2012.

    I agree, not good. But our countries have complex legal frameworks to manage it.

    "Normal" traffic deaths are insurance concerns, with mostly predetermined, capped, damages. Consumers are on the hook to pay the premiums. And there is even an accepted way to calculate third party liability and insurance coverage for it.

    Special cases, such as drunken or reckless driving can result in fines and jail sentences for the drivers.

    Design & manufacturing defects end up with the car manufacturers in the dock. Recalls can be extremely expensive and punitive damages huge. And it can go verrrrry wrong. For example, the Prius's accelerator issue - $1.2B for 37 deaths.

    Musk, who is an extremely clever guy, is probably right that we are only a few decades away from safer driving from robots, in aggregate. Will we modify our legal framework to award the same type of damages for wrongful death due to faulty driving, but this time against a rich multinational? I pay about $1400CAD/year to cover my car, BC is costly. If Tesla is driving my car, does that mean they need to put aside money against the risk of my car getting into an accident due to their AI?

    i.e. if the car I am driving swipes a little granny riding her bicycle into the ditch and kills her, I could be in big trouble, but I will likely not be paying out millions of dollars. There is a, costly but mandatory, economic mechanism for me to cover most of my remaining risk. If the 50x safer Tesla autopilot does it, what's Tesla's exposure? And where will that money come from? Maybe it should be my insurance, but does that mean politicians will leave the carmaker off the hook and cap damages, because its an AI driver issue rather than any other part of the car? Don't think so.

    We do have precedents for this, btw. Air travel has caps on awards against airline companies and I think even aircraft manufacturer caused crashes have not resulted in ultra-massive payouts. The general model is - pay some damages, spend a lot of effort identifying the cause, fix the issue. It works well, air travel is very safe. But it is an optimistic carmaker that thinks they're automatically gonna hop onto that wagon from their current legal exposure.

    I suspect maybe it'll start with less litigious locations than North America. Or with long-haul trucks in segregated lanes.

  32. Anonymous Coward
    Anonymous Coward

    Misunderstanding?

    What is being talked about is not autonomous, but automatic driving.

    Autonomous: acting independently.

    Automatic: working by itself.

    The former implies that there is no driver or he's not in control. In the latter, the driver may delegate some or most driving tasks to the machine, under his supervision and responsibility.

    This is "simply" an evolution from what we can see on the roads today with things like adaptive cruise control, dynamic steering, stability control, adaptive and predictive suspension, etc., etc., etc.

    What happens is that in the process of adding more and more of these technologies, driving becomes a bit of a different animal compared to the old-fashioned (or motor sports) way, which is why the idea of a special licence is a sensible one.

    My current car having many of the above mentioned technologies, I can attest that one needs to approach the driving in a significantly different way. I can also attest that it makes drivers not used to it *more* uncomfortable with the driving at first, especially if they haven't been briefed beforehand. However, once familiarised, the driving feels so much safer and relaxed--I would never go back to a "normal" car for day to day use.

    Another interesting observation: I've been driven in my car by a number of different people from 19 to 40 years old, and I found the younger ones a) got hold of it a lot quicker, and b) drove a lot more sensibly (easier said than done with 420 HP under the bonnet!) than the 30-40 y.o., both men and women. Of course I don't claim my experience to be representative or significant, but wouldn't it be great if new driving technologies would make young drivers safer and more courteous?

  33. Nigel Brown

    This just in

    http://www.delphi.com/delphi-drive

  34. M7S

    Maintenance of the software, and some practical issues for fully autonomous vehicles.

    We're probably all aware of corporates still running Windows XP. Not recommended but it happens.

    I recall when ABS and all wheel steering (on consumer cars, it was a fad for a while) were "new" Tiff Needell* expressing on TV concerns about the likelihood (or not) of maintenance being properly arranged by the third or fourth owner of a vehicle.

    Software is improved all the time (until manufacturers "bin" it, as they have done with my otherwise perfectly fine PBX) but considering the number of times when I update a copy of firefox and learn that the various plug-ins I have previously installed are not compatible and need to be removed, reinstalled or tweaked, do we really have confidence that these self driving systems will work and be maintained as intended? Given the lack of adherence to current mechanical standards by drivers, which are easier for traffic police to spot (where they have time) than electronic issues, probably not. I know my local Volvo main dealer has to connect a car to the head office servers for any updates, he doesn't hold a copy himself. That's great for ensuring that cars where the owners can pay the high costs are up to date, but not for those using independent garages.

    Also for those of you hoping for a totally autonomous driving world: I get to my field in my Landrover (or similar) and need to drive to the other side of it to pick up the carcass of the sheep that needs to be removed and taken to an authorised disposal point. It's unlikely to be waiting for me on some convenient track. Do I need to swap vehicles, perhaps having to leave a "manual off roader" in each contiguous area that I farm? No. Its not practical or economic or particularly environmentally friendly (from an equipment efficiency/lifecycle point of view) for a small farmer. Manual/Dual control vehicles would remain a necessity. Likewise for the ambulance driver/fireman/vehicle recovery operator who may need to carefully negotiate past a queue of traffic by using a non-authorised driving surface on a temporary basis.

    All this stuff is a lovely idea and I'd be pleased to see it happen with all the benefits it might bring, but I think its probably going to be a lot more complex and expensive than many of the proponents would have us believe. Still, their interest is sales/profit and hoping that the costs of the problems (infrastructure, inconvenience) will fall on someone else. Thats only human. Ironic in the circumstances....

    *for the younger reader, he presented Top Gear before our lord JC** ascended to head up the current trinity.

    ** "He's not the Messiah"

  35. Anonymous Coward
    Anonymous Coward

    Patching

    How do vendors patch their s/w? How do they do beta testing? What if I miss a critical upgrade/patch?

    Scarey......

  36. JustNiz

    I personally enjoy driving, my hobby is classic cars. I will never accept that I will ever be any safer in a car being driven bty a computer than when driving myself. I will also strongly rue the day if/when there is a complete ban on human drivers. Musk for me is the devil.

    Don't EVER expect me to buy a car with this "feature", for at least as long as any other option exists. In fact don't even expect me to ride in a car when it is not being driven by a human.

  37. Public Citizen

    I've yet to receive ~any~ answer, much less one approaching "adequate" on how these "self driving" systems are supposed to function safely during extreme inclement weather, such as during a snowstorm.

    The follow up question once that one is answered is: So how does the AI Interface install the tire chains?

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like