back to article Disengage, disengage! Cali DMV reports show how often human drivers override robot cars

Mercedes' driverless cars need human intervention approximately every 2.08km (1.3 miles), and other makes are totally reliant on frequent switching to manual, according to figures out this month from the Californian Department for Motor Vehicles. The "disengagement reports" (the times an autonomous car was taken over by the …

Silver badge

These figures relate to California, and a lot of the stuff that I've read about driverless cars relates to tests being conducted in relatively sunny climes. I wonder if any info is available (or even how much testing has been done) for how well the tech copes in rain/sleet/snow/fog/British weather conditions?

28
0
Bronze badge

The British weather could be problematic, but they must have an auto shutdown that engages if my Auntie Maud is detected closer than 10 miles.

12
0
Silver badge

I'm wondering more how these cars would fare in say, New York City or even Los Angeles. My suspicion is that around Silly Valley, the traffic is pretty decent and well-mannered compared to those two places.

Disclaimer: I've driven in both those cities but haven't had the pleasure yet of visiting Silly Valley.

3
0
Silver badge

IMHO the ultimate test would be winter in Moscow. Good reactions won't be enough, you have to have some form of sixth sense to cope there.

3
0
Bronze badge

The ultimate test is possibly Winter driving in Boston which combines abundant snow and ice, an improbable road network, gigantic potholes, an attitude that traffic laws are advisory, not mandatory and drivers operating according to the prime commandment that the entity with the least to lose from a collision has the right of way.

0
0
Anonymous Coward

Right of way to the stupidest

"to the prime commandment that the entity with the least to lose from a collision has the right of way"

sound like North London food delivery scooter muppet's and Uber drivers have trained there

0
0
Silver badge

I still think that maybe we don't have the full information.

The article mentions the difference between manual and automatic switching of control back to the driver. How much of this is because of a different testing strategy, and how much is because of failure of the control system?

Seems like Waymo is doing allright.

I half suspect that once one company has cracked it, it'll become a de-facto (if not enforced) standard. The Marketing seems to write itself.

I also quite firmly think that anything beyond what we pretty much have already with Tesla "Autopilot" but falling short of fully autonomous is doomed to fail as a product - purely because of easily distracted bags of meat not being ready to take over controls in case of emergency.

26
0
TRT
Silver badge

I think the figures are fairly impressive. It would be interesting to see how they correlate the the actual capability of the auto-drive itself. I mean, once every 5 miles on a motorway for a semi-autonomous lane-changer point-and-shoot system is good but then less impressive than, say, once every 0.5 miles by a full A-to-B sat nav style autopilot operating in a dense, urban jungle scenario with traffic lights every 50 yards, Lycra-clad couri-kazi pilots cycling zigzag between the gridlock, phombies walking across your path in full-on Oxford Street style and coping with a street scene having more visual clutter than the painted record of a fight between Jackson Pollock and Jean-Michel Basquiat.

43
0
Silver badge

up-vote for the Jack the Dripper imagery

8
0
Silver badge

Depending on the levels of transparency and disclosure, the proposed 200 mile trip around the UK ought to be interesting in terms of manual interventions and/or control systems "giving up".

You are right to point out that a manual intervention is not the same as the control system flinging its hands in the air and screaming for the meatsack to take over. Intervention implies the control system was about to do something bad and had to be stopped from doing it as opposed to "realising" it might not be able to handle an upcoming situation and requesting help. Both type need to be counted and assessed.

4
0
Silver badge

You are right to point out that a manual intervention is not the same as the control system flinging its hands in the air and screaming for the meatsack to take over. Intervention implies the control system was about to do something bad and had to be stopped from doing it as opposed to "realising" it might not be able to handle an upcoming situation and requesting help.

Both your cases point to the fact that when the going gets tough the AI can't cope. As long as that remains the case perhaps you should reverse your vocabulary and refer to "bag of spanners" and "capable human driver"

2
0
Silver badge
WTF?

Override Idiotic Wetware Drivers Option Please

Where I live it would be nice if some of the wetware drivers had an automatic switch over to something else. One road has a 30mph temporary limit and severe lane restrictions due to construction work just round a corner. It would be great if the drivers were overridden to recognise the No Overtaking (it is a single lane at this point) and the 30mph limit signs. My car has no problem automatically recognising both the limit and the overtaking restriction signs and displaying them on the display module.

11
2
AS1

Re: Override Idiotic Wetware Drivers Option Please

At a very minimum, automatic speed limiters would improve compliance and reduce driver stress (especially if the proposal for prosecution at +1 mph is taken forward). Given the turnover of cars, within five years traffic would be self-regulating as regards speed.

It would be a first step towards full automation, along with lane following and dynamic cruise controls that are already available.

1
15
Silver badge

Re: Override Idiotic Wetware Drivers Option Please

At a very minimum, automatic speed limiters would improve compliance and reduce driver stress

Personally, I don't want higher limits in town; For most urban streets 30 is fine, however, they'd be inevitable with enforced speed limiters. The only reason the roads move as well as they do is that most people ignore most limits most of the time. Before jumping to dispute this, try driving on any national speed limit dual carriageway at any time of day and stick rigidly to the limit. You'll notice you're constantly being overtaken by all manner of road users.

You'd also have to automate "overtake mode" to eliminate the all too frequent situation where a vehicle restricted to a lower speed limit is dragging along 30+ cars in its wake because those at the front lack the skills to overtake safely or the knowledge to pull back and increase stopping distances such that the rolling road block may be passed in sections rather than one hit. I imagine that will scare the hell out of the first few passengers to experience it, especially when conditions change and the car has to retreat back into the stack after beginning an overtake. For this reason I imagine most bikers would resist having a limiter too.

13
0
TRT
Silver badge

Re: Override Idiotic Wetware Drivers Option Please

Automatic Speed Limiters should be a new class of vehicle on the driving license, and anyone who loses their license for speeding should, once their ban is spent, ONLY be allowed to make use of that class.

Discuss this and the extension to full self-driving vehicles and implications of driving licences etc.

2
3
Silver badge

Re: Override Idiotic Wetware Drivers Option Please

"At a very minimum, automatic speed limiters would improve compliance and reduce driver stress"

And cause accidents every time a driver needed to accelerate out of a situation.

1
2
TRT
Silver badge

Re: Override Idiotic Wetware Drivers Option Please

Hm. At least two people on here have accumulated enough points through speeding to have lost their licenses, I see.

3
0

Failed Logic

Real issue with AV proposals is that full AV is still a distant dream and one that many do not even want.

All the other partial solutions are based on the failed logic of having a human oversee a computer. Humans are incredibly poor at these kind of tasks, and such systems were long ago excluded as unsafe. Early aircraft autopilots being the clearest example of such failures with pilot coming to trust autopilots over looking out the window to see the ground approaching unexpectedly!

Until full AV systems are available that do not require Human intervention the systems should only be driving aids, for example warning or intervening as necessary to support the driver. Computer support of the driver has great potential, but it MUST be a Human in control ultimately.

16
0
Silver badge

Re: Failed Logic

Unfortunately there is a problem with the use of computers (AV or what ever the latest buzzword is) to control cars - what happens is someone installs a program during one of the routine service calls that turns the car into a killing machine when certain conditions are met.

There is a SF story that I read many years ago based on that idea with one of the triggering factors being a full moon.

0
0
Anonymous Coward

One intervention needed per 2km....

So slightly safer than the average driver in the UK then?

13
3

Re: One intervention needed per 2km....

You're kidding? We would all be dead if that were true. Without intervention, the self-driving cars would suffer catastrophic accidents.

6
1
Silver badge
Boffin

Re: One intervention needed per 2km....

We would all be dead if that were true.

Errr, no. For every driver killed traffic density decreases, even if that's infinitesimally small at first. But with every such accident the traffic density, and with that the accident rate, will go down, with the accident rate asymptotally approaching the background level of 'immovable solid object fails to yield to vehicle'

7
0

Fully Autonomous Vehicles Will Be Science Fiction for the Foreseeable Future

Fully autonomous vehicles are way beyond what current AI technologies can handle. A major breakthrough in AGI must happen before we realize this dream. One thing is certain: it will not happen with Deep Learning. A deep neural net is really an expert system and, as such, it suffers from the same fatal flaw: it fails catastrophically every time it encounters a situation for which it has not been trained. This is unsuitable for real world applications where safety is a must.

To all big time investors: Do not waste money investing in any project using deep learning to achieve full driving autonomy. It's a waste of time and money. Invest in AGI research instead.

18
1
Anonymous Coward

I can't see full AV either

First problem it assumes maps are entirely reliable, second problem it assumes meatbags will not accidentally, or accidentally on purpose create accidents.

Third problem is real life, potholes, collapses in road/neighbouring trees/cliff sides etc.

Fourth problem is destination parking. do I expect to have my driveway (and any friends) driveways' mapped? What about the field I park in for the summer fete? Or the loading bay of the factory?

On main roads, yes, nice idea, but the idea of thumbing down a jonnycab still appears a bit far in the mists of future time.

Overall, though measuring the number of driver interventions as an indicator of success/safety is also entirely unhelpful, local road conditions, weather, poor junction designs, indoor car parks in GPS shade, can all be a factor as there is no consistent test conditions (for which I am grateful as it would reproduce the fuel economy cheating effects)

Chasing the impossible dream, there probably should be a song about that...

14
0

Re: I can't see full AV either

Firstly, does it assume maps are entirely reliable? I don't think that's true at all, they read road signs and markings (not perfectly at present but they're getting there)

https://www.citylab.com/transportation/2017/02/how-to-teach-a-car-a-traffic-sign/516030/

The second problem doesn't seem like a driverless cars problem to me. If a human decides they want to deliberately cause an accident by driving on to my side of the road immediately before we are about to pass in opposite directions there's not a damn thing I can do about it and there's nothing a driverless car can do about it either. I don't see how AVs make that worse

Third problem is related to the first. That's only a problem if you assume that AVs navigate solely by GPS and maps, but that's not true, they navigate by GPS, maps, cameras and (maybe) lidar.

The fourth problem is a fairly minor UI issue, how do I indicate where I want the car to go if there is no map of the area. Well either the car takes me to the nearest mapped road then puts me in to a semi-manual mode for the last 10m where I direct the car with a joystick and it continues to handle all the collision avoidance and actual control of the car, or I indicate on a satellite map where I want it to go, or something similar. Even if the answer is that no, you can't park anywhere that's not on the map, that's still not fatal for AVs, there just needs to be some way for you to get your own drive added to the map and you'll have to accept that you're otherwise going to have to park in car parks or on the road side.

I can see some arguments for the fact that getting the last 20% of the way to AVs is going to need much much more work than the first 80% but I can't see anything which is a complete show stopper.

2
4
Silver badge

Re: I can't see full AV either

"they read road signs and markings"

Thanks to "austerity", road markings seem to be getting less and less visible these days. Re-painting costs money and councils don't seem to have any. I'm not sure I want to be in an AV on a dark rainy night. Not to mention when snow covers the markings up. Likewise, more and more road signs seem to be disappearing behind foliage. More "austerity" cut-backs.

8
0
Silver badge

Re: I'm not sure I want to be in an AV on a dark rainy night.

Darkness will not be an issue, at least not if the car is using Lidar - it doesn't rely on ambient light.

Rain will be a problem, but only so much as it is to humans too - it's a physical obstruction.

0
0
Silver badge

And for comparison...

How often do passengers have to shout "Oi!" at a driver who needs to snap back into focus?

I know I'm not the only person to have gone 'fully autonomous' with no recollection at all of parts of the journey I have just undertaken. Or had to take 'late action' because I hadn't fully absorbed the situation earlier.

Why autonomous control is disengaged is important. If it's just because the system or meat sack wasn't confident about what was coming up and it was an informed decision that seems fair enough. If it's because disaster is imminent that's more worrisome.

16
0

Re: And for comparison...

A common experience. In 1947, Robert M. Coates wrote a science fiction story in the New Yorker about it, called "The Hour After Westerly"

https://www.newyorker.com/magazine/1947/11/01/the-hour-after-westerly

When automated driverless cars can write imaginative short stories, THEN we will need to worry.

6
0

Optimistic

Read through these reports. In particular, Waymo and Cruise. They are logging the most miles and their trend is clear. The latest reported months have a lot more miles and a lot fewer disengagements.

Cruise notes why they drive in Frisco instead of other places: It's a harder environment than suburbia or highways, so they learn faster.

Remember, these guys *want* disengagements. Each disengagement can be gone over like an airliner crash. Replayed millions of times, varying the parameters. When you run out of disengagements, you have a problem learning, don't you?

14
0

Re: Optimistic

This could be one of the greatest relative strengths of AVs. Once they're out there you don't necessarily have to wait for a crash or disengagement to learn. Here's an interesting blog post from Tesla on this subject. It describes how they use fleetwide learning to whitelist particular radar returns in particular areas to avoid false positives.

https://www.tesla.com/en_GB/blog/upgrading-autopilot-seeing-world-radar

As well as recognising specific items, presumably it could be used more generally and even in cases where there was no collision and no false positive causing unnecessary emergency braking. I.e. my car sees something which it assumes is a far away truck, as we get closer, it realises that it is a nearby van. It can send to Tesla, "image A was incorrectly classified as a truck", if Tesla get lots of those they can tweak the image classifier. Similarly on the control side it can say, I wanted to change course from A to B so I applied control input X. I actually ended up on course C and therefore applied correction Y to end up on course B. As far as the passenger is concerned nothing bad happened but my AV knows that its internal model of the car's dynamics must be wrong. They can send that to Tesla who can either say "Get your suspension / tracking / tyre pressures checked" or if they see it from lots of cars then they can change the control model.

2
2

Re: Optimistic

From the Tesla report:

Additionally, because Tesla is the only participant in the program that has a fleet of hundreds of thousands of customer-owned vehicles that test autonomous technology in “shadow-mode” during their normal operation ..., Tesla is able to use billions of miles of real-world driving data to develop its autonomous technology. In “shadow mode,” features run in the background without actuating vehicle controls in order to provide data on how the features would perform in real world and real time conditions. This data allows Tesla to safely compare self-driving features not only to our existing Autopilot advanced driver assistance system, but also to how drivers actually drive in a wide variety of road conditions and situations.

Put another way, Tesla is Big-Brothering their cars and can conduct a Delphi Poll on what a good driver does in very, very many circumstances.

4
0
Silver badge

To take the devil's side

We are at the dawn of autonomous vehicles. I do believe that we need all the data now in order to be able to properly program the damn things for later.

So yeah, Big Brother it may be, but that should help needing less of the Red Cross later on.

1
0
Silver badge

Re: Optimistic

"Tesla is Big-Brothering their cars and can conduct a Delphi Poll on what a good driver does in very, very many circumstances."

What it can't do is record why the driver did it if the Tesla system didn't register the problem. e.g. the driver recognises from the behaviour of a pedestrian that they're about to side-step off the curb and breaks in anticipation. The system will record the pre-emptive breaking followed by the entry of the pedestrian onto the roadway but the actual movement of the pedestrian will only be recorded as a random action at the time it actually happened. The driver, being a sentient being like the pedestrian, can see that the pedestrian is unsteady, is being confronted by another aggressive pedestrian or whatever and has sufficient understanding to realise what they, the driver, would do if they were in that situation.

The critical word in the previous sentence is "understanding". That's the difference between man and machine.

2
0

Re: Optimistic

True, AVs are unlikely to be able to get that degree of reading the intentions of pedestrians for an extremely long time. But how big a deal is that? If you're moving through at 30mph, an average human probably isn't going to spot that amount of detail either.

For a human driver to have the the time to see evolving dynamics between pedestrians and the spare capacity to be watching them in the first place you've got to be doing what 10mph? Less? At that speed the stopping distance of an AV is just a few feet. So while the human driver might be better in the sense that they can spot it earlier and brake sooner, they're probably not better in the sense of actually preventing any accidents.

0
0

So because Mercedes disengages every mile, waymo's cars aren't ready?

That seems to be the gist of the article.

10
0
Childcatcher

Paradoxically, yes

Check this article about Air France flight 447 that crashed into the Atlantic:

http://www.slate.com/blogs/the_eye/2015/06/25/air_france_flight_447_and_the_safety_paradox_of_airline_automation_on_99.html

The gist is that because so much regular airtime is on auto pilot, when the system disengages the pilots are less able to assess and take over the plane because they are out of the loop and out of practice.

So with a car that disengages every mile or two the driver will at least have retained most driving skills, and in fact will be pretty much waiting for the disengagement.

On the other hand, if it disengages every 5,500 miles, the chances that the driver will be able react (and is even able to! nap time right?) would be pretty slim.

After reading the Slate article, I came to the conclusion that for self driving cars it is a case of all or very little: either 0 disengagements, or so many that the driver is still pretty much engaged. Of course I could be pessimist about this.

14
0

Re: Paradoxically, yes

Ignoring the fact that has nothing to do with my point, I still disagree with the 0 disengagements policy.

People crash as well. So long as the rate of disengagements is less than the rate of crashes of the driver, the self driving car is still safer.

1
1
Silver badge

Re: Paradoxically, yes

It's not so much that.

In a manually driven car, you know what your car is doing at any moment in time because you told it to do it, and you know why you to told it to do it.

In a self driving car, if you spot an problem, you have to figure out what the car is currently doing before you can take over, and that takes about 25 seconds. If you are driving at the speed limit on a British motorway, your car will travel about 940 meters in that time which is way outside your current field of vision.

4
0
Silver badge

Re: Paradoxically, yes

"So long as the rate of disengagements is less than the rate of crashes of the driver, the self driving car is still safer."

That means we have a very long time to go before the self driving car reaches the standard of an inexperienced driver.

3
0
TRT
Silver badge

Re: Paradoxically, yes

20 years on from the advent of the fully automatic self-driving car, how will the next generation of learner driver acquire enough experience to be able to hold a driving licence?

0
0
Silver badge

Re: Paradoxically, yes

Why would they need one, if their car is fully self driving?

0
0
Silver badge

Disparity in distance

I wonder if the disparity in distance between interventions might be in part explainable by the time of day the vehicles operate (peak / off peak), and the type of roads and directions in which they're driven.

Head into London on the M1 at 8AM Monday morning and you'll need more interventions per mile than heading along the local b-road at 3am Sunday morning.

4
0

Re: Disparity in distance

From experience in a Tesla Model S this is actually the reverse of where we are now. I'd trust the car completely driving along the M1 in heavy traffic (I drove from Bristol to Sheffield on motorways around Christmas and only intervened to change motorway or go in to the supercharger). B roads just don't have enough information for the car to decide what to do (no lane markings, no edge markings, corners you can't see around, huge puddles, cattle, horses, unexpected stationary farm vehicles and so on).

1
0
Silver badge
Headmaster

Correction

"While the above data shows that driverless tech is indeed still a while away "

ITYM miles away.

1
1

GPS NIghtmares Made Worse?

My 2017 vehicle still has maps from 2009 and there are no updates listed. When updated maps are available they will cost me $250USD. When I drive home my vehicle insists I'm off-roading at 120km/h because the highway was realigned 2 years ago. Unless mapping updates become instant and automatic I can foresee our autonomous future ending in tears...

18
0
Silver badge

Re: GPS NIghtmares Made Worse?

Sounds like you bought from a manufacturer who cheaped out on a proprietary satnav instead of licensing one that gets updates.

2
0
TRT
Silver badge

Re: GPS NIghtmares Made Worse?

Mine is of a similar state, and it wasn't that it cheaper out, but that the system must accept inputs of wheel rotations, inertial sensors and steering angle etc etc to supplement GPS. At the time of manufacture (2006) there wasn't an alternative other than to design it yourself. I'm not sure even today that there's a commercial system that accepts and balances alternative positional information.

1
0
Silver badge

Re: GPS NIghtmares Made Worse?

"Sounds like you bought from a manufacturer who cheaped out on a proprietary satnav instead of licensing one that gets updates."

And if your car is useless without the frequent updates the vendor of those updates will have you by the balls every time your subscription is due.

0
0
Silver badge
Paris Hilton

While the above data shows that driverless tech is indeed still a while away from being fully reliable, none of the autonomous vehicle makers that we can think of are claiming to be so, and certainly don't intend to be any time before 2020-ish.

I thought that Waymo was going to start a fully-automated service in Phoenix this year?

1
0

Page:

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Forums

Biting the hand that feeds IT © 1998–2018