back to article 'Autopilot' Tesla crashed into our parked patrol car, say SoCal cops

Police in Laguna Beach, California, have said a Tesla car – which the driver claimed had been operating in "autopilot" mode – has crashed into one of the force's stationary cop cars. Photos of the incident were tweeted by Laguna Beach Police Department Public Information Sergeant Jim Cota on Tuesday (this morning UK time). The …

Silver badge
WTF?

Re: Drugs

There's nothing "uniquely American" about the concept of pedestrians not simply walking across the street wherever they feel like it; try that shit in Eastern Europe, get fined all the way to oblivion if there's a cop around - as you should be*. There's a good reason the international "pedestrian crossing" traffic sign exists. Sane places use them.

* of course we still do it when the road is effectively empty far to the left and right - but everyone knows it's unmistakably our ass on the line if we failed to spot a cop** and our fault if anything goes wrong.

5
6

Re: Drugs

"Jaywalking" is a uniquely american concept,"

No it's a concept used in many civilised countries to protect humans from machines.

Heck, I'm sticking up for the US. I'll have to watch myself or next it'll be Microsoft.

6
4
Silver badge

Re: Drugs

"try that shit in Eastern Europe, get fined all the way to oblivion if there's a cop around"

Next time that happens to you: Watch where that "fine" actually goes.

Hint: Not into the authorities' coffers.

Also: watch who it gets enforced against

Hint: Not the locals.

' The forgotten history of how automakers invented the crime of "jaywalking" '

https://www.vox.com/2015/1/15/7551873/jaywalking-history

https://en.wikipedia.org/wiki/Jaywalking

2
0
Silver badge

Re: Drugs

The problem with the concept of Jaywalking is that it seems to criminalise harmless crossing the road when it's safe to do so.

If you are wandering in the road when there is traffic about, you are a danger to yourself and others and there may be a case for prosecution. If on the other hand you cross the road at a sensible time and perfectly safely, it's a waste of time and money to treat that as wrongdoing, which I have heard stories of many times.

2
0
Silver badge
Holmes

Think of it

as evolution in action.

7
1
Silver badge
Thumb Down

Self driving cars?

Bollocks!

9
2
Anonymous Coward

Re: Self driving cars?

A Wild Bull would also be a danger if left unattended.

5
0
Anonymous Coward

What's the point of having a dog then barking yourself, autopilot seems as much use as an ashtray on a motorbike.

10
3

The fundamental problem of self-driving cars is deep learning

Deep learning sucks. Unlike the brain, a deep neural net can only see things it has been trained to detect. IOW, don't wear a Chewbacca costume in front of an autonomous car. Just saying.

5
0
Silver badge

Re: The fundamental problem of self-driving cars is deep learning

That surely depends on whether the designer was a Star Wars fan or hater?

1
0
Silver badge

Re: The fundamental problem of self-driving cars is deep learning

"Unlike the brain, a deep neural net can only see things it has been trained to detect."

Nonsense. A neural net can only recognise things it has been trained to detect. Just like you. Can you distinguish a sonnet by Marlowe from one by Shakespeare? But since the detection of objects by autonomous vehicles has nothing to do with neural nets (and everything to do with things like LIDAR) Star Wars fans can safely roam the streets.

0
0

IMO, the important metrics to consider are:

Accidents per whatever unit of distance compared to human drivers.

Fatalities per whatever unit of distance compared to human drivers.

That's a start of looking at this objectively rather than with an impossible to meet standard of perfection. Ample evidence exists to demonstrate humans are far from perfect as drivers.

8
0
Anonymous Coward

I'd add to that line of thinking:

It's fair to compare accidents\fatalities involved per mile, but you should also consider the number of times that the safety features engaged successfully and prevented or reduced the severity of an accident.

People are looking at this with a very negative space point of view, only counting the crashes and failures. The big picture looks very different. Musk's recent rants aren't helping people focus on that, but it's important as it impacts every player not just Tesla. The basic technology is never going to get to 100% safety. It shouldn't be expected to, and in these early generations it just needs to be be close to a human driver, and work in a complementary fashion with one.

Auto accidents are one of the leading causes of death and serious injury, and the number of self drive incidents is still tiny even relative to the number of miles driven. So I wish people would stop acting like this was in any way an issue worthy of the panic. I'd have bigger concerns if this was being used for unattended vehicles, but not with a driver who's hands are on the wheel.

8
0
Silver badge

Re: I'd add to that line of thinking:

It's fair to compare accidents\fatalities involved per mile, but you should also consider the number of times that the safety features engaged successfully and prevented or reduced the severity of an accident.

No, you shouldn't. The reason being, those numbers are already included in the headline "accidents/fatalities per million miles", or whatever number you're looking at.

The trouble is that if you get a number for "times safety feature engaged", you have nothing to compare that number with. Human drivers don't, typically, make a systematic count of every time they have to brake to avoid crashing into the car in front - and if they did, the answer would be so subjective as to be meaningless anyway. So that number can only, at best, be a distraction.

We need numbers that can actually be measured with a reasonable degree of certainty and consistency. Number of accidents, and especially number of fatalities, are the only metrics that come close to meeting that requirement.

4
0
Silver badge

Re: I'd add to that line of thinking:

"It shouldn't be expected to, and in these early generations it just needs to be be close to a human driver, and work in a complementary fashion with one."

Except you need to take human nature into account. Although it's still devastating for the family of anyone killed on the roads if a human is driving it's either an unavoidable accident or there's someone to blame and hopefully be punished. When it's a "machine" that kills someone, how do we accept the "unavoidable accident" when it was the "infallible computer" that did it or choose who to blame or get punished?

1
0

The real problem is likely that the 90% (or higher) times that AutoPilot does work, disarms people's ability to handle the remaining bits.

This is a serious, architectural problem. If 90%(or even 99.5%) success is accompanied by 10% or 0.5% catastrophe, the technology is fundamentally unsafe.

14
2
Silver badge

Fully agree and people are sick of me banging on about this. In schools now they are so bubble-wrapping everything that children do not learn to evaluate risks which means they become adults and enter an environment where simple everyday events are trying to kill you (or have you kill yourself)

The same thing is happening in healthcare. People are so germ'o'phobic that children are no longer building a proper immune system and instead we are weeding out all the weak bugs and just leaving the superbugs by overusing anti-bacterials

17
2
Holmes

Wise choice

"...walk away from the crash uninjured and refused an offer of medical treatment."

Pro life tip, always refuse medical treatment when uninjured and healthy.

17
0
Silver badge

Not fit for purpose

This is at least the third time a Tesla on "autopilot" hit something in its path without even slowing. Even the lesser "super cruise" modes available on many luxury cars that provide emergency stop do better.

I wonder how many cases of this have to happen before the government requires Tesla to disable the autopilot feature in the US? Obviously the warnings they claim they are putting out to let people know it isn't what most people think of when they hear "autopilot" and the warnings they claim they are doing to try to get people to stay in control of the car when using it aren't working.

Will a Tesla on autopilot have to kill someone in another car or a pedestrian before they take action? It is ridiculous that Tesla is allowed to beta test an alpha quality product on public roads.

7
7

Re: Not fit for purpose

Well, it's not that a car not equipped with "autopilot" or other driver assistance system has ever hit something in its path without slowing down before, now, has it?

6
0

Re: Not fit for purpose

And I bet there are other car manufacturers out there who do have their own share of crashes with driver assistance systems turned on.

You don't hear about them because they're not Tesla. They're not juicy enough. They didn't advertise their – sometimes better – systems as aggressively. They don't bundle all their assistance system into one big "autopilot", neither in advertising, selling (in fact, while gleaning sales brochures of the big German manufacturers I have hard time decoding what exactly each of the system does, or how it should be compared to Tesla's system, which I understand quite well. They dowse the explanation in so much technobabble that it puts Star Trek dialog to shame) or in use.

That makes them less attractive to report. If the Tesla Autopilot fails, it is a story. If the VW Collision Avoidance System fails, it's a non-story. Because the former is seen by the PRESS (not the majority of Tesla drivers or technical minded people) as a revolutionary type of self-driving future, while the latter comes across to them as just another car part like brakes. Are failing brakes a story? There you go.

12
2
Silver badge

Re: Not fit for purpose

Driver aids such as emergency braking work fine in most cases. The key difference is that no one expects to take their hands off the wheel and stop paying attention just because they have these driver aids to help keep them safe.

Autonomous vehicle makers are pushing exactly that, and that will lead to deaths that could have been avoided.

7
0
Silver badge
Devil

Re: Not fit for purpose

Fair point. The AAA did some testing on other cars and discovered they're far from perfect either, even in "avoidable" accident scenarios.

But Tesla have brought the bad press on themselves by calling it "Autopilot" and lulling their users into a false sense of security.

A car, why, what do you see? --->

7
2
Anonymous Coward

@elgarak1 - Re: Not fit for purpose

Do you really believe in collision avoidance systems ? I don't.

0
0
Silver badge

@elgarak1

Or should I call you "Tesla apologist"?

Please tell me which cars have "driver assistance systems" that allow drivers to keep their hands off the wheel and treat it as a self-driving car, with minimal warning (and apparently you can buy third party devices intended to fool the steering wheel into thinking you are touching it so you can drive without the annoying warnings)

No one else is stupid enough to call their system "autopilot", knowing full well that in most people's minds the word autopilot means it can drive itself. And it actually does try, it just does a really shitty job at it and will continue getting in accidents until it kills an innocent bystander and Tesla is sued for $50 million.

They knew exactly what calling it autopilot would connote in people's minds, and lied to owners that the cars would be upgradeable to level 5 automation when they aren't shipping with the hardware necessary to implement that. Heck, it may be short of the hardware required to even detect a vehicle right in front of it, given that it keeps ramming into stopped vehicles in its path without even slowing.

7
4
Silver badge

Re: Not fit for purpose

Are failing brakes a story?

If brakes on a particular make and model are failing in numbers greater than "just a few, and negligible compared to the total number on the road", then it may well become a story. Especially if those brakes are of a new design with several improved features.

3
0
Anonymous Coward

That section of laguna canyon has "clever" road markings

Laguna Canyon isn't a side road. It is a numbered highway, and parts of it are divided with a center barrier. Parts of it also have an oddly marked "Suicide Lane" that is a great example of the city planners getting creative with the road markings. Hint, human divers get confused by non-standard road markings, not just computers. Stop being clever and use the same markings that the rest of the state uses.

Looking at the photos it looks like the crash was in a two lane section an the police car was a white SUV with blue lettering, not a full black and white. It also looks like it was parked by the side of the road, not parked in the middle of the street(which is a thing that they like to do some times). It will be interesting to see if the self drive had kicked out and the driver didn't notice, or if the car just went wide in the turn and plowed into the corner of the police car.

1
0
Silver badge

Re: That section of laguna canyon has "clever" road markings

"Stop being clever and use the same markings that the rest of the state uses."

Would that not be undue governmental interference into the rights of the local government who did it? Standardised top down regulation in the USA seems to be seen as some sort of commie pinko plot by certain outspoken people.

2
0
Silver badge

> the ~40,000 people who died in US auto accidents alone in past year get almost no coverage

He's right. This has always pissed me off personally.

12 people get shot in a school and it's the end of the world (which it is) but 3,000+ die EACH MONTH and it's completely ignored.

And the penalties are nil. An old geezer killed someone on a bicycle and got a whopping $80 fine - until the local community revolted and she got 3mos in jail - 3 months for killing someone!

9
3
Silver badge

That's just not true. That 40,000 figure got extensive coverage from, among others, Washington Post, Wall St Journal, USA Today, CNBC, AP, and just about every other mainstream outlet.

In 2014, $416 billion was spent on maintaining US highways, and that's not including the cost of policing them, or the cost of building new highways, or vehicle inspections, driver education and licensing, or many other related costs. That's over $10 million per death, even excluding some of the largest costs. That's - not my idea of "completely ignored".

6
1
Anonymous Coward

@Gene Cash

Death of a person is a tragedy. Death of tens of thousands is just statistics.

5
0

"12 people get shot in a school and it's the end of the world (which it is) but 3,000+ die EACH MONTH and it's completely ignored."

Difference is 12 people are shot without any reason WHATSOEVER whilst deaths on the road are one of the risks of everyday life (which we should try and minimize) for which we derive a benefit

8
1

There could be many reasons why someone knocks over and kills a cyclist, not all of them being the car drivers fault.

So without more information it is impossible to say whether this was far too lenient or a travesty of justice that they received any sentence at all.

4
1

"...drivers are continuously reminded of their responsibility to keep their hands on the wheel and maintain control of the vehicle at all times.."

Otherwise understood as the "notapilot" feature. Used as intended, it's really doing nothing at all.

4
1
Anonymous Coward

Unrealistic expectations?

Does Tesla really believe that U.S. drivers will keep their hands on the steering wheel at all times and actually pay attention to their driving with a feature called: "autopilot"? If so then maybe Tesla is in the wrong business because it's never gonna happen. It's unlikely that Tesla can escape major lawsuits based on their expectations of U.S. drivers.

5
1
Silver badge

Musk is a geek

he thinks like a geek and is thus subject to "nerdview" (basically assuming that everyone sees a system as a system and not just as a thing that does something useful). I hate to say it but this is a case where a good marketing person would have come in handy.

1
0

Re: Unrealistic expectations?

The use of of Autopilot seems to be like chaperoning a beginning driver.

Maybe everything goes ok, but you constantly need to keep on edge to ensure no hiccups are encountered.

I don't need that stress (because I care). I'd rather drive myself, than take a chance getting lulled to sleep driving without driving with the unlicensed driver that is "AutoPilot".

0
0

This post has been deleted by its author

Anonymous Coward

Tesla Autopilot is not even capable of AEB ?

Automatic Emergency Braking (AEB) is where a car's sensors see something coming up (for example, a parked Police Car or a barrier across the lane) and slams on the brakes. It's an increasingly common feature and doesn't appear to be all that difficult to implement.

It seems clear that Tesla have somehow neglected to include a fully-functioning AEB within the Autopilot system.

What a big bucket of FAIL.

Clowns.

2
0

Page:

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Forums

Biting the hand that feeds IT © 1998–2018