back to article Oddly enough, when a Tesla accelerates at a barrier, someone dies: Autopilot report lands

A Tesla with Autopilot engaged accelerated toward a barrier in the final seconds before a deadly crash, an official report into the crash has revealed. Apple engineer Walter Huang was driving his Model X P100D on a Silicon Valley freeway on the morning of March 23 when the car, under computer control, moved into the triangular …

Silver badge

Re: Fire Department

The thing about water and lithium is a bit overstated. Lithium is less reactive than the other alkali metals. Yes it produces hydrogen when wet, but the rate of production of hydrogen depends on temperature and surface area. Apply lots of water to keep the temperature down, because simply denying it air won't work terribly well once it's over 85C.

3
0
Silver badge

Re: Fire Department

The problem is the cell-compartmentalisation. It takes only one cell to become a thermal runaway and you have flames. But cooling them involves pouring water on the middle of a battery that's almost entirely solid metal contained in a metal box. It's hardly accessible.

It's not going to go "boom" on contact with water, but it's going to expand, release gas, put pressure on nearby cells, all contained in a fixed size box with thousands of other cells that were - until then - untouched.

And as shown - even days later the tiniest sliver of stray/punctured/scored metal from the accident shorting out the cell starts a fire again.

I have seen a Macbook keyboard physically expand upwards and then ping keys off it, over the course of just minutes, because a battery got wet. The battery was literally 2-3 times its original size and punched/warped the aluminium casing in a matter of seconds. That's not helpful.

3
0
Silver badge

Re: Fire Department

The lithium in battery packs as I recall isn't raw metallic lithium but rather in a compound. The result is that the material is not nearly as water-sensitive. That's why airliner guidance for a phone battery on fire is to douse it; for something as sensitive to fire as an airliner, they wouldn't be saying this if they didn't consider the ramifications carefully. In this case, cooling down the battery to prevent further thermal runaway is clearly more of a benefit than the risk of a lithium reaction.

0
0

Why the hatred for Tesla and Musk?

Is it because he told you the truth about your pantomime profession?

Did the driver have his hands on the wheel like he was supposed to? No. Case closed.

1
24
Anonymous Coward

Dear "Robbed of your longshaft",

Did the valid criticism of Your Beloved Elon cause your tech erection to flag?

10
2
Silver badge

re: Is it because he told you the truth about your pantomime profession?

Oh no he didn't!!

8
0
Silver badge

While I don't find Elon's reaction particularly tasteful, I don't think he did anything out of the ordinary either. Tesla didn't lie; they were simply quick to point out anything and everything that may have contributed to the crash beside their own role. Hardly surprising, that. Every single person and company I can think of does exactly that immediately whenever blamed for something. It may not be the reaction you're looking for, but it's certainly human nature...

4
1

There are a lot of non-Tesla drivers in these comments.

Anyone who actually owns one, having spent a few minutes getting to know Autopilot, quickly learns it's limitations. It's pretty damn good on motorways, but you know as soon as you come to junctions you have to actively tell the car what to do.

Any one who is stupid enough to just let the car drive, without paying any attention at all, would probably be the sort of person who would have done the same thing in a car without the feature.

15
6
Silver badge

Non tesla driver here

I test drove a Model X. Got the sales-person to turn on Autopilot

Car immediately accelerated hard and pulled sharply to the left, presumably to find the line, only it was sharp enough to have taken me off the road had I not resisted the turn which stopped it. Autopilot was immediately turned off again.

It's not fully autonomous, and I wouldn't be happy to leave it trying to drive without my guidance/overwatch if I were to get one.

4
0
Anonymous Coward

Anyone who actually owns one, having spent a few minutes getting to know Autopilot, quickly learns it's limitations. It's pretty damn good on motorways, but you know as soon as you come to junctions you have to actively tell the car what to do.

So in other words it's fucking useless and no better than cruise control just with a misleading name.

Mine's the one with a manual gearbox and no flaky ELON9000 trying to murder me, open the pod bay doors Elon.

5
2
Silver badge

Re: Non tesla driver here

"It's not fully autonomous, and I wouldn't be happy to leave it trying to drive without my guidance/overwatch if I were to get one."

Which is exactly what the user manual says you should do. The autopilot systems are in beta and full self driving is not yet available (FSD probably won't be available for a long time, probably eons or elons)

0
0
Pint

Re: Non tesla driver here

It seems to me that the machine vision is being done wrong, and completely backward, and needs to go back to first principles.

How do I stay on the road?

- First, find the edges of it. Edge detection is key.

- Lanes have a mostly standardized width, so it is pretty easy to figure out how many there should be. If the number is sufficiently fractional a lane is probably merging.

- Next, look at the motions of other cars, they are likely to give a good indication of pathing.

- Last AND least, look at lane markings, because 101 has too many bloody places where they didn't paint over the old markings so they cross over each other and run straight into barricades.

How do I navigate unexpected obstacles?

- My vehicle can be described as a rectangular prism of "Will bend and break if intersected".

- Around it there is another slightly larger area of "Keep Out Zone" that I want to try to protect.

- I should choose a path that will allow me to pass without allowing any intersections of my "Keep out zone" with the current and projected paths of objects. It does not matter if it is a ball, a child, bicycle, or car, it is not desirable to hit it.

- It is easier to identify things like wind-blown paper, bags, etc which are not a problem than the myriad things which are, so train for the smaller set and treat the rest kinematically.

3
0
Gold badge

Re: Non tesla driver here

- First, find the edges of it. Edge detection is key.

Edge not found for unspecified reason. Now what?

- Lanes have a mostly standardized width,...

Not on this road. Now what?

- Next, look at the motions of other cars,

Road full of nutters who left it too late to be in the correct lane. Now what?

- Last AND least, look at lane markings

Computer vision is rubbish and delivers a *clear* identification of a lane marking that doesn't actually exist. Now what?

Human beings suffer all of these problems, but get around them by understanding the road and the others users at a far higher level, so when they receive implausible information they can reject it and try harder to find some better sources. We find this process so easy that we usually don't even realise we are doing it. Sadly, we've no idea how we do it. The workaround, so far, for autonomous vehicles is to spend shedloads of cash on numerous and exotic sensors that far outstrip the capabilities of our eyes and ears.

3
1

Re: Non tesla driver here

Rubbish. Think about how you stay on the road sometime. If all those things fail YOU are going off as well. What if you suddenly go blind or have a stroke. Hey, same result.

The processes I listed were my understanding of how I stay on the road through less than ideal conditions. There are likely more, but they build What a hierarchical process that gives different weights to different types of data and rejects or adjusts if there are contradictions.

My point was that the behavior I see reported from self driving vehicles seems like it relies most on things like lane markers that go totally awry when the highway department gets involved, so the way the vehicle determines position and navigation may need a rethink.

0
0
Silver badge

Re: Non tesla driver here

That's PRECISELY the problem. We DON'T think about it. Not consciously, at least. It happens all SUBconsciously in our autonomous mind, and one of the things we've learned through machine learning is that it's bloody hard to teach intuition, because most of the time we don't know how OUR OWN intuitions work. You can't teach something you don't understand. And before you disregard the idea, consider how much conscious thought we put into walking, which we typically learn as babies when our capacity for reasoned, conscious thought was limited to begin with, yet nigh everyone from schoolchildren to adults handle the process with hardly a second thought. If you want to see just how much goes into a basic walking gait, try playing QWOP (look it up with your favorite search engine).

1
0
Silver badge

Re: Non tesla driver here

- First, find the edges of it. Edge detection is key.

The first self-driving tech attempts tried that, they found the car freaked out when the edges changed suddenly (like when the roadway became a bridge) and couldn't cope well if the edge wasn't strongly defined (a lot of places don't paint white lines on the edge)

What people have found is that everything you think you know about driving isn't actually what you know about driving. It's all the edge cases which make it hard.

0
0
Silver badge

Re: Non tesla driver here

"If you want to see just how much goes into a basic walking gait, try playing QWOP"

or watch someone learning to walk again.

0
0
Silver badge

Why???

Why would an engineer - of all people - do such a thing? Riding at the front of a 2-ton lump of metal and plastic travelling at high speed towards other such lumps approaching at equally high speeds, with a head-on crash averted only by some buggy software?

Even seen as a method of committing suicide, it is excessively complicated.

4
1

Re: Why???

Of course, we don't know which Apple product this guy was working on, do we? If he were the real Miles Dyson, and Elon figured it out... Or, more likely* it was just neutering the competition.

*unlikely!

3
0
Silver badge
Holmes

Self driving cars will always kill people. The only question will be - do they statistically kill fewer people than people driven cars.

The headlines are never going to go away.

15
0
Anonymous Coward

Everything makes mistakes

100 people die on US roads per day, likely due to stupid human mistakes, little mention of that.

1 person dies due to 1 stupid mistake of an autonomous system, the news won't stop going on about it.

7
11
Anonymous Coward

Re: Everything makes mistakes

Weird. Human's crash cars all the time and it's not news. Computers hardly ever crash cars and it's news. How does that work? I don't see the difference.

Oh wait, is it do with how common the things are?

5
4
Bronze badge

Re: Everything makes mistakes

Computer driven cars are in the minority. The crashes are ones that people would not have. A human driver would have no problem with leaving the 101 for the 85 yet should not have allowed the car to attempt this. So it was human error to have allowed the car to do this.

All car crashes should be regarded as human error. Any time a Tesla crashes on autopilot it's the error of the driver who allowed autopilot to be in control. Alternatively someone hacked the car and murdered the driver.

4
1
Silver badge

Re:1 person dies due to 1 stupid mistake of an autonomous system

Tesla autopilot isn't autonomous.

4
0
Silver badge

Re: A human driver would have no problem with leaving the 101 for the 85

As the barrier was damaged from a previous crash, and we haven't heard about a Tesla crashing there before, I think it is safe to assume that a person had indeed crashed there previously.

So much for human drivers having no problems.

11
3

Re: Everything makes mistakes

"A human driver would have no problem with leaving the 101 for the 85..."

The damaged barrier present BEFORE this event suggests otherwise...

8
1
Silver badge

Re: Re:1 person dies due to 1 stupid mistake of an autonomous system

The fact is that the driver of the Tesla apparently chose to abandon control of the car to the autopilot, so it was autonomous in fact even though it was not Tesla's intention that it should be allowed to operate that way.

2
1
Silver badge

Re: Everything makes mistakes

"A human driver would have no problem with leaving the 101 for the 85 yet should not have allowed the car to attempt this."

The fact that the crash barrier had not been repaired since being damaged in a previous accident indicates that at least one human driver had a problem leaving the 101 for the 85.

2
1
Silver badge

Re: Everything makes mistakes

"The crashes are ones that people would not have."

Are they? Are you really sure about that?

"A human driver would have no problem with leaving the 101 for the 85"

And yet, the crash attenuator on the gore was removed, because someone had crashed into it in virtually the exact same manner as the Tesla did. Had it been there the crash would have been perfectly survivable.

More to the point, video of this piece of road clearly shows a misleading lane marker trivially capable of pulling unwary drivers directly into the gore - and I'll point out _again_ that someone had already crashed into the gore, which is why it was in the dangerous state it was in.

0
0

Re: Everything makes mistakes

Very late, but why oh why do people insist on commenting without bothering to look at the data?

"A human drive would have no problem..."

EXCEPT ONE DID. In a Toyota Prius. Which resulted in the crash barrier being damaged. Which resulted in the Tesla not benefiting from that barrier. Etc...

0
0
Silver badge

When will people learn

It's not the normal events which confound automated driving systems, it's the abnormal ones.

The reality is unless a vehicle capable of handling all situations in the road safely, the driver must be compelled to pay attention. An alert driver combined with an autonomous vehicle is far safer than an autonomous vehicle by itself.

4
3
Bronze badge

Re: When will people learn

"An alert driver combined with an autonomous vehicle is far safer than an autonomous vehicle by itself."

But there's the rub, how can a driver remain alert if he's not doing anything most of the time?

In order for computers and humans to drive cars together then the human must be involved all the time whilst the computer assists to make the job easier and more precise.

For instance power steering makes steering easier and more precise so the car is driven better with less effort. The 'autopilot' should be a co-pilot.

6
0
Silver badge

Re: When will people learn

Well that's the point I was making. If you don't keep the driver engaged and the car does something dumb, then there is no human intervention when the car piles into a truck / tree or whatever. An alert, attentive driver can hit the brakes even when the car is doing something dumb.

And if necessary that means the car has to force the driver to be alert. Force them to hold the wheel, monitor their face, reaction times, issue activities to perform, keep them engaged with drive. And start bleeping and slow down if they don't react.

The problem is Tesla didn't bother with any of that in the first instance and has only begrudgingly implemented it now.

They're not alone in this - all autonomous cars have the same issue.

3
0
Silver badge

Re: When will people learn

"Force them to hold the wheel, monitor their face, reaction times, issue activities to perform, keep them engaged with drive. And start bleeping and slow down if they don't react.

The problem is Tesla didn't bother with any of that in the first instance and has only begrudgingly implemented it now."

This is incorrect - the Tesla Autopilot does (and did at the time of the accisent) monitor if the driver is holding the steeering wheel and will first warn the driver but will ultimattely disengage. If it believes the driver is still not responding, it will engage hazard flashers, pull the car over and stop.

0
1
Silver badge

Re: When will people learn

"If it believes the driver is still not responding, it will engage hazard flashers, pull the car over and stop."

Is it just me, or am I picturing one of these going into the ditch when it tries to do this on a road with no shoulders?

0
0
Silver badge

Telsa scared of big law suits and having their autopilot and maybe even the cars themselves deemed dangerous to operate/drive. One big thing for NTSB to look at, is the time and requirements to put out the battery fire twice !

1
0
Anonymous Coward

NOT autopilot

whoever decided to call it autopilot is the idiot that needs the blame.

You called it something that "normal*" people think is the car driving its self, no matter how much you tell them it isn't, they now wont believe you, especially when you keep promising the F*&king moon.

1 person is to blame and it wasn't the non-driver...

Lets make it clear Musk is NOT a fucking genius, he got lucky with paypal, the rest is just having buckets of money!!!.

3
8
Silver badge

Re: NOT autopilot

It should have been called advanced lane keeping or similar. Autopilot is such a vague term that people obviously misinterpret what it does and the limits of such a system.

Not just Tesla however. No system is remotely close to full autonomy on the open road. It's not the normal that catches them out but the abnormal.

3
0
Anonymous Coward

Re: NOT autopilot

As we know people are dumb, just make the computer a little less.

If CarPosition == SafeMotorWay then AutoPilotAvailable = True Else AutoPilotAvailable = False

If CarPosition == (Junction - 1 Kilometre) then WakeUpDozyDriver(Now()) = True

Function( WakeUpDozyDriver(StartTimer) {

Do until NOT DriverDozy {

AnnoyingBeepVolume = Now()-StartTimer

If Now()-StartTimer >= 5 then ShortSharpElectricShockVoltage = Now()-StartTimer*12

}

If If Now()-StartTimer >= 10 then {

AssumeDriverDead = True

MovetoHardShoulder = True

CallParamedics = True

PointsonDrivingLicence = PointsonDrivingLicence + 1

}

}

Function (TestDriverDozy) {

If (HandsonWheel == True AND EyeTrackingRoad == True AND

DriverSaysCorrectHeadsUPDisplayCode == True) then false

}

Sorted - Elon can just give me a ride on one of his rockets in lieu of royalties!

2
3
Silver badge

Re: NOT autopilot

How does the car know where it is on a road junction with no lines and poor markings otherwise? Especially at the bottom of a stack in stop and go traffic (which can throw off GPS and inertials, tespectively)?

Suppose the driver is in a fugue? Or sleepdriving? Both can result in false positives for driver awareness.

Put it this way. If WE can't always get it right, what chance does any technology we make have?

2
0
Silver badge

Re: NOT autopilot

Most of that is what it does (or attempts to do), aside from disabling autopilot at junctions.

0
0

Unfunnily Enough

Cheap jokes when someone is killed are really inappropriate even when dishing out merited criticisms of Tesla's "trial & error in a live context" approach to progress.

0
13
Silver badge

Re: Unfunnily Enough

Fine, I promise not to tell any jokes at the funeral. I promise absolutely nothing about anywhere else. Go take your over-the-top piety somewhere people actually care for it.

14
0
Bronze badge

And in a final effort to pin the blame for the crash on Huang

Was Huang the driver or simply riding in the car?

The 101 is a very scruffy old road with a lot of fast moving traffic through Silicon Valley. You need your wits about you and should not be expecting a computer to drive you. Definitely can't expect your car to follow another car off the 101 onto the 85.

What concerns me is whether the cruise control failed to release the car to the driver.

3
0
Silver badge

Re: And in a final effort to pin the blame for the crash on Huang

He was the driver.

Why would the cruise control release the car to the driver? By all accounts, he never tied to take control.

6
0
Silver badge

And this is why I will still prefer old school cars without autopilot or any such fancy gimmickry. I want to remain in control of the car at all times.

3
2
Anonymous Coward

Do Tesla use agile ?

Maybe Tesla just isn't agile.

0
0
Anonymous Coward

Ghost train

How long before driving resembles a fairground ride where we are all passengers scared witless by the unexpected and no idea of whether we will get to journey's end? I'd rather play on the Dodgems.

1
0

It's all rubbish

To paraphrase James May on Top Gear years ago.... It's all rubbish anyway. Self driving cars were invented years ag6.... they're called taxis.

7
0
Silver badge

Re: It's all rubbish

I have had a few taxi rides in and around Slough where I would have felt safer in an autonomous vehicle in beta.

1
0

Page:

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Forums

Biting the hand that feeds IT © 1998–2018