there are some ****ing morons out there. And the fact some of them are driving Telslas means there's no correlation between wealth and brains.
An update to Tesla's Autopilot software earlier this month has caused headaches for drivers of its electric cars – with one user alleging he was almost driven off the road by the robotic assistant. The patch, 2018.21.9, contained a number of tweaks to address safety concerns with the Autopilot software, which Tesla trumpeted …
There'll be worse to come if Musk is serious about the next software update being able to:
begin to enable full self-driving features.
WTF? You can't roll out autonomous driving - the vehicle's either fully self-driving, or it isn't. Continuing this pretence that drivers have to always be prepared to take control, while progressively taking them out of the loop, is borderline criminal.
Continuing this pretence that drivers have to always be prepared to take control
The vast vast majority of drivers in the UK are hopeless. Utterly woeful, and frankly, unsafe. That will, statistically, include most people reading this. the idea that such low skilled drivers would be capable of correcting something the car has started doing, say swerving, is fanciful - they'll over correct and end up in a hedge. That presumes they're sober/awake/alert enough to realise something needs correcting.
It'd be nice if we had a realistic driving test that was difficult to pass, then regular retesting. Most people never progress beyond the very basic L test in terms of driving, so we really do need to make sure they haven't regressed too far from that point over the next, say, 50-60 years.
I think we have a winner for the Dunning–Kruger award for worst driver in the thread.
You sound like every other terrible driver I have ever had the misfortune to be a passenger with. The vast majority of drivers are, well, average - which means they drive a few 100k's of miles over a few decades without a serious accident. Whereas every accident prone driver I've ever known always seem to have the same attitude as the poster. A little bit of patience, understanding and forbearance goes a long way in making everyones life a little less dangerous on the roads.
So, O.P, how many miles and how many countries have you driven over the decades? And were they accident free? Me, a good 350K plus miles in 4 counties. Including LA freeways when they used to be completely madmax , central Paris on a Friday afternoon and some of the sketchier parts of Italy. Driving in the UK is a doddle compared with some places in the world. Which is reflected by its accident rate per miles driven.
The simple fact is most human car drivers are actually very very good considering the complexity and variability of the task whereas autonomous driving software is actually very very bad at dealing with anything other than perfect driving conditions. Which dont happen very often.
Roll on the criminal product liability class action lawsuits.
Bravo. I'll humblebrag here. I recognize that I would have had more accidents if not for the many times other drivers have compensated for my occasional lapses in judgment or insufficient attention. I try to reciprocate with patience, tolerance and alertness while keeping my hand less on the horn and my feet more ready for the brake, clutch, whatever... And still, I succumb to annoyance and irritation that I try to suppress.
I have been accident-free for decades and hundreds of thousands of miles. Thanks to others' patience and yes, skill.
Yep, If you haven't passed your IAM or ROSPA test within 5 years, add a government surtax of 100% to your insurance costs as a nice financial incentive to make people get round to it.
The L test is just supposed to show that you are safe on the roads while you gain experience of driving. Why not make people prove they are both experienced & safe after a set time? We might get fewer lunatics on the roads if we did something like this.
Then....accident rates fall, NHS/emergency service costs/callouts fall, roads are safer, cheaper insurance (I am trying to find something negative for balance....nope).
If you are reading this, is there any reason you haven't taken an advanced driving test yet?
My excuse for not doing the IAM or ROSPA test is quite simply that virtually everyone you see with an IAM or ROSPA sticker on their vehicle is driving or riding like a total dick.
When they can't agree with each other over what is safe and what isn't I find it odd that they are rated so highly by their members.
But then after a lot of years as a working rider and probably close to or just over 1 million miles in the seat I don't even hold the Police especially the motorcyclists in anything approaching regard although I do think Lorry drivers and most delivery drivers are pretty good.
> Not quickly enough as for 1/4 second or so you are thinking "is it going to brake? Am I supposed to take over? Oh shit it's not doing to stop"
Almost 3 seconds pass between when the vehicle starts to move across, showing a stationary vehicle in front of it, and when the impact occurs. You may well be late, but any breaking is better than no breaking.
Good alert defensive drivers will get twitchy straight away; those with less experience and more so as they rely on auto-pilot in the future, less so... YMMV :-D
I think even a good driver (although one has to ask why a good driver would be ignoring the road) would have to add to the normal 'thinking distance' the time needed to switch back to 'I'm driving a car' mode from the 'I'm on autopilot' mode. As earlier posters have pointed out, initial reactions would still be from a mindset that expected the car to do something about the situation, and realisation that the car was going to do nothing might well come far too late to take over and respond to the threat.
The halfway house of 'assisted cruise', and even fully autonomous cars is fraught with pitfalls. The only way to introduce autonomous vehicles safely is to replace all vehicles with fully autonomous vehicles in one go. An autonomous vehicle can predict or query the behaviour of every other vehicle on the road, but it can't predict that the meatsack in lane 4 is going to cut across 3 lanes of traffic at the last possible moment and at very high speed, because he suddenly realised this is his exit. An autonomous vehicle would already have positioned for exit some time back and alerted all other traffic to its intentions. You'd need to get rid of all human directed vehicles over for such a scenario to exist.
That just isn't going to happen
"Good alert defensive drivers will get twitchy straight away; "
I'd not even go that far. I'd say most drivers following at the same speed as the car in front would immediately be aware there must be a good reason for the leading car to be changing lanes on what in the video was an otherwise clear road, not a junction. The leading car, at best, was pulling out to pass a slower vehicle. A human driver would have known this and almost certainly have started reacting as soon as the indicator came on.
Not quickly enough as for 1/4 second or so you are thinking "is it going to brake? Am I supposed to take over? Oh shit it's not doing to stop"
If that's your line of thinking, you shouldn't be driving in the first place.
Until such a time that there's no steering wheel or pedals in the car, *you* are always in control. You may allow the car to perform certain manoeuvres on your behalf, but that should never preclude you from reacting to developing situations around you before the car does.
> brake? Not quickly enough as for 1/4 second or so you are thinking "is it going to brake? Am I supposed to take over? Oh shit it's not doing to stop"
No, hell no.
You should always be taking over - braking or whatever - immediately. You should never wait to see what autopilot (or any other 'super-cruise', or pretend autonomous, or even real autonomous system) will or will not do. Those systems are there in case the driver fucks up, they are a backstop. Not the primary system. You, as the driver, are the primary control system.
It is implicitly relying on the car in front to be driving at safe speed so it is maintaining breaking distance to it as if it will start stopping then. That is quite normal - most human drivers drive this way. You need 75m braking distance for 70mph. Nobody does that. Everyone drives closer and pays attention to what the other vehicle does as well as looking over it to what is AHEAD of it on the road.
1. The LIDAR and other hard instrumentation of the autopilot suite do not see past the car in front.
2. The software on the vis feed is nowhere near to do that level of analysis at present. You literally have to judge what the other driver has piled up in their rear window and make judgements based on partially obscured views and indicators/stop lights of cars way in front of you. We all do it instinctively after driving for a few months. Automated cars - not even there as this test clearly demonstrates.
3. We make additional decisions based on the car in front. I am actually expecting an Audi driver to switch lanes after the first blink of an indicator (still better than a teenager in a pimped up Saxo which will not even bother). I give it extra 10-15m just because the person in front is pretending to be in the cockpit of a Me109. The autopilot in the BBC video did not do that.
By the way, the BBC video is nearly identical to the reconstruction of the recent incident with the crash barrier in California. The sole difference is that it was a cardboard car, not a concrete barrier. The other variables, including an audi switching lanes at the last minute (coming out of a "non-lane") are identical.
@Voland's right hand
"You need 75m braking distance for 70mph. Nobody does that. "
I do actually try and leave a nice safe gap on British motorways - usually without success.
.. Because there.s always some driver who cant resist that stretch of empty space and so has to pull into it.
Only time I get to have a "safe" distance is when MWay is very quiet.
.. I;m a realist (& keen on myself & other road users staying alive) & acknowledge my concentration will not always be 100% when suffering monotonous MWay driving, hence I like a nice safe distance as I expect sub optimal response time if I happen not to be fully focused and so extra reaction time needed.
You would be able to keep a safe braking distance from the car in front to start with, at which point the auto-pilot's reaction (full on braking the instant it sees the stationary car) would prevent the collision.
In this case the auto-pilot could have swerved, but in most cases in traffic that would be unsafe to do anyways. Creeping up close behind the car in front works only when you can communicate with that car so you can effectively see through it.
"You would be able to keep a safe braking distance from the car in front to start with, at which point the auto-pilot's reaction (full on braking the instant it sees the stationary car) would prevent the collision."
Errr no. The auto-pilot doesn't have any reaction to stationary objects. This is deliberately done to stop the many false alarms that would occur when roadside objects come into the field of view.
"The auto-pilot doesn't have any reaction to stationary objects. This is deliberately done to stop the many false alarms that would occur when roadside objects come into the field of view."
Yes, I've seen that explanation a few times. It basically says - "That's how it's supposed to work". Now, I agree that it would be dangerous if an autonomous car slams on the brakes because of a stationary roadside item eg large road sign.... BUT Tesla is basically admitting that it's system is not capable of distinguishing between an object on the roadside and one on the road right in front of it.
That is pretty rubbish
STOP MEASURING DISTANCES. You can't judge them, you can't remember them, and you have to know your speed to use them anywhere near accurately.
Just have two seconds between you and the car in front. Always. You're already looking forward. Wait for them to pass something, count to two. And, yes, drivers DO do that: anyone with a brain that realises if they can't come to a complete stop before the car in front does then they are going to die.
That's two seconds between you and the stationary obstacle or the next car in the queue too. If you don't have two seconds and HE doesn't have two seconds, you're too close. Because he'll whack the object (as in the demo) and you still can't stop because he never stopped normally as if braking.
The guy in the comment above counted three between seeing the car and anything being done. So not only did you blow through your entire braking distance, but 50% over again, without even reacting, let alone pressing the brake.
Where the 2 seconds as a base minimum distance for a complete idiot that's "minus lookahead" is enough to see and stop in if you're driving anywhere near sensibly, and "revealing" something less than 2 seconds away means you weren't paying attention.
2 seconds at 70 mph is 62.5m. That's the MINIMUM distance you should have between you and a car in front travelling at the same speed. Though it can take 3-4 seconds in a modern car to come to a complete stop even in ideal conditions (but more like 2.5-3 on a well-maintained one) at worst, hitting a surprise stationary object you'll reduce the impact speed to a fender-bender not certain-death. At best, you'll have time to move around the obstacle entirely. However that's the MINIMUM. The utmost lowest limit. The least you can do and still be vaguely safe by the laws of physic and a well-maintained car and absolute attention on the road.
And magically the "2 second rule" pretty much works no matter what speed you're doing, as it scales with speed!
Seriously have we all just forgotten how to drive?
"Just have two seconds between you and the car in front."
When I learned to drive (an probably for a long time before that) the rule was a car length per 10 miles an hour. That turned out to be a reasonable approximation for 1 second. Given that brakes and tyres were less efficient than nowadays it seems that the advice then was a good deal more optimistic.
The danger here is assuming people knew how to drive to begin with. Number of times I have to take evasive action because someone isn't paying the slightest attention to their surroundings is ridiculous, and getting worse.
How the fuck these people passed their test is beyond me. They certainly don't develop beyond the test. Learn enough to pass it, then regress.
Best approach is assume everyone around me doesn't have a clue and are always likely to do something stupid.
And don't get me started on parents getting / installing a baby seat from the traffic side. In a narrow road with parked cars both sides. Are you trying to get yourself and your kid killed? People really are getting dumber by the day.
> And don't get me started on parents getting / installing a baby seat from the traffic side. In a narrow road with parked cars both sides. Are you trying to get yourself and your kid killed? People really are getting dumber by the day.
And parents who push a pram/buggy out in front of them through a gap in parked cars till they can see what's coming, by which point the poor little darling has been pushed into the middle of the traffic flow.
"And don't get me started on parents getting / installing a baby seat from the traffic side. In a narrow road with parked cars both sides. Are you trying to get yourself and your kid killed? People really are getting dumber by the day."
You do realise some people have 2 or more kids, right? And that cars have only 2 sides, one of which will be on the 'traffic' side?
If its a narrow road with cars parked on both sides, maybe you should be going slow enough to not hit any parents and/or kids, and maybe you can wait for the parents for the full 30 seconds to a minute it takes to strap a kid in?
"And parents who push a pram/buggy out in front of them through a gap in parked cars till they can see what's coming, by which point the poor little darling has been pushed into the middle of the traffic flow."
Awful practice, I agree. Reverse the buggy and go out first yourself. Or, God forbid, walk the extra 100 metres to a zebra crossing instead of crossing from every which where
Drivers are too aggressive these days to adhere to the Three Second Rule. Even less than ONE second provides a gap of at least a car length, and the instant you leave a gap big enough to fit (by Murphy's Law), someone's WILL slip into it, removing your gap. And trying to put the Three Second Rule on the new car just invites another interloper, ad nauseum.
> BUT Tesla is basically admitting that it's system is not capable of distinguishing between an object on the roadside and one on the road right in front of it.
That is sorta true.
I say sorta, because don't forget on a bend, or coming up to or out of a bend, objects that are on the side of the road are in front of the car.
Being able to determine if a stationary object is a collision hazard or not is trivial if you assume a straight road, where anything directly in front also means it is on the road, and anything not in front means it is not on the road.
But this get's rather more complex when you start throwing in bends, intersections, overhangs, and so on. Is that object right in front actually on the road, or is it on the side of the road but because of a bend it is directly in front of the car?
This is the reason that these systems, once you exceed a certain speed (reportedly in the 35-40mph sorta range) start, effectively, assuming any stationary object is something that isn't on the road. After all, if you are travelling at 70mph, shouldn't it be obvious to the driver that that stationary object is on the road, therefore a hazard. Therefore as the driver must be keeping an appropriate safe stopping distance (/rolleyes) and paying attention even though they are are using a super-cruise (in he case of a Tesla it is branded as Autopilot) type system (again /rolleyes) then surely the driver should begin braking, at least to a low enough speed for the auto-braking functions to enable themselves and apply additional braking if necessary?
But yes, the fact that this is the situation does scream that the systems aren't capable of being able to identify objects visually like humans do - I can identify a sign or tree from how it looks, and conclude that it must be on the side of the road because I know what the object is. These systems aren't sophisticated enough to identify an object purely on shape/vision alone. They munge a whole heap of data together - location, speed, rough shape, to try and deduce what the object is. And that deductive process isn't good enough yet.
What you are leading to IMHO, is the conclusion that only roads built for autonomous cars (or accredited for use by) should be used by autonomous cars, and no other roads.
This may be the way forward.
However, if you're going to spend that much money providing a bulletproof playground for these cars, you might as well stick rails in the ground and build a Light Rail System (e.g., the Docklands Light Railway (DLR)) which is a damned sight safer and meets environmental agendas much more profoundly.
"I can identify a sign or tree from how it looks, and conclude that it must be on the side of the road because I know what the object is."
There were a few stationary trees on the roads of the UK yesterday. So even if an autonomous car can recognise a tree, can it also tell if it's lying in the road (partially or fully) and take the correct action or will it dismiss it until too late because trees are always at the side of the road?
"BUT Tesla is basically admitting that it's system is not capable of distinguishing between an object on the roadside and one on the road right in front of it.
That is pretty rubbish"
Normally I'd just upvote, but that point needs separating out, quoting and re-posted for extra emphasis.
The problem is that in this situation there are no sensors on the Tesla/Volvo etc that can "see" the stationary car, so it will not apply the brakes.
The radar is used to detect other moving objects, not stationary ones. The reason for this is that the radar cannot tell overhead obstacles from those in the road so if it stopped for every stationary object it would be stopping for overhanging trees, overhead street signs etc.
It might be different if they used LIDAR but they don't.
"The radar is used to detect other moving objects, not stationary ones. The reason for this is that the radar cannot tell overhead obstacles from those in the road so if it stopped for every stationary object it would be stopping for overhanging trees, overhead street signs etc."
... which is a valid comment at this point but should be load of hogwash.
The car doesn't just rely on radar. It has a ton load of other sensors telling it what is happening. It should be perfectly capable, right now, of working out a) what is a moving object, b) what is a stationary object, and *most importantly* c) calculating if any of these objects is in, or is moving into, it's projected 'flight' path.
A stationary overhead gantry, a lamp-post, or a tree branch hanging over over the road more than 6ft off the ground should be easily detected and *not* be a collision suspect for a Tesla. Nor technically is a car stopped in another lane if your lane is clear and moving, at least until it starts moving again or a human gets out of it at the wrong moment (identify: nearby stationary or slow object in what should be an open lane; reaction: proceed with caution, check speed, etc).
Something detected *directly* in the predicted flight path — or is otherwise calculated likely to be moving into it — *is* a collision suspect... ergo take evasive (whatever it deems that evasive to be at the moment, be it brake, slow down, swerve or whatever). This should not include street furniture.
What about an object crossing the road perpendicular to the line of travel? It's moving but not with a component in the direction the radar's looking.
Well, we could always test it out in the real world. You drive your Tesla on autopilot and I'll cross the road, and ... ...
Ah. Second thoughts, I'll drive your Tesla on autopilot and you cross the road in front of it.
You would be able to keep a safe braking distance from the car in front to start with, at which point the auto-pilot's reaction (full on braking the instant it sees the stationary car) would prevent the collision.
I thought the issue was that, in common with similar systems, the Tesla cruise control ignores stationary objects that it detects once the vehicle is doing a certain speed to avoid the risk of emergency braking on high speed roads which is, perhaps, counter-intuitive to many people who have seen what the cruise control is capable of handling. I presume that the baking after hitting the inflatable car was either as a result of the Tesla detecting the impact or was done by the driver having made his point and knowing that there was another real vehicle in front of the inflatable one, or both.
For many vehicles a wetware driver doesn't need any technology to "see through " the vehicle in front other than the front and rear windows already installed and their own ocular sensor array. (Translation: You can simply, literally see through the car in front).
Now, following a truck, SUV or family saloon loaded with crap on the parcel shelf, the wetware obviously has to make a suitable adjustment.
And that's the thing - for all the talk of "advanced" tech and AI and other brands of snake-oil, the simple fact is that as "impressive" as what might have been achieved might be, it is still a long, LONG way from being able to match a human. Even one of only average intelligence.
I suspect a significant proportion of human drivers would crash in that situation, if not the majority; the black car indicates and changes lane normally, and there are no brake lights or signs of trouble on the white car, so it's going to take a moment before you clock that it's not moving. It's less than 2s to impact from the stationary car coming into view by my stopwatch, and the nominal stopping time from 40mph with ABS is 2.0 seconds, so do the math; it's pretty much a guaranteed crash situation.
In fact many human drivers would probably swerve, which on a real highway with lanes full of traffic would likely mean an even worse accident as a nice multi-lane pile-up ensues. The car in the video had slowed quite a bit, as it was only slightly more than a car length late in stopping, so if it was hitting a real car instead of a polystyrene model, it would likely have at least kept the accident in its own lane.
The issue of whether you'd have time time to take over from autopilot in time is a bit moot, if the autopilot actually did as well or better than a human driver would have.
Stop trying to defend it.
The car does not try to significantly brake AT ALL until it's already hit.
All the sensors in the world didn't notice the stationary object in its path (which could have been a concrete lane divider just as easily as a car) and whacks it full-speed.
The first car got around the obstacle, the car behind it certainly should as well (I'd exclude if there were something to the right of the moving car that might prevent an emergency lane change, but there's not).
These things are deathtraps, pure and simple. I'm sure they "work" some of the time but they can't see a stationary object. And if you have to keep your hand on the wheel and full-attention, and they don't work well enough to be left unsupervised, then you are really just burning money for something that can't do the job.
And when it can't do the job of STOPPING HITTING STATIONARY OBJECTS (which means your braking distance was sorely lacking... no matter WHAT speed you're doing or who was in front... literally you could not have stopped if that car in front had decided to stop BEHIND the stationary car) then suggesting a software upgrade could make it self-driving is ludicrous to the extreme.
Imagine the dummy car wasn't a dummy car, but a lane divider. Or a police vehicle. Or one of those crash-vehicles that steer you away from the coned-off-lanes. BAM. You're dead. And that's at 70mph. At 40mph, I would damn well expect it to stop.
It drove too close, it did not detect a problem, did not react in time, and it would have killed its driver and other people for the simple situation of "car in front swerves around stationary object in the road". That could be a pedestrian, a piece of debris, a car that pulled half-out, a blown tyre, a broken-down vehicle, a police roadblock, anything. And the Tesla was too dumb to distinguish and do anything about a perfectly ordinary driving situation that happens thousands of times a day and DOESN'T kill everyone.
It should be noted here that setting the following distance is a meatsack function. There's a control with a range 1-7 (from memory) which sets the time interval to follow behind.
This one looks to me like it's set to closest follow distance, which is pretty damn stupid. I always use any intelligent cruise control on maximum separation, personally.
Now, if you want to debate why the meatsack is able to set the following distance, I'm right with you - prime evidence that the people designing these system don't quite grok safe driving, to me.
But then, the group mostly involved in cutting-edge programming are also the same group most over-represented in collision statistics - young males. Go figure...
Biting the hand that feeds IT © 1998–2019