back to article Dear Tesla, stop calling it autopilot – and drivers are not your guinea pigs

Tesla is misleading drivers about the efficacy of its Autopilot feature and is putting lives at risk, according to Consumer Reports. The automaker's autopilot system, when engaged, is supposed to control the speed of the vehicle and its distance from other rides and objects – but it's more of a super-cruise-control than a …

Silver badge

"We will continue to develop, validate, and release those enhancements as the technology grows. While we appreciate well-meaning advice from any individual or group, we make our decisions on the basis of real-world data, not speculation by media."

That's a brave statement considering that regulators are seemingly unimpressed by the performance of things like Autopilot. With statements like that it seems Tesla are wilfully ignoring the Human Factors aspects of such a thing.

What I don't get is why on earth Tesla are risking all with Autopilot. Their main thing is half-decent practical electric cars, yet they're willing to take a huge commercial risk on Autopilot, something that their main technology doesn't need or benefit from at all.

Google are nearly as bad, saved by the fact that they're not openly selling cars to the public. "Woohoo, self driving car" they say in demos, ads, papers, trials, and as much publicity as they can generate, yet in the small print they say "you have to be paying attention and will have to take control at short notice"... So not self driving at all then. Most people are believing and responding to the publicity, but have no idea about about actual constraints on the technology. If it wasn't for the strict rules imposed by the State of California (CA published the trials data) we'd not be told that actually it's pretty unreliable at the moment.

The only company doing it properly is Volvo, who at the outset of their development programme said Volvo is aiming for a system where Volvo have the liability, ie a true self driving car. Good for them.

14
1

It looks to me as if the people trusting in this sort of stuff have just put themselves on the short list for the Darwin Award.

3
0
LDS
Silver badge

Actually, in the "Darwin Awards" movie, one of the episodes was exactly about someone thinking that the cruise control was a full AI autopilot, and they could enjoy the trip in a different way...

1
0
Silver badge

Just call it Darwin mode.

Solves all problems.

4
0
Anonymous Coward

Perhaps Tesla is really run by a GLADoS prototype

But there's no sense crying

over every mistake.

You just keep on trying

'til you run out of cake.

And the science gets done.

And you make a neat gun (or electric car...)

for the people who are

still alive.

Although I am sure the article was triggered by an A/C commentard on el-reg who also pointed out that the feature should not be called autopilot.

4
0

Degree of control

I will only let go of my steering wheel when there isn't one, i.e. when the Autopilot functions like a chauffeur and is so reliable that the car design simply doesn't include the option of manual control. In the meantime, if I have to be alert enough to take over at a moment's notice, I might as well retain manual control all the time. Adaptive cruise control is fine, but if I could let go of the steering wheel I would inevitably stop paying proper attention to the road.

14
0
jzl

Evidence & numbers

The number of deaths per mile with Autopilot enabled is lower than the average number of deaths per mile already, meaning that Autopilot is likely to be saving lives.

Also, Musk's just tweeted that they've finally got physical access to the latest crash vehicle (Pennsylvania) and the Autopilot was turned off at the time of the crash. More to the point, he says, the crash would not have happened if it had been turned on.

6
6
LDS
Silver badge

"was turned off at the time of the crash"

And in the instants before? You have to reconstruct all the events leading to a crash. Was the autopilot a factor? Did the driver disabled it in an attempt to avoid the crash? If the autopilot was enabled before? Did it deceived somehow the pilot, causing him to choose the wrong response?

Moreover the statistics about a relatively few Tesla, all new cars used by wealthy relatively young people, may not be directly comparable to the statistics of all cars. I'm quite sure the number of deaths per miles of Rolls Royce and other luxury cars is lower than the average, even without an autopilot. While cheap, old cars especially in the hands of very young people or older ones probably have an higher one.

Beware of statistics, not always they say the real "truth". Especially when there are many "dimensions" to take into account. Tesla data should be compared *only* to similar cars and drivers, to have a meaningful statistic.

11
4
Silver badge

Re: Evidence & numbers

Bullshit. The deaths per mile with old school cruise control are also lower. Are you going to argue that cruise control saves lives, or acknowledge that it has more to do with people only using such features when they already feel safer - and also are generally traveling at highway speeds where the per mile death rates are lower anyway.

This is similar to arguing that not showering makes you safer in your home (because you eliminate the 'slipping in the shower' accidents)

8
4
Anonymous Coward

Re: Evidence & numbers

The number of deaths per mile with Autopilot enabled is lower than the average number of deaths per mile already, meaning that Autopilot is likely to be saving lives

AI vehicle 500,000 miles in 2 years 1 deaths

Piloted vehicles 600,000,000,0000,000 miles in 120 yeras with a million deaths

still doesnt add up to me, until they cover as many miles as normal vehicles

3
2

Re: "was turned off at the time of the crash"

"And in the instants before? You have to reconstruct all the events leading to a crash. Was the autopilot a factor? Did the driver disabled it in an attempt to avoid the crash? If the autopilot was enabled before? Did it deceived somehow the pilot, causing him to choose the wrong response?"

Or did the Autopilot call the driver a bumface? Or... or... none of those things. You seem to be fishing.

3
2
Silver badge

More bullshit numbers

Listing all traffic deaths for the last 120 years to compare against the last two years? How about at least using the last two years, since the death rate was a lot higher in the old days before seat belts and crash testing?

The trends show about 1 death per 100 million miles driven these days. That includes ALL roads not just the nice highways where autopilot is far more likely to be engaged, and all vehicles including poorly maintained 20 year old cars which Teslas aren't. AND it includes the 1/3 of deaths that are alcohol related and can be thrown out unless you are going to claim that the reason autopilot makes you safer is because it saves people who are drunk, texting while driving or doing other stupid stuff. That's hardly the makings of a great ad campaign: "Are you a terrible driver who does stupid things and doesn't pay attention, buy a Tesla and use autopilot and you're less likely to die!"

5
0
Vic
Joke

Re: Evidence & numbers

This is similar to arguing that not showering makes you safer in your home

And so it does.

Eschew showering for long enough, and you're far less likely to contract an STD...

Vic.

2
0
LDS
Silver badge

Re: "was turned off at the time of the crash"

It looks you never did a "root cause analysis". A snapshot of a single instant in an incident may tell not enough and deceive you. Was the autopilot a factor? The fact it wasn't enabled in the instant of the crash doesn't mean it wasn't a factor, if it was enabled before. When it's all about safety, you must be very careful and check and understand everything that lead to an accident. You may be free to rely on unsafe systems just because they make you feel trendy, and kill yourself, but you may also kill other people, and that's unacceptable.

1
0

A sign of things to come

Tesla's "autopilot" is actually quite modest and it's easy to see how you might break the problem down and model it - multiple lanes of cars all going the same way, sensors that model the car's surroundings / lane markings, algorithms that maintain speed & distance, algorithms that mark opportunities to overtake, algorithms to avoid / brake hazards based on proximity, control steering / brakes / lights. It's complex no doubt but it can be modeled.

But it requires:

a) The computer is able to see all hazards, act in a predictable way and additionally only engage when the road and conditions are suitable. This is clearly not the case.

b) The car forces the driver's attention. Force the driver to hold the wheel with both hands. Force them to touch a pedal in a certain way. Monitor their head and posture. This is clearly not the case either.

It is the failure of a) and b) which causes accidents. A failure of a) is bad enough but without an attentive human, it's a guaranteed accident. This is a forseeable consequence of not forcing attention, i.e. bad design.

The funny part is Tesla's self drive solution is quite modest. The problems facing mostly or even fully automated cars are multiple factors higher. Perhaps reports of accidents might do something to allow a little bit of reality to creep into the hype about self drive vehicles.

6
0
Anonymous Coward

This smells like darwin award time. =( The question in my mind is, did Tesla loudly and clearly communicate to its customer that they are not to read, watch tv etc. etc. while engaging the autopilot? If they clearly communicated what they can expect, I see a clear case of the darwins.

If they did not communicate, then I see a big law suit, or several, coming up.

Personally, I would never trust the autopilot enough in its current state to remove my hands from the wheel. On the other hand I'm paranoid, so I don't trust my computer and rely on backups, I don't trust the state, so I encrypt etc. etc. ;)

3
0
Silver badge

Doesn't matter if they communicated it if they don't enforce it. They have sensors that can detect if you are holding the wheel or not, but don't disengage autpilot no matter how long you keep your hands off the wheel. There's no excuse for such stupidity, and I imagine a jury will agree when the inevitable lawsuits begin.

8
1
Anonymous Coward

Hmm, I guess you are referring to US-style law, that assume that the human being is incapable of breathing himself, that gave rise to warnings such as "do not let children play inside laundry machine".

I actually do not like that legal style at all, however, your point is noted. My preference is to have a legal system that assumes a minimum level of intelligence and common sense, like in the far north of europe.

On the other hand, that legal system can be a bit too pointless when the penalty is so soft that you clearly come out ahead if you break the law. I guess something between the US and northern europe might be ideal.

2
0

Not necessarily useful

Sounds like a bad idea to turn OFF the self driving when there isn't anyone holding the wheel.

Turning it ON, with some kind of alarm would be a better option.

Anyway, this is all bass-ackwards.

If this was about safety, the human should have to drive all the time with the Autopilot kicking in only if they do something blatantly stupid (with the ability to do a conscious override, a la stability control).

4
0

It doesn't matter what they communicate. A system which allows a driver to be inattentive will cause accidents. A system which is in itself is imperfect will cause accidents. Both need to be addressed for the system to be safer than a driver by themselves. So this is a forseeable consequence of bad design.

An analogy might be a factory with a dangerous hydraulic machine. You could put warnings all over the machine saying not to do certain things while it's running and someone still will either through stupidity, inattentiveness or whatever. That is why factories are required to install things safety gates, two handed controls, sensors etc. that automatically shut down the machine if the operator does something that puts them at risk. A car hurtling down the road at 70 mph is a dangerous machine and safety should be treated as importantly as it is in a factory.

7
0
Silver badge

" That is why factories are required to install things safety gates, two handed controls, sensors etc."

And why employees bypass the things, then find they have no recourse when the machine amputates body parts (except that these days they do, because the factory is generally found to be negligent in allowing someone to bypass the safety mechanisms.)

0
0
TRT
Silver badge

I'm surprised they called it "Autopilot"

Because from the description it sounds like:

1) In-lane-guidance assist, which is something Toyota have had for about 3-4 years, and uses their electrically driven power steering to artificially "profile" the road (changing lanes feels like you're steering over a 6 inch high ridge where the lane dividers are - if you don't actively steer, it feels like the road is pushing the car into the bend).

2) Collision-avoidance, which Volvo and others have had for a round 5-6 years as a front facing feature, and the side/rear-collision detection has been on high end cars for around 2-3 years.

3) Adaptive cruise-control, which Toyota have again had for 3-4 years, which is supposed to maintain distance to the vehicle ahead whilst respecting an upper speed limit.

You can't just keep piling driver assistance features on top of each other and expecting it to one day magically start working as something that can drive the car.

8
0
Anonymous Coward

Hands off, foot off.

Why doesn't the car start to decelerate if the hands are off the wheel, would be safer than many other drivers.

I watched a vlog, timelapse in car and the driver was overtaking other vehicles (multi-lane highway) with a phone in one hand (+ charging lead) and a lollipop in the other, steering with her knee.

I respect the vlogger and I bet his girlfriend is nice too but FFS, that is the sort of thing that would have me walk.

Also been back seat in a Jeep and the driver drops his fag (UK=Cigarette), both him and his GF (front seat) bend down to look for it so it does not burn the carpet, left just me watching the road and wondering how to word something.

Automation is far behind the human mind, it would never think to do those things ;-)

5
0
WTF?

Beta

Just how have they managed to get a car on the road running beta software that is directly related the safety of the occupants and more importantly others. This is either some very good lobbying or simply the regulations have not caught out. The other possibility is that because it is electric and seen as "IT" and techie, like everything else, current regulations do not apply.

Can you really see Ford, Honda, Toyota, VW etc doing this?

4
0
Anonymous Coward

Re: Beta

"Can you really see Ford, Honda, Toyota, VW etc doing this?"

Toyota don't seem to have had a problem shipping control systems that aren't fit for purpose. There have been deaths, and court cases leading to billion dollar penalties, in the USA.

VW's ability to do software and systems wrong may not have killed drivers or passengers (yet) but their ECU-fiddling has now become relatively visible.

Who knows what we'll find out about Ford, Honda, and others.

For probably the best documented example to date (Toyota), look up (e.g.) "uncommanded acceleration".

Places to start include:

http://www.eetimes.com/document.asp?doc_id=1319903 25 Oct 2013 [1]

http://www.eetimes.com/document.asp?doc_id=1319966 31 Oct 2013

http://www.eetimes.com/document.asp?doc_id=1321734 1 Apr 2014

https://users.ece.cmu.edu/~koopman/pubs/koopman14_toyota_ua_slides.pdf 28 Sep 2014, Prof Phil Koopman (expert witness at the Toyota trial)

[1]

"Could bad code kill a person? It could, and it apparently did.

The Bookout v Toyota Motor Corp. case, which blamed sudden acceleration in a Toyota Camry for a wrongful death, touches the issue directly.

This case -- one of several hundred contending that Toyota's vehicles inadvertently accelerated -- was the first in which a jury heard the plaintiffs' attorneys supporting their argument with extensive testimony from embedded systems experts. That testimony focused on Toyota's electronic throttle control system -- specifically, its source code.

The plaintiffs' attorneys closed their argument by saying that the electronics throttle control system caused the sudden acceleration of a 2005 Camry in a September 2007 accident that killed one woman and seriously injured another on an Oklahoma highway off-ramp. It wasn't loose floor mats, a sticky pedal, or driver error.

An Oklahoma judge announced that a settlement to avoid punitive damages had been reached Thursday evening. This was announced shortly after an Oklahoma County jury found Toyota liable for the crash and awarded $1.5 million of compensation to Jean Bookout, the driver, who was injured in the crash, and $1.5 million to the family of Barbara Schwarz, who died.

During the trial, embedded systems experts who reviewed Toyota's electronic throttle source code testified that they found Toyota's source code defective, and that it contains bugs -- including bugs that can cause unintended acceleration.

"We've demonstrated how as little as a single bit flip can cause the driver to lose control of the engine speed in real cars due to software malfunction that is not reliably detected by any fail-safe," Michael Barr, CTO and co-founder of Barr Group, told us in an exclusive interview. Barr served as an expert witness in this case.

A core group of seven experts, including four from Barr Group, analyzed the Toyota case. Their analysis ultimately resulted in Barr's 800-plus-page report. [continues]"

1
0
Joke

Re: Beta

> For probably the best documented example to date (Toyota), look up (e.g.) "uncommanded acceleration".

Their marketing department did a good job of it, though - where else could the slogan "The car in front is a Toyota" have come from

0
0
Silver badge

Re: Beta

Interesting cases, the Toyota "acceleration" ones.

Ever note how many elderly drivers were involved?

I have no skin in Toyota's game, and I am regarded as elderly by some whippersnappers hereabouts, but it occurred to me many years ago that maybe Toyota valued the public image more than the quest for public truth in these cases.

But then again, maybe not. I Was Not There.

0
0

I LOVE my Tesla Model S, and I DO use the auto-pilot feature regularly as an intelligent assist function, but do not abdicate the responsibility of having to be in control. This is the best and smartest car I have ever owned and driven and absolutely love it, and look forward to continuous software updates that bring new and advanced features all the time. This is the first car I have owned that improves the longer I own it.

But like all technology, mis-use or abuse, and you're headed for trouble. This is very sophisticated technology, and it requires the user to be smart enough to work with it.

8
0

About the naming...

So the name Autopilot should be dropped because it gives a false sense of security, and beta features should be removed...

... says someone that hasn't got a clue what they are talking about. It's a self-driving feature, how can you call it anything that describes what it is without giving a false sense of security? If anything, Autopilot is probably the *least* reassuring name they can give it - after all, we all say we are doing things "on autopilot", when acting without conscious thought. And we know how well that often turns out.

As for beta - that doesn't mean it is an untested feature that is not of sufficient quality to be rolled out. It is an indicator to people to not entirely rely on the feature.

In other words, giving it a "beta" label is telling people exactly what some want from changing the name - and for which changing the name couldn't do.

1
3
Silver badge

Re: About the naming...

As for beta - that doesn't mean it is an untested feature that is not of sufficient quality to be rolled out. It is an indicator to people to not entirely rely on the feature.

Tell you what: you beta test your Tesla on a private road away from me, my family and friends until it isn't in Beta any more and we are good.

Until then, what is called for is a siren about as loud as those used at football matches to sound inside the Tesla's cabin whenever some twat takes his or her hands off the fucking wheel in traffic so he/she can take a selfie.

Anyone caught photoblogging from the backseat of their Tesla while beta-not-ready-for-prime-time-autopilot is in charge should be recycled for organ donation.

8
0

Re: About the naming...

Stevie, I think the point here is that the term "autopilot" in itself does not promise much. Look up the definition on merriam webster and you will find no implied promise of any intelligence whatsoever. Basically it is a device that can be implemented with a piece of rope. Clearly there is a huge span of what the various implementations do. And there are different complexities involved. In an airplane you have the luxury of integrating with a collision avoidance system installed into all other aircrafts. No such thing in a car, so you end up with having to syphon similar information from a camera. I suspect that it is actually harder to implement autonomous operation (which is still <> autopilot) in a car than in an airplane. (but that is probably not relevant to the discussion at hand)

The article's author seem to have a different interpretation of what "autopilot" means, but I think it is very relevant to question that interpretation.

OTOH I found the discussion of statistics interesting. It makes sense that people would activate e.g. a cruise control in places where the traffic situation is predictable. At the same time I am worried that we (even on an IT website) tends to act like luddites. I see similar arguments against this feature as back when people still used to discuss ABS (also an oft misunderstood technology: ABS will not reduce braking distance on slippery surfaces, but it might help you steer while slowing down)

0
0
Vic

Re: About the naming...

In an airplane you have the luxury of integrating with a collision avoidance system installed into all other aircrafts

  • Collision avoidance is rarely integrated with the autopilot
  • Many, many aircraft have no collision avoidance mechanism but the pilot's eyes

Vic.

1
0
Silver badge

Re: About the naming... 4 9Rune5

The name "autopilot" carries with it many ideas and suppositions, including the idea of a safe method of stepping away from the controls.

I could wave a dictionary back at you pointing to the term "pilot" but that would not satisfy anyone since you seemingly believe cars should have the same testing and release philosophy as the software that made the wealth of the man behind the Tesla - cram in Teh Awsum, shove it out the door and fix it in the mix as problems arise - whereas I don't think that thinking belongs anywhere involving large chunks of metal and/or space age composites hurtling along the highways.

Yes, that includes "smart" traffic signs.

Aeroplanes have many advantages, not the least being that when autopilot is engaged the actual pilot is in clear airspace guaranteed by all sorts of backup mechanisms and laws to which all but a few insane types adhere for the public good.

But fly an aeroplane on autopilot into an area where some of that is not true and it all ends rather badly in short order. It has happened many times, once in recent years when a small jet of the Lear type (but not necessarily that marque) suffered what has been publicized as a failure of a door seal that caused everyone to pass out.

I'll pause while everyone inserts their personal favorite conspiracy theory.

The aircraft swanned across the American skies in excellent order while increasingly anxious ATC personnel attempted to contact someone without transistors in their brain. Eventually it intersected a mountain range, whereupon the need for a real pilot was suddenly proved beyond a shadow of a doubt.

0
1
Silver badge

Re: About the naming...

"ABS will not reduce braking distance on slippery surfaces"

It does over the typical scenario (wheels lock and driver doesn't lift foot from brake) but quite a bit.

It can pull a car up slightly better than an expert driver in an non-ABS car, but it's far more important that it can pull the car up far better than most drivers can achieve AND won't result in the car spinning if the surface under left/right sides of the vehicle are different in their levels of grip (eg, side of the road or one set of wheels on paint) - this is one that even expert drivers have trouble avoiding.

https://www.youtube.com/watch?v=mKiTAcXK6M4 - 3:41

It's probably more saved more lives under these kinds of circumstances than anything else.

The steering part is a bonus but no matter how you try and play that it does extend the stopping distance.

0
0

Fundamentally flawed.

Either it needs to be able to deal with all any any situations that might arise, or it's not ready. This idea that people are going to drive for hours on a motorway with their hands and feet hovering near the controls, ready to take over at any stage ... and not fall asleep / get totally distracted is nonsense. It's a manufacturer cop-out to try and sidestep responsibility.

I can see how some people would go for full automation, but this half-assed sort-of automation is just asking for problems.

To the people who keep parroting the line about them already being safer - lets see the stats breakdown on that one before buying the marketing, but in addition, that's almost not the point. When you have a crash whilst driving your car - you had some skin in the game. Quite literally your life. The guy who wrote the bad update code that causes the autonomous car you were riding in to crash would doubtless feel terrible about you being killed, then he would go on with his life. There may not even be a fine.

The distinction here is that by putting your life on the line, you buy in a certain amount of trust from the other road users. You have as much to lose as them. The guy 9-5ing it in a software house on the other side of the world who messed up and killed you and other road users hasn't "bought-in". He needs to be held to a higher standard with rigorous testing.

Here's another interesting thought experiment. Say we assume that the "AutoPilot" software is good enough. Then we say that we pass a law that stipulates that if the deaths per mile with AutoPiliot exceeds that of regular drivers, the entire development and testing team are executed. Now they have some skin in the game. Do you think the testing regime is going to stay the same or get much more thorough? If you think it will get more thorough, I thought the software was good enough? Good enough for strangers maybe, but not them!

5
0
Facepalm

Planes have autopilot - doesn't mean they don't have pilots.

Commercial jetliners have had autopilot for decades. That doesn't mean they don't have a pilot and a co-pilot as well.

From CNBC: "The autopilot system relies on a series of sensors around the aircraft that pick up information like speed, altitude and turbulence. That data are ingested into the computer, which then makes the necessary changes. Basically, it can do almost everything a pilot can do. Key phrase: almost everything."

Anyone who drives a car with "autopilot" and expects it to do everything is an ignoramus and a fool.

1
1
Silver badge
Pint

Re: Planes have autopilot - doesn't mean they don't have pilots.

At the outset, Airbus took a different approach with the entire Human Factors thing. Over the years, quite a few Airbus have been in perfect mechanical condition in the millisecond before impact.

Somebody should make a plot of the rate of 'perfect mechanical condition' versus 'seriously broken or on fire' (both: a millisecond before impact) of the various brands of aircraft, plotted against year. Based on my observation of the crashes over the years, there would be something of note in the historical data

Tesla is following in Airbus' footsteps. The 7pm news shows are going to have a regular 'Self Driving Car Crash of the Week' segment.

Hopefully the NHTSA or DOT will shut down this 'experiment' until the regulations mature.

1
1
Silver badge

Re: Planes have autopilot - doesn't mean they don't have pilots.

"Over the years, quite a few Airbus have been in perfect mechanical condition in the millisecond before impact."

Not just Airbus. The same accusation can be levelled at Boeing and McD.

Pilot error (CFT) has been the prime contributory factor to almost all air crashes in the last 40 years. The few where it hasn't been have been all the more newsworthy because it wasn't pilot error.

(This is one of the reasons why airlines don't hire ex-military pilots anymore. They tend to keep trying to push on regardless when everyone else goes around or gives up and diverts.)

0
0
Silver badge
Pint

Gives new meaning to the term 'Crash Report'...

Dear Tesla,

It drove into the side of a truck.

It drove into the side of a truck.

It drove into the side of a truck.

...

Cheers.

P.S. 'A.I. is hard.' (<- never forget that.)

0
0
Anonymous Coward

Re: Gives new meaning to the term 'Crash Report'...

The truck cut the driver up, and the AI wasn't even switched on at the time.

0
0
Silver badge
Pint

Fermat's Last Theorem (repost)

Remember when Andrew Wiles finally proved Fermat's Last Theorem? The final proof turned out to be just 'a bit' (LOL) more difficult than had originally been imagined by Fermat. Orders of magnitude more complicated.

Self-Driving Cars are probably going to be quite similar. When they finally get one actually finished, one that successfully stays out of the headlines and avoids contributing to 'interesting' tragedies, they'll look back over the intervening decade(s) and then laugh at the vast ratio between their final fully debugged system, and the 2016-era trivial kits that some had imagined would be sufficient.

3
0

Customers who use auto pilot for more than 2 minutes are complete idiots anyway.

0
1
Meh

"Drivers are not your guinea pigs"

It may be uncomfortable to hear it, but it's worth killing a few hundred, maybe a few thousand people if it brings forward a few years the enormous economic advantage of self-driving vehicles. Of course, you may need to do your testing in countries where life is cheaper.

0
0
Anonymous Coward

I use this feature every day and I think Autopilot is a perfect description of its features. Just like autopilot on a plane. It doesn't mean the pilot can go to sleep or do something else. You still have to pay attention to the road. Is it perfect? No. I see it drift out of it's lane and I see it has problems cresting a hill but if you are paying attention, it does make driving easier. I'm only correcting the car when these things happen. I'm certainly not going to go to sleep or start reading a book. By far it works best in stop and go traffic. By the time I get home, I'm not feeling tired from driving. Long trips don't drain me. I love it, flaws and all.

0
0

Page:

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Forums

Biting the hand that feeds IT © 1998–2018