Bus vs. meat bag
Here in FL., us meat bags have learned buses do not give an inch or a f**k. And our mass transit systems are a joke. Avoid at all cost...
Alphabet has filed an accident report with the California Department of Motor Vehicles, in which it says one of its autonomous cars had a low-speed bingle with a bendy-bus. The report says its self-driving kitted-up Lexus found itself baulked by sandbags at an intersection. The car “was travelling in autonomous mode eastbound …
"National Express on the M4 are prone to sitting 6 feet off your tail at 50mph in bumper-to-bumper traffic as it hits the elevated section."
A quick squirt of your screen wash. Watch for their wipers to go to wipe your screen wash of their windscreen. Repeat.ad nausium. Sometimes they get the message.
Or make a note of their number and send a written complaint to National Express. Tailgating is now a specific offence with fines and points for drivers.
"Google's test operator made the judgement that the bus was probably going to give the car space to reverse."
Likewise here in London. Buses own the road or at least they like to think they do. Its hard to argue with seven tons of red steel. What amazes me is that anyone would think that a bus would give way in the first place. Same goes for large trucks, rubbish collection vehicles, indeed anything big. It is survival of the heaviest.
"Its hard to argue with seven tons of red steel"
And the rest. The little buses are 6-7 tons, the double deckers are 11-12. The problem however becomes
more obvious when you drive a large vehicle (FTR I have an HGV class 1 license) in a city like London with narrow streets (this probably doesn't apply in the USA). Pulling over into a space to let traffic coming the other way pass usually isn't an option, reversing never is on a public road. So I'm afraid you have to somewhat bully your way ahead otherwise matey boy in his Mondeo will take the initiative and you both get stuck when he realises you can't get out the way and he now can't reverse because 5 other muppets followed him.
So I'm afraid you have to somewhat bully your way ahead otherwise matey boy in his Mondeo will take the initiative and you both get stuck when he realises you can't get out the way and he now can't reverse because 5 other muppets followed him.
Upvote. I also hold all licenses, which as as nice side effect that I recognise a HGV's situation and give way so they can progress. Not because I feel bullied (emotions are unhelpful when driving), but quite simply because it's the best way forward for everyone - when traffic continues to flow, everyone wins, it's less stress and the HGV driver didn't choose to be there, they simply try to do their job.
There is, naturally, the remaining problem of idiots who think you just stopped for fun and try to get past. It's quite entertaining to watch when they come level with you and suddenly realise what they've got themselves into and no, I am happy where I stopped, I'm not moving. I just watch karma being a biatch :).
> I also hold all licenses, which as as nice side effect that I recognise a HGV's
I don't  - but I'm reasonably competent as a driver (modesty not my strongest attribute :-) ) and am well aware of the space that big vehicles (busses, trucks et. al) need. Hence, give them space at junctions, give way to them if needed and recognise that they either have a schedule to follow or are on a limited time of driving due to a tacho.
Doesn't mean I don't get irritated with the driver of a truck overtaking another truck with a difference of 1mph on a dual-carriageway though.
 Car and motorbike - both passed first time. I've driven all sorts of other stuff as allowed by my OldFart version of driving license. Never driven a road-roller though..
Highway Code Rule 223
Buses, coaches and trams. Give priority to these vehicles when you can do so safely, especially when they signal to pull away from stops. Look out for people getting off a bus or tram and crossing the road.
The reason buses assume they have right of way is that they do. Other road users should always give priority (i.e. cede right-of-way) to buses coaches and trams. People really shouldn't be driving without knowing this!
Read that again and _try_ to comprehend the difference between giving up right of way to a bus that is trying to leave a stop, and _all other circumstances_.
Then read it again, and understand that it is saying that other road users _should_ (not MUST) _give up_ _their_ right-of-way...
No stationary vehicle, not even a bus, _ever_ has right of way over a moving vehicle.
No stationary vehicle, not even a bus, _ever_ has right of way over a moving vehicle.
In quite a few countries, a stationary public transport bus that starts to indicate must be given priority if stopping is possible. Some sanity applies, of course, but if you're in a position to stop when a bus starts indicating in order to pull out of a bus stop and you don't, you can be fined.
Indicating? You mean those flashy lights on the corner of vehicles that either never seem to work or are flashing for six miles before the operator decides to turn or change lanes and even then half the time it's in the opposite direction?
You're confusing this with optional BMW accessories - this is on a bus :).
Not just Florida. As a cyclist on a roundabout I got deliberately sidewiped by a bus driver barging his way through many years ago.
Things got so bad in that town that the cops stationed themselves on one side of an intersection one day and ticketed 30 bus drivers in an afternoon for refusing to give way to cyclists.
refusing to give way to cyclists.
In Denmark there is about 10-15 nasty fatalities every year with cyclist on the right side of bus or lorry going under the wheels when bus / lorry turns right.
Even though the cyclist are "in the right, legally", they should still stay the hell away from large vehicles!
The speculation is that women - who are the majority of the fatalities - simply trust more that others will follow the rules than men do.
This is why the future might actually be maglev pods on a rail above traffic (and pedestrians). The pods have a clearly marked right of way which is much easier for a computer to deal with. Plus it takes commuters out of competition for space with other ground based transport (trucks, bicycles etc).
"115 years ago. And it still works ."
Until it doesn't. At which point extracting passengers requires a fire engine with a ladder. It's happened on multiple occasions.
Going back about the same period (and more practical), elevated moving pavements existed. and should be reconsidered.
The idea of creating an elevated platform for trains is indeed not new. Bangkok used to be a total pain to travel in until they installed the BTS Skytrain (which construction took quite some time and didn't exactly *help* traffic during that time :) ).
It was worth it, though, best idea ever.
"ln fact, somebody already did - 115 years ago. And it still works ."
Wow! I forgot about that. I travelled on it many years ago when I was there. Including a special trip on the Kaiserwagon(??). Somewhere I have a first day cover stamp presentation set we got given at the time showing the Schwebebahn 75 year anniversary(??)
Wow, Ok 4 thumbs down. I don't really see why that comment was annoying people.
Perhaps some of the replies pointing out that a SkyTran system had already been done and had failed were annoyed that I had failed to realise this?
But, the difference between the SkyTran PRT and those elevated hanging German trains, or the Ultra Pods at Heathrow Airport, or the Morgantown PRT (https://en.wikipedia.org/wiki/Morgantown_Personal_Rapid_Transit) is that all those systems require a costly evelated guideway or set of rails. Because the SkyTran is light and uses a single rail it's infrastructure is much cheaper than those other systems.
Other things that I think the SkyTran has going for it are that a single silver rail in the sky isn't the eyesore that an elevated concrete track is. Also because it is maglev the SkyTran can travel at 270 kph.
As fortune noted, the SkyTran might succeed because it is really really cheap:
> "The vehicle's test operator thought the bus would stop..."
Okay, but why didn't the vehicle sense the impending crash and halt in time? I thought that was one of the main reasons for autonomous cars, to help prevent 'human error' accidents. If a big bus is bearing down on the left side from behind at 15mph, a prudent automaton would halt and assess, not just swing out into the path of the bus.
You can bet Google's programmers will be going through that vehicle with a scanning tunnelling electron microscope to find the answers to your question.
I'm actually quite impressed; if a minor bit of boof-tinkle-tinkle, of the sort that happens every day between meatbag drivers, like this is newsworthy, the driverless cars must be doing something right. Especially given that the technology is in its infancy, it's amazing that nothing has gone seriously wrong, to the point where even a little fender-bender like that makes the news!
Yeah, but Google keeps selling it as a crash-proof solution which is not. If the human in the car was wrong, how comes the beautiful software written by Google chaps did not prevent the incident ? Let's be clear on this, the human did not gave the order to advance, he was only supposed to activate emergency braking in case software was dumb.
> "...if a minor bit of boof-tinkle-tinkle, of the sort that happens every day between meatbag drivers..."
The average human driver almost never crashes. How many of these auto-cars are there? Not many at all, and already there's an accident attributed to one. Also they have existed for only a short time. I would suspect that so far the crashes-per-mile ratio greatly favors human drivers.
Okay, one data point doesn't make a trend, but it is troubling.
"The average human driver almost never crashes. How many of these auto-cars are there? Not many at all, and already there's an accident attributed to one. Also they have existed for only a short time. I would suspect that so far the crashes-per-mile ratio greatly favors human drivers"
More than one accident attributed to Google cars apparently:
"Between September 2014 and November 2015, Google’s autonomous vehicles in California experienced 272 failures and would have crashed at least 13 times if their human test drivers had not intervened, according to a document filed by Google with the California Department of Motor Vehicles (DMV).
When California started handing out permits for the testing of self-driving cars on public roads, it had just a few conditions. One was that manufacturers record and report every “disengagement”: incidents when a human safety driver had to take control of a vehicle for safety reasons.
Google lobbied hard against the rule. Ron Medford, director of safety for the company’s self-driving car project, wrote at the time: “This data does not provide an effective barometer of vehicle safety. During testing most disengages occur for benign reasons, not to avoid an accident.”
The first annual reports were due on 1 January, and Google is the first company to share its data publicly. The figures show that during the 14-month period, 49 Google self-driving cars racked up over 424,000 autonomous miles and suffered 341 disengagements, when either the cars unexpectedly handed control back to their test drivers, or the drivers intervened of their own accord. The reports include both Google’s own prototype “Koala” cars and its fleet of modified Lexus RX450h."
@ Big John
Quote: "The average human driver almost never crashes."
It's about once every 10 years on average (in the USA) for a human driver to have a crash.
Quote: "I would suspect that so far the crashes-per-mile ratio greatly favors human drivers."
Er, nope, and not for some time now.
In the USA, a human driver has a crash about every ~165,000 miles driven. (10 years per crash multiplied by average miles driven per year in the USA).
In 2012, the Google cars had done over 300,000 miles on average without an incident. So were already doing better than humans (for safety) back then.
The Google cars hit 700,000+ miles on average without an incident last November (2015).
So with those figures, the current Google cars are around 4-5 times less likely to have an accident per mile than a human driver.
Also bear in mind this is basically with beta software, that is still being continuously developed and improved.
Another year or two, and this average between accidents for the Google cars is probably going to be 1,000,000+ miles.
Looking into the future a little bit....
If we assume an average (USA) driver will drive for 60 years in their lifetime, perhaps a little longer, then that's around ~990,000 miles driven in total, and with an average of 6 accidents over that time. (Based on current USA stats).
If Google get to 1,000,000 miles on average between accidents (and at 700,000+ last year, I see no reason why they can't do this within the next 1-2 years), then that means statistically, a Google car could drive more mileage than the average human USA driver covers in their entire life, but without a single accident.
The Google car will still get murdered on the Autobahn - unless Google can program some sense of self preservation and justified paranoia into its robot brain:
1) BMW, Audi and Porsche RULE the left-hand lanes and they are doing about 260 km/h. Average speed. Pull out to overtake on a totally clear road and one of these will be right up your arse, burning rubber.
2) Everybody else think they have Schumacher's reflexes, brakes and driving skills so the right-lane has about 3-5 meters between cars. Keep a comfortable distance, some wanker will squeeze in and close the gap.
3) Then we have Latvian and Polish HGV-drivers - "White Van Man" on 'Roids, Russian Quality Roids, coming up from behind at a good clip any time traffic slows a bit.
German police can clean up the carnage and get traffic going again in less than 20 minutes. They have decades of practice.
The question isn't whether they have more or fewer accidents at this point, it's whether they are capable of driving in a variety of road conditions. People encounter changing conditions all the time whether it be heavy fog, rain, snow, etc.
As of 2014 the Google car hasn't driven in rain or snow so it's a little unfair to make a comparison at this early stage if the autonomous vehicles can cherry pick their driving conditions but the human operators are unable to be choosy as it may mean drive to work or lose your job. Sure, one can call in sick one day but it's difficult if not impossible to tell your boss that you can only come to work on nice days.
The Tech Review article linked above also questions the ability of an autonomous vehicle being able to identify the hand signals given by a traffic officer as one may find at the site of an emergency or construction. Further we don't know how much of the autonomous driving is during such things as commute hours on the freeway when the seeming majority of accidents occur.
It's about once every 10 years on average (in the USA) for a human driver to have a crash.
I doubt that figure is accurate if you include minor scrapes which are not reported. With insurance excess and loss of NCB, damage needs to be fairly expensive before the average driver will bother reporting or making a claim.
"I'm actually quite impressed; if a minor bit of boof-tinkle-tinkle, of the sort that happens every day between meatbag drivers, like this is newsworthy, the driverless cars must be doing something right"
Right. Because there are as many autonomous cars out there as meatbag driven ones.
What I wonder about is why they do not automate trains. At least these are already more outer less remote controlled by the signals and all.
"why they do not automate trains"
Some of the subways in Nürnberg, Germany, are driverless.
I think they have been running for almost 5 years. I haven't been on one myself, yet.
Can't recall any accidents with them.
Plenty of airports also have automated trains between terminals, e.g. Frankfurt, Paris CDG, Denver, London Stansted.....
"why they do not automate trains"
Like BART (San Francisco) or the DLR (London), and several others?
The attitude of various unions ("over our dead bodies") has a lot to do with it.
Also, outside of a completely walled-in urban transit system, the possibility of unsignalled objects on the line is very real. It's only recently that robotic vision has reached the point where it can apply the brakes in such circumstances (landslides, fallen trees, cars barrier-dodging level crossings, kids playing silly games ....) and it is still(?) beyond the wit of robot to get out of the cab to investigate the object that it has stopped for which turns out to be a large empty cardboard box blown on the wind, or to not stop at all for what is clearly a wind-inflated carrier bag snagged on trackside shrubbery.
Back to self-drive cars I wonder when they'll start testing them on one-track wiggly deeply potholed rural roads where there are riders on horses to be aware of.
A little known advantage of the Nuremberg system is that the interval between trains can be almost halved compared to meat bag driven ones. That the trains are operating on _normal_ tracks and can run mixed mode with human operated ones is rather unique. There is also more flexibility for scheduling depending on demand.
Looking out of the front windows is very nice (I wish they'd switch on the head lights all the time, but they are off almost all the time).
As GrumpenKraut says ATO (Automatic Train Operation) lets you get trains much closer together than do meat bags. It's going to have to be used for parts of Cross Rail - where it is complicated by the need to hand off between fleshy drivers (for the rest of the network that doesn't have it) and ATO, to make it possible to run trains as close together as they would like.
Currently only metro systems tend to use it, for the reasons outlined by Nigel 11, though it is also being strongly considered for HS2 - a Human driver couldn't stop for obstructions at that speed even if they saw them and it will again be a completely closed system.
"it's amazing that nothing has gone seriously wrong, to the point where even a little fender-bender like that makes the news!"
That's why there's a human operator in there. They take over when necessary. I think there may be an El Reg article somewhere which reference the number of times the "driver" had to take over.
Simplest explanations often being the most plausible, I would hazard a guess that Google engineers have spent 5 9s of the time working on the car going forward. Unless they've had the car reversing round a test track le-mans style the cars probably have had insufficient training in correct responses when reversing. How often do we actually reverse in real life. Probably at most 30 seconds per trip when either parking, or departing.
Unless they've had the car reversing round a test track le-mans style the cars probably have had insufficient training in correct responses when reversing
I would have thought that the responses would either be exactly the same or a mirror-image, and so, unlike a human driver, need no special training.
If you live in a city where bendy buses are common, you'll also know that other drivers don't always understand that they need space to articulate, especially around tight bends.
Now imagine a 54-foot-long, bright blue bendy in the bus lane, stopped at the lights and preparing to turn left, and a small hatchback in the right-hand lane next to it. It's early evening, conditions are dry and visibility is good.
The lights changed to green, the bendy turned left, then right on St Mary Street. The hatchback also attempted to turn left (when it should only have turned right on to Wood Street at that junction) and accelerated...
The bus was all scratched up on the offside and a couple of panels were bent up. No major damage, but it had to be taken out to service to be checked over. The nearside doors of the hatchback were badly buckled - fortunately there were no passengers in the car.
That area was well-covered by CCTV, the bendy had something like a dozen external/internal cameras, GPS tracking and a whole bunch of sensors feeding into a black-box style recorder. Let's just say the hatchback driver didn't have a leg to stand on.
To be fair, that's a Welsh bus. Instead of concentrating on driving, I'd probably be looking at the side and wondering how to pronounce that seeming random collection of vowels and consonants, and whether "Cysylltu pobl a Chymunedau" is Welsh for "C'thulu is my co-driver".
Here's another interesting one: https://www.google.co.uk/maps/place/Northumberland+St,+Huddersfield,+West+Yorkshire+HD1email@example.com,-1.7805395,3a,66.8y,75.07h,73.98t/data=!3m4!1e1!3m2!1sUmncoXKNzFGnjA-AhaYgog!2e0!4m2!3m1!1s0x487bdc6d76aee929:0x84feda5b66e64326?hl=en
Bus (non bendy but see below) occupying the left hand lane opposite the side street waiting for the lights to change at the next junction with me behind. Smallish Volvo, (i.e.smallish in Volvo terms) exiting the street on the right (it wasn't one way back then) absolutely determined to butt in behind the bus crept forward until the rubber strip on his bumper was resting on the side of the bus half way along.
What I hadn't noticed until the bus started to move was that he'd actually pushed in what must have been an unsupported panel form in the skirt of the bus. I watched fascinated as the bus eased forward. The panel yielded as it came to the bumper, then sprung out again after it got past so a depression an inch or so deep ran past the bumper until the leading edge of the wheel arch cleared the bumper..
A second or so later the trailing edge of the wheel arch hit the bumper and tore it clean off. The bus drove on. I drove round the back of the Volvo into the side street and left the muppet, his wife, their car and its bumper to sort it out on their own.
However, The Register notes, Google's test operator made the judgement that the bus was probably going to give the car space to reverse. So a human factor was at play here, not just a failure of machine thinking.
The only thing the human did was not override autonomous mode.
You're right. The author is bending over backwards trying to find a reason to excuse the failure. The section you quoted is simply ridiculous.
Furthermore, why was the idiot car backing up ? "Ooh, there's some obstacle in my path. I've stopped too close. Now I need to back up and crash backwards into an oncoming bus."
We'd better get used to this.
But I find it interesting that the test operator didn't override because he and the machine reached the exact same conclusion:
"Bigger, faster vehicle will obviously let smaller, slower vehicle pull in front." (I wouldn't bet my skin on that!)
Possibly a bit of Google bias (Me Important!) slipping through the programming?
In Krautistan there is a saying that a Mercedes comes with built in right-of-way... maybe G00gle just needs to switch car brands to fix this issue.
Edit: was reminded of a Jasper Carrot routine about his mother in law's driving skills - apparently she always checked the mirror prior to driving off, but never made the connection between checking the mirror and not driving off while another vehicle was approaching.
It's not clear the author did more than skim it.
There's nowhere that says the car was "backing up." The car was "moving back" into the center of the lane.
Take a look at the StreetView for this intersection, heading East on El Camino Real as described in the accident report: https://goo.gl/maps/ibea7E9dMFv
That's a wide lane, enough for two cars if they're small. The Google AV moved to the right side of the lane in anticipation of the turn and possibly to get around other traffic. This is a very common move for meatbag drivers, though certainly questionable of an AV. Said AV encounters random sandbags blocking progress. AV waits for other drivers to pass, out of caution. When the area appears clear, AV begins moving toward the center of the lane (presumably forward) at 2 MPH to get around the sandbags. Meatbag bus barrels down the road at a comparatively quick 15 MPH and does not yield to the vehicle in front of it within its lane. There is a collision and the report describes it as the Google AV making contact with the bus.
The damage to the left front wheel indicates this collision may have been moderately more severe than a "fender bender."
Let's review what we've learned from this:
Richard Chirgwin needs to read his source material before posting.
El Reg needs to do a quick fact-check based on the simple, provided source material.
There were questionable decisions on the part of both the AV (2, by my count) and the bus driver meatbag (1, but rather egregious, by my count).
This was a low-speed collision with no injuries, thank God.
Google needs more time to improve its AV fleet safety before mass adoption. I, for one, am glad we have a sensible system in place for developing and testing these technologies.
EDIT: Worth thanking El Reg and Chirgwin for providing the sauce to begin with. Without sauce, I would not have been able to fact-check his statements and enjoy my delicious Righteous sandwich.
"Google needs more time to improve its AV fleet safety before mass adoption. I, for one, am glad we have a sensible system in place for developing and testing these technologies."
Maybe Google needs to talk to these guy's, considering they already have buses out and about and West Oz is getting one http://navya.tech/?lang=en
"If a minor bit of boof-tinkle-tinkle, of the sort that happens every day between meatbag drivers, like this is newsworthy, the driverless cars must be doing something right."
Well.... these cars all have a driver who is supposed to take over (and apparently do fairly regularly) whenever they think the car is going to crash. Given this, the car software's flawless (up to now) driving record is completely unsurprising. After all, you could have a post-pub-crawl BOFH (or PFY) driving your car without worry if a second, sober, driver was automatically going to take over as soon as the BOFH started aiming for the trees ("it appeared out of nowhere!") That said, I doubt the cars behavior is too bad or someone would have mentioned it by now.
To be honest, hopefully this will provide good data for Google -- it sounds downright dicey to me for a car to stop dead in a traffic lane then GO INTO REVERSE just because of a few cones. That is when you stop, turn on the turn signal, and either wait for traffic to clear or (if it's not going to) wait for a good enough gap in traffic and go for it. I wonder if the software just didn't notice the cones in time, if the hardware couldn't see them (and Google found the car needs a sensor aimed lower or soemthing), or if the software just assumed (up to this point) ONLY cones in a "this lane is closed" configuration as opposed to a few blocking off a small bit of road.
If a cone or two is enough to make the current software behave like this, I wouldn't want to get in a Google car here in the midwest. In the midwest(ern US), you'll find bad enough potholes (luckily not too many) to risk destroying rims or suspension if you go straight through them (I've recently gotten a nice rear end noise which I think is a broken rear stabilizer link...); cones blocking off maybe a foot or two of roadway (so they can patch said potholes, in between times when they close a whole lane or two to repave); these what look like straw-filled rolls shoved into the storm drains (but sticking onto the road several inches) that mean you must go a few inches out to go around them. And, generally road markings that are totally worn off the road, so hopefully it doesn't (for example) rely on lane markings to stay in a lane or the like.
Don't get me wrong.. I'm more positive on these then say, Jeremy Clarkson; but I do think it's possible the difficulty of this is being underestimated. This may be one of those situations where software implementing typical driving rules covers 99% of the drive, but there's so many different "remaining 1%" situations that it could take more code to handle that than to handle the main drive.
The "expected" gets handled well.
The "expected unexpected" gets handled pretty well too.
It is the "unexpected unexpected" where automation fails.
Part of the reason the meatsack got it wrong is that mixing a meatsack and autonomous control creates some interesting opportunities for confusion. If the meatsack thinks the controller will handle the situation Ok he is likely to just leave it be. It is only afterwards they he finds he was wrong.
The biggest issue with all this is that the "control surface" gets confused. This is pretty much exactly what caused the AF447 crash:
Three pilots in the cockpit of which one was yanking back on the stick because he knew that Airbuses don't stall and will handle the stall for you. However stall handling was disengaged.
Three pilots in the cockpit as it slowly fell in a deep stall from 28k ft to 0 ft with the stall alarm yelling "stall stall.... stall stall..." until it got tired. But we all know Airbuses don't stall.... so just ignore it.
All they needed do was push the yoke 6 inches forward and 288 people would not have died. In a manual plane they would have tried to handle the stall, but in this case they did not.
Same thing's going to happen with autonomous vehicles (and did).
It is the "unexpected unexpected" where automation fails.
Which would be a very good point, were it not that meatbag drivers are also most likely to get it wrong in this sort of event.
A good driver or pilot will analyze what happened in the run-up to the "unexpected unexpected" event, be that a crash or (far more often) a near miss. Did he do something that ate into his margin for error? Did the other party do something of that nature? Or was it truly a case best summed up as "shit happens".
The accident rate will nevr be reduced to zero. If the average robot is better than the average meatsack, that ought to be good enough. Especially so, if they are better at avoiding the serious and fatal accidents that insurance-average teenagers are noticeably bad at avoiding.
They are aware of that difficult relationship between increasing automation and driver disengagement.
I found this Ted Talk really interesting on that subject.
That's exactly why the google cars are intended to be fully self-driving. There won't be a back up set of controls for the human in the production version because they know that handing control back and forth between the computer and the human is not going to work. The human is expected to be blind, unqualified to drive, drunk or asleep.
The Google car tried to create two lanes of traffic when only one was described by the road.
- This was because of the wierd turn right on red rule, it was probably indicating right.
It then got stuck behind a stationary obstructon - the lane wasn't as wide as it looked.
When traffic started flowing it then went to merge with the traffic as a gap appeared.
The bus driver, in the same lane, drove their bus into the GooCar.
I suggest that the bus driver may have thought the Lexus was parked/parking if the right indicator was still on - else they should not have been trying to pass a vehicle in the same lane. Ideally the Lexus would have been indicating left at this point of course.
There are cars parked along that road in the streetview linked by a previous commentard.
As for their safety record - they have driven far further than the average driver ever will - and each time any one of them encounters an 'interesting' situation they *all* learn from it...
That's far better than the current situation...
After all how many people get killed by drivers each year?
30 thousand a year in the USA, another 2k/year in the UK
That's not a particularly high bar to exceed - and the advantage is that the "drivers" will get better over time, they won't develop sloppy habits, get tired, read an SMS, be cross with the kids in the back. They don't have "blind spots", they don't have tempers, they don't start sneezing.
They don't have "blind spots", they don't have tempers, they don't start sneezing.
Good points. However, fresh from the factory, I think you're right on the above for the most part. But there are some vehicles that have issues even fresh from the factory. And as the car ages, sensors fail, hardware gets iffy.
OTOH, there might be some AI to monitor all this at some point and reliably predict failures on a given vehicle.
And the failures are where self driving cars can really come into their own. They can and should have redundant systems, so they can simply drive themselves to a local service centre and get a replacement fitted - or, if they are seriously compromised, just stay put.
As opposed to the small majority of cars driving around at the moment with failed lights - their drivers know they have failed lights, they just don't care.
They will learn, adapt and in time drive those roads better than either of us ever could.
I like driving, but I am rather excitedly looking forward to the day when I can leave the driving to an AI (of sorts).. going to see a mate, hang out, have a few beers, get in car, be driven home. awesome.
Go to a concert - no parking; thats ok - get off in front of gig - car goes and parks itself further away (maybe in a designated parking area just outside city) where there are actually parking spaces, comes and picks me up once the gig is over.
I just hope the tech matures before I'm too old to appreciate it :D
"They will learn, adapt and in time drive those roads better than either of us ever could."
They'll have a lot to learn. The sheep might not be in the road yet. Given that they can recognise a sheep, can they still recognise it when it's standing on top of a wall. Can they recognise that the sheep on the wall is facing the road? Can they recognise that it's getting ready to jump and that when it does it will already be too late to break so break NOW?
> The sheep might not be in the road yet
Two simple rules:
1. Is there sheep on only one side of the road and do they have their heads down, eating? Probably safe.
2. Sheep on both sides or shepp standing with heads up looking across road? Almost certainly unsafe.
 Unless it's a young lamb about to run out in front of a fast-moving Ducatti. It was... messy.
That seems like a pretty good rate to me. I've been driving 18 years and must have covered around 150k miles and I've had two minor accidents that were my fault (both in my first 4 years) so it's beating me by more than a factor of 10.
Ok there are an additional couple of hundred where the person intervened, so if all of those would have led to a crash then I'm a better driver (though 18 year-old me wasn't). But I'm guessing that for a few of those the car actually had it covered but the person got nervous. Also presumably it's improving so hopefully the intervention rate is dropping.
All in all we're pretty much there. Sell me one now and I'll be no less safe. A couple more software upgrades and I'll be much better off.
Have they even started testing self-driving cars out in the wilds? On one-track roads with potholes deep enough to eat a tyre? At night? In the rain? Well-greased with mud? Fog? All of those at once? Even on a good day who knows what's around the next corner (a jogger? a cyclist? a mega-tractor towing a spiky thing? a rider on a horse? A lorry delivering heating oil? An escaped bull? )
I made up the escaped bull. The rest is the last few months where I live (nowhere particularly remote: just rural Northamptonshire! ) Still time enough to add snow to the list.
Yes - they're testing it in the wild...
OK - the wild isn't a particularly difficult place weather wise yet...
But the car would be far better than you at seeing around the corner - because it will slow down on approach...
It will be better than you at fog/rain because it will have more sensors, operating at different wavelengths, than you do.
It will be better than you at night, because it won't be tired.
It will be better at dealing with potholes, because it will "see" them all, and plot a course around them - or correct the steering faster than you could.
Someone seems to have forgotten the point of these vehicles. If the passenger must watch constantly in case he needs to intervene, that's harder work than driving the car yourself. So it's entirely a failure of machine thinking.
Yes and no - machines aren't thinking (yet).
And if they start thinking (self awareness, original thought, all that jazz) the cars will all be like 'Can't you take a taxi? I don't want to drive you to the airport. I think I'll swing by the car wash and maybe get my oil changed. Oh, and tonight there is a Steve McQueen retrospective at the drive in, they are showing Bullitt and Le Mans!'
The collision between the Lexus and the bus would never have happened if it had been a self-driving autonomous vehicle.
a) self-driving cars have lidar mapping of the environment, so it would not have been surprised by sandbags in the road (unlike myopic texting sleepy meatsack drivers).
b) self-driving cars watch the traffic around them and plot their courses, a self-driving car will never manouver into the path of another vehicle (unlike careless distracted impatient impulsive meatsack drivers).
a) lidar still needs to be able to see the sandbags - which it did, but only after trying to make two lanes out of two.
b) the car has been programmed to ease traffic flow - that does mean asserting priority at times, else the whole system shuts down...
So it understands priorities and common behaviour (in this case making the lane into two lanes on approach to a junction is apparantly common practise, mostly because the American't can't paint - or don't stop at red lights... ;) )
So it started the two lane manouvre, as would a human, then had to stop - and pulled back into a gap in the traffic. Note that the traffic was still in it's own lane - so it could reasonably determine that it had priority in this situation. The bus driver either assumed it was parking or didn't give a monkeys about priority - we'll probably never know which.
Biting the hand that feeds IT © 1998–2019