Re: Hard decision but Mercedes are probably right
Hmm, have you read anything like the Highway Code, driving fitness requirements, etc?
#1: Pedestrian recognition isn't perfect
As a driver you are expected to be fit enough (eyesight, sober) and driving at a speed appropriate to the conditions to ensure that you are guaranteed to be able to recognise pedestrians, cyclists, any road hazards, etc.
Why should a self driving car be somehow exempt from that just because the technology is hard?
If it is to be allowed on the roads and be fully autonomous and is less perfect than humans are required to be, then the manufacturer should take the responsibility. After all, if a human driver under performs and is blamed for an accident / injuries / death, they are held responsible. The machine and its maker cannot be given a let-off.
And if the technology is systematically less able than a human driver is required to be, then it shouldn't be allowed at all. We might as well start saying it's OK to be drunk behind the wheel.
#2: Take personal responsibility
Er, except that as a driver you are required to anticipate road hazards. Not every pedestrian is responsible for their actions. Ever seen a young kid run out into the road? Ever seen an elderly person fall off a slippery pavement? Ever seen someone pushed into the road by a mugger? Ever seen a pram roll away from a distracted mother? No? Well lucky for you. These things happen, and it's not their fault.
Your attitude is wrong, you should get it fixed. You're saying that the young kid, pensioner, crime victim or baby deserve to be run over.
#3: It's a hard decision.
No it's not - the pedestrian has no protection. The car occupant is surrounded by crush zones and air bags. I say the car should take a chance and trust its own structural integrity.
Not that it's ever going to come to that. If a self driving car is coded to drive in a manner that would allow such a situation to develop (i.e. too fast) then the car maker is as guilty of causing death by reckless driving as a human driver would be.
Now if the law is written appropriately (the car manufacture is liable for its behaviour and any accidents it causes), then the manufacturer would be held responsible for the accident / injuries / death caused by the car's inappropriate speed. So they're going to have to code their cars to drive like grannies in town.
If the law says otherwise, then I won't be getting in a self driving car. Trust my personal liberty to the behaviour of some software written by some guys who get no come-back if it goes wrong? No thanks.