Problems with automated cars:
1) Denial of service attacks. Though possible with traditional cars, they can call for help. Imagine being asleep for the journey to Scotland only to find the car stuck 100 yards down the road because it couldn't progress? Everything from painting extra white lines on the road (there's a guy who puts salt-lines on roads as an art-project to mess with the car's heads), to playing games with the sensors (stick some clear tape on the LIDAR, watch as your neighbour's self-driving car won't move because it thinks it's touching an object).
2) Technology immaturity. We just don't have cars that don't plough into the side of trucks - the stated Tesla case is proof in point... the car still hit the truck. An ENORMOUS truck. HUGE. At speed. Killing the driver. Whether or not the driver was dead in the passenger seat, it shouldn't have mattered. It shouldn't have been possible.
3) Liability. Because of the above, nobody has yet agreed whose fault they are if they go wrong. It's a bit I-Robot-esque to me. Either we have control AND responsibility, or neither. And that means ceding control to the car company. This could impact on everything from finance agreements (sorry, your payment is late, we won't take your wife to hospital) to social enforcement (sorry, you're all under 21, I detect three people in the car and it's past 10pm... you're not going anywhere pal). Also... who has liability for the loading of the car? If someone doesn't put their kid in the child-seat properly, how is the car going to know? But you'll still sue them to oblivion if it crashes. Presumably child-seats would still be legally required, or are we claiming they're so safe we never need to use them?
4) Mixing of autonomous and manual traffic - it's stupid, liable to danger, the biggest programming hazard, the cause of the Tesla accident, and easily solved by just... well, having a special lane, almost like a straight line between destinations, that only authorised cars can drive on, where the hazards are lessened and decisions and marking are clear-cut rather than negotiating the rush hour traffic at the Hangar Lane Gyratory. (P.S. we have that, it's called a railway).
I'd be quite happy with a special segregated lane, just for autonomous traffic, that is the only part they're allowed to drive on, and has all the special gear in the road to signal junctions, other traffic, etc. Put the safety in the infrastructure, not the vehicle. Literally, a personal train. And then roll that out bit by bit until all roads are like that and we can get rid of humans (50 years +). The suggestion to just have these things co-exist is a nonsense.
5) Over-trust in humans. If you don't need a driving licence to drive, then you will see them abused by people who aren't subject to bans etc. People will overload their autonomous car, let it pile through tiny backstreets late at night, leave them in the middle of nowhere like an abandoned shopping trolley, etc. And if the people who drove them can't be traced / stopped / banned, what can you do about it? It needs a kind of registration system at minimum. You'll see them used as drug-runners, porn-peddlers, even automated motorway adverts, getaway vehicles, whatever they can be misused to do. People will be loading drunk friends into them and programming it for Glasgow, etc. Wanna have a laugh? Summon 1000 automated Uber's to your mate's house and block the road. Who's responsible, the companies involved who made cars that block up the roads for hours for everyone else, or the guy who paid them them to do it?
But the biggest deal... we just don't have automated cars. They don't exist. We have software junk in a normal car with a couple of sensors. They aren't fit for purpose. Test them as people-less cargo deliverers for 5 years before you licence them to carry humans (thereby halving potential casualties). But we seem to be skipping that bit.