It's a problem - Is the human in control, or is the machine?
In the aviation industry, experience would tell us that even with intensive, professional training on the human side, an semi-autonomous system failing over to a human will often produce a startle effect which the human cannot process in the given time (which is usually tens of seconds in the air because you have height on your side- cars are going to have mere seeconds, if that). There are elements of mitigation involving clear display of information to said human, forcing them to take-over every now and then to stay attentive, and training them to deal with such situations, but they don't prevent the issue completely, and it's probably not feasible to give this type of training to your average driver as they aren't going to want to pay for it.
Creating a safe semi-autonomous system is probably a lot harder than creating a safe fully-autonomous system, and until that changes, I think we'll be seeing more accident reports stating that the computer needs more control, humans need to RTFM, and wherever humans get involved, lots of holes appear in your layers of swiss cheese because they never behave as expected. I note however, that fundamentally, this accident was caused by the errors of 2 humans driving the truck and the tesla respectively - it's not going to sound a death-knell to autonomous driving aids.