Reply to post: Personal thoughts from the Dash-Cam Video

Uber's disturbing fatal self-driving car crash, a new common sense challenge for AI, and Facebook's evil algorithms

Nigel Sedgwick

Personal thoughts from the Dash-Cam Video

My thoughts from the dash-cam video.

(i) As for Roland6 above, it looks to me as if the car was on dipped headlights only, though the external circumstances (low street illumination and no oncoming vehicles) should mandate full-beam headlights. As a potentially relevant supplementary on this, does the 'autonomous' vehicle system control the headlight dipping, or not? If not, the 'supervisor' should be continually monitoring and do so - to retain adequate visibility for his/her 'supervision' function.

(ii) I am concerned that the dash-cam is not adequately showing the lighting contrast that would be available to a human 'supervisor' who had adjusted to night vision requirements. Thus the seeming lack of visibility for a human driver in those circumstances is not actually in any way certain: otherwise, surely the 'supervisor' would have taken back manual control much earlier.

(iii) If the 'supervisor' had had his hands on the steering wheel, given the road position of the pedestrian at the time of impact, swerving left would have been adequate to miss the pedestrian - braking would probably not have been adequate. Accordingly, always or at least at night, it seems to me that 'supervisors' should have their hands on the steering wheel. Also, why did not the 'autonomous' vehicle system steer/swerve the car to the left; it looks to me as if it should have had time to do so.

(iv) Given that LIDAR, as an active illumination system, clearly has no capability to judge distance unless an adequacy of light is reflected from every object on the road (including black clothing), there must surely be an overriding requirement for additional sensors and processing to be active in parallel with the LIDAR. On this, there needs to be an requirement on the 'autonomous' vehicle system control (as for a prudent driver) to drive at a speed consistent with being able to make an emergency stop within the distance that it can 'see'.

(v) The article states "Internal documents also revealed that the company’s self-driving cars were struggling to meet its target of driving 13 miles without any human intervention during testing on roads in Arizona, ..." Surely the whole concept of requiring, instructing, seeking, hoping that 'supervisors' would avoid 'override' would be entirely against safety-critical requirements - on any grounds other that those of a reduction in safety below what the 'supervisor's' own personal driving style/requirements would be.

(vi) Has it been established what the 'supervisor' was doing looking down. Was this at the speedometer or other dashboard display? Was it at a mobile phone? If the latter, this immediately implies full culpability: on grounds of lack of attention and, very probably, on too-bright illumination causing degradation of the 'supervisor's' night vision capability.

Best regards

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon