Complex issues will surface in the courtroom like #HumanFactors, mode confusion and #AutomatedDriving system internals that, frankly, judges and juries are going to be strained to understand and appreciate.
But the #NHTSA, in its infinite wisdom, is just tickled pink to have private litigation deal with #Tesla’s vast #Autopilot wrongdoings because the agency can continue to do nothing, which they do well.
@eff 1/ This certainly could become a problem, but not at the current level of #automateddriving technology. I know the tech well, having worked as Chief Scientist for #AI at a major automotive company until I retired last year. Bottom line: it is far easier to collect vast amounts of data from vehicle sensors than to use it, store it or transmit it. The amount retained today in consumer vehicles, if any, is very small. Think #DashCam x 8 +/- (Continued)…
Let's talk about #Mercedes vehicles equipped with #DrivePilot a bit - a Level 3-capable vehicle that has been recently "approved" in a handful of US states.
This article almost entirely focuses on the legal dynamics of consumer liability should this vehicle create a direct (or, presumably, an indirect) incident.
But, as always, I want to talk about what I feel are the #SystemsSafety realities at work here and the many foot-guns that are associated with that.
The legal expert cited, Professor William Widen, and Professor Phil Koopman have offered their thoughts on attributing liability (between the vehicle and the human driver), as linked to in the article.
Such systems carry additional burdens that are foreign to more consumer/business-level #MachineLearning systems - in particular, the need to exhaustively quantify "the unseen" through objective analysis.
It is something that, most notably, #Tesla fails to recognize with respect to their #FSDBeta program, likely by design.