Uber self-driving car that struck woman may have guessed (wrongly) she wasn’t real
When Elaine Herzberg was struck and killed by a self-driving Uber in March, the leading media narrative blamed the tragedy on 44-year-old supervising driver Rafaela Vasquez, who was convicted of driving with a suspended license
Now it appears Uber vehicle itself was at fault during the accident, primarily caused by a failure in Uber’s software, this narrative doesn’t hold up quite as well, according to a report from The Information on Monday. The Uber car “saw” that Herzberg was crossing the street, but its system decided she was a false positive in its detection system; as a result, it decided not to move out of the way or stop, according to The Information’s sources.
Theoretically, Vasquez could have manually directed the car to move out of the way. However, new research suggests that the design of self-driving cars encourage their operators to over-trust the vehicles. The March accident also wasn’t an isolated incident with self-driving cars: Just a few days ago, a Waymo self-driving car crashed during a test drive in Chandler, Arizona, inflicting mild injuries to the driver.
Uber halted its Phoenix-based self-driving car testing program shortly after the Herzberg’s death, but managed to avoid certain scrutiny of its policy so long as Vasquez shouldered a significant amount of the blame. Tempe police chief Sylvia Moir went so far as to say that Uber was “likely” not at fault for the death.
Uber also claims it’s started a “top-to-bottom” review of the safety of their vehicles, and hired Christopher Hart, former head of the National Transportation Safety Board, to lead the process. “Our review is looking at everything from the safety of our system to our training processes for vehicle operators, and we hope to have more to say soon,” the Uber representative told TechCrunch.