The federal agency has released a document this week questioning how safe are self-driving cars and whether they should be allowed on public roads after last year’s mishap. The document revealed that the software system of the car involved in accident was not designed to detect pedestrians outside of a crosswalk.
The Uber car crash reported on March 18, 2018, by a National Transportation Safety Board killed Elaine Herzberg, a 49-year-old lady in Tempe, Arizona. The report found that the autonomous driving system of the car couldn’t determine if she was pedestrian, vehicle or bicycle. Also, the vehicle could not predict that she was jaywalking in the path of the moving SUV.
This tragic incident has fueled the long-standing criticism and objections of experts who accuse companies like Uber to deploy such cars which are not ready for public roads. The critics are cynical that these automakers are eager to lead transformative technology across the industry.
In addition to the software failure, the company chose to turn off a built-in Volvo braking system that the Uber later concluded might have dramatically reduced the speed at which the car hit Herzberg, or perhaps avoided the collision altogether. On this, experts said that the decision to turn off the Volvo system while Uber’s software did its work did make technical sense as it would be unsafe and confusing for the system to have two software masters.
According to Jamie Court, the president of the non-profit group Consumer Watchdog, who has been critical of companies’ eagerness to deploy such technology, “robot cars would do well driving in a world of robots, but not on roads and crosswalks where human beings have the right of way.”
Alain Kornhauser, chair of autonomous vehicle engineering at Princeton University quoted, “these have to be much better than that before they can go out there (on public roads). If you can’t do better than that, stay on your test tracks. Don’t come out in public.”
The law professor of the University of South Carolina, Bryant Walker Smith, who studies autonomous vehicles, said that Uber’s system was not prepared for being tested on public roads. He stated – “clearly there was a technological failure in the sense that this was not a mature system. The response to ‘I don’t know what is in front of me’ should absolutely be slow down rather than do nothing.”
According to Smith, as it stands now, there are no federal requirements for autonomous vehicles to be tested on public streets, and NHTSA has taken an approach so it doesn’t slow the technology and limit its life-saving potential.
He also said that although the bills regulating the tests haven’t moved in Congress, some states do have their own regulations.
He further suggests that before testing the self-driving cars on a public road the auto companies should illustrate exactly what they’ll be testing and how they’ll do it safely. He added – “they should publicly say this is what we’re doing, this is why we think it’s reasonably safe and this is why you can trust us. If a company so abuses the public trust that they put an immature technology on the road with an immature system, there should be real consequences when people are injured.”