It looks like Uber has found a probable cause of the fatal car accident that took place some weeks ago. According to a report from The Information, the problem was with the software of its self-driving technology that decided how a car should react after detecting an object on the road.
As a result of how the software has been configured to react to the objects detected on the way, the car “decided” that it didn’t need to take an action right away.
The people familiar with the matter report that the car in question, Volvo XC90, possibly flagged the woman crossing the road as a “false positive.”
In self-driving programs, the vehicle’s sensitivity to react to objects on the way can be adjusted. In this case, the car’s software wasn’t sensitive enough to stop the accident in time and save the woman’s life.
If you’d seen the video footage of the fatal accident, you’d notice a human operator sitting behind the wheel. However, just before the accident took place, he, reportedly, took his eyes off the road. The company had also reduced the number of safety drivers in the car from 2 to 1.
In the wake of this accident, a widespread scrutiny of self-driving programs is being demanded; Uber has already suspended its testing efforts and it’s currently working with National Transportation Safety Board. Nvidia, the supplier of GPUs that power the tech, has also distanced itself from the autonomous tech efforts of Uber.
What are your views on this important issue? What kinds of regulations are the need of the hour? Share your thoughts and keep reading Fossbytes.