Published on:

Consumer Update on Driverless Car Fatal Crashes

iStock_000016030629XSmall-200x300As our readers know, we have been keeping an eye on developments in the driverless car technology. As the reality of these vehicles grows closer and closer, it is likely that eventually we will all encounter these vehicles on the road. But the technology is not yet perfected and accidents have happened. Uber has been testing driverless vehicles over the recent past in several cities. There is always a driver behind the wheel, but the vehicle is driving on its own. Uber halted the program after a tragic pedestrian crash in Arizona that took the life of a woman as she walked across the street at night. She was not in a crosswalk at the time she was struck. Although there was a driver behind the wheel of the vehicle, the driver did not apparently see her in time to stop from hitting her. Because of this tragedy, Uber has been prohibited from autonomous testing in Arizona. This accident and others will likely lead to more intervention on the legal and regulatory side and call for more regulatory management for driving safety and implementation of these autonomous systems.

The National Transportation Safety Board (NTSB) has issued a report on this accident and has determined in interviews with Uber that the Volvo SUV actually did register the presence of something in the road six seconds prior to impact. Ironically, the vehicle’s automatic emergency braking system was not engaged at the time of the fatal crash. It would have been up to the driver to stop the vehicle. Uber has said it disengages the emergency braking system so that the car doesn’t drive erratically. That decision proved fatal in this situation.

Some tech experts have questioned the logic of this. They say the vehicle’s emergency braking system should not be disabled when the self-driving system is engaged. They argue that computers are more able to detect what is ahead on the road and could have protected the pedestrian from being struck had it been engaged. In the Arizona tragedy, the driver could not be alerted to this due to the way the system is set up. The driver who was behind the wheel did not see the pedestrian until a little over one second prior to impact.

Uber is not the only company under fire for their “partially autonomous” driving system. Only recently, there was another incident involving the auto-pilot system in a Tesla, this time the vehicle hit a parked, empty police SUV. Tesla vehicles have been involved in several tragic crashes and loss of life while drivers have engaged their auto-pilot system which has been touted by the innovator Elon Musk. In some recent crashes the driver has been fatally injured hitting a road divider while auto-pilot was engaged and another driver was injured while looking at her phone and driving in this mode.

The Massachusetts Institute of Technology has just issued results of a study on Tesla owners use of this auto-pilot system. The conclusions are a bit worrisome for those sharing the road with Teslas. The drivers apparently use the system often and mostly on the highway and seem to believe they can relinquish their driving responsibility. The MIT study concluded that drivers need far more instruction on the use of these systems and what they can and cannot do. Although Tesla continues to inform their vehicle owners that their cars are not intended to drive without support of the driver, the improper use of the system continues to be an issue as drivers relinquish their role to their vehicles.

If you have been injured or harmed in a pedestrian or motor vehicle accident, contact Scholle Law for more information about your legal rights and medical support. We are here to advise and support those who need help after they suffer injury or are coping with the wrongful death of a loved one.