FREE CONSULTATION (314) 361-4242
Free Consultation

Self-Driving Cars & St. Louis Car Accidents

Published:
Updated:

Many people are still suspicious of the life-saving potential that self-driving cars have, pointing to the fatalities that involve this type of vehicle.

Whether it is because of distracted driving, driving under the influence, road rage, substance abuse, drowsy driving, speeding or simply a bad decision at the wrong moment, most serious car accidents are caused by a human error. According to an NHTSA study, up to 94% of crashes are caused by drivers, the rest of them being attributed to the vehicles, environment or other conditions.

car accident involving a self-driving car

These statistics sparkle an exciting debate, especially now that self-driving cars are being tested in real life conditions and driving along regular vehicles. If we are to expect a future with more autonomous cars, could that mean that the statistics will change very soon?

Many people are still suspicious of the life-saving potential that self-driving cars have, pointing to the fatalities that involve this type of vehicle. However, even when autonomous cars do crash, as it turns out, human error is often to blame. Human operators are yet to pass the full responsibilities to their machines, and self-driving car manufacturers always recommend taking over in case of a critical situation.

One of the most recent accidents and perhaps the most famous one yet is the Tesla crash. The car was driving in fully autonomous mode, killing its driver on the scene. Self-driven cars are not equipped to avoid any scenario fully, and the board computer instructs the human operator to take over when the road conditions are too unpredictable. Also, as a means of precaution, the driver should keep his hands on the steering wheel and observe the road, even when the car is fully autonomous.

However, the investigation showed that the driver did not hold the wheel and ignored the board computer warnings that instructed him to take over. Simply put, even this accident is a consequence of human error.

Tesla’s Safety Issues

The issue of safety and self-driving cars is not new. Governments around the world are struggling to keep up and establish legal frameworks that would uphold safety standards.

Back in April 2018, Tesla was removed from the official investigation into the Autopilot fatal crash in Mountain View, California, for violating its legal agreement with the NTSB by publicly commenting on the ongoing investigation. Tesla said it voluntarily left the investigation, but the NTSB released an official statement denying this fact.

The company has faced its fair share of public scrutiny as a result of the crash, which has prompted Tesla to make different comments about the Autopilot and its functions. According to the Society of Automotive Engineers automation 6-level framework, the Autopilot fits into the level 2 category. The system can manage speed and steering under some conditions, but still requires drivers to be attentive and take control.

Autopilot functions also fit this description, as it alerts drivers whenever they take their hands off the steering wheel. In response to the crash, Tesla reinforced this fact and said many times that their system depends on the driver who is responsible for everyone’s safety, including his own. However, if you look at their advertisements, the Autopilot seems to perform higher than a level 2 automation, leaving many people wondering which side of the story is true.

Still, it’s very likely that even if the current level of Autopilot does not allow for a fully automated driving experience, Tesla and other companies are currently at work to make that happen. If it does, it will most likely create a very problematic task at the legislative end.

The Grey Area of Rising Technologies

We’re living exciting times as technology is taking over and making our lives more comfortable than ever before in many ways. At the same time, this also requires new regulations and laws. It will likely be a while until there will be only autonomous cars on the road. Until then, the rules we have now can’t cover all the situations involving self-driving cars.

In the rare cases when technology fails, who would be liable for the damage: the car manufacturer or the software creator? It is still unclear how to approach such a change of paradigm.

Many people are wondering if autonomous cars can detect the nuances in situations involving humans. For instance, will a self-driving vehicle be able to observe a pedestrian who is absorbed by their cell phone, or a child who is not aware of their surroundings?

These questions will likely dominate the new era of the autonomous car and shape the future of road safety and driving experience.

Why Choose Us vs TV Lawyers?

Direct Attorney Access

Maximum Settlement Focus

Trial-Ready Cases

Paralegal Handling

Quick Settlement Push

Settlement-Only Focus

Free Consultation with a St. Louis Car Accident Lawyer

Don’t talk to an insurance claims adjuster before speaking with The Hoffmann Law Firm, L.L.C. We can help you avoid making statements that may affect the outcome of your case. The consultation is free; you don’t pay unless we get you money!

Updated: May 10, 2024