Tesla lawsuit possible after auto-pilot crash

Posted on October 3rd, 2016
By developer

Two recent crashes involving the much-hyped Tesla autopilot feature raise important legal questions about the future of driving automation. In May, the driver of a Tesla Model S electric vehicle died in what is believe to be the first fatality related to using a car in autopilot mode.

Federal regulators are looking into the crash. It’s a serious development that could set an important precedent as more cars move toward automation.

The crash — and others involving similar technologies — has raised questions about the limitations of self-driving cars.

Better Than Humans?

Tesla Motors notes that its cars give clear safety warnings before drivers can use the autopilot feature, which is disabled by default.

But those warnings appear to conflict with statements made by Tesla founder Elon Musk — who has praised the technology as being “probably better than humans” for driving on the nation’s highways.

Statements like that may give the family of the dead driver legal grounds for a product liability case. Think this is another “set the cruise control and leave the front seat” lawsuit?

The evidence suggests otherwise.

Two Crashes in Two Months

In the May crash, driver Joshua Brown died when his Model S — in autopilot mode — crashed into the side of a truck on a Florida road. It’s important to note that the Tesla autopilot system doesn’t function fully on its own:

Tesla’s warnings describe it as “traffic-aware cruise control.” Drivers also are told to stay in control of the steering wheel.

After the crash, the truck driver reported hearing a Harry Potter movie playing inside the car, and police found a portable DVD player. Tesla said neither its software nor the driver appeared to react before crashing into the truck.

Not quite two months after the fatal wreck, another Tesla — this time a Model X — was involved in a serious crash.

The driver says the Tesla also was in autopilot mode at the time. The National Highway Traffic Safety Administration is looking into it.

A Case Despite Safety Warnings

In a blog post following the fatal crash, Tesla notes the safety warnings its cars give drivers before allowing use of autopilot.

By default, the functionality is disabled, the company says — and drivers must acknowledge that the technology is new and in a public beta state.

However, the family of the driver who died, Joshua Brown, still may launch a product liability case against Tesla Motors. And they may win based on Brown having been led to believe that the autopilot system had capabilities it did not.

Did Brown receive sufficient warnings about possible defects in Tesla’s autopilot system, or is the company to blame for the fatal crash? That may be determined in court.

Just the term “autopilot” itself may mean some legal trouble for Tesla. For many years, most people have understood it to mean that vehicles — namely airplanes, until recently — operate themselves.

Driver Partly Responsible?

Tesla asserts that its autopilot system provides drivers with ample warnings before activating. In court, that won’t be the only argument:

The company also likely will scrutinize the driver’s actions behind the wheel.

Attorneys for Tesla are expected to argue that the driver accepted the risk of using the system. And, so, he bears at least some responsibility for the accident, they’ll say.

In other words, the driver’s actions served as the primary cause of the wreck, the argument may go. A jury will most likely determine whether that’s the case.

If you’ve been injured in a crash, it’s important to work with an experienced attorney who can help protect your rights. To find out more about your legal options, please contact Taos Injury Lawyers.

 

Contact a Taos Lawyer
Find one of the most highly-regarded injury lawyers in your state.
Need to find something?