Tesla’s driver assistance tools are being challenged yet again in court.
This time, a Florida driver is suing the electric car maker for negligence and breach of duty after a collision with a disabled vehicle on Florida’s Turnpike that destroyed the front end of his Tesla Model S.
The man says the collision, which happened while he was doing around 80mph on Autopilot, left him with “severe permanent injuries.” He is seeking unspecified monetary damages.
In addition, the lawsuit also claims that Tesla is misleading consumers into believing its Autopilot system can safely transport passengers at highway speeds.
When you engage Tesla’s Autopilot at over 50 mph it has trouble finding stationary objects and parked cars.
In May of this year, Tesla settled a class action lawsuit from drivers who had bought cars with Autopilot 2.0, a feature that cost an extra $5,000 per vehicle, and which the drivers said was dangerous and unusable. In the settlement, Tesla put $5 million in a fund for legal fees and to compensate buyers of the enhanced Autopilot package from 2016 and 2017 with payments of $20 to $280.
While Autopilot’s enhanced features are Tesla’s incremental steps towards developing a fully self-driving car, these vehicles are not self-driving yet.
While Tesla’s Autopilot system can handle a range of driving conditions, it’s not designed to stop for parked cars or other stationary objects when traveling at highway speeds.
Tesla released a major Autopilot software update last week. Tesla’s new ‘Navigate’ feature on Autopilot “guides a car from a highway’s on-ramp to off-ramp, including suggesting and making lane changes, navigating highway interchanges, and taking exits.” However this warning is also added: “Until truly driverless cars are validated and approved by regulators, drivers are responsible for and must remain in control of their car at all times.”
A closer look into the autopilot issue could reveal that the problem is how these cars are being marketed. The driver in this lawsuit claims that a Tesla sales representative reassured him he only had to “occasionally place his hand on the steering wheel and that the vehicle would ‘do everything else.’”
In September, a driver from Utah lodged a similar complaint after her Tesla hit a stationary fire truck at a red light while on Autopilot. The woman said Tesla sales people told her she only had to occasionally touch the steering wheel of the Model S while using the Autopilot mode.
In response to the Florida lawsuit, a Tesla spokesperson said that they are unable to review the vehicle’s data from the accident because “the car was incapable of transmitting log data to our servers.” The spokesperson went on to say, “However, we have no reason to believe that Autopilot malfunctioned or operated other than as designed.”
Tesla also stressed that driver vigilance remains paramount. “When using Autopilot, it is the driver’s responsibility to remain attentive to their surroundings and in control of the vehicle at all times. Tesla has always been clear that Autopilot doesn’t make the car impervious to all accidents, and Tesla goes to great lengths to provide clear instructions about what Autopilot is and is not.”
Tesla has the following information available on their website regarding Autopilot:
Autopilot is an advanced driver assistance system that enhances safety and convenience behind the wheel. When used properly, Autopilot reduces your overall workload as a driver. 8 external cameras, a radar, 12 ultrasonic sensors and a powerful onboard computer provide an additional layer of safety to guide you on your journey.
Autopilot is intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any time. While Autopilot is designed to become more capable over time, in its current form, it is not a self-driving system. There are five levels of automation and Autopilot is currently classified as a Level 2 automated system according to SAE J3016, which is endorsed by the National Highway Traffic Safety Administration.
Tesla’s Autopilot program is clearly not infallible. Just like other machines, the system is susceptible to defects. While these vehicles equipped with Autopilot features are advertised as being able to prevent or avoid accidents, this case shows that that is clearly not always the case.
In personal injury claims stemming from auto accidents, the main aspect of the lawsuit is proving who is to blame. This can be a heated and complicated debate. In normal car accident claims, the careless or negligent driver who caused the accident is usually found to be liable for damages. However, when a self-driving car is involved in a collision, there is no driver to hold accountable. This presents the issue of product liability because a piece of machinery and a computer is involved. The only way to hold a computer accountable is to sue the entity or company that designed or programmed the computer. There are other legal issues involved as well, which can further complicate personal injury or wrongful death claims.