Tesla has been leading the autonomous car industry, and while these cars are an alluring glimpse into the future, they also pose many dangers to road users. These dangers primarily stem from a driver’s overreliance on the vehicle’s technological ability to operate the car safely. A 2018 Florida Tesla autopilot accident is a prime example of the company’s claims and user overreliance.
In that case, a Florida driver was traveling in his Tesla in Autopilot mode when he bent down to look for his phone. Neither the driver nor the vehicle’s technology realized the road was ending. The vehicle flew through a stop sign and red light and slammed into a parked Chevrolet. The tragic accident took the life of a 22-year-college student. The woman’s estate filed a lawsuit against the company, arguing that the vehicles are “defective and unsafe.” In addition, the estate settled a lawsuit against the Tesla driver. This incident was just one of several fatal accidents involving Tesla vehicles operating on Autopilot mode.
The company explains that Autopilot mode is a system that allows the vehicle to accelerate, brake, and steer without a driver. While the company publicly touts their advanced technology or “driverless cars,” their website states that Autopilot mode is designed to “assist” the driver with “burdensome parts” of driving. Further, the website now states that “current” Autopilot features require “active” driver supervision. Despite its name, the vehicles are not autonomous, and the vehicle’s manuals warn operators not to use the function on city streets.