An arial view of a 3D modeled scene of a small red car with a black roof and blue radar beaming from the front of a self-driving car driving behind a tractor trailer on a street lined by parked cars.

Self-driving cars have been heralded as the biggest upcoming invention, and experts have claimed that they would be a normal part of our lives within a decade. However, the companies that are deeply involved with making these futuristic concepts a part of our reality have struggled to put together the pieces that are necessary to make self-driving cars safe enough for ordinary people to use on a regular basis.

Now, a lawsuit in China against self-driving car pioneer company Tesla may slow things down even more.

Tesla’s Autopilot Function at Issue in Car Accident

Tesla’s Autopilot feature, which was introduced into some of its cars back in October of 2015, was one of the features of the Tesla Model S sedan that 23-year-old Gao Yaning was driving in January, 2016. It was then that he rammed into the back of a street cleaning vehicle in the province of Hebei, in northeastern China. Yaning was killed in the car accident, and his father, Gao Jubin, filed suit against Tesla and the car dealership that sold his son the car that he was driving at the time.

Video shows that Yaning’s car careened into the back of the street cleaning vehicle without breaking.

Jubin’s lawsuit claims that the vehicle was in autopilot mode when the crash occurred, and that the autopilot program failed to take account of the road conditions. Tesla, however, disputes this claim, stating that there was no way of knowing whether the autopilot program was on, because the crash was so severe that “the car was physically incapable of transmitting log data to our servers.”

Tesla’s Autopilot System a Forerunner for Self-Driving Cars

Tesla’s autopilot system is not the self-driving car that we have heard so much about. Instead, it is a program that “sees” other cars and objects while driving on the highway, allowing the car to steer itself to maintain its lane, manage lane changes, and brake for upcoming obstacles. However, Tesla’s Autopilot program does not allow drivers to completely take their eyes off the road. Possibly because of this, numerous crashes and fatalities have been reported in which the driver of a Tesla vehicle completely relied on the program to drive the car, leading to an accident.

Cobb County Personal Injury Attorney Joel Williams

Tesla’s Autopilot program is a huge step towards self-driving cars. However, the Autopilot program should be thought of more as a halfway point between cars that need to be driven, and cars that can drive themselves. This has proven to be an awkward halfway point, because it requires drivers to remain attentive, with their eyes on the road and their hands on the wheel, while still assuring them that the car will do the work. These mixed messages are likely to blame for the wrecks that the device has created.

If you or someone you love has been involved in such a car accident, contact Williams Elleby Howard & Easter for legal representation.

Recommended Posts