Tesla Model S Driver Crashes Into Car, Blames Autopilot Feature
Two incidents in the past few weeks put Tesla on the spotby Kirby, on
Tesla’s Autopilot system may be leaps and bounds ahead of anything other automakers can offer today, but it’s still far from a finished product. Two incidents in the past week have put the spotlight on the Autopilot feature as two drivers were on the receiving end of accidents that they believe were caused by malfunctions on the part of Tesla’s Autopilot feature. For its part, Tesla has denied taking any responsibility, instead shifting the blame to the drivers and their carelessness in understanding the abilities of the technology.
The case of Arianne Simpson is the more recent of the two occurrences. Speaking with Ars Technica, Simpson detailed the trail of events that led to her Model S rear-ending a car at 40 mph. According to Simpson, the Autopilot feature on her Model S didn’t brake like it was supposed to, forcing her to slam the brakes herself. Unfortunately, her reaction came too late as the Model S crashed into the car. Simpson blames the tech for not responding on time, but according to Tesla and its data log, the blame rests on the shoulder of the driver, who it says hit the brake pedal and deactivated the car’s “autopilot and traffic aware cruise control”, thus returning the car to manual control instantly.
Fortunately, neither Simpson nor the driver and passengers from the car she rear-ended were hurt from the accident. Her Model S though appears to have suffered significant damages on the front and will likely need some serious repairs before it can return to driving. It’s the second case in less than a week that a Model S owner is crying foul over what they perceives to be serious flaws on the autonomous driving feature.
Just a few days before Simpson’s accident, a separate incident occurred in Utah where Model S owner Jared Overton claimes that his car started on its own, leading to it crashing into the back of a trailer. According to Overton, he was running errands on April 29 when he went and parked his Model S in one of his errand stops. No sooner than five minutes after getting out of the car, Overton and a worker from the business he was visiting saw his Model S had driven under a parked trailer, causing the car’s windshield to smash. Overton’s complaint reached Tesla, who like in the incident with Simpson, responded by reviewing the vehicle’s log and determining that the crash was caused by Overton and his inattentiveness of the situation. According to Tesla, the car’s Summon feature, which allows the Model S to park by itself, among other functions, was “initiated by a double-press of the gear selector stalk button, shifting from Drive to Park and requesting Summon activation.” This led to the car driving straight into the trailer.
Even if Tesla is right in both instances, having these customer complaints still paints an unflattering picture of the company’s Autopilot feature. Right or wrong, the electric carmaker needs to understand that complaints like these will continue to happen and it’s on the company to ensure that proper awareness of the functions of the Autopilot feature is disseminated correctly to those who can access the technology from their cars.
Continue after the jump to read the full story.
Why it matters
This is a very tricky situation for Tesla because even if it isn’t at fault in both cases – data logs seem to back up its claim – it doesn’t give the company any passes from escaping public perception. The truth is, autopilot technology is still in its infancy and those who think that they can let their cars drive them to their destinations are kidding themselves.
Whatever ignorance the public may have of the Autopilot feature is on them to understand more and learn about what the tech can and can’t do. But here’s the thing. I don’t think Tesla is still in the clear here, no matter what its data logs say. Part of offering this technology is making sure that people know how to use it. It’s one thing to have terms and conditions and have owners agree to them, it’s another thing to really be proactive in saying what the Autopilot tech can and can’t do. Let’s face it, a lot of people skip through these terms and conditions and just sign up regardless of what the fine print says. Yes, that’s wrong because these people don’t know what they’re signing up for.
Tesla should know this and it should go above and beyond just offering a T&C to escape legal liabilities. If Overton was right about something in the aftermath of his Model S crashing into the trailer, it’s that it could’ve been worse, and whether Tesla believes that it’s right or wrong, a more serious accident that’s blamed at the Autopilot feature could have far-reaching ramifications, not just for the Tesla and the tech itself, but for the industry as a whole.
I’m not saying that Tesla’s in the wrong here because it has the data to back up its findings. This is a matter of understanding the human nature of experimentation and making sure everyone who gets the Autopilot feature that it is something that shouldn’t be experimented on in busy roads.
Do something more than issue responses to these claims because some people will be inclined to believe the people who were involved in the crashes. Raise awareness – the proper and comprehensive kind – on the abilities and limits of the Autopilot feature. Don’t settle with wavers and fine print; this is technology which has the potential to shift the industry as a whole. It deserves more than that.
Read our full review on the Tesla Model S here.