It seems that, lately, Tesla->ke1842 just can’t catch a break. People are crawling out of the woodwork with frivolous lawsuits, which are piled on top of all the other negativity like the faults with the Model X. Now, it looks like a Tesla Model S has been involved in a fatal accident, and people all over the internet are debating about whether the fault lies with the driver or the autopilot system of the Model S.

According to the Levy County Journal, the 2015 Tesla Model S was driving east on U.S. 27A near Williston, Florida when a tractor-trailer that was traveling west turned left and passed in front of the car. The Model S struck the trailer, shearing off its roof before it crashed through a couple of fences, struck a light pole, and stopped 100 feet off of the highway. The driver was pronounced dead at the scene.

According to a press release issued by Tesla, the company contacted the NHTSA immediately and was alerted on June 29 that the NHTSA was investigating the incident. According to Tesla, the car had autopilot engaged at the time of the accident and “neither the autopilot or the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied” This is the first known fatal crash that has occurred with autopilot engaged and, according to Tesla, it’s the first in over “130 million miles where autopilot was activated.”

Be that as it may, the general public is divided on where to place fault for the accident. Keep reading to learn more about that.

Tesla’s Response

Before we start going into both sides of the argument, I think it’s only fair that we point out some of what Tesla said in its press release. The company openly supplied information concerning the conditions of the crash and how the car was able to travel under the trailer. Tesla also issued what seems like a heartfelt apology calling the driver of the car “a friend to Tesla and the broader EV community” and offered up its “deepest sympathies to his family and friends.” On top of that, the company openly admitted that the autopilot system didn’t see the truck.

On the other hand, there was also a lot of defense in the press release as well. Of course, this is to be expected to an extent, but Tesla went all out. It describes the various warning and alerts that take place before the system activates, as well as other safety features like the reminders to keep your hands on the wheel, and the “frequent” checks to ensure the driver is touching the wheel. Furthermore, it said that the vehicle will gradually slow down until it determines the driver is back at the helm.

So, while Tesla admits that the autopilot didn’t see the truck or trailer, it also spent a considerable amount of time trying to reinforce the fact that there are various features in place to prevent the driver from getting too distracted. If you’re interested in reading the whole press release, you can see it on Tesla’s website here.

Public Opinion

With the news of this incident, there are naturally two types of people: Those who don’t like the idea of self-driving cars and want to blame the technology and Tesla, and those who feel that the driver was most likely not paying attention and the fault is on the driver of the car. Take some of these comments online for example:

Naturally, someone will call for a lawsuit because, well, it’s Tesla:

Here is a logical thought on the matter:

Surprisingly, there doesn’t seem to be as many people blaming Tesla as you might think. Some are calling out the human population in general as “too dumb” for the technology or saying that the driver was probably doing something he shouldn't be doing. At one point, someone even questioned whether or not the driver had workarounds in place to circumvent the built-in safety systems:

And, there are even some questioning why the driver of the tractor-trailer pulled out in front of the Tesla in the first place. Or, why the trailer didn’t have side guards that eliminate the possibility of cars going under them. Then again, the most common comment questions why Tesla calls the system “Autopilot” when it’s still in beta and clearly not capable of being fully autonomous. In all reality, most of these are pretty good points.

The Effects So Far

While the crash happened back in May, the news has just now been released to the public. Already Tesla is taking at hits. Not only from those who want to blame Tesla for the accident – or at the very least for calling the system “Autopilot,” which could be confusing – but on a financial standpoint too. Take a look at this screen shot from Google search:

You can literally pinpoint the moment that the news of this accident hit the market. It isn’t a huge drop or anything as the price only dropped $6.03, representing a decrease of 2.84 percent. But, it does show some evidence that some stock owners are worried that Tesla could take the blame or, at the very least, feel some backlash from the whole incident. I guess you can’t really blame them considering all the other drama involving Tesla lately.

Conclusion

So who is really to blame? Unfortunately, nobody can say for certain yet. But, that doesn’t mean I can’t express my opinion. Before I get into that, I want to point out that I am a fan of Tesla, and I appreciate the advances the company has made in battery and autonomous technologies. But, I also like to look at things logically... Well, most of the time anyway. That said, don’t turn what I’m about to say into the words of someone who will unequivocally back Tesla.

If you’re familiar with any of my writing, you know that I can be pretty critical of manufacturers, so I’m going to touch a couple of points here. First off, I’ll put this right out in the open. I believe that the driver is most likely at fault. Not because I knew the person because I didn’t, and I damn sure don’t know all of the circumstances surrounding the incident. But, I will openly say that people are dumb – we all make stupid decisions from time to time. It’s how we learn. But, there are more than a few videos out there of people trusting Tesla’s autopilot system way more than they should. Most recently, there was that video of a man sleeping in stop and go traffic that went viral and don’t forget about the video of the couple jackassing around while the car handled the driving for an extended period of time. In my opinion, the driver probably wasn’t following all of the rules or was – like a majority of the people who still drive themselves – staring at his phone when the situation occurred. I mean no offense to the driver or his family and friends, but you can’t deny there isn’t a possibility that this is the truth.

Now, onto what I have to say about Tesla. I don’t think Tesla should be blamed. The company has gone out of its way to warn people that the system is in beta testing, and it’s not a fully autonomous system. But, I have to agree that calling the system “autopilot” probably isn’t the best move considering it isn’t really fully auto. First off, as a population, we tend to use technology before we understand it, and I think that is happening a lot with Tesla lately – take all the complaints about the automatic parking system, for instance. That, and we have a horrible habit of glancing over instructions and not reading terms and agreements. If you question that last sentence, did you read the user license agreement for Windows or MacOS when you turned on your computer for the first time? I’d bet you a wooden nickel that you probably didn’t, and chances are most people glance over the warnings from Tesla as well.

In short, I put blame on the driver, as he should have been able to take control of the car immediately, or the truck driver for pulling out in front of him. But, I think Tesla has picked a horrible name for the system, and I think it has a long way to go before the general public should be trusted to use the system. And, for the record, I love technology, but I hate the idea of not driving my own car – just to put that out there. So, with that said, what are your thoughts on the situation? Who is to blame and why do you think so? Let me know in the comments below.