Tesla's launched it's brand-new Autopilot system with much fanfare, but the advanced system is causing the American company a great deal of trouble. Elon Musk's firm is not only dealing with three crashes in only a couple of months, but also a fatality that occured in May after a Model S collided with an 18-wheeler semi. The case has been under investigation for quite some time and the National Transportation Safety Board (NTSB) issued its preliminary report into the crash. The agency claims that the car was speeding while using the Autopilot function.

According to system data downloaded from the Model S, the sedan was going 74 mph in a 65 mph zone while on Autopilot. This is interesting to say the least, as the system is supposed to prevent the car from speeding. On the other hand, the NTSB's preliminary report didn't provide any information on why the collision occurred or why the automatic emergency braking didn't apply the car's brakes to avoid the collision.

Granted, this is a preliminary report and a final verdict is far from being released, but the fact that the vehicle was speeding while in Autopilot pilot contradicts Tesla's initial claims that it was the driver's fault. Aside from the fact that the company openly admitted that the autonomous system didn't "see" the truck.

As a brief reminder, the accident occurred when a semi truck drove across the highway perpendicular to the oncoming Model S, which struck and passed underneath the trailer. The sedan collided with a utility pole after coasting for some 300 feet before coming to a complete stop. A team of five investigators conducted the on-scene investigation and is still collecting data from the vehicle to further analyze the crash.

Continue readying for the full story.

Why It Matters?

As the Model S crash debate continues, it's becoming increasingly clear that autonomous drive is painfully difficult to implement. The scenario is that much more difficult to analyze since Tesla is actually the only automaker to offer features such as automatic braking and adaptive cruise control (among many others) in a single package. As it did since its inception, Musk's company is trying to revolutionize the industry, which usually comes at a big cost. In this case, Tesla has to overcome the issues of a system that was bound to fail simply because the human behind the steering wheel is still responsible for controlling the vehicle. Tesla also has to deal with widespread criticism from entities that even went as far as to label the company irresponsible for marketing unproven technology.

I'm not trying to defend Tesla here, but this is the first death involving Autopilot in more than 130 million miles of total owner use. Compare that with the number of fatalities coming from various faults in Toyota or GM vehicles...

The truth is that any technology will behave different in the real world compared to factory testing conditions and issues are always to be expected. Everything becomes a major issue when fatalities start to happen. Things will probably settle down in due time and Tesla will bring several improvements to its Autopilot feature, but all will be heavily influenced by the final report on this unfortunate crash. All we can do right now is wait for the NTSB to finalize the investigation.