Just Another Excuse or Did Tesla’s AutoPilot Really Hit Three Cars, Once of Which Was a Police Crusier?
The man had the audacity to blame Tesla for it!by Sidd Dhimaan, on
When you have a technology with a tainted reputation in your car, there are high chances that it’ll be blamed for something it didn’t do. That is exactly what Tesla has been going through with Autopilot. A Model 3 owner crashed into cop cars and blamed it on the Autopilot saying he was checking on his dog on the rear bench. Even if the vehicle was on Autopilot, what happened to being ready to take over when needed?
Tesla AutoPilot - What You Blame When You Don’t Want to Admit Fault
As it turns out, two cop cruisers were on duty helping out a disabled car parked in the left lane of I-95. The police parked their cars behind the stranded vehicle and had their emergency lights on to make sure they were visible. In fact, they even lit an addition flare behind them to be visible from a long distance. As they were waiting for a truck to come and tow the vehicle, a white Tesla Model 3 came crashing into the vehicle waiting to be towed and dragged it for several hundred feet before coming to a halt.
The Model 3 driver, Bruno Alves, stated that he had his vehicle on Autopilot and explained that he was checking on his dog which was in the back seat prior to the collision.
Fortunately, no one was injured, but the driver was charged with “Reckless Driving and Reckless Endangerment.” It didn’t end over there. A lawyer representing the driver is blaming the crash on Tesla’s misleading marketing. Mark Sherman, the Stamford criminal defense attorney representing the 33-year-old driver, said his client was relying on Tesla’s autopilot function when the crash occurred.
I personally think the summons issued makes complete sense because regardless of the technology involved, the driver must be cautious. Tesla even says that until cars have achieved Level 5 Autonomy, there will always be a need for a human behind the wheel to make all the important decisions and be ready to take over when needed. Yes, there are semi-autonomous systems out there; but to help you, not do the work for you. It’s better to treat them as safety assistance because that’s what they really are.
Autopilot and Controversies Go Hand-In-Hand
Earlier this year, the driver of this Model S and his wife, the co-passenger put too much faith in the car’s Autopilot system. Autopilot warned the driver of an object right ahead, but he did not react. His wife noticed this at the very last moment, but it was way too late to actually do something about it. This resulted in the Model S ramming into the rear of a stationary Nissan SUV at around 45 miles per hour.
And, then there are cases where Autopilot has saved lives.
Cops in California arrested a 45-year-old man who was sleeping in his Tesla Model S while it was cruising on Highway 101.
The driver was in an inebriated state and Autopilot took over once the driver stopped giving inputs to the car. Although it was unclear if the car was actually running on Autopilot, but it took them over several miles and seven minutes to stop the Model S. The driver, Alexander Samek, was detained after the police spotted his gray color Model S driving at 70 miles per hour Southbound on Highway 101 at 3:30 a.m. and gave it a chase to stop it. But, news like this barely sees the light of the day.
It’s high time people accept that Autopilot is just an assistance system and you cannot fully rely on it yet.
Level 5 Autonomy is far away and utill then, people need to exercise caution. This dog-man apparently crashed into the troopers because he relied on the technology blindly. When Tesla clearly states that you need to have your hands on the steering wheel at all times, how can you hold Autopilot responsible for your stupidity? I hope people start using this technology the way it was intended to be. What are thoughts on this incident and Autopilot technology as a whole? Share them with us in the comments section below.