Cadillac, under the direction of General Motors, introduced the first enclosed car in 1910; Cadillac debuted the electric starter motor that replaced hand-cranking in 1912; Cadillac then built the first mass-produced V-8 in 1915; and in 1940, GM introduced the first viable automatic transmission. Now in 2017, GM might be making another massive industry advancement that will shape the automotive landscape for the next century, if not beyond. GM says it will take full responsibility if its vehicles crash during autonomous driving.

The news comes from GM’s head of innovation, Warwick Stirling, who spoke with CarAdvice on the subject of autonomous driving and the automaker’s strides in Cadillac’s “Super Cruise” development. Stirling said, the question of liability, if the driver is not driving, the driver is not liable. The car is driving.” This makes GM the first major automaker to make statements about accepting responsibility for crashes while its self-driving car is operating itself. Volvo has reportedly made similar statements, though the Chinese-owned automaker pales in comparison to GM’s size, deep global reach, and influence. This marks the first breath of clarity on the convoluted subject of liability, and other automakers are likely to follow. However, Stirling points out GM cannot take responsibility for all self-driving vehicle crashes. Read his explanation below.

Continue reading for more information.

It Boils Down to Defining Responsibilities

Warwick Stirling’s comments on GM’s acceptance of liability for its self-driving vehicles’ mistakes has a few caveats, but understandably so. According to Stirling, only when a vehicle is Level 3, 4, or 5 autonomous and it assumes complete control of its driving should the automaker accept responsibility for any wrongdoing.

Cadillac’s new self-driving system, Super Cruise, is currently rated at a Level 2, meaning it can simultaneously take over both steering and speed control, but the driver must remain alert and be ready to reassume control of the car. “In a Super Cruise situation, because the driver is still in the driver’s seat, and they are supposed to be driving, and the car is helping them, the driver is still liable,” he said in the interview. Tesla’s AutoPilot system also falls into this category, along with a handful of other automakers’ advanced driving aids.

The debate over liability has been ongoing but has gained momentum in response to automakers like Tesla and Cadillac and their semi-autonomous driving capabilities. The issue has even gone under an official investigation. In January 2017, the National Highway Traffic Safety Administration cleared Tesla of liability after a driver died in 2016 after he ignored audible and visual warnings to reassume control of the vehicle before crashing.

However, the situation will inevitably become muddled when the driver completely transfers driving responsibilities to the vehicle in Level 3 and higher systems. At that point, the vehicle is promised to completely have control, allowing the driver to disengage from situational awareness. The question arises: should a vehicle operating in a Level 3 or higher mode be at fault in causing a crash, who would be at fault? Should the vehicle’s autonomous system take the blame, or is it ultimately the driver’s responsibility to maintain the vehicle’s safe operation?

That’s the conundrum. Thankfully, Stirling has shed some clarity and logic to the divisive topic.

“In level four, there’s likely to be no steering wheel no pedal, you’re not driving so you’re not liable,” he said. “ a combination of the fleet owner, OEM, and the service provider will cover the insurance. It’s going to be a capital liability; it’s going to be a complex space.”

Logically, that makes perfect sense, right? If a vehicle is promised to be fully autonomous and allows the driver to disengage from actively paying attention to the road and surrounding, then most certainly it would be the vehicle and its AI systems that would be held responsible for any driving infractions or crashes. As for current Level 2 and Level 3 systems, it is still the driver’s responsibility to ensure the car is driving properly and to maintain situational awareness.

But that’s not the only problem

Yet another issue creeps up when considering self-learning software. Complex computer code with the ability to teach itself new skills has the potential to essentially learn bad habits. Will it then continue being the vehicle’s manufacturer or software architect at fault, or will blame actually rest on the self-taught AI software? That then opens Pandora’s Box on whether self-learning computer systems are self-aware and how bad behavior can be corrected and/or punished. Needless to say, we still have some debates to be had.

The Five Levels of Autonomous Vehicles

To better understand the roles and responsibilities human drivers have in autonomous vehicles, we can look to the Society of Automotive Engineering’s definition of each. This is known as the SAE J3016 standard.