As if all the reported crashes involving Tesla’s Autopilot system in the U.S. aren’t bad enough, the American electric car brand has admitted that one of its cars in China has gotten into an accident while in Autopilot mode, adding another layer of scrutiny to the controversial driving assist feature.

The owner of the car, 33-year old programmer Luo Zhen, spoke with Reuters about the incident, which happened when the Model S he was driving sideswiped a stopped car along the side of the road. According to Luo, Tesla’s sales staff was at fault for the incident because they overplayed the system’s functions by describing it with a Chinese phrase that translates to “self-driving.” On top of that, Luo added that when he was being demonstrated the system’s functions, the demonstrator took his hands off the steering wheel and then took his feet off the accelerator.

Reuters also reached out to four other owners of Tesla vehicles in other parts of China, and the owners corroborated Liu’s story that Tesla’s front-line sales staff described the the function in Chinese as “self-driving” and that demonstrators also took their hands off the wheel of the car while demonstrating the function. The news agency added that the term “zidong jiashi,” which literally translates to “self-driving” can be seen several times on Tesla’s Chinese portal.

Not surprisingly, a Tesla spokesperson disputed those claims, saying that the company has “never described Autopilot as an autonomous technology or a 'self-driving car,' and any third-party descriptions to this effect are not accurate.” The automaker also put the blame on Liu for not following the guidelines laid out by the system despite Liu’s claims of being misinformed by the company’s sales staff.

Fortunately, the accident resulted in no injuries for all the parties involved. That said, don’t expect this issue to die down anytime soon, especially in a tightly controlled market like China that’s in the process of drafting its own policy regarding the technology.

Continue after the jump to read the full story.

Why it matters

On the short term, this is another body blow to the reputation of Tesla’s Autopilot system. It’s bad enough that all the previous cases happened, but now that a customer from the world’s biggest auto market has suffered the same fate, it could become more difficult for Tesla to convince the notoriously prickly Chinese customer base to buy Teslas. It’s going to turn into a real problem for Tesla if more Tesla owners in China get involved in accidents, crashes, or worse, gets killed because of the Autopilot system. It’s not going to matter whose fault it’s going to be because you can be sure that the Chinese government will be looking really closely at the issue.

In some ways, it’s probably not a surprise that the Chinese government is still processing its policy towards the technology. A lot can change between today and whenever the policies are passed and you can be sure that if more incidents like the one that happened to Luo Zhen takes place in China, that can influence the government to impose stricter regulations on the technology.

Tesla needs to be wary of that because the Chinese market, as big as it is, has become a difficult market for the company to make any headway, first due to distribution issues and widespread concerns about charging vehicles, and now the looming threat of these regulations being imposed in the country that could potentially even lead to the technology being outright banned. It sounds silly now, but it is a possibility that Tesla should take very seriously.

On the long term, I think this incident can have positive effects for the company, at least as far as understanding that all these accidents and crashes aren’t just the fault of the drivers as Tesla repeatedly claims. The more incidents like this are reported, the more pressure is put on Tesla to improve its Autopilot system to become what the company always intended it to be. If it’s not yet happening now, it will at some point.

Tesla Model S

Read our full review on the Tesla Model S here.