People are idiots. That much we knew. But now Tesla is realizing that man’s reckless nature is getting in the way of the company’s advancements in self-maneuvering vehicles.
It is tweaking its “Autopilot” feature to make it harder for people to abuse it, putting themselves, their passengers, and other drivers in danger.
Tesla introduced Autopilot to its later Model S sedans and Model X SUVs in October. The feature enables the vehicle to speed up, slow down and change lanes all on its own. In cities, it can also find parking and ease itself into a spot. Quickly, drivers starting posting YouTube videos of themselves putting autopilot to the test, too often dangerous and scary results.
One man even turned on the feature and then climbed into the backseat of his car.
Tesla CEO Elon Musk has stressed that Autopilot is not the same as having a self-driving car and that lane and speed features are intended for use on highways.
In October, Tesla’s director of communications, Khobi Brooklyn, said, “We’ve been very clear with our customers what the intention of these features are, and we trust our customers and we expect them to be responsible.”
Apparently that trust was grossly misplaced. The “clear” communication hasn’t stopped people from using it on off-ramps, parkways and suburban streets, and broadcasting their sometimes harrowing experience.
(Other videos have shown the feature working perfectly, in one case appearing to prevent a potential crash. Even in the best-case scenario, however, we can’t advocate ever taking your hands off the wheel to film.)
In an earnings call last week, Musk addressed the misuse and said the company is compelled to do something to prevent it.
“There’s been some fairly crazy videos on YouTube … this is not good. And we will be putting some additional constraints on when Autopilot can be activated to minimize the possibility of people doing crazy things with it,” he told analysts.
In an interview with CBS News Tuesday, Tesla vice president of regulatory affairs, James Chen, confirmed the company’s intentions.
“We are looking at additional ways that we can make Autopilot a little bit more user-friendly and perhaps a little less capable of being abused,” he told CBS Radio Detroit’s Jeff Gilbert.
“We have had some people do some pretty scary things, and we are looking at ways to improve the system so that we don’t see this type of abuse.”