Advertisement

What’s in a Name? For Tesla’s Autopilot, Plenty

Photo credit: Car and Driver
Photo credit: Car and Driver

From Car and Driver

Calling Tesla’s business practices “deceptive and unfair,” two prominent consumer organizations have asked the Federal Trade Commission to investigate Tesla Motors and its use of the Autopilot brand name for its advanced driver-assist feature.

The Center for Auto Safety and Consumer Watchdog say the name exaggerates the system’s capabilities and that resulting consumer confusion between Autopilot and an actual autopilot capable of assuming full responsibility for vehicle operations could result in deadly crashes.

“The marketing and advertising practices of Tesla, combined with Elon Musk’s public statements, have made it reasonable for Tesla owners to believe, and act on that belief, that a Tesla with Autopilot is an autonomous vehicle capable of ‘self-driving,’ ” executives from the two organizations wrote to the FTC on Wednesday.

ADVERTISEMENT

Tesla’s Autopilot, of course, is not a self-driving system. Although human motorists can remove their hands from the wheel for brief periods of time, they retain all responsibility for vehicle operations even when the Autopilot mode has been activated. Nonetheless, there no shortage of evidence of drivers treating Autopilot as a self-driving system, from video stunts posted online to tragic fatal accidents on the highways.

An FTC spokesperson confirmed that the federal agency had received the letter but did not say whether it would initiate an investigation. As a matter of practice, the FTC doesn’t discuss potential or ongoing cases.

Advanced driver-assist features are becoming more widely available on vehicles, and self-driving systems may be available for consumer purchase in a matter of years. New terms devised to describe these new technologies-“autonomous,” “automated,” “self-driving,” “semi-autonomous,” “driverless”-often fail to capture the nuanced differences. And that’s before manufacturer-specific brands like Tesla’s Autopilot or Cadillac’s Super Cruise are added to the jargon.

Tesla disagrees with the notion that consumers are misled or befuddled by Autopilot’s capabilities and limitations. “The feedback that we get from our customers shows that they have a very clear understanding of what Autopilot is, how to properly use it, and what features it consists of,” a company spokesperson said.

Others are less certain about that understanding. In its investigation of a fatal Florida crash in 2016 involving a Tesla Model S operating with Autopilot enabled, the National Transportation Safety Board found that the driver, Joshua Brown, had overrelied on the Autopilot system and lacked understanding of its limitations. The NTSB recommended manufacturers add system safeguards to limit the use of automated-vehicle control systems to the conditions for which they were designed.

Following that crash, Tesla made changes to the way Autopilot works, including shortening the time frame before audible and visual warnings are delivered to drivers who remove their hands from the wheel. If drivers repeatedly ignore those messages, they can be locked out from Autopilot during that trip.

If the NTSB is charged with recommending those protections be implemented by manufacturers, Consumer Watchdog and the Center for Auto Safety intend to ensure there’s a similar consideration on the consumer-facing side of the government. They say that Tesla’s marketing efforts violate Section 5 of the Federal Trade Commission Act, because they “are likely to deceive even diligent consumers, who would act reasonably in believing them, and are likely to use Autopilot differently than they would if Tesla employed more honest and transparent advertising strategies.”

“Tesla has repeatedly exaggerated the autonomous capabilities of its Autopilot technology, boosting sales at the expense of consumer safety,” said Jason Levine, executive director of the Center for Auto Safety. “The FTC must step in and expose this charade before more Americans are injured or killed.”

This is not the first criticism of the Autopilot name. In October 2016, Germany’s Federal Motor Transport Authority wrote to Tesla chief executive officer Elon Musk asking that the company no longer use the “misleading term” for its driver-assistance system.

Meanwhile, the NTSB is investigating another fatal crash, one that occurred in March involving a Tesla Model X operating with its Autopilot enabled. That probe has grown contentious; the NTSB revoked Tesla’s status as a party to the investigation after the company released certain details about the crash.

These deadly crashes, along with one involving an Uber self-driving vehicle that struck and killed a pedestrian in March, have eroded some of the trust automakers had begun to build with consumers surrounding these fledgling technologies. It’s unclear whether consumers are conflating driver-assist features with self-driving ones, but 73 percent of those surveyed say they would be “too afraid” to ride in a fully self-driving vehicle, according to an AAA survey released this week.

That’s a sharp rise from the 63 percent who expressed the same fear when the survey was last conducted in late 2017. Greg Brannon, AAA’s director of automotive engineering and industry relations, suggests broad confusion over the naming conventions and different levels of automation could be contributing to consumers’ fears.

“There are sometimes dozens of different marketing names for today’s safety systems,” he said. “Learning how to operate a vehicle equipped with semi-autonomous technology is challenging enough without having to decipher the equipment list and the corresponding level of autonomy.”

With a nudge from two consumer organizations, we’ll soon learn if the FTC agrees, and if so, whether it will demand more clarity.

You Might Also Like