A measurable percentage of Tesla drivers still believe Tesla models with Autopilot are self-driving cars. At least that’s what a survey of Tesla Owners in Germany late last year indicated. As reported by Forbes, nearly one in 10 Tesla owners there thought Autopilot meant fully autonomous.
Although 93 percent of the 675 Tesla owners surveyed said that AutoPilot didn’t mean fully autonomous, 7 percent answered that it did. Seven percent may not seem like a lot, but just one driver putting too much trust in a driver-assist system can ruin your day.
Owner overconfidence in the Autopilot system has been a reoccurring issue for Tesla. In May of last year, a Tesla Model S with its AutoPilot activated plowed into the side of an 18-wheeler truck in Florida. Investigators said the driver never made any effort to to avoid the crash when the Autopilot apparently couldn’t identify the truck’s white trailer against a searingly bright sky. The driver, Joshua Brown, had previously posted videos of his car’s Autopilot system in action avoiding accidents on YouTube.
In its official report on the Florida incident released in June of last year, NHTSA (National Highway Traffic Safety Administration) found that despite its failure to identify the trailer, Autopilot performed as designed, as did the Tesla’s automatic emergency braking (AEB) system. How can that be?
Simple: NHTSA qualifies Autopilot as an advanced driver-assistance system. As such, it requires the full attention of the driver to monitor surrounding traffic and assume control when necessary. Furthermore, NHTSA categorized AEB through the end of the 2016 model year as a rear-end collision-avoidance technology. It isn’t engineered to perform reliably in any crash situation other than a rear-end collision.
Last March, the driver of a Tesla Model S hit a barrier on a Dallas, Texas highway. Uninjured, his immediate response was to blame Autopilot. A video shot with the dashcam of the car immediately behind him detailed the incident. A barrier in what appeared to be a construction zone separated opposing traffic, with each side having two lanes. An abrupt lane shift to the right included a shift in the barrier. With its cameras reading the original painted lane lines (which continued straight) to orient itself within its lane, the Tesla failed to make the adjustment, bouncing off the wall. That it was raining certainly didn’t help.
Actually, we came away from watching the video convinced that Autopilot, for the most part, did its job. It was the driver who didn’t do his. Sure, the system followed the lane markers rather than the contour of the barrier, but that’s where the human driver is supposed to come in. Once it bounced off the barrier, the autonomous systems held it to the appropriate lane. It never came close to swerving into the vehicle on its right.
What it means to you: In some ways, that seven percent of Tesla owners believing they have a fully autonomous car shouldn’t have to shoulder all the blame. Promises on the Tesla website are expansive, possibly helping to perpetuate the idea that self-driving cars are indeed here. If your luck holds, the Tesla models you encounter will be piloted by the other 93 percent.