Update: An owner who recently took delivery of a Tesla Model Y posted on social media on May 27 that his new car came with a driver monitoring camera enabled. In photos of the software update release notes, text on his car’s screen said, "The cabin camera above your rearview mirror can now detect and alert driver inattentiveness while Autopilot is engaged. Camera data does not leave the car itself, which means the system cannot save or transmit information unless data sharing is enabled."

CR’s testing experts believe that, if confirmed, this would be a big step forward. "If this new system proves effective, it could be a major improvement for safety," says Jake Fisher, senior director of auto testing at CR. "This isn’t just about preventing abuse; it has the potential to save lives by preventing driver distraction. We hope Tesla rolls this out to other cars soon, and look forward to evaluating the update."

We will continue to report on this story as we learn more about the new driver monitoring system, and we plan to evaluate it as part of our ongoing tests of driving assistance systems.

This article was originally published April 22, 2021.


Consumer Reports engineers easily tricked our Tesla Model Y this week so that it could drive on Autopilot, the automaker’s driver assistance feature, without anyone in the driver’s seat—a scenario that would present extreme danger if it were repeated on public roads. Over several trips across our half-mile closed test track, our Model Y automatically steered along painted lane lines, but the system did not send out a warning or indicate in any way that the driver’s seat was empty.

"In our evaluation, the system not only failed to make sure the driver was paying attention, but it also couldn’t tell if there was a driver there at all," says Jake Fisher, CR’s senior director of auto testing, who conducted the experiment. "Tesla is falling behind other automakers like GM and Ford that, on models with advanced driver assist systems, use technology to make sure the driver is looking at the road."

Our demonstration comes as federal and local investigators continue to probe the cause of a fatal crash Saturday in Texas in which an apparently driverless 2019 Tesla Model S struck a tree, killing the vehicle’s two occupants. Harris County Precinct 4 Constable Mark Herman, who was on scene at the crash, told CR that he’s almost certain that no one was in the driver’s seat when the vehicle crashed. (The Model S in the crash and our Model Y are different models, but they both have Autopilot.) 

MORE ON CAR SAFETY

We tried to reach Tesla to ask about the Texas crash but did not hear back. Tesla CEO Elon Musk tweeted Monday evening that data logs recovered from the crashed Model S "so far show Autopilot was not enabled," and he suggested that it would not be possible to activate Autopilot on the road where the crash took place because of the lack of painted lane lines. The National Highway Traffic Safety Administration and the National Transportation Safety Board are investigating the crash, which occurred on a winding road in Spring, Texas, outside of Houston.

CR wanted to see whether we could prompt our own Tesla to drive down the road without anyone in the driver’s seat. So Fisher and Kelly Funkhouser, CR’s program manager for vehicle interface testing, took our 2020 Tesla Model Y out on our test track. Funkhouser sat in the rear seat, and Fisher sat in the driver seat on top of a buckled seat belt. (Autopilot will disengage if the driver’s seat belt is unbuckled while the vehicle is in motion.)

How We Ran the Experiment

Fisher engaged Autopilot while the car was in motion on the track, then set the speed dial (on the right spoke of the steering wheel) to 0, which brought the car to a complete stop. Fisher next placed a small, weighted chain on the steering wheel, to simulate the weight of a driver’s hand, and slid over into the front passenger seat without opening any of the vehicle’s doors, because that would disengage Autopilot. Using the same steering wheel dial, which controls multiple functions in addition to Autopilot’s speed, Fisher reached over and was able to accelerate the vehicle from a full stop. He stopped the vehicle by dialing the speed back down to zero.



"The car drove up and down the half-mile lane of our track, repeatedly, never noting that no one was in the driver’s seat, never noting that there was no one touching the steering wheel, never noting there was no weight on the seat," Fisher says. "It was a bit frightening when we realized how easy it was to defeat the safeguards, which we proved were clearly insufficient." 

Under no circumstance should anyone try this. "Let me be clear: Anyone who uses Autopilot on the road without someone in the driver seat is putting themselves and others in imminent danger," Fisher says.

We were able to perform this experiment because we have a private test track. We also had safety crews standing by, and at no time did the vehicle exceed 30 mph. Fisher and Funkhouser are trained test drivers who are extremely familiar with Autopilot. They each have evaluated multiple Tesla vehicles over tens of thousands of miles. The stretch of road they used on our track is specifically configured to evaluate driver assistance systems.

Making Sure Drivers Pay Attention

Activating Autopilot and the more advanced suite of driver assistance features that Tesla calls "Full Self-Driving" does not make the car self-driving. Truly self-driving cars don’t yet exist for consumers to buy.

Our evaluation does not provide specific insight into the Texas crash, but safety advocates and researchers at CR say that it does show that driver monitoring systems need to work harder to keep drivers from using the systems in foreseeably dangerous ways. CR and other safety advocates—including the Insurance Institute for Highway Safety—recommend that all vehicles that incorporate steering automation and adaptive cruise control also include systems to make sure drivers are present and looking at the road, like GM does with an infrared camera that’s part of its Super Cruise system.

"Autopilot makes mistakes, and when it encounters a situation that it cannot negotiate, it can immediately shut itself off," Fisher says. "If the driver isn’t ready to react quickly, it can end in a crash."

As is the case with many vehicles, the only way a Tesla determines whether a driver is present is by examining steering wheel inputs. If there is weight on the wheel, even if the driver’s hands are elsewhere, the vehicle assumes a driver is driving and paying attention.

Funkhouser says that it might be possible to abuse the active driving assistance systems of other manufacturers’ cars in the same way if they lack technology that monitors whether a driver is present and paying attention. "Even if the driver has a hand on the wheel, it doesn’t mean they are looking at the road," she says.

Possible Improvements

By comparison, BMW, Ford, GM, Subaru, and others use camera-based systems that can track the movements of a driver’s eyes and/or head position to ensure that they’re looking at the road. Some vehicles—including those equipped with GM’s Super Cruise—can automatically slow to a stop if they detect that drivers have ignored repeated warnings to look at the road.

At the very least, Fisher and Funkhouser say, Tesla could use the weight sensor in the vehicle’s driver’s seat to determine whether there is a human sitting behind the wheel in order for Autopilot to work. These sensors are already used for seat belt warnings and airbags, among other things, so it wouldn’t be a major leap to program a vehicle to turn off features like cruise control if it senses that the driver’s seat is empty, Funkhouser says.

Some Tesla vehicles do monitor drivers, but not in real time. Instead, there’s a driver-facing camera located above the rearview mirror in Model 3 and Model Y vehicles—which the automaker calls a "cabin camera." It can capture and share a video clip of the moments before a crash or automatic emergency braking (AEB) activation to help the automaker "develop future safety features and software enhancements," according to Tesla’s website. The Model S, Tesla’s first sedan, does not have an in-car camera.

Driver monitoring systems will be a part of the requirements for Europe’s Euro NCAP automotive safety program as of 2023. As active driving assistance systems become increasingly automated, NHTSA should take a similar step, says William Wallace, manager of safety policy at CR.

"These systems come with a real risk of people checking out from the driving task," he says. "Fortunately, the technology exists to make sure their eyes are on the road, and it’s improving and becoming more prevalent. NHTSA should ensure that every car with an active driving assistance system comes with this key safeguard standard."

Fisher says it’s unfortunate that Tesla hasn’t adopted more effective driver monitoring, as the vehicles perform well in many of our performance tests. "They have changed the EV market and made the idea of owning an EV far more attractive than ever before," he says. "But they seem to be using their customers as development engineers as they work on self-driving technologies, and they need to do a better job of keeping them safe."