This week in Massachusetts, a passing pedestrian filmed a driver in a Tesla dead asleep at the wheel while the car cruised down the highway. In response to the video and incident, a Tesla spokesperson told Fox News, "Many of these videos appear to be dangerous pranks or hoaxes." The sleepyhead is suspected of using Tesla's driver-assistance technology called Autopilot, but the driver has not been contacted or spoken to.
Autopilot can steer, accelerate, and brake on its own, but it is not considered self-driving technology. Since Tesla first began rolling out the poorly named feature in 2014, there have been several instances when drivers have been caught napping while their cars drive. It happened in 2016, it happened in 2018, and now it's happened in 2019. YouTube is littered with other examples, including cases where the driver has literally climbed out of his seat and away from the controls. And that's not mentioning the bagel-eating, coffee-sipping crashes.
Despite Tesla issuing numerous clarifications and warnings against such acts, uneducated, misinformed, and/or stubborn or Tesla-smitten customers continue to test the limits of the technology. As a reminder, Tesla specifically states "current Autopilot features require active driver supervision and do not make the vehicle autonomous."
This most recent example of misuse of the technology comes from the Massachusetts Turnpike. Dakota Randall, a passerby, videoed the Tesla doing 50-60 mph at about 3 p.m. and posted the video to Twitter with the caption, "Some guy literally asleep at the wheel on the Mass Pike (great place for it). Teslas are sick, I guess?" Randall did not know the person, follow him, or see any end-result to the situation, so any information about the driver — or passenger, who was also asleep — is unknown. Tesla caught wind of the video and issued this statement:
"Many of these videos appear to be dangerous pranks or hoaxes. Our driver-monitoring system repeatedly reminds drivers to remain engaged and prohibits the use of Autopilot when warnings are ignored. At highway speeds, drivers typically receive warnings every 30 seconds or less if their hands aren't detected on the wheel," the spokesperson said. "Tesla owners have driven billions of miles using Autopilot, and data from our quarterly Vehicle Safety Report indicates that drivers using Autopilot experience fewer accidents than those operating without assistance."
In fairness, the system is supposed to stop and eventually lock drivers out of use should it detect the driver is ignoring safety warnings. Still, it's possible this driver had just fallen asleep and the system had not detected this yet, or the system could be malfunctioning. It's impossible to say what was actually going on.
Watch the video below and decide for yourself.