A team of researchers from the University of South Carolina, Zhejiang University in China, and Chinese security firm Qihoo 360 has found a potential fault in Tesla's Autopilot semi-autonomous driving system. The team was able to exploit weaknesses in the system that lead researcher and USC Professor Wenyuan Xu said "highly motivated people" could use "to cause personal damage or property damage."

According to Wired, Autopilot relies on three separate systems to detect objects around the car – radar, ultra-sonic sensors, and cameras. To start, Xu's team stuck a $90,000 signal generator on a cart in front of a stationary Model S, to simulate following another vehicle. After switching the rig on, the "car" vanished from the Tesla's sensors without warning. Obviously, $90,000 jammers are neither easy to acquire or very portable.

But where jamming the radar is a pricey proposition, the ultra-sonic sensors are a far easier target, albeit at lower speeds. The ultra-sonic sensors manage the Model S' self-parking and summon features, but using equipment that costs as little as $40 – a function generator, an Arduino computer, and an ultra-sonic transducer – Xu's team manipulated the sensors to either see an object that wasn't there or cloak one that was.

The Tesla's camera systems were the most resilient to attacks – the researchers shined lasers and LEDs at the cameras to try and blind them. Xu's team managed to kill a few pixels on the camera sensors, but Autopilot simply shut down and warned its driver to take the wheel when they tried to jam the camera.

If none of these hacks sound practical, it's because they're not, nor are they meant to be. In fact, based on the comments of Prof. Xu, it's more important to force Tesla to add safeguards to Autopilot rather than simply break the system.

"I don't want to send out a signal that the sky is falling, or that you shouldn't use Autopilot. These attacks actually require some skills," Xu told Wired. "[Tesla] need to think about adding detection mechanisms as well. If the noise is extremely high, or there's something abnormal, the radar should warn the central data processing system and say 'I'm not sure I'm working properly.'"

Tesla has studied Prof. Xu's work, telling Wired in a statement that it appreciates the team's efforts and adding, "We have reviewed these results with Wenyuan's team and have thus far not been able to reproduce any real-world cases that pose risk to Tesla drivers." Prof. Xu and her team will officially detail their findings at the DEFCON hacker conference later this week.

Related Video:


From Our Partners

You May Like
Links by Zergnet

Questions

There are no questions about this topic.
Be the first to ask!
Share This Photo X