Business Insider/Corey Protin
Today's self-driving cars rely onspinning sensors called lidarthat can cost more than $10,000 each. But it took Jonathan Petit just $43 and a laser pointer to confuse and defeat them.
"Anybody can go online and get access to this, buy it really quickly, and just assemble it, and there you go, you have a device that can spoof lidar," Petit, a cybersecurity expert, told Business Insider.
Google, Tesla, and major automakers areracing to build fullyautonomous cars, creating a future where many won't need to own a vehicle, the young, old and disabled can get around more easily, and transforming the way we live. One day they could dramatically reduce the roughly 30,000 annual deaths from crashes.
But until we get there, carmakers have to ensure clever hackers ― and those less benevolent than Petit ― can't cause the cars to go haywire.
Tricking the sensorsTwo lidar systems on top of a Ford. Screenshot via YouTube
When Petit was growing up in France in the 1980s, cars were simpler machines, disconnected from the outside world.
Petit's parents owned a restaurant in France, and whenever they had a good season, his dad would use the opportunity to buy a car. One was a Citroen DS that bounced on its hydraulic suspension, a technologically advanced feature at the time.
"When you think about the old times, there's nostalgia about it," he said. "You always think about, 'aw yeah that was a cool time and you can feel the road.' Yeah, I think that was nice. I really loved that car."
Despite growing up around cars, it wasn't until much later that Petit started devoting all his time to making cars resistant to malicious actors.
Petit began extensively studying automotive cybersecurity as a PhD student at Paul Sabatier Universityin Francein 2007. However, it was during his post-doc research at UC Berkeley, working with its Partners for Advanced Transportation Technology, that he became more interested in the hacking risks for self-driving cars, specifically.
Google's self-driving car. Business Insider
Then, in 2015, two hackers ― Charlie Miller and Chris Valasek ― took control of a Jeep Cherokee's UConnect system ― an Internet-connected computer feature that controls everything from your ability to make calls to the car's navigation system.
From a couch 10 miles west of the highway, the two guys were able to toy with the car's air conditioning, blast the radio, activate the windshield wipers, and ultimately cut its transmission. Fiat Chrysler Automotive recalled 1.4 million vehicles to install anti-hacking software following the demo.
As Petit puts it, the demonstration highlighted the importance of automotive security. Now, hackers could gain access without even leaving the couch. Prior hacking demonstrations required researchersto be connected directly to the car's dashboard.
"When they did the hack remotely that was like, 'wow that's interesting.' Now it's not just looking at having physical access [to the car,]" he said. "It's scary when you start to have remote attacks."When Petit performed the attack on the lidar, he became one of the first researchers to show how easy it is to hack self-driving cars' sensors. He was able to trick a sensor into thinking objects were there when they weren't, and vice versa.
"So here, you can think that the potential consequence of an attack like this could be, I tried to crash you into a vehicle ahead of you because I'm telling you there is no object here," he said. "So I'm making you blind and now your system thinks it's free."
But that kind of hack can have other consequences too. The car could see an obstacle that isn't there and change lanes to get away from it. That maneuver, designed to keep passengers safe, could disrupt traffic. It could also cause the car to go off course.
"So now you've changed the path of the vehicle by doing this, that's also an impact, which means that then the risk could be I'm sending you to small street to stop you and rob you or steal the car," he said.
A self-driving Uber used for the Pittsburgh pilot. Uber
Petit not only tricked the lidar system self-driving cars use, but was able to blind the cameras they rely on by using different LED lights. If the car feels it can no longer operate safely because its cameras have been disabled, it could stop entirely, leading to those same kind of problems.
Now, it's important to take these scenarios with a grain of salt. As Petit said, self-driving cars are built with redundant sensor systems, meaning they have multiple cameras and sensors in case one were to fail.
For example, the self-driving cars Uber is using for itsPittsburgh pilot have20 cameras and several radar sensors to provide 360-degree coverage.
That means even if a hacker compromises one or even a few sensors on a self-driving car, the car may still be able to pull enough information from the ones that are operating effectively to continue driving safely.
But it also depends on when an attack occurs. For example, a self-driving car at night might be programmed to rely more on its lidar system since the cameras can't see as well in the dark, Petit said. If a hacker were to then spoof the lidar, it doesn't have as much data to fall back on, and it could put the car and its passengers in a dangerous situation.
"Even if you're thinking with just my sensors, I'm secure. This is not true," he said.
Exploiting communication channelsUS DOT
Petit has conducted other research highlighting how vulnerable self-driving cars are to hacks even beyond sensor vulnerabilities.
In 2011, when Petit was a senior researcher at University of Twente in the Netherlands, he set up equipment that could pick up the signals cars were sending each other and send them to a laptop. These"sniffing stations"were able to locate a security vehicle within a given residential or business zone on campus with 78% accuracy. Petit could then narrow that down to individual roads with 40% accuracy.
Those sniffing stations were able to track cars by taking advantage of vehicle-to-vehicle (V2V) communication.
V2V communication is something automakers are already starting to use using in cars today, like the 2017 Mercedes E-Class. The communication channel allows the cars to talk to other cars on the road to relay data on traffic flow, accidents ahead, or poor weather. It can then be used to send alerts to the driver so she can change her course if things look bad up ahead.
Some automakers are exploring using V2V for self-driving cars because the cars can use the data to navigate more safely without relying on their sensors exclusively tosee obstacles, like a traffic jam, with their sensors.
The cars won't send personally identifying information, but thedata, like GPS locations, are sent to other vehicles unencrypted.
But as Petit showed by setting up sniffing stations, hackers could track the data being sent to otherveh