Researchers from Ben-Gurion University of the Negev’s (BGU) Cyber Security Research Center have found that they can cause the autopilot on an autonomous vehicle to erroneously apply its brakes in response to “phantom” images projected on a road or billboard.

In a new paper, “Phantom of the ADAS,” researchers demonstrated that autopilots and advanced driving-assistance systems (ADASs) in semi-autonomous or fully autonomous cars register depthless projections of objects (phantoms) as real objects. They show how attackers can exploit this perceptual challenge to manipulate the vehicle and potentially harm the driver or passengers without any special expertise by using a commercial drone and inexpensive image projector.

While fully and semi-autonomous cars are already being deployed around the world, vehicular communication systems that connect the car with other cars, pedestrians and surrounding infrastructure are lagging. According to the researchers, the lack of such systems creates a “validation gap,” which prevents the autonomous vehicles from validating their virtual perception with a third party, relying only on internal sensors.

In addition to causing the autopilot to apply brakes, the researchers demonstrated they can fool the ADAS into believing phantom traffic signs are real, when projected for 125 milliseconds in advertisements on digital billboards.

Lastly, they showed how fake lane markers projected on a road by a projector-equipped drone will guide the autopilot into the opposite lane and potentially oncoming traffic (click here for video).

“This type of attack is currently not being taken into consideration by the automobile industry. These are not bugs or poor coding errors, but fundamental flaws in object detectors that are not trained to distinguish between real and fake objects and use feature matching to detect visual objects,” says Ben Nassi, lead author and a Ph.D. student of Professor Yuval Elovici in BGU’s Department of Software and Information Systems Engineering and Cyber Security Research Center.

In reality, depthless objects projected on a road are considered real, even though the depth sensors can differentiate between 2D and 3D. The BGU researchers believe that this is the result of a “better safe than sorry” policy that causes the car to consider a visual 2D object real.

The researchers are developing a neural network model that analyzes a detected object’s context, surface and reflected light, which is capable of detecting phantoms with high accuracy.

Support Jewish Journalism
with 2020 Vision

JNS is more than just another news website and syndication service. It is an organization devoted to nonstop reporting, and telling the truth about Israel and Jewish issues unburdened by the biases and institutional blinders that distort so much of what we read, hear and see about these topics elsewhere in the secular and even Jewish press.

At JNS, you get the facts about Israel and Jewish issues without the bias that so often tilts the argument against the Jewish state. JNS articles and columns are republished every week by digital outlets and print newspapers across the globe. But in the age of round-the-clock news coverage, advertising and syndication revenues are not enough to support our continued growth. We need your financial help to keep JNS on target as we continue our fair and accurate reporting.

Please help us take JNS to the next level with a tax-deductible sponsorship, either on a recurring monthly basis. Jewish News Syndicate is a 501(c)3 not-for-profit organization.