Invisible but Detected: Physical Adversarial Shadow Attack and Defense on LiDAR Object Detection

Authors: 

Ryunosuke Kobayashi, Waseda University; Kazuki Nomoto, Waseda University and Deloitte Tohmatsu Cyber LLC; Yuna Tanaka and Go Tsuruoka, Waseda University; Tatsuya Mori, Waseda University and NICT and RIKEN AIP

Abstract: 

This paper introduces "Shadow Hack," the first adversarial attack exploiting naturally occurring object shadows in LiDAR point clouds to target object detection models in autonomous vehicles. Shadow Hack manipulates these shadows, which implicitly influence object detection even though they are not included in output results. To create "Adversarial Shadows," we use materials that are difficult for LiDAR to measure accurately. We optimize the position and size of these shadows to maximize misclassification by point cloud-based object recognition models. In simulations, Shadow Hack achieves a 100% attack success rate at distances between 11m and 21m across multiple models. Our physical world experiments validate these findings, demonstrating up to 100% success rate at 10m against PointPillars and 98% against SECOND-IoU, using mirror sheets that achieve nearly 100% point cloud removal rate at distances from 1 to 14 meters. We also propose "BB-Validator," a defense mechanism achieving a 100% success rate while maintaining high object detection accuracy.

Open Access Media

USENIX is committed to Open Access to the research presented at our events. Papers and proceedings are freely available to everyone once the event begins. Any video, audio, and/or slides that are posted after the event are also free and open to everyone. Support USENIX and our commitment to Open Access.