BadFusion: 2D-Oriented Backdoor Attacks against 3D Object Detection
arxiv(2024)
摘要
3D object detection plays an important role in autonomous driving; however,
its vulnerability to backdoor attacks has become evident. By injecting
”triggers” to poison the training dataset, backdoor attacks manipulate the
detector's prediction for inputs containing these triggers. Existing backdoor
attacks against 3D object detection primarily poison 3D LiDAR signals, where
large-sized 3D triggers are injected to ensure their visibility within the
sparse 3D space, rendering them easy to detect and impractical in real-world
scenarios.
In this paper, we delve into the robustness of 3D object detection, exploring
a new backdoor attack surface through 2D cameras. Given the prevalent adoption
of camera and LiDAR signal fusion for high-fidelity 3D perception, we
investigate the latent potential of camera signals to disrupt the process.
Although the dense nature of camera signals enables the use of nearly
imperceptible small-sized triggers to mislead 2D object detection, realizing
2D-oriented backdoor attacks against 3D object detection is non-trivial. The
primary challenge emerges from the fusion process that transforms camera
signals into a 3D space, compromising the association with the 2D trigger to
the target output. To tackle this issue, we propose an innovative 2D-oriented
backdoor attack against LiDAR-camera fusion methods for 3D object detection,
named BadFusion, for preserving trigger effectiveness throughout the entire
fusion process. The evaluation demonstrates the effectiveness of BadFusion,
achieving a significantly higher attack success rate compared to existing
2D-oriented attacks.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要