Hiding from thermal imaging pedestrian detectors in the physical world

NEUROCOMPUTING(2024)

引用 0|浏览20
暂无评分
摘要
Thermal imaging detection has been applied in many scenarios. However, its security has not been fully explored. We propose a physical attack method with small bulbs on a board to fool thermal imaging pedestrian detectors. We first designed an optimized Gaussian-function-based patch that caused the average precision (AP) of YOLOv3 to drop by 64.12% in the digital world, while a patch with randomly placed Gaussian functions and a blank patch caused the AP to drop by only 33.01% and 29.69%, respectively. We then manufactured a physical board and attacked YOLOv3 in the real world. In recorded videos, an optimized physical board caused the AP of YOLOv3 to drop by 34.48%, while a board with randomly placed bulbs and a blank board caused the AP to drop by only 17.06% and 14.91%, respectively. With the ensemble attack techniques, the designed physical board attack had good transferability to unseen CNN-based detectors. Furthermore, we successfully evaded the pedestrian detectors working on both visible light images and thermal images by covering printed adversarial paper on the manufactured board. Finally, we tested five typical methods to defend our bulb-based attack but achieved limited success. The results indicated the effectiveness of the attack method.
更多
查看译文
关键词
Adversarial example,Physical adversarial attack,Object detection,Thermal imaging,Model ensemble,Adversarial defense
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要