Adversarial Perturbations of Physical Signals
CoRR(2024)
摘要
We investigate the vulnerability of computer-vision-based signal classifiers
to adversarial perturbations of their inputs, where the signals and
perturbations are subject to physical constraints. We consider a scenario in
which a source and interferer emit signals that propagate as waves to a
detector, which attempts to classify the source by analyzing the spectrogram of
the signal it receives using a pre-trained neural network. By solving
PDE-constrained optimization problems, we construct interfering signals that
cause the detector to misclassify the source even though the perturbations to
the spectrogram of the received signal are nearly imperceptible. Though such
problems can have millions of decision variables, we introduce methods to solve
them efficiently. Our experiments demonstrate that one can compute effective
and physically realizable adversarial perturbations for a variety of machine
learning models under various physical conditions.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要