DNNGuard: An Elastic Heterogeneous DNN Accelerator Architecture against Adversarial Attacks
ASPLOS '20: Architectural Support for Programming Languages and Operating Systems Lausanne Switzerland March, 2020, pp. 19-34, 2020.
EI
Weibo:
Abstract:
Recent studies show that Deep Neural Networks (DNN) are vulnerable to adversarial samples that are generated by perturbing correctly classified inputs to cause the misclassification of DNN models. This can potentially lead to disastrous consequences, especially in security-sensitive applications such as unmanned vehicles, finance and heal...More
Code:
Data:
Tags
Comments