Chrome Extension
WeChat Mini Program
Use on ChatGLM

Vanilla Feature Distillation for Improving the Accuracy-Robustness Trade-Off in Adversarial Training.

IEEE TRANSACTIONS ON DEPENDABLE AND SECURE COMPUTING(2025)

Cited 0|Views37
Key words
Adversarial training,well-separable features,feature alignment,knowledge distillation,Adversarial training,well-separable features,feature alignment,knowledge distillation
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined