MAE-MACD: the Masked Adversarial Contrastive Distillation Algorithm Grounded in Masked Autoencoders
IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS(2024)
Key words
Training,Neural networks,Robustness,Contrastive learning,Perturbation methods,Representation learning,Feature extraction,Adversarial samples,adversarial training,contrastive learning,knowledge distillation,masked autoencoder (MAE)
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined