Chrome Extension
WeChat Mini Program
Use on ChatGLM

DeiT-LT: Distillation Strikes Back for Vision Transformer Training on Long-Tailed Datasets

CVPR 2024(2024)

Cited 0|Views61
Key words
long-tail-learning,vision transformers,vit,distillation
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined