Ensemble Knowledge Distillation from Speech SSL Models Considering Inter-Teacher Differences
International Symposium on Chinese Spoken Language Processing(2024)
Key words
Self-Supervised Learning,Knowledge Distillation,Spoken Language Understanding,Emotion Recognition
AI Read Science
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined