Chrome Extension
WeChat Mini Program
Use on ChatGLM

A Comparison of Supervised and Unsupervised Pre-Training of End-to-end Models

INTERSPEECH 2021(2021)

Cited 14|Views36
Key words
speech recognition,cross-domain,cross-lingual,low-resource,pre-training,self-supervised learning,supervised training,unsupervised training
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined