Neural Semi-supervised Learning for Text Classification Under Large-Scale Pretraining

Zijun Sun
Zijun Sun
Chun Fan
Chun Fan
Xiaofei Sun
Xiaofei Sun
Cited by: 0|Views12

Abstract:

The goal of semi-supervised learning is to utilize the unlabeled, in-domain dataset U to improve models trained on the labeled dataset D. Under the context of large-scale language-model (LM) pretraining, how we can make the best use of U is poorly understood: is semi-supervised learning still beneficial with the presence of large-scale ...More

Code:

Data:

Full Text
Bibtex
Your rating :
0

 

Tags
Comments