When Being Unseen from mBERT is just the Beginning: Handling New Languages With Multilingual Language Models

Benjamin Muller
Benjamin Muller
Antonis Anastasopoulos
Antonis Anastasopoulos
Cited by: 0|Views3

Abstract:

Transfer learning based on pretraining language models on a large amount of raw data has become a new norm to reach state-of-the-art performance in NLP. Still, it remains unclear how this approach should be applied for unseen languages that are not covered by any available large-scale multilingual language model and for which only a sma...More

Code:

Data:

Full Text
Bibtex
Your rating :
0

 

Tags
Comments