CamemBERT: a Tasty French Language Model
ACL, pp. 7203-7219, 2020.
We evaluated CamemBERT on four downstream tasks in which our best model reached or improved the state of the art in all tasks considered, even when compared to strong multilingual models such as mBERT, XLM and XLM-R, while having fewer parameters
Pretrained language models are now ubiquitous in Natural Language Processing. Despite their success, most available models have either been trained on English data or on the concatenation of data in multiple languages. This makes practical use of such models --in all languages except English-- very limited. Aiming to address this issue ...More
PPT (Upload PPT)