A Few Thousand Translations Go a Long Way! Leveraging Pre-trained Models for African News Translation
David Adelani,Jesujoba Alabi,Angela Fan,Julia Kreutzer,Xiaoyu Shen,Machel Reid,Dana Ruiter,Dietrich Klakow,Peter Nabende,Ernie Chang,Tajuddeen Gwadabe,Freshia Sackey,Bonaventure F. P. Dossou,Chris Emezue,Colin Leong,Michael Beukman,Shamsuddeen Muhammad,Guyo Jarso,Oreen Yousuf,Andre Niyongabo Rubungo,Gilles Hacheme,Eric Peter Wairagala,Muhammad Umair Nasir,Benjamin Ajibade,Tunde Ajayi,Yvonne Gitau,Jade Abbott, Mohamed Ahmed,Millicent Ochieng,Anuoluwapo Aremu,Perez Ogayo,Jonathan Mukiibi,Fatoumata Ouoba Kabore,Godson Kalipe,Derguene Mbaye,Allahsera Auguste Tapo,Victoire Memdjokam Koagne,Edwin Munkoh-Buabeng,Valencia Wagner,Idris Abdulmumin,Ayodele Awokoya,Happy Buzaaba,Blessing Sibanda,Andiswa Bukula,Sam Manthalu Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics Human Language Technologies(2022)
Key words
Language Modeling,Topic Modeling,Machine Translation,Syntax-based Translation Models,Pretrained Models
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper