Does Syntactic Knowledge in Multilingual Language Models Transfer Across Languages?

BlackboxNLP@EMNLP(2018)

引用 31|浏览31
暂无评分
摘要
Recent work has shown that neural models can be successfully trained on multiple languages simultaneously. We investigate whether such models learn to share and exploit common syntactic knowledge among the languages on which they are trained. This extended abstract presents our preliminary results.
更多
查看译文
关键词
multilingual language models transfer,syntactic knowledge,languages
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要