Multichannel Generative Language Model: Learning All Possible Factorizations Within and Across Channels

EMNLP, pp. 4208-4220, 2020.

EI
Other Links: arxiv.org|dblp.uni-trier.de|academic.microsoft.com

Abstract:

A channel corresponds to a viewpoint or transformation of an underlying meaning. A pair of parallel sentences in English and French express the same underlying meaning, but through two separate channels corresponding to their languages. In this work, we present the Multichannel Generative Language Model (MGLM). MGLM is a generative join...More

Code:

Data:

Full Text
Your rating :
0

 

Tags
Comments