The Curious Case of Neural Text Degeneration

Ari Holtzman
Ari Holtzman
Jan Buys
Jan Buys

ICLR, 2020.

Cited by: 335|Views36
EI

Abstract:

Despite considerable advances in neural language modeling, it remains an open question what the best decoding strategy is for text generation from a language model (e.g. to generate a story). The counter-intuitive empirical observation is that even though the use of likelihood as training objective leads to high quality models for a broad...More

Code:

Data:

Your rating :
0

 

Tags
Comments