Reflective Decoding: Unsupervised Paraphrasing and Abductive Reasoning

Peter West
Peter West
Ximing Lu
Ximing Lu
Ari Holtzman
Ari Holtzman
Jena Hwang
Jena Hwang
Cited by: 0|Bibtex|Views19
Other Links: arxiv.org

Abstract:

Pretrained Language Models (LMs) generate text with remarkable quality, novelty,and coherence. Yet applying LMs to the problems of paraphrasing and infilling currently requires direct supervision, since these tasks break the left-to-right generation setup of pretrained LMs. We present Reflective Decoding, a novel unsupervised approach t...More

Code:

Data:

Full Text
Your rating :
0

 

Tags
Comments