PEGASUS: Pre-training with Extracted Gap-sentences for Abstractive Summarization

Zhang Jingqing
Zhang Jingqing
Zhao Yao
Zhao Yao
Saleh Mohammad
Saleh Mohammad

ICML 2020, 2019.

Cited by: 0|Bibtex|Views39|Links

Abstract:

Recent work pre-training Transformers with self-supervised objectives on large text corpora has shown great success when fine-tuned on downstream NLP tasks including text summarization. However, pre-training objectives tailored for abstractive text summarization have not been explored. Furthermore there is a lack of systematic evaluatio...More
Your rating :
0

 

Tags
Comments