PEGASUS: Pre-training with Extracted Gap-sentences for Abstractive Summarization
ICML 2020, 2019.
Recent work pre-training Transformers with self-supervised objectives on large text corpora has shown great success when fine-tuned on downstream NLP tasks including text summarization. However, pre-training objectives tailored for abstractive text summarization have not been explored. Furthermore there is a lack of systematic evaluatio...More
PPT (Upload PPT)