Controllable Abstractive Summarization Using Multilingual Pretrained Language Model
International Conference on Information and Communicatiaon Technology(2022)
摘要
By leveraging a multilingual language model, we show that CTRLSum [1], an abstractive summarization approach that can be controlled by keywords, improves baseline summarization system in four languages: English, Indonesian, Spanish, and French by 1.57 in terms of average ROUGE-1, with the Indonesian model achieving state-of-the-art results. We further provide novel analysis about the importance of keywords fed to CTRLSum which (1) shows hypothetical upper-bound results that outperform the state-of-the-art in all four languages by a large margin and (2) provides natural direction for future work to improve CTRLSum by improving the keyword prediction model.
更多查看译文
关键词
keyword,controllable abstractive summarization,multilingual
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要