Resurrecting Submodularity for Neural Text Generation

arXiv (Cornell University)(2019)

引用 0|浏览8
暂无评分
摘要
Submodularity is a desirable property for a variety of objectives in content selection where the current neural encoder-decoder framework is inadequate. We define a class of novel attention mechanisms with submodular functions and in turn, prove the submodularity of the effective neural coverage. The resulting attention module offers an architecturally simple and empirically effective method to improve the coverage of neural text generation. We run experiments on three directed text generation tasks with different levels of recovering rate, across two modalities, three different neural model architectures and two training strategy variations. The results and analyses demonstrate that our method generalizes well across these settings, produces texts of good quality, outperforms comparable baselines and achieves state-of-the-art performance.
更多
查看译文
关键词
neural text generation,submodularity
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要