Large-Margin Learning of Submodular Summarization Models.

EACL '12: Proceedings of the 13th Conference of the European Chapter of the Association for Computational Linguistics(2012)

引用 28|浏览93
暂无评分
摘要
In this paper, we present a supervised learning approach to training submodular scoring functions for extractive multidocument summarization. By taking a structured prediction approach, we provide a large-margin method that directly optimizes a convex relaxation of the desired performance measure. The learning method applies to all submodular summarization methods, and we demonstrate its effectiveness for both pairwise as well as coverage-based scoring functions on multiple datasets. Compared to state-of-the-art functions that were tuned manually, our method significantly improves performance and enables high-fidelity models with number of parameters well beyond what could reasonably be tuned by hand.
更多
查看译文
关键词
large-margin method,submodular summarization method,coverage-based scoring function,extractive multidocument summarization,performance measure,structured prediction approach,supervised learning approach,training submodular,convex relaxation,high-fidelity model,submodular summarization model
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要