QASE Enhanced PLMs: Improved Control in Text Generation for MRC
arxiv(2024)
摘要
To address the challenges of out-of-control generation in generative models
for machine reading comprehension (MRC), we introduce the Question-Attended
Span Extraction (QASE) module. Integrated during the fine-tuning of pre-trained
generative language models (PLMs), QASE enables these PLMs to match SOTA
extractive methods and outperform leading LLMs like GPT-4 in MRC tasks, without
significant increases in computational costs.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要