Learning to Reason and Memorize with Self-Notes

CoRR(2023)

引用 6|浏览94
暂无评分
摘要
Large language models have been shown to struggle with limited context memory and multi-step reasoning. We propose a simple method for solving both of these problems by allowing the model to take Self-Notes. Unlike recent scratchpad approaches, the model can deviate from the input context at any time to explicitly think. This allows the model to recall information and perform reasoning on the fly as it reads the context, thus extending its memory and enabling multi-step reasoning. Our experiments on multiple tasks demonstrate that our method can successfully generalize to longer and more complicated instances from their training setup by taking Self-Notes at inference time.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络