Chrome Extension
WeChat Mini Program
Use on ChatGLM

Revisiting the Compositional Generalization Abilities of Neural Sequence Models

Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2 Short Papers)(2022)

Cited 18|Views100
No score
Key words
Sequence-to-Sequence Learning,Topic Modeling,Language Modeling,Sentence Simplification,Semantic Simplification
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined