Dynamic Context Pruning for Efficient and Interpretable Autoregressive Transformers.
NeurIPS 2023(2023)
Key words
Transformers,Context-pruning,Efficient Transformer
AI Read Science
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined