Chrome Extension
WeChat Mini Program
Use on ChatGLM

The Shaped Transformer: Attention Models in the Infinite Depth-and-Width Limit

NeurIPS 2023(2023)

Cited 39|Views60
Key words
Deep Learning Theory,Covariance SDE,Attention Mechanism,Infinite-Depth-and-Width,Scaling Limit
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined