NIPS 2020, 2020.
Understanding the effects of model architecture on training and test performance is a longstanding goal in the deep learning community
Modern neural network performance typically improves as model size increases. A recent line of research on the Neural Tangent Kernel (NTK) of over-parameterized networks indicates that the improvement with size increase is a product of a better conditioned loss landscape. In this work, we investigate a form of over-parameterization achi...More
PPT (Upload PPT)