Superpolynomial Lower Bounds for Learning One-Layer Neural Networks using Gradient Descent
ICML, pp. 3587-3596, 2020.
We prove our lower bounds in the well-studied statistical query model of that captures most learning algorithms used in practice
We prove the first superpolynomial lower bounds for learning one-layer neural networks with respect to the Gaussian distribution using gradient descent. We show that any classifier trained using gradient descent with respect to square-loss will fail to achieve small test error in polynomial time given access to samples labeled by a one-...More
PPT (Upload PPT)