The Computational Complexity of Training ReLU(s)
arXiv: Computational Complexity, Volume abs/1810.04207, 2018.
We consider the computational complexity of training depth-2 neural networks composed of rectified linear units (ReLUs). We show that, even for the case of a single ReLU, finding a set of weights that minimizes the squared error (even approximately) for a given training set is NP-hard. We also show that for a simple network consisting of ...More
Full Text (Upload PDF)
PPT (Upload PPT)