The Computational Complexity of Training ReLU(s)

arXiv: Computational Complexity, Volume abs/1810.04207, 2018.

Cited by: 17|Views25
EI

Abstract:

We consider the computational complexity of training depth-2 neural networks composed of rectified linear units (ReLUs). We show that, even for the case of a single ReLU, finding a set of weights that minimizes the squared error (even approximately) for a given training set is NP-hard. We also show that for a simple network consisting of ...More

Code:

Data:

Get fulltext within 24h
Bibtex
Your rating :
0

 

Tags
Comments