Empirical Evaluation of Rectified Activations in Convolutional Network

CoRR, Volume abs/1505.00853, 2015.

Cited by: 1662|Views274
EI

Abstract:

In this paper we investigate the performance of different types of rectified activation functions in convolutional neural network: standard rectified linear unit (ReLU), leaky rectified linear unit (Leaky ReLU), parametric rectified linear unit (PReLU) and a new randomized leaky rectified linear units (RReLU). We evaluate these activati...More

Code:

Data:

Your rating :
0

 

Tags
Comments