Empirical Evaluation of Rectified Activations in Convolutional Network
CoRR, Volume abs/1505.00853, 2015.
EI
Abstract:
In this paper we investigate the performance of different types of rectified activation functions in convolutional neural network: standard rectified linear unit (ReLU), leaky rectified linear unit (Leaky ReLU), parametric rectified linear unit (PReLU) and a new randomized leaky rectified linear units (RReLU). We evaluate these activati...More
Code:
Data:
Tags
Comments