Rethinking ReLU to Train Better CNNs
ICPR, pp. 603-608, 2018.
Most of convolutional neural networks share the same characteristic: each convolutional layer is followed by a nonlinear activation layer where Rectified Linear Unit (ReLU) is the most widely used. In this paper, we argue that the designed structure with the equal ratio between these two layers may not be the best choice since it could re...More
Full Text (Upload PDF)
PPT (Upload PPT)