Rectified Exponential Units for Convolutional Neural Networks

IEEE Access(2019)

引用 20|浏览14
暂无评分
摘要
Rectified linear unit (ReLU) plays an important role in today’s convolutional neural networks (CNNs). In this paper, we propose a novel activation function called Rectified Exponential Unit (REU). Inspired by two recently proposed activation functions: Exponential Linear Unit (ELU) and Swish, the REU is designed by introducing the advantage of flexible exponent and multiplication function form. Moreover, we propose the Parametric REU (PREU) to increase the expressive power of the REU. The experiments with three classical CNN architectures, LeNet-5, Network in Network, and Residual Network (ResNet) on scale-various benchmarks including Fashion-MNIST, CIFAR10, CIFAR100, and Tiny ImageNet demonstrate that REU and PREU achieve improvement compared with other activation functions. Our results show that our REU has relative error improvements over ReLU of 7.74% and 6.08% on CIFAR-10 and 100 with the ResNet, the improvements of PREU is 9.24% and 9.32%. Finally, we use the different PREU variants in the Residual unit to achieve more stable results.
更多
查看译文
关键词
Activation function,convolutional neural network,rectified exponential unit,parametric rectified exponential unit
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要