Optimization Method Of Residual Networks Of Residual Networks For Image Classification

Long Lin,Hao Yuan,Liru Guo, Yingqun Kuang,Ke Zhang

INTELLIGENT COMPUTING METHODOLOGIES, ICIC 2018, PT III(2018)

引用 7|浏览11
暂无评分
摘要
The activation of a Deep Convolutional Neural Network that overlooks the diversity of datasets has been restricting its development in image classification. In this paper, we propose a Residual Networks of Residual Networks (RoR) optimization method. Firstly, three activation functions (RELU, ELU and PELU) are applied to RoR and can provide more effective optimization methods for different datasets; Secondly, we added a drop-path to avoid over-fitting and widened RoR adding filters to avoid gradient vanish. Our networks achieved good classification accuracy in CIFAR-10/100 datasets, and the best test errors were 3.52% and 19.07% on CIFAR-10/100, respectively. The experiments prove that the RoR network optimization method can improve network performance, and effectively restrain the vanishing/exploding gradients.
更多
查看译文
关键词
Image classification, RELU, Parametric exponential linear unit, Exponential linear unit, Residual networks of residual networks, Activation function
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要