Multiscale Multipath Ensemble Convolutional Neural Network

IEEE Transactions on Systems, Man, and Cybernetics: Systems(2021)

引用 4|浏览76
暂无评分
摘要
Most convolutional neural network (CNN) models require large parameter quantity, high computational consumption, and deeper layers to achieve better performance, which limit their further applications. For improvement, a lightweight multiscale multipath ensemble CNN (MSME-CNN) is proposed. First, the shallow-layer and deep-layer convolution output features are directly concatenated to form a multipath ensemble mode, which can not only avoid gradient vanishing but also fuse the multilayer features. In addition, each path of the network is composed of multiscale low-rank convolution kernels in parallel. This structure can extract multiscale features of the input and improve the feature extraction ability of the network. Meanwhile, the convolution kernel low-rank approximation can effectively compress the model complexity. Furthermore, the proposed sparse connection mechanism of the convolution kernel helps to reduce the complexity, so that higher classification accuracy can be obtained with less parameters and computational load. Finally, the linear sparse bottleneck structure is used to fuse multiscale features and compress the convolution channel, which further improves the network performance. Experiments of four commonly used image recognition datasets verify the superiority of MSME-CNN to several baseline models.
更多
查看译文
关键词
Computational load,convolutional neural network (CNN),ensemble,multipath,multiscale,parameter quantity
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要