Improving Deep Neural Network Ensembles Using Reconstruction Error

2015 International Joint Conference on Neural Networks (IJCNN)(2015)

引用 8|浏览62
暂无评分
摘要
Ensemble learning of neural network is a learning paradigm where ensembles of several neural networks show improved generalization capabilities that outperform those of single networks. For deep learning of multi-layer neural networks, ensemble learning is still applicable. In addition, characteristics of deep neural networks can provide potential opportunities to improve the performance of traditional neural network ensembles. In this paper, we propose an ensemble criterion of deep neural networks that is based on the reconstruction error and present two strategies to solve the most important issues in ensemble learning of neural networks: component dataset sampling and output averaging. Component training datasets are selected according to the reconstruction error instead of random bootstrap sampling or re-weighting. Moreover, for each testing instance, we can compute the reconstruction error yielded by the sub-model simultaneously with the output. The reconstruction error is used as the weights in output averaging. From the perspectives of prediction interval and confidence interval, we demonstrated that smaller reconstruction error could ensure smaller prediction interval. We also incorporate the famous structure ensemble approach "Dropout" into the proposed approach to achieve the best performance. We conduct experiments on classification and regression datasets to validate the effectiveness of our approach.
更多
查看译文
关键词
deep neural network ensemble,reconstruction error,ensemble learning,learning paradigm,generalization capability,deep learning,multilayer neural network,ensemble criterion,component dataset sampling,output averaging,component training dataset,random bootstrap sampling,testing instance,prediction interval,confidence interval
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要