谷歌浏览器插件
订阅小程序
在清言上使用

A new initialization method for artificial neural networks: Laplacian.

Signal Processing and Communications Applications Conference(2018)

引用 23|浏览7
暂无评分
摘要
Artificial neural networks' popularity in the field of machine learning increases day by day since 2006, foundation date of deep learning. One of the factors which greatly affects the success percentages of deep neural networks is their initialization. In this article, new initialization methods based on Laplacian distribution is proposed. With the use of these new initialization methods, it is aimed to assign appropriate initial values to the network parameters so as to better train the network. Results of our methods on University of California, Irvine (UCI) Human Activity Recognition and CIFAR-10 datasets are compared with the networks which are initialized with well-known methods, such as Gaussian and Uniform initialization, while network formation and layer structure are left unchanged. With this comparison, the advantages of Laplacian-based initialization methods compared to existing methods were discussed considering the test success.
更多
查看译文
关键词
artificial neural networks,deep learning,classifier,initialization
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要