BGADAM: Boosting based Genetic-Evolutionary ADAM for Neural Network Optimization

2021 International Joint Conference on Neural Networks (IJCNN)(2021)

引用 0|浏览22
暂无评分
摘要
For various optimization methods, gradient descent-based algorithms can achieve outstanding performance and have been widely used in various tasks. Among those commonly used algorithms, ADAM owns many advantages such as fast convergence with both the momentum term and the adaptive learning rate. However, since the loss functions of most deep neural networks are non-convex, ADAM also shares the dra...
更多
查看译文
关键词
Training,Deep learning,Analytical models,Neural networks,Optimization methods,Boosting,Genetics
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要