谷歌浏览器插件
订阅小程序
在清言上使用

Training Neural Networks from an Ergodic Perspective

W. Jung, C. A. Morales

Optimization(2023)

引用 0|浏览6
暂无评分
摘要
In this research, we viewed the weights of a neural network as points in a metric space. We proposed that the process of training a neural network can be seen as an iterated function system on this space. Additionally, we found that the most effective training method involves starting with initial weights that are very close to the minimum error. We also provided an ergodic characterization of this efficient training. Our findings suggest that there is potential for further optimization advancements through numerical experimentation and the study of dynamical systems theory.
更多
查看译文
关键词
Gradient descent,training neural network,global attractor,ergodic
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要