On the Computational Power of Online Gradient Descent

Conference on Learning Theory(2019)

引用 0|浏览121
暂无评分
摘要
We prove that the evolution of weight vectors in online gradient descent can encode arbitrary polynomial-space computations, even in very simple learning settings. Our results imply that, under weak complexity-theoretic assumptions, it is impossible to reason efficiently about the fine-grained behavior of online gradient descent.
更多
查看译文
关键词
online gradient descent,computational power
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要