Proving the Lottery Ticket Hypothesis for Convolutional Neural Networks

International Conference on Learning Representations (ICLR)(2022)

引用 11|浏览8
暂无评分
摘要
The lottery ticket hypothesis states that a randomly-initialized neural network contains a small subnetwork which, when trained in isolation, can compete with the performance of the original network. Recent theoretical works proved an even stronger version: every sufficiently overparameterized (dense) neural network contains a subnetwork that, even without training, achieves accuracy comparable to that of the trained large network. These works left as an open problem to extend the result to convolutional neural networks (CNNs). In this work we provide such generalization by showing that, with high probability, it is possible to approximate any CNN by pruning a random CNN whose size is larger by a logarithmic factor.
更多
查看译文
关键词
lottery ticket hypothesis,convolutional neural network,network pruning,random subset sum,random neural network
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要