SparseProp: Efficient Sparse Backpropagation for Faster Training of Neural Networks at the Edge

ICML 2023(2023)

引用 3|浏览5
暂无评分
摘要
We provide an efficient implementation of the backpropagation algorithm, specialized to the case where the weights of the neural network being trained are sparse . Our algorithm is general, as it applies to arbitrary (unstructured) sparsity and common layer types (e.g., convolutional or linear). We provide a fast vectorized implementation on commodity CPUs, and show that it can yield speedups in end-to-end runtime experiments, both in transfer learning using already-sparsified networks, and in training sparse networks from scratch. Thus, our results provide the first support for sparse training on commodity hardware.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要