Pruning Deep Convolutional Neural Networks via Gradient Support Pursuit

PRCV (3)(2020)

引用 0|浏览31
暂无评分
摘要
In this paper, we propose a filter pruning method, namely, Filter Pruning via Gradient Support Pursuit (FPGraSP), which can accelerate and compress very deep Convolutional Neural Networks effectively in an iterative way. Previous work reports that Gradient Support Pursuit (GraSP) is well employed for sparsity-constrained optimization in Machine Learning. We seek to develop a modification that GraSP can be applied to structured pruning in deep CNNs. Specifically, we select the filters with the maximum gradient values and merge their indices with the indices of the filters with the largest weights. We then update parameters over the above union. Finally, we utilize filter selection in a dynamic way to get the filters with the largest magnitude. Different from some previous methods which remove filters of smaller weights but neglect the influence of gradients, we exploit gradient information. Our experimental results on MNIST, CIFAR-10 and CIFAR-100 clearly demonstrate the efficiency of our FPGraSP algorithm. As an example, for pruning ResNet-56 on CIFAR-10, our FPGraSP without fine-tuning obtains 0.04\(\%\) accuracy drop, achieving 52.63\(\%\) FLOPs reduction.
更多
查看译文
关键词
Filter pruning,Gradient support pursuit algorithm,Deep convolutional neural network
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要