Structured Pruning for Efficient ConvNets via Incremental Regularization

2019 International Joint Conference on Neural Networks (IJCNN)(2018)

引用 19|浏览62
暂无评分
摘要
Parameter pruning is a promising approach for CNN compression and acceleration by eliminating redundant model parameters with tolerable performance loss. Despite its effectiveness, existing regularization-based parameter pruning methods usually drive weights towards zero with large and constant regularization factors, which neglects the fact that the expressiveness of CNNs is fragile and needs a more gentle way of regularization for the networks to adapt during pruning. To solve this problem, we propose a new regularization-based pruning method (named IncReg) to incrementally assign different regularization factors to different weight groups based on their relative importance, whose effectiveness is proved on popular CNNs compared with state-of-the-art methods.
更多
查看译文
关键词
ConvNets,IncReg,CNN compression,regularization-based parameter pruning methods,empirical analysis,ImageNet datasets,redundant model parameter elimination,incremental regularization,structured pruning,CIFAR-10 dataset,gentle regularization scheme
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要