Hierarchical Threshold Pruning Based on Uniform Response Criterion.

IEEE transactions on neural networks and learning systems(2023)

引用 1|浏览24
暂无评分
摘要
Convolutional neural networks (CNNs) have been successfully applied to various fields. However, CNNs' overparameterization requires more memory and training time, making it unsuitable for some resource-constrained devices. To address this issue, filter pruning as one of the most efficient ways was proposed. In this article, we propose a feature-discrimination-based filter importance criterion, uniform response criterion (URC), as a key component of filter pruning. It converts the maximum activation responses into probabilities and then measures the importance of the filter through the distribution of these probabilities over classes. However, applying URC directly to global threshold pruning may cause some problems. The first problem is that some layers will be completely pruned under global pruning settings. The second problem is that global threshold pruning neglects that filters in different layers have different importance. To address these issues, we propose hierarchical threshold pruning (HTP) with URC. It performs a pruning step limited in a relatively redundant layer rather than comparing the filters' importance across all layers, which can avoid some important filters being pruned. The effectiveness of our method benefits from three techniques: 1) measuring filter importance by URC; 2) normalizing filter scores; and 3) conducting prune in relatively redundant layers. Extensive experiments on CIFAR-10/100 and ImageNet show that our method achieves the state-of-the-art performance on multiple benchmarks.
更多
查看译文
关键词
Filter pruning,hierarchical threshold pruning (HTP),model compression,uniform response criterion (URC)
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要