BFConv: Improving Convolutional Neural Networks with Butterfly Convolution

ICONIP(2021)

引用 1|浏览12
暂无评分
摘要
Convolutional neural network (CNN) is a basic neural network widely used in vision tasks. Many CNNs alleviate the redundancy in feature maps to reduce model complexity. Inspired by digital signal processing theories, this paper reviews discrete fourier transform (DFT), finding its similarities with standard convolution. In particular, DFT has a fast algorithm called FFT, which sparks our thinking: can we learn from the idea of FFT to realize a more efficient convolution filter? Based on the butterfly operation of FFT, we propose a novel butterfly convolution (BFConv). In addition, we illustrate that group weight sharing convolution is a basic unit of BFConv. Compared with the traditional group convolution structure, BFConv constructs group residual-like connections and increases the range of receptive fields for each sub-feature layer. Without changing the network architecture, we integrate BFConv into ResNet-50, ShuffleNet and VGG-16. Experimental results on CIFAR-10 and ImageNet demonstrate the above BFConv-equipped networks reduce parameters and computation, achieving similar or higher accuracy. Remarkably, when ResNet-50 embedded BFConv reaches nearly half of the compression ratio of the model, it performs favorably against its state-of-the-art competitors.
更多
查看译文
关键词
Convolutional neural network,FFT,Butterfly convolution,Group Weight sharing convolution
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要