SP-GAN: Self-Growing and Pruning Generative Adversarial Networks.

IEEE Transactions on Neural Networks and Learning Systems(2021)

引用 15|浏览100
暂无评分
摘要
This article presents a new Self-growing and Pruning Generative Adversarial Network (SP-GAN) for realistic image generation. In contrast to traditional GAN models, our SP-GAN is able to dynamically adjust the size and architecture of a network in the training stage by using the proposed self-growing and pruning mechanisms. To be more specific, we first train two seed networks as the generator and discriminator; each contains a small number of convolution kernels. Such small-scale networks are much easier and faster to train than large-capacity networks. Second, in the self-growing step, we replicate the convolution kernels of each seed network to augment the scale of the network, followed by fine-tuning the augmented/expanded network. More importantly, to prevent the excessive growth of each seed network in the self-growing stage, we propose a pruning strategy that reduces the redundancy of an augmented network, yielding the optimal scale of the network. Finally, we design a new adaptive loss function that is treated as a variable loss computational process for the training of the proposed SP-GAN model. By design, the hyperparameters of the loss function can dynamically adapt to different training stages. Experimental results obtained on a set of data sets demonstrate the merits of the proposed method, especially in terms of the stability and efficiency of network training. The source code of the proposed SP-GAN method is publicly available at https://github.com/Lambert-chen/SPGAN.git.
更多
查看译文
关键词
Gallium nitride,Training,Generative adversarial networks,Generators,Adaptation models,Convolution,Stability analysis
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要