Increasing Depth of Neural Networks for Life-long Learning

arxiv(2022)

引用 0|浏览2
暂无评分
摘要
Increasing neural network depth is a well-known method for improving neural network performance. Modern deep architectures contain multiple mechanisms that allow hundreds or even thousands of layers to train. This work is trying to answer if extending neural network depth may be beneficial in a life-long learning setting. In particular, we propose a novel method based on adding new layers on top of existing ones to enable the forward transfer of knowledge and adapting previously learned representations for new tasks. We utilize a method of determining the most similar tasks for selecting the best location in our network to add new nodes with trainable parameters. This approach allows for creating a tree-like model, where each node is a set of neural network parameters dedicated to a specific task. The proposed method is inspired by Progressive Neural Network (PNN) concept, therefore it is rehearsal-free and benefits from dynamic change of network structure. However, it requires fewer parameters per task than PNN. Experiments on Permuted MNIST and SplitCIFAR show that the proposed algorithm is on par with other continual learning methods. We also perform ablation studies to clarify the contributions of each system part.
更多
查看译文
关键词
Life-long learning,Continual learning,Deep learning,Machine learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要