Overcoming Catastrophic Forgetting in Continual Learning by Exploring Eigenvalues of Hessian Matrix.

IEEE transactions on neural networks and learning systems(2023)

引用 1|浏览24
暂无评分
摘要
Neural networks tend to suffer performance deterioration on previous tasks when they are applied to multiple tasks sequentially without access to previous data. The problem is commonly known as catastrophic forgetting, a significant challenge in continual learning (CL). To overcome the catastrophic forgetting, regularization-based CL methods construct a regularization-based term, which can be considered as the approximation loss function of previous tasks, to penalize the update of parameters. However, the rigorous theoretical analysis of regularization-based methods is limited. Therefore, we theoretically analyze the forgetting and the convergence properties of regularization-based methods. The theoretical results demonstrate that the upper bound of the forgetting has a relationship with the maximum eigenvalue of the Hessian matrix. Hence, to decrease the upper bound of the forgetting, we propose eiGenvalues ExplorAtion Regularization-based (GEAR) method, which explores the geometric properties of the approximation loss of prior tasks regarding the maximum eigenvalue. Extensive experimental results demonstrate that our method mitigates catastrophic forgetting and outperforms existing regularization-based methods.
更多
查看译文
关键词
Catastrophic forgetting,continual learning (CL),incremental learning,lifelong learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络