Overcoming catastrophic forgetting in neural networks
Proceedings of the National Academy of Sciences of the United States of America, Volume abs/1612.00796, Issue 13, 2017.
artificial intelligencecontinual learningdeep learningstability plasticitysynaptic consolidation
The ability to learn tasks in a sequential fashion is crucial to the development of artificial intelligence. Until now neural networks have not been capable of this and it has been widely thought that catastrophic forgetting is an inevitable feature of connectionist models. We show that it is possible to overcome this limitation and train...More
Full Text (Upload PDF)
PPT (Upload PPT)