Alleviating catastrophic forgetting: context gating and synaptic stabilization (Masse et al 2018)


"Artificial neural networks can suffer from catastrophic forgetting, in which learning a new task causes the network to forget how to perform previous tasks. While previous studies have proposed various methods that can alleviate forgetting over small numbers (<10) of tasks, it is uncertain whether they can prevent forgetting across larger numbers of tasks. In this study, we propose a neuroscience-inspired scheme, called “context-dependent gating,” in which mostly nonoverlapping sets of units are active for any one task. Importantly, context-dependent gating has a straightforward implementation, requires little extra computational overhead, and when combined with previous methods to stabilize connection weights, can allow networks to maintain high performance across large numbers of sequentially presented tasks."

Model Type: Connectionist Network

Model Concept(s): Learning; Reinforcement Learning

Simulation Environment: Python (web link to model)

Implementer(s): Masse, Nicolas Y [masse at uchicago.edu]; Grant, Gregory D [dfreedman at uchicago.edu]

References:

Masse NY, Grant GD, Freedman DJ. (2018). Alleviating catastrophic forgetting using context-dependent gating and synaptic stabilization. Proceedings of the National Academy of Sciences of the United States of America. 115 [PubMed]


This website requires cookies and limited processing of your personal data in order to function. By continuing to browse or otherwise use this site, you are agreeing to this use. See our Privacy policy and how to cite and terms of use.