Learning in recurrent neural networks has been a topic fraught with difficulties and problems. We here report substantial progress in the unsupervised learning of recurrent networks that can keep track of an input signal. Specifically, we show how these networks can learn to efficiently represent their present and past inputs, based on local learning rules only.
Model Type: Connectionist Network
Simulation Environment: Python
Implementer(s): Brendel, Wieland [wieland.brendel at bethgelab.org]
References:
Vertechi P, Brendel W, Machens CK. (2014). Unsupervised learning of an efficient short-term memory network Advances in Neural Information Processing Systems. 27