Abarbanel HD, Creveling DR, Jeanne JM. (2008). Estimation of parameters in nonlinear systems using balanced synchronization. Physical review. E, Statistical, nonlinear, and soft matter physics. 77 [PubMed]
Amit DJ, Brunel N. (1997). Model of global spontaneous activity and local structured activity during delay periods in the cerebral cortex. Cerebral cortex (New York, N.Y. : 1991). 7 [PubMed]
Atiya AF, Parlos AG. (2000). New results on recurrent network training: unifying the algorithms and accelerating convergence. IEEE transactions on neural networks. 11 [PubMed]
Bertschinger N, Natschläger T. (2004). Real-time computation at the edge of chaos in recurrent neural networks. Neural computation. 16 [PubMed]
Brunel N. (2000). Dynamics of networks of randomly connected excitatory and inhibitory spiking neurons. Journal of physiology, Paris. 94 [PubMed]
Buonomano DV, Maass W. (2009). State-dependent computations: spatiotemporal processing in cortical networks. Nature reviews. Neuroscience. 10 [PubMed]
Buonomano DV, Merzenich MM. (1995). Temporal information transformed into a spatial code by a neural network with realistic properties. Science (New York, N.Y.). 267 [PubMed]
Churchland MM, Shenoy KV. (2007). Temporal complexity and heterogeneity of single-neuron activity in premotor and motor cortex. Journal of neurophysiology. 97 [PubMed]
Churchland MM, Shenoy KV. (2007). Delay of movement caused by disruption of cortical preparatory activity. Journal of neurophysiology. 97 [PubMed]
Churchland MM, Yu BM, Ryu SI, Santhanam G, Shenoy KV. (2006). Neural variability in premotor cortex provides a signature of motor preparation. The Journal of neuroscience : the official journal of the Society for Neuroscience. 26 [PubMed]
Doya K. (1992). Bifurcations in the learning of recurrent neural networks Ieee International Symposium On Circuits And Systems. 6
Fetz EE. (1992). Are movement parameters recognizably coded in the activity of single neurons? Behav Brain Sci. 15
Ganguli S, Huh D, Sompolinsky H. (2008). Memory traces in dynamical systems. Proceedings of the National Academy of Sciences of the United States of America. 105 [PubMed]
Haykin S. (2002). Adaptive filter theory (4th ed).
Hinton GE, Osindero S, Teh YW. (2006). A fast learning algorithm for deep belief nets. Neural computation. 18 [PubMed]
Jaeger H. (2003). Adaptive nonlinear system identification with echo state networks Advances in neural information processing systems. 15
Jaeger H, Haas H. (2004). Harnessing nonlinearity: predicting chaotic systems and saving energy in wireless communication. Science (New York, N.Y.). 304 [PubMed]
Maass W, Joshi P, Sontag ED. (2007). Computational aspects of feedback in neural circuits. PLoS computational biology. 3 [PubMed]
Maass W, Natschläger T, Markram H. (2002). Real-time computing without stable states: a new framework for neural computation based on perturbations. Neural computation. 14 [PubMed]
Miall RC, Weir DJ, Wolpert DM, Stein JF. (1993). Is the cerebellum a smith predictor? Journal of motor behavior. 25 [PubMed]
Molgedey L, Schuchhardt J, Schuster HG. (1992). Suppressing chaos in neural networks by noise. Physical review letters. 69 [PubMed]
Pearlmutter B. (1989). Learning state space trajectories in recurrent neural networks Neural Comput. 1
Robinson DA. (1992). Implications of neural networks for how we think about brain function Behav And Brain Sci. 15
Roweis S, Hinton GE, Taylor GW. (2006). Modeling human motion using binary latent variables Advances in Neural Information Processing Systems. 19
Rumelhart D, Mccleland J. (1986). Parallel Distributed Processing.
Rumelhart DE, Hinton GE, Williams RJ. (1986). Learning internal representations by error propagation Parallel Distributed Processing. 1
Sompolinsky H, Crisanti A, Sommers HJ. (1988). Chaos in random neural networks. Physical review letters. 61 [PubMed]
Strogatz SH, Herbert DE. (1994). Nonlinear Dynamics and Chaos.
Sussillo D. (2009). Learning in chaotic recurrent networks PhD thesis.
Yuste R, MacLean JN, Smith J, Lansner A. (2005). The cortex as a central pattern generator. Nature reviews. Neuroscience. 6 [PubMed]
Zipser D, Williams RJ. (1989). A learning algorithm for continually running fully recurrent neural networks Neural Comput. 1
van Vreeswijk C, Sompolinsky H. (1996). Chaos in neuronal networks with balanced excitatory and inhibitory activity. Science (New York, N.Y.). 274 [PubMed]
Bayati M et al. (2018). Storage fidelity for sequence memory in the hippocampal circuit. PloS one. 13 [PubMed]
Cone I, Shouval HZ. (2021). Learning precise spatiotemporal sequences via biophysically realistic learning rules in a modular, spiking network. eLife. 10 [PubMed]
Dura-Bernal S et al. (2023). Data-driven multiscale model of macaque auditory thalamocortical circuits reproduces in vivo dynamics. Cell reports. 42 [PubMed]
Maes A, Barahona M, Clopath C. (2020). Learning spatiotemporal signals using a recurrent spiking network that discretizes time. PLoS computational biology. 16 [PubMed]
Muscinelli SP, Gerstner W, Schwalger T. (2019). How single neuron properties shape chaotic dynamics and signal transmission in random neural networks. PLoS computational biology. 15 [PubMed]
Rössert C, Dean P, Porrill J. (2015). At the Edge of Chaos: How Cerebellar Granular Layer Network Dynamics Can Provide the Basis for Temporal Filters. PLoS computational biology. 11 [PubMed]
Stroud JP, Porter MA, Hennequin G, Vogels TP. (2018). Motor primitives in space and time via targeted gain modulation in cortical networks. Nature neuroscience. 21 [PubMed]
Wilson CJ, Beverlin B, Netoff T. (2011). Chaotic desynchronization as the therapeutic mechanism of deep brain stimulation. Frontiers in systems neuroscience. 5 [PubMed]