Paninski L. (2003). Estimation of entropy and mutual information Neural Comput. 15

See more from authors: Paninski L

References and models cited by this paper
References and models that cite this paper

Amigó JM, Szczepański J, Wajnryb E, Sanchez-Vives MV. (2004). Estimating the entropy rate of spike trains via Lempel-Ziv complexity. Neural computation. 16 [PubMed]

Chacron MJ. (2006). Nonlinear information processing in a model sensory system. Journal of neurophysiology. 95 [PubMed]

Kennel MB, Shlens J, Abarbanel HD, Chichilnisky EJ. (2005). Estimating entropy rates with Bayesian confidence intervals. Neural computation. 17 [PubMed]

Montemurro MA, Senatore R, Panzeri S. (2007). Tight data-robust bounds to mutual information combining shuffling and model selection techniques. Neural computation. 19 [PubMed]

Nelken I, Chechik G, Mrsic-Flogel TD, King AJ, Schnupp JW. (2005). Encoding stimulus information by spike numbers and mean response time in primary auditory cortex. Journal of computational neuroscience. 19 [PubMed]

Pola G, Petersen RS, Thiele A, Young MP, Panzeri S. (2005). Data-robust tight lower bounds to the information carried by spike times of a neuronal population. Neural computation. 17 [PubMed]

Sharpee T, Rust NC, Bialek W. (2004). Analyzing neural responses to natural signals: maximally informative dimensions. Neural computation. 16 [PubMed]

Shlens J, Kennel MB, Abarbanel HD, Chichilnisky EJ. (2007). Estimating information rates with confidence intervals in neural spike trains. Neural computation. 19 [PubMed]

Van Hulle MM. (2005). Edgeworth approximation of multivariate differential entropy. Neural computation. 17 [PubMed]

This website requires cookies and limited processing of your personal data in order to function. By continuing to browse or otherwise use this site, you are agreeing to this use. See our Privacy policy and how to cite and terms of use.