Kennel MB, Shlens J, Abarbanel HD, Chichilnisky EJ. (2005). Estimating entropy rates with Bayesian confidence intervals. Neural computation. 17 [PubMed]

See more from authors: Kennel MB · Shlens J · Abarbanel HD · Chichilnisky EJ

References and models cited by this paper

Balasubramanian V. (1997). Statistical inference, Occam's razor, and statistical mechanics on the space of probability distributions Neural Comput. 9

Bialek W, Nemenman I, Shafee F. (2002). Entropy and inference, revisited Advances in neural information processsing systems.

Bialek W, Rieke F, de Ruyter van Steveninck RR, Warland D. (1991). Reading a neural code. Science (New York, N.Y.). 252 [PubMed]

Bialek W, Tishby N, Nemenman I. (2001). Predictability, complexity, and learning. Neural Comput. 13

Borst A, Theunissen FE. (1999). Information theory and neural coding. Nature neuroscience. 2 [PubMed]

Brenner N, Strong SP, Koberle R, Bialek W, de Ruyter van Steveninck RR. (2000). Synergy in a neural code. Neural computation. 12 [PubMed]

Bullock TH, Perkel DH. (1969). Neural coding Neurosciences research symposium summaries.

Buracas GT, Zador AM, DeWeese MR, Albright TD. (1998). Efficient discrimination of temporal patterns by motion-sensitive neurons in primate visual cortex. Neuron. 20 [PubMed]

Costa J, Hero A. (2004). Geodesic entropic graphs for dimension and entropy estimation in manifold learning IEEE Trans Signal Processing. 25

Cover TM, Thomas JA. (1991). Elements of Information Theory.

Duda RO, Hart PE, Stork DG. (2000). Pattern Classification (2nd edition).

Field DJ. (1987). Relations between the statistics of natural images and the response properties of cortical cells. Journal of the Optical Society of America. A, Optics and image science. 4 [PubMed]

Gamerman D. (1997). Markov chain Monte Carlo: Stochastic simulation for Bayesian inference.

Gilmore R, Lefranc M. (2002). The topology of chaos: Alice in stretch and squeeze land.

Hastings WK. (1970). Monte Carlo sampling methods using Markov chains and their applications Biometrika. 57

Hilborn R. (2000). Chaos and nonlinear dynamics: An introduction for scientists and engineers.

Kontoyiannis I, Algoet P, Suhov Y, Wyner A. (1998). Nonparametric entropy estimation for stationary processes and random fields with applications to English text IEEE Trans Inf Theory. 44

Krichevsky RE, Trofimov VK. (1981). The performance of universal encoding IEEE Trans Info Theory. 27

Lehmann E, Casella G. (1998). Theory Of Point Estimation.

Lempel A, Ziv J. (1976). On the complexity of finite sequences IEEE Trans Inf Theory. 22

Lempel A, Ziv J. (1977). A universal algorithm for sequential data compression IEEE Trans Inform Theory. 23

Lempel A, Ziv J. (1978). Compression of individual sequences via variable-rate coding IEEE Trans Inform Theory. 24

Lewen GD, Bialek W, de Ruyter van Steveninck RR. (2001). Neural coding of naturalistic motion stimuli. Network (Bristol, England). 12 [PubMed]

Lind D, Marcus B. (1996). Symbolic dynamics and coding.

Liu RC, Tzonev S, Rebrik S, Miller KD. (2001). Variability and information in a neural code of the cat lateral geniculate nucleus. Journal of neurophysiology. 86 [PubMed]

London M, Schreibman A, Häusser M, Larkum ME, Segev I. (2002). The information efficacy of a synapse. Nature neuroscience. 5 [PubMed]

Ma SK. (1981). Calculation of entropy from data of motion J Stat Phys. 26

MacKay DJ. (2003). Information Theory, Inference and Learning Algorithms.

Mackay D, McCulloch W. (1952). The limiting information capacity of a neuronal link Bull Math Biophy. 14

Martin A, Seroussi G, Weinberger M. (2004). Linear time universal coding and time reversal of tree sources via FSM closure IEEE Trans Inform Theory. 50

Metropolis N, Rosenbluth AW, Rosenbluth MN, Teller AH, Teller E. (1953). Equation of state calculations by fast computing machines J Chem Phys. 21

Miller G, Madow W. (1954). On the maximum likelihood estimate of the Shannon-Wiener measure of information Air Force Cambridge Research Center Technical Report. 75

Nemenman I, Bialek W, de Ruyter van Steveninck R. (2004). Entropy and information in neural spike trains: progress on the sampling problem. Physical review. E, Statistical, nonlinear, and soft matter physics. 69 [PubMed]

Ott E. (2002). Chaos in dynamical systems.

Paninski L. (2003). Estimation of entropy and mutual information Neural Comput. 15

Purpura K, Victor J. (1997). Metric-space analysis of spike trains: Theory, algorithms and application Comput Neural Syst. 8

Reinagel P, Reid RC. (2000). Temporal coding of visual information in the thalamus. The Journal of neuroscience : the official journal of the Society for Neuroscience. 20 [PubMed]

Rieke F, Warland D, de Ruyter van Steveninck, R, Bialek B. (1997). Spikes: Exploring The Neural Code.

Rissanen J. (1989). Stochastic Complexity Statistical Inquiry.

Roulston M. (1999). Estimating the errors on measured entropy and mutual information Physica D. 125

Schneidman E, Bialek W, Berry MJ. (2003). Synergy, redundancy, and independence in population codes. The Journal of neuroscience : the official journal of the Society for Neuroscience. 23 [PubMed]

Shannon CE. (1948). The mathematical theory of communication Bell Syst Tech J. 27

Shlens J, Kennel M, Abarbanel H, Chichilnisky E. (2005). Estimating information rates with Bayesian confidence intervals in neural spike trains.

Simoncelli EP, Olshausen BA. (2001). Natural image statistics and neural representation. Annual review of neuroscience. 24 [PubMed]

Solomonoff R. (1964). A formal theory of inductive inference. Part I Information And Control. 7

Strong SP, Koberle R, de Ruyter van Steveninck R, Bialek W. (1997). Entropy and information in neuronal spike trains. Phys Rev Lett. 80

Tjalkens T, Shtarkov Y, Willems F. (1995). Multialphabet weighting: Universal coding of context tree sources Problems Info Trans. 33

Treves A, Panzeri S. (1995). The upward bias in measures of information derived from limited data samples. Neural Comput. 7

Warland DK, Reinagel P, Meister M. (1997). Decoding visual information from a population of retinal ganglion cells. Journal of neurophysiology. 78 [PubMed]

Willems F. (1998). Context tree weighting: Extensions IEEE Trans Inform Theory. 44

Willems FMJ, Shtarkov YM, Tjalkens T. (1995). The context tree weighting method basic properties. IEEE Trans. Info. Theory. IT-41

Wolpert DH, Wolf DR. (1995). Estimating functions of probability distributions from a finite set of samples. Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics. 52 [PubMed]

Zhang J, Orlitsky A, Santhanam N. (2004). Universal compression of memoryless sources over unknown alphabets IEEE Trans Inform Theory. 50

Ziv J, Wyner AD, Wyner AJ. (1998). On the role of pattern matching in information theory IEEE Trans Inform Theory. 44

de Ruyter van Steveninck RR, Lewen GD, Strong SP, Koberle R, Bialek W. (1997). Reproducibility and variability in neural spike trains. Science (New York, N.Y.). 275 [PubMed]

References and models that cite this paper

Edgerton JR, Hanson JE, Günay C, Jaeger D. (2010). Dendritic sodium channels regulate network integration in globus pallidus neurons: a modeling study. The Journal of neuroscience : the official journal of the Society for Neuroscience. 30 [PubMed]

Montemurro MA, Senatore R, Panzeri S. (2007). Tight data-robust bounds to mutual information combining shuffling and model selection techniques. Neural computation. 19 [PubMed]

Shlens J, Kennel MB, Abarbanel HD, Chichilnisky EJ. (2007). Estimating information rates with confidence intervals in neural spike trains. Neural computation. 19 [PubMed]

This website requires cookies and limited processing of your personal data in order to function. By continuing to browse or otherwise use this site, you are agreeing to this use. See our Privacy policy and how to cite and terms of use.