Amari S. (2007). Integration of stochastic models by minimizing alpha-divergence. Neural computation. 19 [PubMed]

See more from authors: Amari S

References and models cited by this paper

Aihara K, Toyoizumi T. (2006). Generalization of the mean-field method for power-law distributions Intl J Bifurcation Chaos. 16

Amari S, Ikeda S, Shimokawa H. (2001). Information geometry and meanfield approximation: The Œ+ projection approach Advanced mean field methods: Theory and practive.

Amari S, Nagaoka H. (2000). Methods of information geometry.

Chernoff H. (1952). A measure of asymptotic efficiency for tests of a hypothesis based on a sum of observations Ann Math Stat. 23

Corcuera JM, Giummole F. (1999). A generalized Bayes rule for prediction Scandinavian J Stat. 26

Csiszar I. (1975). I-divergence geometry of probability distributions and minimization problems Annals Of Probability. 3

Dayan P, Hinton GE, Neal RM, Zemel RS. (1995). The Helmholtz machine. Neural computation. 7 [PubMed]

Eguchi S. (1983). Second order efficiency of minimum contrast estimators in a curved exponential family Ann Stat. 11

Falmagne JC. (1985). Elements of psychophysical theory.

Hardy G, Littlewood JE, Polya G. (1952). Inequalities.

Heller J. (2006). Illumination-invariance of Plateaus midgray J Math Psychol. 50

Hinton GE. (2002). Training products of experts by minimizing contrastive divergence. Neural computation. 14 [PubMed]

Ikeda S, Tanaka T, Amari S. (2004). Stochastic reasoning, free energy, and information geometry. Neural computation. 16 [PubMed]

Jacobs RA, Hinton GE, Jordan MI, Nowlan SJ. (1991). Adaptive mixtures of local experts Neural Comput. 3

Jacobs RA, Jordan MI. (1994). Hierarchical mixtures of experts and the EM algorithm Neural Comput. 6

Komaki F. (1996). On asymptotic properties of predictive distributions Biometrika. 83

Marriott P. (2002). On the local geometry of mixture models Biometrika. 89

Matsuyama Y. (2003). The alpha-EM algorithm: Surrogate likelihood maximization using alpha-logarithmic information measures. IEEE Trans Inform Theory. 49

Mihoko M, Eguchi S. (2002). Robust blind source separation by beta divergence. Neural computation. 14 [PubMed]

Minka T. (2005). Divergence measures and message passing MSR-TR-2005-173.

Murata N, Takenouchi T, Kanamori T, Eguchi S. (2004). Information geometry of U-Boost and Bregman divergence. Neural computation. 16 [PubMed]

Petz D, Temesi R. (2005). Means of positive numbers and matrices SIAM J Matrix Analysis and Applications. 27

Renyi H. (1961). On measures of entropy and information Proc 4th Berkeley Symp Math Stat Prob.

Saito H, Hashimoto N, Amari S, Hida E, Ohno H. (2006). Neural representation for perception of wide-field visual flow in MST: Bidirectional transparent motion and its illusory after effect Manuscript submitted for publication.

Tsallis C. (1988). Possible generalization of Boltzmann-Gibbs statistics J Stat Phys. 52

Wolpert DM, Kawato M. (1998). Multiple paired forward and inverse models for motor control. Neural networks : the official journal of the International Neural Network Society. 11 [PubMed]

Xu L. (2004). Advances on BYY harmony learning: information theoretic perspective, generalized projection geometry, and independent factor autodetermination. IEEE transactions on neural networks. 15 [PubMed]

Zhang J. (2004). Divergence function, duality, and convex analysis. Neural computation. 16 [PubMed]

Zhu H, Rohwer R. (1998). Information geometry, Bayesian inference, ideal estimates, and error decomposition Unpublished manuscript. Available online at http:--www.santafe.edu-research-publications-wpabstract-199806045.

References and models that cite this paper
This website requires cookies and limited processing of your personal data in order to function. By continuing to browse or otherwise use this site, you are agreeing to this use. See our Privacy policy and how to cite and terms of use.