Nakajima S, Watanabe S. (2007). Variational Bayes solution of linear neural networks and its generalization performance. Neural computation. 19 [PubMed]

See more from authors: Nakajima S · Watanabe S

References and models cited by this paper

Akaike H. (1974). A new look at the statistical model identification IEEE Trans Appl Comp. 19

Akaike H. (1980). Likelihood and the Bayes procedure Bayesian statistics.

Amari S, Park H, Ozeki T. (2006). Singularities affect dynamics of learning in neuromanifolds. Neural computation. 18 [PubMed]

Attias H. (1999). Inferring parameters and structure of latent variable models by variational bayes In Proceedings of Fifteenth Conference on Uncertainty in Artificial Intelligence.

Baldi PF, Hornik K. (1995). Learning in linear neural networks: a survey. IEEE transactions on neural networks. 6 [PubMed]

Bickel P, Chernoff H. (1993). Asymptotic distribution of the likelihood ratio statistic in a prototypical non regular problem Statistics and Probability: A Raghu Raj Bahadur Festschrift.

Cramer H. (1951). Mathematical methods of statistics..

Dacunha-castelle D, Gassiat E. (1997). Testing in locally conic models, and application to mixture models Probability And Statistics. 1

Dempster AP, Laird NM, Rubin DB. (1977). Maximum likelihood from incomplete data via the EM algorithm. J R Stat Soc B. 39

Efron B, Morris C. (1973). Steins estimation rule and its competitors an empirical Bayes approach J Am Stat Assoc. 68

Fukumizu K. (1999). Generalization error of linear neural networks in unidentifiable cases Algorithmic learning theory: Proceedings of the 10th International Conference on Algorithmic Learning Theory (ALT99).

Fukumizu K. (2003). Likelihood ratio of unidentifiable models and multilayer neural networks Annals Of Statistics. 31

Geiger D, Rusakov D. (2002). Asymptotic model selection for naive Bayesian networks Proc Conf Uncertainty Artif Intel.

Ghahramani Z, Beal MJ. (2001). Graphical models and variational methods Advanced mean field methods.

Hagiwara K. (2002). On the problem in model selection of neural network regression in overrealizable scenario. Neural computation. 14 [PubMed]

Hartigan JA. (1985). A failure of likelihood asymptotics for normal mixtures Proc Barkeley Conf in Honor of J Neyman and J Kiefer. 2

Hinton GE, van_Camp D. (1993). Keeping neural networks simple by minimizing the description length of the weights Proc Conf Computational Learning Theory.

James W, Stein C. (1961). Estimation with quadratic loss Proc Fourth Berkeley Symposium Mathematical Statistics And Probability. 1

Jordan MI, Jaakkola TS. (2000). Bayesian parameter estimation via variational methods Statistics And Computing. 10

Mackay D. (1992). Bayesian Interpolation Neural Comput. 4

Mackay DJC. (1995). Developments in probabilistic modeling with neural networks ensemble learning Proc 3rd Ann Symp Neural Networks.

Reinsel GC, Velu RP. (1998). Multivariate reduced-rank regression.

Rissanen J. (1986). Stochastic complexity and modeling Ann Stat. 14

Sato M. (2001). Online model selection based on the variational Bayes Neural Comput. 13

Schwarz G. (1978). Estimating the dimension of a model Ann Stat. 6

Stein C. (1956). Inadmissibility of the usual estimator for the mean of a multivariate normal distribution Proceedings Of The 3rd Berkeley Symposium On Mathematical Statistics And Probability. 1

Takemura A, Kuriki S. (1997). Weights of chi-bar square distribution for smoothor piecewise smooth cone alternatives Ann Stat. 25

Takemura A, Kuriki S. (2001). Tail probabilities of the maxima of multilinear forms and their applications Ann Stat. 29

Tishby N, Solla SA, Levin E. (1990). A statistical approach to learning and generalization in layered neural networks Proc IEEE. 78

Wang B, Titterington DM. (2004). Convergence and asymptotic normality of variational Bayesian approximations for exponential family models with missing values Proc Conf Uncertainty Artif Intel.

Watanabe S. (1995). A generalized bayesian framework for neural networks with singular fisher information matrices Proc Intl Symposium Nonlinear Theory And Its Applications. 2

Watanabe S. (2001). Algebraic analysis for nonidentifiable learning machines. Neural computation. 13 [PubMed]

Watanabe S. (2001). Algebraic information geometry for learning machines with singularities Advances in neural information processing systems. 13

Watanabe S, Amari S. (2003). Learning coefficients of layered models when the true distribution mismatches the singularities Neural Comput. 15

Watanabe S, Aoyagi M. (2004). Stochastic complerities of reduced rank regression in Bayesian estimation Neural Networks. 18

Watanabe S, Nagata K, Yamazaki K. (2005). A new method of model selection based on learning coefficient Proc Intl Symposium Nonlinear Theory and Its Applications.

Watanabe S, Nakajima S. (2005). Generalization error of linear neural networks in an empirical Bayes approach Proc IJCAI.

Watanabe S, Nakajima S. (2006). Generalization performance of subspace Bayes approach in linear neural networks IEICE Trans. 89

Watanabe S, Nakano N. (2005). Stochastic complexity of layered neural networks in mean field approximation Proc ICONIP.

Watanabe S, Watanabe K. (2006). Stochastic complexities of gaussian mixtures in variational Bayesian approximation J Mach Learn Res. 7

Watanabe S, Watanabe K, Hosino T. (2005). Stochastic complexity of variational Bayesian hidden Markov models Proc Intl Joint Conf Neural Networks.

Watanabe S, Yamazaki K. (2002). Resolution of singularities in mixture models and its stochastic complexity Proc 9th Intl Conf Neural Information Process.

Watanabe S, Yamazaki K. (2003). Stochastic complexities of hidden Markov models Proc Neural Networks Signal Process.

Watanabe S, Yamazaki K. (2003). Stochastic complexity of Bayesian networks Proc 19th Conf Uncertainty Artif Intell.

Watcher KW. (1978). The strong limits of random matrix spectra for sample matrices of independent elements Ann Prob. 6

References and models that cite this paper
This website requires cookies and limited processing of your personal data in order to function. By continuing to browse or otherwise use this site, you are agreeing to this use. See our Privacy policy and how to cite and terms of use.