Takenouchi T, Eguchi S. (2004). Robustifying AdaBoost by adding the naive error rate. Neural computation. 16 [PubMed]

See more from authors: Takenouchi T · Eguchi S

References and models cited by this paper

Baxter L. (1999). Boosting algorithms as gradient descent in function space Advances in neural information processing systems. 11

Copas J. (1988). Binary regression models for contaminated data J Royal Statist Soc B. 50

Csiszar I. (1984). Sanov property, generalized I-projection and a conditional limit theorem Ann Probability. 12

Eguchi S, Copas J. (2001). Recent developments in discriminant analysis from an information geometric point of view J Korean Statist Soc. 30

Eguchi S, Copas J. (2002). A class of logistic type discriminant functions Biometrika. 89

Eguchi S, Murata N, Takenouchi T, Kanamori T. (2002). Information geometry of U-boost and Bregman divergence (ISM research memorandum 860).

Freund Y, Schapire R. (1997). A decision-theoretic generalization of on-line learning and an application to boosting J Comput Sys Sci. 55

Hampel FR, Rousseeuw PJ, Ronchetti EM, Stahel WA. (1986). Robust statistics: The approach based on influence functions.

Huber P. (1981). Robust statistics.

Lafferty J, Lebanon G. (2002). Boosting and maximum likelihood for exponential models Advances in neural information processing systems. 14

Mclachlan G. (1992). Discriminant analysis and statistical pattern recognition.

Muller KR, Ratsch G, Onoda T. (2001). Soft margins for AdaBoost Mach Learn. 42

Schapire R. (1999). Theoretical view of boosting Proceedings of the 4th European Conference on Computational Learning Theory.

Schapire RE. (1990). The strength of weak learnability Machine Learning. 5

Tibshirani R, Hastie T, Friedman J. (2000). Additive logistic regression: A statictical view of boosting Ann Stat. 28

Tibshirani R, Hastie T, Friedman J. (2001). The elements of statistical learning.

Watanabe O, Domingo C. (2000). MadaBoost: A modification of AdaBoost Proceedings of the 13th Conference on Computational Learning Theory.

References and models that cite this paper

Kanamori T, Takenouchi T, Eguchi S, Murata N. (2007). Robust loss functions for boosting. Neural computation. 19 [PubMed]

Murata N, Takenouchi T, Kanamori T, Eguchi S. (2004). Information geometry of U-Boost and Bregman divergence. Neural computation. 16 [PubMed]

This website requires cookies and limited processing of your personal data in order to function. By continuing to browse or otherwise use this site, you are agreeing to this use. See our Privacy policy and how to cite and terms of use.