Shrestha DL, Solomatine DP. (2006). Experiments with AdaBoost.RT, an improved boosting scheme for regression. Neural computation. 18 [PubMed]

See more from authors: Shrestha DL · Solomatine DP

References and models cited by this paper

Avnimelech R, Intrator N. (1999). Boosting regression estimators. Neural computation. 11 [PubMed]

Blake CL, Merz CJ. (1998). UCI Repository of Machine Learning Databases.

Breiman L. (1996). Stacked Regressor Mach Learn. 24

Breiman L. (1996). Bagging predictors Mach Learn. 24

Breiman L. (1996). Bias, variance, and arcing classifiers Tech. Rep. 460.

Breiman L. (1999). Prediction games and arcing algorithms Neural computation. 11 [PubMed]

Cherkassky V, Ma Y. (2004). Comparison of loss functions for linear regression Proc. of the International Joint Conference on Neural Networks.

Drucker H. (1997). Improving regressor using boosting techniques Proc. of the 14th International Conferences on Machine Learning .

Drucker H. (1999). Boosting using neural networks Combining artificial neural Nets.

Duffy N, Helmbold DP. (2000). Leveraging for regression Proc.of the 13th Annual Conference on Computational Learning Theory.

Efron B, Tibshirani R. (1993). An Introduction To The Bootstrap.

Feely R. (2000). Predicting stock market volatility using neural networks Unpublished B.A (Mod.) dissertation.

Frank E, Witten IH. (2000). Data mining.

Freund Y, Schapire R. (1996). Experiment with a new boosting algorithm Proc. of the 13th International Conference on Machine Learning.

Freund Y, Schapire R. (1997). A decision-theoretic generalization of on-line learning and an application to boosting J Comput Sys Sci. 55

Friedman J. (2001). Greedy function approximation: A gradient boosting machine Ann Stat. 29

Friedman JH. (1991). Multivariate Adaptive Regression Splines Ann Stat. 19

Kearns M, Vazirani UV. (1994). An Introduction to computational learning theory.

Namee BM, Cunningham P, Byrne S, Corrigan OI. (2000). The problem of bias in training data in regression problems in medical decision support Padraig Cunninghams Online Publications, Tcd-cs-2000-58 Available Online At Http:--www Cs Tcd Ie-padraig Cunningham-online-pubs Html.

Opitz D, Maclin R. (1999). Popular ensemble methods: An empirical study J Art Intell Res. 11

Quinlan JR. (1992). Learning with continuous classes Proc. of the 5th Australian Joint Conference on AI.

Quinlan JR. (1996). Bagging, boosting and C4.5 Proc. of the 13th national Conference on Artificial Intelligence.

Ridgeway G. (1999). The state of boosting Computing Science And Statistics. 31

Ridgeway G, Madigan D, Richardson T. (1999). Boosting methodology for regression problems Proc. of the 7th International Workshop on Artificial Intelligence and Statistics .

Schapire RE. (1990). The strength of weak learnability Machine Learning. 5

Solomatine DP, Dulal KN. (2003). Model tree as an alternative to neural network in rainfall-runoff modelling Hydrol Sci J. 48

Solomatine DP, Shrestha DL. (2004). AdaBoost.RT: A boosting algorithm for regression problems Proc. of the International Joint Conference on Neural Networks.

Tibshirani R, Hastie T, Friedman J. (2000). Additive logistic regression: A statictical view of boosting Ann Stat. 28

Tresp V. (2001). Committee machines Handbook for neural network signal processing.

Valiant LG. (1984). A theory of the learnable Communications Of The ACM. 27

Vapnik V. (1995). The Nature of Statistical Learning Theory.

Weigend AS, Gershenfeld G. (1993). Time series prediction: Forecasting the future and understanding the past Proceedings of the NATO Advanced Research Workshop on Comparative Time Series Analysis.

Zemel R, Pitassi T. (2001). A gradient-based boosting algorithm for regression problems Advances in neural information processing systems. 13

References and models that cite this paper
This website requires cookies and limited processing of your personal data in order to function. By continuing to browse or otherwise use this site, you are agreeing to this use. See our Privacy policy and how to cite and terms of use.