×

Variance reduction trends on ‘boosted’ classifiers. (English) Zbl 1213.62109

Summary: Ensemble classification techniques such as bagging, [L. Breiman, Mach. Learn. 24, No. 2, 123–140 (1996; Zbl 0858.68080)], boosting [Y. Freund and R. E. Schapire [J. Comput. Syst. Sci. 55, No. 1, 119–139 (1997; Zbl 0880.68103)] and arcing algorithms L. Breiman [Ann. Stat. 26, No. 3, 801–849 (1998; Zbl 0934.62064)] have received much attention in recent literature. Such techniques have been shown to lead to reduced classification error on unseen cases. Even when the ensemble is trained well beyond zero training set error, the ensemble continues to exhibit improved classification errors on unseen cases. Despite many studies and conjectures, the reasons behind this improved performance and understanding of the underlying probabilistic structures remain open and challenging problems. More recently, diagnostics such as edge and margin [R.E. Schapire and Y. Singer, Mach. Learn. 37, No. 3, 297–336 (1999; Zbl 0945.68194)] have been used to explain the improvements made when ensemble classifiers are built. This paper presents some interesting results from an empirical study performed on a set of representative data sets using the decision tree learner C4.5. An exponential-like decay in the variance of the edge is observed as the number of boosting trials is increased. i.e., boosting appears to ‘homogenise’ the edge. Some initial theory is presented which indicates that a lack of correlation between the errors of individual classifiers is a key factor in this variance reduction.

MSC:

62H30 Classification and discrimination; cluster analysis (statistical aspects)

Software:

C4.5
PDF BibTeX XML Cite
Full Text: DOI EuDML