Using iterated bagging to debias regressions. (English) Zbl 1052.68109

Summary: The author [ibid. 24, 123–140 (1996; Zbl 0858.68080)] showed that bagging could effectively reduce the variance of regression predictors, while leaving the bias relatively unchanged. A new form of bagging we call iterated bagging is effective in reducing both bias and variance. The procedure works in stages – the first stage is bagging. Based on the outcomes of the first stage, the output values are altered; and a second stage of bagging is carried out using the altered output values. This is repeated until a simple rule stops the process. The method is tested using both trees and nearest neighbor regression methods. Accuracy on the Boston Housing data benchmark is comparable to the best of the results gotten using highly tuned and compute-intensive Sopport Vector Regression Machines. Some heuristic theory is given to clarify what is going on. Application to two-class classification data gives interesting results.


68T05 Learning and adaptive systems in artificial intelligence


Zbl 0858.68080
Full Text: DOI