##
**Stochastic gradient boosting.**
*(English)*
Zbl 1072.65502

Summary: Gradient boosting constructs additive regression models by sequentially fitting a simple parameterized function (base learner) to current “pseudo”-residuals by least squares at each iteration. The pseudo-residuals are the gradient of the loss functional being minimized, with respect to the model values at each training data point evaluated at the current step. It is shown that both the approximation accuracy and execution speed of gradient boosting can be substantially improved by incorporating randomization into the procedure. Specifically, at each iteration a subsample of the training data is drawn at random (without replacement) from the full training data set. This randomly selected subsample is then used in place of the full sample to fit the base learner and compute the model update for the current iteration. This randomized approach also increases robustness against overcapacity of the base learner.

### MSC:

65C60 | Computational problems in statistics (MSC2010) |

62J99 | Linear inference, regression |

PDFBibTeX
XMLCite

\textit{J. H. Friedman}, Comput. Stat. Data Anal. 38, No. 4, 367--378 (2002; Zbl 1072.65502)

Full Text:
DOI

### References:

[1] | Breiman, L., Bagging predictors, Mach. Learning, 26, 123-140 (1996) · Zbl 0858.68080 |

[2] | Breiman, L., 1999. Using adaptive bagging to debias regressions. Technical Report, Department of Statistics, University of California, Berkeley.; Breiman, L., 1999. Using adaptive bagging to debias regressions. Technical Report, Department of Statistics, University of California, Berkeley. · Zbl 1052.68109 |

[3] | Freund, Y., Schapire, R., 1996. Experiments with a new boosting algorithm. Machine Learning: Proceedings of the 13th International Conference, pp. 148-156.; Freund, Y., Schapire, R., 1996. Experiments with a new boosting algorithm. Machine Learning: Proceedings of the 13th International Conference, pp. 148-156. |

[4] | Friedman, J.H., 1999. Greedy function approximation: a gradient boosting machine. Technical Report, Department of Statistics, Stanford University.; Friedman, J.H., 1999. Greedy function approximation: a gradient boosting machine. Technical Report, Department of Statistics, Stanford University. · Zbl 1043.62034 |

[5] | Friedman, J.H., Hastie, T., Tibshirani, R., 1998. Additive logistic regression: a statistical view of boosting. Technical Report, Department of Statistics, Stanford University.; Friedman, J.H., Hastie, T., Tibshirani, R., 1998. Additive logistic regression: a statistical view of boosting. Technical Report, Department of Statistics, Stanford University. · Zbl 1106.62323 |

This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.