×

zbMATH — the first resource for mathematics

Precise statements of convergence for AdaBoost and arc-gv. (English) Zbl 1141.68722
Verducci, Joseph Stephen (ed.) et al., Prediction and discovery. AMS-IMS-SIAM joint summer research conference on machine and statistical learning: prediction and discovery, Snowbird, UT, USA, June 25–29, 2006. Contemporary Mathematics 443 (ISBN 978-0-8218-4195-2/pbk). 131-145 (2007).
Summary: We present two main results, the first concerning Freund and Schapire’s AdaBoost algorithm, and the second concerning Breiman’s arc-gv algorithm. Our discussion of AdaBoost revolves around a circumstance called the case of “bounded edges”, in which AdaBoost’s convergence properties can be completely understood. Specifically, our first main result is that if AdaBoost’s “edge” values fall into a small interval, a corresponding interval can be found for the asymptotic margin. A special case of this result closes an open theoretical question of Ratsch and Warmuth. Our main result concerning arc-gv is a convergence rate to a maximum margin solution. Both of these results are derived from an important tool called the “smooth margin”, which is a differentiable approximation of the true margin for boosting algorithms.
For the entire collection see [Zbl 1123.62002].

MSC:
68W40 Analysis of algorithms
68Q25 Analysis of algorithms and problem complexity
68Q32 Computational learning theory
PDF BibTeX XML Cite