swMATH ID: 33192
Software Authors: Fan, W.; Stolfo, S. J.; Zhang, J.; Chan, P. K.
Description: AdaCost: Misclassification cost-sensitive boosting. AdaCost, a variant of AdaBoost, is a misclassification cost-sensitive boosting method. It uses the cost of misclassifications to update the training distribution on successive boosting rounds. The purpose is to reduce the cumulative misclassification cost more than AdaBoost. We formally show that AdaCost reduces the upper bound of cumulative misclassification cost of the training set. Empirical evaluations have shown significant reduction in the cumulative misclassification cost over AdaBoost without consuming additional computing power
Homepage: https://pdfs.semanticscholar.org/9ddf/bc2cc5c1b13b80a1a487b9caa57e80edd863.pdf
Source Code: https://github.com/joelprince25/adacost
Related Software: SMOTE; AdaBoost.MH; UCI-ml; SMOTEBoost; C4.5; ADASYN; R; WEKA; MWMOTE; LIBLINEAR; BoosTexter; XGBoost; AutoAugment; mixup; CSMES; STRING; iPPI-Esml; MICE; yaImpute; NetKit
Cited in: 23 Publications

Citations by Year