TY - GEN
T1 - Totally-corrective boosting using continuous-valued weak learners
AU - Sun, Chensheng
AU - Zhao, Sanyuan
AU - Hu, Jiwei
AU - Lam, Kin Man
PY - 2012
Y1 - 2012
N2 - The Boosting algorithm has two main variants: the gradient Boosting and the totally-corrective column-generation Boosting. Recently, the latter has received increasing attention since it exhibits a better convergence property, thus resulting in more efficient strong learners. In this work, we point out that the totally-corrective column-generation Boosting is equivalent to the gradient-descent method for the gradient Boosting in the weak-learner selection criterion, but uses additional totally-corrective updates for the weak-learner weights. Therefore, other techniques for the gradient Boosting that produce continuous-valued weak learners, e.g. step-wise direct minimization and Newtons method, may also be used in combination with the totally-corrective procedure. In this work we take the well known AdaBoost algorithm as an example, and show that employing the continuous-valued weak learners improves the performance when used with the totally-corrective weak-learner weight update.
AB - The Boosting algorithm has two main variants: the gradient Boosting and the totally-corrective column-generation Boosting. Recently, the latter has received increasing attention since it exhibits a better convergence property, thus resulting in more efficient strong learners. In this work, we point out that the totally-corrective column-generation Boosting is equivalent to the gradient-descent method for the gradient Boosting in the weak-learner selection criterion, but uses additional totally-corrective updates for the weak-learner weights. Therefore, other techniques for the gradient Boosting that produce continuous-valued weak learners, e.g. step-wise direct minimization and Newtons method, may also be used in combination with the totally-corrective procedure. In this work we take the well known AdaBoost algorithm as an example, and show that employing the continuous-valued weak learners improves the performance when used with the totally-corrective weak-learner weight update.
KW - Boosting
KW - column generation
KW - gradient
KW - totally corrective
UR - https://www.scopus.com/pages/publications/84867615426
U2 - 10.1109/ICASSP.2012.6288312
DO - 10.1109/ICASSP.2012.6288312
M3 - Conference contribution
AN - SCOPUS:84867615426
SN - 9781467300469
T3 - ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
SP - 2049
EP - 2052
BT - 2012 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2012 - Proceedings
T2 - 2012 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2012
Y2 - 25 March 2012 through 30 March 2012
ER -