Abstract
Boosting is a widely used ensemble meta-algorithm due to its excellent performance in combining weak learners into a strong learner. However, vanilla boosting methods are proved to be sensitive to noise due to the lack of restriction to the always misclassified samples during iterations. In this work, we present a new aspect to overcome the sensitiveness problem by combining the self-sampling learning framework, which can pursue reliable samples and smooth the training process based on the sample reliability measurement designed for the boosting procedure. Experimental results on the synthetic data and several real-world datasets show that the self-sampling regime can automatically optimize an appropriate training subset in different noise environments, and robust boosting algorithms we proposed outperform the state-of-the-art methods.
| Original language | English |
|---|---|
| Article number | 105424 |
| Journal | Knowledge-Based Systems |
| Volume | 193 |
| DOIs | |
| Publication status | Published - 6 Apr 2020 |
Keywords
- Boosting
- Loss function
- Robustness
- Self-sampling
Fingerprint
Dive into the research topics of 'Robust boosting via self-sampling'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver