Abstract
Support vector machine (SVM) is an important and fundamental technique in machine learning. Soft-margin SVM models have stronger generalization performance compared with the hard-margin SVM. Most existing works use the hinge-loss function which can be regarded as an upper bound of the 0–1 loss function. However, it cannot explicitly control the number of misclassified samples. In this paper, we use the idea of soft-margin SVM and propose a new SVM model with a sparse constraint. Our model can strictly limit the number of misclassified samples, expressing the soft-margin constraint as a sparse constraint. By constructing a majorization function, a majorization penalty method can be used to solve the sparse-constrained optimization problem. We apply Conjugate-Gradient (CG) method to solve the resulting subproblem. Extensive numerical results demonstrate the impressive performance of the proposed majorization penalty method.
| Original language | English |
|---|---|
| Pages (from-to) | 474-494 |
| Number of pages | 21 |
| Journal | Optimization Methods and Software |
| Volume | 38 |
| Issue number | 3 |
| DOIs | |
| Publication status | Published - 2023 |
Keywords
- Support vector machine
- conjugate gradient method
- majorization penalty method
- sparse constraint
Fingerprint
Dive into the research topics of 'A majorization penalty method for SVM with sparse constraint'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver