A majorization penalty method for SVM with sparse constraint

Sitong Lu, Qingna Li*

*此作品的通讯作者

科研成果: 期刊稿件文章同行评审

1 引用 (Scopus)

摘要

Support vector machine (SVM) is an important and fundamental technique in machine learning. Soft-margin SVM models have stronger generalization performance compared with the hard-margin SVM. Most existing works use the hinge-loss function which can be regarded as an upper bound of the 0–1 loss function. However, it cannot explicitly control the number of misclassified samples. In this paper, we use the idea of soft-margin SVM and propose a new SVM model with a sparse constraint. Our model can strictly limit the number of misclassified samples, expressing the soft-margin constraint as a sparse constraint. By constructing a majorization function, a majorization penalty method can be used to solve the sparse-constrained optimization problem. We apply Conjugate-Gradient (CG) method to solve the resulting subproblem. Extensive numerical results demonstrate the impressive performance of the proposed majorization penalty method.

源语言英语
页(从-至)474-494
页数21
期刊Optimization Methods and Software
38
3
DOI
出版状态已出版 - 2023

指纹

探究 'A majorization penalty method for SVM with sparse constraint' 的科研主题。它们共同构成独一无二的指纹。

引用此