A majorization penalty method for SVM with sparse constraint

Sitong Lu, Qingna Li*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

Support vector machine (SVM) is an important and fundamental technique in machine learning. Soft-margin SVM models have stronger generalization performance compared with the hard-margin SVM. Most existing works use the hinge-loss function which can be regarded as an upper bound of the 0–1 loss function. However, it cannot explicitly control the number of misclassified samples. In this paper, we use the idea of soft-margin SVM and propose a new SVM model with a sparse constraint. Our model can strictly limit the number of misclassified samples, expressing the soft-margin constraint as a sparse constraint. By constructing a majorization function, a majorization penalty method can be used to solve the sparse-constrained optimization problem. We apply Conjugate-Gradient (CG) method to solve the resulting subproblem. Extensive numerical results demonstrate the impressive performance of the proposed majorization penalty method.

Original languageEnglish
Pages (from-to)474-494
Number of pages21
JournalOptimization Methods and Software
Volume38
Issue number3
DOIs
Publication statusPublished - 2023

Keywords

  • Support vector machine
  • conjugate gradient method
  • majorization penalty method
  • sparse constraint

Fingerprint

Dive into the research topics of 'A majorization penalty method for SVM with sparse constraint'. Together they form a unique fingerprint.

Cite this