A semismooth Newton method for support vector classification and regression

Juan Yin, Qingna Li*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

17 Citations (Scopus)

Abstract

Support vector machine is an important and fundamental technique in machine learning. In this paper, we apply a semismooth Newton method to solve two typical SVM models: the L2-loss SVC model and the ϵ-L2-loss SVR model. The semismooth Newton method is widely used in optimization community. A common belief on the semismooth Newton method is its fast convergence rate as well as high computational complexity. Our contribution in this paper is that by exploring the sparse structure of the models, we significantly reduce the computational complexity, meanwhile keeping the quadratic convergence rate. Extensive numerical experiments demonstrate the outstanding performance of the semismooth Newton method, especially for problems with huge size of sample data (for news20.binary problem with 19,996 features and 1,355,191 samples, it only takes 3 s). In particular, for the ϵ-L2-loss SVR model, the semismooth Newton method significantly outperforms the leading solvers including DCD and TRON.

Original languageEnglish
Pages (from-to)477-508
Number of pages32
JournalComputational Optimization and Applications
Volume73
Issue number2
DOIs
Publication statusPublished - 1 Jun 2019

Keywords

  • Generalized Jacobian
  • Quadratic convergence
  • Semismooth Newton method
  • Support vector classification
  • Support vector regression

Fingerprint

Dive into the research topics of 'A semismooth Newton method for support vector classification and regression'. Together they form a unique fingerprint.

Cite this