TY - JOUR
T1 - An efficient augmented Lagrangian method for support vector machine
AU - Yan, Yinqiao
AU - Li, Qingna
N1 - Publisher Copyright:
© 2020, © 2020 Informa UK Limited, trading as Taylor & Francis Group.
PY - 2020/7/3
Y1 - 2020/7/3
N2 - Support vector machine (SVM) has proved to be a successful approach for machine learning. Two typical SVM models are the L1-loss model for support vector classification (SVC) and ε-L1-loss model for support vector regression (SVR). Due to the non-smoothness of the L1-loss function in the two models, most of the traditional approaches focus on solving the dual problem. In this paper, we propose an augmented Lagrangian method for the L1-loss model, which is designed to solve the primal problem. By tackling the non-smooth term in the model with Moreau–Yosida regularization and the proximal operator, the subproblem in augmented Lagrangian method reduces to a non-smooth linear system, which can be solved via the quadratically convergent semismooth Newton's method. Moreover, the high computational cost in semismooth Newton's method can be significantly reduced by exploring the sparse structure in the generalized Jacobian. Numerical results on various datasets in LIBLINEAR show that the proposed method is competitive with the most popular solvers in both speed and accuracy.
AB - Support vector machine (SVM) has proved to be a successful approach for machine learning. Two typical SVM models are the L1-loss model for support vector classification (SVC) and ε-L1-loss model for support vector regression (SVR). Due to the non-smoothness of the L1-loss function in the two models, most of the traditional approaches focus on solving the dual problem. In this paper, we propose an augmented Lagrangian method for the L1-loss model, which is designed to solve the primal problem. By tackling the non-smooth term in the model with Moreau–Yosida regularization and the proximal operator, the subproblem in augmented Lagrangian method reduces to a non-smooth linear system, which can be solved via the quadratically convergent semismooth Newton's method. Moreover, the high computational cost in semismooth Newton's method can be significantly reduced by exploring the sparse structure in the generalized Jacobian. Numerical results on various datasets in LIBLINEAR show that the proposed method is competitive with the most popular solvers in both speed and accuracy.
KW - Support vector machine
KW - augmented Lagrangian method
KW - generalized Jacobian
KW - semismooth Newton's method
UR - http://www.scopus.com/inward/record.url?scp=85080918631&partnerID=8YFLogxK
U2 - 10.1080/10556788.2020.1734002
DO - 10.1080/10556788.2020.1734002
M3 - Article
AN - SCOPUS:85080918631
SN - 1055-6788
VL - 35
SP - 855
EP - 883
JO - Optimization Methods and Software
JF - Optimization Methods and Software
IS - 4
ER -