TY - JOUR
T1 - A fast smoothing newton method for bilevel hyperparameter optimization for SVC with logistic loss
AU - Wang, Yixin
AU - Li, Qingna
N1 - Publisher Copyright:
© 2024 Informa UK Limited, trading as Taylor & Francis Group.
PY - 2024
Y1 - 2024
N2 - Support vector classification (SVC) with logistic loss has excellent theoretical properties in classification problems where the label values are not continuous. In this paper, we reformulate the hyperparameter selection for SVC with logistic loss as a bilevel optimization problem in which the upper-level problem and the lower-level problem are both based on logistic loss. The resulting bilevel optimization model is converted to a single-level nonlinear programming (NLP) based on the KKT conditions of the lower-level problem. Such NLP contains a set of nonlinear equality constraints and a simple lower-bound constraint. The second-order sufficient condition is characterized, which guarantees that the strict local optimizers are obtained. To solve such NLP, we apply the smoothing Newton method proposed in [Liang L, Sun D., Toh KC. A squared smoothing Newton method for semidefinite programming, 2023] to solve the KKT conditions, which contain one pair of complementarity constraints. We show that the smoothing Newton method has a superlinear convergence rate. Extensive numerical results verify the efficiency of the proposed approach and strict local minimizers can be achieved both numerically and theoretically. In particular, compared with other methods, our algorithm can achieve competitive results while consuming less time than other methods.
AB - Support vector classification (SVC) with logistic loss has excellent theoretical properties in classification problems where the label values are not continuous. In this paper, we reformulate the hyperparameter selection for SVC with logistic loss as a bilevel optimization problem in which the upper-level problem and the lower-level problem are both based on logistic loss. The resulting bilevel optimization model is converted to a single-level nonlinear programming (NLP) based on the KKT conditions of the lower-level problem. Such NLP contains a set of nonlinear equality constraints and a simple lower-bound constraint. The second-order sufficient condition is characterized, which guarantees that the strict local optimizers are obtained. To solve such NLP, we apply the smoothing Newton method proposed in [Liang L, Sun D., Toh KC. A squared smoothing Newton method for semidefinite programming, 2023] to solve the KKT conditions, which contain one pair of complementarity constraints. We show that the smoothing Newton method has a superlinear convergence rate. Extensive numerical results verify the efficiency of the proposed approach and strict local minimizers can be achieved both numerically and theoretically. In particular, compared with other methods, our algorithm can achieve competitive results while consuming less time than other methods.
KW - bilevel optimization
KW - hyperparameter selection
KW - smoothing newton method
KW - superlinear convergence
KW - Support vector classification with logistic loss
UR - http://www.scopus.com/inward/record.url?scp=85203250608&partnerID=8YFLogxK
U2 - 10.1080/02331934.2024.2394612
DO - 10.1080/02331934.2024.2394612
M3 - Article
AN - SCOPUS:85203250608
SN - 0233-1934
JO - Optimization
JF - Optimization
ER -