TY - JOUR
T1 - KMA-α
T2 - A kernel matrix approximation algorithm for support vector machines
AU - Ding, Lizhong
AU - Liao, Shizhong
PY - 2012/4
Y1 - 2012/4
N2 - The computation of kernel matrices is essential for solving the support vector machines (SVM). Since the previous accurate approach is hard to apply in large-scale problems, there has been a lot deal of recent interests in the approximate approach, and a new approximation algorithm for the computation of kernel matrices is proposed in this paper. Firstly, we reformulate the quadratic optimization for SVM and multiple kernel SVM as a second-order cone programming (SOCP) through the convex quadratically constrained linear programming (QCLP). Then, we synthesize the Monte Carlo approximation and the incomplete Cholesky factorization, and present a new kernel matrix approximation algorithm KMA-α. KMA-α uses the Monte Carlo algorithm to randomly sample the kernel matrix. Rather than directly calculate the singular value decomposition of the sample matrix, KMA-α applies the incomplete Cholesky factorization with symmetric permutation to obtain the near-optimal low rank approximation of the sample matrix. The approximate matrix produced by KMA-α can be used in SOCP to improve the efficiency of SVM. Further, we analyze the computational complexity and prove the error bound theorem about the KMA-α algorithm. Finally, by the comparative experiments on benchmark datasets, we verify the validity and the efficiency of KMA-α. Theoretical and experimental results show that KMA-α is a valid and efficient kernel matrix approximation algorithm.
AB - The computation of kernel matrices is essential for solving the support vector machines (SVM). Since the previous accurate approach is hard to apply in large-scale problems, there has been a lot deal of recent interests in the approximate approach, and a new approximation algorithm for the computation of kernel matrices is proposed in this paper. Firstly, we reformulate the quadratic optimization for SVM and multiple kernel SVM as a second-order cone programming (SOCP) through the convex quadratically constrained linear programming (QCLP). Then, we synthesize the Monte Carlo approximation and the incomplete Cholesky factorization, and present a new kernel matrix approximation algorithm KMA-α. KMA-α uses the Monte Carlo algorithm to randomly sample the kernel matrix. Rather than directly calculate the singular value decomposition of the sample matrix, KMA-α applies the incomplete Cholesky factorization with symmetric permutation to obtain the near-optimal low rank approximation of the sample matrix. The approximate matrix produced by KMA-α can be used in SOCP to improve the efficiency of SVM. Further, we analyze the computational complexity and prove the error bound theorem about the KMA-α algorithm. Finally, by the comparative experiments on benchmark datasets, we verify the validity and the efficiency of KMA-α. Theoretical and experimental results show that KMA-α is a valid and efficient kernel matrix approximation algorithm.
KW - Cholesky factorization
KW - Kernel method
KW - Matrix approximation
KW - Second-order cone programming
KW - Support vector machine
UR - http://www.scopus.com/inward/record.url?scp=84861438799&partnerID=8YFLogxK
M3 - Article
AN - SCOPUS:84861438799
SN - 1000-1239
VL - 49
SP - 746
EP - 753
JO - Jisuanji Yanjiu yu Fazhan/Computer Research and Development
JF - Jisuanji Yanjiu yu Fazhan/Computer Research and Development
IS - 4
ER -