Abstract
Parameter tuning is essential to generalization of support vector machine (SVM). Previous methods usually adopt a nested two-layer framework, where the inner layer solves a convex optimization problem, and the outer layer selects the hyper-parameters by minimizing either cross validation or other error bounds. In this paper, we propose a novel parameter tuning approach for SVM via kernel matrix approximation, based on the observation that approximate computation is sufficient for parameter tuning. We first develop a preliminary approximate computation theory of parameter tuning for SVM. We present a kernel matrix approximation algorithm MoCIC. We design an approximate parameter tuning algorithm APT, which applies MoCIC to compute a low-dimension and low-rank approximation of the kernel matrix, and uses this approximate matrix to efficiently solve the quadratic programming of SVM, then selects the optimal candidate parameter through the approximate cross validation error (ACVE). Finally, we verify and compare the feasibility and efficiency of APT on 10 artificial and benchmark datasets. Experimental results show that this new algorithm can dramatically reduce time consumption of parameter tuning and at the same time guarantee the effectiveness of the selected parameters. It comes to the conclusion that the approximate parameter tuning approach is sound, efficient, and promising.
Original language | English |
---|---|
Pages (from-to) | 2047-2054 |
Number of pages | 8 |
Journal | Journal of Computers (Finland) |
Volume | 7 |
Issue number | 8 |
DOIs | |
Publication status | Published - 2012 |
Externally published | Yes |
Keywords
- Kernel methods
- Matrix approximation
- Parameter tuning
- Support vector machine