Abstract
Model selection is critical to least squares support vector machine (LSSVM). A major problem of existing model selection approaches of LSSVM is that the inverse of the kernel matrix need to be calculated with O(n3) complexity for each iteration, where n is the number of training examples. It is prohibitive for the large scale application. In this paper, we propose an approximate approach to model selection of LSSVM. We use multilevel circulant matrices to approximate the kernel matrix so that the fast Fourier transform (FFT) can be applied to reduce the computational cost of matrix inverse. With such approximation, we first design an efficient LSSVM algorithm with O(n log(n)) complexity and theoretically analyze the effect of kernel matrix approximation on the decision function of LSSVM. We further show that the approximate optimal model produced with the multilevel circulant matrix is consistent with the accurate one produced with the original kernel matrix. Under the guarantee of consistency, we present an approximate model selection scheme, whose complexity is significantly lower than the previous approaches. Experimental results on benchmark datasets demonstrate the effectiveness of approximate model selection.
Original language | English |
---|---|
Pages (from-to) | 165-180 |
Number of pages | 16 |
Journal | Journal of Machine Learning Research |
Volume | 20 |
Publication status | Published - 2011 |
Externally published | Yes |
Event | 3rd Asian Conference on Machine Learning, ACML 2011 - Taoyuan, Taiwan, Province of China Duration: 13 Nov 2011 → 15 Nov 2011 |
Keywords
- Least squares support vector machine
- Matrix approximation
- Model selection
- Multilevel circulant matrices