Abstract
Kernel selection is a fundamental problem of kernel-based learning algorithms. In this paper, we propose an approximate approach to automatic kernel selection for regression from the perspective of kernel matrix approximation. We first introduce multilevel circulant matrices into automatic kernel selection, and develop two approximate kernel selection algorithms by exploiting the computational virtues of multilevel circulant matrices. The complexity of the proposed algorithms is quasi-linear in the number of data points. Then, we prove an approximation error bound to measure the effect of the approximation in kernel matrices by multilevel circulant matrices on the hypothesis and further show that the approximate hypothesis produced with multilevel circulant matrices converges to the accurate hypothesis produced with kernel matrices. Experimental evaluations on benchmark datasets demonstrate the effectiveness of approximate kernel selection.
Original language | English |
---|---|
Article number | 7396940 |
Pages (from-to) | 554-565 |
Number of pages | 12 |
Journal | IEEE Transactions on Cybernetics |
Volume | 47 |
Issue number | 3 |
DOIs | |
Publication status | Published - Mar 2017 |
Externally published | Yes |
Keywords
- Approximate algorithms
- kernel matrix approximation
- kernel selection
- model selection
- multilevel circulant matrices