摘要
Kernel selection is a fundamental problem of kernel-based learning algorithms. In this paper, we propose an approximate approach to automatic kernel selection for regression from the perspective of kernel matrix approximation. We first introduce multilevel circulant matrices into automatic kernel selection, and develop two approximate kernel selection algorithms by exploiting the computational virtues of multilevel circulant matrices. The complexity of the proposed algorithms is quasi-linear in the number of data points. Then, we prove an approximation error bound to measure the effect of the approximation in kernel matrices by multilevel circulant matrices on the hypothesis and further show that the approximate hypothesis produced with multilevel circulant matrices converges to the accurate hypothesis produced with kernel matrices. Experimental evaluations on benchmark datasets demonstrate the effectiveness of approximate kernel selection.
源语言 | 英语 |
---|---|
文章编号 | 7396940 |
页(从-至) | 554-565 |
页数 | 12 |
期刊 | IEEE Transactions on Cybernetics |
卷 | 47 |
期 | 3 |
DOI | |
出版状态 | 已出版 - 3月 2017 |
已对外发布 | 是 |