High dimensional Bayesian optimization via supervised dimension reduction

Miao Zhang, Huiqi Li*, Steven Su

*此作品的通讯作者

科研成果: 书/报告/会议事项章节会议稿件同行评审

19 引用 (Scopus)

摘要

Bayesian optimization (BO) has been broadly applied to computational expensive problems, but it is still challenging to extend BO to high dimensions. Existing works are usually under strict assumption of an additive or a linear embedding structure for objective functions. This paper directly introduces a supervised dimension reduction method, Sliced Inverse Regression (SIR), to high dimensional Bayesian optimization, which could effectively learn the intrinsic sub-structure of objective function during the optimization. Furthermore, a kernel trick is developed to reduce computational complexity and learn nonlinear subset of the unknowing function when applying SIR to extremely high dimensional BO. We present several computational benefits and derive theoretical regret bounds of our algorithm. Extensive experiments on synthetic examples and two real applications demonstrate the superiority of our algorithms for high dimensional Bayesian optimization.

源语言英语
主期刊名Proceedings of the 28th International Joint Conference on Artificial Intelligence, IJCAI 2019
编辑Sarit Kraus
出版商International Joint Conferences on Artificial Intelligence
4292-4298
页数7
ISBN(电子版)9780999241141
DOI
出版状态已出版 - 2019
活动28th International Joint Conference on Artificial Intelligence, IJCAI 2019 - Macao, 中国
期限: 10 8月 201916 8月 2019

出版系列

姓名IJCAI International Joint Conference on Artificial Intelligence
2019-August
ISSN(印刷版)1045-0823

会议

会议28th International Joint Conference on Artificial Intelligence, IJCAI 2019
国家/地区中国
Macao
时期10/08/1916/08/19

指纹

探究 'High dimensional Bayesian optimization via supervised dimension reduction' 的科研主题。它们共同构成独一无二的指纹。

引用此