Learning to optimize on SPD manifolds

Zhi Gao, Yuwei Wu*, Yunde Jia, Mehrtash Harandi

*此作品的通讯作者

科研成果: 期刊稿件会议文章同行评审

14 引用 (Scopus)

摘要

Many tasks in computer vision and machine learning are modeled as optimization problems with constraints in the form of Symmetric Positive Definite (SPD) matrices. Solving such optimization problems is challenging due to the non-linearity of the SPD manifold, making optimization with SPD constraints heavily relying on expert knowledge and human involvement. In this paper, we propose a meta-learning method to automatically learn an iterative optimizer on SPD manifolds. Specifically, we introduce a novel recurrent model that takes into account the structure of input gradients and identifies the updating scheme of optimization. We parameterize the optimizer by the recurrent model and utilize Riemannian operations to ensure that our method is faithful to the geometry of SPD manifolds. Compared with existing SPD optimizers, our optimizer effectively exploits the underlying data distribution and learns a better optimization trajectory in a data-driven manner. Extensive experiments on various computer vision tasks including metric nearness, clustering, and similarity learning demonstrate that our optimizer outperforms existing state-of-the-art methods consistently.

源语言英语
文章编号9157412
页(从-至)7697-7706
页数10
期刊Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
DOI
出版状态已出版 - 2020
活动2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2020 - Virtual, Online, 美国
期限: 14 6月 202019 6月 2020

指纹

探究 'Learning to optimize on SPD manifolds' 的科研主题。它们共同构成独一无二的指纹。

引用此