Variance-Reduced Shuffling Gradient Descent with Momentum for Finite-Sum Minimization

Xia Jiang, Xianlin Zeng*, Lihua Xi, Jian Sun

*此作品的通讯作者

科研成果: 期刊稿件文章同行评审

1 引用 (Scopus)

摘要

Finite-sum minimization is a fundamental optimization problem in signal processing and machine learning. This letter proposes a variance-reduced shuffling gradient descent with Nesterov's momentum for smooth convex finite-sum optimization. We integrate an explicit variance reduction into the shuffling gradient descent to deal with the variance introduced by shuffling gradients. The proposed algorithm with a unified shuffling scheme converges at a rate of O (1T), where T is the number of epochs. The convergence rate independent of gradient variance is better than most existing shuffling gradient algorithms for convex optimization. Finally, numerical simulations demonstrate the convergence performance of the proposed algorithm.

源语言英语
页(从-至)1700-1705
页数6
期刊IEEE Control Systems Letters
7
DOI
出版状态已出版 - 2023

指纹

探究 'Variance-Reduced Shuffling Gradient Descent with Momentum for Finite-Sum Minimization' 的科研主题。它们共同构成独一无二的指纹。

引用此