Variance-Reduced Shuffling Gradient Descent with Momentum for Finite-Sum Minimization

Xia Jiang, Xianlin Zeng*, Lihua Xi, Jian Sun

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

Finite-sum minimization is a fundamental optimization problem in signal processing and machine learning. This letter proposes a variance-reduced shuffling gradient descent with Nesterov's momentum for smooth convex finite-sum optimization. We integrate an explicit variance reduction into the shuffling gradient descent to deal with the variance introduced by shuffling gradients. The proposed algorithm with a unified shuffling scheme converges at a rate of O (1T), where T is the number of epochs. The convergence rate independent of gradient variance is better than most existing shuffling gradient algorithms for convex optimization. Finally, numerical simulations demonstrate the convergence performance of the proposed algorithm.

Original languageEnglish
Pages (from-to)1700-1705
Number of pages6
JournalIEEE Control Systems Letters
Volume7
DOIs
Publication statusPublished - 2023

Keywords

  • Nesterov's momentum
  • Variance reduction
  • finite-sum minimization
  • shuffling gradient descent

Fingerprint

Dive into the research topics of 'Variance-Reduced Shuffling Gradient Descent with Momentum for Finite-Sum Minimization'. Together they form a unique fingerprint.

Cite this