A Smooth Double Proximal Primal-Dual Algorithm for a Class of Distributed Nonsmooth Optimization Problems

Yue Wei, Hao Fang*, Xianlin Zeng, Jie Chen, Panos Pardalos

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

30 Citations (Scopus)

Abstract

This technical note studies a class of distributed nonsmooth convex consensus optimization problems. The cost function is a summation of local cost functions which are convex but nonsmooth. Each of the local cost functions consists of a twice differentiable (smooth) convex function and two lower semi-continuous (nonsmooth) convex functions. We call these problems as single-smooth plus double-nonsmooth (SSDN) problems. Under mild conditions, we propose a distributed double proximal primal-dual optimization algorithm. Double proximal splitting is designed to deal with the difficulty caused by the unproximable property of the summation of those two nonsmooth functions. Besides, it can also guarantee that the proposed algorithm is locally Lipschitz continuous. An auxiliary variable in the double proximal splitting is introduced to estimate the subgradient of the second nonsmooth function. Theoretically, we conduct the convergence analysis by employing Lyapunov stability theory. It shows that the proposed algorithm can make the states achieve consensus at the optimal point. In the end, nontrivial simulations are presented and the results demonstrate the effectiveness of the proposed algorithm.

Original languageEnglish
Article number8807129
Pages (from-to)1800-1806
Number of pages7
JournalIEEE Transactions on Automatic Control
Volume65
Issue number4
DOIs
Publication statusPublished - Apr 2020

Keywords

  • Distributed optimization
  • nonsmooth convex optimization
  • primal dual
  • proximal operator

Fingerprint

Dive into the research topics of 'A Smooth Double Proximal Primal-Dual Algorithm for a Class of Distributed Nonsmooth Optimization Problems'. Together they form a unique fingerprint.

Cite this