TY - JOUR
T1 - Distributed Momentum-Based Frank-Wolfe Algorithm for Stochastic Optimization
AU - Hou, Jie
AU - Zeng, Xianlin
AU - Wang, Gang
AU - Sun, Jian
AU - Chen, Jie
N1 - Publisher Copyright:
© 2014 Chinese Association of Automation.
PY - 2023/3/1
Y1 - 2023/3/1
N2 - This paper considers distributed stochastic optimization, in which a number of agents cooperate to optimize a global objective function through local computations and information exchanges with neighbors over a network. Stochastic optimization problems are usually tackled by variants of projected stochastic gradient descent. However, projecting a point onto a feasible set is often expensive. The Frank-Wolfe (FW) method has well-documented merits in handling convex constraints, but existing stochastic FW algorithms are basically developed for centralized settings. In this context, the present work puts forth a distributed stochastic Frank-Wolfe solver, by judiciously combining Nesterov's momentum and gradient tracking techniques for stochastic convex and nonconvex optimization over networks. It is shown that the convergence rate of the proposed algorithm is Ok-1/2) for convex optimization, and O(1/log2(k)) for nonconvex optimization. The efficacy of the algorithm is demonstrated by numerical simulations against a number of competing alternatives.
AB - This paper considers distributed stochastic optimization, in which a number of agents cooperate to optimize a global objective function through local computations and information exchanges with neighbors over a network. Stochastic optimization problems are usually tackled by variants of projected stochastic gradient descent. However, projecting a point onto a feasible set is often expensive. The Frank-Wolfe (FW) method has well-documented merits in handling convex constraints, but existing stochastic FW algorithms are basically developed for centralized settings. In this context, the present work puts forth a distributed stochastic Frank-Wolfe solver, by judiciously combining Nesterov's momentum and gradient tracking techniques for stochastic convex and nonconvex optimization over networks. It is shown that the convergence rate of the proposed algorithm is Ok-1/2) for convex optimization, and O(1/log2(k)) for nonconvex optimization. The efficacy of the algorithm is demonstrated by numerical simulations against a number of competing alternatives.
KW - Distributed optimization
KW - Frank-Wolfe (FW) algorithms
KW - momentum-based method
KW - stochastic optimization
UR - http://www.scopus.com/inward/record.url?scp=85137868742&partnerID=8YFLogxK
U2 - 10.1109/JAS.2022.105923
DO - 10.1109/JAS.2022.105923
M3 - Article
AN - SCOPUS:85137868742
SN - 2329-9266
VL - 10
SP - 685
EP - 699
JO - IEEE/CAA Journal of Automatica Sinica
JF - IEEE/CAA Journal of Automatica Sinica
IS - 3
ER -