Campus Units
Electrical and Computer Engineering
Document Type
Article
Publication Version
Submitted Manuscript
Publication Date
2020
Journal or Book Title
arXiv
Abstract
The alternating gradient descent (AGD) is a simple but popular algorithm which has been applied to problems in optimization, machine learning, data ming, and signal processing, etc. The algorithm updates two blocks of variables in an alternating manner, in which a gradient step is taken on one block, while keeping the remaining block fixed. When the objective function is nonconvex, it is well-known the AGD converges to the first-order stationary solution with a global sublinear rate.
In this paper, we show that a variant of AGD-type algorithms will not be trapped by "bad" stationary solutions such as saddle points and local maximum points. In particular, we consider a smooth unconstrained optimization problem, and propose a perturbed AGD (PA-GD) which converges (with high probability) to the set of second-order stationary solutions (SS2) with a global sublinear rate. To the best of our knowledge, this is the first alternating type algorithm which takes O(polylog(d)/ϵ7/3) iterations to achieve SS2 with high probability [where polylog(d) is polynomial of the logarithm of dimension d of the problem].
Copyright Owner
The Author(s)
Copyright Date
2018
Language
en
File Format
application/pdf
Recommended Citation
Lu, Songtao; Hong, Mingyi; and Wang, Zhengdao, "On the Sublinear Convergence of Randomly Perturbed Alternating Gradient Descent to Second Order Stationary Solutions" (2020). Electrical and Computer Engineering Publications. 277.
https://lib.dr.iastate.edu/ece_pubs/277
Comments
This is a pre-print of the article Lu, Songtao, Mingyi Hong, and Zhengdao Wang. "On the sublinear convergence of randomly perturbed alternating gradient descent to second order stationary solutions." arXiv preprint arXiv:1802.10418 (2018). Posted with permission.