Abstract
In this paper we consider a class of structured nonsmooth difference-of-convex (DC) minimization in which the first convex component is the sum of a smooth and a nonsmooth function while the second convex component is the supremum of finitely many convex smooth functions. The existing methods for this problem usually have weak convergence guarantees or exhibit slow convergence. Due to this, we propose two nonmonotone enhanced proximal DC algorithms for solving this problem. For possible acceleration, one uses a nonmonotone line-search scheme in which the associated Lipschitz constant is adaptively approximated by some local curvature information of the smooth function in the first convex component, and the other employs an extrapolation scheme. It is shown that every accumulation point of the solution sequence generated by them is a D-stationary point of the problem. These methods may, however, become inefficient when the number of convex smooth functions in defining the second convex component is large. To remedy this issue, we propose randomized counterparts for them and show that every accumulation point of the generated solution sequence is a D-stationary point of the problem almost surely. Some preliminary numerical experiments are conducted to demonstrate the efficiency of the proposed algorithms.
Original language | English |
---|---|
Pages (from-to) | 2725-2752 |
Number of pages | 28 |
Journal | SIAM Journal on Optimization |
Volume | 29 |
Issue number | 4 |
DOIs | |
Publication status | Published - 31 Oct 2019 |
Scopus Subject Areas
- Software
- Theoretical Computer Science
User-Defined Keywords
- D-stationary point
- Extrapolation
- Nonmonotone line search
- Nonsmooth DC programming
- Proximal DCA