TY - JOUR

T1 - Enhanced proximal DC algorithms with extrapolation for a class of structured nonsmooth DC minimization

AU - Lu, Zhaosong

AU - ZHOU, Zirui

AU - Sun, Zhe

N1 - Funding Information:
Acknowledgements Zhaosong Lu was supported in part by NSERC Discovery Grant. Zirui Zhou was supported by NSERC Discovery Grant and the SFU Alan Mekler postdoctoral fellowship. Zhe Sun was supported by National Natural Science Foundation of China (Grant Nos. 11761037 and 11501265), Natural Science Foundation of Jiangxi Province (Grant No. 20181BAB201009), and the scholarship from China Scholarship Council. This work was conducted while the second author was a postdoctoral fellow and the third author was a visiting scholar at Simon Fraser University, Canada.

PY - 2019/7/1

Y1 - 2019/7/1

N2 - In this paper we consider a class of structured nonsmooth difference-of-convex (DC) minimization in which the first convex component is the sum of a smooth and nonsmooth functions while the second convex component is the supremum of possibly infinitely many convex smooth functions. We first propose an inexact enhanced DC algorithm for solving this problem in which the second convex component is the supremum of finitely many convex smooth functions, and show that every accumulation point of the generated sequence is an (α, η) -D-stationary point of the problem, which is generally stronger than an ordinary D-stationary point. In addition, inspired by the recent work (Pang et al. in Math Oper Res 42(1):95–118, 2017; Wen et al. in Comput Optim Appl 69(2):297–324, 2018), we propose two proximal DC algorithms with extrapolation for solving this problem. We show that every accumulation point of the solution sequence generated by them is an (α, η) -D-stationary point of the problem, and establish the convergence of the entire sequence under some suitable assumption. We also introduce a concept of approximate (α, η) -D-stationary point and derive iteration complexity of the proposed algorithms for finding an approximate (α, η) -D-stationary point. In contrast with the DC algorithm (Pang et al. 2017), our proximal DC algorithms have much simpler subproblems and also incorporate the extrapolation for possible acceleration. Moreover, one of our proximal DC algorithms is potentially applicable to the DC problem in which the second convex component is the supremum of infinitely many convex smooth functions. In addition, our algorithms have stronger convergence results than the proximal DC algorithm in Wen et al. (2018).

AB - In this paper we consider a class of structured nonsmooth difference-of-convex (DC) minimization in which the first convex component is the sum of a smooth and nonsmooth functions while the second convex component is the supremum of possibly infinitely many convex smooth functions. We first propose an inexact enhanced DC algorithm for solving this problem in which the second convex component is the supremum of finitely many convex smooth functions, and show that every accumulation point of the generated sequence is an (α, η) -D-stationary point of the problem, which is generally stronger than an ordinary D-stationary point. In addition, inspired by the recent work (Pang et al. in Math Oper Res 42(1):95–118, 2017; Wen et al. in Comput Optim Appl 69(2):297–324, 2018), we propose two proximal DC algorithms with extrapolation for solving this problem. We show that every accumulation point of the solution sequence generated by them is an (α, η) -D-stationary point of the problem, and establish the convergence of the entire sequence under some suitable assumption. We also introduce a concept of approximate (α, η) -D-stationary point and derive iteration complexity of the proposed algorithms for finding an approximate (α, η) -D-stationary point. In contrast with the DC algorithm (Pang et al. 2017), our proximal DC algorithms have much simpler subproblems and also incorporate the extrapolation for possible acceleration. Moreover, one of our proximal DC algorithms is potentially applicable to the DC problem in which the second convex component is the supremum of infinitely many convex smooth functions. In addition, our algorithms have stronger convergence results than the proximal DC algorithm in Wen et al. (2018).

KW - Approximate D-stationary point

KW - D-stationary point

KW - Extrapolation

KW - Iteration complexity

KW - Nonsmooth DC program

KW - Proximal DCA

UR - http://www.scopus.com/inward/record.url?scp=85053525639&partnerID=8YFLogxK

U2 - 10.1007/s10107-018-1318-9

DO - 10.1007/s10107-018-1318-9

M3 - Article

AN - SCOPUS:85053525639

VL - 176

SP - 369

EP - 401

JO - Mathematical Programming

JF - Mathematical Programming

SN - 0025-5610

IS - 1-2

ER -