In recent years there has been an ever-increasing interest in the optimization community for algorithms suitable for solving optimization problems with separable structures. These optimization problems have arisen from various fields such as machine learning, signal processing, and statistics. With the advent of big data era, the problem instances are typically of large- and even huge-scale. Numerous experiments have demonstrated that first-order methods such as the proximal gradient method (PGM) and the randomized block coordinates proximal gradient method (R-BCPGM) are very powerful for solving them. The numerical efficiency and some theoretical progress on PGM and R-BCPGM have long been witnessed in diversified areas, but research on their convergence rates is still in its infancy. In this project, we will focus on introducing the well-established error bound/calmness/metric subregularity theories in variational analysis to the convergence rate analysis. In particular, the convergence rates of PGM and R-BCPGM will be intensively investigated under error bound/calmness/metric subregularity. Firstly, a new framework for establishing suitable algorithm-tailored subregularity conditions which suffice to ensure the desired linear convergence will be initiated. Moreover, the calmness conditions tailored to linearly convergent PGM will be verified to hold automatically for a wide class of structured convex models. In addition, the subregularity conditions required for the expected value-type linear convergence of R-BCPGM will be proved to hold automatically for some widely-used statistical learning models such as LASSO, fused LASSO, grouped LASSO and etc. For general problems without any underlying structures, by using some recent developments in variational analysis, sufficient conditions for the calmness tailored to the linear convergence of PGM will also be developed.
|Effective start/end date
|1/08/18 → 31/07/21
Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.