Block coordinate type methods for optimization and learning

Zhan Yu*

*Corresponding author for this work

Research output: Contribution to journalJournal articlepeer-review

1 Citation (Scopus)


We study nonconvex (composite) optimization and learning problems where the decision variables can be split into blocks of variables. Random block projection is a popular technique to handle this kind of problem for its remarkable reduction of the computational cost from the projection. This powerful method has not been well proposed for the situation that first-order information is prohibited and only zeroth-order information is available. In this paper, we propose to develop different classes of zeroth-order stochastic block coordinate type methods. Zeroth-order block coordinate descent (ZS-BCD) is proposed for solving unconstrained nonconvex optimization problem. For composite optimization, we establish the zeroth-order stochastic block mirror descent (ZS-BMD) and its associated two-phase method to achieve the complexity bound for the composite optimization problem. Furthermore, we also establish zeroth-order stochastic block coordinate conditional gradient (ZS-BCCG) method for nonconvex (composite) optimization. By implementing ZS-BCCG method, in each iteration, only (approximate) linear programming subproblem needs to be solved on a random block instead of a rather costly projection subproblem on the whole decision space, in contrast to the existing traditional stochastic approximation methods. In what follows, an approximate ZS-BCCG method and corresponding two-phase ZS-BCCG method are proposed. This is also the first time that a two-phase BCCG method has been developed to carry out the complexity analysis of nonconvex composite optimization problem.

Original languageEnglish
Pages (from-to)777-817
Number of pages41
JournalAnalysis and Applications
Issue number3
Early online date28 Dec 2022
Publication statusPublished - May 2023

Scopus Subject Areas

  • Analysis
  • Applied Mathematics

User-Defined Keywords

  • Coordinate descent
  • stochastic optimization
  • mirror descent
  • conditional gradient
  • regularized learning
  • zeroth-order method


Dive into the research topics of 'Block coordinate type methods for optimization and learning'. Together they form a unique fingerprint.

Cite this