TY - JOUR
T1 - On the Proximal Jacobian Decomposition of ALM for Multiple-Block Separable Convex Minimization Problems and Its Relationship to ADMM
AU - He, Bingsheng
AU - Xu, Hong Kun
AU - Yuan, Xiaoming
N1 - Funding Information:
This author was supported by the NSFC Grant 10471056. This author was supported in part by NSC 102-2115-M-110-001-MY3. This author was partially supported by the General Research Fund from Hong Kong Research Grants Council: 203613.
PY - 2016/3
Y1 - 2016/3
N2 - The augmented Lagrangian method (ALM) is a benchmark for solving convex minimization problems with linear constraints. When the objective function of the model under consideration is representable as the sum of some functions without coupled variables, a Jacobian or Gauss–Seidel decomposition is often implemented to decompose the ALM subproblems so that the functions’ properties could be used more effectively in algorithmic design. The Gauss–Seidel decomposition of ALM has resulted in the very popular alternating direction method of multipliers (ADMM) for two-block separable convex minimization models and recently it was shown in He et al. (Optimization Online, 2013) that the Jacobian decomposition of ALM is not necessarily convergent. In this paper, we show that if each subproblem of the Jacobian decomposition of ALM is regularized by a proximal term and the proximal coefficient is sufficiently large, the resulting scheme to be called the proximal Jacobian decomposition of ALM, is convergent. We also show that an interesting application of the ADMM in Wang et al. (Pac J Optim, to appear), which reformulates a multiple-block separable convex minimization model as a two-block counterpart first and then applies the original ADMM directly, is closely related to the proximal Jacobian decomposition of ALM. Our analysis is conducted in the variational inequality context and is rooted in a good understanding of the proximal point algorithm.
AB - The augmented Lagrangian method (ALM) is a benchmark for solving convex minimization problems with linear constraints. When the objective function of the model under consideration is representable as the sum of some functions without coupled variables, a Jacobian or Gauss–Seidel decomposition is often implemented to decompose the ALM subproblems so that the functions’ properties could be used more effectively in algorithmic design. The Gauss–Seidel decomposition of ALM has resulted in the very popular alternating direction method of multipliers (ADMM) for two-block separable convex minimization models and recently it was shown in He et al. (Optimization Online, 2013) that the Jacobian decomposition of ALM is not necessarily convergent. In this paper, we show that if each subproblem of the Jacobian decomposition of ALM is regularized by a proximal term and the proximal coefficient is sufficiently large, the resulting scheme to be called the proximal Jacobian decomposition of ALM, is convergent. We also show that an interesting application of the ADMM in Wang et al. (Pac J Optim, to appear), which reformulates a multiple-block separable convex minimization model as a two-block counterpart first and then applies the original ADMM directly, is closely related to the proximal Jacobian decomposition of ALM. Our analysis is conducted in the variational inequality context and is rooted in a good understanding of the proximal point algorithm.
KW - Alternating direction method of multipliers
KW - Augmented Lagrangian method
KW - Convex optimization
KW - Jacobian decomposition
KW - Parallel computation
KW - Proximal point algorithm
KW - Variational inequality problem
UR - http://www.scopus.com/inward/record.url?scp=84957441620&partnerID=8YFLogxK
U2 - 10.1007/s10915-015-0060-1
DO - 10.1007/s10915-015-0060-1
M3 - Journal article
AN - SCOPUS:84957441620
SN - 0885-7474
VL - 66
SP - 1204
EP - 1217
JO - Journal of Scientific Computing
JF - Journal of Scientific Computing
IS - 3
ER -