Abstract
Recently, we have proposed combining the alternating direction method of multipliers (ADMM) with a Gaussian back substitution procedure for solving the convex minimization model with linear constraints and a general separable objective function, i.e., the objective function is the sum of many functions without coupled variables. In this paper, we further study this topic and show that the decomposed subproblems in the ADMM procedure can be substantially alleviated by linearizing the involved quadratic terms arising from the augmented Lagrangian penalty. When the resolvent operators of the separable functions in the objective have closed-form representations, embedding the linearization into the ADMM subproblems becomes necessary to yield easy subproblems with closed-form solutions. We thus show theoretically that the blend of ADMM, Gaussian back substitution and linearization works effectively for the separable convex minimization model under consideration.
Original language | English |
---|---|
Pages (from-to) | 247-260 |
Number of pages | 14 |
Journal | Numerical Algebra, Control and Optimization |
Volume | 3 |
Issue number | 2 |
DOIs | |
Publication status | Published - Apr 2013 |
Scopus Subject Areas
- Algebra and Number Theory
- Control and Optimization
- Applied Mathematics
User-Defined Keywords
- Alternating direction method of multipliers
- Gaussian back substitution
- Linearization
- Resolvent operator
- Separable convex programming