Generalized alternating direction method of multipliers: new theoretical insights and applications

Ethan X. Fang, Bingsheng He, Han Liu, Xiaoming Yuan*

*Corresponding author for this work

Research output: Contribution to journalJournal articlepeer-review

72 Citations (Scopus)

Abstract

Recently, the alternating direction method of multipliers (ADMM) has received intensive attention from a broad spectrum of areas. The generalized ADMM (GADMM) proposed by Eckstein and Bertsekas is an efficient and simple acceleration scheme of ADMM. In this paper, we take a deeper look at the linearized version of GADMM where one of its subproblems is approximated by a linearization strategy. This linearized version is particularly efficient for a number of applications arising from different areas. Theoretically, we show the worst-case $${\mathcal {O}}(1/k)$$O(1/k) convergence rate measured by the iteration complexity ($$k$$k represents the iteration counter) in both the ergodic and a nonergodic senses for the linearized version of GADMM. Numerically, we demonstrate the efficiency of this linearized version of GADMM by some rather new and core applications in statistical learning. Code packages in Matlab for these applications are also developed.

Original languageEnglish
Pages (from-to)149-187
Number of pages39
JournalMathematical Programming Computation
Volume7
Issue number2
Early online date6 Feb 2015
DOIs
Publication statusPublished - Jun 2015

Scopus Subject Areas

  • Theoretical Computer Science
  • Software

User-Defined Keywords

  • Alternating direction method of multipliers
  • Convergence rate
  • Convex optimization
  • Discriminant analysis
  • Statistical learning
  • Variable selection

Fingerprint

Dive into the research topics of 'Generalized alternating direction method of multipliers: new theoretical insights and applications'. Together they form a unique fingerprint.

Cite this