New parallel descent-like method for solving a class of variational inequalities

Z. K. Jiang, X. M. Yuan

Research output: Contribution to journalJournal articlepeer-review

19 Citations (Scopus)

Abstract

To solve a class of variational inequalities with separable structures, some classical methods such as the augmented Lagrangian method and the alternating direction methods require solving two subvariational inequalities at each iteration. The most recent work (B. S. He in Comput. Optim. Appl. 42(2):195-212, 2009) improved these classical methods by allowing the subvariational inequalities arising at each iteration to be solved in parallel, at the price of executing an additional descent step. This paper aims at developing this strategy further by refining the descent directions in the descent steps, while preserving the practical characteristics suitable for parallel computing. Convergence of the new parallel descent-like method is proved under the same mild assumptions on the problem data.

Original languageEnglish
Pages (from-to)311-323
Number of pages13
JournalJournal of Optimization Theory and Applications
Volume145
Issue number2
DOIs
Publication statusPublished - May 2010

Scopus Subject Areas

  • Control and Optimization
  • Management Science and Operations Research
  • Applied Mathematics

User-Defined Keywords

  • Alternating direction methods
  • Augmented Lagrangian method
  • Descent-like methods
  • Parallel computing
  • Variational inequalities

Fingerprint

Dive into the research topics of 'New parallel descent-like method for solving a class of variational inequalities'. Together they form a unique fingerprint.

Cite this