Steplengths in the extragradient type methods

Xiang Wang, Bingsheng He, Lizhi LIAO*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

5 Citations (Scopus)

Abstract

The extragradient type methods are a class of efficient direct methods. For solving monotone variational inequalities, these methods only require function evaluation, and therefore are widely applied to black-box models. In this type of methods, the distance between the iterate and a fixed solution point decreases by iterations. Furthermore, in each iteration, the negative increment of such squared distance has a differentiable concave lower bound function without requiring any solution in its formula. In this paper, we investigate some properties for the lower bound. Our study reveals that the lower bound affords a steplength domain which guarantees the convergence of the entire algorithm. Based on these results, we present two new steplengths. One involves the projection onto the tangent cone without line search, while the other can be computed via searching the positive root of a one dimension concave lower bound function. Our preliminary numerical results confirm and illustrate the attractiveness of our contributions.

Original languageEnglish
Pages (from-to)2925-2939
Number of pages15
JournalJournal of Computational and Applied Mathematics
Volume233
Issue number11
DOIs
Publication statusPublished - 1 Apr 2010

Scopus Subject Areas

  • Computational Mathematics
  • Applied Mathematics

User-Defined Keywords

  • Black-box model
  • Extragradient type methods
  • Monotone variational inequalities
  • Projection

Fingerprint

Dive into the research topics of 'Steplengths in the extragradient type methods'. Together they form a unique fingerprint.

Cite this