Abstract
The extragradient type methods are a class of efficient direct methods. For solving monotone variational inequalities, these methods only require function evaluation, and therefore are widely applied to black-box models. In this type of methods, the distance between the iterate and a fixed solution point decreases by iterations. Furthermore, in each iteration, the negative increment of such squared distance has a differentiable concave lower bound function without requiring any solution in its formula. In this paper, we investigate some properties for the lower bound. Our study reveals that the lower bound affords a steplength domain which guarantees the convergence of the entire algorithm. Based on these results, we present two new steplengths. One involves the projection onto the tangent cone without line search, while the other can be computed via searching the positive root of a one dimension concave lower bound function. Our preliminary numerical results confirm and illustrate the attractiveness of our contributions.
Original language | English |
---|---|
Pages (from-to) | 2925-2939 |
Number of pages | 15 |
Journal | Journal of Computational and Applied Mathematics |
Volume | 233 |
Issue number | 11 |
DOIs | |
Publication status | Published - 1 Apr 2010 |
Scopus Subject Areas
- Computational Mathematics
- Applied Mathematics
User-Defined Keywords
- Black-box model
- Extragradient type methods
- Monotone variational inequalities
- Projection