Abstract
Combined with non-monotone line search, the Barzilai and Borwein (BB) gradient method has been successfully extended for solving unconstrained optimization problems and is competitive with conjugate gradient methods. In this paper, we establish the R-linear convergence of the BB method for any-dimensional strongly convex quadratics. One corollary of this result is that the BB method is also locally R-linear convergent for general objective functions, and hence the stepsize in the BB method will always be accepted by the non-monotone line search when the iterate is close to the solution.
Original language | English |
---|---|
Pages (from-to) | 1-10 |
Number of pages | 10 |
Journal | IMA Journal of Numerical Analysis |
Volume | 22 |
Issue number | 1 |
DOIs | |
Publication status | Published - Jan 2002 |
Scopus Subject Areas
- Mathematics(all)
- Computational Mathematics
- Applied Mathematics
User-Defined Keywords
- Gradient method
- R-linear convergence
- Strictly convex
- Unconstrained optimization