Abstract
The conjugate gradient method is a powerful solution scheme for solving unconstrained optimization problems, especially for large-scale problems. However, the convergence rate of the method without restart is only linear. In this paper, we will consider an idea contained in [16] and present a new restart technique for this method. Given an arbitrary descent direction dt and the gradient gt, our key idea is to make use of the BFGS updating formula to provide a symmetric positive definite matrix Pt such that dt = -Ptgt, and then define the conjugate gradient iteration in the transformed space. Two conjugate gradient algorithms are designed based on the new restart technique. Their global convergence is proved under mild assumptions on the objective function. Numerical experiments are also reported, which show that the two algorithms are comparable to the Beale-Powell restart algorithm.
Original language | English |
---|---|
Pages (from-to) | 249-260 |
Number of pages | 12 |
Journal | Numerical Algorithms |
Volume | 35 |
Issue number | 2-4 |
DOIs | |
Publication status | Published - Apr 2004 |
Scopus Subject Areas
- Applied Mathematics
User-Defined Keywords
- BFGS updating formula
- Conjugate gradient method
- Global convergence
- Restart
- Unconstrained optimization