On restart procedures for the conjugate gradient method

Yu Hong Dai*, Lizhi LIAO, Duan Li

*Corresponding author for this work

Research output: Contribution to journalJournal articlepeer-review

46 Citations (Scopus)


The conjugate gradient method is a powerful solution scheme for solving unconstrained optimization problems, especially for large-scale problems. However, the convergence rate of the method without restart is only linear. In this paper, we will consider an idea contained in [16] and present a new restart technique for this method. Given an arbitrary descent direction dt and the gradient gt, our key idea is to make use of the BFGS updating formula to provide a symmetric positive definite matrix Pt such that dt = -Ptgt, and then define the conjugate gradient iteration in the transformed space. Two conjugate gradient algorithms are designed based on the new restart technique. Their global convergence is proved under mild assumptions on the objective function. Numerical experiments are also reported, which show that the two algorithms are comparable to the Beale-Powell restart algorithm.

Original languageEnglish
Pages (from-to)249-260
Number of pages12
JournalNumerical Algorithms
Issue number2-4
Publication statusPublished - Apr 2004

Scopus Subject Areas

  • Applied Mathematics

User-Defined Keywords

  • BFGS updating formula
  • Conjugate gradient method
  • Global convergence
  • Restart
  • Unconstrained optimization


Dive into the research topics of 'On restart procedures for the conjugate gradient method'. Together they form a unique fingerprint.

Cite this