Abstract
In this paper, we propose a continuous Newton-type method in the form of an ordinary differential equation by combining the negative gradient and the Newton direction. We show that for a general function f(x), our method converges globally to a connected subset of the stationary points of f(x) under some mild conditions, and converges globally to a single stationary point for a real analytic function. The method reduces to the exact continuous Newton method if the Hessian matrix of f(x) is uniformly positive definite. We report on convergence of the new method on the set of standard test problems in the literature.
Original language | English |
---|---|
Pages (from-to) | 259-277 |
Number of pages | 19 |
Journal | Pacific Journal of Optimization |
Volume | 4 |
Issue number | 2 |
Publication status | Published - May 2008 |
Scopus Subject Areas
- Control and Optimization
- Computational Mathematics
- Applied Mathematics
User-Defined Keywords
- Continuous method
- Global convergence
- ODE method
- Pseudotransient continuation
- Unconstrained optimization