Abstract
Self-scaling quasi-Newton methods for unconstrained optimization depend upon updating the Hessian approximation by a formula which depends on two parameters (say, τ and θ) such that τ = 1, θ = 0, and θ = 1 yield the unscaled Broyden family, the BFGS update, and the DFP update, respectively. In previous work, conditions were obtained on these parameters that imply global and superlinear convergence for self-scaling methods on convex objective functions. This paper discusses the practical performance of several new algorithms designed to satisfy these conditions.
Original language | English |
---|---|
Pages (from-to) | 533-553 |
Number of pages | 21 |
Journal | Journal of Optimization Theory and Applications |
Volume | 96 |
Issue number | 3 |
Publication status | Published - Mar 1998 |
Keywords
- Broyden family
- Global and superlinear convergence
- Inexact line searches
- Quasi-Newton methods
- Self-scaling methods
- Unconstrained optimization
ASJC Scopus subject areas
- Management Science and Operations Research
- Applied Mathematics
- Control and Optimization