Numerical experience with a class of self-scaling quasi-newton algorithms

Research output: Contribution to journalArticle

28 Citations (Scopus)

Abstract

Self-scaling quasi-Newton methods for unconstrained optimization depend upon updating the Hessian approximation by a formula which depends on two parameters (say, τ and θ) such that τ = 1, θ = 0, and θ = 1 yield the unscaled Broyden family, the BFGS update, and the DFP update, respectively. In previous work, conditions were obtained on these parameters that imply global and superlinear convergence for self-scaling methods on convex objective functions. This paper discusses the practical performance of several new algorithms designed to satisfy these conditions.

Original languageEnglish
Pages (from-to)533-553
Number of pages21
JournalJournal of Optimization Theory and Applications
Volume96
Issue number3
Publication statusPublished - Mar 1998

Keywords

  • Broyden family
  • Global and superlinear convergence
  • Inexact line searches
  • Quasi-Newton methods
  • Self-scaling methods
  • Unconstrained optimization

ASJC Scopus subject areas

  • Management Science and Operations Research
  • Applied Mathematics
  • Control and Optimization

Fingerprint Dive into the research topics of 'Numerical experience with a class of self-scaling quasi-newton algorithms'. Together they form a unique fingerprint.

  • Cite this