Numerical experience with a class of self-scaling quasi-newton algorithms

M. Al-Baali*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

32 Citations (Scopus)


Self-scaling quasi-Newton methods for unconstrained optimization depend upon updating the Hessian approximation by a formula which depends on two parameters (say, τ and θ) such that τ = 1, θ = 0, and θ = 1 yield the unscaled Broyden family, the BFGS update, and the DFP update, respectively. In previous work, conditions were obtained on these parameters that imply global and superlinear convergence for self-scaling methods on convex objective functions. This paper discusses the practical performance of several new algorithms designed to satisfy these conditions.

Original languageEnglish
Pages (from-to)533-553
Number of pages21
JournalJournal of Optimization Theory and Applications
Issue number3
Publication statusPublished - Mar 1998


  • Broyden family
  • Global and superlinear convergence
  • Inexact line searches
  • Quasi-Newton methods
  • Self-scaling methods
  • Unconstrained optimization

ASJC Scopus subject areas

  • Management Science and Operations Research
  • Control and Optimization
  • Applied Mathematics


Dive into the research topics of 'Numerical experience with a class of self-scaling quasi-newton algorithms'. Together they form a unique fingerprint.

Cite this