Variational quasi-Newton methods for unconstrained optimization

M. Al-Baali*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

13 Citations (Scopus)

Abstract

In this paper, we propose new members of the Broyden family of quasi-Newton methods. We develop, on the basis of well-known least-change results for the BFGS and DFP updates, a measure for the Broyden family which seeks to take into account the change in both the Hessian approximation and its inverse. The proposal is then to choose the formula which gives the least value of this measure in terms of the two parameters available, and hence to produce an update which is optimal in the sense of the given measure. Several approaches to the problem of minimizing the measure are considered, from which new updates are obtained. In particular, one approach yields a new variational result for the Davidon optimally conditioned method and another yields a reasonable modification to this method. The paper is also concerned with the possibility of estimating, in a certain sense, the size of the eigenvalues of the Hessian approximation on the basis of two available scalars. This allows one to derive further modifications to the above-mentioned methods. Comparisons with the BFGS and Davidson methods are made on a set of standard test problems that show promising results for certain new methods.

Original languageEnglish
Pages (from-to)127-143
Number of pages17
JournalJournal of Optimization Theory and Applications
Volume77
Issue number1
DOIs
Publication statusPublished - Apr 1993
Externally publishedYes

Keywords

  • Broyden family of updates
  • Davidon optimally conditioned update
  • Unconstrained optimization
  • least-change BFGS and DFP updates
  • quasi-Newton methods
  • symmetric rank-one update

ASJC Scopus subject areas

  • Control and Optimization
  • Management Science and Operations Research
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Variational quasi-Newton methods for unconstrained optimization'. Together they form a unique fingerprint.

Cite this