Extra updates for the BFGS method

M. Al-Baali*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

15 Citations (Scopus)


This paper considers employing extra updates for the BFGS method for unconstrained optimization. The usual BFGS Hessian is updated a number of times, depending on the information of the first order derivatives, to obtain a new Hessian approximation at each iteration. Two approaches are proposed. One of them has the same properties of global and superlinear convergence on convex functions the BFGS method has, and another has the same property of quadratic termination without exact line searches that the symmetric rank-one method has. The new algorithms attempt to combine the best features of certain methods which are intended for either parallel computation or large scale optimization. It is concluded that some new algorithms are competitive with the standard BFGS method.

Original languageEnglish
Pages (from-to)159-179
Number of pages21
JournalOptimization Methods and Software
Issue number3
Publication statusPublished - 2000

ASJC Scopus subject areas

  • Software
  • Control and Optimization
  • Applied Mathematics


Dive into the research topics of 'Extra updates for the BFGS method'. Together they form a unique fingerprint.

Cite this