Extra updates for the BFGS method

Research output: Contribution to journalArticle

13 Citations (Scopus)

Abstract

This paper considers employing extra updates for the BFGS method for unconstrained optimization. The usual BFGS Hessian is updated a number of times, depending on the information of the first order derivatives, to obtain a new Hessian approximation at each iteration. Two approaches are proposed. One of them has the same properties of global and superlinear convergence on convex functions the BFGS method has, and another has the same property of quadratic termination without exact line searches that the symmetric rank-one method has. The new algorithms attempt to combine the best features of certain methods which are intended for either parallel computation or large scale optimization. It is concluded that some new algorithms are competitive with the standard BFGS method.

Original languageEnglish
Pages (from-to)159-179
Number of pages21
JournalOptimization Methods and Software
Volume13
Issue number3
Publication statusPublished - 2000

Fingerprint

BFGS Method
Update
Large-scale Optimization
Superlinear Convergence
Line Search
Unconstrained Optimization
Parallel Computation
Derivatives
Global Convergence
Termination
Convex function
First-order
Iteration
Derivative
Approximation

ASJC Scopus subject areas

  • Computer Graphics and Computer-Aided Design
  • Software
  • Management Science and Operations Research
  • Applied Mathematics
  • Control and Optimization

Cite this

Extra updates for the BFGS method. / Al-Baali, M.

In: Optimization Methods and Software, Vol. 13, No. 3, 2000, p. 159-179.

Research output: Contribution to journalArticle

@article{04c5eee9ea314e379dbe006149da1d27,
title = "Extra updates for the BFGS method",
abstract = "This paper considers employing extra updates for the BFGS method for unconstrained optimization. The usual BFGS Hessian is updated a number of times, depending on the information of the first order derivatives, to obtain a new Hessian approximation at each iteration. Two approaches are proposed. One of them has the same properties of global and superlinear convergence on convex functions the BFGS method has, and another has the same property of quadratic termination without exact line searches that the symmetric rank-one method has. The new algorithms attempt to combine the best features of certain methods which are intended for either parallel computation or large scale optimization. It is concluded that some new algorithms are competitive with the standard BFGS method.",
author = "M. Al-Baali",
year = "2000",
language = "English",
volume = "13",
pages = "159--179",
journal = "Optimization Methods and Software",
issn = "1055-6788",
publisher = "Taylor and Francis Ltd.",
number = "3",

}

TY - JOUR

T1 - Extra updates for the BFGS method

AU - Al-Baali, M.

PY - 2000

Y1 - 2000

N2 - This paper considers employing extra updates for the BFGS method for unconstrained optimization. The usual BFGS Hessian is updated a number of times, depending on the information of the first order derivatives, to obtain a new Hessian approximation at each iteration. Two approaches are proposed. One of them has the same properties of global and superlinear convergence on convex functions the BFGS method has, and another has the same property of quadratic termination without exact line searches that the symmetric rank-one method has. The new algorithms attempt to combine the best features of certain methods which are intended for either parallel computation or large scale optimization. It is concluded that some new algorithms are competitive with the standard BFGS method.

AB - This paper considers employing extra updates for the BFGS method for unconstrained optimization. The usual BFGS Hessian is updated a number of times, depending on the information of the first order derivatives, to obtain a new Hessian approximation at each iteration. Two approaches are proposed. One of them has the same properties of global and superlinear convergence on convex functions the BFGS method has, and another has the same property of quadratic termination without exact line searches that the symmetric rank-one method has. The new algorithms attempt to combine the best features of certain methods which are intended for either parallel computation or large scale optimization. It is concluded that some new algorithms are competitive with the standard BFGS method.

UR - http://www.scopus.com/inward/record.url?scp=0033658985&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0033658985&partnerID=8YFLogxK

M3 - Article

AN - SCOPUS:0033658985

VL - 13

SP - 159

EP - 179

JO - Optimization Methods and Software

JF - Optimization Methods and Software

SN - 1055-6788

IS - 3

ER -