### Abstract

This paper considers simple modifications of the limited memory BFGS (L-BFGS) method for large scale optimization. It describes algorithms in which alternating ways of re-using a given set of stored difference vectors are outlined. The proposed algorithms resemble the L-BFGS method, except that the initial Hessian approximation is defined implicitly like the L-BFGS Hessian in terms of some stored vectors rather than the usual choice of a multiple of the unit matrix. Numerical experiments show that the new algorithms yield desirable improvement over the L-BFGS method.

Original language | English |
---|---|

Pages (from-to) | 99-112 |

Number of pages | 14 |

Journal | Numerical Algorithms |

Volume | 22 |

Issue number | 1 |

Publication status | Published - 1999 |

### Fingerprint

### Keywords

- BFGS updating formula
- Large scale optimization
- Limited memory BFGS method
- Quasi-Newton methods

### ASJC Scopus subject areas

- Applied Mathematics

### Cite this

*Numerical Algorithms*,

*22*(1), 99-112.

**Improved Hessian approximations for the limited memory BFGS method.** / Al-Baali, Mehiddin.

Research output: Contribution to journal › Article

*Numerical Algorithms*, vol. 22, no. 1, pp. 99-112.

}

TY - JOUR

T1 - Improved Hessian approximations for the limited memory BFGS method

AU - Al-Baali, Mehiddin

PY - 1999

Y1 - 1999

N2 - This paper considers simple modifications of the limited memory BFGS (L-BFGS) method for large scale optimization. It describes algorithms in which alternating ways of re-using a given set of stored difference vectors are outlined. The proposed algorithms resemble the L-BFGS method, except that the initial Hessian approximation is defined implicitly like the L-BFGS Hessian in terms of some stored vectors rather than the usual choice of a multiple of the unit matrix. Numerical experiments show that the new algorithms yield desirable improvement over the L-BFGS method.

AB - This paper considers simple modifications of the limited memory BFGS (L-BFGS) method for large scale optimization. It describes algorithms in which alternating ways of re-using a given set of stored difference vectors are outlined. The proposed algorithms resemble the L-BFGS method, except that the initial Hessian approximation is defined implicitly like the L-BFGS Hessian in terms of some stored vectors rather than the usual choice of a multiple of the unit matrix. Numerical experiments show that the new algorithms yield desirable improvement over the L-BFGS method.

KW - BFGS updating formula

KW - Large scale optimization

KW - Limited memory BFGS method

KW - Quasi-Newton methods

UR - http://www.scopus.com/inward/record.url?scp=0033408125&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0033408125&partnerID=8YFLogxK

M3 - Article

VL - 22

SP - 99

EP - 112

JO - Numerical Algorithms

JF - Numerical Algorithms

SN - 1017-1398

IS - 1

ER -