Abstract
This paper considers employing extra updates for the BFGS method for unconstrained optimization. The usual BFGS Hessian is updated a number of times, depending on the information of the first order derivatives, to obtain a new Hessian approximation at each iteration. Two approaches are proposed. One of them has the same properties of global and superlinear convergence on convex functions the BFGS method has, and another has the same property of quadratic termination without exact line searches that the symmetric rank-one method has. The new algorithms attempt to combine the best features of certain methods which are intended for either parallel computation or large scale optimization. It is concluded that some new algorithms are competitive with the standard BFGS method.
Original language | English |
---|---|
Pages (from-to) | 159-179 |
Number of pages | 21 |
Journal | Optimization Methods and Software |
Volume | 13 |
Issue number | 3 |
Publication status | Published - 2000 |
ASJC Scopus subject areas
- Computer Graphics and Computer-Aided Design
- Software
- Management Science and Operations Research
- Applied Mathematics
- Control and Optimization