Numerical experience with a class of self-scaling quasi-newton algorithms

Research output: Contribution to journalArticle

25 Citations (Scopus)

Abstract

Self-scaling quasi-Newton methods for unconstrained optimization depend upon updating the Hessian approximation by a formula which depends on two parameters (say, τ and θ) such that τ = 1, θ = 0, and θ = 1 yield the unscaled Broyden family, the BFGS update, and the DFP update, respectively. In previous work, conditions were obtained on these parameters that imply global and superlinear convergence for self-scaling methods on convex objective functions. This paper discusses the practical performance of several new algorithms designed to satisfy these conditions.

Original languageEnglish
Pages (from-to)533-553
Number of pages21
JournalJournal of Optimization Theory and Applications
Volume96
Issue number3
Publication statusPublished - Mar 1998

Fingerprint

Quasi-Newton Algorithm
Newton-Raphson method
Update
Scaling
Quasi-Newton Method
Superlinear Convergence
Unconstrained Optimization
Global Convergence
Convex function
Updating
Two Parameters
Objective function
Imply
Approximation
Class
Experience
Family
Work conditions

Keywords

  • Broyden family
  • Global and superlinear convergence
  • Inexact line searches
  • Quasi-Newton methods
  • Self-scaling methods
  • Unconstrained optimization

ASJC Scopus subject areas

  • Management Science and Operations Research
  • Applied Mathematics
  • Control and Optimization

Cite this

Numerical experience with a class of self-scaling quasi-newton algorithms. / Al-Baali, M.

In: Journal of Optimization Theory and Applications, Vol. 96, No. 3, 03.1998, p. 533-553.

Research output: Contribution to journalArticle

@article{1833d581fc4e43a981ecfb00f8bcdf9c,
title = "Numerical experience with a class of self-scaling quasi-newton algorithms",
abstract = "Self-scaling quasi-Newton methods for unconstrained optimization depend upon updating the Hessian approximation by a formula which depends on two parameters (say, τ and θ) such that τ = 1, θ = 0, and θ = 1 yield the unscaled Broyden family, the BFGS update, and the DFP update, respectively. In previous work, conditions were obtained on these parameters that imply global and superlinear convergence for self-scaling methods on convex objective functions. This paper discusses the practical performance of several new algorithms designed to satisfy these conditions.",
keywords = "Broyden family, Global and superlinear convergence, Inexact line searches, Quasi-Newton methods, Self-scaling methods, Unconstrained optimization",
author = "M. Al-Baali",
year = "1998",
month = "3",
language = "English",
volume = "96",
pages = "533--553",
journal = "Journal of Optimization Theory and Applications",
issn = "0022-3239",
publisher = "Springer New York",
number = "3",

}

TY - JOUR

T1 - Numerical experience with a class of self-scaling quasi-newton algorithms

AU - Al-Baali, M.

PY - 1998/3

Y1 - 1998/3

N2 - Self-scaling quasi-Newton methods for unconstrained optimization depend upon updating the Hessian approximation by a formula which depends on two parameters (say, τ and θ) such that τ = 1, θ = 0, and θ = 1 yield the unscaled Broyden family, the BFGS update, and the DFP update, respectively. In previous work, conditions were obtained on these parameters that imply global and superlinear convergence for self-scaling methods on convex objective functions. This paper discusses the practical performance of several new algorithms designed to satisfy these conditions.

AB - Self-scaling quasi-Newton methods for unconstrained optimization depend upon updating the Hessian approximation by a formula which depends on two parameters (say, τ and θ) such that τ = 1, θ = 0, and θ = 1 yield the unscaled Broyden family, the BFGS update, and the DFP update, respectively. In previous work, conditions were obtained on these parameters that imply global and superlinear convergence for self-scaling methods on convex objective functions. This paper discusses the practical performance of several new algorithms designed to satisfy these conditions.

KW - Broyden family

KW - Global and superlinear convergence

KW - Inexact line searches

KW - Quasi-Newton methods

KW - Self-scaling methods

KW - Unconstrained optimization

UR - http://www.scopus.com/inward/record.url?scp=0032373590&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0032373590&partnerID=8YFLogxK

M3 - Article

AN - SCOPUS:0032373590

VL - 96

SP - 533

EP - 553

JO - Journal of Optimization Theory and Applications

JF - Journal of Optimization Theory and Applications

SN - 0022-3239

IS - 3

ER -