Global and superlinear convergence of a restricted class of self-scaling methods with inexact line searches, for convex functions

Research output: Contribution to journalArticle

17 Citations (Scopus)

Abstract

This paper studies the convergence properties of algorithms belonging to the class of self-scaling (SS) quasi-Newton methods for unconstrained optimization. This class depends on two parameters, say θk and τk, for which the choice τk = 1 gives the Broyden family of unscaled methods, where θk = 1 corresponds to the well known DFP method. We propose simple conditions on these parameters that give rise to global convergence with inexact line searches, for convex objective functions. The q-superlinear convergence is achieved if further restrictions on the scaling parameter are introduced. These convergence results are an extension of the known results for the unscaled methods. Because the scaling parameter is heavily restricted, we consider a subclass of SS methods which satisfies the required conditions. Although convergence for the unscaled methods with θk≥1 is still an open question, we show that the global and superlinear convergence for SS methods is possible and present, in particular, a new SS-DFP method.

Original languageEnglish
Pages (from-to)191-203
Number of pages13
JournalComputational Optimization and Applications
Volume9
Issue number2
Publication statusPublished - Feb 1998

Fingerprint

Inexact Line Search
Superlinear Convergence
Newton-Raphson method
Global Convergence
Convex function
Scaling
Quasi-Newton Method
Unconstrained Optimization
Class
Convergence Properties
Convergence Results
Two Parameters
Objective function
Restriction

ASJC Scopus subject areas

  • Management Science and Operations Research
  • Applied Mathematics
  • Computational Mathematics
  • Control and Optimization

Cite this

@article{73876c09913b4f8ebab9bd332012ee4a,
title = "Global and superlinear convergence of a restricted class of self-scaling methods with inexact line searches, for convex functions",
abstract = "This paper studies the convergence properties of algorithms belonging to the class of self-scaling (SS) quasi-Newton methods for unconstrained optimization. This class depends on two parameters, say θk and τk, for which the choice τk = 1 gives the Broyden family of unscaled methods, where θk = 1 corresponds to the well known DFP method. We propose simple conditions on these parameters that give rise to global convergence with inexact line searches, for convex objective functions. The q-superlinear convergence is achieved if further restrictions on the scaling parameter are introduced. These convergence results are an extension of the known results for the unscaled methods. Because the scaling parameter is heavily restricted, we consider a subclass of SS methods which satisfies the required conditions. Although convergence for the unscaled methods with θk≥1 is still an open question, we show that the global and superlinear convergence for SS methods is possible and present, in particular, a new SS-DFP method.",
author = "M. Al-Baali",
year = "1998",
month = "2",
language = "English",
volume = "9",
pages = "191--203",
journal = "Computational Optimization and Applications",
issn = "0926-6003",
publisher = "Springer Netherlands",
number = "2",

}

TY - JOUR

T1 - Global and superlinear convergence of a restricted class of self-scaling methods with inexact line searches, for convex functions

AU - Al-Baali, M.

PY - 1998/2

Y1 - 1998/2

N2 - This paper studies the convergence properties of algorithms belonging to the class of self-scaling (SS) quasi-Newton methods for unconstrained optimization. This class depends on two parameters, say θk and τk, for which the choice τk = 1 gives the Broyden family of unscaled methods, where θk = 1 corresponds to the well known DFP method. We propose simple conditions on these parameters that give rise to global convergence with inexact line searches, for convex objective functions. The q-superlinear convergence is achieved if further restrictions on the scaling parameter are introduced. These convergence results are an extension of the known results for the unscaled methods. Because the scaling parameter is heavily restricted, we consider a subclass of SS methods which satisfies the required conditions. Although convergence for the unscaled methods with θk≥1 is still an open question, we show that the global and superlinear convergence for SS methods is possible and present, in particular, a new SS-DFP method.

AB - This paper studies the convergence properties of algorithms belonging to the class of self-scaling (SS) quasi-Newton methods for unconstrained optimization. This class depends on two parameters, say θk and τk, for which the choice τk = 1 gives the Broyden family of unscaled methods, where θk = 1 corresponds to the well known DFP method. We propose simple conditions on these parameters that give rise to global convergence with inexact line searches, for convex objective functions. The q-superlinear convergence is achieved if further restrictions on the scaling parameter are introduced. These convergence results are an extension of the known results for the unscaled methods. Because the scaling parameter is heavily restricted, we consider a subclass of SS methods which satisfies the required conditions. Although convergence for the unscaled methods with θk≥1 is still an open question, we show that the global and superlinear convergence for SS methods is possible and present, in particular, a new SS-DFP method.

UR - http://www.scopus.com/inward/record.url?scp=0031999308&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0031999308&partnerID=8YFLogxK

M3 - Article

AN - SCOPUS:0031999308

VL - 9

SP - 191

EP - 203

JO - Computational Optimization and Applications

JF - Computational Optimization and Applications

SN - 0926-6003

IS - 2

ER -