Abstract
The line search subproblem in unconstrained optimization is concerned with finding an acceptable steplength which satisfies certain standard conditions. Prototype algorithms are described which guarantee finding such a step in a finite number of operations. This is achieved by first bracketing an interval of acceptable values and then reducing this bracket uniformly by the repeated use of sectioning in a systematic way. Some new theorems about convergence and termination of the line search are presented. Use of these algorithms to solve the line search subproblem in methods for nonlinear least squares is considered. We show that substantial gains in efficiency can be made by making polynomial interpolations to the individual residual functions rather than the overall objective function. We also study modified schemes in which the Jacobian matrix is evaluated as infrequently as possible, and show that further worthwhile savings can be made. Numerical results are presented.
Original language | English |
---|---|
Pages (from-to) | 359-377 |
Number of pages | 19 |
Journal | Journal of Optimization Theory and Applications |
Volume | 48 |
Issue number | 3 |
DOIs | |
Publication status | Published - Mar 1986 |
Keywords
- Unconstrained optimization
- line search
- nonlinear least squares
- sectioning
ASJC Scopus subject areas
- Control and Optimization
- Management Science and Operations Research
- Applied Mathematics