ملخص
The line search subproblem in unconstrained optimization is concerned with finding an acceptable steplength which satisfies certain standard conditions. Prototype algorithms are described which guarantee finding such a step in a finite number of operations. This is achieved by first bracketing an interval of acceptable values and then reducing this bracket uniformly by the repeated use of sectioning in a systematic way. Some new theorems about convergence and termination of the line search are presented. Use of these algorithms to solve the line search subproblem in methods for nonlinear least squares is considered. We show that substantial gains in efficiency can be made by making polynomial interpolations to the individual residual functions rather than the overall objective function. We also study modified schemes in which the Jacobian matrix is evaluated as infrequently as possible, and show that further worthwhile savings can be made. Numerical results are presented.
اللغة الأصلية | English |
---|---|
الصفحات (من إلى) | 359-377 |
عدد الصفحات | 19 |
دورية | Journal of Optimization Theory and Applications |
مستوى الصوت | 48 |
رقم الإصدار | 3 |
المعرِّفات الرقمية للأشياء | |
حالة النشر | Published - مارس 1986 |
منشور خارجيًا | نعم |
ASJC Scopus subject areas
- ???subjectarea.asjc.2600.2606???
- ???subjectarea.asjc.1800.1803???
- ???subjectarea.asjc.2600.2604???