posted on 2013-11-27, 15:01authored byDjamal Touati-Ahmed
The main subject of the research in this thesis is the study
of conjugate gradient methods for optimization and the development
of improved algorithms. After an introductory first chapter,
Chapter 2 contains a background of numerical methods for
optimization in general and of conjugate gradient-type algorithms
in particular. In Chapter 3 we study the convergence properties
of conjugate gradient methods and discuss Powell's (1983) counter
example that proves that there exist twice continuously
differentiable functions with bounded level sets for which the
Polak-Ribiere method fails to achieve global convergence whereas
the Fletcher-Reeves method is shown to be globally convergent,
despite the fact that in numerical computations the Polak-Ribiere
method is far more efficient than that of Fletcher-Reeves.
Chapters 4 and 5 deal with the development of a number of new
hybrid algorithms, three of which are shown to satisfy the descent
property at every iteration and achieve global convergence
regardless of whether exact or inexact line searches are used.
A new restarting procedure for conjugate gradient methods is also
given that ensures a descent property to hold and global convergence
for any conjugate gradient method using a non negative update.
The application of these hybrid algorithms and that of the new
restarting procedure to a wide class of well-known test problems
is given and discussed in the final Chapter "Discussions and
Conclusions". The results obtained, given in the appendices, show
that a considerable improvement is achieved by these hybrids and by
methods using the new restarting procedure over the existing conjugate
gradient methods and also over quasi-Newton methods.