Displaying similar documents to “Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization.”

An imperfect conjugate gradient algorithm

Fridrich Sloboda (1982)

Aplikace matematiky

Similarity:

A new biorthogonalization algorithm is defined which does not depend on the step-size used. The algorithm is suggested so as to minimize the total error after n steps if imperfect steps are used. The majority of conjugate gradient algorithms are sensitive to the exactness of the line searches and this phenomenon may destroy the global efficiency of these algorithms.

Conjugate gradient algorithms for conic functions

Ladislav Lukšan (1986)

Aplikace matematiky

Similarity:

The paper contains a description and an analysis of two modifications of the conjugate gradient method for unconstrained minimization which find a minimum of the conic function after a finite number of steps. Moreover, further extension of the conjugate gradient method is given which is based on a more general class of the model functions.

Modifications of the limited-memory BFGS method based on the idea of conjugate directions

Vlček, Jan, Lukšan, Ladislav

Similarity:

Simple modifications of the limited-memory BFGS method (L-BFGS) for large scale unconstrained optimization are considered, which consist in corrections of the used difference vectors (derived from the idea of conjugate directions), utilizing information from the preceding iteration. For quadratic objective functions, the improvement of convergence is the best one in some sense and all stored difference vectors are conjugate for unit stepsizes. The algorithm is globally convergent for...