Displaying similar documents to “A conjugate gradient method with sufficient descent and global convergence for unconstrained nonlinear optimization.”

Convex quadratic underestimation and Branch and Bound for univariate global optimization with one nonconvex constraint

Hoai An Le Thi, Mohand Ouanes (2006)

RAIRO - Operations Research

Similarity:

The purpose of this paper is to demonstrate that, for globally minimize one dimensional nonconvex problems with both twice differentiable function and constraint, we can propose an efficient algorithm based on Branch and Bound techniques. The method is first displayed in the simple case with an interval constraint. The extension is displayed afterwards to the general case with an additional nonconvex twice differentiable constraint. A quadratic bounding function which is better than...

A self-adaptive trust region method for the extended linear complementarity problems

Zhensheng Yu, Qiang Li (2009)

Applications of Mathematics

Similarity:

By using some NCP functions, we reformulate the extended linear complementarity problem as a nonsmooth equation. Then we propose a self-adaptive trust region algorithm for solving this nonsmooth equation. The novelty of this method is that the trust region radius is controlled by the objective function value which can be adjusted automatically according to the algorithm. The global convergence is obtained under mild conditions and the local superlinear convergence rate is also established...

An imperfect conjugate gradient algorithm

Fridrich Sloboda (1982)

Aplikace matematiky

Similarity:

A new biorthogonalization algorithm is defined which does not depend on the step-size used. The algorithm is suggested so as to minimize the total error after n steps if imperfect steps are used. The majority of conjugate gradient algorithms are sensitive to the exactness of the line searches and this phenomenon may destroy the global efficiency of these algorithms.