The search session has expired. Please query the service again.

The search session has expired. Please query the service again.

The search session has expired. Please query the service again.

The search session has expired. Please query the service again.

The search session has expired. Please query the service again.

The search session has expired. Please query the service again.

The search session has expired. Please query the service again.

The search session has expired. Please query the service again.

The search session has expired. Please query the service again.

The search session has expired. Please query the service again.

The search session has expired. Please query the service again.

The search session has expired. Please query the service again.

The search session has expired. Please query the service again.

The search session has expired. Please query the service again.

The search session has expired. Please query the service again.

The search session has expired. Please query the service again.

Displaying similar documents to “Limited-memory variable metric methods that use quantities from the preceding iteration”

A new diagonal quasi-Newton algorithm for unconstrained optimization problems

Mahsa Nosrati, Keyvan Amini (2024)

Applications of Mathematics

Similarity:

We present a new diagonal quasi-Newton method for solving unconstrained optimization problems based on the weak secant equation. To control the diagonal elements, the new method uses new criteria to generate the Hessian approximation. We establish the global convergence of the proposed method with the Armijo line search. Numerical results on a collection of standard test problems demonstrate the superiority of the proposed method over several existing diagonal methods.

A modified limited-memory BNS method for unconstrained minimization derived from the conjugate directions idea

Vlček, Jan, Lukšan, Ladislav

Similarity:

A modification of the limited-memory variable metric BNS method for large scale unconstrained optimization of the differentiable function f : N is considered, which consists in corrections (based on the idea of conjugate directions) of difference vectors for better satisfaction of the previous quasi-Newton conditions. In comparison with [11], more previous iterations can be utilized here. For quadratic objective functions, the improvement of convergence is the best one in some sense, all...

A generalized limited-memory BNS method based on the block BFGS update

Vlček, Jan, Lukšan, Ladislav

Similarity:

A block version of the BFGS variable metric update formula is investigated. It satisfies the quasi-Newton conditions with all used difference vectors and gives the best improvement of convergence in some sense for quadratic objective functions, but it does not guarantee that the direction vectors are descent for general functions. To overcome this difficulty and utilize the advantageous properties of the block BFGS update, a block version of the limited-memory BNS method for large scale...