Page 1

Displaying 1 – 13 of 13

Showing per page

A modified limited-memory BNS method for unconstrained minimization derived from the conjugate directions idea

Vlček, Jan, Lukšan, Ladislav (2015)

Programs and Algorithms of Numerical Mathematics

A modification of the limited-memory variable metric BNS method for large scale unconstrained optimization of the differentiable function f : N is considered, which consists in corrections (based on the idea of conjugate directions) of difference vectors for better satisfaction of the previous quasi-Newton conditions. In comparison with [11], more previous iterations can be utilized here. For quadratic objective functions, the improvement of convergence is the best one in some sense, all stored corrected...

A self-scaling memoryless BFGS based conjugate gradient method using multi-step secant condition for unconstrained minimization

Yongjin Kim, Yunchol Jong, Yong Kim (2024)

Applications of Mathematics

Conjugate gradient methods are widely used for solving large-scale unconstrained optimization problems, because they do not need the storage of matrices. Based on the self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno (SSML-BFGS) method, new conjugate gradient algorithms CG-DESCENT and CGOPT have been proposed by W. Hager, H. Zhang (2005) and Y. Dai, C. Kou (2013), respectively. It is noted that the two conjugate gradient methods perform more efficiently than the SSML-BFGS method. Therefore,...

Adjustment of the scaling parameter of Dai-Kou type conjugate gradient methods with application to motion control

Mahbube Akbari, Saeed Nezhadhosein, Aghile Heydari (2024)

Applications of Mathematics

We introduce a new scaling parameter for the Dai-Kou family of conjugate gradient algorithms (2013), which is one of the most numerically efficient methods for unconstrained optimization. The suggested parameter is based on eigenvalue analysis of the search direction matrix and minimizing the measure function defined by Dennis and Wolkowicz (1993). The corresponding search direction of conjugate gradient method has the sufficient descent property and the extended conjugacy condition. The global...

An accurate active set Newton algorithm for large scale bound constrained optimization

Li Sun, Guoping He, Yongli Wang, Changyin Zhou (2011)

Applications of Mathematics

A new algorithm for solving large scale bound constrained minimization problems is proposed. The algorithm is based on an accurate identification technique of the active set proposed by Facchinei, Fischer and Kanzow in 1998. A further division of the active set yields the global convergence of the new algorithm. In particular, the convergence rate is superlinear without requiring the strict complementarity assumption. Numerical tests demonstrate the efficiency and performance of the present strategy...

An active set strategy based on the multiplier function or the gradient

Li Sun, Liang Fang, Guoping He (2010)

Applications of Mathematics

We employ the active set strategy which was proposed by Facchinei for solving large scale bound constrained optimization problems. As the special structure of the bound constrained problem, a simple rule is used for updating the multipliers. Numerical results show that the active set identification strategy is practical and efficient.

Currently displaying 1 – 13 of 13

Page 1