Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization.
Andrei, Neculai (2011)
Bulletin of the Malaysian Mathematical Sciences Society. Second Series
Similarity:
Andrei, Neculai (2011)
Bulletin of the Malaysian Mathematical Sciences Society. Second Series
Similarity:
Yongjin Kim, Yunchol Jong, Yong Kim (2024)
Applications of Mathematics
Similarity:
Conjugate gradient methods are widely used for solving large-scale unconstrained optimization problems, because they do not need the storage of matrices. Based on the self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno (SSML-BFGS) method, new conjugate gradient algorithms CG-DESCENT and CGOPT have been proposed by W. Hager, H. Zhang (2005) and Y. Dai, C. Kou (2013), respectively. It is noted that the two conjugate gradient methods perform more efficiently than the SSML-BFGS method....
Zainab Hassan Ahmed, Mohamed Hbaib, Khalil K. Abbo (2024)
Applications of Mathematics
Similarity:
The Fletcher-Reeves (FR) method is widely recognized for its drawbacks, such as generating unfavorable directions and taking small steps, which can lead to subsequent poor directions and steps. To address this issue, we propose a modification to the FR method, and then we develop it into the three-term conjugate gradient method in this paper. The suggested methods, named ``HZF'' and ``THZF'', preserve the descent property of the FR method while mitigating the drawbacks. The algorithms...
Vlček, Jan, Lukšan, Ladislav
Similarity:
To improve the performance of the L-BFGS method for large scale unconstrained optimization, repeating of some BFGS updates was proposed e.g. in [1]. Since this can be time consuming, the extra updates need to be selected carefully. We show that groups of these updates can be repeated infinitely many times under some conditions, without a noticeable increase of the computational time; the limit update is a block BFGS update [17]. It can be obtained by solving of some Lyapunov matrix equation...
Liu, Hailin, Cheng, Sui Sun, Li, Xiaoyong (2011)
Applied Mathematics E-Notes [electronic only]
Similarity:
Ahmad Kamandi, Keyvan Amini (2022)
Applications of Mathematics
Similarity:
We propose a new and efficient nonmonotone adaptive trust region algorithm to solve unconstrained optimization problems. This algorithm incorporates two novelties: it benefits from a radius dependent shrinkage parameter for adjusting the trust region radius that avoids undesirable directions and exploits a new strategy to prevent sudden increments of objective function values in nonmonotone trust region techniques. Global convergence of this algorithm is investigated under some mild...
Yanqin Xue, Hongwei Liu, Zexian Liu (2019)
Applications of Mathematics
Similarity:
Trust region methods are a class of effective iterative schemes in numerical optimization. In this paper, a new improved nonmonotone adaptive trust region method for solving unconstrained optimization problems is proposed. We construct an approximate model where the approximation to Hessian matrix is updated by the scaled memoryless BFGS update formula, and incorporate a nonmonotone technique with the new proposed adaptive trust region radius. The new ratio to adjusting the next trust...
Yuan, Gonglin (2009)
International Journal of Mathematics and Mathematical Sciences
Similarity:
Zhang, Ming-Liang, Xiao, Yun-Hai, Zhou, Dangzhen (2010)
Mathematical Problems in Engineering
Similarity:
Vlček, Jan, Lukšan, Ladislav
Similarity:
Simple modifications of the limited-memory BFGS method (L-BFGS) for large scale unconstrained optimization are considered, which consist in corrections of the used difference vectors (derived from the idea of conjugate directions), utilizing information from the preceding iteration. For quadratic objective functions, the improvement of convergence is the best one in some sense and all stored difference vectors are conjugate for unit stepsizes. The algorithm is globally convergent for...
Chleboun, Jan
Similarity: