Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization.
Andrei, Neculai (2011)
Bulletin of the Malaysian Mathematical Sciences Society. Second Series
Similarity:
Andrei, Neculai (2011)
Bulletin of the Malaysian Mathematical Sciences Society. Second Series
Similarity:
Ahmad Kamandi, Keyvan Amini (2022)
Applications of Mathematics
Similarity:
We propose a new and efficient nonmonotone adaptive trust region algorithm to solve unconstrained optimization problems. This algorithm incorporates two novelties: it benefits from a radius dependent shrinkage parameter for adjusting the trust region radius that avoids undesirable directions and exploits a new strategy to prevent sudden increments of objective function values in nonmonotone trust region techniques. Global convergence of this algorithm is investigated under some mild...
Yongjin Kim, Yunchol Jong, Yong Kim (2024)
Applications of Mathematics
Similarity:
Conjugate gradient methods are widely used for solving large-scale unconstrained optimization problems, because they do not need the storage of matrices. Based on the self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno (SSML-BFGS) method, new conjugate gradient algorithms CG-DESCENT and CGOPT have been proposed by W. Hager, H. Zhang (2005) and Y. Dai, C. Kou (2013), respectively. It is noted that the two conjugate gradient methods perform more efficiently than the SSML-BFGS method....
Liu, Hailin, Cheng, Sui Sun, Li, Xiaoyong (2011)
Applied Mathematics E-Notes [electronic only]
Similarity:
Yuan, Gonglin (2009)
International Journal of Mathematics and Mathematical Sciences
Similarity:
Mahbube Akbari, Saeed Nezhadhosein, Aghile Heydari (2024)
Applications of Mathematics
Similarity:
We introduce a new scaling parameter for the Dai-Kou family of conjugate gradient algorithms (2013), which is one of the most numerically efficient methods for unconstrained optimization. The suggested parameter is based on eigenvalue analysis of the search direction matrix and minimizing the measure function defined by Dennis and Wolkowicz (1993). The corresponding search direction of conjugate gradient method has the sufficient descent property and the extended conjugacy condition....
Morteza Kimiaei, Majid Rostami (2016)
Kybernetika
Similarity:
Image denoising is a fundamental problem in image processing operations. In this paper, we present a two-phase scheme for the impulse noise removal. In the first phase, noise candidates are identified by the adaptive median filter (AMF) for salt-and-pepper noise. In the second phase, a new hybrid conjugate gradient method is used to minimize an edge-preserving regularization functional. The second phase of our algorithm inherits advantages of both Dai-Yuan (DY) and Hager-Zhang (HZ) conjugate...
Zhang, Ming-Liang, Xiao, Yun-Hai, Zhou, Dangzhen (2010)
Mathematical Problems in Engineering
Similarity:
Roman Knobloch, Jaroslav Mlýnek, Radek Srb (2017)
Applications of Mathematics
Similarity:
Differential evolution algorithms represent an up to date and efficient way of solving complicated optimization tasks. In this article we concentrate on the ability of the differential evolution algorithms to attain the global minimum of the cost function. We demonstrate that although often declared as a global optimizer the classic differential evolution algorithm does not in general guarantee the convergence to the global minimum. To improve this weakness we design a simple modification...
Vlček, Jan, Lukšan, Ladislav
Similarity:
To improve the performance of the L-BFGS method for large scale unconstrained optimization, repeating of some BFGS updates was proposed e.g. in [1]. Since this can be time consuming, the extra updates need to be selected carefully. We show that groups of these updates can be repeated infinitely many times under some conditions, without a noticeable increase of the computational time; the limit update is a block BFGS update [17]. It can be obtained by solving of some Lyapunov matrix equation...
Fridrich Sloboda (1982)
Aplikace matematiky
Similarity:
A new biorthogonalization algorithm is defined which does not depend on the step-size used. The algorithm is suggested so as to minimize the total error after steps if imperfect steps are used. The majority of conjugate gradient algorithms are sensitive to the exactness of the line searches and this phenomenon may destroy the global efficiency of these algorithms.
Yanqin Xue, Hongwei Liu, Zexian Liu (2019)
Applications of Mathematics
Similarity:
Trust region methods are a class of effective iterative schemes in numerical optimization. In this paper, a new improved nonmonotone adaptive trust region method for solving unconstrained optimization problems is proposed. We construct an approximate model where the approximation to Hessian matrix is updated by the scaled memoryless BFGS update formula, and incorporate a nonmonotone technique with the new proposed adaptive trust region radius. The new ratio to adjusting the next trust...
Nedeljko Ostojić, Dušan Starčević (2000)
The Yugoslav Journal of Operations Research
Similarity: