Displaying similar documents to “An application of the averaged gradient technique”

A self-scaling memoryless BFGS based conjugate gradient method using multi-step secant condition for unconstrained minimization

Yongjin Kim, Yunchol Jong, Yong Kim (2024)

Applications of Mathematics

Similarity:

Conjugate gradient methods are widely used for solving large-scale unconstrained optimization problems, because they do not need the storage of matrices. Based on the self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno (SSML-BFGS) method, new conjugate gradient algorithms CG-DESCENT and CGOPT have been proposed by W. Hager, H. Zhang (2005) and Y. Dai, C. Kou (2013), respectively. It is noted that the two conjugate gradient methods perform more efficiently than the SSML-BFGS method....

A modified Fletcher-Reeves conjugate gradient method for unconstrained optimization with applications in image restoration

Zainab Hassan Ahmed, Mohamed Hbaib, Khalil K. Abbo (2024)

Applications of Mathematics

Similarity:

The Fletcher-Reeves (FR) method is widely recognized for its drawbacks, such as generating unfavorable directions and taking small steps, which can lead to subsequent poor directions and steps. To address this issue, we propose a modification to the FR method, and then we develop it into the three-term conjugate gradient method in this paper. The suggested methods, named ``HZF'' and ``THZF'', preserve the descent property of the FR method while mitigating the drawbacks. The algorithms...

Impulse noise removal based on new hybrid conjugate gradient approach

Morteza Kimiaei, Majid Rostami (2016)

Kybernetika

Similarity:

Image denoising is a fundamental problem in image processing operations. In this paper, we present a two-phase scheme for the impulse noise removal. In the first phase, noise candidates are identified by the adaptive median filter (AMF) for salt-and-pepper noise. In the second phase, a new hybrid conjugate gradient method is used to minimize an edge-preserving regularization functional. The second phase of our algorithm inherits advantages of both Dai-Yuan (DY) and Hager-Zhang (HZ) conjugate...

Adjustment of the scaling parameter of Dai-Kou type conjugate gradient methods with application to motion control

Mahbube Akbari, Saeed Nezhadhosein, Aghile Heydari (2024)

Applications of Mathematics

Similarity:

We introduce a new scaling parameter for the Dai-Kou family of conjugate gradient algorithms (2013), which is one of the most numerically efficient methods for unconstrained optimization. The suggested parameter is based on eigenvalue analysis of the search direction matrix and minimizing the measure function defined by Dennis and Wolkowicz (1993). The corresponding search direction of conjugate gradient method has the sufficient descent property and the extended conjugacy condition....

Gradient-free and gradient-based methods for shape optimization of water turbine blade

Bastl, Bohumír, Brandner, Marek, Egermaier, Jiří, Horníková, Hana, Michálková, Kristýna, Turnerová, Eva

Similarity:

The purpose of our work is to develop an automatic shape optimization tool for runner wheel blades in reaction water turbines, especially in Kaplan turbines. The fluid flow is simulated using an in-house incompressible turbulent flow solver based on recently introduced isogeometric analysis (see e.g. J. A. Cotrell et al.: Isogeometric Analysis: Toward Integration of CAD and FEA, Wiley, 2009). The proposed automatic shape optimization approach is based on a so-called hybrid optimization...