Page 1 Next

Displaying 1 – 20 of 25

Showing per page

New hybrid conjugate gradient method for nonlinear optimization with application to image restoration problems

Youcef Elhamam Hemici, Samia Khelladi, Djamel Benterki (2024)

Kybernetika

The conjugate gradient method is one of the most effective algorithm for unconstrained nonlinear optimization problems. This is due to the fact that it does not need a lot of storage memory and its simple structure properties, which motivate us to propose a new hybrid conjugate gradient method through a convex combination of β k R M I L and β k H S . We compute the convex parameter θ k using the Newton direction. Global convergence is established through the strong Wolfe conditions. Numerical experiments show the...

New regions of stability in input optimization

Sheng Huang, Sanjo Zlobec (1988)

Aplikace matematiky

using point-to-set mappings we identify two new regions of stability in input optimization. Then we extend various results from the literature on optimality conditions, continuity of Lagrange multipliers, and the marginal value formula over the new and some old regions of stability.

New technique for solving univariate global optimization

Djamel Aaid, Amel Noui, Mohand Ouanes (2017)

Archivum Mathematicum

In this paper, a new global optimization method is proposed for an optimization problem with twice differentiable objective function a single variable with box constraint. The method employs a difference of linear interpolant of the objective and a concave function, where the former is a continuous piecewise convex quadratic function underestimator. The main objectives of this research are to determine the value of the lower bound that does not need an iterative local optimizer. The proposed method...

Newton methods for solving two classes of nonsmooth equations

Yan Gao (2001)

Applications of Mathematics

The paper is devoted to two systems of nonsmooth equations. One is the system of equations of max-type functions and the other is the system of equations of smooth compositions of max-type functions. The Newton and approximate Newton methods for these two systems are proposed. The Q-superlinear convergence of the Newton methods and the Q-linear convergence of the approximate Newton methods are established. The present methods can be more easily implemented than the previous ones, since they do not...

Nonlinear conjugate gradient methods

Lukšan, Ladislav, Vlček, Jan (2015)

Programs and Algorithms of Numerical Mathematics

Modifications of nonlinear conjugate gradient method are described and tested.

Nonlinear Rescaling Method and Self-concordant Functions

Richard Andrášik (2013)

Acta Universitatis Palackianae Olomucensis. Facultas Rerum Naturalium. Mathematica

Nonlinear rescaling is a tool for solving large-scale nonlinear programming problems. The primal-dual nonlinear rescaling method was used to solve two quadratic programming problems with quadratic constraints. Based on the performance of primal-dual nonlinear rescaling method on testing problems, the conclusions about setting up the parameters are made. Next, the connection between nonlinear rescaling methods and self-concordant functions is discussed and modified logarithmic barrier function is...

Nonsmooth equation method for nonlinear nonconvex optimization

Lukšan, Ladislav, Matonoha, Ctirad, Vlček, Jan (2025)

Programs and Algorithms of Numerical Mathematics

The contribution deals with the description of two nonsmooth equation methods for inequality constrained mathematical programming problems. Three algorithms are presented and their efficiency is demonstrated by numerical experiments.

Currently displaying 1 – 20 of 25

Page 1 Next