Nonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimization.
Nonlinear rescaling is a tool for solving large-scale nonlinear programming problems. The primal-dual nonlinear rescaling method was used to solve two quadratic programming problems with quadratic constraints. Based on the performance of primal-dual nonlinear rescaling method on testing problems, the conclusions about setting up the parameters are made. Next, the connection between nonlinear rescaling methods and self-concordant functions is discussed and modified logarithmic barrier function is...
The contribution deals with the description of two nonsmooth equation methods for inequality constrained mathematical programming problems. Three algorithms are presented and their efficiency is demonstrated by numerical experiments.
In this paper multidimensional nonsmooth, nonconvex problems of the calculus of variations with codifferentiable integrand are studied. Special classes of codifferentiable functions, that play an important role in the calculus of variations, are introduced and studied. The codifferentiability of the main functional of the calculus of variations is derived. Necessary conditions for the extremum of a codifferentiable function on a closed convex set and its applications to the nonsmooth problems of...
We present a local and a semi-local convergence analysis of an iterative method for approximating zeros of derivatives for solving univariate and unconstrained optimization problems. In the local case, the radius of convergence is obtained, whereas in the semi-local case, sufficient convergence criteria are presented. Numerical examples are also provided.