Displaying 221 – 240 of 450

Showing per page

Modifications of the limited-memory BFGS method based on the idea of conjugate directions

Vlček, Jan, Lukšan, Ladislav (2013)

Programs and Algorithms of Numerical Mathematics

Simple modifications of the limited-memory BFGS method (L-BFGS) for large scale unconstrained optimization are considered, which consist in corrections of the used difference vectors (derived from the idea of conjugate directions), utilizing information from the preceding iteration. For quadratic objective functions, the improvement of convergence is the best one in some sense and all stored difference vectors are conjugate for unit stepsizes. The algorithm is globally convergent for convex sufficiently...

Multi-objective geometric programming problem with Karush−Kuhn−Tucker condition using ϵ-constraint method

A. K. Ojha, Rashmi Ranjan Ota (2014)

RAIRO - Operations Research - Recherche Opérationnelle

Optimization is an important tool widely used in formulation of the mathematical model and design of various decision making problems related to the science and engineering. Generally, the real world problems are occurring in the form of multi-criteria and multi-choice with certain constraints. There is no such single optimal solution exist which could optimize all the objective functions simultaneously. In this paper, ϵ-constraint method along with Karush−Kuhn−Tucker (KKT) condition has been used...

Multi-objective Optimization Problem with Bounded Parameters

Ajay Kumar Bhurjee, Geetanjali Panda (2014)

RAIRO - Operations Research - Recherche Opérationnelle

In this paper, we propose a nonlinear multi-objective optimization problem whose parameters in the objective functions and constraints vary in between some lower and upper bounds. Existence of the efficient solution of this model is studied and gradient based as well as gradient free optimality conditions are derived. The theoretical developments are illustrated through numerical examples.

New hybrid conjugate gradient method for nonlinear optimization with application to image restoration problems

Youcef Elhamam Hemici, Samia Khelladi, Djamel Benterki (2024)

Kybernetika

The conjugate gradient method is one of the most effective algorithm for unconstrained nonlinear optimization problems. This is due to the fact that it does not need a lot of storage memory and its simple structure properties, which motivate us to propose a new hybrid conjugate gradient method through a convex combination of β k R M I L and β k H S . We compute the convex parameter θ k using the Newton direction. Global convergence is established through the strong Wolfe conditions. Numerical experiments show the...

New regions of stability in input optimization

Sheng Huang, Sanjo Zlobec (1988)

Aplikace matematiky

using point-to-set mappings we identify two new regions of stability in input optimization. Then we extend various results from the literature on optimality conditions, continuity of Lagrange multipliers, and the marginal value formula over the new and some old regions of stability.

New technique for solving univariate global optimization

Djamel Aaid, Amel Noui, Mohand Ouanes (2017)

Archivum Mathematicum

In this paper, a new global optimization method is proposed for an optimization problem with twice differentiable objective function a single variable with box constraint. The method employs a difference of linear interpolant of the objective and a concave function, where the former is a continuous piecewise convex quadratic function underestimator. The main objectives of this research are to determine the value of the lower bound that does not need an iterative local optimizer. The proposed method...

Newton methods for solving two classes of nonsmooth equations

Yan Gao (2001)

Applications of Mathematics

The paper is devoted to two systems of nonsmooth equations. One is the system of equations of max-type functions and the other is the system of equations of smooth compositions of max-type functions. The Newton and approximate Newton methods for these two systems are proposed. The Q-superlinear convergence of the Newton methods and the Q-linear convergence of the approximate Newton methods are established. The present methods can be more easily implemented than the previous ones, since they do not...

Nonlinear conjugate gradient methods

Lukšan, Ladislav, Vlček, Jan (2015)

Programs and Algorithms of Numerical Mathematics

Modifications of nonlinear conjugate gradient method are described and tested.

Currently displaying 221 – 240 of 450