The search session has expired. Please query the service again.

The search session has expired. Please query the service again.

The search session has expired. Please query the service again.

The search session has expired. Please query the service again.

The search session has expired. Please query the service again.

The search session has expired. Please query the service again.

The search session has expired. Please query the service again.

The search session has expired. Please query the service again.

Currently displaying 1 – 3 of 3

Showing per page

Order by Relevance | Title | Year of publication

A penalty approach for a box constrained variational inequality problem

Zahira KebailiDjamel Benterki — 2018

Applications of Mathematics

We propose a penalty approach for a box constrained variational inequality problem ( BVIP ) . This problem is replaced by a sequence of nonlinear equations containing a penalty term. We show that if the penalty parameter tends to infinity, the solution of this sequence converges to that of BVIP when the function F involved is continuous and strongly monotone and the box C contains the origin. We develop the algorithmic aspect with theoretical arguments properly established. The numerical results tested on...

A numerical feasible interior point method for linear semidefinite programs

Djamel BenterkiJean-Pierre CrouzeixBachir Merikhi — 2007

RAIRO - Operations Research

This paper presents a feasible primal algorithm for linear semidefinite programming. The algorithm starts with a strictly feasible solution, but in case where no such a solution is known, an application of the algorithm to an associate problem allows to obtain one. Finally, we present some numerical experiments which show that the algorithm works properly.

New hybrid conjugate gradient method for nonlinear optimization with application to image restoration problems

The conjugate gradient method is one of the most effective algorithm for unconstrained nonlinear optimization problems. This is due to the fact that it does not need a lot of storage memory and its simple structure properties, which motivate us to propose a new hybrid conjugate gradient method through a convex combination of β k R M I L and β k H S . We compute the convex parameter θ k using the Newton direction. Global convergence is established through the strong Wolfe conditions. Numerical experiments show the...

Page 1

Download Results (CSV)