The search session has expired. Please query the service again.
We discuss the existence and multiplicity of positive solutions for a class of second order quasilinear equations. To obtain our results we will use the Ekeland variational principle and the Mountain Pass Theorem.
We present necessary conditions for linear noncooperative N-player delta dynamic games on an arbitrary time scale. Necessary conditions for an open-loop Nash-equilibrium and for a memoryless perfect state Nash-equilibrium are proved.
We present a characterization of weak sharp local minimizers of order one for a function f: ℝⁿ → ℝ defined by , where the functions are strictly differentiable. It is given in terms of the gradients of and the Mordukhovich normal cone to a given set on which f is constant. Then we apply this result to a smooth nonlinear programming problem with constraints.
In this paper, first we consider parametric control systems driven by nonlinear evolution equations defined on an evolution triple of spaces. The parametres are time-varying probability measures (Young measures) defined on a compact metric space. The appropriate optimization problem is a minimax control problem, in which the system analyst minimizes the maximum cost (risk). Under general hypotheses on the data we establish the existence of optimal controls.
Then we pass to nonparametric...
We study first order optimality systems for the control of a system
governed by a variational
inequality and deal with Lagrange multipliers: is
it possible to associate to each pointwise constraint a multiplier
to get a “good” optimality system?
We give
positive and negative answers for the finite and infinite dimensional cases.
These results are compared with
the previous ones got by penalization or differentiation.
In this paper, we propose a primal interior-point method for large sparse generalized minimax optimization. After a short introduction, where the problem is stated, we introduce the basic equations of the Newton method applied to the KKT conditions and propose a primal interior-point method. (i. e. interior point method that uses explicitly computed approximations of Lagrange multipliers instead of their updates). Next we describe the basic algorithm and give more details concerning its implementation...
In this paper, we propose a primal interior-point method for large sparse minimax optimization. After a short introduction, the complete algorithm is introduced and important implementation details are given. We prove that this algorithm is globally convergent under standard mild assumptions. Thus the large sparse nonconvex minimax optimization problems can be solved successfully. The results of extensive computational experiments given in this paper confirm efficiency and robustness of the proposed...
In this report we propose a new recursive matrix formulation of limited memory variable metric methods. This approach can be used for an arbitrary update from the Broyden class (and some other updates) and also for the approximation of both the Hessian matrix and its inverse. The new recursive formulation requires approximately multiplications and additions per iteration, so it is comparable with other efficient limited memory variable metric methods. Numerical experiments concerning Algorithm...
The goal of this paper is to compute the shape Hessian for a generalized Oseen problem with nonhomogeneous Dirichlet boundary condition by the velocity method. The incompressibility will be treated by penalty approach. The structure of the shape gradient and shape Hessian with respect to the shape of the variable domain for a given cost functional are established by an application of the Lagrangian method with function space embedding technique.
Currently displaying 21 –
40 of
45