On a convolution operation obtained by adding level sets : classical and new results
We present a local and a semi-local convergence analysis of an iterative method for approximating zeros of derivatives for solving univariate and unconstrained optimization problems. In the local case, the radius of convergence is obtained, whereas in the semi-local case, sufficient convergence criteria are presented. Numerical examples are also provided.
We investigate the existence of the solution to the following problem min φ(x) subject to G(x)=0, where φ: X → ℝ, G: X → Y and X,Y are Banach spaces. The question of existence is considered in a neighborhood of such point x₀ that the Hessian of the Lagrange function is degenerate. There was obtained an approximation for the distance of solution x* to the initial point x₀.
In this paper, we show how optimization methods can be used efficiently to determine the parameters of an oscillatory model of handwriting. Because these methods have to be used in real-time applications, this involves that the optimization problems must be rapidely solved. Hence, we developed an original heuristic algorithm, named FHA. This code was validated by comparing it (accuracy/CPU-times) with a multistart method based on Trust Region Reflective algorithm.
Henrici’s transformation is a generalization of Aitken’s -process to the vector case. It has been used for accelerating vector sequences. We use a modified version of Henrici’s transformation for solving some unconstrained nonlinear optimization problems. A convergence acceleration result is established and numerical examples are given.
In the paper necessary optimality conditions are derived for the minimization of a locally Lipschitz objective with respect to the consttraints , where is a closed set and is a set-valued map. No convexity requirements are imposed on . The conditions are applied to a generalized mathematical programming problem and to an abstract finite-dimensional optimal control problem.
In this paper we present the motivation for using the Truncated Newton method in an algorithm that maximises a non-linear function with additional maximin-like arguments subject to a network-like linear system of constraints. The special structure of the network (so-termed replicated quasi-arborescence) allows to introduce the new concept of independent superbasic sets and, then, using second-order information about the objective function without too much computer effort and storage.