A Regularized Continuous Projection-Gradient Method of the Fourth Order
We study polyconvex envelopes of a class of functions related to the function of Kohn and Strang introduced in . We present an example of a function of this class for which the polyconvex envelope may be computed explicitly and we also point out some general features of the problem.
Ant Colony Optimization (ACO) is a recent metaheuristic method that is inspired by the behavior of real ant colonies. In this paper, we review the underlying ideas of this approach that lead from the biological inspiration to the ACO metaheuristic, which gives a set of rules of how to apply ACO algorithms to challenging combinatorial problems. We present some of the algorithms that were developed under this framework, give an overview of current applications, and analyze the relationship between...
A new approach for obtaining the second order sufficient conditions for nonlinear mathematical programming problems which makes use of second order derivative is presented. In the so-called second order -approximation method, an optimization problem associated with the original nonlinear programming problem is constructed that involves a second order -approximation of both the objective function and the constraint function constituting the original problem. The equivalence between the nonlinear...
In this paper, we introduce a new linear programming second-order stochastic dominance (SSD) portfolio efficiency test for portfolios with scenario approach for distribution of outcomes and a new SSD portfolio inefficiency measure. The test utilizes the relationship between CVaR and dual second-order stochastic dominance, and contrary to tests in Post [Post] and Kuosmanen [Kuosmanen], our test detects a dominating portfolio which is SSD efficient. We derive also a necessary condition for SSD efficiency...
By using some NCP functions, we reformulate the extended linear complementarity problem as a nonsmooth equation. Then we propose a self-adaptive trust region algorithm for solving this nonsmooth equation. The novelty of this method is that the trust region radius is controlled by the objective function value which can be adjusted automatically according to the algorithm. The global convergence is obtained under mild conditions and the local superlinear convergence rate is also established under...
Conjugate gradient methods are widely used for solving large-scale unconstrained optimization problems, because they do not need the storage of matrices. Based on the self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno (SSML-BFGS) method, new conjugate gradient algorithms CG-DESCENT and CGOPT have been proposed by W. Hager, H. Zhang (2005) and Y. Dai, C. Kou (2013), respectively. It is noted that the two conjugate gradient methods perform more efficiently than the SSML-BFGS method. Therefore,...
Semi-smooth Newton methods for elliptic equations with gradient constraints are investigated. The one- and multi-dimensional cases are treated separately. Numerical examples illustrate the approach and as well as structural features of the solution.
Semi-smooth Newton methods for elliptic equations with gradient constraints are investigated. The one- and multi-dimensional cases are treated separately. Numerical examples illustrate the approach and as well as structural features of the solution.
In this paper, we present a sensitivity result for quadratic second-order cone programming under the weak form of second-order sufficient condition. Based on this result, we analyze the local convergence of an SQP-type method for nonlinear second-order cone programming. The subproblems of this method at each iteration are quadratic second-order cone programming problems. Compared with the local convergence analysis done before, we do not need the assumption that the Hessian matrix of the Lagrangian...
Sensitivity analysis (with respect to the regularization parameter) of the solution of a class of regularized state constrained optimal control problems is performed. The theoretical results are then used to establish an extrapolation-based numerical scheme for solving the regularized problem for vanishing regularization parameter. In this context, the extrapolation technique provides excellent initializations along the sequence of reducing regularization parameters. Finally, the favorable numerical behavior...
We show how the use of a parallel between the ordinary (+, X) and the (max, +) algebras, Maslov measures that exploit this parallel, and more specifically their specialization to probabilities and the corresponding cost measures of Quadrat, offer a completely parallel treatment of stochastic and minimax control of disturbed nonlinear discrete time systems with partial information. This paper is based upon, and improves, the discrete time part of the earlier paper [9].