The search session has expired. Please query the service again.
Displaying 201 –
220 of
425
En este artículo estudiamos la utilización de métodos duales en el diseño de algoritmos híbridos para la resolución de problemas de "Set Partitioning" (SP). Las técnicas duales resultan de gran interés para resolver problemas con estructura combinatoria no sólo porque generan cotas inferiores sino porque, además, su utilización junto con heurísticas y procedimientos de generación de desigualdades en el diseño de algoritmos híbridos permite evaluar la calidad de las cotas superiores obtenidas. Los...
En este artículo presentamos y probamos numéricamente un nuevo algoritmo para la minimización global de un polinomio de grado par. El algoritmo está basado en la simple idea de trasladar verticalmente el grafo del polinomio hasta que el eje OX sea tangente al grafo del polinomio trasladado. En esta privilegiada posición, cualquier raíz real del polinomio trasladado es un mínimo global del polinomio original.
The numerical solution of granular dynamics problems with Coulomb friction leads to the problem of minimizing a convex quadratic function with semidefinite Hessian subject to a separable conical constraints. In this paper, we are interested in the numerical solution of this problem. We suggest a modification of an active-set optimal quadratic programming algorithm. The number of projection steps is decreased by using a projected Barzilai-Borwein method. In the numerical experiment, we compare our...
Simple modifications of the limited-memory BFGS method (L-BFGS) for large
scale unconstrained optimization are considered, which consist in corrections of the used difference vectors (derived from the idea of conjugate directions), utilizing information from the preceding iteration. For quadratic objective functions, the improvement of convergence is the best one in some sense and all stored difference vectors are conjugate for unit stepsizes. The algorithm is globally convergent for convex sufficiently...
The conjugate gradient method is one of the most effective algorithm for unconstrained nonlinear optimization problems. This is due to the fact that it does not need a lot of storage memory and its simple structure properties, which motivate us to propose a new hybrid conjugate gradient method through a convex combination of and . We compute the convex parameter using the Newton direction. Global convergence is established through the strong Wolfe conditions. Numerical experiments show the...
Modifications of nonlinear conjugate gradient method are described and tested.
Nonlinear rescaling is a tool for solving large-scale nonlinear programming problems. The primal-dual nonlinear rescaling method was used to solve two quadratic programming problems with quadratic constraints. Based on the performance of primal-dual nonlinear rescaling method on testing problems, the conclusions about setting up the parameters are made. Next, the connection between nonlinear rescaling methods and self-concordant functions is discussed and modified logarithmic barrier function is...
Currently displaying 201 –
220 of
425