A block version of the BFGS variable metric update formula is investigated. It satisfies the quasi-Newton conditions with all used difference vectors and gives the best improvement of convergence in some sense for quadratic objective functions, but it does not guarantee that the direction vectors are descent for general functions. To overcome this difficulty and utilize the advantageous properties of the block BFGS update, a block version of the limited-memory BNS method for large scale unconstrained...
Simple modifications of the limited-memory BFGS method (L-BFGS) for large
scale unconstrained optimization are considered, which consist in corrections of the used difference vectors (derived from the idea of conjugate directions), utilizing information from the preceding iteration. For quadratic objective functions, the improvement of convergence is the best one in some sense and all stored difference vectors are conjugate for unit stepsizes. The algorithm is globally convergent for convex sufficiently...
To improve the performance of the L-BFGS method for large scale unconstrained optimization, repeating of some BFGS updates was proposed e.g. in [1]. Since this can be time consuming, the extra updates need to be selected carefully. We show that groups of these updates can be repeated infinitely many times under some conditions, without a noticeable increase of the computational time; the limit update is a block BFGS update [17]. It can be obtained by solving of some Lyapunov matrix equation whose...
Modifications of nonlinear conjugate gradient method are described and tested.
In this contribution, we propose a new hybrid method for minimization of nonlinear least squares. This method is based on quasi-Newton updates, applied to an approximation of the Jacobian matrix , such that . This property allows us to solve a linear least squares problem, minimizing instead of solving the normal equation , where is the required direction vector. Computational experiments confirm the efficiency of the new method.
A modification of the limited-memory variable metric BNS method for large scale unconstrained optimization of the differentiable function is considered, which consists in corrections (based on the idea of conjugate directions) of difference vectors for better satisfaction of the previous quasi-Newton conditions. In comparison with [11], more previous iterations can be utilized here.
For quadratic objective functions, the improvement of convergence is the best
one in some sense, all stored corrected...
This contribution contains a description and comparison of two methods applied to exposure optimization applied to moulding process in the automotive industry.
New positive definite preconditioners for the matrix free truncated Newton method are given. Corresponding algorithms are described in detail. Results of numerical experiments that confirm the efficiency and robustness of the preconditioned truncated Newton method are reported.
The paper contains a description and an analysis of two modifications of the conjugate gradient method for unconstrained minimization which find a minimum of the conic function after a finite number of steps. Moreover, further extension of the conjugate gradient method is given which is based on a more general class of the model functions.
The paper describes the dual method for solving a special problem of quadratic programming as a subproblem at nonlinear minimax approximation. Two cases are analyzed in detail, differring in linear dependence of gradients of the active functions. The complete algorithm of the dual method is presented and its finite step convergence is proved.
Download Results (CSV)