A modified limited-memory BNS method for unconstrained minimization derived from the conjugate directions idea
- Programs and Algorithms of Numerical Mathematics, Publisher: Institute of Mathematics AS CR(Prague), page 237-243
Access Full Article
topAbstract
topHow to cite
topVlček, Jan, and Lukšan, Ladislav. "A modified limited-memory BNS method for unconstrained minimization derived from the conjugate directions idea." Programs and Algorithms of Numerical Mathematics. Prague: Institute of Mathematics AS CR, 2015. 237-243. <http://eudml.org/doc/269930>.
@inProceedings{Vlček2015,
abstract = {A modification of the limited-memory variable metric BNS method for large scale unconstrained optimization of the differentiable function $f:\{\mathcal \{R\}\}^N\rightarrow \mathcal \{R\}$ is considered, which consists in corrections (based on the idea of conjugate directions) of difference vectors for better satisfaction of the previous quasi-Newton conditions. In comparison with [11], more previous iterations can be utilized here.
For quadratic objective functions, the improvement of convergence is the best
one in some sense, all stored corrected difference vectors are conjugate and the
quasi-Newton conditions with these vectors are satisfied. The algorithm is
globally convergent for convex sufficiently smooth functions and our numerical experiments indicate its efficiency.},
author = {Vlček, Jan, Lukšan, Ladislav},
booktitle = {Programs and Algorithms of Numerical Mathematics},
keywords = {large scale unconstrained optimization; limited-memory variable metric method; BNS method; quasi-Newton method; convergence analysis; numerical experiments},
location = {Prague},
pages = {237-243},
publisher = {Institute of Mathematics AS CR},
title = {A modified limited-memory BNS method for unconstrained minimization derived from the conjugate directions idea},
url = {http://eudml.org/doc/269930},
year = {2015},
}
TY - CLSWK
AU - Vlček, Jan
AU - Lukšan, Ladislav
TI - A modified limited-memory BNS method for unconstrained minimization derived from the conjugate directions idea
T2 - Programs and Algorithms of Numerical Mathematics
PY - 2015
CY - Prague
PB - Institute of Mathematics AS CR
SP - 237
EP - 243
AB - A modification of the limited-memory variable metric BNS method for large scale unconstrained optimization of the differentiable function $f:{\mathcal {R}}^N\rightarrow \mathcal {R}$ is considered, which consists in corrections (based on the idea of conjugate directions) of difference vectors for better satisfaction of the previous quasi-Newton conditions. In comparison with [11], more previous iterations can be utilized here.
For quadratic objective functions, the improvement of convergence is the best
one in some sense, all stored corrected difference vectors are conjugate and the
quasi-Newton conditions with these vectors are satisfied. The algorithm is
globally convergent for convex sufficiently smooth functions and our numerical experiments indicate its efficiency.
KW - large scale unconstrained optimization; limited-memory variable metric method; BNS method; quasi-Newton method; convergence analysis; numerical experiments
UR - http://eudml.org/doc/269930
ER -
NotesEmbed ?
topTo embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.