Modifications of the limited-memory BFGS method based on the idea of conjugate directions
- Programs and Algorithms of Numerical Mathematics, Publisher: Institute of Mathematics AS CR(Prague), page 209-214
Access Full Article
topAbstract
topHow to cite
topVlček, Jan, and Lukšan, Ladislav. "Modifications of the limited-memory BFGS method based on the idea of conjugate directions." Programs and Algorithms of Numerical Mathematics. Prague: Institute of Mathematics AS CR, 2013. 209-214. <http://eudml.org/doc/271332>.
@inProceedings{Vlček2013,
abstract = {Simple modifications of the limited-memory BFGS method (L-BFGS) for large
scale unconstrained optimization are considered, which consist in corrections of the used difference vectors (derived from the idea of conjugate directions), utilizing information from the preceding iteration. For quadratic objective functions, the improvement of convergence is the best one in some sense and all stored difference vectors are conjugate for unit stepsizes. The algorithm is globally convergent for convex sufficiently smooth functions. Numerical experiments indicate that the new method often improves the L-BFGS method significantly.},
author = {Vlček, Jan, Lukšan, Ladislav},
booktitle = {Programs and Algorithms of Numerical Mathematics},
keywords = {limited-memory BFGS method; unconstrained optimization; quadratic objective function; convergence; performance},
location = {Prague},
pages = {209-214},
publisher = {Institute of Mathematics AS CR},
title = {Modifications of the limited-memory BFGS method based on the idea of conjugate directions},
url = {http://eudml.org/doc/271332},
year = {2013},
}
TY - CLSWK
AU - Vlček, Jan
AU - Lukšan, Ladislav
TI - Modifications of the limited-memory BFGS method based on the idea of conjugate directions
T2 - Programs and Algorithms of Numerical Mathematics
PY - 2013
CY - Prague
PB - Institute of Mathematics AS CR
SP - 209
EP - 214
AB - Simple modifications of the limited-memory BFGS method (L-BFGS) for large
scale unconstrained optimization are considered, which consist in corrections of the used difference vectors (derived from the idea of conjugate directions), utilizing information from the preceding iteration. For quadratic objective functions, the improvement of convergence is the best one in some sense and all stored difference vectors are conjugate for unit stepsizes. The algorithm is globally convergent for convex sufficiently smooth functions. Numerical experiments indicate that the new method often improves the L-BFGS method significantly.
KW - limited-memory BFGS method; unconstrained optimization; quadratic objective function; convergence; performance
UR - http://eudml.org/doc/271332
ER -
NotesEmbed ?
topTo embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.