A generalized limited-memory BNS method based on the block BFGS update
- Programs and Algorithms of Numerical Mathematics, Publisher: Institute of Mathematics CAS(Prague), page 164-171
Access Full Article
topAbstract
topHow to cite
topVlček, Jan, and Lukšan, Ladislav. "A generalized limited-memory BNS method based on the block BFGS update." Programs and Algorithms of Numerical Mathematics. Prague: Institute of Mathematics CAS, 2017. 164-171. <http://eudml.org/doc/288155>.
@inProceedings{Vlček2017,
abstract = {A block version of the BFGS variable metric update formula is investigated. It satisfies the quasi-Newton conditions with all used difference vectors and gives the best improvement of convergence in some sense for quadratic objective functions, but it does not guarantee that the direction vectors are descent for general functions. To overcome this difficulty and utilize the advantageous properties of the block BFGS update, a block version of the limited-memory BNS method for large scale unconstrained optimization is proposed. The algorithm is globally convergent for convex sufficiently smooth functions and our numerical experiments indicate its efficiency.},
author = {Vlček, Jan, Lukšan, Ladislav},
booktitle = {Programs and Algorithms of Numerical Mathematics},
keywords = {unconstrained minimization; block variable metric methods; limited-memory methods; the BFGS update; global convergence; numerical results},
location = {Prague},
pages = {164-171},
publisher = {Institute of Mathematics CAS},
title = {A generalized limited-memory BNS method based on the block BFGS update},
url = {http://eudml.org/doc/288155},
year = {2017},
}
TY - CLSWK
AU - Vlček, Jan
AU - Lukšan, Ladislav
TI - A generalized limited-memory BNS method based on the block BFGS update
T2 - Programs and Algorithms of Numerical Mathematics
PY - 2017
CY - Prague
PB - Institute of Mathematics CAS
SP - 164
EP - 171
AB - A block version of the BFGS variable metric update formula is investigated. It satisfies the quasi-Newton conditions with all used difference vectors and gives the best improvement of convergence in some sense for quadratic objective functions, but it does not guarantee that the direction vectors are descent for general functions. To overcome this difficulty and utilize the advantageous properties of the block BFGS update, a block version of the limited-memory BNS method for large scale unconstrained optimization is proposed. The algorithm is globally convergent for convex sufficiently smooth functions and our numerical experiments indicate its efficiency.
KW - unconstrained minimization; block variable metric methods; limited-memory methods; the BFGS update; global convergence; numerical results
UR - http://eudml.org/doc/288155
ER -
NotesEmbed ?
topTo embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.