A self-scaling memoryless BFGS based conjugate gradient method using multi-step secant condition for unconstrained minimization

Yongjin Kim; Yunchol Jong; Yong Kim

Applications of Mathematics (2024)

  • Volume: 69, Issue: 6, page 847-866
  • ISSN: 0862-7940

Abstract

top
Conjugate gradient methods are widely used for solving large-scale unconstrained optimization problems, because they do not need the storage of matrices. Based on the self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno (SSML-BFGS) method, new conjugate gradient algorithms CG-DESCENT and CGOPT have been proposed by W. Hager, H. Zhang (2005) and Y. Dai, C. Kou (2013), respectively. It is noted that the two conjugate gradient methods perform more efficiently than the SSML-BFGS method. Therefore, C. Kou, Y. Dai (2015) proposed some suitable modifications of the SSML-BFGS method such that the sufficient descent condition holds. For the sake of improvement of modified SSML-BFGS method, in this paper, we present an efficient SSML-BFGS-type three-term conjugate gradient method for solving unconstrained minimization using Ford-Moghrabi secant equation instead of the usual secant equations. The method is shown to be globally convergent under certain assumptions. Numerical results compared with methods using the usual secant equations are reported.

How to cite

top

Kim, Yongjin, Jong, Yunchol, and Kim, Yong. "A self-scaling memoryless BFGS based conjugate gradient method using multi-step secant condition for unconstrained minimization." Applications of Mathematics 69.6 (2024): 847-866. <http://eudml.org/doc/299618>.

@article{Kim2024,
abstract = {Conjugate gradient methods are widely used for solving large-scale unconstrained optimization problems, because they do not need the storage of matrices. Based on the self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno (SSML-BFGS) method, new conjugate gradient algorithms CG-DESCENT and CGOPT have been proposed by W. Hager, H. Zhang (2005) and Y. Dai, C. Kou (2013), respectively. It is noted that the two conjugate gradient methods perform more efficiently than the SSML-BFGS method. Therefore, C. Kou, Y. Dai (2015) proposed some suitable modifications of the SSML-BFGS method such that the sufficient descent condition holds. For the sake of improvement of modified SSML-BFGS method, in this paper, we present an efficient SSML-BFGS-type three-term conjugate gradient method for solving unconstrained minimization using Ford-Moghrabi secant equation instead of the usual secant equations. The method is shown to be globally convergent under certain assumptions. Numerical results compared with methods using the usual secant equations are reported.},
author = {Kim, Yongjin, Jong, Yunchol, Kim, Yong},
journal = {Applications of Mathematics},
keywords = {unconstrained optimization; conjugate gradient method; multi-step secant condition; self-scaling; improved Wolfe line search},
language = {eng},
number = {6},
pages = {847-866},
publisher = {Institute of Mathematics, Academy of Sciences of the Czech Republic},
title = {A self-scaling memoryless BFGS based conjugate gradient method using multi-step secant condition for unconstrained minimization},
url = {http://eudml.org/doc/299618},
volume = {69},
year = {2024},
}

TY - JOUR
AU - Kim, Yongjin
AU - Jong, Yunchol
AU - Kim, Yong
TI - A self-scaling memoryless BFGS based conjugate gradient method using multi-step secant condition for unconstrained minimization
JO - Applications of Mathematics
PY - 2024
PB - Institute of Mathematics, Academy of Sciences of the Czech Republic
VL - 69
IS - 6
SP - 847
EP - 866
AB - Conjugate gradient methods are widely used for solving large-scale unconstrained optimization problems, because they do not need the storage of matrices. Based on the self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno (SSML-BFGS) method, new conjugate gradient algorithms CG-DESCENT and CGOPT have been proposed by W. Hager, H. Zhang (2005) and Y. Dai, C. Kou (2013), respectively. It is noted that the two conjugate gradient methods perform more efficiently than the SSML-BFGS method. Therefore, C. Kou, Y. Dai (2015) proposed some suitable modifications of the SSML-BFGS method such that the sufficient descent condition holds. For the sake of improvement of modified SSML-BFGS method, in this paper, we present an efficient SSML-BFGS-type three-term conjugate gradient method for solving unconstrained minimization using Ford-Moghrabi secant equation instead of the usual secant equations. The method is shown to be globally convergent under certain assumptions. Numerical results compared with methods using the usual secant equations are reported.
LA - eng
KW - unconstrained optimization; conjugate gradient method; multi-step secant condition; self-scaling; improved Wolfe line search
UR - http://eudml.org/doc/299618
ER -

References

top
  1. Chen, Y., Cao, M., Yang, Y., 10.1186/s13660-019-2238-9, J. Inequal. Appl. 2019 (2019), Article ID 300, 13 pages. (2019) Zbl1499.90216MR4036512DOI10.1186/s13660-019-2238-9
  2. Dai, Y.-H., Kou, C.-X., 10.1137/100813026, SIAM J. Optim. 23 (2013), 296-320. (2013) Zbl1266.49065MR3033109DOI10.1137/100813026
  3. Dai, Y.-H., Liao, L.-Z., 10.1007/s002450010019, Appl. Math. Optim. 43 (2001), 87-101. (2001) Zbl0973.65050MR1804396DOI10.1007/s002450010019
  4. Dolan, E. D., Moré, J. J., 10.1007/s101070100263, Math. Program. 91 (2002), 201-213. (2002) Zbl1049.90004MR1875515DOI10.1007/s101070100263
  5. Dong, X.-L., Dai, Z.-F., Ghanbari, R., Li, X.-L., 10.1007/s40305-019-00257-w, J. Oper. Res. Soc. China 9 (2021), 411-425. (2021) Zbl1488.49058MR4263215DOI10.1007/s40305-019-00257-w
  6. Dong, X., Han, D., Dai, Z., Li, L., Zhu, J., 10.1007/s10957-018-1377-3, J. Optim. Theory Appl. 179 (2018), 944-961. (2018) Zbl1402.90176MR3869490DOI10.1007/s10957-018-1377-3
  7. Ford, J. A., Moghrabi, I. A., 10.1080/10556789308805550, Optim. Methods Softw. 2 (1993), 357-370. (1993) MR1284271DOI10.1080/10556789308805550
  8. Ford, J. A., Moghrabi, I. A., 10.1016/0377-0427(94)90309-3, J. Comput. Appl. Math. 50 (1994), 305-323. (1994) Zbl0807.65062MR1284271DOI10.1016/0377-0427(94)90309-3
  9. Ford, J. A., Narushima, Y., Yabe, H., 10.1007/s10589-007-9087-z, Comput. Optim. Appl. 40 (2008), 191-216. (2008) Zbl1181.90221MR2390824DOI10.1007/s10589-007-9087-z
  10. Hager, W. W., Zhang, H., 10.1137/030601880, SIAM J. Optim. 16 (2005), 170-192. (2005) Zbl1093.90085MR2177774DOI10.1137/030601880
  11. Kou, C. X., Dai, Y. H., 10.1007/s10957-014-0528-4, J. Optim. Theory Appl. 165 (2015), 209-224. (2015) Zbl1319.49042MR3327422DOI10.1007/s10957-014-0528-4
  12. Li, D., Fukushima, M., 10.1016/S0377-0427(00)00540-9, J. Comput. Appl. Math. 129 (2001), 15-35. (2001) Zbl0984.65055MR1823208DOI10.1016/S0377-0427(00)00540-9
  13. Li, X., Shi, J., Dong, X., Yu, J., 10.1016/j.cam.2018.10.035, J. Comput. Appl. Math. 350 (2019), 372-379. (2019) Zbl1524.90295MR3877400DOI10.1016/j.cam.2018.10.035
  14. Mtagulwa, P., Kaelo, P., 10.1016/j.apnum.2019.06.003, Appl. Numer. Math. 145 (2019), 111-120. (2019) Zbl1477.65095MR3963704DOI10.1016/j.apnum.2019.06.003
  15. Narushima, Y., Yabe, H., Ford, J. A., 10.1137/080743573, SIAM J. Optim. 21 (2011), 212-230. (2011) Zbl1250.90087MR2765496DOI10.1137/080743573
  16. Nocedal, J., Wright, S. J., 10.1007/978-0-387-40065-5, Springer Series in Operations Research and Financial Engineering. Springer, New York (2006). (2006) Zbl1104.65059MR2244940DOI10.1007/978-0-387-40065-5
  17. Perry, A., A class of conjugate gradient algorithms with a two-step variable metric memory, Discussion paper 269 (1977), 16 pages Available at https://EconPapers.repec.org/RePEc:nwu:cmsems:269. (1977) 
  18. Shanno, D. F., 10.1137/0715085, SIAM J. Numer. Anal. 15 (1978), 1247-1257. (1978) Zbl0408.90071MR0512697DOI10.1137/0715085
  19. Wei, Z., Li, G., Qi, L., 10.1016/j.amc.2005.08.027, Appl. Math. Comput. 175 (2006), 1156-1188. (2006) Zbl1100.65054MR2220320DOI10.1016/j.amc.2005.08.027
  20. Yabe, H., Takano, M., 10.1023/B:COAP.0000026885.81997.88, Comput. Optim. Appl. 28 (2004), 203-225. (2004) Zbl1056.90130MR2072533DOI10.1023/B:COAP.0000026885.81997.88
  21. Yuan, G., Wei, Z., Li, G., 10.1016/j.cam.2013.04.032, J. Comput. Appl. Math. 255 (2014), 86-96. (2014) Zbl1291.90315MR3093406DOI10.1016/j.cam.2013.04.032
  22. Zheng, Y., Zheng, B., 10.1007/s10957-017-1140-1, J. Optim. Theory Appl. 175 (2017), 502-509. (2017) Zbl1380.65108MR3717341DOI10.1007/s10957-017-1140-1
  23. Zhou, Q., Hang, D., 10.1016/j.apnum.2014.12.009, Appl. Numer. Math. 91 (2015), 75-88. (2015) Zbl1310.65070MR3312325DOI10.1016/j.apnum.2014.12.009
  24. Zhou, W., Zhang, L., 10.1080/10556780500137041, Optim. Methods Softw. 21 (2006), 707-714. (2006) Zbl1112.90096MR2238653DOI10.1080/10556780500137041

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.