Convergence analysis for principal component flows

Shintaro Yoshizawa; Uwe Helmke; Konstantin Starkov

International Journal of Applied Mathematics and Computer Science (2001)

  • Volume: 11, Issue: 1, page 223-236
  • ISSN: 1641-876X

Abstract

top
A common framework for analyzing the global convergence of several flows for principal component analysis is developed. It is shown that flows proposed by Brockett, Oja, Xu and others are all gradient flows and the global convergence of these flows to single equilibrium points is established. The signature of the Hessian at each critical point is determined.

How to cite

top

Yoshizawa, Shintaro, Helmke, Uwe, and Starkov, Konstantin. "Convergence analysis for principal component flows." International Journal of Applied Mathematics and Computer Science 11.1 (2001): 223-236. <http://eudml.org/doc/207501>.

@article{Yoshizawa2001,
abstract = {A common framework for analyzing the global convergence of several flows for principal component analysis is developed. It is shown that flows proposed by Brockett, Oja, Xu and others are all gradient flows and the global convergence of these flows to single equilibrium points is established. The signature of the Hessian at each critical point is determined.},
author = {Yoshizawa, Shintaro, Helmke, Uwe, Starkov, Konstantin},
journal = {International Journal of Applied Mathematics and Computer Science},
keywords = {Hessians; neural networks; principal component analysis; phase portrait; gradient flows},
language = {eng},
number = {1},
pages = {223-236},
title = {Convergence analysis for principal component flows},
url = {http://eudml.org/doc/207501},
volume = {11},
year = {2001},
}

TY - JOUR
AU - Yoshizawa, Shintaro
AU - Helmke, Uwe
AU - Starkov, Konstantin
TI - Convergence analysis for principal component flows
JO - International Journal of Applied Mathematics and Computer Science
PY - 2001
VL - 11
IS - 1
SP - 223
EP - 236
AB - A common framework for analyzing the global convergence of several flows for principal component analysis is developed. It is shown that flows proposed by Brockett, Oja, Xu and others are all gradient flows and the global convergence of these flows to single equilibrium points is established. The signature of the Hessian at each critical point is determined.
LA - eng
KW - Hessians; neural networks; principal component analysis; phase portrait; gradient flows
UR - http://eudml.org/doc/207501
ER -

References

top
  1. Baldi P. and Hornik K. (1991): Back-propagation and unsupervised learning in linear networks, In: Backpropagation: Theory, Architectures and Applications (Y. Chauvin and D.E. Rumelhart, Eds.). - Hillsdale, NJ:Erlbaum Associates. 
  2. Baldi P. and Hornik K. (1995): Learning in linear neural networks: A survey. - IEEE Trans. Neural Netw., Vol.6, No.4, pp.837-858. 
  3. Brockett R.W. (1991): Dynamical systems that sort lists, diagonalize matrices and solve linear programming problems. - Lin. Algebra Appl., Vol.146, pp.79-91. Zbl0719.90045
  4. Helmke U. and Moore J.B. (1994): Dynamical Systems and Optimization. - London: Springer. Zbl0984.49001
  5. Łojasiewicz S. (1983): Sur les trajectoires du gradient d'unefonction analytique. - Seminari di Geometria, Bologna, Vol.15, pp.115-117. Zbl0606.58045
  6. Oja E. (1982): A simplified neuron model as a principal componentanalyzer. - J. Math. Biol., Vol.15, No.3, pp.267-273. Zbl0488.92012
  7. Oja E. and Karhunen J. (1985): On stochastic approximation of the eigenvectors and eigenvalues of the expectation of a randommatrix. - J. Math. Anal. Appl., Vol.106, No.1, pp.69-84. Zbl0583.62077
  8. Oja E. (1989): Neural networks, principal components, and subspaces. - Int. J. Neural Syst., Vol.1, pp.61-68. 
  9. Oja E., Ogawa H. and Wangviwattana J. (1992a): Principal component analysis by homogeneous neural networks, Part I: The weighted subspace criterion. - IEICE Trans. Inf. Syst., Vol.3, pp.366-375. 
  10. Oja E., Ogawa H. and Wangviwattana J. (1992b): Principal component analysis by homogeneous neural networks, Part II: Analysis and extensions of the learning algorithms. - IEICE Trans. Inf. Syst., Vol.3, pp.376-382. 
  11. Sanger T.D. (1989): Optimal unsupervised learning in a single-layer linear feedforward network. - Neural Netw., Vol.2, No.6, pp.459-473. 
  12. Williams R. (1985): Feature discovery through error-correctinglearning. - Tech. Rep. No.8501, University of California, San Diego, Inst. of Cognitive Science. 
  13. Wyatt J.L. and Elfadel I.M. (1995): Time-domain solutions of Oja's equations. - Neural Comp.,Vol.7, No.5, pp.915-922. 
  14. Xu L. (1993): Least mean square error recognition principle for self organizing neural nets. - Neural Netw., Vol.6, No.5, pp.627-648. 
  15. Yan W.Y., Helmke U. and Moore J.B. (1994): Global analysis of Oja's flow for neural networks. - IEEE Trans. Neural Netw., Vol.5, No.5, pp.674-683. 

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.