Global convergence of Oja's subspace algorithm for principal component extraction

IEEE Trans Neural Netw. 1998;9(1):58-67. doi: 10.1109/72.655030.

Abstract

Oja's principal subspace algorithm is a well-known and powerful technique for learning and tracking principal information in time series. A thorough investigation of the convergence property of Oja's algorithm is undertaken in this paper. The asymptotic convergence rates of the algorithm is discovered. The dependence of the algorithm on its initial weight matrix and the singularity of the data covariance matrix is comprehensively addressed.