Efficiently updating and tracking the dominant kernel principal components

Neural Netw. 2007 Mar;20(2):220-9. doi: 10.1016/j.neunet.2006.09.012. Epub 2007 Jan 17.

Abstract

The dominant set of eigenvectors of the symmetrical kernel Gram matrix is used in many important kernel methods (like e.g. kernel Principal Component Analysis, feature approximation, denoising, compression, prediction) in the machine learning area. Yet in the case of dynamic and/or large-scale data, the batch calculation nature and computational demands of the eigenvector decomposition limit these methods in numerous applications. In this paper we present an efficient incremental approach for fast calculation of the dominant kernel eigenbasis, which allows us to track the kernel eigenspace dynamically. Experiments show that our updating scheme delivers a numerically stable and accurate approximation for eigenvalues and eigenvectors at every iteration in comparison to the batch algorithm.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms*
  • Artificial Intelligence*
  • Humans
  • Information Storage and Retrieval*
  • Logical Observation Identifiers Names and Codes
  • Neural Networks, Computer*
  • Pattern Recognition, Automated
  • Principal Component Analysis*