Mixtures of Probabilistic Principal Component Analyzers

Article Properties
  • Language
    English
  • Publication Date
    1999/02/01
  • Indian UGC (Journal)
  • Refrences
    18
  • Citations
    724
  • Michael E. Tipping Microsoft Research, St. George House, Cambridge CB2 3NH, U.K.
  • Christopher M. Bishop Microsoft Research, St. George House, Cambridge CB2 3NH, U.K.
Abstract
Cite
Tipping, Michael E., and Christopher M. Bishop. “Mixtures of Probabilistic Principal Component Analyzers”. Neural Computation, vol. 11, no. 2, 1999, pp. 443-82, https://doi.org/10.1162/089976699300016728.
Tipping, M. E., & Bishop, C. M. (1999). Mixtures of Probabilistic Principal Component Analyzers. Neural Computation, 11(2), 443-482. https://doi.org/10.1162/089976699300016728
Tipping ME, Bishop CM. Mixtures of Probabilistic Principal Component Analyzers. Neural Computation. 1999;11(2):443-82.
Journal Categories
Medicine
Internal medicine
Neurosciences
Biological psychiatry
Neuropsychiatry
Science
Mathematics
Instruments and machines
Electronic computers
Computer science
Technology
Electrical engineering
Electronics
Nuclear engineering
Electronics
Technology
Mechanical engineering and machinery
Description

How can we overcome the limitations of traditional Principal Component Analysis (PCA)? This paper introduces a mixture model approach to Principal Component Analysis, addressing the limitations imposed by PCA's global linearity. While nonlinear PCA variants exist, this research explores capturing data complexity through a combination of local linear PCA projections. The main purpose is to introduce PCA within a maximum likelihood framework. PCA is formulated within a maximum likelihood framework, using a gaussian latent variable model. This framework leads to a well-defined mixture model for probabilistic principal component analyzers, whose parameters can be determined using an expectation-maximization algorithm. The use of an expectation-maximization algorithm helps in calculating the parameters. The model's advantages are discussed in the context of clustering, density modeling, and local dimensionality reduction. The mixture model's utility is demonstrated through applications in image compression and handwritten digit recognition. This approach offers a powerful tool for capturing data complexity and improving the performance of PCA in various applications.

Published in Neural Computation, this paper focuses on a core area of research in neural networks and machine learning: dimensionality reduction. The development of a mixture model for PCA is a significant contribution to the field, aligning with the journal's interest in novel computational techniques for data analysis. This approach is appropriate for the journal's theoretical and applied focus.

Refrences
Citations
Citations Analysis
The first research to cite this article was titled A hierarchical latent variable model for data visualization and was published in 1998. The most recent citation comes from a 2024 study titled A hierarchical latent variable model for data visualization . This article reached its peak citation in 2022 , with 51 citations.It has been cited in 325 different journals, 17% of which are open access. Among related journals, the Neurocomputing cited this research the most, with 31 citations. The chart below illustrates the annual citation trends for this article.
Citations used this article by year