Abstract
Over of the last century, Principal Component Analysis (PCA) has become one of the pillars of modern scientific methods. Although PCA is typically viewed as a statistical tool aiming at finding orthogonal directions on which the variance is maximized, its first introduction by Pearson at 1901 was in the framework of the non-linear least-squares minimization problem of fitting a plane to scattered data points. Since linear least-squares regression also fits a plane to scattered data points, PCA and linear least-squares regression have thus a natural kinship, which we explore in this paper. In particular, we present an iterated linear least-squares approach, yielding a sequence of subspaces that converges to the space spanned by the leading principal components. The key observation, by which we establish our result, is that each iteration of the Power (or Subspace) Iterations, applied to the covariance matrix, can be interpreted as a solution to a linear least-squares problem.
Original language | English |
---|---|
Pages (from-to) | 84-92 |
Number of pages | 9 |
Journal | Applied and Computational Harmonic Analysis |
Volume | 63 |
DOIs | |
State | Published - Mar 2023 |
Keywords
- Least-squares
- Principal component analysis
- Singular value decomposition
- Subspace iterations
All Science Journal Classification (ASJC) codes
- Applied Mathematics