Abstract
Often, large, high-dimensional datasets collected across multiple modalities can be organized as a higher-order tensor. Low-rank tensor decomposition then arises as a powerful and widely used tool to discover simple low-dimensional structures underlying such data. However, we currently lack a theoretical understanding of the algorithmic behavior of low-rank tensor decompositions. We derive Bayesian approximate message passing (AMP) algorithms for recovering arbitrarily shaped low-rank tensors buried within noise, and we employ dynamic mean field theory to precisely characterize their performance. Our theory reveals the existence of phase transitions between easy, hard and impossible inference regimes, and displays an excellent match with simulations. Moreover it reveals several qualitative surprises compared to the behavior of symmetric, cubic tensor decomposition. Finally, we compare our AMP algorithm to the most commonly used algorithm, alternating least squares (ALS), and demonstrate that AMP significantly outperforms ALS in the presence of noise.
Original language | English |
---|---|
Article number | 124016 |
Journal | Journal of Statistical Mechanics: Theory and Experiment |
Volume | 2019 |
Issue number | 12 |
DOIs | |
State | Published - 20 Dec 2019 |
Externally published | Yes |
Keywords
- machine learning
All Science Journal Classification (ASJC) codes
- Statistical and Nonlinear Physics
- Statistics and Probability
- Statistics, Probability and Uncertainty