Abstract
It is common to assess the 'memory strength' of a stationary process by looking at how fast the normalized log-determinant of its covariance submatrices (i.e., entropy rate) decreases. In this work, we propose an alternative characterization in terms of the normalized trace-inverse of the covariance submatrices. We show that this sequence is monotonically non-decreasing and is constant if and only if the process is white. Furthermore, while the entropy rate is associated with one-sided prediction errors (present from past), the new measure is associated with two-sided prediction errors (present from past and future). Minimizing this measure is then used as an alternative to Burg's maximum-entropy principle for spectral estimation. We also propose a counterpart for non-stationary processes, by looking at the average trace-inverse of subsets.
| Original language | English |
|---|---|
| Pages (from-to) | 2767-2781 |
| Number of pages | 15 |
| Journal | IEEE Transactions on Information Theory |
| Volume | 68 |
| Issue number | 4 |
| DOIs | |
| State | Published - 1 Apr 2022 |
Keywords
- Maximum entropy
- causality
- minimum mean square error
- prediction
All Science Journal Classification (ASJC) codes
- Information Systems
- Computer Science Applications
- Library and Information Sciences
Fingerprint
Dive into the research topics of 'Monotonicity of the Trace-Inverse of Covariance Submatrices and Two-Sided Prediction'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver