Quantization of signals is an integral part of modern signal processing applications, such as sensing, communication, and inference. While signal quantization provides many physical advantages, it usually degrades the subsequent estimation performance that is based on quantized data. In an attempt to maintain physical constraints, while, simultaneously, to attain substantial performance gain, we consider systems with mixed-resolution, 1-bit quantized and continuous-valued, data. First, we describe the linear minimum mean-squared error (LMMSE) estimator and its associated mean-squared error (MSE) for the general mixed-resolution model. However, the MSE of the LMMSE requires matrix inversion, where the number of measurements defines the matrix dimensions and thus, may not be a tractable tool for optimization and system design. Therefore, we present the linear Gaussian orthonormal (LGO) measurement model and derive a closed-form analytic expression for the MSE of the LMMSE estimator under this model. We discuss two common special cases of the LGO model: 1) scalar parameter estimation, and 2) channel estimation in multiple-input multiple-output (MIMO) communication systems with mixed analog-to-digital converters (ADCs). We then solve the resource allocation optimization problem under the LGO model, with the proposed tractable MSE as an objective function and under a power constraint by using a one-dimensional search. Further, we present the concept of dithering for mixed-resolution models and optimize the dithering noise as part of the resource allocation optimization problem. Simulations show that the proposed resource allocation and dithering policies provide significant performance improvement.
- Massive MIMO
- Resource allocation
- linear minimum mean-squared error
- mixed-ADC architecture
All Science Journal Classification (ASJC) codes
- Signal Processing
- Electrical and Electronic Engineering