New Cramer-Rao-Type Bound for Constrained Parameter Estimation.

Research output: Working paperPreprint

Abstract

Non-Bayesian parameter estimation under parametric constraints is encountered in numerous applications in signal processing, communications, and control. Mean-squared-error (MSE) lower bounds are widely used as performance benchmarks and for system design. The well-known constrained Cram´er-Rao bound (CCRB) is a lower bound on the MSE of estimators that satisfy some unbiasedness conditions. In many constrained estimation problems, these unbiasedness conditions are too strict and popular estimators, such as the constrained maximum likelihood estimator, do not satisfy them. In addition, MSE performance can be uniformly improved by implementing estimators that do not satisfy these conditions. As a result, the CCRB is not a valid bound on the MSE of such estimators. In this paper, we propose a new definition for unbiasedness in constrained settings, denoted by C-unbiasedness, which is based on using Lehmann-unbiasedness with a weighted MSE (WMSE) risk and taking into account the parametric constraints. In addition, a Cram ´er-Rao-type bound on the WMSE of C-unbiased estimators, denoted as Lehmann-unbiased CCRB (LU-CCRB), is derived. It is shown that in general, C-unbiasedness is less restrictive than the CCRB unbiasedness conditions. Thus, the LU-CCRB is valid for a larger set of estimators than the CCRB and C-unbiased estimators with lower WMSE than the corresponding CCRB may exist. In the simulations, we examine linear and nonlinear estimation problems under nonlinear parametric constraints in which the constrained maximum likelihood estimator is shown to be C-unbiased and the LU-CCRB is an informative bound on its WMSE, while the corresponding CCRB is not valid.
Original languageAmerican English
DOIs
StatePublished - Nov 2018

Fingerprint

Dive into the research topics of 'New Cramer-Rao-Type Bound for Constrained Parameter Estimation.'. Together they form a unique fingerprint.

Cite this