Two New Families of Local Asymptotically Minimax Lower Bounds in Parameter Estimation

Research output: Contribution to journalArticlepeer-review

Abstract

We propose two families of asymptotically local minimax lower bounds on parameter estimation performance. The first family of bounds applies to any convex, symmetric loss function that depends solely on the difference between the estimate and the true underlying parameter value (i.e., the estimation error), whereas the second is more specifically oriented to the moments of the estimation error. The proposed bounds are relatively easy to calculate numerically (in the sense that their optimization is over relatively few auxiliary parameters), yet they turn out to be tighter (sometimes significantly so) than previously reported bounds that are associated with similar calculation efforts, across many application examples. In addition to their relative simplicity, they also have the following advantages: (i) Essentially no regularity conditions are required regarding the parametric family of distributions. (ii) The bounds are local (in a sense to be specified). (iii) The bounds provide the correct order of decay as functions of the number of observations, at least in all the examples examined. (iv) At least the first family of bounds extends straightforwardly to vector parameters.

Original languageEnglish
Article number944
JournalEntropy
Volume26
Issue number11
DOIs
StatePublished - Nov 2024

Keywords

  • Fisher information
  • hypothesis testing
  • mean-square error
  • minimax estimation
  • parameter estimation
  • probability of error

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Mathematical Physics
  • Physics and Astronomy (miscellaneous)
  • General Physics and Astronomy
  • Electrical and Electronic Engineering

Cite this