We develop a uniform Cramer-Rao lower bound (UCRLB) on the total variance of any estimator of an unknown deterministic vector of parameters, with bias gradient matrix whose norm is bounded by a constant. We consider two different measures of norm, leading to two corresponding bounds. When the observations are related to the unknown vector through a linear Gaussian model, Tikhonov regularization and the shrunken estimator are shown to achieve the UCRLB. For more general models, we show that the penalized maximum likelihood estimator with a suitable penalizing function asymptotically achieves the UCRLB.
展开▼