A Minimal Complete Class Theorem for Decision Problems Where the Parameter Space Contains Only Finitely Many Points.
It was recently shown that all estimators which are locally best in the relative interior of the parameter set, together with their limits constitute a complete class in linear estimation, both unbiased and biased. However, not all these limits are admissible. A sufficient condition for admissibility of a limit was given by the author (1986) for the case of unbiased estimation in a linear model with the natural parameter space. This paper extends this result to the general linear model and to biased...
Let be observation vector in the usual linear model with expectation and covariance matrix known up to a multiplicative scalar, possibly singular. A linear statistic is called invariant estimator for a parametric function if its MSE depends on only through . It is shown that is admissible invariant for , if and only if, it is a BLUE of in the case when is estimable with zero variance, and it is of the form , where and is an arbitrary BLUE, otherwise. This result is used in...
Estimation in truncated parameter space is one of the most important features in statistical inference, because the frequently used criterion of unbiasedness is useless, since no unbiased estimator exists in general. So, other optimally criteria such as admissibility and minimaxity have to be looked for among others. In this paper we consider a subclass of the exponential families of distributions. Bayes estimator of a lower-bounded scale parameter, under the squared-log error loss function with...