Page 1 Next

Displaying 1 – 20 of 36

Showing per page

A sufficient condition for admissibility in linear estimation

Czesław Stępniak (1988)

Aplikace matematiky

It was recently shown that all estimators which are locally best in the relative interior of the parameter set, together with their limits constitute a complete class in linear estimation, both unbiased and biased. However, not all these limits are admissible. A sufficient condition for admissibility of a limit was given by the author (1986) for the case of unbiased estimation in a linear model with the natural parameter space. This paper extends this result to the general linear model and to biased...

Admissible invariant estimators in a linear model

Czesław Stępniak (2014)


Let 𝐲 be observation vector in the usual linear model with expectation 𝐀 β and covariance matrix known up to a multiplicative scalar, possibly singular. A linear statistic 𝐚 T 𝐲 is called invariant estimator for a parametric function φ = 𝐜 T β if its MSE depends on β only through φ . It is shown that 𝐚 T 𝐲 is admissible invariant for φ , if and only if, it is a BLUE of φ , in the case when φ is estimable with zero variance, and it is of the form k φ ^ , where k 0 , 1 and φ ^ is an arbitrary BLUE, otherwise. This result is used in...

An admissible estimator of a lower-bounded scale parameter under squared-log error loss function

Eisa Mahmoudi, Hojatollah Zakerzadeh (2011)


Estimation in truncated parameter space is one of the most important features in statistical inference, because the frequently used criterion of unbiasedness is useless, since no unbiased estimator exists in general. So, other optimally criteria such as admissibility and minimaxity have to be looked for among others. In this paper we consider a subclass of the exponential families of distributions. Bayes estimator of a lower-bounded scale parameter, under the squared-log error loss function with...

Bayes unbiased estimation in a model with two variance components

Jaroslav Stuchlý (1987)

Aplikace matematiky

In the paper an explicit expression for the Bayes invariant quadratic unbiased estimate of the linear function of the variance components is presented for the mixed linear model 𝐭 = 𝐗 β + ϵ , 𝐄 ( 𝐭 ) = 𝐗 β , 𝐃 ( 𝐭 ) = 0 1 𝐔 1 + 0 2 𝐔 2 with the unknown variance componets in the normal case. The matrices 𝐔 1 , 𝐔 2 may be singular. Applications to two examples of the analysis of variance are given.

Estimation of the size of a closed population

S. Sengupta (2010)

Applicationes Mathematicae

The problem considered is that of estimation of the size (N) of a closed population under three sampling schemes admitting unbiased estimation of N. It is proved that for each of these schemes, the uniformly minimum variance unbiased estimator (UMVUE) of N is inadmissible under square error loss function. For the first scheme, the UMVUE is also the maximum likelihood estimator (MLE) of N. For the second scheme and a special case of the third, it is shown respectively that an MLE and an estimator...

Evaluating default priors with a generalization of Eaton’s Markov chain

Brian P. Shea, Galin L. Jones (2014)

Annales de l'I.H.P. Probabilités et statistiques

We consider evaluating improper priors in a formal Bayes setting according to the consequences of their use. Let 𝛷 be a class of functions on the parameter space and consider estimating elements of 𝛷 under quadratic loss. If the formal Bayes estimator of every function in 𝛷 is admissible, then the prior is strongly admissible with respect to 𝛷 . Eaton’s method for establishing strong admissibility is based on studying the stability properties of a particular Markov chain associated with the inferential...

Invariancia de las reglas admisibles y Bayes respecto de transformaciones monótonas.

Francisco Criado Torralba (1983)

Trabajos de Estadística e Investigación Operativa

From an optimality point of view the solution of a decision problem is related to classes of optimal strategies: admissible, Bayes, etc. which are closely related to boundaries of the risk set S such as lower-boundary, Bayes boundary, positive Bayes boundary. In this paper we present some results concerning invariance properties of such boundaries when the set is transformed by means of a continuous monotonic increasing function W.

Minimax prediction under random sample size

Alicja Jokiel-Rokita (2002)

Applicationes Mathematicae

A class of minimax predictors of random variables with multinomial or multivariate hypergeometric distribution is determined in the case when the sample size is assumed to be a random variable with an unknown distribution. It is also proved that the usual predictors, which are minimax when the sample size is fixed, are not minimax, but they remain admissible when the sample size is an ancillary statistic with unknown distribution.

Currently displaying 1 – 20 of 36

Page 1 Next