Gamma-minimax estimation of multinomial probabilities
Our purpose in this paper is to provide a general approach to model selection via penalization for Gaussian regression and to develop our point of view about this subject. The advantage and importance of model selection come from the fact that it provides a suitable approach to many different types of problems, starting from model selection per se (among a family of parametric models, which one is more suitable for the data at hand), which includes for instance variable selection in regression models,...
Information inequalities for the minimax risk of sequential estimators are derived in the case where the loss is measured by the squared error of estimation plus a linear functional of the number of observations. The results are applied to construct minimax sequential estimators of: the failure rate in an exponential model with censored data, the expected proportion of uncensored observations in the proportional hazards model, the odds ratio in a binomial distribution and the expectation of exponential...
The R-ε criterion is considered as a generalization of the minimax criterion, in a decision problem with Θ = {θ1, ..., θn}, and its relation with the invariance is studied. If a decision problem is invariant under a finite group G, it is known, from the minimax point of view that, for any rule δ, there exists an invariant rule δ' which is either preferred or equivalent to δ. The question raised in this paper is: given that the minimax ordering is a particular case of R-ε ordering, is it possible...
The problems of minimax mutual prediction are considered for binomial and multinomial random variables and for sums of limited random variables with unknown distribution. For the loss function being a linear combination of quadratic losses minimax mutual predictors are determined where the parameters of predictors are obtained by numerical solution of some equations.
The problem of minimax mutual prediction is considered for multinomial random variables with the loss function being a linear combination of quadratic losses connected with prediction of particular variables. The basic parameter of the minimax mutual predictor is determined by numerical solution of some equation.
Let U₀ be a random vector taking its values in a measurable space and having an unknown distribution P and let U₁,...,Uₙ and be independent, simple random samples from P of size n and m, respectively. Further, let be real-valued functions defined on the same space. Assuming that only the first sample is observed, we find a minimax predictor d⁰(n,U₁,...,Uₙ) of the vector with respect to a quadratic errors loss function.