Displaying 21 – 40 of 88

Showing per page

Gaussian model selection

Lucien Birgé, Pascal Massart (2001)

Journal of the European Mathematical Society

Our purpose in this paper is to provide a general approach to model selection via penalization for Gaussian regression and to develop our point of view about this subject. The advantage and importance of model selection come from the fact that it provides a suitable approach to many different types of problems, starting from model selection per se (among a family of parametric models, which one is more suitable for the data at hand), which includes for instance variable selection in regression models,...

Information inequalities for the minimax risk of sequential estimators (with applications)

Lesław Gajek, B. Mizera-Florczak (1998)

Applicationes Mathematicae

Information inequalities for the minimax risk of sequential estimators are derived in the case where the loss is measured by the squared error of estimation plus a linear functional of the number of observations. The results are applied to construct minimax sequential estimators of: the failure rate in an exponential model with censored data, the expected proportion of uncensored observations in the proportional hazards model, the odds ratio in a binomial distribution and the expectation of exponential...

Invariance and R-ε criterion.

Julián de la Horra (1986)

Trabajos de Estadística

The R-ε criterion is considered as a generalization of the minimax criterion, in a decision problem with Θ = {θ1, ..., θn}, and its relation with the invariance is studied. If a decision problem is invariant under a finite group G, it is known, from the minimax point of view that, for any rule δ, there exists an invariant rule δ' which is either preferred or equivalent to δ. The question raised in this paper is: given that the minimax ordering is a particular case of R-ε ordering, is it possible...

Minimax mutual prediction

Stanisław Trybuła (2000)

Applicationes Mathematicae

The problems of minimax mutual prediction are considered for binomial and multinomial random variables and for sums of limited random variables with unknown distribution. For the loss function being a linear combination of quadratic losses minimax mutual predictors are determined where the parameters of predictors are obtained by numerical solution of some equations.

Minimax mutual prediction of multinomial random variables

Stanisław Trybuła (2003)

Applicationes Mathematicae

The problem of minimax mutual prediction is considered for multinomial random variables with the loss function being a linear combination of quadratic losses connected with prediction of particular variables. The basic parameter of the minimax mutual predictor is determined by numerical solution of some equation.

Minimax nonparametric prediction

Maciej Wilczyński (2001)

Applicationes Mathematicae

Let U₀ be a random vector taking its values in a measurable space and having an unknown distribution P and let U₁,...,Uₙ and V , . . . , V m be independent, simple random samples from P of size n and m, respectively. Further, let z , . . . , z k be real-valued functions defined on the same space. Assuming that only the first sample is observed, we find a minimax predictor d⁰(n,U₁,...,Uₙ) of the vector Y m = j = 1 m ( z ( V j ) , . . . , z k ( V j ) ) T with respect to a quadratic errors loss function.

Currently displaying 21 – 40 of 88