Displaying 101 – 120 of 253

Showing per page

Minimax nonparametric hypothesis testing for ellipsoids and Besov bodies

Yuri I. Ingster, Irina A. Suslina (2010)

ESAIM: Probability and Statistics

We observe an infinitely dimensional Gaussian random vector x = ξ + v where ξ is a sequence of standard Gaussian variables and v ∈ l2 is an unknown mean. We consider the hypothesis testing problem H0 : v = 0versus alternatives H ε , τ : v V ε for the sets V ε = V ε ( τ , ρ ε ) l 2 . The sets Vε are lq-ellipsoids of semi-axes ai = i-s R/ε with lp-ellipsoid of semi-axes bi = i-r pε/ε removed or similar Besov bodies Bq,t;s (R/ε) with Besov bodies Bp,h;r (pε/ε) removed. Here τ = ( κ , R ) or τ = ( κ , h , t , R ) ; κ = ( p , q , r , s ) are the parameters which define the sets Vε for given radii...

Minimax nonparametric prediction

Maciej Wilczyński (2001)

Applicationes Mathematicae

Let U₀ be a random vector taking its values in a measurable space and having an unknown distribution P and let U₁,...,Uₙ and V , . . . , V m be independent, simple random samples from P of size n and m, respectively. Further, let z , . . . , z k be real-valued functions defined on the same space. Assuming that only the first sample is observed, we find a minimax predictor d⁰(n,U₁,...,Uₙ) of the vector Y m = j = 1 m ( z ( V j ) , . . . , z k ( V j ) ) T with respect to a quadratic errors loss function.

Minimax Prediction for the Multinomial and Multivariate Hypergeometric Distributions

Alicja Jokiel-Rokita (1998)

Applicationes Mathematicae

A problem of minimax prediction for the multinomial and multivariate hypergeometric distribution is considered. A class of minimax predictors is determined for estimating linear combinations of the unknown parameter and the random variable having the multinomial or the multivariate hypergeometric distribution.

Minimax prediction under random sample size

Alicja Jokiel-Rokita (2002)

Applicationes Mathematicae

A class of minimax predictors of random variables with multinomial or multivariate hypergeometric distribution is determined in the case when the sample size is assumed to be a random variable with an unknown distribution. It is also proved that the usual predictors, which are minimax when the sample size is fixed, are not minimax, but they remain admissible when the sample size is an ancillary statistic with unknown distribution.

Minimax results for estimating integrals of analytic processes

Karim Benhenni, Jacques Istas (2010)

ESAIM: Probability and Statistics

The problem of predicting integrals of stochastic processes is considered. Linear estimators have been constructed by means of samples at N discrete times for processes having a fixed Hölderian regularity s > 0 in quadratic mean. It is known that the rate of convergence of the mean squared error is of order N-(2s+1). In the class of analytic processes Hp, p ≥ 1, we show that among all estimators, the linear ones are optimal. Moreover, using optimal coefficient estimators derived through...

Minimax theorems with applications to convex metric spaces

Jürgen Kindler (1995)

Colloquium Mathematicae

A minimax theorem is proved which contains a recent result of Pinelis and a version of the classical minimax theorem of Ky Fan as special cases. Some applications to the theory of convex metric spaces (farthest points, rendez-vous value) are presented.

Minimum disparity estimators for discrete and continuous models

María Luisa Menéndez, Domingo Morales, Leandro Pardo, Igor Vajda (2001)

Applications of Mathematics

Disparities of discrete distributions are introduced as a natural and useful extension of the information-theoretic divergences. The minimum disparity point estimators are studied in regular discrete models with i.i.d. observations and their asymptotic efficiency of the first order, in the sense of Rao, is proved. These estimators are applied to continuous models with i.i.d. observations when the observation space is quantized by fixed points, or at random, by the sample quantiles of fixed orders....

Minimum distance estimator for a hyperbolic stochastic partial differentialequation

Vincent Monsan, Modeste N'zi (2000)

Applicationes Mathematicae

We study a minimum distance estimator in L 2 -norm for a class ofnonlinear hyperbolic stochastic partial differential equations, driven by atwo-parameter white noise. The consistency and asymptotic normality of thisestimator are established under some regularity conditions on thecoefficients. Our results are applied to the two-parameterOrnstein-Uhlenbeck process.

Currently displaying 101 – 120 of 253