Mean square error for histograms when estimating Radon-Nikodym derivatives.
This paper deals with a scalar response conditioned by a functional random variable. The main goal is to estimate the conditional hazard function. An asymptotic formula for the mean square error of this estimator is calculated considering as usual the bias and variance.
We consider a deconvolution problem of estimating a signal blurred with a random noise. The noise is assumed to be a stationary Gaussian process multiplied by a weight function function εh where h ∈ L2(R1) and ε is a small parameter. The underlying solution is assumed to be infinitely differentiable. For this model we find asymptotically minimax and Bayes estimators. In the case of solutions having finite number of derivatives similar results were obtained in [G.K. Golubev and R.Z. Khasminskii,...
Let U₀ be a random vector taking its values in a measurable space and having an unknown distribution P and let U₁,...,Uₙ and be independent, simple random samples from P of size n and m, respectively. Further, let be real-valued functions defined on the same space. Assuming that only the first sample is observed, we find a minimax predictor d⁰(n,U₁,...,Uₙ) of the vector with respect to a quadratic errors loss function.
The problem of predicting integrals of stochastic processes is considered. Linear estimators have been constructed by means of samples at N discrete times for processes having a fixed Hölderian regularity s > 0 in quadratic mean. It is known that the rate of convergence of the mean squared error is of order N-(2s+1). In the class of analytic processes Hp, p ≥ 1, we show that among all estimators, the linear ones are optimal. Moreover, using optimal coefficient estimators derived through...
We propose a method based on a penalised likelihood criterion, for estimating the number on non-zero components of the mean of a Gaussian vector. Following the work of Birgé and Massart in Gaussian model selection, we choose the penalty function such that the resulting estimator minimises the Kullback risk.
This paper deals with a non-parametric problem coming from physics, namely quantum tomography. That consists in determining the quantum state of a mode of light through a homodyne measurement. We apply several model selection procedures: penalized projection estimators, where we may use pattern functions or wavelets, and penalized maximum likelihood estimators. In all these cases, we get oracle inequalities. In the former we also have a polynomial rate of convergence for the non-parametric problem....
The purpose of this paper is to investigate moderate deviations for the Durbin–Watson statistic associated with the stable first-order autoregressive process where the driven noise is also given by a first-order autoregressive process. We first establish a moderate deviation principle for both the least squares estimator of the unknown parameter of the autoregressive process as well as for the serial correlation estimator associated with the driven noise. It enables us to provide a moderate deviation...
In this paper, we give sufficient conditions to establish central limit theorems and moderate deviation principle for a class of support estimates of empirical and Poisson point processes. The considered estimates are obtained by smoothing some bias corrected extreme values of the point process. We show how the smoothing permits to obtain Gaussian asymptotic limits and therefore pointwise confidence intervals. Some unidimensional and multidimensional examples are provided.