Displaying similar documents to “Adaptive estimation of the stationary density of discrete and continuous time mixing processes”

Model selection for (auto-)regression with dependent data

Yannick Baraud, F. Comte, G. Viennet (2001)

ESAIM: Probability and Statistics

Similarity:

In this paper, we study the problem of non parametric estimation of an unknown regression function from dependent data with sub-gaussian errors. As a particular case, we handle the autoregressive framework. For this purpose, we consider a collection of finite dimensional linear spaces (e.g. linear spaces spanned by wavelets or piecewise polynomials on a possibly irregular grid) and we estimate the regression function by a least-squares estimator built on a data driven selected linear...

Exact adaptive pointwise estimation on Sobolev classes of densities

Cristina Butucea (2001)

ESAIM: Probability and Statistics

Similarity:

The subject of this paper is to estimate adaptively the common probability density of n independent, identically distributed random variables. The estimation is done at a fixed point x 0 , over the density functions that belong to the Sobolev class W n ( β , L ) . We consider the adaptive problem setup, where the regularity parameter β is unknown and varies in a given set B n . A sharp adaptive estimator is obtained, and the explicit asymptotical constant, associated to its rate of convergence is found. ...

Diffusions with measurement errors. I. Local asymptotic normality

Arnaud Gloter, Jean Jacod (2001)

ESAIM: Probability and Statistics

Similarity:

We consider a diffusion process X which is observed at times i / n for i = 0 , 1 , ... , n , each observation being subject to a measurement error. All errors are independent and centered gaussian with known variance ρ n . There is an unknown parameter within the diffusion coefficient, to be estimated. In this first paper the case when X is indeed a gaussian martingale is examined: we can prove that the LAN property holds under quite weak smoothness assumptions, with an explicit limiting Fisher information. What...

Extreme values and kernel estimates of point processes boundaries

Stéphane Girard, Pierre Jacob (2004)

ESAIM: Probability and Statistics

Similarity:

We present a method for estimating the edge of a two-dimensional bounded set, given a finite random set of points drawn from the interior. The estimator is based both on a Parzen-Rosenblatt kernel and extreme values of point processes. We give conditions for various kinds of convergence and asymptotic normality. We propose a method of reducing the negative bias and edge effects, illustrated by some simulations.

Adaptive estimation of a quadratic functional of a density by model selection

Béatrice Laurent (2005)

ESAIM: Probability and Statistics

Similarity:

We consider the problem of estimating the integral of the square of a density f from the observation of a n sample. Our method to estimate f 2 ( x ) d x is based on model selection via some penalized criterion. We prove that our estimator achieves the adaptive rates established by Efroimovich and Low on classes of smooth functions. A key point of the proof is an exponential inequality for U -statistics of order 2 due to Houdré and Reynaud.

Detecting abrupt changes in random fields

Antoine Chambaz (2002)

ESAIM: Probability and Statistics

Similarity:

This paper is devoted to the study of some asymptotic properties of a M -estimator in a framework of detection of abrupt changes in random field’s distribution. This class of problems includes e.g. recovery of sets. It involves various techniques, including M -estimation method, concentration inequalities, maximal inequalities for dependent random variables and φ -mixing. Penalization of the criterion function when the size of the true model is unknown is performed. All the results apply...

Minimax and bayes estimation in deconvolution problem

Mikhail Ermakov (2008)

ESAIM: Probability and Statistics

Similarity:

We consider a deconvolution problem of estimating a signal blurred with a random noise. The noise is assumed to be a stationary Gaussian process multiplied by a weight function function where and is a small parameter. The underlying solution is assumed to be infinitely differentiable. For this model we find asymptotically minimax and Bayes estimators. In the case of solutions having finite number of derivatives similar results were obtained in [G.K. Golubev and R.Z. Khasminskii,...