Estimation of simple linear regression model using ranked set sampling.
We consider the problem of estimating the mean of a Gaussian vector with independent components of common unknown variance . Our estimation procedure is based on estimator selection. More precisely, we start with an arbitrary and possibly infinite collection of estimators of based on and, with the same data , aim at selecting an estimator among with the smallest Euclidean risk. No assumptions on the estimators are made and their dependencies with respect to may be unknown. We establish...
Gaussian semiparametric or local Whittle estimation of the memory parameter in standard long memory processes was proposed by Robinson [18]. This technique shows several advantages over the popular log- periodogram regression introduced by Geweke and Porter–Hudak [7]. In particular under milder assumptions than those needed in the log periodogram regression it is asymptotically more efficient. We analyse the asymptotic behaviour of the Gaussian semiparametric estimate of the memory parameter in...
We consider the problem of hypothesis testing within a monotone regression model. We propose a new test of the hypothesis : “” against the composite alternative : “” under the assumption that the true regression function is decreasing. The test statistic is based on the -distance between the isotonic estimator of and the function , since it is known that a properly centered and normalized version of this distance is asymptotically standard normally distributed under . We study the asymptotic...
We consider the problem of hypothesis testing within a monotone regression model. We propose a new test of the hypothesis H0: “ƒ = ƒ0” against the composite alternative Ha: “ƒ ≠ ƒ0” under the assumption that the true regression function f is decreasing. The test statistic is based on the -distance between the isotonic estimator of f and the function f0, since it is known that a properly centered and normalized version of this distance is asymptotically standard normally distributed under H0....
We consider the problem of estimating the conditional mean of a real gaussian variable Y=∑i=1pθiXi+ɛ where the vector of the covariates (Xi)1≤i≤p follows a joint gaussian distribution. This issue often occurs when one aims at estimating the graph or the distribution of a gaussian graphical model. We introduce a general model selection procedure which is based on the minimization of a penalized least squares type criterion. It handles a variety of problems such as ordered and complete variable selection,...
We deal with the problem of choosing a piecewise constant estimator of a regression function s mapping into . We consider a non Gaussian regression framework with deterministic design points, and we adopt the non asymptotic approach of model selection via penalization developed by Birgé and Massart. Given a collection of partitions of , with possibly exponential complexity, and the corresponding collection of piecewise constant estimators, we propose a penalized least squares criterion which...
This paper presents a new algorithm to perform regression estimation, in both the inductive and transductive setting. The estimator is defined as a linear combination of functions in a given dictionary. Coefficients of the combinations are computed sequentially using projection on some simple sets. These sets are defined as confidence regions provided by a deviation (PAC) inequality on an estimator in one-dimensional models. We prove that every projection the algorithm actually improves the performance...
We derive the two-sample Kolmogorov-Smirnov type test when a nuisance linear regression is present. The test is based on regression rank scores and provides a natural extension of the classical Kolmogorov-Smirnov test. Its asymptotic distributions under the hypothesis and the local alternatives coincide with those of the classical test.
The problem of nonparametric function fitting using the complete orthogonal system of trigonometric functions , k=0,1,2,..., for the observation model , i=1,...,n, is considered, where are uncorrelated random variables with zero mean value and finite variance, and the observation points , i=1,...,n, are equidistant. Conditions for convergence of the mean-square prediction error , the integrated mean-square error and the pointwise mean-square error of the estimator for f ∈ C[0,2π] and...
In this paper, we investigate the problem of the conditional cumulative of a scalar response variable given a random variable taking values in a semi-metric space. The uniform almost complete consistency of this estimate is stated under some conditions. Moreover, as an application, we use the obtained results to derive some asymptotic properties for the local linear estimator of the conditional quantile.
We study the estimation of the mean function of a continuous-time stochastic process and its derivatives. The covariance function of the process is assumed to be nonparametric and to satisfy mild smoothness conditions. Assuming that n independent realizations of the process are observed at a sampling design of size N generated by a positive density, we derive the asymptotic bias and variance of the local polynomial estimator as n,N increase to infinity. We deduce optimal sampling densities, optimal...
Let Y ∈ ℝn be a random vector with mean s and covariance matrix σ2PntPn where Pn is some known n × n-matrix. We construct a statistical procedure to estimate s as well as under moment condition on Y or Gaussian hypothesis. Both cases are developed for known or unknown σ2. Our approach is free from any prior assumption on s and is based on non-asymptotic model selection methods. Given some linear spaces collection {Sm, m ∈ ℳ}, we consider, for any m ∈ ℳ, the least-squares estimator ŝm of s in Sm....
In this paper, we study the problem of non parametric estimation of an unknown regression function from dependent data with sub-gaussian errors. As a particular case, we handle the autoregressive framework. For this purpose, we consider a collection of finite dimensional linear spaces (e.g. linear spaces spanned by wavelets or piecewise polynomials on a possibly irregular grid) and we estimate the regression function by a least-squares estimator built on a data driven selected linear space among...
In this paper, we study the problem of non parametric estimation of an unknown regression function from dependent data with sub-Gaussian errors. As a particular case, we handle the autoregressive framework. For this purpose, we consider a collection of finite dimensional linear spaces (e.g. linear spaces spanned by wavelets or piecewise polynomials on a possibly irregular grid) and we estimate the regression function by a least-squares estimator built on a data driven selected linear space among...
We consider the problem of estimating an unknown regression function when the design is random with values in . Our estimation procedure is based on model selection and does not rely on any prior information on the target function. We start with a collection of linear functional spaces and build, on a data selected space among this collection, the least-squares estimator. We study the performance of an estimator which is obtained by modifying this least-squares estimator on a set of small probability....
We consider the problem of estimating an unknown regression function when the design is random with values in . Our estimation procedure is based on model selection and does not rely on any prior information on the target function. We start with a collection of linear functional spaces and build, on a data selected space among this collection, the least-squares estimator. We study the performance of an estimator which is obtained by modifying this least-squares estimator on a set of small...