The search session has expired. Please query the service again.
The search session has expired. Please query the service again.
The search session has expired. Please query the service again.
The search session has expired. Please query the service again.
Displaying 61 –
80 of
119
A linearization of the nonlinear regression model causes a bias in estimators of model parameters. It can be eliminated, e.g., either by a proper choice of the point where the model is developed into the Taylor series or by quadratic corrections of linear estimators. The aim of the paper is to obtain formulae for biases and variances of estimators in linearized models and also for corrected estimators.
The longitudinal regression model where is the th measurement of the th subject at random time , is the regression function, is a predictable covariate process observed at time and is a noise, is studied in marked point process framework. In this paper we introduce the assumptions which guarantee the consistency and asymptotic normality of smooth -estimator of unknown parameter .
In this paper, we study the problem of non parametric estimation of an unknown regression function from dependent data with sub-gaussian errors. As a particular case, we handle the autoregressive framework. For this purpose, we consider a collection of finite dimensional linear spaces (e.g. linear spaces spanned by wavelets or piecewise polynomials on a possibly irregular grid) and we estimate the regression function by a least-squares estimator built on a data driven selected linear space among...
In this paper, we study the problem of non parametric estimation
of an unknown regression function from dependent data with
sub-Gaussian errors. As a particular case, we handle the
autoregressive framework. For this purpose, we consider a
collection of finite dimensional linear spaces (e.g. linear spaces
spanned by wavelets or piecewise polynomials on a possibly
irregular grid) and we estimate the regression function by a
least-squares estimator built on a data driven selected linear
space among...
We consider the problem of estimating an unknown regression function when the design is random with values in . Our estimation procedure is based on model selection and does not rely on any prior information on the target function. We start with a collection of linear functional spaces and build, on a data selected space among this collection, the least-squares estimator. We study the performance of an estimator which is obtained by modifying this least-squares estimator on a set of small probability....
We consider the problem of estimating an unknown regression function
when the design is random with values in . Our estimation
procedure is based on model selection and does not rely on any prior
information on the target function. We start with a collection of
linear functional spaces and build, on a data selected space among
this collection, the least-squares estimator. We study the
performance of an estimator which is obtained by modifying this
least-squares estimator on a set of small...
In the regression model with errors in variables, we observe n i.i.d. copies of (Y, Z) satisfying Y=fθ0(X)+ξ and Z=X+ɛ involving independent and unobserved random variables X, ξ, ɛ plus a regression function fθ0, known up to a finite dimensional θ0. The common densities of the Xi’s and of the ξi’s are unknown, whereas the distribution of ɛ is completely known. We aim at estimating the parameter θ0 by using the observations (Y1, Z1), …, (Yn, Zn). We propose an estimation procedure based on the least...
An approximate value of a parameter in a nonlinear regression model is known in many cases. In such situation a linearization of the model is possible however it is important to recognize, whether the difference between the actual value of the parameter and the approximate value does not cause significant changes, e.g., in the bias of the estimator or in its variance, etc. Some rules suitable for a solution of this problem are given in the paper.
The properties of two recursive estimators of the Fourier coefficients of a regression function with respect to a complete orthonormal system of bounded functions (ek) , k=1,2,..., are considered in the case of the observation model , i=1,...,n , where are independent random variables with zero mean and finite variance, , i=1,...,n, form a random sample from a distribution with density ϱ =1/(b-a) (uniform distribution) and are independent of the errors , i=1,...,n . Unbiasedness and mean-square...
Formulas for a new three- and four-dimensional parameter-effects arrays corresponding to transformations of parameters in non-linear regression models are given. These formulae make the construction of the confidence regions for parameters easier. An example is presented which shows that some care is necessary when a new array is computed.
Currently displaying 61 –
80 of
119