Locally best linear-quadratic estimators
This paper presents a semi-global mathematical model for an analysis of a signal of amperometric biosensors. Artificial neural networks were applied to an analysis of the biosensor response to multi-component mixtures. A large amount of the learning and test data was synthesized using computer simulation of the biosensor response. The biosensor signal was analyzed with respect to the concentration of each component of the mixture. The paradigm of locally weighted linear regression was used for retraining...
The longitudinal regression model where is the th measurement of the th subject at random time , is the regression function, is a predictable covariate process observed at time and is a noise, is studied in marked point process framework. In this paper we introduce the assumptions which guarantee the consistency and asymptotic normality of smooth -estimator of unknown parameter .
Real valued -estimators in a statistical model with observations are replaced by -valued -estimators in a new model with observations , where are regressors, is a structural parameter and a structural function of the new model. Sufficient conditions for the consistency of are derived, motivated by the sufficiency conditions for the simpler “parent estimator” . The result is a general method of consistent estimation in a class of nonlinear (pseudolinear) statistical problems. If...
Two estimates of the regression coefficient in bivariate normal distribution are considered: the usual one based on a sample and a new one making use of additional observations of one of the variables. They are compared with respect to variance. The same is done for two regression lines. The conclusion is that the additional observations are worth using only when the sample is very small.
Matrix mathematics provides a powerful tool set for addressing statistical problems, in particular, the theory of matrix ranks and inertias has been developed as effective methodology of simplifying various complicated matrix expressions, and establishing equalities and inequalities occurred in statistical analysis. This paper describes how to establish exact formulas for calculating ranks and inertias of covariances of predictors and estimators of parameter spaces in general linear models (GLMs),...
Least-Squares Solution (LSS) of a linear matrix equation and Ordinary Least-Squares Estimator (OLSE) of unknown parameters in a general linear model are two standard algebraical methods in computational mathematics and regression analysis. Assume that a symmetric quadratic matrix-valued function Φ(Z) = Q − ZPZ0 is given, where Z is taken as the LSS of the linear matrix equation AZ = B. In this paper, we establish a group of formulas for calculating maximum and minimum ranks and inertias of Φ(Z)...
In many cases we can consider the regression parameters as realizations of a random variable. In these situations the minimum mean square error estimator seems to be useful and important. The explicit form of this estimator is given in the case that both the covariance matrices of the random parameters and those of the error vector are singular.
The Minimum Norm Quadratic Unbiased Invariant Estimator of the estimable linear function of the unknown variance-covariance component parameter θ in the linear model with given linear restrictions of the type Rθ = c is derived in two special structures: replicated and growth-curve model.
In this paper, we study the problem of non parametric estimation of an unknown regression function from dependent data with sub-gaussian errors. As a particular case, we handle the autoregressive framework. For this purpose, we consider a collection of finite dimensional linear spaces (e.g. linear spaces spanned by wavelets or piecewise polynomials on a possibly irregular grid) and we estimate the regression function by a least-squares estimator built on a data driven selected linear space among...