Estimation and prediction in regression models with random explanatory variables

Nguyen Bac-Van

  • Publisher: Instytut Matematyczny Polskiej Akademi Nauk(Warszawa), 1992

Abstract

top
The regression model X(t),Y(t);t=1,...,n with random explanatory variable X is transformed by prescribing a partition S 1 , . . . , S k of the given domain S of X-values and specifying X ( 1 ) , . . . , X ( n ) S i = X i 1 , . . . , X i α ( i ) , i = 1 , . . . , k . Through the conditioning α ( i ) = a ( i ) , i = 1 , . . . , k , X i 1 , . . . , X i α ( i ) ; i = 1 , . . . , k = x 11 , . . . , x k a ( k ) the initial model with i.i.d. pairs (X(t),Y(t)),t=1,...,n, becomes a conditional fixed-design ( x 11 , . . . , x k a ( k ) ) model Y i j , i = 1 , . . . , k ; j = 1 , . . . , a ( i ) where the response variables Y i j are independent and distributed according to the mixed conditional distribution Q ( · , x i j ) of Y given X at the observed value x i j .Afterwards, we investigate the case ( Q ) E ( Y ' | x ) = i = 1 k b i ( x ) θ i I S i ( x ) , ( Q ) D ( Y | x ) = i = 1 k d i ( x ) Σ i I S i ( x ) which arises when the conditional distribution law of Y given X changes as X passes from a domain S i to another, whence Y follows a mixture of distributions. Then the general transformation gives the equivalent reduction to a conditional multivariate Behrens-Fisher model. We construct conditional generalized least squares estimators of θ ' = ( θ 1 ' θ k ' ) and predictors of Y(n+1) given X(n+1) = x ∈ S. Through some condition imposed on the range of θ, the CGLS estimator and predictor are shown to enjoy local and global optimality.CONTENTSPreface..................................................................................................................................................................................................................5I. A data transformation preserving the conditional distribution and localizing the explanatory variable.................................................................61. Introduction........................................................................................................................................................................................................62. Theorems on data transformation......................................................................................................................................................................73. Proofs of the theorems.......................................................................................................................................................................................94. Interpretation of the theorems..........................................................................................................................................................................14II. Conditional linear models and estimation of regression parameters.................................................................................................................175. Introduction......................................................................................................................................................................................................176. Conditional generalized least squares estimators (CGLSE).............................................................................................................................197. Conditional estimability.....................................................................................................................................................................................258. Properties of the CGLSE..................................................................................................................................................................................29III. Prediction of the response variable.................................................................................................................................................................349. Introduction......................................................................................................................................................................................................3510. Predictors connnected wi.th the CGLSE........................................................................................................................................................3511. Properties of CGLS predictors.......................................................................................................................................................................38References..........................................................................................................................................................................................................431991 Mathematics Subject Classification: Primary 62J02; Secondary 62F11.

How to cite

top

Nguyen Bac-Van. Estimation and prediction in regression models with random explanatory variables. Warszawa: Instytut Matematyczny Polskiej Akademi Nauk, 1992. <http://eudml.org/doc/219330>.

@book{NguyenBac1992,
abstract = {The regression model X(t),Y(t);t=1,...,n with random explanatory variable X is transformed by prescribing a partition $S_\{1\},...,S_\{k\}$ of the given domain S of X-values and specifying$\{X(1),...,X(n)\} ∩ S_\{i\} = \{X_\{i1\},...,X_\{iα(i)\} \}, i=1,...,k.$Through the conditioning$\{α(i)=a(i), i=1,...,k\}, \{X_\{i1\},...,X_\{iα(i)\}; i=1,...,k\} = \{x_\{11\},...,x_\{ka(k)\}\}$the initial model with i.i.d. pairs (X(t),Y(t)),t=1,...,n, becomes a conditional fixed-design $(x_\{11\},...,x_\{ka(k)\})$ model$\{Y_\{ij\},i=1,...,k;j=1,...,a(i)\}$where the response variables $Y_\{ij\}$ are independent and distributed according to the mixed conditional distribution $Q(·,x_\{ij\})$ of Y given X at the observed value $x_\{ij\}$.Afterwards, we investigate the case$(Q)E(Y^\{\prime \}|x) = ∑^k_\{i=1\} b_\{i\}(x)θ_\{i\} I_\{S_\{i\}\}(x), (Q)D(Y|x) = ∑^k_\{i=1\} d_\{i\}(x)Σ_\{i\}I_\{S_\{i\}\}(x)$which arises when the conditional distribution law of Y given X changes as X passes from a domain $S_\{i\}$ to another, whence Y follows a mixture of distributions. Then the general transformation gives the equivalent reduction to a conditional multivariate Behrens-Fisher model. We construct conditional generalized least squares estimators of $θ^\{\prime \} = (θ^\{\prime \}_\{1\}⋮ ⋯⋮ θ^\{\prime \}_\{k\})$ and predictors of Y(n+1) given X(n+1) = x ∈ S. Through some condition imposed on the range of θ, the CGLS estimator and predictor are shown to enjoy local and global optimality.CONTENTSPreface..................................................................................................................................................................................................................5I. A data transformation preserving the conditional distribution and localizing the explanatory variable.................................................................61. Introduction........................................................................................................................................................................................................62. Theorems on data transformation......................................................................................................................................................................73. Proofs of the theorems.......................................................................................................................................................................................94. Interpretation of the theorems..........................................................................................................................................................................14II. Conditional linear models and estimation of regression parameters.................................................................................................................175. Introduction......................................................................................................................................................................................................176. Conditional generalized least squares estimators (CGLSE).............................................................................................................................197. Conditional estimability.....................................................................................................................................................................................258. Properties of the CGLSE..................................................................................................................................................................................29III. Prediction of the response variable.................................................................................................................................................................349. Introduction......................................................................................................................................................................................................3510. Predictors connnected wi.th the CGLSE........................................................................................................................................................3511. Properties of CGLS predictors.......................................................................................................................................................................38References..........................................................................................................................................................................................................431991 Mathematics Subject Classification: Primary 62J02; Secondary 62F11.},
author = {Nguyen Bac-Van},
keywords = {least squares procedure; random explanatory variable regression problems; conditional fixed-design model; data transformation; conditioning; asymptotic estimability; conditional unbiasedness; conditional generalized least squares estimates; prediction problem},
language = {eng},
location = {Warszawa},
publisher = {Instytut Matematyczny Polskiej Akademi Nauk},
title = {Estimation and prediction in regression models with random explanatory variables},
url = {http://eudml.org/doc/219330},
year = {1992},
}

TY - BOOK
AU - Nguyen Bac-Van
TI - Estimation and prediction in regression models with random explanatory variables
PY - 1992
CY - Warszawa
PB - Instytut Matematyczny Polskiej Akademi Nauk
AB - The regression model X(t),Y(t);t=1,...,n with random explanatory variable X is transformed by prescribing a partition $S_{1},...,S_{k}$ of the given domain S of X-values and specifying${X(1),...,X(n)} ∩ S_{i} = {X_{i1},...,X_{iα(i)} }, i=1,...,k.$Through the conditioning${α(i)=a(i), i=1,...,k}, {X_{i1},...,X_{iα(i)}; i=1,...,k} = {x_{11},...,x_{ka(k)}}$the initial model with i.i.d. pairs (X(t),Y(t)),t=1,...,n, becomes a conditional fixed-design $(x_{11},...,x_{ka(k)})$ model${Y_{ij},i=1,...,k;j=1,...,a(i)}$where the response variables $Y_{ij}$ are independent and distributed according to the mixed conditional distribution $Q(·,x_{ij})$ of Y given X at the observed value $x_{ij}$.Afterwards, we investigate the case$(Q)E(Y^{\prime }|x) = ∑^k_{i=1} b_{i}(x)θ_{i} I_{S_{i}}(x), (Q)D(Y|x) = ∑^k_{i=1} d_{i}(x)Σ_{i}I_{S_{i}}(x)$which arises when the conditional distribution law of Y given X changes as X passes from a domain $S_{i}$ to another, whence Y follows a mixture of distributions. Then the general transformation gives the equivalent reduction to a conditional multivariate Behrens-Fisher model. We construct conditional generalized least squares estimators of $θ^{\prime } = (θ^{\prime }_{1}⋮ ⋯⋮ θ^{\prime }_{k})$ and predictors of Y(n+1) given X(n+1) = x ∈ S. Through some condition imposed on the range of θ, the CGLS estimator and predictor are shown to enjoy local and global optimality.CONTENTSPreface..................................................................................................................................................................................................................5I. A data transformation preserving the conditional distribution and localizing the explanatory variable.................................................................61. Introduction........................................................................................................................................................................................................62. Theorems on data transformation......................................................................................................................................................................73. Proofs of the theorems.......................................................................................................................................................................................94. Interpretation of the theorems..........................................................................................................................................................................14II. Conditional linear models and estimation of regression parameters.................................................................................................................175. Introduction......................................................................................................................................................................................................176. Conditional generalized least squares estimators (CGLSE).............................................................................................................................197. Conditional estimability.....................................................................................................................................................................................258. Properties of the CGLSE..................................................................................................................................................................................29III. Prediction of the response variable.................................................................................................................................................................349. Introduction......................................................................................................................................................................................................3510. Predictors connnected wi.th the CGLSE........................................................................................................................................................3511. Properties of CGLS predictors.......................................................................................................................................................................38References..........................................................................................................................................................................................................431991 Mathematics Subject Classification: Primary 62J02; Secondary 62F11.
LA - eng
KW - least squares procedure; random explanatory variable regression problems; conditional fixed-design model; data transformation; conditioning; asymptotic estimability; conditional unbiasedness; conditional generalized least squares estimates; prediction problem
UR - http://eudml.org/doc/219330
ER -

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.