A note on Sampford-Durbin sampling
The first-order autoregression model with heteroskedastic innovations is considered and it is shown that the classical bootstrap procedure based on estimated residuals fails for the least-squares estimator of the autoregression coefficient. A different procedure called wild bootstrap, respectively its modification is considered and its consistency in the strong sense is established under very mild moment conditions.
In the paper, a heteroskedastic autoregressive process of the first order is considered where the autoregressive parameter is random and errors are allowed to be non-identically distributed. Wild bootstrap procedure to approximate the distribution of the least-squares estimator of the mean of the random parameter is proposed as an alternative to the approximation based on asymptotic normality, and consistency of this procedure is established.
In the paper a sequential monitoring scheme is proposed to detect instability of parameters in a multivariate autoregressive process. The proposed monitoring procedure is based on the quasi-likelihood scores and the quasi-maximum likelihood estimators of the respective parameters computed from a training sample, and it is designed so that the sequential test has a small probability of a false alarm and asymptotic power one as the size of the training sample is sufficiently large. The asymptotic...
This work deals with a multivariate random coefficient autoregressive model (RCA) of the first order. A class of modified least-squares estimators of the parameters of the model, originally proposed by Schick for univariate first-order RCA models, is studied under more general conditions. Asymptotic behavior of such estimators is explored, and a lower bound for the asymptotic variance matrix of the estimator of the mean of random coefficient is established. Finite sample properties are demonstrated...
Page 1