On extrapolation in multiple ARMA processes
A non-linear AR(1) process is investigated when the associated white noise is positive. A criterion is derived for the geometric ergodicity of the process. Some explicit formulas are derived for one and two steps ahead extrapolation. Influence of parameter estimation on extrapolation is studied.
The impact of additive outliers on a performance of the Kalman filter is discussed and less outlier-sensitive modification of the Kalman filter is proposed. The improved filter is then used to obtain an improved smoothing algorithm and an improved state-space model parameters estimation.
The periodic autoregressive processes are useful in statistical analysis of seasonal time series. Some procedures (e.g. extrapolation) are quite analogous to those in the clasical autoregressive models. The problem of interpolation needs, however, some special methods. They are demonstrated in the paper on the case of the process of the second order with the period of length 2.
This paper introduces a novel method for selecting a feature subset yielding an optimal trade-off between class separability and feature space dimensionality. We assume the following feature properties: (a) the features are ordered into a sequence, (b) robustness of the features decreases with an increasing order and (c) higher-order features supply more detailed information about the objects. We present a general algorithm how to find under those assumptions the optimal feature subset. Its performance...
Particle filter algorithms approximate a sequence of distributions by a sequence of empirical measures generated by a population of simulated particles. In the context of Hidden Markov Models (HMM), they provide approximations of the distribution of optimal filters associated to these models. For a given set of observations, the behaviour of particle filters, as the number of particles tends to infinity, is asymptotically Gaussian, and the asymptotic variance in the central limit theorem depends...
Particle filter algorithms approximate a sequence of distributions by a sequence of empirical measures generated by a population of simulated particles. In the context of Hidden Markov Models (HMM), they provide approximations of the distribution of optimal filters associated to these models. For a given set of observations, the behaviour of particle filters, as the number of particles tends to infinity, is asymptotically Gaussian, and the asymptotic variance in the central limit theorem depends...
In the article, we consider construction of prediction intervals for stationary time series using Bühlmann's [8], [9] sieve bootstrapapproach. Basic theoretical properties concerning consistency are proved. We extend the results obtained earlier by Stine [21], Masarotto and Grigoletto [13] for an autoregressive time series of finite order to the rich class of linear and invertible stationary models. Finite sample performance of the constructed intervals is investigated by computer simulations.
Convergence of the ensemble Kalman filter in the limit for large ensembles to the Kalman filter is proved. In each step of the filter, convergence of the ensemble sample covariance follows from a weak law of large numbers for exchangeable random variables, the continuous mapping theorem gives convergence in probability of the ensemble members, and bounds on the ensemble then give convergence.