Prédiction non paramétrique : étude de l'erreur quadratique du prédictogramme
We establish rates of convergences in statistical learning for time series forecasting. Using the PAC-Bayesian approach, slow rates of convergence √ d/n for the Gibbs estimator under the absolute loss were given in a previous work [7], where n is the sample size and d the dimension of the set of predictors. Under the same weak dependence conditions, we extend this result to any convex Lipschitz loss function. We also identify a condition on the parameter space that ensures similar rates for the...
Outliers in a time series often cause problems in fitting a suitable model to the data. Hence predictions based on such models are liable to be erroneous. In this paper we consider a stable first-order autoregressive process and suggest two methods of substituting an outlier by imputed values and then predicting on the basis of it. The asymptotic properties of both the process parameter estimators and the predictors are also studied.