Page 1

Displaying 1 – 6 of 6

Showing per page

Prediction of time series by statistical learning: general losses and fast rates

Pierre Alquier, Xiaoyin Li, Olivier Wintenberger (2013)

Dependence Modeling

We establish rates of convergences in statistical learning for time series forecasting. Using the PAC-Bayesian approach, slow rates of convergence √ d/n for the Gibbs estimator under the absolute loss were given in a previous work [7], where n is the sample size and d the dimension of the set of predictors. Under the same weak dependence conditions, we extend this result to any convex Lipschitz loss function. We also identify a condition on the parameter space that ensures similar rates for the...

Prediction problems related to a first-order autoregressive process in the presence of outliers

Sugata Sen Roy, Sourav Chakraborty (2006)

Applicationes Mathematicae

Outliers in a time series often cause problems in fitting a suitable model to the data. Hence predictions based on such models are liable to be erroneous. In this paper we consider a stable first-order autoregressive process and suggest two methods of substituting an outlier by imputed values and then predicting on the basis of it. The asymptotic properties of both the process parameter estimators and the predictors are also studied.

Currently displaying 1 – 6 of 6

Page 1