Currently displaying 1 – 3 of 3

Showing per page

Order by Relevance | Title | Year of publication

Iterative feature selection in least square regression estimation

Pierre Alquier — 2008

Annales de l'I.H.P. Probabilités et statistiques

This paper presents a new algorithm to perform regression estimation, in both the inductive and transductive setting. The estimator is defined as a linear combination of functions in a given dictionary. Coefficients of the combinations are computed sequentially using projection on some simple sets. These sets are defined as confidence regions provided by a deviation (PAC) inequality on an estimator in one-dimensional models. We prove that every projection the algorithm actually improves the performance...

Density estimation with quadratic loss: a confidence intervals method

Pierre Alquier — 2008

ESAIM: Probability and Statistics

We propose a feature selection method for density estimation with quadratic loss. This method relies on the study of unidimensional approximation models and on the definition of confidence regions for the density thanks to these models. It is quite general and includes cases of interest like detection of relevant wavelets coefficients or selection of support vectors in SVM. In the general case, we prove that every selected feature actually improves the performance of the estimator. In the case...

Prediction of time series by statistical learning: general losses and fast rates

Pierre AlquierXiaoyin LiOlivier Wintenberger — 2013

Dependence Modeling

We establish rates of convergences in statistical learning for time series forecasting. Using the PAC-Bayesian approach, slow rates of convergence √ d/n for the Gibbs estimator under the absolute loss were given in a previous work [7], where n is the sample size and d the dimension of the set of predictors. Under the same weak dependence conditions, we extend this result to any convex Lipschitz loss function. We also identify a condition on the parameter space that ensures similar rates for the...

Page 1

Download Results (CSV)