Gaussian model selection

Lucien Birgé; Pascal Massart

Journal of the European Mathematical Society (2001)

  • Volume: 003, Issue: 3, page 203-268
  • ISSN: 1435-9855

Abstract

top
Our purpose in this paper is to provide a general approach to model selection via penalization for Gaussian regression and to develop our point of view about this subject. The advantage and importance of model selection come from the fact that it provides a suitable approach to many different types of problems, starting from model selection per se (among a family of parametric models, which one is more suitable for the data at hand), which includes for instance variable selection in regression models, to nonparametric estimation, for which it provides a very powerful tool that allows adaptation under quite general circumstances. Our approach to model selection also provides a natural connection between the parametric and nonparametric points of view and copes naturally with the fact that a model is not necessarily true. The method is based on the penalization of a least squares criterion which can be viewed as a generalization of Mallows’ C p . A large part of our efforts will be put on choosing properly the list of models and the penalty function for various estimation problems like classical variable selection or adaptive estimation for various types of l p -bodies.

How to cite

top

Birgé, Lucien, and Massart, Pascal. "Gaussian model selection." Journal of the European Mathematical Society 003.3 (2001): 203-268. <http://eudml.org/doc/277724>.

@article{Birgé2001,
abstract = {Our purpose in this paper is to provide a general approach to model selection via penalization for Gaussian regression and to develop our point of view about this subject. The advantage and importance of model selection come from the fact that it provides a suitable approach to many different types of problems, starting from model selection per se (among a family of parametric models, which one is more suitable for the data at hand), which includes for instance variable selection in regression models, to nonparametric estimation, for which it provides a very powerful tool that allows adaptation under quite general circumstances. Our approach to model selection also provides a natural connection between the parametric and nonparametric points of view and copes naturally with the fact that a model is not necessarily true. The method is based on the penalization of a least squares criterion which can be viewed as a generalization of Mallows’ $C_p$. A large part of our efforts will be put on choosing properly the list of models and the penalty function for various estimation problems like classical variable selection or adaptive estimation for various types of $l_p$-bodies.},
author = {Birgé, Lucien, Massart, Pascal},
journal = {Journal of the European Mathematical Society},
keywords = {nonasymptotic point of view; white noise framework; curve estimation; parametric; nonparametric; penalized projection estimators; Bayesian view; nonasymptotic point of view; white noise framework; curve estimation; parametric; nonparametric; penalized projection estimators; Bayesian view},
language = {eng},
number = {3},
pages = {203-268},
publisher = {European Mathematical Society Publishing House},
title = {Gaussian model selection},
url = {http://eudml.org/doc/277724},
volume = {003},
year = {2001},
}

TY - JOUR
AU - Birgé, Lucien
AU - Massart, Pascal
TI - Gaussian model selection
JO - Journal of the European Mathematical Society
PY - 2001
PB - European Mathematical Society Publishing House
VL - 003
IS - 3
SP - 203
EP - 268
AB - Our purpose in this paper is to provide a general approach to model selection via penalization for Gaussian regression and to develop our point of view about this subject. The advantage and importance of model selection come from the fact that it provides a suitable approach to many different types of problems, starting from model selection per se (among a family of parametric models, which one is more suitable for the data at hand), which includes for instance variable selection in regression models, to nonparametric estimation, for which it provides a very powerful tool that allows adaptation under quite general circumstances. Our approach to model selection also provides a natural connection between the parametric and nonparametric points of view and copes naturally with the fact that a model is not necessarily true. The method is based on the penalization of a least squares criterion which can be viewed as a generalization of Mallows’ $C_p$. A large part of our efforts will be put on choosing properly the list of models and the penalty function for various estimation problems like classical variable selection or adaptive estimation for various types of $l_p$-bodies.
LA - eng
KW - nonasymptotic point of view; white noise framework; curve estimation; parametric; nonparametric; penalized projection estimators; Bayesian view; nonasymptotic point of view; white noise framework; curve estimation; parametric; nonparametric; penalized projection estimators; Bayesian view
UR - http://eudml.org/doc/277724
ER -

Citations in EuDML Documents

top
  1. Émilie Lebarbier, Tristan Mary-Huard, Une introduction au critère BIC : fondements théoriques et interprétation
  2. Pierre Alquier, Xiaoyin Li, Olivier Wintenberger, Prediction of time series by statistical learning: general losses and fast rates
  3. Nicolas Verzelen, High-dimensional gaussian model selection on a gaussian design
  4. Lucien Birgé, Yves Rozenholc, How many bins should be put in a regular histogram
  5. Cathy Maugis, Bertrand Michel, A non asymptotic penalized criterion for gaussian mixture model selection
  6. Cathy Maugis, Bertrand Michel, Data-driven penalty calibration: A case study for gaussian mixture model selection
  7. Yannick Baraud, Model selection for regression on a random design
  8. Xavier Gendre, Model selection and estimation of a component in additive regression
  9. Alice Cleynen, Emilie Lebarbier, Segmentation of the Poisson and negative binomial rate models: a penalized estimator
  10. Cathy Maugis, Bertrand Michel, Data-driven penalty calibration: A case study for Gaussian mixture model selection

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.