Model selection for estimating the non zero components of a Gaussian vector
ESAIM: Probability and Statistics (2006)
- Volume: 10, page 164-183
- ISSN: 1292-8100
Access Full Article
topAbstract
topHow to cite
topHuet, Sylvie. "Model selection for estimating the non zero components of a Gaussian vector." ESAIM: Probability and Statistics 10 (2006): 164-183. <http://eudml.org/doc/249747>.
@article{Huet2006,
abstract = {
We propose a method based on a penalised likelihood criterion, for
estimating the number on non-zero components of the mean
of a
Gaussian vector. Following the work of Birgé and Massart in Gaussian model
selection, we choose the penalty function such that the resulting
estimator minimises the Kullback risk.
},
author = {Huet, Sylvie},
journal = {ESAIM: Probability and Statistics},
keywords = {Kullback risk; model selection; penalised likelihood criteria.; penalised likelihood criteria},
language = {eng},
month = {3},
pages = {164-183},
publisher = {EDP Sciences},
title = {Model selection for estimating the non zero components of a Gaussian vector},
url = {http://eudml.org/doc/249747},
volume = {10},
year = {2006},
}
TY - JOUR
AU - Huet, Sylvie
TI - Model selection for estimating the non zero components of a Gaussian vector
JO - ESAIM: Probability and Statistics
DA - 2006/3//
PB - EDP Sciences
VL - 10
SP - 164
EP - 183
AB -
We propose a method based on a penalised likelihood criterion, for
estimating the number on non-zero components of the mean
of a
Gaussian vector. Following the work of Birgé and Massart in Gaussian model
selection, we choose the penalty function such that the resulting
estimator minimises the Kullback risk.
LA - eng
KW - Kullback risk; model selection; penalised likelihood criteria.; penalised likelihood criteria
UR - http://eudml.org/doc/249747
ER -
References
top- F. Abramovich, Y. Benjamini, D. Donoho and I. Johnston, Adapting to unknown sparsity by controlloing the false discovery rate. Technical Report 2000-19, Department of Statistics, Stanford University (2000).
- H. Akaike, Information theory and an extension of the maximum likelihood principle, in 2nd International Symposium on Information Theory, B.N. Petrov and F. Csaki Eds., Budapest Akademia Kiado (1973) 267–281.
- H. Akaike, A bayesian analysis of the minimum aic procedure. Ann. Inst. Statist. Math.30 (1978) 9–14.
- A. Antoniadis, I. Gijbels and G. Grégoire, Model selection using wavelet decomposition and applications. Biometrika84 (1997) 751–763.
- Y. Baraud, S. Huet and B. Laurent, Adaptive tests of qualitative hypotheses. ESAIM: PS7 (2003) 147–159.
- A. Barron, L. Birgé and P. Massart, Risk bounds for model selection via penalization. Probab. Theory Rel. Fields113 (1999) 301–413.
- Y. Benjamini and Y. Hochberg, Controlling the false discovery rate: a practical and powerful approach to multiple testing. J. R. Statist. Soc. B57 (1995) 289–300.
- L. Birgé and P. Massart, Gaussian model selection. J. Eur. Math. Soc. (JEMS)3 (2001) 203–268.
- L. Birgé and P. Massart, A generalized cp criterion for gaussian model selection. Technical report, Univ. Paris 6, Paris 7, Paris (2001).
- B.S. Cirel'son, I.A. Ibragimov and V.N. Sudakov, Norm of gaussian sample function, in Proceedings of the 3rd Japan-URSS. Symposium on Probability Theory, Berlin, Springer-Verlag. Springer Lect. Notes Math.550 (1976) 20–41.
- H.A. David, Order Statistics. Wiley series in Probability and mathematical Statistics. John Wiley and Sons, NY (1981).
- E.P. Box and R.D. Meyer, An analysis for unreplicated fractional factorials. Technometrics28 (1986) 11–18.
- D.P. Foster and R.A. Stine, Adaptive variable selection competes with bayes expert. Technical report, The Wharton School of the University of Pennsylvania, Philadelphia (2002).
- S. Huet, Comparison of methods for estimating the non zero components of a gaussian vector. Technical report, INRA, MIA-Jouy, www.inra.fr/miaj/apps/cgi-bin/raptech.cgi (2005).
- M.C. Hurvich and C.L. Tsai, Regression and time series model selection in small samples. Biometrika76 (1989) 297–307.
- I. Johnston and B. Silverman, Empirical bayes selection of wavelet thresholds. Available from www.stats.ox.ac.uk/ silverma/papers.html (2003).
- B. Laurent and P. Massart, Adaptive estimation of a quadratic functional by model selection. Ann. Statist.28 (2000) 1302–1338.
- R. Nishii, Maximum likelihood principle and model selection when the true model is unspecified. J. Multivariate Anal.27 (1988) 392–403.
- P.D. Haaland and M.A. O'Connell, Inference for effect-saturated fractional factorials. Technometrics37 (1995) 82–93.
- J. Rissanen, Universal coding, information, prediction and estimation. IEEE Trans. Infor. Theory30 (1984) 629–636.
- R.V. Lenth, Quick and easy analysis of unreplicated factorials. Technometrics31(4) (1989) 469–473.
- G. Schwarz, Estimating the dimension of a model. Ann. Statist.6 (1978) 461–464.
NotesEmbed ?
topTo embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.