Specific Gaussian mixtures are considered to solve simultaneously variable selection and clustering problems. A non asymptotic penalized criterion is proposed to choose the number of mixture components and the relevant variable subset. Because of the non linearity of the associated Kullback-Leibler contrast on Gaussian mixtures, a general model selection theorem for maximum likelihood estimation proposed by [Massart Springer, Berlin (2007). Lectures from the 33rd Summer School on Probability Theory...
In the companion paper [C. Maugis and B. Michel, A non asymptotic penalized criterion for Gaussian mixture model selection. 15 (2011) 41–68] , a penalized likelihood criterion is proposed to select a Gaussian mixture model among a specific model collection. This criterion depends on unknown constants which have to be calibrated in practical situations. A “slope heuristics” method is described and experimented to deal with this practical problem. In a model-based clustering context, the specific...
In the companion paper [C. Maugis and B. Michel,
A non asymptotic penalized criterion for Gaussian mixture model selection.
(2011) 41–68] , a penalized likelihood
criterion is proposed to select a Gaussian mixture model among a
specific model collection. This criterion depends on unknown
constants which have to be calibrated in practical situations. A
“slope heuristics” method is described and experimented to deal
with this practical problem. In a model-based clustering context,
the...
Specific Gaussian mixtures are considered to solve simultaneously
variable selection and clustering problems. A non asymptotic
penalized criterion is proposed to choose the number of mixture
components and the relevant variable subset. Because of the non
linearity of the associated Kullback-Leibler contrast on Gaussian
mixtures, a general model selection theorem for maximum likelihood
estimation proposed by [Massart
Springer, Berlin (2007).
Lectures from the 33rd Summer School on Probability...
Download Results (CSV)