The search session has expired. Please query the service again.

The search session has expired. Please query the service again.

The search session has expired. Please query the service again.

The search session has expired. Please query the service again.

The search session has expired. Please query the service again.

The search session has expired. Please query the service again.

Currently displaying 1 – 4 of 4

Showing per page

Order by Relevance | Title | Year of publication

A non asymptotic penalized criterion for gaussian mixture model selection

Cathy MaugisBertrand Michel — 2011

ESAIM: Probability and Statistics

Specific Gaussian mixtures are considered to solve simultaneously variable selection and clustering problems. A non asymptotic penalized criterion is proposed to choose the number of mixture components and the relevant variable subset. Because of the non linearity of the associated Kullback-Leibler contrast on Gaussian mixtures, a general model selection theorem for maximum likelihood estimation proposed by [Massart  Springer, Berlin (2007). Lectures from the 33rd Summer School on Probability Theory...

Data-driven penalty calibration: A case study for gaussian mixture model selection

Cathy MaugisBertrand Michel — 2011

ESAIM: Probability and Statistics

In the companion paper [C. Maugis and B. Michel, A non asymptotic penalized criterion for Gaussian mixture model selection. 15 (2011) 41–68] , a penalized likelihood criterion is proposed to select a Gaussian mixture model among a specific model collection. This criterion depends on unknown constants which have to be calibrated in practical situations. A “slope heuristics” method is described and experimented to deal with this practical problem. In a model-based clustering context, the specific...

A non asymptotic penalized criterion for Gaussian mixture model selection

Cathy MaugisBertrand Michel — 2012

ESAIM: Probability and Statistics

Specific Gaussian mixtures are considered to solve simultaneously variable selection and clustering problems. A non asymptotic penalized criterion is proposed to choose the number of mixture components and the relevant variable subset. Because of the non linearity of the associated Kullback-Leibler contrast on Gaussian mixtures, a general model selection theorem for maximum likelihood estimation proposed by [Massart  Springer, Berlin (2007). Lectures from the 33rd Summer School on Probability...

Data-driven penalty calibration: A case study for Gaussian mixture model selection

Cathy MaugisBertrand Michel — 2012

ESAIM: Probability and Statistics

In the companion paper [C. Maugis and B. Michel, A non asymptotic penalized criterion for Gaussian mixture model selection. (2011) 41–68] , a penalized likelihood criterion is proposed to select a Gaussian mixture model among a specific model collection. This criterion depends on unknown constants which have to be calibrated in practical situations. A “slope heuristics” method is described and experimented to deal with this practical problem. In a model-based clustering context, the...

Page 1

Download Results (CSV)