Page 1

Displaying 1 – 8 of 8

Showing per page

Data-driven penalty calibration: A case study for gaussian mixture model selection

Cathy Maugis, Bertrand Michel (2011)

ESAIM: Probability and Statistics

In the companion paper [C. Maugis and B. Michel, A non asymptotic penalized criterion for Gaussian mixture model selection. ESAIM: P&S 15 (2011) 41–68] , a penalized likelihood criterion is proposed to select a Gaussian mixture model among a specific model collection. This criterion depends on unknown constants which have to be calibrated in practical situations. A “slope heuristics” method is described and experimented to deal with this practical problem. In a model-based clustering context,...

Data-driven penalty calibration: A case study for Gaussian mixture model selection

Cathy Maugis, Bertrand Michel (2012)

ESAIM: Probability and Statistics

In the companion paper [C. Maugis and B. Michel, A non asymptotic penalized criterion for Gaussian mixture model selection. ESAIM: P&S15 (2011) 41–68] , a penalized likelihood criterion is proposed to select a Gaussian mixture model among a specific model collection. This criterion depends on unknown constants which have to be calibrated in practical situations. A “slope heuristics” method is described and experimented to deal with this practical problem. In a model-based clustering context, the...

Density deconvolution with associated stationary data

Le Thi Hong Thuy, Cao Xuan Phuong (2023)

Applications of Mathematics

We study the density deconvolution problem when the random variables of interest are an associated strictly stationary sequence and the random noises are i.i.d. with a nonstandard density. Based on a nonparametric strategy, we introduce an estimator depending on two parameters. This estimator is shown to be consistent with respect to the mean integrated squared error. Under additional regularity assumptions on the target function as well as on the density of noises, some error estimates are derived....

Density estimation with quadratic loss: a confidence intervals method

Pierre Alquier (2008)

ESAIM: Probability and Statistics

We propose a feature selection method for density estimation with quadratic loss. This method relies on the study of unidimensional approximation models and on the definition of confidence regions for the density thanks to these models. It is quite general and includes cases of interest like detection of relevant wavelets coefficients or selection of support vectors in SVM. In the general case, we prove that every selected feature actually improves the performance of the estimator. In the case...

Density smoothness estimation problem using a wavelet approach

Karol Dziedziul, Bogdan Ćmiel (2014)

ESAIM: Probability and Statistics

In this paper we consider a smoothness parameter estimation problem for a density function. The smoothness parameter of a function is defined in terms of Besov spaces. This paper is an extension of recent results (K. Dziedziul, M. Kucharska, B. Wolnik, Estimation of the smoothness parameter). The construction of the estimator is based on wavelets coefficients. Although we believe that the effective estimation of the smoothness parameter is impossible in general case, we can show that it becomes...

Dependent Lindeberg central limit theorem and some applications

Jean-Marc Bardet, Paul Doukhan, Gabriel Lang, Nicolas Ragache (2008)

ESAIM: Probability and Statistics

In this paper, a very useful lemma (in two versions) is proved: it simplifies notably the essential step to establish a Lindeberg central limit theorem for dependent processes. Then, applying this lemma to weakly dependent processes introduced in Doukhan and Louhichi (1999), a new central limit theorem is obtained for sample mean or kernel density estimator. Moreover, by using the subsampling, extensions under weaker assumptions of these central limit theorems are provided. All the usual causal...

Currently displaying 1 – 8 of 8

Page 1