Existence, Consistency and computer simulation for selected variants of minimum distance estimators

Václav Kůs; Domingo Morales; Jitka Hrabáková; Iva Frýdlová

Kybernetika (2018)

  • Volume: 54, Issue: 2, page 336-350
  • ISSN: 0023-5954

Abstract

top
The paper deals with sufficient conditions for the existence of general approximate minimum distance estimator (AMDE) of a probability density function f 0 on the real line. It shows that the AMDE always exists when the bounded φ -divergence, Kolmogorov, Lévy, Cramér, or discrepancy distance is used. Consequently, n - 1 / 2 consistency rate in any bounded φ -divergence is established for Kolmogorov, Lévy, and discrepancy estimators under the condition that the degree of variations of the corresponding family of densities is finite. A simulation experiment empirically studies the performance of the approximate minimum Kolmogorov estimator (AMKE) and some histogram-based variants of approximate minimum divergence estimators, like power type and Le Cam, under six distributions (Uniform, Normal, Logistic, Laplace, Cauchy, Weibull). A comparison with the standard estimators (moment/maximum likelihood/median) is provided for sample sizes n = 10 , 20 , 50 , 120 , 250 . The simulation analyzes the behaviour of estimators through different families of distributions. It is shown that the performance of AMKE differs from the other estimators with respect to family type and that the AMKE estimators cope more easily with the Cauchy distribution than standard or divergence based estimators, especially for small sample sizes.

How to cite

top

Kůs, Václav, et al. "Existence, Consistency and computer simulation for selected variants of minimum distance estimators." Kybernetika 54.2 (2018): 336-350. <http://eudml.org/doc/294169>.

@article{Kůs2018,
abstract = {The paper deals with sufficient conditions for the existence of general approximate minimum distance estimator (AMDE) of a probability density function $f_0$ on the real line. It shows that the AMDE always exists when the bounded $\phi $-divergence, Kolmogorov, Lévy, Cramér, or discrepancy distance is used. Consequently, $n^\{-1/2\}$ consistency rate in any bounded $\phi $-divergence is established for Kolmogorov, Lévy, and discrepancy estimators under the condition that the degree of variations of the corresponding family of densities is finite. A simulation experiment empirically studies the performance of the approximate minimum Kolmogorov estimator (AMKE) and some histogram-based variants of approximate minimum divergence estimators, like power type and Le Cam, under six distributions (Uniform, Normal, Logistic, Laplace, Cauchy, Weibull). A comparison with the standard estimators (moment/maximum likelihood/median) is provided for sample sizes $n=10,20,50,120,250$. The simulation analyzes the behaviour of estimators through different families of distributions. It is shown that the performance of AMKE differs from the other estimators with respect to family type and that the AMKE estimators cope more easily with the Cauchy distribution than standard or divergence based estimators, especially for small sample sizes.},
author = {Kůs, Václav, Morales, Domingo, Hrabáková, Jitka, Frýdlová, Iva},
journal = {Kybernetika},
keywords = {Kolmogorov distance; $\phi $-divergence; minimum distance estimator; consistency rate; computer simulation},
language = {eng},
number = {2},
pages = {336-350},
publisher = {Institute of Information Theory and Automation AS CR},
title = {Existence, Consistency and computer simulation for selected variants of minimum distance estimators},
url = {http://eudml.org/doc/294169},
volume = {54},
year = {2018},
}

TY - JOUR
AU - Kůs, Václav
AU - Morales, Domingo
AU - Hrabáková, Jitka
AU - Frýdlová, Iva
TI - Existence, Consistency and computer simulation for selected variants of minimum distance estimators
JO - Kybernetika
PY - 2018
PB - Institute of Information Theory and Automation AS CR
VL - 54
IS - 2
SP - 336
EP - 350
AB - The paper deals with sufficient conditions for the existence of general approximate minimum distance estimator (AMDE) of a probability density function $f_0$ on the real line. It shows that the AMDE always exists when the bounded $\phi $-divergence, Kolmogorov, Lévy, Cramér, or discrepancy distance is used. Consequently, $n^{-1/2}$ consistency rate in any bounded $\phi $-divergence is established for Kolmogorov, Lévy, and discrepancy estimators under the condition that the degree of variations of the corresponding family of densities is finite. A simulation experiment empirically studies the performance of the approximate minimum Kolmogorov estimator (AMKE) and some histogram-based variants of approximate minimum divergence estimators, like power type and Le Cam, under six distributions (Uniform, Normal, Logistic, Laplace, Cauchy, Weibull). A comparison with the standard estimators (moment/maximum likelihood/median) is provided for sample sizes $n=10,20,50,120,250$. The simulation analyzes the behaviour of estimators through different families of distributions. It is shown that the performance of AMKE differs from the other estimators with respect to family type and that the AMKE estimators cope more easily with the Cauchy distribution than standard or divergence based estimators, especially for small sample sizes.
LA - eng
KW - Kolmogorov distance; $\phi $-divergence; minimum distance estimator; consistency rate; computer simulation
UR - http://eudml.org/doc/294169
ER -

References

top
  1. Mohamad, D. Al, 10.1007/s00362-016-0812-5, Statistical Papers (published on-line 2016.) DOI10.1007/s00362-016-0812-5
  2. Barron, A. R., The convergence in information of probability density estimators., In: IEEE Int. Symp. Information Theory, Kobe 1988. 
  3. Beran, R., 10.1214/aos/1176343842, Ann. Statist. 5 (1977), 455-463. MR0448700DOI10.1214/aos/1176343842
  4. Berger, A., 10.1214/aoms/1177729701, An. Math. Statist. 22 (1951), 119-120. MR0039789DOI10.1214/aoms/1177729701
  5. Broniatowski, M., Toma, A., Vajda, I., 10.1016/j.jspi.2012.03.019, J. Statist. Plann. Inference. 142 (2012), 9, 2574-2585. MR2922007DOI10.1016/j.jspi.2012.03.019
  6. Csiszár, I., Eine Informationstheoretische Ungleichung und ihre Anwendung auf den Beweis der Ergodizit on Markhoffschen Ketten., Publ. Math. Inst. Hungar. Acad. Sci., Ser. A 8 (1963), 84-108. MR0164374
  7. Csiszár, I., Information-type measures of difference of probability distributions and indirect observations., Studia Sci. Math. Hungar. 2 (1967), 299-318. MR0219345
  8. Frýdlová, I., Vajda, I., Kůs, V., Modified power divergence estimators in normal model - simulation and comparative study., Kybernetika 48 (2012), 4, 795-808. MR3013399
  9. Gibbs, A. L., Su, F. E., 10.1111/j.1751-5823.2002.tb00178.x, Int. Statist. Rev. 70 (2002), 419-435. DOI10.1111/j.1751-5823.2002.tb00178.x
  10. Győrfi, L., Vajda, I., Meulen, E. C. van der, Family of point estimates yielded by L 1 -consistent density estimate., In: -Statistical Analysis and Related Methods (Y. Dodge, ed.), Elsevier, Amsterdam 1992, pp. 415-430. MR1214843
  11. Győrfi, L., Vajda, I., Meulen, E. C. van der, Minimum Hellinger distance point estimates consistent under weak family regularity., Math. Methods Statist. 3 (1994), 25-45. MR1272629
  12. Győrfi, L., Vajda, I., Meulen, E. C. van der, 10.1007/bf02613911, Metrika 43 (1996), 237-255. MR1394805DOI10.1007/bf02613911
  13. Hrabáková, J., Kůs, V., 10.1080/03610926.2013.802806, Comm. Statist. - Theory and Methods 42 (2013), 20, 3665-3677. MR3170957DOI10.1080/03610926.2013.802806
  14. Hrabáková, J., Kůs, V., 10.1007/s00184-016-0601-0, Metrika 80 (2017), 243-257. MR3597584DOI10.1007/s00184-016-0601-0
  15. Kafka, P., Ősterreicher, F., Vincze, I., On powers of f -divergences defining a distance., Studia Sci. Mathem. Hungarica 26 (1991), 415-422. MR1197090
  16. Kůs, V., Blended φ -divergences with examples., Kybernetika 39 (2003), 43-54. MR1980123
  17. Kůs, V., 10.1007/s001840300286, Metrika 60 (2004), 1-14. MR2100162DOI10.1007/s001840300286
  18. Kůs, V., Morales, D., Vajda, I., Extensions of the parametric families of divergences used in statistical inference., Kybernetika 44 (2008), 1, 95-112. MR2405058
  19. Cam, L. Le, 10.1007/978-1-4612-4946-7, Springer, New York 1986. MR0856411DOI10.1007/978-1-4612-4946-7
  20. Liese, F., Vajda, I., Convex Statistical Distances., Teubner, Leipzig 1987. MR0926905
  21. Liese, F., Vajda, I., 10.1109/tit.2006.881731, IEEE Trans. Inform. Theory 52 (2006), 4394-4412. MR2300826DOI10.1109/tit.2006.881731
  22. Matusita, K., 10.1007/bf02868578, Ann. Inst. Statist. Math. 16 (1964), 305-315. MR0172419DOI10.1007/bf02868578
  23. Ősterreicher, F., On a class of perimeter-type distances of probability distributions., Kybernetika 32 (1996), 4, 389-393. MR1420130
  24. Pardo, L., 10.1201/9781420034813, Chapman and Hall, Boston 2006. MR2183173DOI10.1201/9781420034813
  25. Pfanzagl, J., 10.1515/9783110889765, W. de Gruyter, Berlin 1994. MR1291393DOI10.1515/9783110889765
  26. Vajda, I., Theory of Statistical Inference and Information., Kluwer, Boston 1989. 

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.