Minimum disparity estimators for discrete and continuous models

María Luisa Menéndez; Domingo Morales; Leandro Pardo; Igor Vajda

Applications of Mathematics (2001)

  • Volume: 46, Issue: 6, page 439-466
  • ISSN: 0862-7940

Abstract

top
Disparities of discrete distributions are introduced as a natural and useful extension of the information-theoretic divergences. The minimum disparity point estimators are studied in regular discrete models with i.i.d. observations and their asymptotic efficiency of the first order, in the sense of Rao, is proved. These estimators are applied to continuous models with i.i.d. observations when the observation space is quantized by fixed points, or at random, by the sample quantiles of fixed orders. It is shown that the random quantization leads to estimators which are robust in the sense of Lindsay [9], and which can achieve the efficiency in the underlying continuous models provided these are regular enough.

How to cite

top

Menéndez, María Luisa, et al. "Minimum disparity estimators for discrete and continuous models." Applications of Mathematics 46.6 (2001): 439-466. <http://eudml.org/doc/33096>.

@article{Menéndez2001,
abstract = {Disparities of discrete distributions are introduced as a natural and useful extension of the information-theoretic divergences. The minimum disparity point estimators are studied in regular discrete models with i.i.d. observations and their asymptotic efficiency of the first order, in the sense of Rao, is proved. These estimators are applied to continuous models with i.i.d. observations when the observation space is quantized by fixed points, or at random, by the sample quantiles of fixed orders. It is shown that the random quantization leads to estimators which are robust in the sense of Lindsay [9], and which can achieve the efficiency in the underlying continuous models provided these are regular enough.},
author = {Menéndez, María Luisa, Morales, Domingo, Pardo, Leandro, Vajda, Igor},
journal = {Applications of Mathematics},
keywords = {divergence; disparity; minimum disparity estimators; robustness; asymptotic efficiency; divergence; disparity; minimum disparity estimators; robustness; asymptotic efficiency},
language = {eng},
number = {6},
pages = {439-466},
publisher = {Institute of Mathematics, Academy of Sciences of the Czech Republic},
title = {Minimum disparity estimators for discrete and continuous models},
url = {http://eudml.org/doc/33096},
volume = {46},
year = {2001},
}

TY - JOUR
AU - Menéndez, María Luisa
AU - Morales, Domingo
AU - Pardo, Leandro
AU - Vajda, Igor
TI - Minimum disparity estimators for discrete and continuous models
JO - Applications of Mathematics
PY - 2001
PB - Institute of Mathematics, Academy of Sciences of the Czech Republic
VL - 46
IS - 6
SP - 439
EP - 466
AB - Disparities of discrete distributions are introduced as a natural and useful extension of the information-theoretic divergences. The minimum disparity point estimators are studied in regular discrete models with i.i.d. observations and their asymptotic efficiency of the first order, in the sense of Rao, is proved. These estimators are applied to continuous models with i.i.d. observations when the observation space is quantized by fixed points, or at random, by the sample quantiles of fixed orders. It is shown that the random quantization leads to estimators which are robust in the sense of Lindsay [9], and which can achieve the efficiency in the underlying continuous models provided these are regular enough.
LA - eng
KW - divergence; disparity; minimum disparity estimators; robustness; asymptotic efficiency; divergence; disparity; minimum disparity estimators; robustness; asymptotic efficiency
UR - http://eudml.org/doc/33096
ER -

References

top
  1. 10.1016/0167-7152(94)90236-4, Statist. Probab. Lett. 20 (1994), 69–73. (1994) MR1294806DOI10.1016/0167-7152(94)90236-4
  2. 10.1080/00949659408811609, J.  Statist. Comput. Simulation 50 (1994), 173–185. (1994) DOI10.1080/00949659408811609
  3. 10.1214/aoms/1177703581, Ann. Math. Statist. 35 (1964), 817–824. (1964) Zbl0259.62017MR0169324DOI10.1214/aoms/1177703581
  4. Goodness-of-fit using sample quantiles, J.  Roy. Statist. Soc. Ser.  B 35 (1973), 277–284. (1973) MR0336896
  5. Mathematical Methods of Statistics, Princeton University Press, Princeton, 1946. (1946) MR0016588
  6. Multinomial goodness-of-fit tests, J.  Roy. Statist. Soc. Ser.  B 46 (1984), 440–464. (1984) MR0790631
  7. Statistical Methods for Research Workers (8th edition), London, 1941. (1941) 
  8. Convex Statistical Distances, Teubner, Leipzig, 1987. (1987) MR0926905
  9. 10.1214/aos/1176325512, Ann. Statist. 22 (1994), 1081–1114. (1994) MR1292557DOI10.1214/aos/1176325512
  10. 10.1080/03610929808832117, Comm. Statist. Theory Methods 27 (1998), 609–633. (1998) MR1619038DOI10.1080/03610929808832117
  11. 10.1023/A:1012466605316, Ann. Inst. Statist. Math. 53 (2001), 277–288. (2001) MR1841136DOI10.1023/A:1012466605316
  12. 10.1016/0378-3758(95)00013-Y, J.  Statist. Plann. Inference 48 (1995), 347–369. (1995) MR1368984DOI10.1016/0378-3758(95)00013-Y
  13. Contribution to the theory of the χ 2 test, In: Proc. Berkeley Symp. Math. Statist. Probab., Berkeley, CA, Berkeley University Press, Berkeley, 1949, pp. 239–273. (1949) MR0028003
  14. 10.1080/03610919508813265, Comm. Statist. Simulation Comput. 24 (1995), 653–673. (1995) DOI10.1080/03610919508813265
  15. Asymptotic efficiency and limiting information, In: Proc. 4th Berkeley Symp. Math. Stat. Probab., Berkeley, CA, Berkeley University Press, Berkeley, 1961, pp. 531–545. (1961) Zbl0156.39802MR0133192
  16. Linear Statistical Inference and its Applications (2nd edition), Wiley, New York, 1973. (1973) MR0346957
  17. Goodness-of-fit Statistics for Discrete Multivariate Data, Springer-Verlag, New York, 1988. (1988) MR0955054
  18. On minimum discrepancy estimators, Sankhyä Ser. A 34 (1972), 133–144. (1972) Zbl0266.62021MR0331606
  19. χ 2 -divergence and generalized Fisher information, In: Transactions of the Sixth Prague Conference on Information Theory, Statistical Decision Functions and Random Processes, Academia, Prague, 1973, pp. 223–234. (1973) Zbl0297.62003MR0356302
  20. Theory of Statistical Inference and Information, Kluwer Academic Publishers, Boston, 1989. (1989) Zbl0711.62002
  21. Mathematische Statistik, Springer-Verlag, Berlin, 1957. (1957) 
  22. On the criterion that a given system of deviations from the probable in the case of correlated system of variables is such that it can be reasonably supposed to have arisen from random sampling, Philosophical Magazine 50 (1990), 157–172. (1990) 
  23. Eine Informationstheoretische Ungleichung und ihre Anwendung auf den Beweis der Ergodizität von Markoffschen Ketten, Publications of the Mathematical Institute of the Hungarian Academy of Sciences, Series A 8 (1963), 85–108. (1963) MR0164374
  24. A general class of coefficients of divergence of one distribution from another, J.  Roy. Statist. Soc. Ser. B 28 (1966), 131–140. (1966) MR0196777
  25. On measures of entropy and information, In: Proceedings of the 4th Berkeley Symposium on Probability Theory and Mathematical Statistics, Vol. 1, University of California Press, Berkeley, 1961, pp. 531–546. (1961) MR0132570
  26. Inequalities: Theory of Majorization and its Applications, Academic Press, New York, 1979. (1979) MR0552278

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.