Strong law of large numbers for additive extremum estimators

João Tiago Mexia; Pedro Corte Real

Discussiones Mathematicae Probability and Statistics (2001)

  • Volume: 21, Issue: 2, page 81-88
  • ISSN: 1509-9423

Abstract

top
Extremum estimators are obtained by maximizing or minimizing a function of the sample and of the parameters relatively to the parameters. When the function to maximize or minimize is the sum of subfunctions each depending on one observation, the extremum estimators are additive. Maximum likelihood estimators are extremum additive whenever the observations are independent. Another instance of additive extremum estimators are the least squares estimators for multiple regressions when the usual assumptions hold. A strong law of large numbers is derived for additive extremum estimators. This law requires only the existence of first order moments and may be of interest in connection with maximum likelihood estimators, since the usual assumption that the observations are identically distributed is discarded.

How to cite

top

João Tiago Mexia, and Pedro Corte Real. "Strong law of large numbers for additive extremum estimators." Discussiones Mathematicae Probability and Statistics 21.2 (2001): 81-88. <http://eudml.org/doc/287700>.

@article{JoãoTiagoMexia2001,
abstract = {Extremum estimators are obtained by maximizing or minimizing a function of the sample and of the parameters relatively to the parameters. When the function to maximize or minimize is the sum of subfunctions each depending on one observation, the extremum estimators are additive. Maximum likelihood estimators are extremum additive whenever the observations are independent. Another instance of additive extremum estimators are the least squares estimators for multiple regressions when the usual assumptions hold. A strong law of large numbers is derived for additive extremum estimators. This law requires only the existence of first order moments and may be of interest in connection with maximum likelihood estimators, since the usual assumption that the observations are identically distributed is discarded.},
author = {João Tiago Mexia, Pedro Corte Real},
journal = {Discussiones Mathematicae Probability and Statistics},
keywords = {Kolmogorov's strong law of large numbers; multiple regression; almost sure convergence; additive extremum estimators; regression; strong consistency},
language = {eng},
number = {2},
pages = {81-88},
title = {Strong law of large numbers for additive extremum estimators},
url = {http://eudml.org/doc/287700},
volume = {21},
year = {2001},
}

TY - JOUR
AU - João Tiago Mexia
AU - Pedro Corte Real
TI - Strong law of large numbers for additive extremum estimators
JO - Discussiones Mathematicae Probability and Statistics
PY - 2001
VL - 21
IS - 2
SP - 81
EP - 88
AB - Extremum estimators are obtained by maximizing or minimizing a function of the sample and of the parameters relatively to the parameters. When the function to maximize or minimize is the sum of subfunctions each depending on one observation, the extremum estimators are additive. Maximum likelihood estimators are extremum additive whenever the observations are independent. Another instance of additive extremum estimators are the least squares estimators for multiple regressions when the usual assumptions hold. A strong law of large numbers is derived for additive extremum estimators. This law requires only the existence of first order moments and may be of interest in connection with maximum likelihood estimators, since the usual assumption that the observations are identically distributed is discarded.
LA - eng
KW - Kolmogorov's strong law of large numbers; multiple regression; almost sure convergence; additive extremum estimators; regression; strong consistency
UR - http://eudml.org/doc/287700
ER -

References

top
  1. [1] N. Bac Van, Strong convergence of least squares estimates in polynomial regression with random explanatory variables, Acta Mathematica Vietnamica 23 (2) (1998), 195-205. Zbl1054.62522
  2. [2] N. Bac Van, Strong convergence of least squares estimates in polynomial regression with random explanatory variables, Acta Mathematica Vietnamica 19 (1) (1994), 111-137. 
  3. [3] J. Galambos, Advanced Probability Theory, Marcel Dekker 1988. Zbl0681.60005
  4. [4] J.T. Mexia and P.C. Real, Extension of Kolmogorov's strong law to multiple regression, 23rd European Meeting of Statisticians, Funchal (Madeira Island), August 13-18 2001, Revista de Estatística, 2 Quadrimestre de 2001, 277-278. 
  5. [5] D. Williams, Probability with Martingales, Cambridge Mathematical Textbooks 1991. Zbl0722.60001
  6. [6] S. Zacks, The Theory of Statistical Inference, John Wiley 1971. 

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.