Scaling of model approximation errors and expected entropy distances

Guido F. Montúfar; Johannes Rauh

Kybernetika (2014)

  • Volume: 50, Issue: 2, page 234-245
  • ISSN: 0023-5954

Abstract

top
We compute the expected value of the Kullback-Leibler divergence of various fundamental statistical models with respect to Dirichlet priors. For the uniform prior, the expected divergence of any model containing the uniform distribution is bounded by a constant 1 - γ . For the models that we consider this bound is approached as the cardinality of the sample space tends to infinity, if the model dimension remains relatively small. For Dirichlet priors with reasonable concentration parameters the expected values of the divergence behave in a similar way. These results serve as a reference to rank the approximation capabilities of other statistical models.

How to cite

top

Montúfar, Guido F., and Rauh, Johannes. "Scaling of model approximation errors and expected entropy distances." Kybernetika 50.2 (2014): 234-245. <http://eudml.org/doc/261861>.

@article{Montúfar2014,
abstract = {We compute the expected value of the Kullback-Leibler divergence of various fundamental statistical models with respect to Dirichlet priors. For the uniform prior, the expected divergence of any model containing the uniform distribution is bounded by a constant $1-\gamma $. For the models that we consider this bound is approached as the cardinality of the sample space tends to infinity, if the model dimension remains relatively small. For Dirichlet priors with reasonable concentration parameters the expected values of the divergence behave in a similar way. These results serve as a reference to rank the approximation capabilities of other statistical models.},
author = {Montúfar, Guido F., Rauh, Johannes},
journal = {Kybernetika},
keywords = {exponential families; KL divergence; MLE; Dirichlet prior; exponential families; KL divergence; MLE; Dirichlet prior},
language = {eng},
number = {2},
pages = {234-245},
publisher = {Institute of Information Theory and Automation AS CR},
title = {Scaling of model approximation errors and expected entropy distances},
url = {http://eudml.org/doc/261861},
volume = {50},
year = {2014},
}

TY - JOUR
AU - Montúfar, Guido F.
AU - Rauh, Johannes
TI - Scaling of model approximation errors and expected entropy distances
JO - Kybernetika
PY - 2014
PB - Institute of Information Theory and Automation AS CR
VL - 50
IS - 2
SP - 234
EP - 245
AB - We compute the expected value of the Kullback-Leibler divergence of various fundamental statistical models with respect to Dirichlet priors. For the uniform prior, the expected divergence of any model containing the uniform distribution is bounded by a constant $1-\gamma $. For the models that we consider this bound is approached as the cardinality of the sample space tends to infinity, if the model dimension remains relatively small. For Dirichlet priors with reasonable concentration parameters the expected values of the divergence behave in a similar way. These results serve as a reference to rank the approximation capabilities of other statistical models.
LA - eng
KW - exponential families; KL divergence; MLE; Dirichlet prior; exponential families; KL divergence; MLE; Dirichlet prior
UR - http://eudml.org/doc/261861
ER -

References

top
  1. Ay, N., 10.1214/aop/1020107773, Ann. Probab. 30 (2002), 416-436. Zbl1010.62007MR1894113DOI10.1214/aop/1020107773
  2. Drton, M., Sturmfels, B., Sullivant, S., Lectures on Algebraic Statistics., Birkhäuser, Basel 2009. Zbl1166.13001MR2723140
  3. Frigyik, B. A., Kapila, A., Gupta, M. R., Introduction to the Dirichlet Distribution and Related Processes., Technical Report, Department of Electrical Engineering University of Washington, 2010. 
  4. Matúš, F., Ay, N., On maximization of the information divergence from an exponential family., In: Proc. WUPES'03, University of Economics, Prague 2003, pp. 199-204. 
  5. Matúš, F., Rauh, J., Maximization of the information divergence from an exponential family and criticality., In: Proc. ISIT, St. Petersburg 2011, pp. 903-907. 
  6. Montúfar, G., Rauh, J., Ay, N., Expressive power and approximation errors of restricted Boltzmann machines., In: Advances in NIPS 24, MIT Press, Cambridge 2011, pp. 415-423. 
  7. Nemenman, I., Shafee, F., Bialek, W., Entropy and inference, revisited., In: Advances in NIPS 14, MIT Press, Cambridge 2001, pp. 471-478. 
  8. Rauh, J., Finding the Maximizers of the Information Divergence from an Exponential Family., Ph.D. Thesis, Universität Leipzig 2011. MR2817016
  9. Rauh, J., Optimally approximating exponential families., Kybernetika 49 (2013), 199-215. Zbl1283.94027MR3085392
  10. Wolpert, D., Wolf, D., 10.1103/PhysRevE.52.6841, Phys, Rev. E 52 (1995), 6841-6854. MR1384746DOI10.1103/PhysRevE.52.6841

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.