Optimality conditions for maximizers of the information divergence from an exponential family
Kybernetika (2007)
- Volume: 43, Issue: 5, page 731-746
- ISSN: 0023-5954
Access Full Article
topAbstract
topHow to cite
topMatúš, František. "Optimality conditions for maximizers of the information divergence from an exponential family." Kybernetika 43.5 (2007): 731-746. <http://eudml.org/doc/33891>.
@article{Matúš2007,
abstract = {The information divergence of a probability measure $P$ from an exponential family $\mathcal \{E\}$ over a finite set is defined as infimum of the divergences of $P$ from $Q$ subject to $Q\in \mathcal \{E\}$. All directional derivatives of the divergence from $\mathcal \{E\}$ are explicitly found. To this end, behaviour of the conjugate of a log-Laplace transform on the boundary of its domain is analysed. The first order conditions for $P$ to be a maximizer of the divergence from $\mathcal \{E\}$ are presented, including new ones when $P$ is not projectable to $\mathcal \{E\}$.},
author = {Matúš, František},
journal = {Kybernetika},
keywords = {Kullback–Leibler divergence; relative entropy; exponential family; information projection; log-Laplace transform; cumulant generating function; directional derivatives; first order optimality conditions; convex functions; polytopes},
language = {eng},
number = {5},
pages = {731-746},
publisher = {Institute of Information Theory and Automation AS CR},
title = {Optimality conditions for maximizers of the information divergence from an exponential family},
url = {http://eudml.org/doc/33891},
volume = {43},
year = {2007},
}
TY - JOUR
AU - Matúš, František
TI - Optimality conditions for maximizers of the information divergence from an exponential family
JO - Kybernetika
PY - 2007
PB - Institute of Information Theory and Automation AS CR
VL - 43
IS - 5
SP - 731
EP - 746
AB - The information divergence of a probability measure $P$ from an exponential family $\mathcal {E}$ over a finite set is defined as infimum of the divergences of $P$ from $Q$ subject to $Q\in \mathcal {E}$. All directional derivatives of the divergence from $\mathcal {E}$ are explicitly found. To this end, behaviour of the conjugate of a log-Laplace transform on the boundary of its domain is analysed. The first order conditions for $P$ to be a maximizer of the divergence from $\mathcal {E}$ are presented, including new ones when $P$ is not projectable to $\mathcal {E}$.
LA - eng
KW - Kullback–Leibler divergence; relative entropy; exponential family; information projection; log-Laplace transform; cumulant generating function; directional derivatives; first order optimality conditions; convex functions; polytopes
UR - http://eudml.org/doc/33891
ER -
References
top- Ay N., An information-geometric approach to a theory of pragmatic structuring, Ann. Probab. 30 (2002), 416–436 Zbl1010.62007MR1894113
- Ay N., Locality of Global Stochastic Interaction in Directed Acyclic Networks, Neural Computation 14 (2002), 2959–2980 Zbl1079.68582
- Ay N., Knauf A., Maximizing multi-information, Kybernetika 45 (2006), 517–538 MR2283503
- Ay N., Wennekers T., Dynamical properties of strongly interacting Markov chains, Neural Networks 16 (2003), 1483–1497
- Barndorff-Nielsen O., Information and Exponential Families in Statistical Theory, Wiley, New York 1978 Zbl0387.62011MR0489333
- Brown L. D., Fundamentals of Statistical Exponential Families, (Lecture Notes – Monograph Series 9.) Institute of Mathematical Statistics, Hayward, CA 1986 Zbl0685.62002MR0882001
- Csiszár I., Matúš F., Information projections revisited, IEEE Trans. Inform. Theory 49 (2003), 1474–1490 Zbl1063.94016MR1984936
- Csiszár I., Matúš F., Closures of exponential families, Ann. Probab. 33 (2005), 582–600 Zbl1068.60008MR2123202
- Csiszár I., Matúš F., Generalized maximum likelihood estimates for exponential families, To appear in Probab. Theory Related Fields (2008) Zbl1133.62039MR2372970
- Pietra S. Della, Pietra, V. Della, Lafferty J., Inducing features of random fields, IEEE Trans. Pattern Anal. Mach. Intell. 19 (1997), 380–393 (1997)
- Letac G., Lectures on Natural Exponential Families and their Variance Functions, (Monografias de Matemática 50.) Instituto de Matemática Pura e Aplicada, Rio de Janeiro 1992 Zbl0983.62501MR1182991
- Matúš F., Maximization of information divergences from binary i, i.d. sequences. In: Proc. IPMU 2004, Perugia 2004, Vol. 2, pp. 1303–1306
- Matúš F., Ay N., On maximization of the information divergence from an exponential family, In: Proc. WUPES’03 (J. Vejnarová, ed.), University of Economics, Prague 2003, pp. 199–204
- Rockafellar R. T., Convex Analysis, Princeton University Press, Priceton, N.J. 1970 MR0274683
- Wennekers T., Ay N., Finite state automata resulting from temporal information maximization, Theory in Biosciences 122 (2003), 5–18 Zbl1090.68064
Citations in EuDML Documents
topNotesEmbed ?
topTo embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.