Information Theory in the mathematical Statistics

Tadeusz Inglot

Mathematica Applicanda (2014)

  • Volume: 42, Issue: 1
  • ISSN: 1730-2668

Abstract

top
In the paper we present an outline of the information theory from the probabilistic and statistical point of view. Such a direction of the information theory has been intensively developed in recent decades and significantly influenced a progress in the statistical methodology. The aim of the article is to introduce the reader into these problems, provide some intuitions and acquaint with a specific information-theoretic approach to the mathematical statistics.The first part of the paper is devoted to brief and easy of approach introduction to the main notions of the information theory like entropy, relative entropy (Kullback-Leibler distance), information projection and Fisher information as well as presentation of their most important properties including de Bruijn's identity, Fisher information inequalities and entropy power inequalities. In the short second part we give applications of the notions and results from the first part to limit theorems of the probability theory such as the asymptotic equipartition property, the convergence of empirical measures in the entropy distance, large deviation principle with emphasis to Sanov theorem, the convergence of distributions of homogeneous Markov chains in the entropy distance and the central limit theorem. The main, last part of the article shows some most significant and important applications of the information theory to the mathematical statistics. We discuss connections of the maximum likelihood estimators with the information projections and the notion of sufficient statistic fromthe information-theoretic point of view. The problems of source coding, channel capacity and an amount of information provided by statistical experiments are presented in a statistical framework. Some attention is paid to the expansion of Clarke and Barron and its corollaries e.g. in density estimation. Next, applications of the information theory to hypothesis testing is discussed. We give the classical Stein's Lemma and its generalization to testing composite hypothesis obtained by Bahadur and show their connections with the asymptotic efficiency of statistical tests. Finally, we briefly mention the problem of information criteria in a model seletion including the most popular two-stage minimal description length criterion of Rissanen. The enclosed literature is limited only to papers and books which are referred to in the paper.

How to cite

top

Tadeusz Inglot. "Information Theory in the mathematical Statistics." Mathematica Applicanda 42.1 (2014): null. <http://eudml.org/doc/293042>.

@article{TadeuszInglot2014,
abstract = {In the paper we present an outline of the information theory from the probabilistic and statistical point of view. Such a direction of the information theory has been intensively developed in recent decades and significantly influenced a progress in the statistical methodology. The aim of the article is to introduce the reader into these problems, provide some intuitions and acquaint with a specific information-theoretic approach to the mathematical statistics.The first part of the paper is devoted to brief and easy of approach introduction to the main notions of the information theory like entropy, relative entropy (Kullback-Leibler distance), information projection and Fisher information as well as presentation of their most important properties including de Bruijn's identity, Fisher information inequalities and entropy power inequalities. In the short second part we give applications of the notions and results from the first part to limit theorems of the probability theory such as the asymptotic equipartition property, the convergence of empirical measures in the entropy distance, large deviation principle with emphasis to Sanov theorem, the convergence of distributions of homogeneous Markov chains in the entropy distance and the central limit theorem. The main, last part of the article shows some most significant and important applications of the information theory to the mathematical statistics. We discuss connections of the maximum likelihood estimators with the information projections and the notion of sufficient statistic fromthe information-theoretic point of view. The problems of source coding, channel capacity and an amount of information provided by statistical experiments are presented in a statistical framework. Some attention is paid to the expansion of Clarke and Barron and its corollaries e.g. in density estimation. Next, applications of the information theory to hypothesis testing is discussed. We give the classical Stein's Lemma and its generalization to testing composite hypothesis obtained by Bahadur and show their connections with the asymptotic efficiency of statistical tests. Finally, we briefly mention the problem of information criteria in a model seletion including the most popular two-stage minimal description length criterion of Rissanen. The enclosed literature is limited only to papers and books which are referred to in the paper.},
author = {Tadeusz Inglot},
journal = {Mathematica Applicanda},
keywords = {Entropy, Kullback-Leibler distance, Fisher informa- tion, entropy convergence, statistical model and source coding, Stein's Lemma, density estimation.},
language = {eng},
number = {1},
pages = {null},
title = {Information Theory in the mathematical Statistics},
url = {http://eudml.org/doc/293042},
volume = {42},
year = {2014},
}

TY - JOUR
AU - Tadeusz Inglot
TI - Information Theory in the mathematical Statistics
JO - Mathematica Applicanda
PY - 2014
VL - 42
IS - 1
SP - null
AB - In the paper we present an outline of the information theory from the probabilistic and statistical point of view. Such a direction of the information theory has been intensively developed in recent decades and significantly influenced a progress in the statistical methodology. The aim of the article is to introduce the reader into these problems, provide some intuitions and acquaint with a specific information-theoretic approach to the mathematical statistics.The first part of the paper is devoted to brief and easy of approach introduction to the main notions of the information theory like entropy, relative entropy (Kullback-Leibler distance), information projection and Fisher information as well as presentation of their most important properties including de Bruijn's identity, Fisher information inequalities and entropy power inequalities. In the short second part we give applications of the notions and results from the first part to limit theorems of the probability theory such as the asymptotic equipartition property, the convergence of empirical measures in the entropy distance, large deviation principle with emphasis to Sanov theorem, the convergence of distributions of homogeneous Markov chains in the entropy distance and the central limit theorem. The main, last part of the article shows some most significant and important applications of the information theory to the mathematical statistics. We discuss connections of the maximum likelihood estimators with the information projections and the notion of sufficient statistic fromthe information-theoretic point of view. The problems of source coding, channel capacity and an amount of information provided by statistical experiments are presented in a statistical framework. Some attention is paid to the expansion of Clarke and Barron and its corollaries e.g. in density estimation. Next, applications of the information theory to hypothesis testing is discussed. We give the classical Stein's Lemma and its generalization to testing composite hypothesis obtained by Bahadur and show their connections with the asymptotic efficiency of statistical tests. Finally, we briefly mention the problem of information criteria in a model seletion including the most popular two-stage minimal description length criterion of Rissanen. The enclosed literature is limited only to papers and books which are referred to in the paper.
LA - eng
KW - Entropy, Kullback-Leibler distance, Fisher informa- tion, entropy convergence, statistical model and source coding, Stein's Lemma, density estimation.
UR - http://eudml.org/doc/293042
ER -

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.