Smoothness of Metropolis-Hastings algorithm and application to entropy estimation
Didier Chauveau; Pierre Vandekerkhove
ESAIM: Probability and Statistics (2013)
- Volume: 17, page 419-431
- ISSN: 1292-8100
Access Full Article
topAbstract
topHow to cite
topChauveau, Didier, and Vandekerkhove, Pierre. "Smoothness of Metropolis-Hastings algorithm and application to entropy estimation." ESAIM: Probability and Statistics 17 (2013): 419-431. <http://eudml.org/doc/273622>.
@article{Chauveau2013,
abstract = {The transition kernel of the well-known Metropolis-Hastings (MH) algorithm has a point mass at the chain’s current position, which prevent direct smoothness properties to be derived for the successive densities of marginals issued from this algorithm. We show here that under mild smoothness assumption on the MH algorithm “input” densities (the initial, proposal and target distributions), propagation of a Lipschitz condition for the iterative densities can be proved. This allows us to build a consistent nonparametric estimate of the entropy for these iterative densities. This theoretical study can be viewed as a building block for a more general MCMC evaluation tool grounded on such estimates.},
author = {Chauveau, Didier, Vandekerkhove, Pierre},
journal = {ESAIM: Probability and Statistics},
keywords = {entropy; Kullback divergence; Metropolis-Hastings algorithm; nonparametric statistic},
language = {eng},
pages = {419-431},
publisher = {EDP-Sciences},
title = {Smoothness of Metropolis-Hastings algorithm and application to entropy estimation},
url = {http://eudml.org/doc/273622},
volume = {17},
year = {2013},
}
TY - JOUR
AU - Chauveau, Didier
AU - Vandekerkhove, Pierre
TI - Smoothness of Metropolis-Hastings algorithm and application to entropy estimation
JO - ESAIM: Probability and Statistics
PY - 2013
PB - EDP-Sciences
VL - 17
SP - 419
EP - 431
AB - The transition kernel of the well-known Metropolis-Hastings (MH) algorithm has a point mass at the chain’s current position, which prevent direct smoothness properties to be derived for the successive densities of marginals issued from this algorithm. We show here that under mild smoothness assumption on the MH algorithm “input” densities (the initial, proposal and target distributions), propagation of a Lipschitz condition for the iterative densities can be proved. This allows us to build a consistent nonparametric estimate of the entropy for these iterative densities. This theoretical study can be viewed as a building block for a more general MCMC evaluation tool grounded on such estimates.
LA - eng
KW - entropy; Kullback divergence; Metropolis-Hastings algorithm; nonparametric statistic
UR - http://eudml.org/doc/273622
ER -
References
top- [1] I.A. Ahmad and P.E. Lin, A nonparametric estimation of the entropy for absolutely continuous distributions. IEEE Trans. Inf. Theory22 (1976) 372–375. Zbl0336.62001MR408987
- [2] I.A. Ahmad and P.E. Lin, A nonparametric estimation of the entropy for absolutely continuous distributions. IEEE Trans. Inf. Theory36 (1989) 688–692. Zbl0336.62001
- [3] C. Andrieu and J. Thoms, A tutorial on adaptive MCMC. Stat. Comput.18 (2008) 343–373. MR2461882
- [4] Y.F. Atchadé and J. Rosenthal, On adaptive Markov chain Monte Carlo algorithms. Bernoulli11 (2005) 815–828. Zbl1085.62097MR2172842
- [5] P. Billingsley, Probability and Measure, 3rd edition. Wiley, New York (2005). Zbl0649.60001
- [6] D. Chauveau and P. Vandekerkhove, Improving convergence of the Hastings-Metropolis algorithm with an adaptive proposal. Scand. J. Stat.29 (2002) 13–29. Zbl1023.65003MR1894378
- [7] D. Chauveau and P. Vandekerkhove, A Monte Carlo estimation of the entropy for Markov chains. Methodol. Comput. Appl. Probab.9 (2007) 133–149. Zbl1115.62011MR2364985
- [8] Y.G. Dmitriev and F.P. Tarasenko, On the estimation of functionals of the probability density and its derivatives. Theory Probab. Appl.18 (1973) 628–633. Zbl0301.62020
- [9] Y.G. Dmitriev and F.P. Tarasenko, On a class of non-parametric estimates of non-linear functionals of density. Theory Probab. Appl.19 (1973) 390–394. Zbl0316.62013
- [10] R. Douc, A. Guillin, J.M. Marin and C.P. Robert, Convergence of adaptive mixtures of importance sampling schemes. Ann. Statist.35 (2007) 420–448. Zbl1132.60022MR2332281
- [11] E.J. Dudevicz and E.C. Van Der MeulenEntropy-based tests of uniformity. J. Amer. Statist. Assoc.76 (1981) 967–974. Zbl0484.62035MR650913
- [12] P.P.B. Eggermont and V.N. LaRiccia, Best asymptotic normality of the Kernel density entropy estimator for Smooth densities. IEEE Trans. Inf. Theory45 (1999) 1321–1326. Zbl0959.62005MR1686275
- [13] W.R. Gilks, S. Richardson and D.J. Spiegelhalter, Markov Chain Monte Carlo in practice. Chapman & Hall, London (1996) Zbl0832.00018MR1397966
- [14] W.R. Gilks, G.O. Roberts and S.K. Sahu, Adaptive Markov chain Monte carlo through regeneration. J. Amer. Statist. Assoc.93 (1998) 1045–1054. Zbl1064.65503MR1649199
- [15] L. Györfi and E.C. Van Der Meulen, Density-free convergence properties of various estimators of the entropy. Comput. Statist. Data Anal.5 (1987) 425–436. Zbl0632.62031MR907170
- [16] L. Györfi and E.C. Van Der Meulen, An entropy estimate based on a Kernel density estimation, Limit Theorems in Probability and Statistics Pécs (Hungary). Colloquia Mathematica societatis János Bolyai 57 (1989) 229–240. Zbl0724.62038
- [17] H. Haario, E. Saksman and J. Tamminen, An adaptive metropolis algorithm. Bernouilli7 (2001) 223–242. Zbl0989.65004MR1828504
- [18] W.K. Hastings, Monte Carlo sampling methods using Markov chains and their applications. Biometrika57 (1970) 97–109. Zbl0219.65008
- [19] L. Holden, Geometric convergence of the Metropolis-Hastings simulation algorithm. Statist. Probab. Lett. 39 (1998). Zbl0914.60043MR1646224
- [20] A.V. Ivanov and M.N. Rozhkova, Properties of the statistical estimate of the entropy of a random vector with a probability density (in Russian). Probl. Peredachi Inform. 17 (1981) 33–43. Translated into English in Probl. Inf. Transm. 17 (1981) 171–178. Zbl0529.62002MR673736
- [21] S.F. Jarner and E. Hansen, Geometric ergodicity of metropolis algorithms. Stoc. Proc. Appl.85 (2000) 341–361. Zbl0997.60070MR1731030
- [22] K.L. Mengersen and R.L. Tweedie, Rates of convergence of the Hastings and Metropolis algorithms. Ann. Statist.24 (1996) 101–121. Zbl0854.60065MR1389882
- [23] N. Metropolis, A.W. Rosenbluth, M.N. Rosenbluth, A.H. Teller and E. Teller, Equations of state calculations by fast computing machines. J. Chem. Phys.21 (1953) 1087–1092.
- [24] A. Mokkadem, Estimation of the entropy and information of absolutely continuous random variables. IEEE Trans. Inf. Theory23 (1989) 95–101. Zbl0669.94004MR995340
- [25] R Development Core Team, R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. http://www.R-project.org (2010), ISBN 3-900051-07-0.
- [26] G.O. Roberts and J.S. Rosenthal, Optimal scaling for various Metropolis-Hastings algorithms. Statist. Sci.16 (2001) 351–367. Zbl1127.65305MR1888450
- [27] G.O. Roberts and R.L. Tweedie, Geometric convergence and central limit theorems for multidimensional Hastings and Metropolis algorithms. Biometrika83 (1996) 95–110. Zbl0888.60064MR1399158
- [28] D. Scott, Multivariate Density Estimation: Theory, Practice and Visualization. John Wiley, New York (1992). Zbl0850.62006MR1191168
- [29] F.P. Tarasenko, On the evaluation of an unknown probability density function, the direct estimation of the entropy from independent observations of a continuous random variable and the distribution-free entropy test of goodness-of-fit. Proc. IEEE56 (1968) 2052–2053.
NotesEmbed ?
topTo embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.