Skip to main content Accessibility help
×
×
Home

Smoothness of Metropolis-Hastings algorithm and application to entropy estimation

  • Didier Chauveau (a1) and Pierre Vandekerkhove (a2)

Abstract

The transition kernel of the well-known Metropolis-Hastings (MH) algorithm has a point mass at the chain’s current position, which prevent direct smoothness properties to be derived for the successive densities of marginals issued from this algorithm. We show here that under mild smoothness assumption on the MH algorithm “input” densities (the initial, proposal and target distributions), propagation of a Lipschitz condition for the iterative densities can be proved. This allows us to build a consistent nonparametric estimate of the entropy for these iterative densities. This theoretical study can be viewed as a building block for a more general MCMC evaluation tool grounded on such estimates.

Copyright

References

Hide All
[1] Ahmad, I.A. and Lin, P.E., A nonparametric estimation of the entropy for absolutely continuous distributions. IEEE Trans. Inf. Theory 22 (1976) 372375.
[2] Ahmad, I.A. and Lin, P.E., A nonparametric estimation of the entropy for absolutely continuous distributions. IEEE Trans. Inf. Theory 36 (1989) 688692.
[3] Andrieu, C. and Thoms, J., A tutorial on adaptive MCMC. Stat. Comput. 18 (2008) 343373.
[4] Atchadé, Y.F. and Rosenthal, J., On adaptive Markov chain Monte Carlo algorithms. Bernoulli 11 (2005) 815828.
[5] P. Billingsley, Probability and Measure, 3rd edition. Wiley, New York (2005).
[6] Chauveau, D. and Vandekerkhove, P., Improving convergence of the Hastings-Metropolis algorithm with an adaptive proposal. Scand. J. Stat. 29 (2002) 1329.
[7] Chauveau, D. and Vandekerkhove, P., A Monte Carlo estimation of the entropy for Markov chains. Methodol. Comput. Appl. Probab. 9 (2007) 133149.
[8] Dmitriev, Y.G. and Tarasenko, F.P., On the estimation of functionals of the probability density and its derivatives. Theory Probab. Appl. 18 (1973) 628633.
[9] Dmitriev, Y.G. and Tarasenko, F.P., On a class of non-parametric estimates of non-linear functionals of density. Theory Probab. Appl. 19 (1973) 390394.
[10] Douc, R., Guillin, A., Marin, J.M. and Robert, C.P., Convergence of adaptive mixtures of importance sampling schemes. Ann. Statist. 35 (2007) 420448.
[11] Dudevicz, E.J. and Van Der Meulen, E.C. Entropy-based tests of uniformity. J. Amer. Statist. Assoc. 76 (1981) 967974.
[12] Eggermont, P.P.B. and LaRiccia, V.N., Best asymptotic normality of the Kernel density entropy estimator for Smooth densities. IEEE Trans. Inf. Theory 45 (1999) 13211326.
[13] W.R. Gilks, S. Richardson and D.J. Spiegelhalter, Markov Chain Monte Carlo in practice. Chapman & Hall, London (1996)
[14] Gilks, W.R., Roberts, G.O. and Sahu, S.K., Adaptive Markov chain Monte carlo through regeneration. J. Amer. Statist. Assoc. 93 (1998) 10451054.
[15] Györfi, L. and Van Der Meulen, E.C., Density-free convergence properties of various estimators of the entropy. Comput. Statist. Data Anal. 5 (1987) 425436.
[16] Györfi, L. and Van Der Meulen, E.C., An entropy estimate based on a Kernel density estimation, Limit Theorems in Probability and Statistics Pécs (Hungary). Colloquia Mathematica societatis János Bolyai 57 (1989) 229240.
[17] Haario, H., Saksman, E. and Tamminen, J., An adaptive metropolis algorithm. Bernouilli 7 (2001) 223242.
[18] Hastings, W.K., Monte Carlo sampling methods using Markov chains and their applications. Biometrika 57 (1970) 97109.
[19] L. Holden, Geometric convergence of the Metropolis-Hastings simulation algorithm. Statist. Probab. Lett. 39 (1998).
[20] Ivanov, A.V. and Rozhkova, M.N., Properties of the statistical estimate of the entropy of a random vector with a probability density (in Russian). Probl. Peredachi Inform. 17 (1981) 3343. Translated into English in Probl. Inf. Transm. 17 (1981) 171–178.
[21] Jarner, S.F. and Hansen, E., Geometric ergodicity of metropolis algorithms. Stoc. Proc. Appl. 85 (2000) 341361.
[22] Mengersen, K.L. and Tweedie, R.L., Rates of convergence of the Hastings and Metropolis algorithms. Ann. Statist. 24 (1996) 101121.
[23] Metropolis, N., Rosenbluth, A.W., Rosenbluth, M.N., Teller, A.H. and Teller, E., Equations of state calculations by fast computing machines. J. Chem. Phys. 21 (1953) 10871092.
[24] Mokkadem, A., Estimation of the entropy and information of absolutely continuous random variables. IEEE Trans. Inf. Theory 23 (1989) 95101.
[25] R Development Core Team, R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. http://www.R-project.org (2010), ISBN 3-900051-07-0.
[26] Roberts, G.O. and Rosenthal, J.S., Optimal scaling for various Metropolis-Hastings algorithms. Statist. Sci. 16 (2001) 351367.
[27] Roberts, G.O. and Tweedie, R.L., Geometric convergence and central limit theorems for multidimensional Hastings and Metropolis algorithms. Biometrika 83 (1996) 95110.
[28] D. Scott, Multivariate Density Estimation: Theory, Practice and Visualization. John Wiley, New York (1992).
[29] Tarasenko, F.P., On the evaluation of an unknown probability density function, the direct estimation of the entropy from independent observations of a continuous random variable and the distribution-free entropy test of goodness-of-fit. Proc. IEEE 56 (1968) 20522053.
Recommend this journal

Email your librarian or administrator to recommend adding this journal to your organisation's collection.

ESAIM: Probability and Statistics
  • ISSN: 1292-8100
  • EISSN: 1262-3318
  • URL: /core/journals/esaim-probability-and-statistics
Please enter your name
Please enter a valid email address
Who would you like to send this to? *
×

Keywords

Metrics

Full text views

Total number of HTML views: 0
Total number of PDF views: 0 *
Loading metrics...

Abstract views

Total abstract views: 0 *
Loading metrics...

* Views captured on Cambridge Core between <date>. This data will be updated every 24 hours.

Usage data cannot currently be displayed