Estimation of the transition density of a Markov chain

Mathieu Sart

Annales de l'I.H.P. Probabilités et statistiques (2014)

  • Volume: 50, Issue: 3, page 1028-1068
  • ISSN: 0246-0203

Abstract

top
We present two data-driven procedures to estimate the transition density of an homogeneous Markov chain. The first yields a piecewise constant estimator on a suitable random partition. By using an Hellinger-type loss, we establish non-asymptotic risk bounds for our estimator when the square root of the transition density belongs to possibly inhomogeneous Besov spaces with possibly small regularity index. Some simulations are also provided. The second procedure is of theoretical interest and leads to a general model selection theorem from which we derive rates of convergence over a very wide range of possibly inhomogeneous and anisotropic Besov spaces. We also investigate the rates that can be achieved under structural assumptions on the transition density.

How to cite

top

Sart, Mathieu. "Estimation of the transition density of a Markov chain." Annales de l'I.H.P. Probabilités et statistiques 50.3 (2014): 1028-1068. <http://eudml.org/doc/272004>.

@article{Sart2014,
abstract = {We present two data-driven procedures to estimate the transition density of an homogeneous Markov chain. The first yields a piecewise constant estimator on a suitable random partition. By using an Hellinger-type loss, we establish non-asymptotic risk bounds for our estimator when the square root of the transition density belongs to possibly inhomogeneous Besov spaces with possibly small regularity index. Some simulations are also provided. The second procedure is of theoretical interest and leads to a general model selection theorem from which we derive rates of convergence over a very wide range of possibly inhomogeneous and anisotropic Besov spaces. We also investigate the rates that can be achieved under structural assumptions on the transition density.},
author = {Sart, Mathieu},
journal = {Annales de l'I.H.P. Probabilités et statistiques},
keywords = {adaptive estimation; Markov chain; model selection; robust tests; transition density},
language = {eng},
number = {3},
pages = {1028-1068},
publisher = {Gauthier-Villars},
title = {Estimation of the transition density of a Markov chain},
url = {http://eudml.org/doc/272004},
volume = {50},
year = {2014},
}

TY - JOUR
AU - Sart, Mathieu
TI - Estimation of the transition density of a Markov chain
JO - Annales de l'I.H.P. Probabilités et statistiques
PY - 2014
PB - Gauthier-Villars
VL - 50
IS - 3
SP - 1028
EP - 1068
AB - We present two data-driven procedures to estimate the transition density of an homogeneous Markov chain. The first yields a piecewise constant estimator on a suitable random partition. By using an Hellinger-type loss, we establish non-asymptotic risk bounds for our estimator when the square root of the transition density belongs to possibly inhomogeneous Besov spaces with possibly small regularity index. Some simulations are also provided. The second procedure is of theoretical interest and leads to a general model selection theorem from which we derive rates of convergence over a very wide range of possibly inhomogeneous and anisotropic Besov spaces. We also investigate the rates that can be achieved under structural assumptions on the transition density.
LA - eng
KW - adaptive estimation; Markov chain; model selection; robust tests; transition density
UR - http://eudml.org/doc/272004
ER -

References

top
  1. [1] N. Akakpo. Estimation adaptative par sélection de partitions en rectangles dyadiques. Ph.D. thesis, Univ. Paris Sud, 2009. 
  2. [2] N. Akakpo. Adaptation to anisotropy and inhomogeneity via dyadic piecewise polynomial selection. Math. Methods Statist.21 (2012) 1–28. Zbl1308.62070MR2901269
  3. [3] N. Akakpo and C. Lacour. Inhomogeneous and anisotropic conditional density estimation from dependent data. Electron. J. Statist.5 (2011) 1618–1653. Zbl1271.62060MR2870146
  4. [4] K. B. Athreya and G. S. Atuncar. Kernel estimation for real-valued Markov chains. Sankhyā60 (1998) 1–17. Zbl0977.62093MR1714774
  5. [5] Y. Baraud. Estimator selection with respect to Hellinger-type risks. Probab. Theory Related Fields151 (2011) 353–401. Zbl05968717MR2834722
  6. [6] Y. Baraud and L. Birgé. Estimating the intensity of a random measure by histogram type estimators. Probab. Theory Related Fields143 (2009) 239–284. Zbl1149.62019MR2449129
  7. [7] Y. Baraud and L. Birgé. Estimating composite functions by model selection. Ann. Inst. Henri Poincaré Probab. Stat.50 (2014) 285–314. Zbl1281.62093MR3161532
  8. [8] A. K. Basu and D. K. Sahoo. On Berry–Esseen theorem for nonparametric density estimation in Markov sequences. Bull. Inform. Cybernet.30 (1998) 25–39. Zbl0921.62039MR1629735
  9. [9] L. Birgé. Approximation dans les espaces métriques et théorie de l’estimation. Probab. Theory Related Fields65 (1983) 181–237. Zbl0506.62026MR722129
  10. [10] L Birgé. Stabilité et instabilité du risque minimax pour des variables indépendantes équidistribuées. Ann. Inst. Henri Poincaré Probab. Stat. 20 (1984) 201–223. Zbl0542.62018
  11. [11] L. Birgé. Sur un théorème de minimax et son application aux tests. Probab. Math. Statist.2 (1984) 259–282. Zbl0571.62036MR764150
  12. [12] L. Birgé. Model selection via testing: An alternative to (penalized) maximum likelihood estimators. Ann. Inst. Henri Poincaré Probab. Stat.42 (2006) 273–325. Zbl1333.62094MR2219712
  13. [13] L. Birgé. Model selection for Poisson processes. In Asymptotics: Particles, Processes and Inverse Problems 32–64. IMS Lecture Notes Monogr. Ser. 55. IMS, Beachwood, OH, 2007. Zbl1176.62082MR2459930
  14. [14] L. Birgé. Model selection for density estimation with 𝕃 2 -loss. Probab. Theory Related Fields158 (2014) 533–574. Zbl1285.62037MR3176358
  15. [15] L. Birgé. Robust tests for model selection. In From Probability to Statistics and Back: High-Dimensional Models and Processes. A Festschrift in Honor of Jon Wellner 47–64. IMS Collections 9. IMS, Beachwood, OH, 2012. Zbl1327.62279MR3186748
  16. [16] G. Blanchard, C. Schäfer and Y. Rozenholc. Oracle Bounds and Exact Algorithm for Dyadic Classification Trees. Lecture Notes in Comput. Sci. 3120. Springer, Berlin, 2004. Zbl1078.62521MR2177922
  17. [17] R. C. Bradley. Basic properties of strong mixing conditions. A survey and some open questions. Probab. Surv. 2 (2005) 107–144. Zbl1189.60077MR2178042
  18. [18] S. Clémencon. Adaptive estimation of the transition density of a regular Markov chain. Math. Methods Statist.9 (2000) 323–357. Zbl1008.62076MR1827473
  19. [19] F. Comte and Y. Rozenholc. Adaptive estimation of mean and volatility functions in (auto-)regressive models. Stochastic Process. Appl.97 (2002) 111–145. Zbl1064.62046MR1870963
  20. [20] W. Dahmen, R. DeVore and K. Scherer. Multi-dimensional spline approximation. SIAM J. Numer. Anal.17 (1980) 380–402. Zbl0437.41010MR581486
  21. [21] R. DeVore and X. Yu. Degree of adaptive approximation. Math. Comput. 55 (1990) 625–635. Zbl0723.41015MR1035930
  22. [22] C. C. Y. Dorea. Strong consistency of kernel estimators for Markov transition densities. Bull. Braz. Math. Soc. (N.S.) 33 (2002) 409–418. Zbl1033.62035MR1978836
  23. [23] P. Doukhan. Mixing: Properties and Examples. Lecture Notes in Statistics 85. Springer, New York, 1994. Zbl0801.60027MR1312160
  24. [24] P. Doukhan and M. Ghindès. Estimation de la transition de probabilité d’une chaîne de Markov Doëblin-récurrente. Étude du cas du processus autorégressif général d’ordre 1 . Stochastic Process. Appl. 15 (1983) 271–293. Zbl0515.62037MR711186
  25. [25] R. Hochmuth. Wavelet characterizations for anisotropic Besov spaces. Appl. Comput. Harmon. Anal.12 (2002) 179–208. Zbl1003.42024MR1884234
  26. [26] A. Juditsky, O. Lepski and A. Tsybakov. Nonparametric estimation of composite functions. Ann. Statist.37 (2009) 1360–1404. Zbl1160.62030MR2509077
  27. [27] C. Lacour. Adaptive estimation of the transition density of a Markov chain. Ann. Inst. Henri Poincaré Probab. Statist.43 (2007) 571–597. Zbl1125.62087MR2347097
  28. [28] C. Lacour. Nonparametric estimation of the stationary density and the transition density of a Markov chain. Stochastic Process. Appl.118 (2008) 232–260. Zbl1129.62028MR2376901
  29. [29] C. Lacour. Erratum to “Nonparametric estimation of the stationary density and the transition density of a Markov chain” [Stochastic Process. Appl. 118 (2008) 232–260] []. Stochastic Process. Appl.122 (2012) 2480–2485. Zbl1277.62106MR2376901
  30. [30] L. Le Cam. Convergence of estimates under dimensionality restrictions. Ann. Statist.1 (1973) 38–53. Zbl0255.62006MR334381
  31. [31] L. Le Cam. On local and global properties in the theory of asymptotic normality of experiments. In Stochastic Processes and Related Topics (Proc. Summer Res. Inst. Statist. Inference for Stochastic Processes, Indiana Univ., Bloomington, Ind., 1974, Vol. 1; dedicated to Jerzy Neyman) 13–54. Academic Press, New York, 1975. Zbl0389.62011MR395005
  32. [32] P. Massart. Concentration Inequalities and Model Selection. Lecture Notes in Mathematics 1896. Springer, Berlin, 2003. Zbl1170.60006MR2319879
  33. [33] G. G. Roussas. Nonparametric estimation in Markov processes. Ann. Inst. Statist. Math.21 (1969) 73–87. Zbl0181.45804MR247722
  34. [34] G. G. Roussas. Estimation of Transition Distribution Function and Its Quantiles in Markov Processes: Strong Consistency and Asymptotic Normality. NATO Adv. Sci. Inst. Ser. C Math. Phys. Sci. 335. Kluwer Acad. Publ., Dordrecht, 1991. Zbl0735.62081MR1154345
  35. [35] M. Sart. Model selection for poisson processes with covariates. ArXiv e-prints, 2012. 
  36. [36] G. Viennet. Inequalities for absolutely regular sequences: Application to density estimation. Probab. Theory Related Fields107 (1997) 467–492. Zbl0933.62029MR1440142

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.