The “progressive mixture” estimator for regression trees

Gilles Blanchard

Annales de l'I.H.P. Probabilités et statistiques (1999)

  • Volume: 35, Issue: 6, page 793-820
  • ISSN: 0246-0203

How to cite

top

Blanchard, Gilles. "The “progressive mixture” estimator for regression trees." Annales de l'I.H.P. Probabilités et statistiques 35.6 (1999): 793-820. <http://eudml.org/doc/77646>.

@article{Blanchard1999,
author = {Blanchard, Gilles},
journal = {Annales de l'I.H.P. Probabilités et statistiques},
language = {eng},
number = {6},
pages = {793-820},
publisher = {Gauthier-Villars},
title = {The “progressive mixture” estimator for regression trees},
url = {http://eudml.org/doc/77646},
volume = {35},
year = {1999},
}

TY - JOUR
AU - Blanchard, Gilles
TI - The “progressive mixture” estimator for regression trees
JO - Annales de l'I.H.P. Probabilités et statistiques
PY - 1999
PB - Gauthier-Villars
VL - 35
IS - 6
SP - 793
EP - 820
LA - eng
UR - http://eudml.org/doc/77646
ER -

References

top
  1. [1] Y. Amit and D. Geman, Shape quantization and recognition with randomized trees, Neural Computation9 (1997) 1545-1588. 
  2. [2] Y. Amit, D. Geman and K. Wilder, Joint induction of shape features and tree classifiers, IEEE Trans. PAMI19 (11) (1997) 1300-1306. 
  3. [3] A. Barron and Y. Yang, Information theoretic determination of minimax rates of convergence, Department of Statistics, Yale University, 1997. Zbl0978.62008
  4. [4] A.A. Barron, Are Bayes rules consistent in information? in: T.M. Cover and B. Gopinath (Eds.), Open Problems in Communication and Computation, Springer, Berlin, 1987, pp. 85-91. 
  5. [5] L. Birgé, Approximation dans les espaces métriques et théorie de l'approximation, Z. Wahrscheinlichkeitstheor. Verw. Geb.65 (1983) 181-237. Zbl0506.62026MR722129
  6. [6] O. Catoni, "Universal" aggregation rules with exact bias bounds, Preprint of the Laboratoire de Probabilités et Modèles Aléatoires, Université Pierre et Marie Curie, available at http://www.proba.jussieu.fr/mathdoc/preprints/index.html#1999 (to appear in Annals of Statistics), 1999. 
  7. [7] H. Chipman, E.I. George and E. Mcculloch, Bayesian CART model search, JASA93 (1998) 935-947. 
  8. [8] T.M. Cover and J.A. Thomas, Elements of Information Theory, Wiley Series in Telecommunications, Wiley, New York, 1991. Zbl0762.94001MR1122806
  9. [9] L. Devroye and L. Györfi, Nonparametric Density Estimation: The L1 View, Wiley, New York, 1985. Zbl0546.62015MR780746
  10. [10] D. Helmbold and R. Shapire, Predicting nearly as well as the best pruning of a decision tree, Machine Learning27 (1997) 51-68. 
  11. [11] F.M.J. Willems, Y.M. Shtarkov and T.J. Tjalkens, The context-tree weighting method: basic properties, IEEE Trans. Inform. Theory41 (3) (1995) 653-664. Zbl0837.94011
  12. [12] F.M.J. Willems, Y.M. Shtarkov and T.J. Tjalkens, Context weighting for general finite-context sources, IEEE Trans. Inform. Theory42 (5) (1996) 1514- 1520. Zbl0860.94016

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.