Construire un arbre de discrimination binaire à partir de données imprécises

E. Périnel

Revue de Statistique Appliquée (1999)

  • Volume: 47, Issue: 1, page 5-30
  • ISSN: 0035-175X

How to cite


Périnel, E.. "Construire un arbre de discrimination binaire à partir de données imprécises." Revue de Statistique Appliquée 47.1 (1999): 5-30. <>.

author = {Périnel, E.},
journal = {Revue de Statistique Appliquée},
language = {fre},
number = {1},
pages = {5-30},
publisher = {Société française de statistique},
title = {Construire un arbre de discrimination binaire à partir de données imprécises},
url = {},
volume = {47},
year = {1999},

AU - Périnel, E.
TI - Construire un arbre de discrimination binaire à partir de données imprécises
JO - Revue de Statistique Appliquée
PY - 1999
PB - Société française de statistique
VL - 47
IS - 1
SP - 5
EP - 30
LA - fre
UR -
ER -


  1. Arraya R., (1995). Induction of decision trees when examples are describes with noisy measurements and with fuzzy class membership. In Seminaire du projet CLOREC, INRIA Rocquencourt, juin. 
  2. Belson W.A., (1959). Matching and prediction on the principle of biological classification, Applied Statistics, vol. VIII. 
  3. Breiman L., Friedman J.H., Olshen R.A. and Stone C.J., (1984). Classification and regression trees. Belmont: Wadsworth. Zbl0541.62042MR726392
  4. Caillou B., Tartour E. et Schlumberger M., (1992). Les tumeurs neuroendocrines, Revue Prat., vol.42, 7. 
  5. Celeux G. et Nakache J.P., (1994). Analyse discriminante sur variables qualitatives, PolyTechnica. 
  6. Celeux G. and Govaert G., (1992). A Classification EM algorithm for clustering and two stochastic versions, Computational Statistics & Data Analysis, vol.14, 315-332. Zbl0937.62605MR1192205
  7. Ciampi A., (1992). Constructing prediction trees from data : the RECPAM approach. Proceedings from the Prague 1991 University Summer School on computational aspects of model choice, 105-152, Physica Verlag, Heidelberg. MR1210551
  8. Ciampi A., Diday E., Lebbe J., Périnel E. and Vignes R., (1996). Recursive partition with probabilistically imprecise data. In : Ordinal and Symbolic Data Analysis, 201-212, Diday E. et al. editors, Springer- Verlag. Zbl0902.62006
  9. Dempster A., Laird N. and Rubin D.B., (1977). Maximum likelihood from incomplete data via the EM algorithm, Journal of the Royal Statistical Society, B39, 1- 38. Zbl0364.62022MR501537
  10. Diday E., (1987). Des objets de l'analyse des données à ceux de l'analyse des connaissances. In Induction symbolique et numérique à partir de données, E. Diday et Y. Kodratoff (eds.), Cépaduès. 
  11. Diday E. et Émilion, R., (1997). Treillis de Galois maximaux et Capacités de Choquet. C.R. Acad. Sci. Paris, Analyse mathématique, t. 324, série 1. Zbl0882.06005MR1464817
  12. Esposito F., Malerba D. and Semeraro G., (1997). A comparative analysis of methods for pruning decision trees, IEEE Transactions on Pattern Analysis and Machine Intelligence. vol.19, n°5, 476- 492. 
  13. Friedman J.H., (1977). A Recursive Partitioning Decision Rule for Nonparametric Classification, IEEE Transactions on Computers, april, 404-408. Zbl0403.62036
  14. Gueguen A. et Nakache J.P., (1988). Méthode de discrimination basée sur la construction d'un arbre de décision binaire, Revue de Statistique Appliquée, vol.XXXVI, 1, 19-38. 
  15. Jamshidian M., Jennrich R.I., (1993). Conjugate gradient acceleration of the EM algorithm, Journal of the American Statistical Association, march, vol. 88, 421. Zbl0775.65025MR1212487
  16. Jordan M.I. and Jacobs R.A., (1993). Hierarchical mixtures of experts and the EM algorithm, août 93, soumis à Neural Computation. 
  17. Lebart L., Morineau A. et Piron M., (1995). Statistique exploratoire multidimensionnelle, Dunod Zbl0920.62077
  18. Lebbe J., (1991). Représentation des concepts en biologie et en médecine, Thèse de l'Université Pierre et Marie Curie, Paris VI-Jussieu. 
  19. Lindstrom M.J. and Bates D.M., (1988). Newton Raphson and EM algorithms for linear mixed-effects models for repeated-measured data, Journal of the American Statistical Association, vol.83, 1014-1022. Zbl0671.65119MR997577
  20. Meng X.L. and Rubin D.B., (1993). Maximum likelihood estimation via the ECM algorithm : a general framework, Biometrika, vol. 80, 2, 267-278. Zbl0778.62022MR1243503
  21. Mingers J., (1989). An empirical comparison of pruning methods for decision-tree induction, Machine Learning, vol. 4, 2, 227-243. 
  22. Morgan J.N. and Sonquist J.A., (1963). Problems in the analysis of survey data, and a proposal, J.A.S.A., vol.58, 302. Zbl0114.10103
  23. Périnel E., (1996). Méthodes de segmentation et analyse des données symboliques. Le cas de données probabilistes imprécises. Thèse de l'Université Paris IX-Dauphine. 
  24. Quinlan J.R., (1986). The effect of noise on concept leaming. In Michalski, Carbonel & Mitchell (Eds.), Machine Learning : an artificial intelligence approach. San Mateo, CA: Morgan Kaufman. 
  25. Quinlan J.R., (1990). Probabilistic decision trees. In : Machine Learning III, Kodratoff Y., Michalski R. (eds.), 140-152. 
  26. Quinlan J.R., (1993). C4.5 Programs for Machine Learning. Morgan Kaufman, San Mateo, California. 
  27. Sethi I.K. and Sarvarayudu G.P.R., (1982). Hierarchical classifier design using mutual information, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 2, 441-445. 
  28. Titterington D.M., Smith A.F.M. and Makov U.E., (1985). Statistical analysis of finite mixture distributions, Wiley & Sons (Eds.), Wiley series in probability and mathematical statistics. Zbl0646.62013MR838090
  29. Wedel M. and De Sarbo W.S., (1995). A mixture likelihood approach for generalized linear models, Journal of Classification, vol.12, 21- 55. Zbl0825.62611
  30. Yuan Y. and Shaw M.J., (1995). Induction of fuzzy decision trees. Fuzzy Sets and Systems, 69, 125-139. MR1317881

NotesEmbed ?


You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.


Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.