Data mining et statistique

Philippe Besse; Caroline Le Gall; Nathalie Raimbault; Sophie Sarpy

Journal de la société française de statistique (2001)

  • Volume: 142, Issue: 1, page 5-36
  • ISSN: 1962-5197

How to cite

top

Besse, Philippe, et al. "Data mining et statistique." Journal de la société française de statistique 142.1 (2001): 5-36. <http://eudml.org/doc/198418>.

@article{Besse2001,
author = {Besse, Philippe, Le Gall, Caroline, Raimbault, Nathalie, Sarpy, Sophie},
journal = {Journal de la société française de statistique},
language = {fre},
number = {1},
pages = {5-36},
publisher = {Société française de statistique},
title = {Data mining et statistique},
url = {http://eudml.org/doc/198418},
volume = {142},
year = {2001},
}

TY - JOUR
AU - Besse, Philippe
AU - Le Gall, Caroline
AU - Raimbault, Nathalie
AU - Sarpy, Sophie
TI - Data mining et statistique
JO - Journal de la société française de statistique
PY - 2001
PB - Société française de statistique
VL - 142
IS - 1
SP - 5
EP - 36
LA - fre
UR - http://eudml.org/doc/198418
ER -

References

top
  1. ACADÉMIE DES SCIENCES (2000), La statistique. Rapport sur la science et la Techique. Technique & Documentation. 
  2. BAUM E. (1989), What size net gives valid generalization ? Neural Computation 1, 151-160. 
  3. BERGERET F. and CHANDON Y. (1999), Improving yeld in ic manufacturing by statistical analysis of a large data base. Micro Magazine, www.micromagazine.com/archive/99/03/bergeret.htlm. 
  4. BESSE P. (2000), Statistique & data mining. www.ups-tlse.fr/Besse/enseignement.htlm. 
  5. BREIMAN L. (1996), Bagging predictors. Machine learning 26(2), 123-140. Zbl0858.68080
  6. BREIMAN L. (2001), Radom forests random features. Machine learning, à paraître. Zbl1007.68152
  7. BREIMAN L., FRIEDMAN, OLSHEN R. and STONE C. (1984), Classification and regression trees. Wadsworth & Brooks. Zbl0541.62042
  8. DE VEAUX R., SCHUMI J., SCHWEINSBERG J. and UNGAR L. (1998), Prediction intervals for neural networks va nonlinear regression. Technometrics 40(4), 273-282. Zbl1063.62582MR1659361
  9. EFRON B. (1983), Estimating the error rate of a prediction rule : improvementy on cross-validation. Journal of the American Statistical Association 78, 316-331. Zbl0543.62079MR711106
  10. ELDER J. and PREBIGON D. (1996), A statistical perspective on knowledge discovery in data bases. In U. M. Fayyad, G. Piatetsky-Shapiro, P. Smyth, and R. Uthurusamy (Eds.), advances in Knpwledge Discovery and Data Mining, pp. 83-113. AAAI Press/MIT Press. 
  11. FAYYAD U.M. (1997), Editorial. Data mining and Knowledge discovery 1, 5-10. 
  12. FRIEDMAN J.H. (1997), Data mining and statistics. What's the connection ? in Proc. of the 29trh Symposium on the Interface : Computing Science and Statistics. 
  13. GARDNER R., BIEKER J. ELWELL S., THALMAN R. and RIVERA E. (2000), Solving tough semiconductor manufacturing problems using data mining. In IEEE/SEMI Advanced semiconductor manufactoring conference. 
  14. GHATTAS B. (1999), Importance des variables dans les méthodes CART. La Revue de Modulad 24, 17-28. 
  15. GHATTAS B. (2000), Agrégation d'arbres de classification. Revue de Statistique Appliquée 48(2), 85-98. 
  16. GOEBEL M. and GRUENWALD L. (1999), A survey of data mining and knowledge discovery software tolls. in SIGKDD Explorations, pp. 20-33. ACM SIGKDD. 
  17. HAND D, MANNILA H. and SMYTH (2001), Principles of data mining. MIT Press. 
  18. HAND D.J. (1998), Data mining : Statistics ans more? The American Statistician 52(2), 112-118. 
  19. HAND D.J. (1999), Statistics and data mining : intersecting disciplines. In SIGKDD Explorations, Volume 1, pp. 16-19. ACM SIGKDD. 
  20. HÉBRAIL G. and LECHEVALLIER Y. (2002), Data mining et analyse de données symboliques, in Analyse de données. Hermes. 
  21. HO T.K. (1998), The random subspace method for constructing decision forests. IEEE Transactions on Pattern Analysis and Machine Intelligence 20(8), 832-844. citeseer-nj.nec/com/ho98random.htlm. 
  22. HWANG J. and DlNG A. (1997), Prediction intervals for artificial neural networks. Journal of the American Statistical Association 92, 748-757. Zbl1090.62559MR1467864
  23. JAMBU M. (2000), Introduction au data mining. Eyrolles. 
  24. MICHIE D., SPIEGELHALTER and TAYLOIR C. (1994), Machine leaming, neural and statistical classification. Harwood. Zbl0827.68094
  25. MIENO F. and SATO T., SHIBURA Y., ODAGIRI K., TSUDA H. and TAKE R. (1990), Yield improvement using data mining system. In Semiconductor Manufacturing Conference Proceedings, pp. 391-394. IEEE. 
  26. QUINLAN J. (1993), C4.5-Programms for machine learning. Morgan Kaufmann. 
  27. RAIMBAULT N., BES C. and FABRE P. (2001), Neural aircraft autopilot gain adjuster. In 15th IFAC Symposium on Automatic Control in Aerospace. 
  28. RAIMBAULT N. and FABRE P. (2001), Probabilistic neural detector of pilot-induced oscillations (pios). In AIAA Guidance, Navigation and Control conference. 
  29. S-plus (1987), S-plus 4 Guide to statistics. MathSoft. 
  30. SAS (1989), SAS/STAT User's Guide (fourth éd.), Volume 2. Sas Institute Inc. version 7. 
  31. SEM (2001), SAS/Enterprise Miner User's Guide. Sas Institute Inc. version 8. 
  32. SHLIEN S. (1990), Multiple binary decision tree classifiers. Pattern Recognition 23, 757-763. 
  33. TIBSHIRANI R. (1996), A comparison of some error estimates for neural network models. Neural Computation 8, 152-163. 
  34. ZIGHED D.A. and RAKOTOMALADA (2000), Graphes d'induction, apprentissage et data mining. Hermès. 

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.