Rough sets methods in feature reduction and classification

Roman Świniarski

International Journal of Applied Mathematics and Computer Science (2001)

  • Volume: 11, Issue: 3, page 565-582
  • ISSN: 1641-876X

Abstract

top
The paper presents an application of rough sets and statistical methods to feature reduction and pattern recognition. The presented description of rough sets theory emphasizes the role of rough sets reducts in feature selection and data reduction in pattern recognition. The overview of methods of feature selection emphasizes feature selection criteria, including rough set-based methods. The paper also contains a description of the algorithm for feature selection and reduction based on the rough sets method proposed jointly with Principal Component Analysis. Finally, the paper presents numerical results of face recognition experiments using the learning vector quantization neural network, with feature selection based on the proposed principal components analysis and rough sets methods.

How to cite

top

Świniarski, Roman. "Rough sets methods in feature reduction and classification." International Journal of Applied Mathematics and Computer Science 11.3 (2001): 565-582. <http://eudml.org/doc/207520>.

@article{Świniarski2001,
abstract = {The paper presents an application of rough sets and statistical methods to feature reduction and pattern recognition. The presented description of rough sets theory emphasizes the role of rough sets reducts in feature selection and data reduction in pattern recognition. The overview of methods of feature selection emphasizes feature selection criteria, including rough set-based methods. The paper also contains a description of the algorithm for feature selection and reduction based on the rough sets method proposed jointly with Principal Component Analysis. Finally, the paper presents numerical results of face recognition experiments using the learning vector quantization neural network, with feature selection based on the proposed principal components analysis and rough sets methods.},
author = {Świniarski, Roman},
journal = {International Journal of Applied Mathematics and Computer Science},
keywords = {feature selection; rough sets; classification},
language = {eng},
number = {3},
pages = {565-582},
title = {Rough sets methods in feature reduction and classification},
url = {http://eudml.org/doc/207520},
volume = {11},
year = {2001},
}

TY - JOUR
AU - Świniarski, Roman
TI - Rough sets methods in feature reduction and classification
JO - International Journal of Applied Mathematics and Computer Science
PY - 2001
VL - 11
IS - 3
SP - 565
EP - 582
AB - The paper presents an application of rough sets and statistical methods to feature reduction and pattern recognition. The presented description of rough sets theory emphasizes the role of rough sets reducts in feature selection and data reduction in pattern recognition. The overview of methods of feature selection emphasizes feature selection criteria, including rough set-based methods. The paper also contains a description of the algorithm for feature selection and reduction based on the rough sets method proposed jointly with Principal Component Analysis. Finally, the paper presents numerical results of face recognition experiments using the learning vector quantization neural network, with feature selection based on the proposed principal components analysis and rough sets methods.
LA - eng
KW - feature selection; rough sets; classification
UR - http://eudml.org/doc/207520
ER -

References

top
  1. Almuallim H. and Dietterich T.G. (1991): Learning with many irrelevant features. — Proc. 9th Nat. Conf. Artificial Intelligence, Menlo Park, CA, AAAI Press, pp.574–552. Zbl0942.68657
  2. Atkeson C.G. (1991): Using locally weighted regression for robot learning. — Proc. IEEE Int. Conf. Robotics and Automation, pp.958–963 
  3. Bazan J., Skowron A. and Synak P. (1994a): Market data analysis: A rough set approach. — ICS Res. Rep., No.6, Warsaw University of Technology, Warsaw, Poland. 
  4. Bazan J., Skowron A. and Synak P. (1994b): Dynamic reducts as a tool for extracting laws from decision tables. — Proc. Symp. Methodologies for Intelligent Systems, Charlotte, NC, pp.16–19. 
  5. Bishop C.M. (1995): Neural Networks for Pattern Recognition. — Oxford: Oxford Press Zbl0868.68096
  6. Blumer A., Ehrenfeucht A., Haussler D. and Warmuth M.K. (1987): Occam’s razor. — Inf. Process. Lett., Vol.24, pp.377–380. Zbl0653.68084
  7. Diamentras K.I. and Kung S.Y. (1996): Principal Component Neural Networks. Theory and Applications. — New York: Wiley. 
  8. Cios K., Pedrycz W. and Świniarski R.W. (1998): Data Mining Methods in Knowledge DisHUK covery. — Boston/Dordrecht/London: Kluwer Academic Publishers. Zbl0912.68199
  9. Doak J. (1992): An evaluation of feature selection methods and their application to computer security. — Tech. Rep., No.CSE-92-18, University of California at Davis. 
  10. Duda R.O. and Hart P.E. (1973): Pattern Recognition and Scene Analysis. — New York: Wiley. Zbl0277.68056
  11. Fisher R.A. (1936): The use of multiple measurements in taxonomy problems. — Annals of Eugenics, Vol.7, pp.179–188. 
  12. Fukunaga K. (1990): Introduction to Statistical Pattern Recognition. — New York: Academic Press. Zbl0711.62052
  13. Geman S., Bienenstock E. and Doursat R. (1992): Neural networks and the bias/variance dilemma. — Neural Comput., Vol.4, No.1, pp.1–58. 
  14. Holland J.H. (1992): Adaptation of Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control and Artificial Intelligence. — MIT Press. 
  15. Hong Z.Q. (1991): Algebraic Feature Extraction of Image for Recognition. — Pattern RecogHUK nition, Vol.24, No.3, pp.211–219. 
  16. Jain A.K. (1989): Fundamentals of Digital Image Processing. — New York: Prentice Hall. Zbl0744.68134
  17. John G., Kohavi R. and Pfleger K. (1994): Irrelevant features and the subset selection probHUK lem. — Proc. 11th Int. Conf. Machine Learning (ICML-94), pp.121–129. 
  18. Karhunen K. (1947): Uber lineare methoden in der Wahrscheinlichkeitsrechnung. — Annales Acedemiae Scientiarum Fennicae, Series AI: Mathematica-Physica, 3rd Ed.: Van NosHUK trand, pp.373–379. 
  19. Kira K. and Rendell L.A. (1992): A practical approach to feature selection. — Proc. 9th Int. Workshop Machine Learning, Aberdeen, Scotland, pp.259–256. 
  20. Kittler J. (1986): Feature selection and extraction, In: Handbook of Pattern Recognition and Image Processing (T.Y. Young and K.S. Fu, Eds.), San Diego: Academic Press, pp.59–83. 
  21. Kohonen T. (1990): The Self-Organizing Map. — Proc. IEEE, Vol.78, pp.1464–1480. 
  22. Kononenko I. (1994): Estimating attributes: Analysis and extension of Relief. — Proc. Europ. Conf. Machine Learning. 
  23. Langley P. and Sage S. (1994): Selection of relevant features in machine learning. — Proc. AAAI Fall Symp. Relevance, pp.140–144. 
  24. Lewler E.L. and Wood D.E. (1966): Branch and bound methods: A survey. — Oper. Res., Vol.149, pp.4. 
  25. Liu H. and Setiono R. (1996a): A probabilistic approach to feature selection—A filter solution. — Proc. 13th Int. Conf. Machine Learning (ICML’96), Bari, Italy, pp.319–327. 
  26. Liu H. and Setiono R. (1996b): Feature selection and classification—A probabilistic wrapHUK per approach. — 9th Int. Conf. Industrial and Engineering Applications of Artificial Intelligence and Expert Systems (IEA-AIE’96), Fukuoka, Japan, pp.419–424. 
  27. Liu H. and Motoda H. (1999): Feature Selection for Knowledge Discovery and Data Mining. — Dordrecht: Kluwer Academic Publishers. Zbl0908.68127
  28. Lobo V., Moura-Pires F. and Świniarski R. (1997): Minimizing the number of neurons for a SOM-based classification, using Boolean function formalization. — Int. Rep., San Diego State University, Department of Mathematical and Computer Sciences. 
  29. Marill T. and Green D.M. (1963): On the effectiveness of receptors in recognition systems. — IEEE Trans. Inf. Theory, Vol.9, pp.11–17. 
  30. Modrzejewski M. (1993): Feature selection using rough sets theory. — Proc. European Conf. Machine Learning, pp.213–226. 
  31. Narendra P.M. and Fukunaga K. (1977): A branch and bound algorithm for feature subset selection. — Trans. IEEE. Computers, Vol.C-26, pp.917–922. Zbl0363.68059
  32. Nguyen T. et al. (1994): Application of rough sets, neural networks and maximum likelihood for texture classification based on singular value decomposition. — Proc. Int. Workshop RSSC Rough Sets and Soft Computing, San Jose, U.S.A., pp.332-339. 
  33. Pal S.K. and Skowron A. (1999): Rough-Fuzzy Hybridization: A New Trend in Decision Making. — Singapore: Springer. Zbl0941.68129
  34. Pawlak Z. (1982): Rough sets. — Int. J. Comp. Sci., Vol.11, pp.341–356. Zbl0501.68053
  35. Pawlak Z. (1991): Rough Sets. Theoretical Aspects of Reasoning About Data. — Boston: Kluwer Academic Publishers. Zbl0758.68054
  36. Pregenzer M. (1997): Distinction sensitive learning vector quantization. — Ph.D. Thesis, Graz University of Technology, Graz, Austria. 
  37. Quinlan J.R. (1993): C4.5: Programs for Machine Learning. — New York: Morgan Kaufman. 
  38. Rissanen J. (1978): Modeling by shortest data description. — Automatica, Vol.14, pp.465– 471. Zbl0418.93079
  39. Samaria F. and Harter A. (1994): Parametrization of stochastic model for human face idenHUK tification. — Proc. IEEE Workshop Application of Computer Vision. 
  40. Siedlecki W. and Sklanski J. (1988): On automatic feature selection. — Int. J. Pattern Recogn. Artif. Intell., Vol.2, No.2, pp.197–220. 
  41. Skowron A. (1990): The rough sets theory and evidence theory. — Fundamenta Informaticae, Vol.13, pp.245–262. Zbl0752.94023
  42. Swets D.L and Weng J.J. (1996): Using discriminant eigenfeatures for image retrieval. — IEEE Trans. Pattern Recogn. Mach. Intell., Vol.10, No.8, pp.831–836. 
  43. Świniarski R. (1993): Introduction to rough sets, In: Materials of the Int. Short Course Neural Networks. Fuzzy and Rough Systems. Theory and Applications. — San Diego State University, San Diego, California, pp.1–24. 
  44. Świniarski R. (1995): RoughFuzzyLab. — A software package developed at San Diego State University, San Diego, California. 
  45. Świniarski R. and Nguyen J. (1996): Rough sets expert system for texture classification based on 2D spectral features. — Proc. 3rd Biennial European Joint Conf. Engineering Systems Design and Analysis ESDA’96, Montpellier, France, pp.3–8. 
  46. Świniarski R., Hunt F., Chalret D. and Pearson D. (1995): Feature selection using rough sets and hidden layer expansion for rupture prediction in a highly automated production system. — Proc. 12th Int. Conf. Systems Science, Wrocław, Poland. 
  47. Świniarski R. and Hargis L. (2001): Rough sets as a front and of neural networks texture classifiers. — Neurocomputing, Vol.36, pp.85–102. Zbl1003.68642
  48. Swingler K. (1996): Applying Neural Networks. — London: Academic Press. 
  49. Weiss S. and Indurkhya N. (1977): Predictive Data-Mining: A Practical Guide. — New York: Morgan Kaufmann. Zbl0885.68050
  50. Yu B. and Yuan B. (1993): A more efficient branch and bound algorithm for feature selection. — Pattern Recognition, Vol.26, No.6, pp.883–889. 
  51. Xu L., Yan P. and Chang T. (1989): Best first strategy for feature selection. — Proc. 9th Int. Conf. Pattern Recognition, pp.706–708. 

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.