A practical application of kernel-based fuzzy discriminant analysis

Jian-Qiang Gao; Li-Ya Fan; Li Li; Li-Zhong Xu

International Journal of Applied Mathematics and Computer Science (2013)

  • Volume: 23, Issue: 4, page 887-903
  • ISSN: 1641-876X

Abstract

top
A novel method for feature extraction and recognition called Kernel Fuzzy Discriminant Analysis (KFDA) is proposed in this paper to deal with recognition problems, e.g., for images. The KFDA method is obtained by combining the advantages of fuzzy methods and a kernel trick. Based on the orthogonal-triangular decomposition of a matrix and Singular Value Decomposition (SVD), two different variants, KFDA/QR and KFDA/SVD, of KFDA are obtained. In the proposed method, the membership degree is incorporated into the definition of between-class and within-class scatter matrices to get fuzzy between-class and within-class scatter matrices. The membership degree is obtained by combining the measures of features of samples data. In addition, the effects of employing different measures is investigated from a pure mathematical point of view, and the t-test statistical method is used for comparing the robustness of the learning algorithm. Experimental results on ORL and FERET face databases show that KFDA/QR and KFDA/SVD are more effective and feasible than Fuzzy Discriminant Analysis (FDA) and Kernel Discriminant Analysis (KDA) in terms of the mean correct recognition rate.

How to cite

top

Jian-Qiang Gao, et al. "A practical application of kernel-based fuzzy discriminant analysis." International Journal of Applied Mathematics and Computer Science 23.4 (2013): 887-903. <http://eudml.org/doc/262480>.

@article{Jian2013,
abstract = {A novel method for feature extraction and recognition called Kernel Fuzzy Discriminant Analysis (KFDA) is proposed in this paper to deal with recognition problems, e.g., for images. The KFDA method is obtained by combining the advantages of fuzzy methods and a kernel trick. Based on the orthogonal-triangular decomposition of a matrix and Singular Value Decomposition (SVD), two different variants, KFDA/QR and KFDA/SVD, of KFDA are obtained. In the proposed method, the membership degree is incorporated into the definition of between-class and within-class scatter matrices to get fuzzy between-class and within-class scatter matrices. The membership degree is obtained by combining the measures of features of samples data. In addition, the effects of employing different measures is investigated from a pure mathematical point of view, and the t-test statistical method is used for comparing the robustness of the learning algorithm. Experimental results on ORL and FERET face databases show that KFDA/QR and KFDA/SVD are more effective and feasible than Fuzzy Discriminant Analysis (FDA) and Kernel Discriminant Analysis (KDA) in terms of the mean correct recognition rate.},
author = {Jian-Qiang Gao, Li-Ya Fan, Li Li, Li-Zhong Xu},
journal = {International Journal of Applied Mathematics and Computer Science},
keywords = {kernel fuzzy discriminant analysis; fuzzy k-nearest neighbor; QR decomposition; SVD; fuzzy membership matrix; t-test; fuzzy k-nearest neighbors},
language = {eng},
number = {4},
pages = {887-903},
title = {A practical application of kernel-based fuzzy discriminant analysis},
url = {http://eudml.org/doc/262480},
volume = {23},
year = {2013},
}

TY - JOUR
AU - Jian-Qiang Gao
AU - Li-Ya Fan
AU - Li Li
AU - Li-Zhong Xu
TI - A practical application of kernel-based fuzzy discriminant analysis
JO - International Journal of Applied Mathematics and Computer Science
PY - 2013
VL - 23
IS - 4
SP - 887
EP - 903
AB - A novel method for feature extraction and recognition called Kernel Fuzzy Discriminant Analysis (KFDA) is proposed in this paper to deal with recognition problems, e.g., for images. The KFDA method is obtained by combining the advantages of fuzzy methods and a kernel trick. Based on the orthogonal-triangular decomposition of a matrix and Singular Value Decomposition (SVD), two different variants, KFDA/QR and KFDA/SVD, of KFDA are obtained. In the proposed method, the membership degree is incorporated into the definition of between-class and within-class scatter matrices to get fuzzy between-class and within-class scatter matrices. The membership degree is obtained by combining the measures of features of samples data. In addition, the effects of employing different measures is investigated from a pure mathematical point of view, and the t-test statistical method is used for comparing the robustness of the learning algorithm. Experimental results on ORL and FERET face databases show that KFDA/QR and KFDA/SVD are more effective and feasible than Fuzzy Discriminant Analysis (FDA) and Kernel Discriminant Analysis (KDA) in terms of the mean correct recognition rate.
LA - eng
KW - kernel fuzzy discriminant analysis; fuzzy k-nearest neighbor; QR decomposition; SVD; fuzzy membership matrix; t-test; fuzzy k-nearest neighbors
UR - http://eudml.org/doc/262480
ER -

References

top
  1. Aydilek, I.B. and Arslan, A. (2012). A novel hybrid approach to estimating missing values in databases using k-nearest neighbors and neural networks, International Journal of Innovative Computing, Information and Control 7(8): 4705-4717. 
  2. Baudat, G. and Anouar, F. (2000). Generalized discriminant analysis using a kernel approach, Neural Computation 12(10): 2385-2404. 
  3. Belhumeur, P.N. and Kriegman, D.J. (1997). Eigenfaces vs. Fisherfaces: Recognition using class specific linear projection, IEEE Transactions on Pattern Analysis Machine Intelligence 19(7): 711-720. 
  4. Chen, L.F., Liao, H. Y.M., Ko, M.T., Lin, J.C. and Yu, G.J. (2000). A new LDA-based face recognition system which can solve the small sample size problem, Pattern Recognition 33(10): 1713-1726. 
  5. Cover, T.M. (1965). Geometrical and statistical properties of systems of linear inequalities with applications in pattern recognition, IEEE Transactions on Electronic Computers 14(3): 326-334. Zbl0152.18206
  6. Demsar, J. (2006). Statistical comparisons of classifiers over multiple data sets, The Journal of Machine Learning Research 7(1): 1-30. Zbl1222.68184
  7. Dietterich, T.G. (1998). Approximate statistical tests for comparing supervised classification learning algorithms, Neural Computation 10(7): 1895-1923. 
  8. Duda, R.O., Hart, P.E. and Stork, D.G. (2012). Pattern Classification, John Wiley and Sons, New York, NY. Zbl0968.68140
  9. Friedman, J.H. (1989). Regularized discriminant analysis, Journal of the American Statistical Association 84(405): 165-175. 
  10. Fukunaga, K. (1990). Introduction to Statistical Pattern Recognition, San Diego, California, CA. Zbl0711.62052
  11. Gao, J. and Fan, L. (2011). Kernel-based weighted discriminant analysis with QR decomposition and its application face recognition, WSEAS Transactions on Mathematics 10(10): 358-367. 
  12. Gao, J., Fan, L. and Xu, L. (2012). Solving the face recognition problem using QR factorization, WSEAS Transactions on Mathematics 11(1): 728-737. 
  13. Gao, J.Q., Fan, L.Y. and Xu, L.Z. (2013). Median null (sw)-based method for face feature recognition, Applied Mathematics and Computation 219(12): 6410-6419. Zbl06281820
  14. Gao, Q.X., Zhang, L. and Zhang, D. (2008). Face recognition using FLDA with single training image per person, Applied Mathematics and Computation 205(2): 726-734. Zbl1152.68679
  15. Hastie, T., Buja, A. and Tibshirani, R. (1995). Penalized discriminant analysis, The Annals of Statistics 23(1): 73-102. Zbl0821.62031
  16. Hastie, T., Tibshirani, R. and Buja, A. (1994). Flexible discriminant analysis by optimal scoring, Journal of the American Statistical Association 89(428): 1255-1270. Zbl0812.62067
  17. Hastie, T., Tibshirani, R., Friedman, J. and Franklin, J. (1991). The elements of statistical learning: Data mining, inference and prediction, The Mathematical Intelligencer 27(2): 83-85. 
  18. Hong, Z.Q. and Yang, J.Y. (2005). Optimal discriminant plane for a small number of samples and design method of classifier on the plane, Pattern Recognition 24(4): 317-324. 
  19. Jain, A. and Zongker, D. (1997). Feature selection: Evaluation, application, and small sample performance, IEEE Transactions on Pattern Analysis and Machine Intelligence 19(2): 153-158. 
  20. Keller, J.M., Gray, M.R. and Givens, J.A. (1985). A fuzzy k-nearest neighbor algorithm, IEEE Transactions on Systems, Man and Cybernetics 15(4): 580-585. 
  21. Koc, M. and Barkana, A. (2011). A new solution to one sample problem in face recognition using FLDA, Applied Mathematics and Computation 217(24): 10368-10376. Zbl1221.62096
  22. Kwak, K.C. and Pedrycz, W. (2005). Face recognition using a fuzzy Fisherface classifier, Pattern Recognition 38(10): 1717-1732. 
  23. Lee, H.M., Chen, C.M., Chen, J.M. and Jou, Y.L. (2001). An efficient fuzzy classifier with feature selection based on fuzzy entropy, IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics 31(3): 426-432. 
  24. Liu, F. and Xue, X. (2012). Constructing kernels by fuzzy rules for support vector regressions, International Journal of Innovative Computing, Information and Control 8(7): 4811-4822. 
  25. Liu, Y. (2006). Website of the ORL face database, http://www.cam-orl.co.uk. 
  26. Liu, Y., Liu, X. and Su, Z. (2008). A new fuzzy approach for handling class labels in canonical correlation analysis, Neurocomputing 71(7): 1735-1740. 
  27. Loog, M., Duin, R.P.W. and Haeb-Umbach, R. (2001). Multiclass linear dimension reduction by weighted pairwise fisher criteria, IEEE Transactions on Pattern Analysis and Machine Intelligence 23(7): 762-766. 
  28. Pahasa, J. and Ngamroo, I. (2012). PSO based kernel principal component analysis and multi-class support vector machine for power quality problem classification, International Journal of Innovative Computing, Information and Control 8(3): 1523-1539. 
  29. Pal, N.R. and Eluri, V.K. (1998). Two efficient connectionist schemes for structure preserving dimensionality reduction, IEEE Transactions on Neural Networks 9(6): 1142-1154. 
  30. Phillips, P.J. (2004). Website of the facial recognition technology (FERET) database, http://www.itl.nist.gov/iad/humanid/feret/feret-master.html. 
  31. Raudys, S.J. and Jain, A.K. (1991). Small sample size effects in statistical pattern recognition: Recommendations for practitioners, IEEE Transactions on Pattern Analysis and Machine Intelligence 13(3): 252-264. 
  32. Schölkopf, B., Smola, A. and Müller, K.R. (1998). Nonlinear component analysis as a kernel eigenvalue problem, Neural Computation 10(5): 1299-1319. 
  33. Swets, D.L. and Weng, J.J. (1996). Using discriminant eigenfeatures for image retrieval, IEEE Transactions on Pattern Analysis and Machine Intelligence 18(8): 831-836. 
  34. Świercz, E. (2010). Classification in the Gabor time-frequency domain of non-stationary signals embedded in heavy noise with unknown statistical distribution, International Journal of Applied Mathematics and Computer Science 20(1): 135-147, DOI: 10.2478/v10006-010-0010-x. Zbl1300.62045
  35. Vapnik, V.N. (1998). Statistical Learning Theory, Wiley, New York, NY. Zbl0935.62007
  36. Woźniak, M. and Krawczyk, B. (2012). Combined classifier based on feature space partitioning, International Journal of Applied Mathematics and Computer Science 22(4): 855-866, DOI: 10.2478/v10006-012-0063-0. 
  37. Wu, X.H. and Zhou, J.J. (2006). Fuzzy discriminant analysis with kernel methods, Pattern Recognition 39(11): 2236-2239. Zbl1102.68606
  38. Yang, J., Frangi, A.F., Yang, J.Y., Zhang, D. and Jin, Z. (2005). KPCA plus LDA: A complete kernel Fisher discriminant framework for feature extraction and recognition, IEEE Transactions on Pattern Analysis and Machine Intelligence 27(2): 230-244. 
  39. Yang, J. and Yang, J.Y. (2001). An optimal FLD algorithm for facial feature extraction, Proceedings of SPIE Intelligent Robots and Computer Vision XX: Algorithms, Techniques, and Active Vision, Boston, MA, USA, pp. 438-444. 
  40. Yang, J. and Yang, J.Y. (2003). Why can LDA be performed in PCA transformed space?, Pattern Recognition 36(2): 563-566. 
  41. Yang, W., Wang, J., Ren, M., Zhang, L. and Yang, J. (2009). Feature extraction using fuzzy inverse FDA, Neurocomputing 73(13): 3384-3390. 
  42. Yu, H. and Yang, J. (2001). A direct LDA algorithm for high-dimensional data with application to face recognition, Pattern Recognition 34(10): 2067-2070. Zbl0993.68091
  43. Zadeh, L.A. (1965). Fuzzy sets, Information and Control 8(3): 338-353. Zbl0139.24606
  44. Zheng, Y.J., Yang, J., Yang, J.Y. and Wu, X.J. (2006a). A reformative kernel Fisher discriminant algorithm and its application to face recognition, Neurocomputing 69(13): 1806-1810. 
  45. Zheng, Y., Yang, J., Wang, W., Wang, Q., Yang, J. and Wu, X. (2006b). Fuzzy kernel Fisher discriminant algorithm with application to face recognition, 6th World Congress on Intelligent Control and Automation WCICA, Dalian, China, Vol. 2, pp. 9669-9672. 
  46. Zhuang, X.S. and Dai, D.Q. (2005). Inverse Fisher discriminate criteria for small sample size problem and its application to face recognition, Pattern Recognition 38(11): 2192-2194. 
  47. Zhuang, X.S. and Dai, D.Q. (2007). Improved discriminate analysis for high-dimensional data and its application to face recognition, Pattern Recognition 40(5): 1570-1578. Zbl1113.68086

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.