Random projection RBF nets for multidimensional density estimation

Ewa Skubalska-Rafajłowicz

International Journal of Applied Mathematics and Computer Science (2008)

  • Volume: 18, Issue: 4, page 455-464
  • ISSN: 1641-876X

Abstract

top
The dimensionality and the amount of data that need to be processed when intensive data streams are observed grow rapidly together with the development of sensors arrays, CCD and CMOS cameras and other devices. The aim of this paper is to propose an approach to dimensionality reduction as a first stage of training RBF nets. As a vehicle for presenting the ideas, the problem of estimating multivariate probability densities is chosen. The linear projection method is briefly surveyed. Using random projections as the first (additional) layer, we are able to reduce the dimensionality of input data. Bounds on the accuracy of RBF nets equipped with a random projection layer in comparison to RBF nets without dimensionality reduction are established. Finally, the results of simulations concerning multidimensional density estimation are briefly reported.

How to cite

top

Ewa Skubalska-Rafajłowicz. "Random projection RBF nets for multidimensional density estimation." International Journal of Applied Mathematics and Computer Science 18.4 (2008): 455-464. <http://eudml.org/doc/207899>.

@article{EwaSkubalska2008,
abstract = {The dimensionality and the amount of data that need to be processed when intensive data streams are observed grow rapidly together with the development of sensors arrays, CCD and CMOS cameras and other devices. The aim of this paper is to propose an approach to dimensionality reduction as a first stage of training RBF nets. As a vehicle for presenting the ideas, the problem of estimating multivariate probability densities is chosen. The linear projection method is briefly surveyed. Using random projections as the first (additional) layer, we are able to reduce the dimensionality of input data. Bounds on the accuracy of RBF nets equipped with a random projection layer in comparison to RBF nets without dimensionality reduction are established. Finally, the results of simulations concerning multidimensional density estimation are briefly reported.},
author = {Ewa Skubalska-Rafajłowicz},
journal = {International Journal of Applied Mathematics and Computer Science},
keywords = {radial basis functions; multivariate density estimation; dimension reduction; normal random projection; novelty detection},
language = {eng},
number = {4},
pages = {455-464},
title = {Random projection RBF nets for multidimensional density estimation},
url = {http://eudml.org/doc/207899},
volume = {18},
year = {2008},
}

TY - JOUR
AU - Ewa Skubalska-Rafajłowicz
TI - Random projection RBF nets for multidimensional density estimation
JO - International Journal of Applied Mathematics and Computer Science
PY - 2008
VL - 18
IS - 4
SP - 455
EP - 464
AB - The dimensionality and the amount of data that need to be processed when intensive data streams are observed grow rapidly together with the development of sensors arrays, CCD and CMOS cameras and other devices. The aim of this paper is to propose an approach to dimensionality reduction as a first stage of training RBF nets. As a vehicle for presenting the ideas, the problem of estimating multivariate probability densities is chosen. The linear projection method is briefly surveyed. Using random projections as the first (additional) layer, we are able to reduce the dimensionality of input data. Bounds on the accuracy of RBF nets equipped with a random projection layer in comparison to RBF nets without dimensionality reduction are established. Finally, the results of simulations concerning multidimensional density estimation are briefly reported.
LA - eng
KW - radial basis functions; multivariate density estimation; dimension reduction; normal random projection; novelty detection
UR - http://eudml.org/doc/207899
ER -

References

top
  1. Achlioptas D. (2001). Database friendly random projections, Proceedings of the 20th ACM SIGMOD-SIGACT-SIGART Symposium on Principles of Database Systems, Santa Barbara, CA, USA, pp. 274-281. 
  2. Ailon N. and Chazelle B. (2006). Approximate nearest neighbors and the fast Johnson-Lindenstrauss transform, Proceedings of the 38th Annual ACM Symposium on Theory of Computing, Seattle, WA, USA, pp. 557-563. Zbl1301.68232
  3. Arriaga R. and Vempala S.(1999). An algorithmic theory of learning: Robust concepts and random projection, Proceedings of the 40th Annual Symposium on Foundations of Computer Science, New York, NY, USA, pp. 616-623. Zbl1095.68092
  4. Bishop C. M. (1994). Novelty detection and neural-network validation, IEE Proceedings - Vision Image and Signal Processing, 141:217-222. 
  5. Bishop C.M. (1995). Neural Networks for Pattern Recognition, Oxford University Press, Oxford. Zbl0868.68096
  6. Bowman A.W. (1995). An alternative method of cross-validation for the smoothing of density estimates, Biometrika 71(2): 353-360. 
  7. Broomhead D. and Lowe D. (1988). Multivariable functional interpolation and adaptive networks, Complex Systems 2(11): 321-323. Zbl0657.68085
  8. Buhmann M. D. (1988). Radial Basis Functions: Theory and Implementations, Cambridge University Press, Cambridge. Zbl1038.41001
  9. Chen S., Cowan C.F.N. and Grant P.M. (1991). Orthogonal least squares learning algorithm for radial basisfunction networks, IEEE Transactions on Neural Networks 2(2): 302-307. 
  10. Chen S., Hong X. and Harris C.J. (2004). Sparse kernel density construction using orthogonal forward regression with leave-one-out test score and local regularization, IEEE Transactions on Systems, Man, and Cybernetics, Part B, 34(4): 1708-1717. 
  11. Dasgupta S. and Gupta A. (2003). An elementary proof of a theorem of Johnson and Lindenstrauss, Random Structures and Algorithms 22(1): 60-65. Zbl1018.51010
  12. Devroye L. and Györfi L. (1985). Nonparametric Density Estimation. The L1 View. Wiley, New York, NY. Zbl0546.62015
  13. Devroye L., Györfi L. and Lugosi G. (1996). Probabilistic Theory of Pattern Recognition, Springer-Verlag, New York, NY. Zbl0853.68150
  14. Frankl P. and Maehara H. (1987). The Johnson-Lindenstrauss lemma and the sphericity of some graphs, Journal of Combinatorial Theory A 44(3): 355-362. Zbl0675.05049
  15. Gertler J.J. (1998). Fault Detection and Diagnosis in Engineering Systems, Marcel Dekker, New York, NY. 
  16. Guh R.(2005). A hybrid learning based model for on-line detection and analysis of control chart patterns, Computers and Industrial Engineering 49(1): 35-62. 
  17. Holmström L. and Hämäläinen A. (1993). The self-organizing reduced kernel density estimator, Proceedings of the 1993 IEEE International Conference on Neural Networks, San Francisco, CA, USA, 1: 417-421. 
  18. Haykin S. (1999). Neural Networks. A Comprehensive Foundation, 2nd Ed., Prentice-Hall, Upper Saddle River, NJ. Zbl0934.68076
  19. Indyk P. and Motwani R. (1998). Approximate nearest neighbors: Towards removing the curse of dimensionality, Proceedings of the 30th Annual ACM Symposium on Theory of Computing, Dallas, TX, USA, pp. 604-613. Zbl1029.68541
  20. Indyk P. and Naor A.(2006). Nearest neighbor preserving embeddings, ACM Transactions on Algorithms (to appear). Zbl1192.68748
  21. Johnson W. B. and Lindenstrauss J. (1984). Extensions of Lipshitz mapping into Hilbert space, Contemporary Mathematics 26: 189-206. Zbl0539.46017
  22. Jones M.C., Marron J.S. and Sheather S.J. (1996). A brief survey of bandwidth selection for density estimation, Journal of the American Statistical Association 91(433): 401-407. Zbl0873.62040
  23. Karayiannis N.B. (1999). Reformulated radial basis neural networks trained by gradient descent, IEEE Transactions on Neural Networks 10(3): 657-671. 
  24. Krzyżak A. (2001). Nonlinear function learning using optimal radial basis function networks, Journal on Nonlinear Analysis 47(1): 293-302. Zbl1042.68651
  25. Krzyżak A., Linder T.and Lugosi G. (2001). Nonparametric estimation and classification using radial basis function nets and empirical risk minimization, IEEE Transactions on Neural Networks 7(2): 475-487. 
  26. Krzyżak A. and Niemann H. (2001). Convergence and rates of convergence of radial basis functions networks in function learning, Journal on Nonlinear Analysis 47(1): 281-292. Zbl1042.68652
  27. Krzyżak A.and Skubalska-Rafajłowicz E. (2004). Combining space-filling curves and radial basis function networks, Artificial Intelligence and Soft Computing ICAISC 2004. 7th International conference. Zakopane, Poland, Lecture Notes in Artificial Intelligence, 3070: 229-234, Springer-Verlag, Berlin. Zbl1058.68574
  28. Korbicz J., Kościelny J. M., Kowalczuk Z. and Cholewa W. (Eds) (2004). Fault diagnosis. Models, Artificial Intelligence, Applications, Springer-Verlag, Berlin. Zbl1074.93004
  29. Leonard J. A., and Kramer M. A. (1990). Classifying process behaviour with neural networks: Strategies for improved training and generalization, Proceedings of the American Control Conference, San Diego, CA, USA, pp. 2478-2483. 
  30. Leonard J.A., and Kramer M.A. (1991). Radial basis networks for classifying process faults, IEEE Control Systems Magazine 11(3): 31-38 
  31. Li Y., Pont M. J. and Jones N.B. (2002). Improving the performance of radial basis function classifiers in condition monitoring and fault diagnosis applications where ‘unknown' faults may occur, Pattern Recognition Letters 23(5): 569-577. Zbl1012.68173
  32. Li P., Hastie T.J. and Church K.W. (2007). Nonlinear Estimators and tail bounds for dimension reduction in l1 using Cauchy random projections, The Journal of Machine Learning Research 8(10): 2497-2532. 
  33. Li P., Hastie T.J. and Church K.W. (2006). Sub-Gaussian random projections, Technical report, Stanford University. 
  34. Magdon-Ismail M. and Atiya A. (2002). Density estimation and random variate generation using multilayer networks, IEEE Transactions on Neural Networks 13(3): 497-520. 
  35. Moody J. and Darken C.J. (1989). Fast learning in networks of locally tuned processing units, Neural Computation 1(2): 281-294. 
  36. Patton R.J.(1994). Robust model-based fault diagnosis: The state of the art, Proceedings of the IFAC Symposium on Fault Detection Supervision and Safety of Technical Processes, Espoo, Finland, pp. 1-24. 
  37. Patton R. J., Chen J. and Benkhedda H.(2000). A study on neurofuzzy systems for fault diagnosis, International Journal of Systems Science 31(11): 1441-1448. Zbl1080.93600
  38. Parzen E. (1962 ). On estimation of a probability density function and mode, Annals of Mathematical Statistics 33(3): 1065-1076. Zbl0116.11302
  39. Poggio T. and Girosi F. (1990). Networks for approximation and learning, Proceedings of the IEEE 78(9): 484-1487. Zbl1226.92005
  40. Powell M.J.D. (1987). Radial basis functions for multivariable interpolation: A review, in (J.C. Mason, M.G. Cox, Eds.) Algorithms for Approximation, Clarendon Press, Oxford, pp. 143-167. 
  41. Rafajłowicz E. (2006). RBF nets in fault localization, 8th International Conference on Artificial Intelligence and Soft Computing - ICAISC 2006. Zakopane, Poland, LNCS, Springer-Verlag, Berlin/Heidelberg, 4029/2006: 113-122. 
  42. Rafajłowicz E., Skubalska-Rafajłowicz E. (2003). RBF nets based on equidistributed points, Proceedings of 9th IEEE International Conference: Methods and Models in Automation and Robotics MMAR 2003, Szczecin, Poland, 2: 921-926. 
  43. Roberts S. (2000). Extreme value statistics for novelty detection in biomedical data processing, IEE Proceedings: Science, Measurement and Technology 147 (6): 363-367. 
  44. Schlorer H. and Hartman U. (1992). Mapping neural networks derived from the Parzen window estimator, Neural Networks 5(6): 903-909. 
  45. Skubalska-Rafajłowicz E. (2000). On using space-filling curves and vector quantization for constructing multidimensional control charts, Proceedings of the 5th on Conference Neural Network and Soft Computing, Zakopane, Poland, pp. 162-167. 
  46. Skubalska-Rafajłowicz E. (2006a). RBF neural network for probability density function estimation and detecting changes in multivariate processes, 8th International Conference: Artificial Intelligence and Soft Computing - ICAISC 2006. Zakopane, Poland, LNCS, Springer-Verlag, Berlin/Heidelberg 4029/2006: 133-141. 
  47. Skubalska-Rafajłowicz E. (2006b). Self-organizing RBF neural network for probability density function estimation, Proceedings of the 12th IEEE International Conference on Methods and Models in Automation and Robotics, Międzyzdroje, Poland, pp. 985-988. 
  48. Specht D.F. (1990). Probabilistic neural networks, Neural Networks 3(1): 109-118. 
  49. Vempala S. (2004). The Random Projection Method, American Mathematical Society, Providence, RI. Zbl1058.68063
  50. Wettschereck D. and Dietterich T. (1992). Improving the performance of radial basis function networks by learning center locations, in (B. Spatz, Ed.) Advances in Neural Information Processing Systems, Morgan Kaufmann, San Mateo, CA, Vol. 4, pp. 1133-1140. 
  51. Willsky A. S. (1976). A survey of design methods for failure detection in dynamic systems, Automatica 12(6): 601-611. Zbl0345.93067
  52. Xu L., Krzyżak A. and Yuille A. (1994). On radial basis function nets and kernel regression: Statistical consistency, convergence rates, and receptive field size, Neural Networks 7(4): 609-628. Zbl0817.62031
  53. Yee P. V. and Haykin S. (2001). Regularized Radial Basis Function Networks: Theory and Applications, John Wiley, New York, NY. 
  54. Yin H. and Allinson N.M.(2001). Self-organising mixture networks for probability density estimation, IEEE Transactions on Neural Networks 12(2): 405-411. 
  55. Zorriassatine F., Tannock J.D.T.and O‘Brien C.(2003). Using novelty detection to identify abnormalities caused by mean shifts in bivariate processes, Computers and Industrial Engineering 44(3): 385-408. 

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.