Graph-based generation of a meta-learning search space

Norbert Jankowski

International Journal of Applied Mathematics and Computer Science (2012)

  • Volume: 22, Issue: 3, page 647-667
  • ISSN: 1641-876X

Abstract

top
Meta-learning is becoming more and more important in current and future research concentrated around broadly defined data mining or computational intelligence. It can solve problems that cannot be solved by any single, specialized algorithm. The overall characteristic of each meta-learning algorithm mainly depends on two elements: the learning machine space and the supervisory procedure. The former restricts the space of all possible learning machines to a subspace to be browsed by a meta-learning algorithm. The latter determines the order of selected learning machines with a module responsible for machine complexity evaluation, organizes tests and performs analysis of results. In this article we present a framework for meta-learning search that can be seen as a method of sophisticated description and evaluation of functional search spaces of learning machine configurations used in meta-learning. Machine spaces will be defined by specially defined graphs where vertices are specialized machine configuration generators. By using such graphs the learning machine space may be modeled in a much more flexible way, depending on the characteristics of the problem considered and a priori knowledge. The presented method of search space description is used together with an advanced algorithm which orders test tasks according to their complexities.

How to cite

top

Norbert Jankowski. "Graph-based generation of a meta-learning search space." International Journal of Applied Mathematics and Computer Science 22.3 (2012): 647-667. <http://eudml.org/doc/244065>.

@article{NorbertJankowski2012,
abstract = {Meta-learning is becoming more and more important in current and future research concentrated around broadly defined data mining or computational intelligence. It can solve problems that cannot be solved by any single, specialized algorithm. The overall characteristic of each meta-learning algorithm mainly depends on two elements: the learning machine space and the supervisory procedure. The former restricts the space of all possible learning machines to a subspace to be browsed by a meta-learning algorithm. The latter determines the order of selected learning machines with a module responsible for machine complexity evaluation, organizes tests and performs analysis of results. In this article we present a framework for meta-learning search that can be seen as a method of sophisticated description and evaluation of functional search spaces of learning machine configurations used in meta-learning. Machine spaces will be defined by specially defined graphs where vertices are specialized machine configuration generators. By using such graphs the learning machine space may be modeled in a much more flexible way, depending on the characteristics of the problem considered and a priori knowledge. The presented method of search space description is used together with an advanced algorithm which orders test tasks according to their complexities.},
author = {Norbert Jankowski},
journal = {International Journal of Applied Mathematics and Computer Science},
keywords = {meta-learning; data mining; learning machines; complexity of learning; complexity of learning machines; computational intelligence},
language = {eng},
number = {3},
pages = {647-667},
title = {Graph-based generation of a meta-learning search space},
url = {http://eudml.org/doc/244065},
volume = {22},
year = {2012},
}

TY - JOUR
AU - Norbert Jankowski
TI - Graph-based generation of a meta-learning search space
JO - International Journal of Applied Mathematics and Computer Science
PY - 2012
VL - 22
IS - 3
SP - 647
EP - 667
AB - Meta-learning is becoming more and more important in current and future research concentrated around broadly defined data mining or computational intelligence. It can solve problems that cannot be solved by any single, specialized algorithm. The overall characteristic of each meta-learning algorithm mainly depends on two elements: the learning machine space and the supervisory procedure. The former restricts the space of all possible learning machines to a subspace to be browsed by a meta-learning algorithm. The latter determines the order of selected learning machines with a module responsible for machine complexity evaluation, organizes tests and performs analysis of results. In this article we present a framework for meta-learning search that can be seen as a method of sophisticated description and evaluation of functional search spaces of learning machine configurations used in meta-learning. Machine spaces will be defined by specially defined graphs where vertices are specialized machine configuration generators. By using such graphs the learning machine space may be modeled in a much more flexible way, depending on the characteristics of the problem considered and a priori knowledge. The presented method of search space description is used together with an advanced algorithm which orders test tasks according to their complexities.
LA - eng
KW - meta-learning; data mining; learning machines; complexity of learning; complexity of learning machines; computational intelligence
UR - http://eudml.org/doc/244065
ER -

References

top
  1. Bensusan, H., Giraud-Carrier, C. and Kennedy, C.J. (2000). A higher-order approach to meta-learning, in J. Cussens and A. Frisch (Eds.), Proceedings of the Work-in-Progress Track at the 10th International Conference on Inductive Logic Programming, Springer-Verlag, Berlin/Heidelberg, pp. 33-42. 
  2. Brazdil, P., Giraud-Carrier, C., Soares, C. and Vilalta, R. (2009). Metalearning: Applications to Data Mining, Springer, Berlin/Heidelberg. Zbl1173.68625
  3. Brazdil, P., Soares, C. and da Costa, J.P. (2003). Ranking learning algorithms: Using IBL and meta-learning on accuracy and time results, Machine Learning 50(3): 251-277. Zbl1033.68082
  4. Chan, P. and Stolfo, S.J. (1996). On the accuracy of metalearning for scalable data mining, Journal of Intelligent Information Systems 8(1): 5-28. 
  5. Czarnowski, I. and Jędrzejowicz, P. (2011). Application of agent-based simulated annealing and tabu search procedures to solving the data reduction problem, International Journal of Applied Mathematics and Computer Science 21(1): 57-68, DOI: 10.2478/v10006-011-0004-3. Zbl1221.68191
  6. Duch, W. and Grudziński, K. (1999). Search and global minimization in similarity-based methods, International Joint Conference on Neural Networks, Washington, DC, USA, p. 742. Zbl0965.68079
  7. Duch, W. and Itert, L. (2003). Committees of undemocratic competent models, Proceedings of the Joint International Conference on Artificial Neural Networks (ICANN) and the International Conference on Neural Information Processing (ICONIP), Istanbul, Turkey, pp. 33-36. 
  8. Duch, W., Wieczorek, T., Biesiada, J. and Blachnik, M. (2004). Comparison of feature ranking methods based on information entropy, Proceedings of International Joint Conference on Neural Networks, Budapest, Hungary, pp. 1415-1420. 
  9. Frank, A. and Asuncion, A. (2010). UCI machine learning repository, University of California, School of Information and Computer Science, Irvine, CA, http://archive.ics.uci.edu/ml. 
  10. Grąbczewski, K. and Jankowski, N. (2011). Saving time and memory in computational intelligence system with machine unification and task spooling, Knowledge-Based Systems 24(5): 570-588. 
  11. Guyon, I. (2003). NIPS 2003 workshop on feature extraction, http://www.clopinet.com/isabelle/Projects/NIPS2003. 
  12. Guyon, I. (2006). Performance prediction challenge, http://www.modelselect.inf.ethz.ch. 
  13. Guyon, I., Gunn, S., Nikravesh, M. and Zadeh, L. (Eds.) (2006). Feature Extraction: Foundations and Applications, Springer, Berlin/Heidelberg. Zbl1114.68059
  14. Jankowski, N., Duch, W. and Grąbczewski, K. (Eds.) (2011). Meta-learning in Computational Intelligence, Studies in Computational Intelligence, Vol. 358, Springer, Berlin/Heidelberg. Zbl1231.68026
  15. Jankowski, N. and Grąbczewski, K. (2005). Heterogenous committees with competence analysis, in N. Nedjah, L. Mourelle, M. Vellasco, A. Abraham and M. Köppen (Eds.), 5th International Conference on Hybrid Intelligent Systems, Rio de Janeiro, Brazil, IEEE Press, New York, NY, pp. 417-422. 
  16. Jankowski, N. and Grąbczewski, K. (2007). Handwritten digit recognition-Road to contest victory, IEEE Symposium Series on Computational Intelligence, IEEE Press, New York, NY, pp. 491-498. 
  17. Jankowski, N. and Grochowski, M. (2004). Comparison of instances selection algorithms I: Algorithms survey, in L. Rutkowski, I. Siekmann, R. Tadeusiewicz and L.A. Zadeh (Eds.), Artificial Intelligence and Soft Computing, Lecture Notes in Artifical Intelligence, Vol. 3070, Springer-Verlag, Berlin/Heidelberg pp. 598-603. Zbl1058.68564
  18. Jankowski, N. and Grochowski, M. (2005). Instances selection algorithms in the conjunction with LVQ, in M.H. Hamza (Ed.), Artificial Intelligence and Applications, ACTA Press, Innsbruck, pp. 453-459. 
  19. Kadlec, P. and Gabrys, B. (2008). Learnt topology gating artificial neural networks, IEEE World Congress on Computational Intelligence, Hong Kong, China, pp. 2605-2612. 
  20. Kohonen, T. (1986). Learning vector quantization for pattern recognition, Technical Report TKK-F-A601, Helsinki University of Technology, Espoo. 
  21. Kordík, P. and Černý, J. (2011). Self-organization of supervised models, in N. Jankowski, W. Duch and K. Grąbczewski (Eds.), Meta-learning in Computational Intelligence, Studies in Computational Intelligence, Vol. 358, Springer, Berlin/Heidelberg, pp. 179-223. 
  22. Korytkowski, M., Nowicki, R., Rutkowski, L. and Scherer, R. (2011). AdaBoost ensemble of DCOG rough-neuro-fuzzy systems, in P. Jędrzejowicz, N.T. Nguyen and K. Hoang (Eds.), ICCCI (1), Lecture Notes in Computer Science, Vol. 6922, Springer, Berlin/Heidelberg, pp. 62-71. 
  23. Łęski, J. (2003). A fuzzy if-then rule-based nonlinear classifier, International Journal of Applied Mathematics and Computer Science 13(2): 215-223. Zbl1048.93503
  24. Peng, Y., Falch, P., Soares, C. and Brazdil, P. (2002). Improved dataset characterisation for meta-learning, 5th International Conference on Discovery Science, Luebeck, Germany, pp. 141-152. Zbl1024.68579
  25. Pfahringer, B., Bensusan, H. and Giraud-Carrier, C. (2000). Meta-learning by landmarking various learning algorithms, International Conference on Machine Learning, Stanford, CA, USA, pp. 743-750. 
  26. Prodromidis, A. and Chan, P. (2000). Meta-learning in distributed data mining systems: Issues and approaches, in H. Kargupta and P. Chan (Eds.), Book on Advances of Distributed Data Mining, AAAI Press, Menlo Park, CA. 
  27. Scherer, R. (2010). Designing boosting ensemble of relational fuzzy systems, International Journal of Neural Systems 20(5): 381-388. 
  28. Scherer, R. (2011). An ensemble of logical-type neuro-fuzzy systems, Expert Systems with Applications 38(10): 13115-13120. 
  29. Smith-Miles, K.A. (2008). Towards insightful algorithm selection for optimization using meta-learning concepts, IEEE World Congress on Computational Intelligence, Hong Kong, China, pp. 4117-4123. 
  30. Todorovski, L. and Dzeroski, S. (2003). Combining classifiers with meta decision trees, Machine Learning Journal 50(3): 223-249. Zbl1033.68099
  31. Troć, M. and Unold, O. (2010). Self-adaptation of parameters in a learning classifier system ensemble machine, International Journal of Applied Mathematics and Computer Science 20(1): 157-174, DOI: 10.2478/v10006-010-0012-8. Zbl1300.68047
  32. Witten, I.H. and Frank, E. (2005). Data Mining: Practical Machine Learning Tools and Techniques, Morgan Kaufmann, Amsterdam. Zbl1076.68555

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.