Application of agent-based simulated annealing and tabu search procedures to solving the data reduction problem
Ireneusz Czarnowski; Piotr Jędrzejowicz
International Journal of Applied Mathematics and Computer Science (2011)
- Volume: 21, Issue: 1, page 57-68
- ISSN: 1641-876X
Access Full Article
topAbstract
topHow to cite
topIreneusz Czarnowski, and Piotr Jędrzejowicz. "Application of agent-based simulated annealing and tabu search procedures to solving the data reduction problem." International Journal of Applied Mathematics and Computer Science 21.1 (2011): 57-68. <http://eudml.org/doc/208037>.
@article{IreneuszCzarnowski2011,
abstract = {The problem considered concerns data reduction for machine learning. Data reduction aims at deciding which features and instances from the training set should be retained for further use during the learning process. Data reduction results in increased capabilities and generalization properties of the learning model and a shorter time of the learning process. It can also help in scaling up to large data sources. The paper proposes an agent-based data reduction approach with the learning process executed by a team of agents (A-Team). Several A-Team architectures with agents executing the simulated annealing and tabu search procedures are proposed and investigated. The paper includes a detailed description of the proposed approach and discusses the results of a validating experiment.},
author = {Ireneusz Czarnowski, Piotr Jędrzejowicz},
journal = {International Journal of Applied Mathematics and Computer Science},
keywords = {data reduction; machine learning; A-Team; optimization; multi-agent system},
language = {eng},
number = {1},
pages = {57-68},
title = {Application of agent-based simulated annealing and tabu search procedures to solving the data reduction problem},
url = {http://eudml.org/doc/208037},
volume = {21},
year = {2011},
}
TY - JOUR
AU - Ireneusz Czarnowski
AU - Piotr Jędrzejowicz
TI - Application of agent-based simulated annealing and tabu search procedures to solving the data reduction problem
JO - International Journal of Applied Mathematics and Computer Science
PY - 2011
VL - 21
IS - 1
SP - 57
EP - 68
AB - The problem considered concerns data reduction for machine learning. Data reduction aims at deciding which features and instances from the training set should be retained for further use during the learning process. Data reduction results in increased capabilities and generalization properties of the learning model and a shorter time of the learning process. It can also help in scaling up to large data sources. The paper proposes an agent-based data reduction approach with the learning process executed by a team of agents (A-Team). Several A-Team architectures with agents executing the simulated annealing and tabu search procedures are proposed and investigated. The paper includes a detailed description of the proposed approach and discusses the results of a validating experiment.
LA - eng
KW - data reduction; machine learning; A-Team; optimization; multi-agent system
UR - http://eudml.org/doc/208037
ER -
References
top- Aarts, E. and van Laarhoven, P. (1987). Simulated anealing: A pedestrian review of the theory and some applications, in J. Kittler and P.A. Devijver (Eds.), Pattern Recognition and Applications, Springer-Verlag, Berlin. Zbl0643.65028
- Aha, D., Kibler, D. and Albert, M. (1999). Instance-based learning algorithms, Machine Learning 6(1): 37-66.
- Aksela, M. (2007). Adaptive Combinations of Classifiers with Application to On-line Handwritten Character Recognition, Ph.D. thesis, Helsinki University of Technology, Helsinki.
- Asuncion, A. and Newman, D. (2007). Website of the UCI Machine Learning Repository, http://archive.ics.uci.edu/ml/.
- Barbucha, D., Czarnowski, I., Jędrzejowicz, P., Ratajczak-Ropel, E. and Wierzbowska, I. (2009). e-JABAT-An implementation of the Web-based A-Team, in N. Nguyen and L. Jain (Eds.), Intelligent Agents in the Evolution of Web and Applications, Studies in Computational Intelligence, Vol. 167, Springer-Verlag, Berlin/Heidelberg, pp. 57-86.
- Bellifemine, F., Caire, G., Poggi A. and Rimassa, G. (2003). Jade. A white paper, Technical Report Exp, 3(3), September 2003, pp. 6-19, http://exp.telecomitalialab.com.
- Bezdek, J. and Kuncheva, L. (2001). Nearest prototype classifier designs: An experimental study, International Journal of Intelligent Systems 16(2): 1445-1473. Zbl0997.68125
- Bhanu, B. and Peng, J. (2000). Adaptive integration image segmentation and object recognition, IEEE Transactions on Systems, Man and Cybernetics 30(4): 427-44.
- Bull, L. (2004). Learning classifier systems: A brief introduction, in L. Bull (Ed.), Applications of Learning Classifier Systems, Studies in Fuzziness and Soft Computing, Springer, Berlin, pp. 3-14.
- Czarnowski, I. (2010). Prototype selection algorithms for distributed learning, Pattern Recognition 43(6): 2292-2300. Zbl1192.68510
- Czarnowski, I. and Jędrzejowicz, P. (2004). An approach to instance reduction in supervised learning, in F. Coenen, A. Preece and A. Macintosh (Eds.), Research and Development in Intelligent Systems XX, Springer, London, pp. 267-282. Zbl1091.68568
- Czarnowski, I. and Jędrzejowicz, P. (2010). An approach to data reduction and integrated machine classification, New Generation Computing 28(1): 21-40. Zbl1191.68493
- Dash, M. and Liu, H. (1997). Feature selection for classification, Intelligence Data Analysis 1(3): 131-156.
- DataSet, C. (2009). Website of the Datasets Used for Classification: Comparison of Results, http://www.is.umk.pl/projects/datasets.html.
- Friedman, M. (1937). The use of ranks to avoid the assumption of normality implicit in the analysis of variance, Journal of the American Statistical Association 32(200): 675-701.
- Glover, F. (1989). Tabu search, Part I, ORSA Journal on Computing 1(3): 533-549.
- Glover, F. (1990). Tabu search, Part II, ORSA Journal on Computing 2(1): 4-32. Zbl0771.90084
- Glover, F. and Laguna, M. (1993). Tabu search, in C. Reeves (Ed.), Modern Heuristic Techniques for Combinatorial Problems, John Wiley and Sons, New Jork, NY, pp. 70-150 . Zbl0774.90033
- Hamo, Y. and Markovitch, S. (2005). The COMPSET algorithm for subset selection, Proceedings of the 19th International Joint Conference for Artificial Intelligence, Edinburgh, UK, pp. 728-733.
- Hart, P. (1968). The condensed nearest neighbour rule, IEEE Transactions on Information Theory 14: 515-516.
- Jędrzejowicz, P. (1999). Social learning algorithm as a tool for solving some difficult scheduling problems, Foundation of Computing and Decision Sciences 24(2): 51-66. Zbl1204.90041
- Kim, S.-W. and Oommen, B. (2003). A brief taxonomy and ranking of creative prototype reduction schemes, Pattern Analysis Application 6(3): 232-244.
- Kirkpatrick, S., Gelatt, C. and Vecci, M. (1983). Optimisation by simulated annealing, Science 220(4598): 671-680. Zbl1225.90162
- Kohavi, R. and John, G. (1997). Wrappers for feature subset selection, Artificial Intelligence 97(1-2): 273-324. Zbl0904.68143
- Kuncheva, L. and Bezdek, J. (1998). Nearest prototype classification: Clustering, genetic algorithm or random search?, IEEE Transactions on Systems, Man and Cybernetics 28(1): 160-164.
- Kuncheva, L. and Jain, L. (1999). Nearest-neighbor classifier: Simultaneous editing and feature selection, Pattern Recognition Letters 20(11-13): 1149-1156.
- Michalewicz, Z. (1996). Genetic Algorithms + Data Structures = Evolution Programs, Springer, Berlin. Zbl0841.68047
- Nanni, L. and Lumini, A. (2009). Particle swarm optimization for prototype reduction, Neurocomputing 72(4-6): 1092-1097. Zbl1192.68598
- Quinlan, J. (1993). C4.5: Programs for Machine Learning, Morgan Kaufmann Publishers, San Mateo, CA.
- Raman, B. (2003). Enhancing learning using feature and example selection, Technical report, Texas AM University, College Station, TX.
- Ritter, G., Woodruff, H., Lowry, S. and Isenhour, T. (1975). An algorithm for a selective nearest decision rule, IEEE Transactions on Information Theory 21(6): 665-669. Zbl0323.68023
- Rozsypal, A. and Kubat, M. (2003). Selecting representative examples and attributes by a genetic algorithm, Intelligent Data Analysis 7(4): 291-304. Zbl1083.68590
- Sahel, Z., Bouchachia, A., Gabrys, B. and Rogers, P. (2007). Adaptive mechanisms for classification problems with drifting data, in B. Apolloni, R. Howlett and L. Jain (Eds.), KES 2007, Lecture Notes in Artificial Intelligence, Vol. 4693, Springer-Verlag, Berlin/Heidelberg, pp. 419-426.
- Skalak, D. (1994). Prototype and feature selection by sampling and random mutation hill climbing algorithm, Proceedings of the International Conference on Machine Learning, New Brunswick, NJ, USA, pp. 293-301.
- Song, H. and Lee, S. (1996). LVQ combined with simulated annealing for optimal design of large set reference models, Neural Networks 9(2): 329-336.
- Talukdar, S., Baerentzen, L., Gove, A. and de Souza, P. (1996). Asynchronous teams: Co-operation schemes for autonomous, computer-based agents, Technical Report TR EDRC 18-59-96, Carnegie Mellon University, Pittsburgh, PA.
- Wilson, D. and Martinez, T. (2000a). An integrated instancebased learning algorithm, Computational Intelligence 16(1): 1-28.
- Wilson, D. and Martinez, T. (2000b). Reduction techniques for instance-based learning algorithm, Machine Learning 33(3): 257-286. Zbl0954.68126
- Zongker, D. and Jain, A. (1996). Algorithm for feature selection: An evaluation, Proceedings of the International Conference on Pattern Recognition, ICPR96, Vienna, Austria, pp. 18-22.
Citations in EuDML Documents
top- Bogdan Trawiński, Magdalena Smętek, Zbigniew Telec, Tadeusz Lasota, Nonparametric statistical analysis for multiple comparison of machine learning regression algorithms
- Liming Yuan, Jiafeng Liu, Xianglong Tang, Multiple-instance learning with pairwise instance similarity
- Norbert Jankowski, Graph-based generation of a meta-learning search space
- Piotr Kulczycki, Szymon Łukasik, An algorithm for reducing the dimension and size of a sample for data exploration procedures
NotesEmbed ?
topTo embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.