Towards a theory of practice in metaheuristics design: A machine learning perspective

Mauro Birattari; Mark Zlochin; Marco Dorigo

RAIRO - Theoretical Informatics and Applications (2006)

  • Volume: 40, Issue: 2, page 353-369
  • ISSN: 0988-3754

Abstract

top
A number of methodological papers published during the last years testify that a need for a thorough revision of the research methodology is felt by the operations research community – see, for example, [Barr et al., J. Heuristics1 (1995) 9–32; Eiben and Jelasity, Proceedings of the 2002 Congress on Evolutionary Computation (CEC'2002) 582–587; Hooker, J. Heuristics1 (1995) 33–42; Rardin and Uzsoy, J. Heuristics7 (2001) 261–304]. In particular, the performance evaluation of nondeterministic methods, including widely studied metaheuristics such as evolutionary computation and ant colony optimization, requires the definition of new experimental protocols. A careful and thorough analysis of the problem of evaluating metaheuristics reveals strong similarities between this problem and the problem of evaluating learning methods in the machine learning field. In this paper, we show that several conceptual tools commonly used in machine learning – such as, for example, the probabilistic notion of class of instances and the separation between the training and the testing datasets – fit naturally in the context of metaheuristics evaluation. Accordingly, we propose and discuss some principles inspired by the experimental practice in machine learning for guiding the performance evaluation of optimization algorithms. Among these principles, a clear separation between the instances that are used for tuning algorithms and those that are used in the actual evaluation is particularly important for a proper assessment.

How to cite

top

Birattari, Mauro, Zlochin, Mark, and Dorigo, Marco. " Towards a theory of practice in metaheuristics design: A machine learning perspective ." RAIRO - Theoretical Informatics and Applications 40.2 (2006): 353-369. <http://eudml.org/doc/249732>.

@article{Birattari2006,
abstract = { A number of methodological papers published during the last years testify that a need for a thorough revision of the research methodology is felt by the operations research community – see, for example, [Barr et al., J. Heuristics1 (1995) 9–32; Eiben and Jelasity, Proceedings of the 2002 Congress on Evolutionary Computation (CEC'2002) 582–587; Hooker, J. Heuristics1 (1995) 33–42; Rardin and Uzsoy, J. Heuristics7 (2001) 261–304]. In particular, the performance evaluation of nondeterministic methods, including widely studied metaheuristics such as evolutionary computation and ant colony optimization, requires the definition of new experimental protocols. A careful and thorough analysis of the problem of evaluating metaheuristics reveals strong similarities between this problem and the problem of evaluating learning methods in the machine learning field. In this paper, we show that several conceptual tools commonly used in machine learning – such as, for example, the probabilistic notion of class of instances and the separation between the training and the testing datasets – fit naturally in the context of metaheuristics evaluation. Accordingly, we propose and discuss some principles inspired by the experimental practice in machine learning for guiding the performance evaluation of optimization algorithms. Among these principles, a clear separation between the instances that are used for tuning algorithms and those that are used in the actual evaluation is particularly important for a proper assessment. },
author = {Birattari, Mauro, Zlochin, Mark, Dorigo, Marco},
journal = {RAIRO - Theoretical Informatics and Applications},
language = {eng},
month = {7},
number = {2},
pages = {353-369},
publisher = {EDP Sciences},
title = { Towards a theory of practice in metaheuristics design: A machine learning perspective },
url = {http://eudml.org/doc/249732},
volume = {40},
year = {2006},
}

TY - JOUR
AU - Birattari, Mauro
AU - Zlochin, Mark
AU - Dorigo, Marco
TI - Towards a theory of practice in metaheuristics design: A machine learning perspective
JO - RAIRO - Theoretical Informatics and Applications
DA - 2006/7//
PB - EDP Sciences
VL - 40
IS - 2
SP - 353
EP - 369
AB - A number of methodological papers published during the last years testify that a need for a thorough revision of the research methodology is felt by the operations research community – see, for example, [Barr et al., J. Heuristics1 (1995) 9–32; Eiben and Jelasity, Proceedings of the 2002 Congress on Evolutionary Computation (CEC'2002) 582–587; Hooker, J. Heuristics1 (1995) 33–42; Rardin and Uzsoy, J. Heuristics7 (2001) 261–304]. In particular, the performance evaluation of nondeterministic methods, including widely studied metaheuristics such as evolutionary computation and ant colony optimization, requires the definition of new experimental protocols. A careful and thorough analysis of the problem of evaluating metaheuristics reveals strong similarities between this problem and the problem of evaluating learning methods in the machine learning field. In this paper, we show that several conceptual tools commonly used in machine learning – such as, for example, the probabilistic notion of class of instances and the separation between the training and the testing datasets – fit naturally in the context of metaheuristics evaluation. Accordingly, we propose and discuss some principles inspired by the experimental practice in machine learning for guiding the performance evaluation of optimization algorithms. Among these principles, a clear separation between the instances that are used for tuning algorithms and those that are used in the actual evaluation is particularly important for a proper assessment.
LA - eng
UR - http://eudml.org/doc/249732
ER -

References

top
  1. R.S Barr, B.L. Golden, J.P. Kelly, M.G.C. Resende and W.R. Stewart, Designing and reporting computational experiments with heuristic methods. J. Heuristics1 (1995) 9–32.  
  2. M. Birattari, The Problem of Tuning Metaheuristics, as Seen from a Machine Learning Perspective. Ph.D. thesis, Université Libre de Bruxelles, Brussels, Belgium (2004).  
  3. M. Birattari, T. Stützle, L. Paquete and K. Varrentrapp, A racing algorithm for configuring metaheuristics, in Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-2002), edited by W.B. Langdon, et al. Morgan Kaufmann Publishers, San Francisco, CA (2002) 11–18.  
  4. M. Demenage, P. Grisoni and V.Th. Paschos, Differential approximation algorithms for some combinatorial optimization problems. Theoret. Comput. Sci.209 (1998) 107–122.  
  5. M. Dorigo and C. Blum, Ant colony optimization theory: A survey. Theoret. Comput. Sci.344 (2005) 243–278.  
  6. M. Dorigo and G. Di Caro, The Ant Colony Optimization meta-heuristic, in New Ideas in Optimization, edited by D. Corne, M. Dorigo and F. Glover. McGraw Hill, London, UK (1999) 11–32.  
  7. M. Dorigo, G. Di Caro and L. M. Gambardella, Ant algorithms for discrete optimization. Artificial Life5 (1999) 137–172.  
  8. M. Dorigo, V. Maniezzo and A. Colorni, Ant System: Optimization by a colony of cooperating agents. IEEE Trans. Systems, Man, and Cybernetics – Part B26 (1996) 29–41.  
  9. M. Dorigo and T. Stützle, Ant Colony Optimization. MIT Press, Cambridge, MA (2004).  
  10. A.E. Eiben and M. Jelasity, A Critical Note on Experimental Research Methodology in EC, in Proceedings of the 2002 Congress on Evolutionary Computation (CEC'2002), Piscataway, NJ, IEEE Press (2002) 582–587.  
  11. L.J. Fogel, A.J. Owens and M.J. Walsh, Artificial Intelligence Through Simulated Evolution. John Wiley & Sons, New York, NY (1966).  
  12. M.R. Garey and D.S. Johnson, Computers and Intractability: A Guide to the Theory of 𝒩 P -Completeness. Freeman, San Francisco, CA (1979).  
  13. F. Glover, Tabu search – part I. ORSA J. Comput.1 (1989) 190–206.  
  14. F. Glover, Tabu search – part II. ORSA J. Comput.2 (1990) 4–32.  
  15. D.E. Goldberg, Genetic Algorithms in Search, Optimization, and Machine Learning. Addison-Wesley, Reading, MA (1989).  
  16. R. Hassin and S. Khuller, Z-approximations. J. Algorithms41 (2001) 429–442.  
  17. J. Holland, Adaptation in Natural and Artificial Systems. University of Michigan Press, Ann Arbor, MI (1975).  
  18. J.N. Hooker, Testing heuristics: we have it all wrong. J. Heuristics1 (1995) 33–42.  
  19. D.S. Johnson, A theoretician's guide to the experimental analysis of algorithms, in Data structures, near neighbor searches, and methodology: 5th and 6th DIMACS implementation challenges. American Mathematical Society, Providence, RI (2002) 215–250.  
  20. S.A. Kauffman, The Origins of Order. Self-Organization and Selection in Evolution. Oxford University Press, Oxford, UK (1993).  
  21. S. Kirkpatrick, C.D. Gelatt Jr. and M.P. Vecchi, Optimization by simulated annealing. Science220 (1983) 671–680.  
  22. K. Liang, X. Yao and C. Newton, Adapting self-adaptive parameters in evolutionary algorithms. Appl. Intell.15 (2001) 171–180.  
  23. H.R. Lourenço, O. Martin and T. Stützle, Iterated local search, in Handbook of Metaheuristics. International Series in Operations Research & Management Science, edited by F. Glover and G. Kochenberger. Kluwer Academic Publishers, Norwell, MA 57 (2002) 321–353.  
  24. D.G. Luenberger, Introduction to Linear and Nonlinear Programming. Addison-Wesley Publishing Company, Reading, MA (1973).  
  25. O. Maron and A.W. Moore, Hoeffding races: Accelerating model selection search for classification and function approximation, in Advances in Neural Information Processing Systems, edited by J.D. Cowan, G. Tesauro and J. Alspector. Morgan Kaufmann Publishers, San Francisco, CA 6 (1994) 59–66.  
  26. C.C. McGeogh, Toward an experimental method for algorithm simulation. INFORMS J. Comput.2 (1996) 1–15.  
  27. Z. Michalewicz and D.B. Fogel, How to Solve it: Modern Heuristics. Springer-Verlag, Berlin, Germany (2000).  
  28. A.W. Moore and M.S. Lee, Efficient algorithms for minimizing cross validation error, in International Conference on Machine Learning. Morgan Kaufmann Publishers, San Francisco, CA (1994) 190–198.  
  29. B.L. Nelson, J. Swann, D. Goldsman and W. Song, Simple procedures for selecting the best simulated system when the number of alternatives is large. Oper. Res.49 (2001) 950–963.  
  30. R.R. Rardin and R. Uzsoy, Experimental evaluation of heuristic optimization algorithms: A tutorial. J. Heuristics7 (2001) 261–304.  
  31. I. Rechenberg, Evolutionsstrategie – Optimierung technischer Systeme nach Prinzipien der biologischen Information. Fromman Verlag, Freiburg, Germany (1973).  
  32. H.-P. Schwefel, Numerical Optimization of Computer Models. John Wiley & Sons, Chichester, UK (1981).  
  33. I. Sommerville, Software Engineering. Addison Wesley, Harlow, UK, sixth edition (2001).  
  34. M. Toussaint, Self-adaptive exploration in evolutionary search. Technical Report IRINI-2001-05, Institut für Neuroinformatik, Ruhr-Universität Bochum, Bochum, Germany (2001).  
  35. D.H. Wolpert and W.G. Macready, No free lunch theorems for optimization. IEEE Transactions on Evolutionary Computation1 (1997) 67–82.  
  36. E. Zemel, Measuring the quality of approximate solutions to zero-one programming problems. Math. Oper. Res.6 (1981) 319–332.  
  37. M. Zlochin, M. Birattari, N. Meuleau and M. Dorigo, Model-based search for combinatorial optimization: A critical survey. Ann. Oper. Res.131 (2004) 375–395.  
  38. M. Zlochin and M. Dorigo, Model based search for combinatorial optimization: A comparative study, in Parallel Problem Solving from Nature – PPSN VII, edited by M. Guervós, J.J. et al. Springer Verlag, Berlin, Germany (2002) 651–661.  

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.