Bias-variance decomposition in Genetic Programming

Taras Kowaliw; René Doursat

Open Mathematics (2016)

  • Volume: 14, Issue: 1, page 62-80
  • ISSN: 2391-5455

Abstract

top
We study properties of Linear Genetic Programming (LGP) through several regression and classification benchmarks. In each problem, we decompose the results into bias and variance components, and explore the effect of varying certain key parameters on the overall error and its decomposed contributions. These parameters are the maximum program size, the initial population, and the function set used. We confirm and quantify several insights into the practical usage of GP, most notably that (a) the variance between runs is primarily due to initialization rather than the selection of training samples, (b) parameters can be reasonably optimized to obtain gains in efficacy, and (c) functions detrimental to evolvability are easily eliminated, while functions well-suited to the problem can greatly improve performance-therefore, larger and more diverse function sets are always preferable.

How to cite

top

Taras Kowaliw, and René Doursat. "Bias-variance decomposition in Genetic Programming." Open Mathematics 14.1 (2016): 62-80. <http://eudml.org/doc/276925>.

@article{TarasKowaliw2016,
abstract = {We study properties of Linear Genetic Programming (LGP) through several regression and classification benchmarks. In each problem, we decompose the results into bias and variance components, and explore the effect of varying certain key parameters on the overall error and its decomposed contributions. These parameters are the maximum program size, the initial population, and the function set used. We confirm and quantify several insights into the practical usage of GP, most notably that (a) the variance between runs is primarily due to initialization rather than the selection of training samples, (b) parameters can be reasonably optimized to obtain gains in efficacy, and (c) functions detrimental to evolvability are easily eliminated, while functions well-suited to the problem can greatly improve performance-therefore, larger and more diverse function sets are always preferable.},
author = {Taras Kowaliw, René Doursat},
journal = {Open Mathematics},
keywords = {Analysis of algorithms; Bias-variance decomposition; Classification; Computational learning theory; Evolutionary computation; Genetic programming; Learning and adaptive systems; Non-parametric inference; Regression; analysis of algorithms; bias-variance decomposition; classification; computational learning theory; evolutionary computation; genetic programming; learning and adaptive systems; non-parametric inference; regression},
language = {eng},
number = {1},
pages = {62-80},
title = {Bias-variance decomposition in Genetic Programming},
url = {http://eudml.org/doc/276925},
volume = {14},
year = {2016},
}

TY - JOUR
AU - Taras Kowaliw
AU - René Doursat
TI - Bias-variance decomposition in Genetic Programming
JO - Open Mathematics
PY - 2016
VL - 14
IS - 1
SP - 62
EP - 80
AB - We study properties of Linear Genetic Programming (LGP) through several regression and classification benchmarks. In each problem, we decompose the results into bias and variance components, and explore the effect of varying certain key parameters on the overall error and its decomposed contributions. These parameters are the maximum program size, the initial population, and the function set used. We confirm and quantify several insights into the practical usage of GP, most notably that (a) the variance between runs is primarily due to initialization rather than the selection of training samples, (b) parameters can be reasonably optimized to obtain gains in efficacy, and (c) functions detrimental to evolvability are easily eliminated, while functions well-suited to the problem can greatly improve performance-therefore, larger and more diverse function sets are always preferable.
LA - eng
KW - Analysis of algorithms; Bias-variance decomposition; Classification; Computational learning theory; Evolutionary computation; Genetic programming; Learning and adaptive systems; Non-parametric inference; Regression; analysis of algorithms; bias-variance decomposition; classification; computational learning theory; evolutionary computation; genetic programming; learning and adaptive systems; non-parametric inference; regression
UR - http://eudml.org/doc/276925
ER -

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.