Page 1

Displaying 1 – 4 of 4

Showing per page

Bayesian estimation of mixtures with dynamic transitions and known component parameters

Ivan Nagy, Evgenia Suzdaleva, Miroslav Kárný (2011)

Kybernetika

Probabilistic mixtures provide flexible “universal” approximation of probability density functions. Their wide use is enabled by the availability of a range of efficient estimation algorithms. Among them, quasi-Bayesian estimation plays a prominent role as it runs “naturally” in one-pass mode. This is important in on-line applications and/or extensive databases. It even copes with dynamic nature of components forming the mixture. However, the quasi-Bayesian estimation relies on mixing via constant...

Bias-variance decomposition in Genetic Programming

Taras Kowaliw, René Doursat (2016)

Open Mathematics

We study properties of Linear Genetic Programming (LGP) through several regression and classification benchmarks. In each problem, we decompose the results into bias and variance components, and explore the effect of varying certain key parameters on the overall error and its decomposed contributions. These parameters are the maximum program size, the initial population, and the function set used. We confirm and quantify several insights into the practical usage of GP, most notably that (a) the...

Building adaptive tests using Bayesian networks

Jiří Vomlel (2004)

Kybernetika

We propose a framework for building decision strategies using Bayesian network models and discuss its application to adaptive testing. Dynamic programming and A O algorithm are used to find optimal adaptive tests. The proposed A O algorithm is based on a new admissible heuristic function.

Currently displaying 1 – 4 of 4

Page 1