Displaying similar documents to “A complete gradient clustering algorithm formed with kernel estimators”

Application of agent-based simulated annealing and tabu search procedures to solving the data reduction problem

Ireneusz Czarnowski, Piotr Jędrzejowicz (2011)

International Journal of Applied Mathematics and Computer Science

Similarity:

The problem considered concerns data reduction for machine learning. Data reduction aims at deciding which features and instances from the training set should be retained for further use during the learning process. Data reduction results in increased capabilities and generalization properties of the learning model and a shorter time of the learning process. It can also help in scaling up to large data sources. The paper proposes an agent-based data reduction approach with the learning...

Detecting a data set structure through the use of nonlinear projections search and optimization

Victor L. Brailovsky, Michael Har-Even (1998)

Kybernetika

Similarity:

Detecting a cluster structure is considered. This means solving either the problem of discovering a natural decomposition of data points into groups (clusters) or the problem of detecting clouds of data points of a specific form. In this paper both these problems are considered. To discover a cluster structure of a specific arrangement or a cloud of data of a specific form a class of nonlinear projections is introduced. Fitness functions that estimate to what extent a given subset of...

Graphics processing units in acceleration of bandwidth selection for kernel density estimation

Witold Andrzejewski, Artur Gramacki, Jarosław Gramacki (2013)

International Journal of Applied Mathematics and Computer Science

Similarity:

The Probability Density Function (PDF) is a key concept in statistics. Constructing the most adequate PDF from the observed data is still an important and interesting scientific problem, especially for large datasets. PDFs are often estimated using nonparametric data-driven methods. One of the most popular nonparametric method is the Kernel Density Estimator (KDE). However, a very serious drawback of using KDEs is the large number of calculations required to compute them, especially...

Bayes sharpening of imprecise information

Piotr Kulczycki, Małgorzata Charytanowicz (2005)

International Journal of Applied Mathematics and Computer Science

Similarity:

A complete algorithm is presented for the sharpening of imprecise information, based on the methodology of kernel estimators and the Bayes decision rule, including conditioning factors. The use of the Bayes rule with a nonsymmetrical loss function enables the inclusion of different results of an under- and overestimation of a sharp value (real number), as well as minimizing potential losses. A conditional approach allows to obtain a more precise result thanks to using information entered...