Currently displaying 1 – 5 of 5

Showing per page

Order by Relevance | Title | Year of publication

Concept of Data Depth and Its Applications

Ondřej Vencálek — 2011

Acta Universitatis Palackianae Olomucensis. Facultas Rerum Naturalium. Mathematica

Data depth is an important concept of nonparametric approach to multivariate data analysis. The main aim of the paper is to review possible applications of the data depth, including outlier detection, robust and affine-equivariant estimates of location, rank tests for multivariate scale difference, control charts for multivariate processes, and depth-based classifiers solving discrimination problem.

k -Depth-nearest Neighbour Method and its Performance on Skew-normal Distributons

Ondřej Vencálek — 2013

Acta Universitatis Palackianae Olomucensis. Facultas Rerum Naturalium. Mathematica

In the present paper we investigate performance of the k -depth-nearest classifier. This classifier, proposed recently by Vencálek, uses the concept of data depth to improve the classification method known as the k -nearest neighbour. Simulation study which is presented here deals with the two-class classification problem in which the considered distributions belong to the family of skewed normal distributions.

A depth-based modification of the k-nearest neighbour method

Ondřej VencálekDaniel Hlubinka — 2021

Kybernetika

We propose a new nonparametric procedure to solve the problem of classifying objects represented by d -dimensional vectors into K 2 groups. The newly proposed classifier was inspired by the k nearest neighbour (kNN) method. It is based on the idea of a depth-based distributional neighbourhood and is called k nearest depth neighbours (kNDN) classifier. The kNDN classifier has several desirable properties: in contrast to the classical kNN, it can utilize global properties of the considered distributions...

Weighted halfspace depth

Daniel HlubinkaLukáš KotíkOndřej Vencálek — 2010

Kybernetika

Generalised halfspace depth function is proposed. Basic properties of this depth function including the strong consistency are studied. We show, on several examples that our depth function may be considered to be more appropriate for nonsymetric distributions or for mixtures of distributions.

On the optimality of the max-depth and max-rank classifiers for spherical data

Ondřej VencálekHouyem DemniAmor MessaoudGiovanni C. Porzio — 2020

Applications of Mathematics

The main goal of supervised learning is to construct a function from labeled training data which assigns arbitrary new data points to one of the labels. Classification tasks may be solved by using some measures of data point centrality with respect to the labeled groups considered. Such a measure of centrality is called data depth. In this paper, we investigate conditions under which depth-based classifiers for directional data are optimal. We show that such classifiers are equivalent to the Bayes...

Page 1

Download Results (CSV)