The search session has expired. Please query the service again.
The search session has expired. Please query the service again.
In the present paper we investigate performance of the -depth-nearest classifier. This classifier, proposed recently by Vencálek, uses the concept of data depth to improve the classification method known as the -nearest neighbour. Simulation study which is presented here deals with the two-class classification problem in which the considered distributions belong to the family of skewed normal distributions.
High-dimensional data models abound in genomics studies, where often inadequately small sample sizes create impasses for incorporation of standard statistical tools. Conventional assumptions of linearity of regression, homoscedasticity and (multi-) normality of errors may not be tenable in many such interdisciplinary setups. In this study, Kendall's tau-type rank statistics are employed for statistical inference, avoiding most of parametric assumptions to a greater extent. The proposed procedures...
It turns out that for standard kernel estimators no inequality like that of Dvoretzky-Kiefer-Wolfowitz can be constructed, and as a result it is impossible to answer the question of how many observations are needed to guarantee a prescribed level of accuracy of the estimator. A remedy is to adapt the bandwidth to the sample at hand.
We derive the two-sample Kolmogorov-Smirnov type test when a nuisance linear regression is present. The test is based on regression rank scores and provides a natural extension of the classical Kolmogorov-Smirnov test. Its asymptotic distributions under the hypothesis and the local alternatives coincide with those of the classical test.
Currently displaying 1 –
6 of
6