Displaying similar documents to “Algorithm 84. The choice of representative variables by stepwise regression”

Stacked regression with restrictions

Tomasz Górecki (2005)

Discussiones Mathematicae Probability and Statistics

Similarity:

When we apply stacked regression to classification we need only discriminant indices which can be negative. In many situations, we want these indices to be positive, e.g., if we want to use them to count posterior probabilities, when we want to use stacked regression to combining classification. In such situation, we have to use leastsquares regression under the constraint βₖ ≥ 0, k = 1,2,...,K. In their earlier work [5], LeBlanc and Tibshirani used an algorithm given in [4]. However,...

Directional quantile regression in Octave (and MATLAB)

Pavel Boček, Miroslav Šiman (2016)

Kybernetika

Similarity:

Although many words have been written about two recent directional (regression) quantile concepts, their applications, and the algorithms for computing associated (regression) quantile regions, their software implementation is still not widely available, which, of course, severely hinders the dissemination of both methods. Wanting to partly fill in the gap here, we provide all the codes needed for computing and plotting the multivariate (regression) quantile regions in Octave and MATLAB,...

An adaptive method of estimation and outlier detection in regression applicable for small to moderate sample sizes

Brenton R. Clarke (2000)

Discussiones Mathematicae Probability and Statistics

Similarity:

In small to moderate sample sizes it is important to make use of all the data when there are no outliers, for reasons of efficiency. It is equally important to guard against the possibility that there may be single or multiple outliers which can have disastrous effects on normal theory least squares estimation and inference. The purpose of this paper is to describe and illustrate the use of an adaptive regression estimation algorithm which can be used to highlight outliers, either single...

Fitting a linear regression model by combining least squares and least absolute value estimation.

Sira Allende, Carlos Bouza, Isidro Romero (1995)

Qüestiió

Similarity:

Robust estimation of the multiple regression is modeled by using a convex combination of Least Squares and Least Absolute Value criterions. A Bicriterion Parametric algorithm is developed for computing the corresponding estimates. The proposed procedure should be specially useful when outliers are expected. Its behavior is analyzed using some examples.

Note on universal algorithms for learning theory

Karol Dziedziul, Barbara Wolnik (2007)

Applicationes Mathematicae

Similarity:

We study the universal estimator for the regression problem in learning theory considered by Binev et al. This new approach allows us to improve their results.

On the Equivalence between Orthogonal Regression and Linear Model with Type-II Constraints

Sandra Donevska, Eva Fišerová, Karel Hron (2011)

Acta Universitatis Palackianae Olomucensis. Facultas Rerum Naturalium. Mathematica

Similarity:

Orthogonal regression, also known as the total least squares method, regression with errors-in variables or as a calibration problem, analyzes linear relationship between variables. Comparing to the standard regression, both dependent and explanatory variables account for measurement errors. Through this paper we shortly discuss the orthogonal least squares, the least squares and the maximum likelihood methods for estimation of the orthogonal regression line. We also show that all mentioned...