Displaying 581 – 600 of 657

Showing per page

The Frisch scheme in algebraic and dynamic identification problems

Roberto P. Guidorzi, Roberto Diversi, Umberto Soverini (2008)

Kybernetika

This paper considers the problem of determining linear relations from data affected by additive noise in the context of the Frisch scheme. The loci of solutions of the Frisch scheme and their properties are first described in the algebraic case. In this context two main problems are analyzed: the evaluation of the maximal number of linear relations compatible with data affected by errors and the determination of the linear relation actually linking the noiseless data. Subsequently the extension...

The LASSO estimator: Distributional properties

Rakshith Jagannath, Neelesh S. Upadhye (2018)

Kybernetika

The least absolute shrinkage and selection operator (LASSO) is a popular technique for simultaneous estimation and model selection. There have been a lot of studies on the large sample asymptotic distributional properties of the LASSO estimator, but it is also well-known that the asymptotic results can give a wrong picture of the LASSO estimator's actual finite-sample behaviour. The finite sample distribution of the LASSO estimator has been previously studied for the special case of orthogonal models....

The least trimmed squares. Part I: Consistency

Jan Ámos Víšek (2006)

Kybernetika

The consistency of the least trimmed squares estimator (see Rousseeuw [Rous] or Hampel et al. [HamRonRouSta]) is proved under general conditions. The assumptions employed in paper are discussed in details to clarify the consequences for the applications.

The least trimmed squares. Part III: Asymptotic normality

Jan Ámos Víšek (2006)

Kybernetika

Asymptotic normality of the least trimmed squares estimator is proved under general conditions. At the end of paper a discussion of applicability of the estimator (including the discussion of algorithm for its evaluation) is offered.

The linear model with variance-covariance components and jackknife estimation

Jaromír Kudeláš (1994)

Applications of Mathematics

Let θ * be a biased estimate of the parameter ϑ based on all observations x 1 , , x n and let θ - i * ( i = 1 , 2 , , n ) be the same estimate of the parameter ϑ obtained after deletion of the i -th observation. If the expectation of the estimators θ * and θ - i * are expressed as E ( θ * ) = ϑ + a ( n ) b ( ϑ ) E ( θ - i * ) = ϑ + a ( n - 1 ) b ( ϑ ) i = 1 , 2 , , n , where a ( n ) is a known sequence of real numbers and b ( ϑ ) is a function of ϑ , then this system of equations can be regarded as a linear model. The least squares method gives the generalized jackknife estimator. Using this method, it is possible to obtain the unbiased...

Currently displaying 581 – 600 of 657