Displaying 61 – 80 of 137

Showing per page

Bernstein inequality for the parameter of the pth order autoregressive process AR(p)

Samir Benaissa (2006)

Applicationes Mathematicae

The autoregressive process takes an important part in predicting problems leading to decision making. In practice, we use the least squares method to estimate the parameter θ̃ of the first-order autoregressive process taking values in a real separable Banach space B (ARB(1)), if it satisfies the following relation: X ̃ t = θ ̃ X ̃ t - 1 + ε ̃ t . In this paper we study the convergence in distribution of the linear operator I ( θ ̃ T , θ ̃ ) = ( θ ̃ T - θ ̃ ) θ ̃ T - 2 for ||θ̃|| > 1 and so we construct inequalities of Bernstein type for this operator.

Bias correction on censored least squares regression models

Jesus Orbe, Vicente Núñez-Antón (2012)

Kybernetika

This paper proposes a bias reduction of the coefficients' estimator for linear regression models when observations are randomly censored and the error distribution is unknown. The proposed bias correction is applied to the weighted least squares estimator proposed by Stute [28] [W. Stute: Consistent estimation under random censorship when covariables are present. J. Multivariate Anal. 45 (1993), 89-103.], and it is based on model-based bootstrap resampling techniques that also allow us to work with...

Bias of LS estimators in nonlinear regression models with constraints. Part I: General case

Andrej Pázman, Jean-Baptiste Denis (1999)

Applications of Mathematics

We derive expressions for the asymptotic approximation of the bias of the least squares estimators in nonlinear regression models with parameters which are subject to nonlinear equality constraints. The approach suggested modifies the normal equations of the estimator, and approximates them up to o p ( N - 1 ) , where N is the number of observations. The “bias equations” so obtained are solved under different assumptions on constraints and on the model. For functions of the parameters the invariance of the approximate...

Bias of LS estimators in nonlinear regression models with constraints. Part II: Biadditive models

Jean-Baptiste Denis, Andrej Pázman (1999)

Applications of Mathematics

General results giving approximate bias for nonlinear models with constrained parameters are applied to bilinear models in anova framework, called biadditive models. Known results on the information matrix and the asymptotic variance matrix of the parameters are summarized, and the Jacobians and Hessians of the response and of the constraints are derived. These intermediate results are the basis for any subsequent second order study of the model. Despite the large number of parameters involved,...

Bias-variance decomposition in Genetic Programming

Taras Kowaliw, René Doursat (2016)

Open Mathematics

We study properties of Linear Genetic Programming (LGP) through several regression and classification benchmarks. In each problem, we decompose the results into bias and variance components, and explore the effect of varying certain key parameters on the overall error and its decomposed contributions. These parameters are the maximum program size, the initial population, and the function set used. We confirm and quantify several insights into the practical usage of GP, most notably that (a) the...

Binary segmentation and Bonferroni-type bounds

Michal Černý (2011)

Kybernetika

We introduce the function Z ( x ; ξ , ν ) : = - x ϕ ( t - ξ ) · Φ ( ν t ) d t , where ϕ and Φ are the pdf and cdf of N ( 0 , 1 ) , respectively. We derive two recurrence formulas for the effective computation of its values. We show that with an algorithm for this function, we can efficiently compute the second-order terms of Bonferroni-type inequalities yielding the upper and lower bounds for the distribution of a max-type binary segmentation statistic in the case of small samples (where asymptotic results do not work), and in general for max-type random variables...

Currently displaying 61 – 80 of 137