Page 1

Displaying 1 – 6 of 6

Showing per page

Binary segmentation and Bonferroni-type bounds

Michal Černý (2011)

Kybernetika

We introduce the function Z ( x ; ξ , ν ) : = - x ϕ ( t - ξ ) · Φ ( ν t ) d t , where ϕ and Φ are the pdf and cdf of N ( 0 , 1 ) , respectively. We derive two recurrence formulas for the effective computation of its values. We show that with an algorithm for this function, we can efficiently compute the second-order terms of Bonferroni-type inequalities yielding the upper and lower bounds for the distribution of a max-type binary segmentation statistic in the case of small samples (where asymptotic results do not work), and in general for max-type random variables...

Bounds and asymptotic expansions for the distribution of the Maximum of a smooth stationary Gaussian process

Jean-Marc Azaïs, Christine Cierco-Ayrolles, Alain Croquette (2010)

ESAIM: Probability and Statistics

This paper uses the Rice method [18] to give bounds to the distribution of the maximum of a smooth stationary Gaussian process. We give simpler expressions of the first two terms of the Rice series [3,13] for the distribution of the maximum. Our main contribution is a simpler form of the second factorial moment of the number of upcrossings which is in some sense a generalization of Steinberg et al.'s formula ([7] p. 212). Then, we present a numerical application and asymptotic expansions...

Bounds on tail probabilities for negative binomial distributions

Peter Harremoës (2016)

Kybernetika

In this paper we derive various bounds on tail probabilities of distributions for which the generated exponential family has a linear or quadratic variance function. The main result is an inequality relating the signed log-likelihood of a negative binomial distribution with the signed log-likelihood of a Gamma distribution. This bound leads to a new bound on the signed log-likelihood of a binomial distribution compared with a Poisson distribution that can be used to prove an intersection property...

Bounds on the information divergence for hypergeometric distributions

Peter Harremoës, František Matúš (2020)

Kybernetika

The hypergeometric distributions have many important applications, but they have not had sufficient attention in information theory. Hypergeometric distributions can be approximated by binomial distributions or Poisson distributions. In this paper we present upper and lower bounds on information divergence. These bounds are important for statistical testing and for a better understanding of the notion of exchangeability.

Currently displaying 1 – 6 of 6

Page 1