Previous Page 2

Displaying 21 – 24 of 24

Showing per page

Bounds on Capital Requirements For Bivariate Risk with Given Marginals and Partial Information on the Dependence

Carole Bernard, Yuntao Liu, Niall MacGillivray, Jinyuan Zhang (2013)

Dependence Modeling

Nelsen et al. [20] find bounds for bivariate distribution functions when there are constraints on the values of its quartiles. Tankov [25] generalizes this work by giving explicit expressions for the best upper and lower bounds for a bivariate copula when its values on a compact subset of [0; 1]2 are known. He shows that they are quasi-copulas and not necessarily copulas. Tankov [25] and Bernard et al. [3] both give sufficient conditions for these bounds to be copulas. In this note we give weaker...

Bounds on tail probabilities for negative binomial distributions

Peter Harremoës (2016)

Kybernetika

In this paper we derive various bounds on tail probabilities of distributions for which the generated exponential family has a linear or quadratic variance function. The main result is an inequality relating the signed log-likelihood of a negative binomial distribution with the signed log-likelihood of a Gamma distribution. This bound leads to a new bound on the signed log-likelihood of a binomial distribution compared with a Poisson distribution that can be used to prove an intersection property...

Bounds on the information divergence for hypergeometric distributions

Peter Harremoës, František Matúš (2020)

Kybernetika

The hypergeometric distributions have many important applications, but they have not had sufficient attention in information theory. Hypergeometric distributions can be approximated by binomial distributions or Poisson distributions. In this paper we present upper and lower bounds on information divergence. These bounds are important for statistical testing and for a better understanding of the notion of exchangeability.

Currently displaying 21 – 24 of 24

Previous Page 2