Currently displaying 1 – 1 of 1

Showing per page

Order by Relevance | Title | Year of publication

On the Jensen-Shannon divergence and the variation distance for categorical probability distributions

Jukka CoranderUlpu RemesTimo Koski — 2021

Kybernetika

We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a scaled Jeffreys' divergence and a reversed Jensen-Shannon divergence. Upper and lower bounds for the Jensen-Shannon divergence are then found in terms of the squared (total) variation distance. The derivations rely upon the Pinsker inequality and the reverse Pinsker inequality. We use these bounds to prove the asymptotic equivalence of the maximum likelihood estimate and minimum Jensen-Shannon divergence...

Page 1

Download Results (CSV)