Upper and lower bounds on TVD and KLD between centered elliptical distributions in high-dimensional setting

Authors: Pavel Ievlev, Timofei Shashkov
Published in: Statistics and Probability Letters
Date:

Abstract

In this paper, we derive some upper and lower bounds and inequalities for the total variation distance (TVD) and the Kullback-Leibler divergence (KLD), also known as the relative entropy, between two probability measures $\mu$ and $\nu$.

Keywords

total variation distance, Kullback-Leibler divergence, multivariate Student distribution, high-dimensional statistics, Gamma distribution

Details

In this paper, we derive some upper and lower bounds and inequalities for the total variation distance (TVD) and the Kullback-Leibler divergence (KLD), also known as the relative entropy, between two probability measures $\mu$ and $\nu$ defined by \(D_{\mathrm{TV}} ( \mu, \nu ) = \sup_{B \in \mathcal{B} (\mathbb{R}^n)} \left| \mu(B) - \nu(B) \right| \quad \text{and} \quad D_{\mathrm{KL}} ( \mu \, \| \, \nu ) = \int_{\mathbb{R}^n} \ln \left( \frac{d\mu(x)}{d\nu(x)} \right) \, \mu(dx)\) correspondingly when the dimension n is high. We begin with some elementary bounds for centered elliptical distributions admitting densities and showcase how these bounds may be used by estimating the TVD and KLD between multivariate Student and multivariate normal distribution in the high-dimensional setting. Next, we show how the same approach simplifies when we apply it to multivariate Gamma distributions with independent components (in the latter case, we only study the TVD, because KLD may be calculated explicitly, see [1]). Our approach is motivated by the recent contribution by Barabesi and Pratelli [2].

Citation

BibTeX
@article{ievlev_shashkov_2025,
  title={Upper and lower bounds on {TVD} and {KLD} between centered elliptical distributions in high-dimensional setting},
  author={Ievlev, Pavel and Shashkov, Timofei},
  journal={submitted to Statistics and Probability Letters},
  year={2025}
}