(f,Gamma)-Divergences: Interpolating between f-Divergences and Integral Probability Metrics
Jeremiah Birrell, Paul Dupuis, Markos A. Katsoulakis, Yannis Pantazis, Luc Rey-Bellet; 23(39):1−70, 2022.
Abstract
We develop a rigorous and general framework for constructing information-theoretic divergences that subsume both f-divergences and integral probability metrics (IPMs), such as the 1-Wasserstein distance. We prove under which assumptions these divergences, hereafter referred to as (f,Γ)-divergences, provide a notion of `distance' between probability measures and show that they can be expressed as a two-stage mass-redistribution/mass-transport process. The (f,Γ)-divergences inherit features from IPMs, such as the ability to compare distributions which are not absolutely continuous, as well as from f-divergences, namely the strict concavity of their variational representations and the ability to control heavy-tailed distributions for particular choices of f. When combined, these features establish a divergence with improved properties for estimation, statistical learning, and uncertainty quantification applications. Using statistical learning as an example, we demonstrate their advantage in training generative adversarial networks (GANs) for heavy-tailed, not-absolutely continuous sample distributions. We also show improved performance and stability over gradient-penalized Wasserstein GAN in image generation.
© JMLR 2022. (edit, beta) |