Bounds on L_p Errors in Density Ratio Estimation via f-Divergence Loss Functions
Yoshiaki Kitazawa
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
Density ratio estimation (DRE) is a fundamental machine learning technique for identifying relationships between two probability distributions. f-divergence loss functions, derived from variational representations of f-divergence, are commonly employed in DRE to achieve state-of-the-art results. This study presents a novel perspective on DRE using f-divergence loss functions by deriving the upper and lower bounds on L_p errors. These bounds apply to any estimator within a class of Lipschitz continuous estimators, irrespective of the specific f-divergence loss functions utilized. The bounds are formulated as a product of terms that include the data dimension and the expected value of the density ratio raised to the power of p. Notably, the lower bound incorporates an exponential term dependent on the Kullback--Leibler divergence, indicating that the L_p error significantly increases with the Kullback--Leibler divergence for p > 1, and this increase becomes more pronounced as p increases. Furthermore, these theoretical findings are substantiated through numerical experiments.