Minimax Rates for Conditional Density Estimation via Empirical Entropy
- Blair Bilodeau ,
- Dylan Foster ,
- Daniel M. Roy
Annals of Statistics |
We consider the task of estimating a conditional density using i.i.d. samples from a joint distribution, which is a fundamental problem with applications in both classification and uncertainty quantification for regression. For joint density estimation, minimax rates have been characterized for general density classes in terms of uniform (metric) entropy, a well-studied notion of statistical capacity. When applying these results to conditional density estimation, the use of uniform entropy — which is infinite when the covariate space is unbounded and suffers from the curse of dimensionality — can lead to suboptimal rates. Consequently, minimax rates for conditional density estimation cannot be characterized using these classical results.
We resolve this problem for well-specified models, obtaining matching (within logarithmic factors) upper and lower bounds on the minimax Kullback–Leibler risk in terms of the empirical Hellinger entropy for the conditional density class. The use of empirical entropy allows us to appeal to concentration arguments based on local Rademacher complexity, which — in contrast to uniform entropy — leads to matching rates for large, potentially nonparametric classes and captures the correct dependence on the complexity of the covariate space. Our results require only that the conditional densities are bounded above, and do not require that they are bounded below or otherwise satisfy any tail conditions.