Nonparametric Density Estimation under Adversarial Losses release_bu34qw73cjdovoc2lb3vc2ob7y

by Shashank Singh, Ananya Uppal, Boyue Li, Chun-Liang Li, Manzil Zaheer, Barnabás Póczos

Released as a article .



We study minimax convergence rates of nonparametric density estimation under a large class of loss functions called "adversarial losses", which, besides classical L^p losses, includes maximum mean discrepancy (MMD), Wasserstein distance, and total variation distance. These losses are closely related to the losses encoded by discriminator networks in generative adversarial networks (GANs). In a general framework, we study how the choice of loss and the assumed smoothness of the underlying density together determine the minimax rate. We also discuss implications for training GANs based on deep ReLU networks, and more general connections to learning implicit generative models in a minimax statistical sense.
In text/plain format

Archived Files and Locations

application/pdf  609.4 kB
file_gaxeh4ux6naaboxwxjk2kfisja (webarchive) (repository)
Read Archived PDF
Type  article
Stage   submitted
Date   2018-10-28
Version   v2
Language   en ?
arXiv  1805.08836v2
Work Entity
access all versions, variants, and formats of this works (eg, pre-prints)
Catalog Record
Revision: 169b1a54-e730-49c6-9b05-c6fbac905657