Rates of convergence for density estimation with generative adversarial networks

  • Nikita Puchkin
  • , Sergey Samsonov
  • , Denis Belomestny
  • , Eric Moulines
  • , Alexey Naumov

Research output: Contribution to journalArticlepeer-review

Abstract

In this work we undertake a thorough study of the non-asymptotic properties of the vanilla generative adversarial networks (GANs). We prove an oracle inequality for the Jensen-Shannon (JS) divergence between the underlying density p and the GAN estimate with a significantly better statistical error term compared to the previously known results. The advantage of our bound becomes clear in application to nonparametric density estimation. We show that the JS-divergence between the GAN estimate and p decays as fast as (log n/n)2β/(2β+d), where n is the sample size and β determines the smoothness of p. This rate of convergence coincides (up to logarithmic factors) with minimax optimal for the considered class of densities.

Original languageEnglish
JournalJournal of Machine Learning Research
Volume25
Publication statusPublished - 1 Jan 2024
Externally publishedYes

Keywords

  • Jensen-Shannon risk
  • generative model
  • minimax rates
  • nonparametric density estimation
  • oracle inequality

Fingerprint

Dive into the research topics of 'Rates of convergence for density estimation with generative adversarial networks'. Together they form a unique fingerprint.

Cite this