Approximating Probability Distributions by Using Wasserstein Generative Adversarial Networks

Yihang Gao*, Michael K. Ng*, Mingjie Zhou

*Corresponding author for this work

Research output: Contribution to journalJournal articlepeer-review

Abstract

Studied here are Wasserstein generative adversarial networks (WGANs) with GroupSort neural networks as their discriminators. It is shown that the error bound of the approximation for the target distribution depends on the width and depth (capacity) of the generators and discriminators and the number of samples in training. A quantified generalization bound is established for the Wasserstein distance between the generated and target distributions. According to the theoretical results, WGANs have a higher requirement for the capacity of discriminators than that of generators, which is consistent with some existing results. More importantly, the results with overly deep and wide (high-capacity) generators may be worse than those with low-capacity generators if discriminators are insufficiently strong. Numerical results obtained using Swiss roll and MNIST datasets confirm the theoretical results.
Original languageEnglish
Pages (from-to)949 - 976
Number of pages28
JournalSIAM Journal on Mathematics of Data Science
Volume5
Issue number4
DOIs
Publication statusPublished - Dec 2023

User-Defined Keywords

  • Wasserstein GAN
  • GroupSort neural networks
  • approximation
  • generalization Error
  • capacity

Fingerprint

Dive into the research topics of 'Approximating Probability Distributions by Using Wasserstein Generative Adversarial Networks'. Together they form a unique fingerprint.

Cite this