Skip to content

Code for "Score-based Generative Modeling Secretly Minimizes the Wasserstein Distance", NeurIPS 2022.

Notifications You must be signed in to change notification settings

UW-Madison-Lee-Lab/score-wasserstein

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 

Repository files navigation

Score-based Generative Modeling Secretly Minimizes the Wasserstein Distance

Dohyun Kwon, Ying Fan, Kangwook Lee. Advances in Neural Information Processing Systems 35 (NeurIPS 2022).

Links: Paper

Poster

Abstract

Score-based generative models are shown to achieve remarkable empirical performances in various applications such as image generation and audio synthesis. However, a theoretical understanding of score-based diffusion models is still incomplete. Recently, Song et al. showed that the training objective of score-based generative models is equivalent to minimizing the Kullback-Leibler divergence of the generated distribution from the data distribution. In this work, we show that score-based models also minimize the Wasserstein distance between them. Specifically, we prove that the Wasserstein distance is upper bounded by the square root of the objective function up to multiplicative constants and a fixed constant offset. Our proof is based on a novel application of the theory of optimal transport, which can be of independent interest to the society. Our numerical experiments support our findings. By analyzing our upper bounds, we provide a few techniques to obtain tighter upper bounds.

Experiments

Please see the ipynb file which contains the results for verifying the upper bound. The original file is generated by google colab and it is recommended to run this online version directly.

About

Code for "Score-based Generative Modeling Secretly Minimizes the Wasserstein Distance", NeurIPS 2022.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published