Understanding the Effectiveness of Lipschitz Constraint in Training GANs via Gradient Analysis

Published in ArXiv 2018, 2018

Zhiming Zhou, Yuxuan Song, Lantao Yu, Yong Yu. In submission to NIPS 2018.

[ArXiv]

Abstract

This paper aims to bring a new perspective for understanding GANs, by delving deeply into the key factors that lead to the failure and success in training of GANs. Specifically, (i) we study the value surface of the optimal discriminative function, from which we show that the fundamental source of failure in training of GANs stems from the unwarranted gradient direction; (ii) we show that Lipschitz constraint is not always necessary for evaluating Wasserstein distance, and we further demonstrate that, without Lipschitz constraint, Wasserstein GAN may also fail in the same way as other GANs; (iii) we theoretically show that Lipschitz constraint is generally a powerful tool to guarantee meaningful gradient directions and we further propose a generalized family of GAN formulations based on Lipschitz constraint, where Wasserstein GAN is a special case.