Exploring Guided Sampling of Conditional GANs
Yifei Zhang*, Mengfei Xia, Yujun Shen, Jiapeng Zhu, Ceyuan Yang, Kecheng Zheng, Lianghua Huang, Yu Liu, Fan Cheng*
;
Abstract
"Guided sampling serves as a widely used inference technique in diffusion models to trade off sample fidelity and diversity. In this work, we confirm that generative adversarial networks (GANs) can also benefit from guided sampling, not even requiring to pre-prepare a classifier (, classifier guidance) or learn an unconditional counterpart (, classifier-free guidance) as in diffusion models. Inspired by the organized latent space in GANs, we manage to estimate the data-condition joint distribution from a well-learned conditional generator simply through vector arithmetic. With such an easy implementation, our approach, termed , improves the FID score of a state-of-the-art GAN model pre-trained on ImageNet 64 × 64 from 8.87 to 6.06, barely increasing the inference time. We then propose a learning-based variant of our framework to better approximate the distribution of the entire dataset, further improving the FID score to 4.37. It is noteworthy that our sampling strategy sufficiently closes the gap between GANs and one-step diffusion models (, with FID 4.02) under comparable model size. Code is available at https://github.com/zyf0619sjtu/GANdance."
Related Material
[pdf]
[supplementary material]
[DOI]