[IEEE TNNLS] Bayesian Cycle-Consistent Generative Adversarial Networks via Marginalizing Latent Sampling

Accepted as IEEE TNNLS regular paper!


Abstract:

Recent techniques built on generative adversarial networks (GANs), such as cycle-consistent GANs, are able to learn mappings among different domains built from unpaired data sets, through min-max optimization games between generators and discriminators. However, it remains challenging to stabilize the training process and thus cyclic models fall into mode collapse accompanied by the success of discriminator. To address this problem, we propose an novel Bayesian cyclic model and an integrated cyclic framework for interdomain mappings. The proposed method motivated by Bayesian GAN explores the full posteriors of cyclic model via sampling latent variables and optimizes the model with maximum a posteriori (MAP) estimation. Hence, we name it Bayesian CycleGAN. In addition, original CycleGAN cannot generate diversified results. But it is feasible for Bayesian framework to diversify generated images by replacing restricted latent variables in inference process. We evaluate the proposed Bayesian CycleGAN on multiple benchmark data sets, including Cityscapes, Maps, and Monet2photo. The proposed method improve the per-pixel accuracy by 15% for the Cityscapes semantic segmentation task within origin framework and improve 20% within the proposed integrated framework, showing better resilience to imbalance confrontation. The diversified results of Monet2Photo style transfer also demonstrate its superiority over original cyclic model.


Code now is available at https://github.com/ranery/Bayesian-CycleGAN !


Bibtex

If you find this work inspiring, please cite:

@ARTICLE{you2020bayesian,
  author={H. {You} and Y. {Cheng} and T. {Cheng} and C. {Li} and P. {Zhou}},
  journal={IEEE Transactions on Neural Networks and Learning Systems},
  title={Bayesian Cycle-Consistent Generative Adversarial Networks via Marginalizing Latent Sampling},
  year={2020},
  pages={1-15},
  doi={10.1109/TNNLS.2020.3017669},
  ISSN={2162-2388},
}