Learning Cycle-Consistent Cooperative Networks via Alternating

MCMC Teaching for Unsupervised Cross-Domain Translation



Jianwen Xie 1*, Zilong Zheng 2*, Xiaolin Fang 3, Song-Chun Zhu 2,4,5, and Ying Nian Wu 2

(* Equal contributions)
1 Cognitive Computing Lab, Baidu Research, USA
2 University of California, Los Angeles (UCLA), USA
3 Massachusetts Institute of Technology, USA
4 Tsinghua University, China
5 Peking University, China


Abstract

This paper studies the unsupervised cross-domain translation problem by proposing a generative framework, in which the probability distribution of each domain is represented by a generative cooperative network that consists of an energybased model and a latent variable model. The use of generative cooperative network enables maximum likelihood learning of the domain model by MCMC teaching, where the energy-based model seeks to fit the data distribution of domain and distills its knowledge to the latent variable model via MCMC. Specifically, in the MCMC teaching process, the latent variable model parameterized by an encoder-decoder maps examples from the source domain to the target domain, while the energy-based model further refines the mapped results by Langevin revision such that the revised results match to the examples in the target domain in terms of the statistical properties, which are defined by the learned energy function. For the purpose of building up a correspondence between two unpaired domains, the proposed framework simultaneously learns a pair of cooperative networks with cycle consistency, accounting for a two-way translation between two domains, by alternating MCMC teaching. Experiments show that the proposed framework is useful for unsupervised image-toimage translation and unpaired image sequence translation.

Results

       

Paper

The AAAI conference paper can be downloaded here.

The AAAI tex file can be downloaded here.

The poster can be downloaded here.

The slide can be downloaded here.

Code and Data

The Python code using tensorflow can be downloaded here

If you wish to use our code, please cite the following paper: 

Learning Cycle-Consistent Cooperative Networks via Alternating MCMC Teaching for Unsupervised Cross-Domain Translation
Jianwen Xie*, Zilong Zheng*, Xiaolin Fang, Song-Chun Zhu, Ying Nian Wu
The Thirty-Fifth AAAI Conference on Artificial Intelligence (AAAI) 2021

Related Work

Top