Learning Energy-Based Model with Variational Auto-Encoder as Amortized Sampler



Jianwen Xie, Zilong Zheng, and Ping Li

Cognitive Computing Lab, Baidu Research, USA


Abstract

Due to the intractable partition function, training energybased models (EBMs) by maximum likelihood requires Markov chain Monte Carlo (MCMC) sampling to approximate the gradient of the Kullback-Leibler divergence between data and model distributions. However, it is non-trivial to sample from an EBM because of the difficulty of mixing between modes. In this paper, we propose to learn a variational auto-encoder (VAE) to initialize the finite-step MCMC, such as Langevin dynamics that is derived from the energy function, for efficient amortized sampling of the EBM. With these amortized MCMC samples, the EBM can be trained by maximum likelihood, which follows an “analysis by synthesis” scheme; while the VAE learns from these MCMC samples via variational Bayes. We call this joint training algorithm the variational MCMC teaching, in which the VAE chases the EBM toward data distribution. We interpret the learning algorithm as a dynamic alternating projection in the context of information geometry. Our proposed models can generate samples comparable to GANs and EBMs. Additionally, we demonstrate that our model can learn effective probabilistic distribution toward supervised conditional learning tasks.

Results

       

Paper

The AAAI conference paper can be downloaded here.

The AAAI tex file can be downloaded here.

The poster can be downloaded here.

The slide can be downloaded here.

Code and Data

The Code can be downloaded here. The pretrained model can be downloaded from here

If you wish to use our code, please cite the following paper: 

Learning Energy-Based Model with Variational Auto-Encoder as Amortized Sampler
Jianwen Xie, Zilong Zheng, Ping Li
The Thirty-Fifth AAAI Conference on Artificial Intelligence (AAAI) 2021

Related Work

Top