Stat 202C: Monte Carlo Methods for Optimization 

 MW 2-3:15 pm, Spring 2018, Math Science 5128


Course Description

This graduate level course introduces Monte Carlo methods for simulation, optimization, estimation, learning and complex landscape visualization, including: Importance sampling; Sequential importance sampling; Markov chain Monte Carlo (MCMC) sampling techniques including Gibbs samplers, Metropolis/Hastings and various improvements; Simulated annealing; Exact sampling techniques; Convergence analysis; Data augmentation; Cluster sampling, such as Swendsen-Wang and SW-cuts; Hamiltonian and Langevin Monte Carlo; Equi-energy and multi-domain sampler; and Techniques for mapping complex energy landscapes.

   The lectures will be based on the following book draft.
Grading Plan: 4 units, letter grades
    The grade will be based on four parts
        2 homework                                       20%
        3 small projects                                  45%
        Final exam                                         35%

Tentative List of Topics

   Chapter 1,   Introduction to Monte Carlo Methods                 [Lect1.pdf] PDF files will be distriibuted through CCLE                                                                                                
   1, Monte Carlo methods in science and enginnering 
      -- Simulation, estimation, sampling, optimization, learning, and visualization.
   2, Topics and issues in Monte Carlo methods

  Chapter 2,   Sequential Monte Carlo                          
   1. Importance sampling and weighted samples 
   2. Advanced importance sampling techniques 
   3. Framework for sequential Monte Carlo 
         (selection, pruning, resampling, ...)          
   4. Application: particle filtering in object tracking, Monte Carlo Tree Search         

  Chapter 3,  Backgrounds on Markov Chains                             
   1. The transition matrix 
   2. Topology of transition matrix: communication and period 
   3. Positive recurrence and invariant measures 
   4. Ergodicity theorem                 
   Chapter 4, Metropolis methods and its variants                           
   1. Metropolis algorithm and the Hastings's generalization
   2. Special case: Metropolized independence sampler    
   3. Reversible jumps and trans-dimensional MCMC           
   Chapter 5 Gibbs sampler and its variants                               
   1. Gibbs sampler                           
   2. generalizations: 
       Hit-and-run, Multi-grid, generalized Gibbs, Metropolized Gibbs
   3. Data association and data augmentation
   4. Slice sampling 

   Chapter 6  Clustering sampling                                      
   1. Ising/Potts models
   2. Swendsen-Wang and clustering sampling   
   3. Three interpretations of the SW method 

  Chapter 7 Langevin Dynamics
   1. Hamiltonian Monte Carlo
   2. Langiven dynamics used in machine learning
       Gibbs Reaction and Diffusion equations, Alternative Back-propagation

  Chapter 8 Convergence analysis                                       
   1. Monitoring and diagnosing convergence 
   2*. Contraction coefficient 
   3. Puskin's order 
   4*. Eigen-structures of the transition matrix 
         (Perron-Frobenius theorem, spectral theorem)
   5. Geometric bounds 
   6*. Exact analysis on independence Metropolised Sampler (IMS) 
   7*. First hitting time analysis and bounds for IMS (paper) 
   8. Path coupling techniques.
        Bounds for Gibbs sampler and Swendson-Wang algorithm (paper).
   * discussed in previous Chapters.
   Chapter 9  Exact sampling                                        
   1. Coupling from the past CFTP  
   2. Bounding chains

  Chapter 10 Advanced topics                                           
   1. Equi-energy and mult-domain sampler                 
   2. Wang-Landau algorithm
   3. Attraction-Diffusion Algorithm
   4. Mapping the energy landscape and case studies
   5. Visualization of object recognition and the image universe
   6. Landscapes for curriculum learning