Stat 231--- CS 276A

Pattern Recognition and Machine Learning

TR 3:30-4:45 PM, Fall 2015, Lakretz 120      

www.stat.ucla.edu/~sczhu/Courses/UCLA/Stat_231/Stat_231.html

Course Description

This course introduces fundamental concepts, theories, and algorithms for pattern recognition and machine learning,
which are used in computer vision, speech recognition, data mining, statistics, information retrieval, and bioinformatics.
Topics include: Bayesian decision theory, parametric and non-parametric learning, data clustering, component analysis,
boosting techniques, kernel methods and support vector machine.

Prerequisites

You don't have to take exactly these courses as long as you know the materials.

Textbook

Textbook is not mandetary if you can understand the lecture notes and handouts.

Instructors

Grading Plan: 4 units, letter grades

Two Homework assignments
20%

Three projects:

 

15%

15%

15%

Middle-Term Exam: No.
0%
Final Exam: Dec 8, Tuesday 11:30-2:30pm (close book exam)
35%

Grading policy

Tentative Schedule for 2015

Lecture
Date
Topics
Reading Materials
Handouts
1
09-24
Introduction to Pattern Recognition
[Problems, applications, examples, 
and project introduction]
Ch 1
2
09-29
Bayesian Decision Theory I
[Bayes rule, discriminant functions]
Ch 2.1-2.6
3
10-01
Bayesian Decision Theory II 
[loss functions and Bayesian error analysis]
Ch 2.1-2.6
4
10-06
Component Analysis and Dimension Reduction I:
[principal component analysis (PCA)], face modeling]
[Explanation of Project 1: code and data format]
Ch 3.8.1, 
Ch 10.13.1
Project 1 
5
10-08
Component  Analysis and Dimension Reduction II:
[Fisher Linear Discriminant ]
[Multi-dimensional scaling (MDS)]
Ch 3.8.2, 
Ch10.14
6
10-13
Component  Analysis and Dimension Reduction III:
[Local Linear Embedding (LLE), Intrinsic dimension]
paper
7
10-15
Boosting Techniques I:
  [perceptron, backpropagation and Adaboost]
Ch 9.5 
8
10-20
Boosting Techniques II:
[RealBoost and Example on face detection]
[ Explanation of project II ]
9
10-22
Boosting Techniques III:
[analysis, logit boost, cascade and decision Policy]
10
10-27
Boosting Techniques III:
[analysis, logit boost, cascade and decision Policy]
 
11
10-29
Non-metric method I:
 [tree structured Classification]
Ch 8.1-8.3 
12
11-03
Non-metric method II:
Syntactic pattern recognition 
and example on human parsing
Ch 8.5-8.8 Lect11.pdf
13
11-05
Support vector machine I: 
 Kernel-induced feature space 
14
11-10
Support vector machine II: 
[Support vector classifier]
[Explanation of project III]
 
15
11-12
Support vector machine III:
[Loss functions, Latent SVM,
 Neual networks and DeepNet]
Ch 5.11
16
11-17
Parametric Learning
       [ Maximum Likelihood Estimation (MLE) ]
        [ Sufficient Statistics and Maximum entropy]
Ch 3.1-3.6
17
11-19
Non-parametric Learning I
[ Parzen window and K-nn classifer]
Ch 4.1-4.5
18
11-24
Non-parametric Learning II:
[K-nn classifer and Error analysis]
Ch 4.6
handout
11-26
ThanksGivings Holiday
 
19
12-01
 Topics on Deep Learning 
   Tutorial
 
20
12-03
Advanced Topics on Lifelong Machine Learning: