Stat 231 CS 276A
Pattern Recognition and Machine Learning
MW 3:304:45 PM, Fall 2016, Kinsey Pavilion 1220B
www.stat.ucla.edu/~sczhu/Courses/UCLA/Stat_231/Stat_231.html
Syllabus.pdf
Course Description
This course introduces fundamental concepts, theories, and algorithms for
pattern recognition and machine learning,
which are used in computer vision,
speech recognition, data mining, statistics, information retrieval, and bioinformatics.
Topics include: Bayesian decision theory, parametric and nonparametric learning,
data clustering,
component analysis,
boosting techniques, kernel methods and support vector machine, and deep learning with neural networks.
Prerequisites
You don't have to take exactly these courses as long as you know the materials.
 Math 33A Linear Algebra and Its Applications, Matrix Analysis
 Stat 100B Intro to Mathematical Statistics,
 CS 180 Intro to Algorithms and Complexity.
 Programming skills in Matlab or python.
Textbook
Textbook is not mandetary if you can understand the lecture notes and handouts.
 R. Duda, et al. Pattern Classification, John Wiley & Sons, 2001.
[Good for CS students]
 T. Hastie, et al. The Elements of Statistical Learning, Spinger,2009.
[Good for Stat students]
 C. Bishop, Pattern Recognition and Machine Learning, Springer, 2006
[with advanced materials]
Instructors
 Prof. SongChun Zhu, sczhu@stat.ucla.edu,
office: Boelter Hall 9404
Office Hours: Tuesday 1:003:00pm
 Reader: Yang Lu, yanglv@ucla.edu, office: Boelter Hall 9406.
Office hours: Thursday 1:003:00pm
Grading Plan: 4 units, letter grades

15%
15%
15%
15% 
Final Exam: Dec 8, Thursday 3:006:00pm (close book exam)
 40% 
Grading policy
 You are encouraged to work and discuss in a group, but each person must
finish his/her own project. Submit your report of the project, and your code through the CCLE website.
 You have a total of four late days (not including weekends) for the entire class (4 projects) to cover your various reasons,
but after using the four late days,
no credit will be given for late homework/project.
Tentative Schedule for 2016 (Once the enrollment is fixed, course materials will be posted on the CCLE site.)
Lecture 
Date 
Topics 
Handouts 
1 
0926 
Introduction to Pattern Recognition
[Problems, applications, examples,and project 0 ]


2 
0928 
Bayesian Decision Theory I
[Bayes rule, discriminant functions]


3 
1003 
Bayesian Decision Theory II
[loss functions and Bayesian error analysis]


4 
1005 
Component Analysis and Dimension Reduction I:
[PCA, face modeling, Project 1: code and data format]


5 
1010 
Component Analysis and Dimension Reduction II:
[Fisher Linear Discriminant ]
[Multidimensional scaling (MDS)]


6 
1012 
Component Analysis and Dimension Reduction III:
[Local Linear Embedding (LLE), Intrinsic dimension]


7 
1017 
Boosting Techniques I:
[perceptron, backpropagation and Adaboost]


8 
1019 
Boosting Techniques II:
[RealBoost and Example on face detection]
[ Explanation of project II ]


9 
1024 
Boosting Techniques III:
[analysis, logit boost, cascade and decision policy]


10 
1026 
Boosting Techniques III:
[analysis, logit boost, cascade and decision policy]


11 
1031 
Nonmetric method I:
[Decision tree and random forrest]


12 
1102 
Nonmetric method II:
Syntactic pattern recognition
and example on human parsing


13 
1107 
Support vector machine I:
Kernelinduced feature space


14 
119 
Support vector machine II:
[Support vector classifier]
[Explanation of project III]


15 
1114 
Support vector machine III:
[Loss functions, Latent SVM]


16 
1116 
Parametric Learning
[ Maximum Likelihood Estimation (MLE) ]
[ Sufficient Statistics and Maximum entropy]


17 
1121 
Nonparametric Learning I [ Parzen window and Knn classifer]


18 
1123 
Nonparametric Learning II:
[Knn classifer and Error analysis]


19 
1128 


20 
1130 

