TR 3:304:45 PM, Fall 2015, Lakretz 120 www.stat.ucla.edu/~sczhu/Courses/UCLA/Stat_231/Stat_231.html
This course introduces fundamental concepts, theories, and algorithms for
pattern recognition and machine learning,
which are used in computer vision,
speech recognition, data mining, statistics, information retrieval, and bioinformatics.
Topics include: Bayesian decision theory, parametric and nonparametric learning,
data clustering,
component analysis,
boosting techniques, kernel methods and support vector machine.
You don't have to take exactly these courses as long as you know the materials.
Textbook is not mandetary if you can understand the lecture notes and handouts.
Two Homework assignments
 20% 

Three projects:

15% 15% 15% 
MiddleTerm Exam: No.
 0% 
Final Exam: Dec 8, Tuesday 11:302:30pm (close book exam)
 35% 
Tentative Schedule for 2015
Lecture 
Date 
Topics 
Reading Materials 
Handouts 
1 
0924 
Introduction to Pattern Recognition [Problems, applications, examples, and project introduction] 
Ch 1 

2 
0929 
Bayesian Decision Theory I [Bayes rule, discriminant functions] 
Ch 2.12.6 

3 
1001 
Bayesian Decision Theory II [loss functions and Bayesian error analysis] 
Ch 2.12.6 

4 
1006 
Component Analysis and Dimension Reduction I: [principal component analysis (PCA)], face modeling] [Explanation of Project 1: code and data format] 
Ch 3.8.1, Ch 10.13.1 Project 1 

5 
1008 
Component Analysis and Dimension Reduction II: [Fisher Linear Discriminant ] [Multidimensional scaling (MDS)] 
Ch 3.8.2, Ch10.14 

6 
1013 
Component Analysis and Dimension Reduction III: [Local Linear Embedding (LLE), Intrinsic dimension] 
paper 

7 
1015 
Boosting Techniques I: [perceptron, backpropagation and Adaboost] 
Ch 9.5 

8 
1020 
Boosting Techniques II: [RealBoost and Example on face detection] [ Explanation of project II ] 


9 
1022 
Boosting Techniques III: [analysis, logit boost, cascade and decision Policy] 


10 
1027 
Boosting Techniques III: [analysis, logit boost, cascade and decision Policy] 


11 
1029 
Nonmetric method I: [tree structured Classification] 
Ch 8.18.3 

12 
1103 
Nonmetric method II: Syntactic pattern recognition and example on human parsing 
Ch 8.58.8  Lect11.pdf 
13 
1105 
Support vector machine I: Kernelinduced feature space 

14 
1110 
Support vector machine II: [Support vector classifier] [Explanation of project III] 


15 
1112 
Support vector machine III: [Loss functions, Latent SVM, Neual networks and DeepNet] 
Ch 5.11


16 
1117 
Parametric Learning [ Maximum Likelihood Estimation (MLE) ] [ Sufficient Statistics and Maximum entropy] 
Ch 3.13.6 

17 
1119 
Nonparametric Learning I 
Ch 4.14.5 

18 
1124 
Nonparametric Learning II: [Knn classifer and Error analysis] 
Ch 4.6 handout 

1126 
ThanksGivings Holiday 


19 
1201 
Topics on Deep Learning 
Tutorial 

20 
1203 
Advanced Topics on Lifelong Machine Learning: 
