Stat 231 / CS 276A
Pattern Recognition and Machine Learning

Fall 2017, MW 2:00-3:15 PM, Physics and Astronomy Building 1434A
www.stat.ucla.edu/~sczhu/Courses/UCLA/Stat_231/Stat_231.html

Syllabus.pdf


Course Description

This course introduces fundamental concepts, theories, and algorithms for pattern recognition and machine learning,
which are used in computer vision, speech recognition, data mining, statistics, information retrieval, and bioinformatics.
Topics include: Bayesian decision theory, parametric and non-parametric learning, data clustering, component analysis,
boosting techniques, support vector machine, and deep learning with neural networks.

Prerequisites

You don't have to take exactly these courses as long as you know the materials.

Textbook

Textbook is not mandatory if you can understand the lecture notes and handouts.

Instructors

Grading Plan

4 units, letter grades.

Five projects:

 

10%

15%

15%

10%

10%

Final Exam: (date and room to be announced)
40%

Grading Policy

Schedule

Tentative Schedule for 2017 (Once the enrollment is fixed, course materials will be posted on the CCLE site.)

Lecture
Date
Topics
Handouts
1
10-02
Introduction to Pattern Recognition
[Problems, applications, examples]

Lect 1.pdf
Tutorial.zip (TA session on Tuesday 10/03)

2
10-04
Bayesian Decision Theory I
[Bayes rule, discriminant functions]
Lect2.pdf
3
10-09
Bayesian Decision Theory II
[loss functions and Bayesian error analysis]
 
4
10-11
Component Analysis and Dimension Reduction I:
[PCA, face modeling by Active Appearance Model (AAM) ]
 
5
10-16
Component Analysis and Dimension Reduction II
[Fisher Linear Discriminant ] [Multi-dimensional scaling (MDS)]
 
6
10-18
Component Analysis and Dimension Reduction III
[Local Linear Embedding (LLE), Intrinsic dimension]
 
7
10-23
Boosting Techniques I
[perceptron, backpropagation and Adaboost]
 
8
10-25
Boosting Techniques II
[RealBoost and Example on face detection]
 
9
10-30
Boosting Techniques III
[analysis, logit boost, cascade and decision policy]
 
10
11-01
Boosting Techniques III
[analysis, logit boost, cascade and decision policy]
 
11
11-06
Non-metric method I
[Decision tree and random forrest]
 
12
11-08
Non-metric method II
[Syntactic pattern recognition and example on human parsing]
 
13
11-13
Support vector machine I
[Kernel-induced feature space]
 
14
11-15
Support vector machine II
[Support vector classifier]
 
15
11-20
Support vector machine III
[Loss functions, Latent SVM]
 
16
11-22
Parametric Learning
[ Maximum Likelihood Estimation ] [ Sufficient Statistics and Maximum entropy]
 
17
11-27
Non-parametric Learning I
[ Parzen window and K-nn classifer]
 
18
11-29
Non-parametric Learning II
[K-nn classifer and Error analysis]
 
19
12-04
Deep Learning I
20
12-06
Deep Learning II