This course introduces fundamental concepts, theories, and algorithms for pattern recognition and machine learning,
which are used in computer vision, speech recognition, data mining, statistics, information retrieval, and bioinformatics.
Topics include: Bayesian decision theory, parametric and non-parametric learning, data clustering, component analysis,
boosting techniques, support vector machine, and deep learning with neural networks.
You don't have to take exactly these courses as long as you know the materials.
Textbook is not mandatory if you can understand the lecture notes and handouts.
Five projects:
|
10% 15% 15% 10% 10% |
---|---|
Final Exam: Friday, December 14,
11:30 AM - 2:30 PM Franz Hall 1260
| 40% |
Tentative Schedule for 2018 (Course materials will be posted on the CCLE site.)
Lecture
|
Topics
|
Handouts
|
|
1
|
Introduction to Pattern Recognition
[Problems, applications, examples] |
||
2
|
Bayesian Decision Theory I
[Bayes rule, discriminant functions] |
Lecture2 pdf | |
3
|
Bayesian Decision Theory II
[loss functions and Bayesian error analysis] |
||
4
|
Component Analysis and Dimension Reduction I:
[PCA, face modeling by Active Appearance Model (AAM), Auto-encoder] |
||
5
|
Component Analysis and Dimension Reduction II
[Fisher Linear Discriminant ] [Multi-dimensional scaling (MDS)] |
||
6
|
Component Analysis and Dimension Reduction III
[Local Linear Embedding (LLE), Intrinsic dimension] |
||
7
|
Boosting Techniques I
[perceptron, backpropagation and Adaboost] |
||
8
|
Boosting Techniques II
[RealBoost and Example on face detection] |
||
9
|
Boosting Techniques III
[analysis, logit boost, cascade and decision policy] |
||
10
|
Boosting Techniques III
[analysis, logit boost, cascade and decision policy] |
||
11
|
Non-metric method I
[Decision tree and random forrest] |
||
12
|
Non-metric method II
[Syntactic pattern recognition and example on human parsing] |
||
13
|
Support vector machine I
[Kernel-induced feature space] |
||
14
|
Support vector machine II
[Support vector classifier] |
||
15
|
Support vector machine III
[Loss functions, Latent SVM] |
||
16
|
Parametric Learning
[ Maximum Likelihood Estimation ] [ Sufficient Statistics and Maximum entropy] |
||
17
|
Non-parametric Learning I
[ Parzen window and K-nn classifer] |
||
18
|
Non-parametric Learning II
[K-nn classifer and Error analysis] |
||
19
|
Deep Learning I
|
||
20
|
Deep Learning II
|