Pattern Recognition and Machine Learning

```TR 3:30-4:45 PM, Fall 2015, Lakretz 120

www.stat.ucla.edu/~sczhu/Courses/UCLA/Stat_231/Stat_231.html```

Course Description

This course introduces fundamental concepts, theories, and algorithms for pattern recognition and machine learning,
which are used in computer vision, speech recognition, data mining, statistics, information retrieval, and bioinformatics.
Topics include: Bayesian decision theory, parametric and non-parametric learning, data clustering, component analysis,
boosting techniques, kernel methods and support vector machine.

Prerequisites

You don't have to take exactly these courses as long as you know the materials.

• Math 33A Linear Algebra and Its Applications, Matrix Analysis
• Stat 100B Intro to Mathematical Statistics,
• CS 180 Intro to Algorithms and Complexity.
• Programming skills in Matlab or R.

Textbook

Textbook is not mandetary if you can understand the lecture notes and handouts.

• R. Duda, P. Hart, D. Stork, "Pattern Classification", second edition, 2000. [Good for CS students]
• T. Hastie, R. Tibshurani, and J.H. Friedman, "The Elements of Statistical Learning: Data Mining, Inference, and Prediction", Spinger Series in Statistics, 2nd edition 2009. [Good for Statistics students]

Instructors

• Prof. Song-Chun Zhu, sczhu@stat.ucla.edu, 310-206-8693, office: Boelter Hall 9404.
Office Hours: Tuesday 1:00-3:00pm
• Reader: Yang Liu, yangliu2014@ucla.edu, office: Boelter Hall 9401.
Office hours: Thursday 1:00-3:00pm

 Two Homework assignments 20% 15% 15% 15% 0% 35%

 Lecture Date Topics Reading Materials Handouts 1 09-24 ```Introduction to Pattern Recognition [Problems, applications, examples, and project introduction]``` `Ch 1` ```syllabus.pdf Lect1.pdf``` 2 09-29 ```Bayesian Decision Theory I [Bayes rule, discriminant functions]``` `Ch 2.1-2.6` `Lect2.pdf` 3 10-01 ```Bayesian Decision Theory II [loss functions and Bayesian error analysis]``` `Ch 2.1-2.6` `Lect3.pdf` 4 10-06 ```Component Analysis and Dimension Reduction I: [principal component analysis (PCA)], face modeling] [Explanation of Project 1: code and data format]``` ```Ch 3.8.1, Ch 10.13.1 Project 1 ``` `Lect4-5.pdf` 5 10-08 ```Component Analysis and Dimension Reduction II: [Fisher Linear Discriminant ] [Multi-dimensional scaling (MDS)]``` ```Ch 3.8.2, Ch10.14``` ```FisherFace.pdf Lect5-6.pdf``` 6 10-13 ```Component Analysis and Dimension Reduction III: [Local Linear Embedding (LLE), Intrinsic dimension]``` `paper` `LLE paper ` 7 10-15 ```Boosting Techniques I: [perceptron, backpropagation and Adaboost]``` `Ch 9.5 ` `Lect7-9.pdf` 8 10-20 ```Boosting Techniques II: [RealBoost and Example on face detection] [ Explanation of project II ]``` ```Tutorial Handout 1 Handout 2``` 9 10-22 ```Boosting Techniques III: [analysis, logit boost, cascade and decision Policy]``` 10 10-27 ```Boosting Techniques III: [analysis, logit boost, cascade and decision Policy]``` ` ` 11 10-29 ```Non-metric method I: [tree structured Classification]``` `Ch 8.1-8.3 ` `Lect10.pdf` 12 11-03 ```Non-metric method II: Syntactic pattern recognition and example on human parsing``` Ch 8.5-8.8 Lect11.pdf 13 11-05 ```Support vector machine I: Kernel-induced feature space ``` ` Tutorial paper` `lecture12-15.pdf` 14 11-10 ```Support vector machine II: [Support vector classifier] [Explanation of project III]``` ` ` 15 11-12 ```Support vector machine III: [Loss functions, Latent SVM, Neual networks and DeepNet]``` `Ch 5.11` 16 11-17 ```Parametric Learning [ Maximum Likelihood Estimation (MLE) ] [ Sufficient Statistics and Maximum entropy]``` `Ch 3.1-3.6` `Lect16.pdf` 17 11-19 `Non-parametric Learning I [ Parzen window and K-nn classifer]` `Ch 4.1-4.5` `Lect17.pdf` 18 11-24 ```Non-parametric Learning II: [K-nn classifer and Error analysis]``` ```Ch 4.6 handout``` `Lect18.pdf` 11-26 `ThanksGivings Holiday` ` ` 19 12-01 ` Topics on Deep Learning ` ` Tutorial` ` ` 20 12-03 `Advanced Topics on Lifelong Machine Learning:` ` `