General information
Description: This course provides an accessible introduction to machine learning aimed at advanced undergraduate and graduate students in statistics, computer science, electrical engineering or related disciplines. Topics covered include Bayes decision theory, parameter estimation, regression, PCA, K-means, SVMs and hidden Markov models. Emphasis is on learning high-level concepts behind machine learning algorithms and gaining practical experience applying them to real data and problems.
Lectures: Mon, Wed, 4-5:15pm, Kinsey Pavillion 1220B Note new classroom
Instructor and Reader:
- Instructor: Alyson Fletcher, akfletcher@ucla.edu
- Instructor office Hours: Tuesdays 11-1, Room 8100 Math Sci Building (Stat lounge)
- Special reader: Chengcheng Yu, Boelter Hall 9401, chengchengyu@ucla.edu
- Reader office hours: Mondays 12-2, Wednesdays 11-12, Boelter Hall 9401
Textbook:
Alpaydin, Introduction to Machine Learning, 3rd edition.
Available as an ebook
Supplemntary texts:
- Hastie, Tibshirani, Friedman. Elements of Statistical Learning.
- Bishop. Pattern Recognition and Machine Intelligence.
- Murphy. Machine Learning. A Probabilistic Perspective
- Shai Shalev-Shwartz and Shai Ben-David,
Understanding Machine learning from theory to algorithm.
Announcements
- Beginning Monday, April 4 class will be in
Kinsey Pavillion 1220B
- The text Introduction to Machine Learning (Third Edition) by Alpaydin is available as an
ebook (unlimited simultaneous user).
- Changed office hours for Professor Fletcher: OH are NOW Tuesdays 11-1, Room 8100 Math Sci (Stat lounge)
- Other announcements will be posted on CCLE
Syllabus
- Hypothesis testing, Bayes Decision Theory, Receiver Operating Curve (ROC)
- Learning parametric distributions, bias-variance tradeoff, curse of dimensionality, exponential models, sufficient statistics
- Non-parametric estimation
- Linear regression, model selection, validation
- Perceptron, linear classifiers, support vector machines
- Principal component analysis (PCA) and dimensionality reduction
- Clustering, K-means and expectation maximization
- Hidden Markov Models
Prerequisites
Upper division probability and statistics
and linear algebra. Familiarity or willingness to learn Python or MATLAB.
Grading
4-6 Homeworks 40%, pop quizzes and midterm 30%, final 30%
Resources
Homework:
Solutions may be submitted electronically on the
CCLE website.
Resources
Lecture Notes:
- Lecture 1 [Slides] [Notes]. Reading: Chapter 1
- Lecture 2 [Slides]
[Notes]
Readings: Chapter 2 and 3.1,3.2
- Lecture 3: Readings: Chapters 3.3, 3.4 and 4.1-4.8
Background Resources: