CS 446/ECE 449: Machine Learning
Spring 2025
- Course Description:
The goal of machine learning is to develop algorithms and models that enable computers to learn from
data and make predictions or decisions without being explicitly programmed for a particular task. In this
course, we will cover the common algorithms and models encountered in both traditional machine learning
and modern deep learning, those in unsupervised learning, supervised learning, generative models, and
reinfrocement learning. The algorithms that we will cover include k-means, Gaussian mixture models,
expectation maximization, decision trees, Naive Bayes, linear regression, logistic regression, support vector
machines, kernel methods, boosting, learning theory, common deep architectures, GAN and diffusion
models, basic reinforcement learning algorithms such as Q-learning and policy gradient.
- Course Credit:
- Undergraduate: 3
- Graduate: 3 or 4
- Prerequisites:
- Linear algebra (and calculus)
- Probability
- Basic statistics
- Programming in python (numpy and pytorch)
- Class time:
Tues, Thu, 9:30am–10:45am, SC 1404
- Instructor:
Prof. Tong Zhang (tozhang@illinois.edu)
- Office: SC 2118
- Office Hour: Thu 10:50Am –11:50Am
- Teaching Assistants:
- Jane Du
Email: zd16@illinois.edu
Office: Lounge outside SC 3102
Office Hour: Monday, 10AM – 11AM
- Yao Xiao
Email: yaox11@illinois.edu
Office: Siebel Center 3307
Office Hour: Monday, 4PM – 5PM
- Yangyi Chen
Email: yangyic3@illinois.edu
Office: Lounge outside SC 3102
Office Hour: Friday, 4PM – 5PM
- Course resources:
- Grading:
- 3-credit: best five out of six homeworks (60%) + midterm 1 (20%) + mideterm 2 (20%)
- 4-credit: six homeworks (60%) + midterm 1 (20%) + midterm 2 (20%)
- Course Material:
- Lecture slides (distributed before each lecture)
- Reference books:
Tom Mitchell. Machine learning. McGraw-Hill New York, 1997. URL https://www.cs.cmu.edu/~tom/mlbook.html
Ethem Alpaydin. Introduction to machine learning. MIT Press, 2020. URL https://mitpress.mit.edu/9780262043793/introduction-to-machine-learning/
Shai Shalev-Shwartz and Shai Ben-David. Understanding machine learning: From theory to algorithms.
Cambridge University Press, 2014. URL https://www.cs.huji.ac.il/~shais/UnderstandingMachineLearning/copy.html
J Shawe-Taylor and N Christianini. Kernel Methods for Pattern Analysis. 2004. URL https://www.cambridge.org/core/books/kernel-methods-for-pattern-analysis/811462F4D6CD6A536A05127319A8935A
Bernhard Scholkopf and Alexander J Smola. Learning with kernels: support vector machines, regularization,
optimization, and beyond. MIT Press, 2018. URL https://direct.mit.edu/books/monograph/1821/Learning-with-KernelsSupport-Vector-Machines
Yoshua Bengio, Ian Goodfellow, and Aaron Courville. Deep learning. MIT Press, 2017. URL https://www.deeplearningbook.org
Christopher M Bishop and Hugh Bishop. Deep learning: Foundations and concepts. Springer Nature,
2023. URL https://www.bishopbook.com
Kevin P. Murphy. Probabilistic Machine Learning: Advanced Topics. MIT Press, 2023. URL http://probml.github.io/book2
Gareth James, Daniela Witten, Trevor Hastie, Robert Tibshirani, and Jonathan Taylor. An introduction
to statistical learning: With applications in python. Springer Nature, 2023. URL https://www.statlearning.com
- Topics
- Introduction (one lecture)
- Unsupervised learning (three lectures on clustering and dimension reduction)
- Supervised learning (eight lectures on linear models, decision trees, boosting, kernel methods, model
selection and combination)
- Learning theory (one lecture)
- Midterm 1 (cover materials up to Midterm 1)
- Deep learning (six lectures on models and training)
- Generative models (three lectures)
- Sequential decision making (four lectures, online learning, bandits, and RL)
- Midterm 2 (cover materials after Midterm 1)