**ECE 566: COMPUTATIONAL INFERENCE AND LEARNING, FALL 2023**

**Course Description:**Computational inference and machine learning have seen a surge of interest in the last 20 years, motivated by applications as diverse as computer vision, speech recognition, analysis of networks and distributed systems, big-data analytics, large-scale computer simulations, and indexing and searching of very large databases. This new course will introduce the mathematical and computational methods that enable such applications. Topics include computational methods for statistical inference, information theory, sparsity analysis, approximate inference and search, and fast optimization.

The course will complement ECE561 (Statistical Inference for Engineers and Data Scientists), ECE544NA (Pattern Recognition and Machine Learning), and ECE543 (Statistical Learning Theory) which introduce core theory for statistical inference and machine learning respectively, but do not focus on computational methods. Teaching materials include notes from the instructor and articles from scientific journals.

**Prerequisites:**ECE490 and ECE534.**Class time and place:**2:00-3:30 PM TR. 3020 ECE building.

**Instructor:**Prof. Pierre Moulin. Room 310 CSL. Email: pmoulin at illinois dot edu

Office Hours: 10:00-11:30 AM Wednesdays. Room 310 CSL.

**TA:**Aditya Deshmukh. Room 312 CSL. Email: ad11 at illinois dot edu

Office Hours: 2:00 PM-3:00 PM Mondays. Room 312 CSL.

**Grading:**Homeworks (20%), Quiz (10%), Midterm exam (30%), and a Final project (40%).

**Homework submissions:**Gradescope. Entry code: 7GYP84

**Project:**List of Topics

**ECE561 book by P. Moulin and V. Veeravalli:**

Chapter 1 and 2: Introduction, Hypothesis Testing**ECE566 Fall 2017 Notes:**

Notes**ECE566 Notes (Part 2):**

Notes

- Homework-1. Due on Thursday, Sept 7, 2023 on Gradescope. Solutions.
- Homework-2. Due on Thursday, Sept 28, 2023 on Gradescope.

- Quiz on Tuesday, Sep 26, 2023 in class. The quiz will cover: Probability, optimization, and the material covered in HW1.

**Lecture 1:**Introduction, Review of Optimization concepts.**Lecture 2:**Bayes inference, maximum likelihood principle, Maximum A Priori (MAP) estimation, Minimum Mean Squared Error (MMSE) estimation.**Lecture 3:**Empirical Risk Minimization.**Lectures 4,5:**Stochastic Approximation and Stochastic Gradient Descent.**Lecture 6:**Statistical performance analysis via Monte Carlo methods and importance sampling.**Lecture 7:**Bootstrap.**Lecture 8:**Bayesian recursive estimation using Particle Filtering.**Lectures 9-11:**Parameter estimation via Expectation Maximization (EM) algorithm.**Lectures 12, 13:**Hidden Markov Models, Viterbi algorithm, Baum-Welch learning.**Lectures 14, 15:**Linear Dynamical Systems, RTS smoother.**Lectures 16-18:**Graphical models.**Lectures 19-22:**Variational inference, mean-I field techniques.**Lectures 23-25:**L-1 penalized least squares minimization.**Lectures 26-28:**Reconstruction of sparse signals using Compressive Sensing.**Lecture 29:**Dimensionality reduction using random projections; hashing.