ECE 563 - Information Theory (Fall 2025)
Lecturer: Olgica Milenkovic (Office hours: Thursday 2:30-3:30pm, 311 CSL or by appointment as needed)
Teaching Assistants: Peizhi Niu (Office hour: Tuesday 3:00-4:00pm, Electrical and Computer Engineering Building ECEB 3036; peizhin2@illinois.edu)
Lectures: Tuesday and Thursday, 12:30-13:50, Electrical and Computer Engineering Building ECEB 3081

Course Objectives:
Catalog Description
Mathematical models for channels and sources; entropy, information, data compression, channel capacity, Shannon's theorems, rate-distortion theory. Information theory and statistics, learning.
Prerequisites: Solid background in probability (ECE 534, MATH 464, or MATH 564).
Required textbook: T. M. Cover and J. A. Thomas, Elements of Information Theory, 2nd ed., Wiley, 2006.
Recommended textbook: Y. Poliyanskiy and Y. Wu, Information Theory: From Coding to Learning, Cambridge University Press, 2025.
Grading: Homework will be assigned and solutions provided, but not graded, Midterm I exam: October 7th, 6:30-8:00 pm, Electrical and Computer Engineering Building ECEB 2015 (25%) CLOSED NOTES, Midterm II exam: November 13th, 6:30-8:00 PM, Electrical and Computer Engineering Building ECEB 2013 (25%) CLOSED NOTES, Final exam: Dec. XXX, 2024 (50%).
Homework Submission: Please submit your homework to the designated Box folder corresponding to the assignment: Box Link. Each submission should be named using the format NetID_HW# (for example: peizhin2_HW1). Please upload only a single PDF file per assignment. Make sure to upload your file before the deadline.
Homework, Fall 2025
Additional Instructional Material
Entropy in Physics (Video, TEDed)
Operational Characterization of Entropy (Video, Khan Academy)
Lecture subjects, Fall 2025
1. Tuesday, August 26th: Introduction, Syllabus Overview, How to measure information in physics, engineering, communication theory.
The part of the first lecture pertaining to entropy in physics and how it inspired Shannon's entropy can be found at Notes 1. The axiomatic derivation of Shannon's entropy from class is based on R. Ash, Information Theory, pp. 5-12. More on axiomatic approaches can be found here Entropy axioms
2. Thursday, August 28th: Axioms of Shannon entropy, derivation of the Shannon entropy function through an axiomatic approach, generalized means, Renyi entropy
A summary of the notes pertaining to generalized means and Renyi entropy can be found at Notes 2.
3. Tuesday, September 2nd: Properties of entropy, Jensen's inequality, Conditional entropy, Joint entropy, Conditioning reduces entropy, Submodularity
A summary of the notes pertaining to generalized means and Renyi entropy can be found at Notes 3.
4. Thursday, September 4th: Submodularity and Han's inequality, Kullback-Leibler divergence, Mutual information, Bregman divergence
A summary of the notes pertaining to Lovasz's extension, entropy and submodularity, KL divergence and MI can be found here Notes 4.
5. Tuesday, September 9th: Hamming codes, Data processing inequality, Log-sum inequality, Fano's inequality
A summary of the notes pertaining to Hamming codes, Data processing inequality, Log-sum inequality, Fano's inequality can be found here Notes 5.
6. Thursday, September 11th: Convergence of sequences of random variables, Law of large numbers, Typical sequences. The material covered closely followed the textbook by Cover+Thomas, Chapter 3.
7. Tuesday, September 16th: Lecture was cancelled due to Allerton conference.
8. Thursday, September 18th: The Asymptotic Equipartition Property (AEP) and Compression of sequences. The material covered closely followed the textbook by Cover+Thomas, Chapter 3.
9. Tuesday, September 23rd: Data compression - uniquely decodable codes, prefix codes, Kraft's inequality (proofs for both finite and countably infinite alphabets). The material covered closely followed the textbook by Cover+Thomas, Chapter 5. We skipped Chapter 4 (Entropy rates) but will cover it during the discussion of compression algorithms.
10. Thursday, September 25th: Data compression - Shannon codes, Huffman codes, optimality of Huffman codes
11. Tuesday, September 30th: Data compression - Entropy rates and compression of stationary sources (Chapter 4 of Cover and Thomas).
12. Thursday, October 2nd: Extended Huffman codes, Tunstall codes, Adaptive Huffman codes. Notes can be found here Notes 6 and Adaptive Huffman coding.
13. Tuesday, October 7: Examples of channels, information channel capacity, symmetric channels. The material closely follows the text.
14. Thursday, October 9th: Joint typicality, Shannon's second theorem (channel capacity theorem) - achievability. The material closely follows the text.
15. Tuesday, October 14th: Recap of Fano's inequality, Shannon's second theorem (channel capacity theorem) - converse. Feedback capacity and source-channel coding separation theorem are briefly discussed but delegated to the HW. All notes are following Cover and Thomas.
16. Thursday, October 16th: Differential entropy.
17. Tuesday, October 21st: Differential entropy, Additive Gaussian noise channels and their capacity. Parallel Gaussian channels and waterfilling arguments.
18. Thursday, October 23th: Parallel Gaussian channels and waterfilling arguments. Arimoto-Blahut's algorithm.
19. Tuesday, October 28th: Arimoto-Blahut's algorithm continued. MSE distortion, scalar quantization, optimal uniform scalar quantizers. Notes can be found here Quantization - Gray and Quantization - Gray and Neuhoff and Quantization - Class.
20. Thursday, October 30th: Nonuniform scalar quantization and Benett's integral.
21. Tuesday, November 4th: Nonuniform scalar quantization and Benett's integral, vector quantization.
22. Thursday, November 6th: Rate-distortion theory. Metric entropy (coverings and packings, volume bound etc), Information projection and large deviations
23. Tuesday, November 11th: Basics of statistical decision theory
24. Thursday, November 13th: Large-sample asymptotic
25. Tuesday, November 18th: Mutual information method
26. Thursday, November 20th: Entropic bounds for statistical estimation
27. Tuesday, November 26th: Thanksgiving break
28. Thursday, November 28th: Thanksgiving break
29. Tuesday, December 2nd: Fisher information
30. Thursday, December 4th: Strong data processing inequality