ECE 563 - Information Theory (Fall 2025)

Lecturer: Olgica Milenkovic (Office hours: Thursday 2:30-3:30pm, 311 CSL or by appointment as needed)

Teaching Assistants: Peizhi Niu (Office hour: Tuesday 3:00-4:00pm, Electrical and Computer Engineering Building ECEB 3036; peizhin2@illinois.edu)

Lectures: Tuesday and Thursday, 12:30-13:50, Electrical and Computer Engineering Building ECEB 3081

Course Objectives:

Catalog Description

Mathematical models for channels and sources; entropy, information, data compression, channel capacity, Shannon's theorems, rate-distortion theory. Information theory and statistics, learning.

Prerequisites: Solid background in probability (ECE 534, MATH 464, or MATH 564).

Required textbook: T. M. Cover and J. A. Thomas, Elements of Information Theory, 2nd ed., Wiley, 2006.

Recommended textbook: Y. Poliyanskiy and Y. Wu, Information Theory: From Coding to Learning, Cambridge University Press, 2025.

Grading: Homework will be assigned and solutions provided, but not graded, Midterm I exam: September 30th (Tentative) (25%) CLOSED NOTES, Midterm II exam: November 6th (Tentative) (25%) CLOSED NOTES,  Final exam: Dec. XXX, 2024 (50%).

Syllabus


Homework, Fall 2025


Additional Instructional Material

Entropy in Physics (Video, TEDed)  

Operational Characterization of Entropy (Video, Khan Academy)  


Lecture subjects, Fall 2025

1. Tuesday, August 26th: Introduction, Syllabus Overview, How to measure information in physics, engineering, communication theory.

The part of the first lecture pertaining to entropy in physics and how it inspired Shannon's entropy can be found at Notes 1. The axiomatic derivation of Shannon's entropy from class is based on R. Ash, Information Theory, pp. 5-12. More on axiomatic approaches can be found here Entropy axioms  

2. Thursday, August 28th: Axioms of Shannon entropy, derivation of the Shannon entropy function through an axiomatic approach, generalized means, Renyi entropy

A summary of the notes pertaining to generalized means and Renyi entropy can be found at Notes 2.

3. Tuesday, September 2nd: Properties of entropy, Jensen's inequality, Conditional entropy, Joint entropy, Conditioning reduces entropy, Submodularity

A summary of the notes pertaining to generalized means and Renyi entropy can be found at Notes 3.

4. Thursday, September 4th: Submodularity and Han's inequality, Kullback-Leibler divergence, Bregman divergence

5. Tuesday, September 9th: Mutual information, Data processing inequality, Log-sum inequality, Fano's inequality

6. Thursday, September 11th: Extremization of mutual information and the Blahut-Arimoto algorithm

7. Tuesday, September 16th: Typical sequences and the asymptotic equipartition property, compressing typical sequences

8. Thursday, September 18th: Data compression - uniquely decodable codes, prefix codes, Kraft's inequality for prefix and uniquely decodable code, bounds on optimal code-length

9. Tuesday, September 23rd: Data compression - Shannon codes, Huffman codes, optimality of Huffman codes

10. Thursday, September 25th: Data compression - Extended Huffman codes, Entropy rates and compression of stationary sources

11. Tuesday, September 30th: Data compression - Adaptive Huffman codes, Tunstall codes, Asymmetric numeral systems

12. Thursday, October 2nd: Compressing sources with large alphabets, patterns and Good-Turing estimators

13. Tuesday, October 7: Examples of channels, information channel capacity, symmetric channels

14. Thursday, October 9th: Joint typicality, Shannon's second theorem (channel capacity theorem) - achievability

15. Tuesday, October 14th: Recap of Fano's inequality, Shannon's second theorem (channel capacity theorem) - converse

16. Thursday, October 16th: Feedback capacity, source-channel coding separation theorem

17. Tuesday, October 21st: Differential entropy

18. Thursday, October 23rd: Additive Gaussian noise channels and their capacity, parallel Gaussian channels and waterfilling arguments

19. Tuesday, October 28th: MSE distortion, scalar quantization, optimal uniform scalar quantizers

20. Thursday, October 30th: Nonuniform scalar quantization and Benett's integral, Rate-distortion theory

21. Tuesday, November 4th: Metric entropy (coverings and packings, volume bound etc)

22. Thursday, November 6th: Information projection and large deviations

23. Tuesday, November 11th: Basics of statistical decision theory

24. Thursday, November 13th: Large-sample asymptotic

25. Tuesday, November 18th: Mutual information method

26. Thursday, November 20th: Entropic bounds for statistical estimation

27. Tuesday, November 26th: Thanksgiving break

28. Thursday, November 28th: Thanksgiving break

29. Tuesday, December 2nd: Fisher information

30. Thursday, December 4th: Strong data processing inequality