ECE 563 - Information Theory (Fall 2024)

Lecturer: Olgica Milenkovic (Office hours: Thursday 2:30-3:30pm, 311 CSL or by appointment as needed)

Teaching Assistants: Kanad Sarkar (Office hour: Wednesdays 2pm, Location ECEB 3036 ; kanads2@illinois.edu)

Lectures: Tuesday and Thursday, 12:30pm, Electrical and Computer Engineering Building ECEB 3081

Problem Solving Sessions: The sessions will start in September. Attendance is optional. Time/Location: Tuesdays 2:30 - 4:00 pm (every other week) @ ECEB 4070

Course Objectives:

Catalog Description

Mathematical models for channels and sources; entropy, information, data compression, channel capacity, Shannon's theorems, and rate-distortion theory.

Prerequisites: Solid background in probability (ECE 534, MATH 464, or MATH 564).

Textbook: T. M. Cover and J. A. Thomas, Elements of Information Theory, 2nd ed., Wiley, 2006.

Grading: Homework (25%), Midterm I exam: 6:00pm-8:00pm, October 24th, 2024 in ECEB 3017 (25%) CLOSED NOTES,  Final exam: Dec. XXX, 2024 (25%),  Group project/paper presentations (25%)

Project Presentations:  December 5th, 2024 starting at noon in CSL 141. Presentation Schedule

Syllabus


Homework, Fall 2024


Recitations, Fall 2024

 Project Slides, Fall 2024


Research Project Topics

Asysmetric Numeral Systems

Genomic Data Compression Review paper  

The Information Bottleneck Problem Research paper  

Lovasz number of a graph Lecture notes  

Quantized Deep Learning Blog  

Capacity of DNA-Storage Channels Research paper  

Polar codes Research paper  

Network coding Text  

Quantum information theory Introduction  

Renyi entropy NeurIPS paper  

Deletion error-correction and capacity of deletion channels Review paper by Sloane  

Channel dispersion: finite block-length regime Y. Polyansky et al., see also the prior work by Strassen.  


Additional Instructional Material

Entropy in Physics (Video, TEDed)  

Operational Characterization of Entropy (Video, Khan Academy)  

The first lecture on the axiomatic derivation of Shannon's entropy is based on R. Ash, Information Theory, pp. 5-12. More on axiomatic approaches can be found here Entropy axioms  


Lecture subjects, Fall 2022 -- Will follow similar outline in 2024 with second part of the topic subjects updated

1. Tuesday, August 23rd: Introduction, Syllabus Overview, How to measure information in physics, engineering, communication theory and biology

2. Thursday, August 25th: Axioms of Shannon entropy, derivation of the Shannon entropy function through an axiomatic approach, properties of Shannon entropy

3. Tuesday, August 30th: Conditional entropy, Joint entropy, Conditioning reduces entropy, Kulback-Leibler divergence, Mutual information

4. Thursday, September 1st: Problem solving session

5. Tuesday, September 6th: Data processing inequality, Jensen's inequality, the Log-sum inequality, Fano's inequality

6. Thursday, September 8th: Typical sequences and the asymptotic equipartition property, compressing typical sequences

7. Tuesday, September 13th: Data compression - uniquely decodable codes, prefix codes, Kraft's inequality for prefix and uniquely decodable code, bounds on optimal code-length

8. Thursday, September 15th: Data compression - Shannon codes, Huffman codes, optimality of Huffman codes

9. Tuesday, September 20th: Data compression - Extended Huffman codes, Entropy rates and compression of stationary sources

10. Thursday, September 22th: Data compression - Adaptive Huffman codes

11. Tuesday, September 27th: Data compression - Adaptive Huffman codes continued

12. Thursday, September 29th: Data compression - Tunstall codes, Runlength codes

13. Tuesday, October 4th: Examples of channels, information channel capacity, symmetric channels

14. Thursday, October 6th: Joint typicality

15. Tuesday, October 11th: Shannon's second theorem (channel capacity theorem) - achievability

16. Thursday, October 13th: Recap of Fano's inequality, Shannon's second theorem (channel capacity theorem) - converse

17. Thursday, October 18th: Practical error-correction codes - Hamming codes, parity-check matrices, minimum Hamming distance of codes, linear codes

18. Thursday, October 20th: Practical error-correction codes - Dual codes, sphere-packing bound, Singleton bound, introduction to finite fields

19. Tuesday, October 25th: Practical error-correction codes - Constructions of finite fields, irreducible and primitive polynomials and elements, BCH and RS code

20. Thursday, October 27th: Practical error-correction codes - Regular and irregular LDPC codes, the peeling decoder for BECs, stopping sets, belief propagation decoding for the BSC

21. Tuesday, November 1st: LDPC codes Reference

22. Thursday, November 3rd: Feedback capacity, source-channel coding separation theorem

23. Tuesday, November 8th: No lecture, election day

24. Thursday, November 10th: Differential entropy

25. Tuesday, November 15th: Additive Gaussian noise channels and their capacity, parallel Gaussian channels and waterfilling arguments

26. Thursday, November 17th: MSE distortion, scalar quantization, optimal uniform scalar quantizers

27. Tuesday, November 22nd: Thanksgiving break

28. Thursday, November 24th: Thanksgiving break

29. Tuesday, November 29th: Nonuniform scalar quantization and Benett's integral, Rate-distortion theory

 

Homework, Fall 2022

Problem Solving Sessions, Fall 2019

Recitations, Fall 2022

Topics:

Course Schedule from Fall 2019

Date Topic Reading Assignment Learning Objectives Multimedia Supplements
8/28

1. The problem of communication, iInformation theory beyond communication

[slides]

  • Chapter 1 (Introduction and Preview) of Cover & Thomas
8/30

2. The idea of error-control coding and linear codes

[slides] [handwritten]

  • Chapter 7.11 (Hamming Codes) of Cover & Thomas
9/4 3. Information measures and their axiomatic derivation
  4. Basic inequalities with information measures
  • Chapter 2.4-2.10 (Entropy, Relative Entropy and Mutual Information) of Cover & Thomas
 
9/11 5. Asymptotic Equipartition Property
  • Chapter 3.1 of Cover & Thomas
9/13 6. Source Coding Theorem
  • Chapter 3.2 of Cover & Thomas
  • Chapter 5.2 of Yeung, if you'd like
9/18 7. Variable-length Codes
  • Chapter 5 of Cover & Thomas
9/20 8. Entropy Rate of Stochastic Processes
  • Chapter 4 of Cover & Thomas
 
9/25 9. Distributed Source Coding
  • Chapter 15.4 of Cover & Thomas
 
9/27 10. Universal Source Coding
  • Chapter 13 of Cover & Thomas
10/2 11. Method of Types
  • Chapter 11.1-11.3 of Cover & Thomas
 
10/4 12. Allerton Conference [no lecture]    
10/9 13. Hypothesis Testing
  • Chapter 11.7-11.10 of Cover & Thomas
 
10/11 14. Channel Coding Theorem: Converse and Joint AEP
  • Chapter 7.9 and 7.6 of Cover & Thomas
10/16 15. Channel Coding Theorem: Achievability and Examples
  • Chapter 7.7 and 7.1 of Cover & Thomas
 
10/18 16. Midterm [no lecture]      
10/23 17. Source-Channel Separation
  • Chapter 7.13 of Cover & Thomas (and e.g. Gastpar et al., 2003)
 
10/25 18. Differential Entropy, Maximum Entropy, and Capacity of Real-Valued Channels
  • Chapter 8, 9, and 12 of Cover & Thomas
 
10/30 19. Rate-Distortion Theorem: Converse and Examples
  • Chapter 10 of Cover & Thomas
11/1 20. Rate-Distortion Theorem: Achievability and More Examples
  • Chapter 10 of Cover & Thomas (and Chapter 9 of Yeung)
 
11/6 21. Quantization Theory  
11/8 22. Blahut-Arimoto 
  • Chapter 10.8 of Cover & Thomas (and Chapter 10 of Yeung)
 
11/13 23. Strong Data Processing Inequalities  
11/15 24. Large Deviations
  • Chapter 11.4-11.5 of Cover & Thomas
 
11/27 25. Error Exponents for Channel Coding
  • Chapter 5.6 of Blahut
 
11/29 26. Error Exponents for Channel Coding  
12/4 27. Multiple Access Channel: Achievability
  • Chapter 15.3 of Cover & Thomas
 
12/6 28. Quantum Information Theory [guest lecture]       
12/11 29. Multiple Access Channel: Converse, Examples, and Duality
  • Chapter 15.3 and 15.5 of Cover & Thomas