ECE 563 - Information Theory (Fall 2020)

Lecturer: Lav Varshney (office hours, Friday 9:30am-10:30am, Zoom)

Teaching Assistant: Sourya Basu (office hours, Wednesday, 9:00am-10:00am, Zoom)

Lectures: Tuesday and Thursday, 12:30pm, Zoom (if you have not received the password, please ask the course staff).  Recordings via Illinois Media Space.

Problem Solving Sessions: Monday, 9:00am-10:00am, Zoom [optional]

Course Goals

Catalog Description

Mathematical models for channels and sources; entropy, information, data compression, channel capacity, Shannon's theorems, and rate-distortion theory.

Prerequisites: Solid background in probability (ECE 534, MATH 464, or MATH 564).

Textbook: T. M. Cover and J. A. Thomas, Elements of Information Theory, 2nd ed., Wiley, 2006.

Grading: 

Syllabus


Homework (all via GradeScope, if you have not received invitation, ask course staff)

Problem Solving Sessions

Old exams

Exams

Juxtaposition Paper

Course Schedule

Date Topic Reading Assignment Learning Objectives Multimedia Supplements
8/25

1. The problem of communication, information theory beyond communication

[slides]

  • Chapter 1 (Introduction and Preview) of Cover & Thomas
8/27

2. The idea of error-control coding and linear codes

[slides][handwritten]

  • Chapter 7.11 (Hamming Codes) of Cover & Thomas
9/1

3. Information measures and their axiomatic derivation

[handwritten]

9/3

4. Basic inequalities with information measures

[handwritten]

  • Chapter 2.4-2.10 (Entropy, Relative Entropy and Mutual Information) of Cover & Thomas
9/8

5. Asymptotic Equipartition Property

[handwritten]

  • Chapter 3.1 of Cover & Thomas
9/10

6. Source Coding Theorem

[handwritten]

  • Chapter 3.2 of Cover & Thomas
  • Chapter 5.2 of Yeung, if you'd like
9/15

7. Variable-length Codes

[handwritten]

  • Chapter 5 of Cover & Thomas
9/17

8. Entropy Rate of Stochastic Processes

[handwritten]

  • Chapter 4 of Cover & Thomas
 
9/22

9. Distributed Source Coding

[handwritten]

  • Chapter 15.4 of Cover & Thomas
 
9/24

10. Universal Source Coding

[handwritten]

  • Chapter 13 of Cover & Thomas
9/29

11. Method of Types

[handwritten]

  • Chapter 11.1-11.3 of Cover & Thomas
 
10/1 12. Exam 1 [no lecture]      
10/6

13. Hypothesis Testing

[handwritten]

  • Chapter 11.7-11.10 of Cover & Thomas
 
10/8

14. Channel Coding Theorem: Converse and Joint AEP

[handwritten]

  • Chapter 7.9 and 7.6 of Cover & Thomas
10/13

15. Channel Coding Theorem: Achievability and Examples

[handwritten]

  • Chapter 7.7 and 7.1 of Cover & Thomas
 
10/15

16. Source-Channel Separation

[handwritten]

  • Chapter 7.13 of Cover & Thomas (and e.g. Gastpar et al., 2003)
  • Michael Gastpar, et al., "To code, or not to code," IEEE Transactions on Information Theory, 49(5): 1147-1158, May 2003.
10/20

17. Differential Entropy, Maximum Entropy, and Capacity of Real-Valued Channels

[handwritten]

  • Chapter 8, 9, and 12 of Cover & Thomas
 
10/22

18. Rate-Distortion Theorem: Converse and Examples

[handwritten]

  • Chapter 10 of Cover & Thomas
10/27 19. Exam 2 [no lecture]

 


  •  

 

10/29

20. Rate-Distortion Theorem: Achievability and More Examples

[handwritten]

  • Chapter 10 of Cover & Thomas (and Chapter 9 of Yeung)
 
11/3 21. Election Day [no lecture]

 

 

 
11/5

22. Quantization Theory

[handwritten]

 
11/10

23. Blahut-Arimoto

[handwritten

  • Chapter 10.8 of Cover & Thomas (and Chapter 10 of Yeung)
 
11/12

24. Strong Data Processing Inequalities

[handwritten][s]

 
11/17 25. Large Deviations
  • Chapter 11.4-11.5 of Cover & Thomas
 
11/19

26. Error Exponents for Channel Coding

[s]

  • Chapter 5.6 of Blahut
 
12/1 27. Error Exponents for Channel Coding  
12/3 28. Multiple Access Channel: Achievability
  • Chapter 15.3 of Cover & Thomas
 
12/8 29. Multiple Access Channel: Converse, Examples, and Duality
  • Chapter 15.3 and 15.5 of Cover & Thomas