ECE 563 - Information Theory (Fall 2020)

Lecturer: Lav Varshney (office hours, Friday 9:30am-10:30am, Zoom)

Teaching Assistant: Sourya Basu (office hours, Wednesday, 9:00am-10:00am, Zoom)

Lectures: Tuesday and Thursday, 12:30pm, Zoom (if you have not received the password, please ask the course staff).  Recordings via Illinois Media Space.

Problem Solving Sessions: Monday, 9:00am-10:00am, Zoom [optional]

Course Goals

• Learn how to establish the fundamental limits and optimal designs for communication systems, including the basic conceptual idea of coding.
• Develop skills in mathematical abstraction of engineering and natural systems, so as to define closed deductive systems within which to prove theorems.
• Attain facility with mathematical tools from information theory that may be broadly applicable.
• Gain confidence in using information-theoretic approaches for problems beyond communication, e.g. in learning, computing, biology, and social sciences.

Catalog Description

Mathematical models for channels and sources; entropy, information, data compression, channel capacity, Shannon's theorems, and rate-distortion theory.

Prerequisites: Solid background in probability (ECE 534, MATH 464, or MATH 564).

Textbook: T. M. Cover and J. A. Thomas, Elements of Information Theory, 2nd ed., Wiley, 2006.

• Homework [including peer production assignment] (25%) --- late submissions will incur 25% score reduction
• Exam 1 [October 1 at 7pm via CBTF] (15%)
• Exam 2 [October 27 at 7pm via CBTF] (15%)
• Final exam [as designated by university via CBTF] (20%)
• Group juxtaposition paper [in groups of three, in roughly Allerton format] (25%)
• Weekly check-ins with course staff [to be scheduled] (0%)

Syllabus

• Homework 0 (released August 23, due August 25, preferably before class)
• Homework 1 (released , due ) [solutions]
• Homework 2 (released , due ) [solutions]
• Homework 3 (released , due ) [solutions]
• Homework 5 (released , due ) [solutions]
• Homework 6 (released , due ) [solutions]

Problem Solving Sessions

Old exams

Exams

• Exam 1 - October 1 at 7pm via CBTF
• Exam 2 - October 27 at 7pm via CBTF
• Final Exam - as designated by university

Juxtaposition Paper

Course Schedule

 Date Topic Reading Assignment Learning Objectives Multimedia Supplements 8/25 1. The problem of communication, information theory beyond communication [slides] Chapter 1 (Introduction and Preview) of Cover & Thomas 8/27 2. The idea of error-control coding and linear codes Chapter 7.11 (Hamming Codes) of Cover & Thomas 9/1 3. Information measures and their axiomatic derivation Chapter 2.1-2.3 (Entropy, Relative Entropy and Mutual Information) of Cover & Thomas T. Berger, "An Axiomatic Derivation of the Information Measure" Federico Echenique and  Roland G. Fryer, Jr., "The Quarterly Journal of Economics, vol. 122, May 2007, pp. 441–485. 9/3 4. Basic inequalities with information measures Chapter 2.4-2.10 (Entropy, Relative Entropy and Mutual Information) of Cover & Thomas J. Aczél, Z. Daróczy, On Measures of Information and their Characterizations, 1975. 9/8 5. Asymptotic Equipartition Property Chapter 3.1 of Cover & Thomas L. D. Goodfellow, "A psychological interpretation of the results of the Zenith radio experiments in telepathy," Journal of Experimental Psychology, 23(6), 601-632, 1938. Raymond Yeung, "Weak Typicality" 9/10 6. Source Coding Theorem Chapter 3.2 of Cover & Thomas Chapter 5.2 of Yeung, if you'd like Raymond Yeung, "Source Coding Theorem" 9/15 7. Variable-length Codes Chapter 5 of Cover & Thomas 9/17 8. Entropy Rate of Stochastic Processes Chapter 4 of Cover & Thomas 9/22 9. Distributed Source Coding Chapter 15.4 of Cover & Thomas 9/24 10. Universal Source Coding Chapter 13 of Cover & Thomas David Brailsford, "Elegant Compression in Text (The LZ77 Method)" 9/29 11. Method of Types Chapter 11.1-11.3 of Cover & Thomas 10/1 12. Exam 1 [no lecture] 10/6 13. Hypothesis Testing Chapter 11.7-11.10 of Cover & Thomas 10/8 14. Channel Coding Theorem: Converse and Joint AEP Chapter 7.9 and 7.6 of Cover & Thomas Brit Cruise 10/13 15. Channel Coding Theorem: Achievability and Examples Chapter 7.7 and 7.1 of Cover & Thomas 10/15 16. Source-Channel Separation Chapter 7.13 of Cover & Thomas (and e.g. Gastpar et al., 2003) Michael Gastpar, et al., "To code, or not to code," IEEE Transactions on Information Theory, 49(5): 1147-1158, May 2003. 10/20 17. Differential Entropy, Maximum Entropy, and Capacity of Real-Valued Channels Chapter 8, 9, and 12 of Cover & Thomas 10/22 18. Rate-Distortion Theorem: Converse and Examples Chapter 10 of Cover & Thomas Claude Shannon, "Scientific Aspects of Juggling" Jordana Cepelewicz, "Overtaxed Working Memory Knocks the Brain Out of Sync" "This duality can be pursued further and is related to a duality between past and future and notions of control and knowledge. Thus we may have knowledge of the past but cannot control it; we may control the future but have no knowledge of it." - Shannon (1959) 10/27 19. Exam 2 [no lecture] 10/29 20. Rate-Distortion Theorem: Achievability and More Examples Chapter 10 of Cover & Thomas (and Chapter 9 of Yeung) 11/3 21. Election Day [no lecture] 11/5 22. Quantization Theory Chapters 5 and 6 of Gersho & Gray 11/10 23. Blahut-Arimoto Chapter 10.8 of Cover & Thomas (and Chapter 10 of Yeung) 11/12 24. Strong Data Processing Inequalities [handwritten][s] W. S. Evans and L. J. Schulman, "Signal Propagation and Noisy Circuits," 1999. Y. Polyanskiy and Y. Wu, "Strong Data-Processing Inequalities for Channels and Bayesian Networks," 2017. 11/17 25. Large Deviations Chapter 11.4-11.5 of Cover & Thomas 11/19 26. Error Exponents for Channel Coding [s] Chapter 5.6 of Blahut 12/1 27. Error Exponents for Channel Coding Chapter 5.7 of Blahut G. D. Forney, Jr., "On Exponential Error Bounds for Random Codes on the BSC" 12/3 28. Multiple Access Channel: Achievability Chapter 15.3 of Cover & Thomas 12/8 29. Multiple Access Channel: Converse, Examples, and Duality Chapter 15.3 and 15.5 of Cover & Thomas