This course introduces students to the understanding about machine learning, security, privacy, adversarial machine learning, and basic game theory. Students will understand the different machine learning algorithms and analyze their implementation and security vulnerabilities through a series of homework and projects.
Please contact the instructor if you have questions regarding the material or concerns about whether your background is suitable for the course.
The following table outlines the schedule for the course. We will update it as the semester progresses.
Date | Lecture | Content | Slides | Readings |
---|---|---|---|---|
1/27 | Course Intro | Slides | Reading 1, Reading 2 | |
1/29 | Supervised Learning I | Regression, classification, Gradient | Slides | Reading 1, Reading 2, Reading 3 |
2/3 | Supervised Learning II | PAC Learnability, supervised learning in Adversarial Settings | Slides | Reading 1, Reading 2 |
2/5 | Unsupervised Learning I | Clustering, PCA, Matrix completion | Slides | Reading 1 |
2/10 | Unsupervised Learning II | Unsupervised learning in Adversarial Settings | Slides | Reading 1, Reading 2 |
2/12 | Homework 1 Walkthrough, Q&A | Reading 1 | ||
2/17 | Non-instructional day | |||
2/19 | Attacks at Decision Time | Evasion Attacks, Anomaly Detection | Slides | Reading 1 |
2/24 | Backdoor Attacks on Machine Learning | Transferability based attacks, score based blackbox attacks, and decision based attacks | Slides | Reading 1, Reading 2 |
2/26 | Poisoning attacks | Slides | Reading 1, Reading 2, Reading 3 | |
3/3 | Guest Lecture | Slides | Reading 1, Reading 2 | |
3/5 | White-box/black-box Decision-time Attacks against different models | Slides | ||
3/10 | Defending against decision-time attacks | Optimal evasion-robust classification, Feature level protection | Slides | Reading 1 |
3/12 | Midterm Exam | |||
3/17 | Guest Lecture | Slides | Reading 1, Reading 2, Reading 3 | |
3/19 | Certified defenses against evasion attacks | Slides | Reading 1, Reading 2 | |
3/24 | Non-instructional day | |||
3/26 | Knowledge enriched robust learning models | |||
3/31 | Defending against poisoning attacks I | Data sub-sampling, outlier removal | Slides | Reading 1, Reading 2, Reading 3, Reading 4 |
4/2 | Defending against poisoning attacks II | Trimmed optimization | Slides | |
4/7 | Privacy attacks against machine learning models | Slides | Reading 1, Reading 2 | |
4/9 | Privacy-preserving machine learning | Slides | ||
4/14 | Game theoretic analysis for privacy protection | Slides | ||
4/16 | Fairness in machine learning | Slides | ||
4/21 | Robustness in federated learning | Slides | ||
4/23 | Data valuation in machine learning | Slides | Reading 1, Reading 2 | |
4/28 | Final review, final project presentation | |||
4/30 | Final Exam | |||
5/5 | Final Exam Analysis |
The course will involve 3 programming homework (due on 2/28, 3/28, 5/2, respectively), a midterm, and a final. Unless otherwise noted by the instructor, all work in this course is to be completed independently. If you are ever uncertain of how to complete an assignment, you can go to office hours or engage in high-level discussions about the problem with your classmates on the Piazza boards.
Grades will be assigned as follows:
The expectations for the course are that students will attend every class, do any readings assigned for class, and actively and constructively participate in class discussions. Class participation will be a measure of contributing to the discourse both in class, through discussion and questions, and outside of class through contributing and responding to the Piazza forum.
More information about course requirements will be made available leading up to the start of classes.
This course will include topics related computer security and privacy. As part of this investigation we may cover technologies whose abuse could infringe on the rights of others. As computer scientists, we rely on the ethical use of these technologies. Unethical use includes circumvention of an existing security or privacy mechanisms for any purpose, or the dissemination, promotion, or exploitation of vulnerabilities of these services. Any activity outside the letter or spirit of these guidelines will be reported to the proper authorities and may result in dismissal from the class and possibly more severe academic and legal sanctions.
The University of Illinois at Urbana-Champaign Student Code should also be considered as a part of this syllabus. Students should pay particular attention to Article 1, Part 4: Academic Integrity. Read the Code at the following URL: http://studentcode.illinois.edu/.
Academic dishonesty may result in a failing grade. Every student is expected to review and abide by the Academic Integrity Policy: http://studentcode.illinois.edu/. Ignorance is not an excuse for any academic dishonesty. It is your responsibility to read this policy to avoid any misunderstanding. Do not hesitate to ask the instructor(s) if you are ever in doubt about what constitutes plagiarism, cheating, or any other breach of academic integrity.
To obtain disability-related academic adjustments and/or auxiliary aids, students with disabilities must contact the course instructor and the as soon as possible. To insure that disability-related concerns are properly addressed from the beginning, students with disabilities who require assistance to participate in this class should contact Disability Resources and Educational Services (DRES) and see the instructor as soon as possible. If you need accommodations for any sort of disability, please speak to me after class, or make an appointment to see me, or see me during my office hours. DRES provides students with academic accommodations, access, and support services. To contact DRES you may visit 1207 S. Oak St., Champaign, call 333-4603 (V/TDD), or e-mail a message to disability@uiuc.edu. Please refer to http://www.disability.illinois.edu/.
Emergency response recommendations can be found at the following website: http://police.illinois.edu/emergency-preparedness/. I encourage you to review this website and the campus building floor plans website within the first 10 days of class: http://police.illinois.edu/emergency-preparedness/building-emergency-action-plans/.
Any student who has suppressed their directory information pursuant to Family Educational Rights and Privacy Act (FERPA) should self-identify to the instructor to ensure protection of the privacy of their attendance in this course. See http://registrar.illinois.edu/ferpa for more information on FERPA.