CIS 6250: Exploring the Depths of Machine Learning Theory

Welcome to CIS 6250, an advanced doctoral-level course designed to rigorously explore the theory of machine learning. This course is structured to provide a deep dive into the mathematical foundations and formal proofs that underpin modern machine learning algorithms. Expect a dynamic learning environment that blends traditional lectures with interactive discussions and critical analysis of core concepts.

This course meets twice a week, specifically on Tuesdays and Thursdays from 10:15 AM to 11:45 AM, with our initial session commencing on Tuesday, August 27th. While there are no strict prerequisites, a solid background in algorithms, complexity theory, discrete mathematics, combinatorics, convex optimization, probability theory, and statistics will be highly beneficial. More broadly, “mathematical maturity” is crucial as we will be engaging with detailed proofs throughout the semester. If you’re unsure about your background’s suitability, please don’t hesitate to ask. The course is open to auditors and occasional participants as well.

Course requirements for enrolled students include active participation in class discussions, completion of problem sets, potential leadership of a class discussion, and a final project. Final projects offer flexibility, ranging from original research contributions to comprehensive literature reviews, or in-depth problem-solving exercises.

Course Schedule and Topics

Note: The following schedule is tentative and subject to adjustments based on the course’s progression.

Module 1: Foundations and PAC Learning
This module sets the stage with a course overview, covering essential topics and mechanics. We will analyze the rectangle learning problem and introduce the Probably Approximately Correct (PAC) learning model. Topics include PAC learning conjunctions, the intractability of PAC learning 3-term DNF, and approaches to PAC learning 3-term DNF using 3CNF.

  • Course Overview Lecture Notes
  • Rectangle Learning Problem Lecture Notes
  • PAC Model Lecture Notes
  • K&V Chapter 1

Module 2: Consistency, PAC Learning, and Finite Hypothesis Spaces
We delve into consistency and PAC learning within finite hypothesis spaces, exploring the concepts of consistency and compression.

  • Consistency and Compression Lecture Notes
  • K&V Chapter 2

Module 3: Infinite Hypothesis Spaces and VC Dimension
This module extends our understanding to infinite hypothesis spaces, introducing the Vapnik-Chervonenkis (VC) dimension and the principle of uniform convergence.

Module 4: Boosting Techniques
We examine boosting algorithms, powerful methods for improving the accuracy of learning algorithms.

Module 5: Noise and Statistical Queries
This module addresses PAC learning in the presence of classification noise (CN) and introduces the Statistical Query (SQ) Model.

Module 6: No-Regret Learning and Game Theory
Exploring the intersection of machine learning and game theory through no-regret learning algorithms.

Module 7: Fairness in Machine Learning
Addressing the critical topic of fairness in machine learning algorithms and outcomes.

Module 8: Differential Privacy and Machine Learning
Examining the principles of differential privacy and its applications within machine learning.

CIS 6250 offers a rigorous and comprehensive exploration into the theory of machine learning, ideal for doctoral students seeking a profound understanding of the field’s theoretical underpinnings. Through a blend of lectures, discussions, and projects, students will gain expertise in the core principles and advanced topics shaping modern machine learning.

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *