Computational Learning Theory: 2017-2018
OverviewMachine learning studies automatic methods for identifying patterns in complex data and for making accurate predictions based on past observations. In this course, we develop rigorous mathematical foundations of machine learning, in order to provide guarantees about the behaviour of learning algorithms and also to understand the inherent difficulty of learning problems.
The course will begin by providing a statistical and computational toolkit, such as generalisation guarantees, fundamental algorithms, and methods to analyse learning algorithms. We will cover questions such as when can we generalise well from limited amounts of data, how can we develop algorithms that are computationally efficient, and understand statistical and computational trade-offs in learning algorithms. We will also discuss new models designed to address relevant practical questions of the day, such as learning with limited memory, communication, privacy, and labelled and unlabelled data. In addition to core concepts from machine learning, we will make connections to principal ideas from information theory, game theory and optimisation.
This is an advanced course requiring a high level of mathematical maturity. It is expected that students will have taken prior courses on machine learning, algorithms, and complexity.
NB. this course was titled "Advanced Machine Learning" in 2016/17 and the previous course materials can be found at... student may also refer to the previous Computational Learning Theory course that was taught in Michaelmas term 2014 and Michaelmas term 2014 for preparation but it should be noted that this course will be different in scope and conent from that course.
These are not hard pre-requisites. However, if you have not taken a majority of the following courses at Oxford (or equivalent elsewhere), you will find the course challenging.
- Algorithms and Data Structures
- Machine Learning
- At least one of Probability and Computing, Computational Complexity, Foundations of Computer Science
If you have any doubt about your background, you are encouraged to discuss with the instructor before the beginning of term or as soon thereafter as possible. Extra lectures will cover some basic computer science background for mathematics students and basic inequalities from probability that we will encounter frequently.
- (Background) Probability
- (Background) Computational Complexity
- PAC learning, sample complexity, computational complexity
- Proper vs improper learning
- Growth function, VC dimension, sample complexity lower bounds
- Learning with membership queries, Angluin's algorithm
- Boosting, weak learning, Adaboost, Margin Bounds
- Learning in the presense of noise, SQ Learning, Agnostic Learning
- Learning Parities with noise, Learning Juntas
- Complexity-theoretic and cryptographic hardness of learning
- Online learning, perceptron and winnow, attribute efficient learning
- Learning from expert advice, multi-armed bandit problems
- (Time permitting) Learning discrete distributions
- (Time permitting) Privacy preserving learning, adaptive data analysis
- Michael Kearns and Umesh Vazirani. An Introduction to Computational Learning Theory. MIT Press, 1994
- Shai Shalev-Shwartz and Shai Ben-David. Understanding Machine Learning: From Theory to Algorithms. Cambridge University Press, 2014.