# Probability and Computing: 2019-2020

| |

| Schedule C1 (CS&P) — Computer Science and Philosophy Schedule C1 — Computer Science |

| Hilary Term 2020 (20 lectures) |

## Overview

This course will be examined by a take-home examination.

In this course we study applications of probabilistic techniques to computer science, focusing on randomised algorithms and the probabilistic analysis of algorithms. Randomisation and probabilistic techniques have applications ranging from combinatorial algorithms, data structures, graph and network algorithms, communication protocols, combinatorial optimisation, computational geometry, and machine learning.Randomised algorithms, which typically guarantee a correct result only with high probability, are often simpler and faster than corresponding deterministic algorithms. Randomisation can also be used to break symmetry and achieve load balancing in parallel and distributed computing. This course introduces students to the ideas in probability that are most relevant to computer science. This background theory is motivated by and illustrated by a wide-ranging series of applications.

lectures for this course are not going to be recorded in Hilary Term 2020.

## Learning outcomes

At the end of the course students are expected to

- Understand basic concepts and tools in probability theory that are relevant to computing, including random variables, independence, linearity of expectations, tail bounds, Markov chains and martingales.
- Be able to use the above tools to devise and analyse randomised algorithms and to carry out the probabilistic analysis of deterministic algorithms.
- Understand some of the main paradigms in the design of randomised algorithms, including random sampling, random walks, Markov-chain simulation, algebraic techniques and amplification.

## Prerequisites

Exposure to probability theory, discrete mathematics, algorithms and linear algebra will be assumed. Specific concepts that will be assumed include discrete probability spaces, the principle of inclusion-exclusion, conditional probability, random variables, expectation, basic graph theory, the binomial theorem, finite fields, power series, sorting algorithms, asymptotic notation, vector spaces, and matrices. Graduate students who have not taken a course in algorithms will not be at a significant disadvantage. Diligent students lacking some of the required background but willing to acquaint themselves with the necessary material are encouraged to contact the lecturer to discuss their situation prior to registering for this course.

## Synopsis

The material will be mostly drawn from the course text "Probability and Computing", by Mitzenmacher and Upfal. Supplementary material is also taken from the book "Counting Sampling and Integrating: Algorithms and Complexity", by Mark Jerrum and from the book "Randomized Algorithms" by Motwani and Raghavan. It is expected that most of the following will be covered:

- Introduction to randomised algorithms: string equality and finding minimum-size cut-sets.
- Linearity of expectations and its relation to algorithms for Max Cut and Max-3-Sat
- Derandomisation by the method of conditional expectations and pairwise independence
- Applications of linearity of expectations: The coupon collector's problem, randomised quicksort and Jensen's inequality
- Tail bounds: Markov's inequality, Chebyshev's inequality and Chernoff bounds
- Application of tail bounds to set balancing and load balancing via balls and bins
- Markov chain algorithms for 2SAT, 3SAT and random walks
- Monte Carlo algorithms: Counting and Approximate Counting
- Sampling algorithms and Markov Chain Monte Carlo
- Markov Chain mixing times, coupling and path coupling
- Martingales and stopping times.
- The Azuma-Hoeffding inequality and its applications to chromatic number, pattern matching, and balls and bins
- The Lovasz Local Lemma
- Online algorithms (the secretary problem and the prophet inequality)
- Interactive proofs

## Syllabus

Tools and Techniques: random variables, independence, linearity of expectations, tail bounds, the probabilistic method, Markov chains and random walks, Monte Carlo algorithms, probability amplification and derandomisation.

Applications: Sorting and searching, graph algorithms, combinatorial optimisation, propositional satisfiability, load balancing (balls and bins), random sampling, counting algorithms.

## Reading list

**Required text: **

- Probability and Computing, by Michael Mitzenmacher and Eli Upfal, Cambridge University Press.

**Also useful**:

- Counting Sampling and Integrating: Algorithms and Complexity, by Mark Jerrum, Birkhauser.
- Randomized Algorithms, by Rajeev Motwani and Prabhakar Raghavan, Cambridge University Press.