Concurrent Programming: 2020-2021
Many challenges arise during the design and implementation of concurrent and distributed programs. The aim of this course is to understand those challenges, and to see techniques for tackling them. The course considers several paradigms for concurrent programming: message-passing concurrency; datatype-based concurrency; synchronous data-parallel concurrency; monitors; and semaphores.
Learning outcomesAt the end of the course students are expected to understand:
- The conceptual foundations of concurrent programming, and
- A variety of effective ways of structuring concurrent and distributed programs.
This course combines well with the Concurrency course: Concurrent Programming helps provide motivation for Concurrency, while Concurrency helps to provide formal underpinnings for this course.
Students should have a basic understanding of Object Oriented Programming (objects, classes, interfaces, inheritance, abstract classes, polymorphism), and be comfortable with programming in Scala. Although the methods of reasoning used in the course will be informal, students should be broadly familiar with the notions of pre- and post-condition, invariant, and abstraction function.
The course will have a number of practicals, to allow students to gain experience with concurrent programming. These practicals will use Scala.
MSc students may not take both this course and Concurrent Algorithms and Data Structures.
Introduction: reasons for concurrency; processes, threads; shared variables; race conditions; concurrent architectures; concurrent computing paradigms; safety and liveness; challenges of concurrent computing.
Message-passing concurrency; channels. Example: numerical integration using workers and a controller; data parallelism; bag of tasks; closing channels; testing against a sequential implementation.
Alternation: syntax, semantics. Example: dining philosophers; deadlock.
Client-server architectures: concurrent datatypes; linearizability and testing of concurrent datatypes.
Interacting peers: patterns of interaction: centralised, fully-connected, ring and tree topologies.
Synchronous data-parallel computation: barrier synchronisation.
Monitors: syntax and semantics; examples; conditions.
Patterns of concurrent computation: recursive parallel, bag of tasks with replacement, pipelines, competition parallel, task parallel, map/reduce, revision of other patterns.
Semaphores: syntax and semantics; using semaphores for mutual exclusion and signalling; examples; passing the baton.
Reasons for concurrency; processes, threads; safety and liveness. Message-passing concurrency; deadlock. Concurrent datatypes. Clients and servers. Interacting peers. Synchronous parallel computation. Patterns of concurrent programming: data parallel; bag of tasks; recursive parallel; task parallel. Low-level concurrency controls: barrier synchronisation; monitors; conditions; semaphores. Testing.
Course textbook: Foundations of Multithreaded, Parallel, and Distributed Programming, Gregory R. Andrews, Addison-Wesley, 2000.
Background reading on Scala:
Alternative: Principle of Concurrent and Distributed Programming, M. Ben-Ari, Prentice Hall, 1990.
Alternative background on Scala: Programming in Scala, Martin Odersky, Lex Spoon, Bill Venners, artima, 2008.