Graph Representation Learning: 2022-2023
This is an advanced course on machine learning with relational data, focusing on the recent advances in the field of graph representation learning. The goal is to provide a systematic coverage of the fundamentals and foundations of graph representation learning. The course will introduce the definitions of the relevant machine learning models (e.g., graph neural networks), discuss their mathematical underpinnings, formally study their properties (e.g., relational inductive bias, expressive power), and demonstrate ways to effectively develop and train such models.
After studying this course, students will:
- Have knowledge of the different paradigms for performing graph machine learning.
- Understand the definition of a range of neural network models, along with their properties.
- Be able to derive and implement optimisation algorithms for these models.
- Understand the foundations of the Bayesian approach to graph machine learning.
- Understand how to choose a model to describe a particular type of data.
- Know how to evaluate a learned model in practice.
- Understand the mathematics necessary for constructing novel machine learning solutions.
- Be able to design and implement various graph machine learning algorithms in a range of real-world applications.
Required background knowledge includes probability theory, linear algebra, continuous mathematics, multivariate calculus, and a basic understanding of graph theory and logic. Students are required to have already taken a machine learning course. Good programming skills are needed, and lecture examples and practicals will be given mainly in Python and PyTorch.
Introduction, motivation, and applications of graph representation learning
- Lecture 1: Overview of graph representation learning
- Lecture 2: Applications of graph representation learning
Shallow node embedding models
- Lecture 3: Node embeddings
- Lecture 4: Knowledge graph embeddings
- Lecture 5: Knowledge graph embedding models
Fundamentals of graph neural networks
- Lecture 6: Message passing neural networks
- Lecture 7: A deep dive into message passing neural networks
- Lecture 8: Graph neural network architectures
- Lecture 9: Graph neural networks and knowledge graphs
Foundations, limitations, and extensions of graph neural networks
- Lecture 10: Information bottlenecks: over-smoothing
- Lecture 11: Information bottlenecks: over-squashing
- Lecture 12: Expressive power of message passing neural networks
- Lecture 13: Higher-order graph neural networks
- Lecture 14: Message passing neural networks with node identifiers
Generative graph representation learning
- Lecture 15: Generative graph learning
- Lecture 16: Variational approaches to generative graph learning
- Lecture 17: Autoregressive approaches to generative graph learning
Guest lecture: By Peter Battaglia: Modeling physical structure and dynamics using graph-based machine learning
Overview of graph representation learning; node embedding models and knowledge graphs; graph representation learning; graph neural networks; nodel-level, edge-level, graph-level tasks; expressive power of message passing neural networks; limitations and extensions of graph neural networks; generative graph learning models.
William L. Hamilton (2020), Graph Representation Learning, Synthesis Lectures on AI and ML, Vol. 14, No. 3.
Students are formally asked for feedback at the end of the course. Students can also submit feedback at any point here. Feedback received here will go to the Head of Academic Administration, and will be dealt with confidentially when being passed on further. All feedback is welcome.