Graph Representation Learning: 2024-2025
Lecturer | |
Degrees | Schedule C1 (CS&P) — Computer Science and Philosophy Schedule C1 — Computer Science |
Term | Michaelmas Term 2024 (20 lectures) |
Overview
This is an advanced course on machine learning with graph-structured data, focusing on the recent advances in the field of graph representation learning. The goal is to provide a systematic coverage of the fundamentals and foundations of graph representation learning. The course will introduce the definitions of the relevant machine learning models (e.g., graph neural networks), discuss their mathematical underpinnings, formally study their properties (e.g., relational inductive bias, expressive power), and demonstrate ways to effectively develop and train such models.
Learning outcomes
After studying this course, students will:
- Have knowledge of the different paradigms for performing graph machine learning.
- Understand the definition of a range of neural network models, along with their properties.
- Be able to derive and implement optimisation algorithms for these models.
- Understand the foundations of the Bayesian approach to graph machine learning.
- Understand how to choose a model to describe a particular type of data.
- Know how to evaluate a learned model in practice.
- Understand the mathematics necessary for constructing novel machine learning solutions.
- Be able to design and implement various graph machine learning algorithms in a range of real-world applications.
Prerequisites
Required background knowledge includes probability theory, linear algebra, continuous mathematics, multivariate calculus, and a basic understanding of graph theory and logic. Students are required to have already taken a machine learning course. Good programming skills are needed, and lecture examples and practicals will be given mainly in Python and PyTorch.
Synopsis
Introduction
- Lecture 1: Overview and applications of graph learning
Shallow embeddings
- Lecture 2: Node embeddings
- Lecture 3: Knowledge graph embeddings
- Lecture 4: Knowledge graph embedding models
Fundamentals of graph neural networks
- Lecture 5: Message passing neural networks
- Lecture 6: A deep dive into message passing neural networks
- Lecture 7: Graph neural network architectures
- Lecture 8: Graph neural networks for knowledge graphs
Foundations and limitations of graph neural networks
- Lecture 9: Information bottlenecks of graph neural networks
- Lecture 10: Expressive power of message passing neural networks
- Lecture 11: Logical expressiveness of message passing neural networks
- Lecture 12: Convergence properties of graph neural networks
- Lecture 13: Generalization in graph neural networks
Extensions of graph neural networks
- Lecture 14: Higher-order graph neural networks
- Lecture 15: Graph neural networks using substructures
- Lecture 16: Graph transformers
- Lecture 17: Graph foundation models
Generative graph representation learning
- Lecture 18: Generative graph learning
- Lecture 19: Approaches to generative graph learning
Guest lecture: TBD
Syllabus
Overview of graph representation learning; node embedding models and knowledge graphs; graph representation learning; graph neural networks; nodel-level, edge-level, graph-level tasks; expressive power of message passing neural networks; limitations and extensions of graph neural networks; generative graph learning models.
Reading list
William L. Hamilton (2020), Graph Representation Learning, Synthesis Lectures on AI and ML, Vol. 14, No. 3.
Related research
Themes |
Taking our courses
This form is not to be used by students studying for a degree in the Department of Computer Science, or for Visiting Students who are registered for Computer Science courses
Other matriculated University of Oxford students who are interested in taking this, or other, courses in the Department of Computer Science, must complete this online form by 17.00 on Friday of 0th week of term in which the course is taught. Late requests, and requests sent by email, will not be considered. All requests must be approved by the relevant Computer Science departmental committee and can only be submitted using this form.