Skip to main content

Student Projects

Undergraduate students in the third and fourth years of the Final Honour School of Computer Science, and students for the MSc in Computer Science are required to undertake a project. Mathematics & Computer Science undergraduates are required to undertake a Computer Science project or a Mathematics dissertation in their fourth year. Computer Science & Philosophy undergraduates may choose to undertake a Computer Science project or a Philosophy thesis in their fourth year.

This is your chance to work on your own project, something that interests and inspires you. We have put together a brief guidance document on how to do it; please click on the link below. On this site, you will also find a sortable and searchable list of projects and potential supervisors. You are welcome to choose one of the projects on the list, or approach a potential supervisor and negotiate your own topic.

Important Deadlines are:

Monday of Week 7, Hilary Term : your online project registration form must be submitted.
MSc students:
Monday of Week 1, Trinity Term : submission deadline for your project proposal.
3rd/4th year students:
12pm on Monday of Week 4 of Trinity Term : Submission deadline.
MSc students:
Tuesday of Week -5 (minus 5), Michaelmas Term : Submission deadline

Project writing handbook

Sample projects

3rd year

4th year

List of projects

Suitable for

Research themes


Project Supervisors Themes MSc  Description
Aggregation of Photovoltaic Panels Alessandro Abate Automated Verification C MSc

The increased relevance of renewable energy sources has modified the behaviour of the electrical grid. Some renewable energy sources affect the network in a distributed manner: whilst each unit has little influence, a large population can have a significant impact on the global network, particularly in the case of synchronised behaviour. This work investigates the behaviour of a large, heterogeneous population of photovoltaic panels connected to the grid. We employ Markov models to represent the aggregated behaviour of the population, while the rest of the network (and its associated consumption) is modelled as a single equivalent generator, accounting for both inertia and frequency regulation. Analysis and simulations of the model show that it is a realistic abstraction, and quantitatively indicate that heterogeneity is necessary to enable the overall network to function in safe conditions and to avoid load shedding. This project will provide extensions of this recent research. In collaboration with an industrial partner.

Prerequisites: Computer-Aided Formal Verification, Probabilistic Model Checking

Analysis and verification of stochastic hybrid systems Alessandro Abate Automated Verification C MSc

Stochastic Hybrid Systems (SHS) are dynamical models that are employed to characterize the probabilistic evolution of systems with interleaved and interacting continuous and discrete components.

Formal analysis, verification, and optimal control of SHS models represent relevant goals because of their theoretical generality and for their applicability to a wealth of studies in the Sciences and in Engineering.

In a number of practical instances the presence of a discrete number of continuously operating modes (e.g., in fault-tolerant industrial systems), the effect of uncertainty (e.g., in safety-critical air-traffic systems), or both occurrences (e.g., in models of biological entities) advocate the use of a mathematical framework, such as that of SHS, which is structurally predisposed to model such heterogeneous systems.

In this project, we plan to investigate and develop new analysis and verification techniques (e.g., based on abstractions) that are directly applicable to general SHS models, while being computationally scalable.

Courses: Computer-Aided Formal Verification, Probabilistic Model Checking, Probability and Computing, Automata Logic and Games

Prerequisites: Familiarity with stochastic processes and formal verification

Automated verification of complex systems in the energy sector Alessandro Abate Automated Verification C MSc

Smart microgrids are small-scale versions of centralized electricity systems, which locally generate, distribute, and regulate the flow of electricity to consumers. Among other advantages, microgrids have shown positive effects over the reliability of distribution networks.

These systems present heterogeneity and complexity coming from 1. local and volatile renewables generation, and 2. the presence of nonlinear dynamics both over continuous and discrete variables. These factors calls for the development of proper quantitative models. This framework provides the opportunity of employing formal methods to verify properties of the microgrid. The goal of the project is in particular to focus on the energy production via renewables, such as photovoltaic panels.

The project can benefit form a paid visit/internship to industrial partners.

Courses: Computer-Aided Formal Verification, Probabilistic Model Checking, Probability and Computing, Automata Logic and Games

Prerequisites: Familiarity with stochastic processes and formal verification, whereas no specific knowledge of smart grids is needed.

Development of software for the verification of MPL models Alessandro Abate Automated Verification C MSc

This project is targeted to enhance the software tollbox VeriSiMPL (''very simple''), which has been developed to enable the abstraction of Max-Plus-Linear (MPL) models. MPL models are specified in MATLAB, and abstracted to Labeled Transition Systems (LTS). The LTS abstraction is formally put in relationship with its MPL counterpart via a (bi)simulation relation. The abstraction procedure runs in MATLAB and leverages sparse representations, fast manipulations based on vector calculus, and optimized data structures such as Difference-Bound Matrices. LTS can be pictorially represented via the Graphviz tool and exported to PROMELA language. This enables the verification of MPL models against temporal specifications within the SPIN model checker.

Courses: Computer-Aided Formal Verification, Numerical Solution of Differential Equations

Prerequisites: Some familiarity with dynamical systems, working knowledge of MATLAB and C

Formal verification of a software tool for physical and digital components Alessandro Abate, Daniel Kroening Automated Verification MSc

We are interested in working with existing commercial simulation software that is targeted around the modelling and analysis of physical, multi-domain systems. It further encompasses the integration with related software tools, as well as the interfacing with devices and the generation of code. We are interested in enriching the software with formal verification features, envisioning the extension of the tool towards capabilities that might enable the user to raise formal assertions or guarantees on models properties, or to synthesise correct-by-design architectures. Within this long-term plan, this project shall target the formal generation of faults warnings, namely of messages to the user that are related to ``bad (dynamical) behaviours'' or to unwanted ``modelling errors''. The student will be engaged in developing algorithmic solutions towards this goal, while reframing them within a formal and general approach. The project is inter-disciplinary in dealing with hybrid models involving digital and physical quantities, and in connecting the use of formal verification techniques from the computer sciences with more classical analytical tools from control engineering

Courses: Computer-Aided Formal Verification, Software Verification. Prerequisites: Knowledge of basic formal verification.

Innovative Sensing and Actuation for Smart Buildings Alessandro Abate Automated Verification C MSc Sensorisation and actuation in smart buildings and the development of smart HVAC (heat, ventilation and air-conditioning) control strategies for energy management allow for optimised energy usage, leading to the reduction in power consumption or to optimised demand/response strategies that are key in a rather volatile market. This can further lead to optimised maintenance for the building devices. Of course the sensitisation of buildings leads to heavy requirements on the overall infrastructure: we are interested in devising new approaches towards the concept of using ``humans as sensors''. Further, we plan to investigate approaches to perform meta-sensing, namely to extrapolate the knowledge from physical sensors towards that of virtual elements (as an example, to infer the current building occupancy from correlated measurements of temperature and humidity dynamics). On the actuation side, we are likewise interested in engineering non-invasive minimalistic solutions, which are robust to uncertainty and performance-certified. The plan for this project is to make the first steps in this direction, based on recent results in the literature. The project can benefit from a visit to Honeywell Labs (Prague). Courses: Computer-Aided Formal Verification. Prerequisites: Some familiarity with dynamical systems.
Model learning and verification Alessandro Abate, Daniel Kroening Automated Verification C MSc

This project will explore connections of techniques from machine learning with successful approaches from formal verification. The project has two sides: a theoretical one, and a more practical one: it will be up to the student to emphasise either of the two sides depending on his/her background and/of interests. The theoretical part will develop existing research, for instance in one of the following two inter-disciplinary domain pairs: learning & repair, or reachability analysis & Bayesian inference. On the other hand, a more practical project will apply the above theoretical connections on a simple models setup in the area of robotics and autonomy.

Courses: Computer-Aided Formal Verification, Probabilistic Model Checking, Machine Learning

Precise simulations and analysis of aggregated probabilistic models Alessandro Abate Automated Verification C MSc

This project shall investigate a rich research line, recently pursued by a few within the Department of CS, looking at the development of quantitative abstractions of Markovian models. Abstractions come in the form of lumped, aggregated models, which are beneficial in being easier to simulate or to analyse. Key to the novelty of this work, the proposed abstractions are quantitative in that precise error bounds with the original model can be established. As such, whatever can be shown over the abstract model, can be as well formally discussed over the original one.

This project, grounded on existing literature, will pursue (depending on the student's interests) extensions of this recent work, or its implementation as a software tool.

Courses: Computer-Aided Formal Verification, Probabilistic Model Checking, Machine Learning

Safe Reinforcement Learning Alessandro Abate Automated Verification C MSc

Reinforcement Learning (RL) is a known architecture for synthesising policies for Markov Decision Processes (MDP). We work on extending this paradigm to the synthesis of ‘safe policies’, or more general of policies such that a linear time property is satisfied. We convert the property into an automaton, then construct a product MDP between the automaton and the original MDP. A reward function is then assigned to the states of the product automaton, according to accepting conditions of the automaton. With this reward function, RL synthesises a policy that satisfies the property: as such, the policy synthesis procedure is `constrained' by the given specification. Additionally, we show that the RL procedure sets up an online value iteration method to calculate the maximum probability of satisfying the given property, at any given state of the MDP. We evaluate the performance of the algorithm on numerous numerical examples. This project will provide extensions of these novel and recent results.

Prerequisites: Computer-Aided Formal Verification, Probabilistic Model Checking, Machine Learning

Software development for abstractions of stochastic hybrid systems Alessandro Abate Automated Verification C MSc

Stochastic hybrid systems (SHS) are dynamical models for the interaction of continuous and discrete states. The probabilistic evolution of continuous and discrete parts of the system are coupled, which makes analysis and verification of such systems compelling. Among specifications of SHS, probabilistic invariance and reach-avoid have received quite some attention recently. Numerical methods have been developed to compute these two specifications. These methods are mainly based on the state space partitioning and abstraction of SHS by Markov chains, which are optimal in the sense of reduction in abstraction error with minimum number of Markov states.

The goal of the project is to combine codes have been developed for these methods. The student should also design a nice user interface (for the choice of dynamical equations, parameters, and methods, etc.).

Courses: Probabilistic Model Checking, Probability and Computing, Numerical Solution of Differential Equations

Prerequisites: Some familiarity with stochastic processes, working knowledge of MATLAB and C

Contextuality and quantum advantage Samson Abramsky Foundations, Structures, and Quantum C MSc "The aim of the project is to explore the use of quantum resources to gain advantage over classical computational models in a range of tasks. The tasks which may be considered include: bounded-depth circuits, non-local games, measurement-based computation, communication complexity, and query complexity. High-level methods for describing quantum solutions for such problems will be studied, based on recent work on a structural theory of contextuality and its application to quantum advantage developed by Abramsky and co-workers. Both qualitative aspects - existence of perfect strategies or solutions - and quantitative aspects - contextuality monotones and resource inequalities - will be studied. References: Contextual fraction as a measure of contextuality, Abramsky, Barbosa and Mansfield PRL 2017 The quantum monad, Abramsky, Barbosa, de Silva and Zapata, MFCS 2017. "
Pebble games, monads and comonads in classical, probabilistic and quantum computation Samson Abramsky Foundations, Structures, and Quantum C MSc Pebble games are an important and widely used tool in logic, algorithms and complexity, constraint satisfaction and database theory. The idea is that we can explore a pair of structures, e.g. graphs, by placing up to k pebbles on them, so we have a window of size at most k on the two structures. If we can always keep these pebbles in sync so that the two k-sized windows look the same (are isomorphic) then we say that Duplicator has a winning strategy for the k-pebble game. This gives a resource-bounded notion of approximation to graphs and other structures which has a wide range of applications. Monads and comonads are widely used in functional programming, e.g. in Haskell, and come originally from category theory. It turns out that pebble games, and similar notions of approximate or local views on data, can be captured elegantly by comonads, and this gives a powerful language for many central notions in constraints, databases and descriptive complexity. For example, k-consistency can be captured in these terms; another important example is treewidth, a key parameter which is very widely used to give “islands of tractability” in otherwise hard problems. Finally, monads can be used to give various notions of approximate or non-classical solutions to computational problems. These include probabilistic and quantum solutions. For example, there are quantum versions of constraint systems and games which admit quantum solutions when there are no classical solutions, thus demonstrating a “quantum advantage”. Monads and comonads can be used in combination, making use of certain “distributive laws”. The aim of this project is to explore a number of aspects of these ideas. Depending on the interests and background of the student, different aspects may be emphasised, from functional programming, category theory, logic, algorithms and descriptive complexity, probabilistic and quantum computation. Some specific directions include: 1. Developing Haskell code for the k-pebbling comonad and various non-classical monads, and using this to give a computational tool-box for various constructions in finite model theory and probabilistic or quantum computation. 2. Developing the category-theoretic ideas involved in combining monads and comonads, and studying some examples. 3. Using the language of comonads to capture other important combinatorial invariants such as tree-depth. 4. Developing the connections between category theory, finite model theory and descriptive complexity. Some background reading. 1. Leonid Libkin, Elements of finite model theory. (Background on pebble games and the connection with logic and complexity). 2. The pebbling comonad in finite mode theory. S. Abramsky, A. Dawar and P. Wang. (Technical report describing the basic ideas which can serve as a starting point.)
Sheaf theoretic semantics for vector space models of natural language Samson Abramsky Foundations, Structures, and Quantum B C MSc

Contextuality is a fundamental feature of quantum physical theories and one that distinguishes it from classical mechanics. In a recent paper by Abramsky and Brandenburger, the categorical notion of sheaves has been used to formalize contextuality. This has resulted in generalizing and extending contextuality to other theories which share some structural properties with quantum mechanics. A consequence of this type of modeling is a succinct logical axiomatization of properties such as non-local correlations and as a result of classical no go theorems such as Bell and Kochen-Soecker. Like quantum mechanics, natural language has contextual features; these have been the subject of much study in distributional models of meaning, originated in the work of Firth and later advanced by Schutze. These models are based on vector spaces over the semiring of positive reals with an inner product operation. The vectors represent meanings of words, based on the contexts in which they often appear, and the inner product measures degrees of word synonymy. Despite their success in modeling word meaning, vector spaces suffer from two major shortcomings: firstly they do not immediately scale up to sentences, and secondly, they cannot, at least not in an intuitive way, provide semantics for logical words such as `and', `or', `not'. Recent work in our group has developed a compositional distributional model of meaning in natural language, which lifts vector space meaning to phrases and sentences. This has already led to some very promising experimental results. However, this approach does not deal so well with the logical words.

The goal of this project is to use sheaf theoretic models to provide both a contextual and logical semantics for natural language.  We believe that sheaves provide a generalization of the logical Montague semantics of natural language which did very well in modeling logical connectives, but did not account for contextuality. The project will also aim to combine these ideas with those of the distributional approach, leading to an approach which combines the advantages of Montague-style and vector-space semantics.



The interested student should have taken the category theory and computational linguistics courses, or be familiar with the contents of these.

Genomic analysis using machine learning and large scale data management techniques Michael Benedikt Data and Knowledge MSc

We will investigate novel risk analysis -- likelihood
of a patient having some medical condition -- using statistical analysis of
a variety of genomics data sources. This will make use of some new infrastructure for
data management -- a query language for nested data -- along with the use
of the SPARK framework, coupled with some basic statistics
and machine learning algorithms. No background in genomics or statistics
is necessary, but the project
does require knowledge of the basics of data management (e.g. undergrad database
course or some experience with SQL) and good programming skills.

Interpolation Michael Benedikt Data and Knowledge C MSc

Let F1 and F2 be sentences (in first-order logic, say) such that F1 entails F2: that is, any model of F1 is also a model of F2. An interpolant is a sentence G such that F1 entails G, and G entails F2, but G only uses relations and functions that appear in *both* F1 and F2.

The goal in this project is to explore and implement procedures for constructing interpolants, particularly for certain decidable fragments of first-order logic. It turns out that finding interpolants like this has applications in some database query rewriting problems.

Prerequisites: Logic and Proof (or equivalent)

Optimization of Web Query Plans Michael Benedikt Data and Knowledge MSc

This project will look at how to find the best plan for a query, given a collection of data sources with access restrictions.

We will look at logic-based methods for analyzing query plans, taking into account integrity constraints that may exist on the data.

Optimized reasoning with guarded logics Michael Benedikt Data and Knowledge C MSc "Inference in first-order logic is undecidable, but a number of logics have appeared in the last decade that achieve decidability. Many of them are guarded logics, which include the guarded fragment of first-order logic. Although the decidability has been known for some decades, no serious implementation has emerged. Recently we have developed new algorithms for deciding some guarded logics, based on resolution, which are more promising from the perspective of implementation. The project will pursue this both in theory and experimentally." Prerequisites A knowledge of first-order logic, e.g. from the Foundations of CS or Knowledge Representation and Reasoning courses, would be important.
High-throughput Molecular Algorithms Luca Cardelli Automated Verification MSc Recent technological developments allow massive parallel reading (sequencing) and writing (synthesis) of heterogeneous pools of DNA strands. We are no longer limited to circuits built out of a small number of different strands, nor to reading out a few bits of output by fluorescence. While these emerging capabilities are somewhat stochastic and error-prone, new classes of algorithms should feasibly take advantage of them. We plan to start by investigating new algorithms to record the order in which a series of molecular events occurs, e.g., the appearance of certain chemicals over time in a dynamic chemical soup. A large network of DNA gates acting in parallel records the relationships between the events; the result can be read out by massive DNA sequencing. We plan to study feasible DNA structures to implement this and similar algorithms, simulate them, and investigate their correctness and robustness. Prerequisites: background in distributed algorithms, verification, and/or synthetic biology.
Noise in Molecular Logic Gates Luca Cardelli Automated Verification MSc

A Boolean variable B that ranges over {true,false} can be represented by two molecular species {B_true,B_false}. Boolean gates like AND can then be described by chemical reactions over those species. These reactions can in turn be implemented by DNA molecules and physically executed. Networks of such Boolean gates can function as controllers for molecular-scale devices, including devices we may want to insert into living organisms. We want to investigate, by mathematical analysis, modelchecking, and simulation, the noise behavior of these logical gates, due to both noisy inputs and to the intrinsic molecular fluctuations generated by chemical reactions. How we can we compute reliably in such a regime, and how can we design logic gates that are resistant to noise? Prerequisites: Background in verification and/or simulation

Merging all the biomedical image processing power of ITK with Matlab Ramón Casero Cañas, Vicente Grau Computational Biology and Health Informatics MSc The Insight Toolkit (ITK), a C++ library, is the most advanced software base for biomedical image processing. Our group has produced a basic working prototype of a software interface to run ITK image filters from Matlab [1], the biomedical engineer's platform of choice. We are looking for a student who, as a minimum, integrates ~155 existing image filters into this interface to make them available for biomedical research. This is more or less straightforward by hand. The challenge for an exceptional student would be to make a parser to automatically link ITK filters to Matlab. [1]
Information processing and conservation laws in general probabilistic theories Giulio Chiribella Foundations, Structures, and Quantum B C MSc "Project description and suitability for different students considered. See the appendix. -areas in which I could supervise student projects in quantum information theory and quantum foundations. -general areas for student supervision: quantum information and foundations of quantum mechanics.
A generic evaluation of a categorical compositional-distributional model of meaning Bob Coecke Foundations, Structures, and Quantum MSc

The categorical compositional distributional model of meaning is an emerging field of Computational Linguistics and natural language processing. It has been pioneered in the Quantum and Computational Linguistic groups of the department. The general theoretical underpinnings of the model is based on compact closed categories, inspired by the categorical quantum models of Abramsky and Coecke. In order to apply the model to main stream tasks, one has to instantiate it on concrete linguistic models, and in particular on distributional (vector space) models of meaning. So far, these instantiations have been done on datasets with small sentences containing few language units such as verbs, adjectives, and relative pronouns, all separately. In this project, we aim to connect these single experiments and perform a unified task. This is a term-classification task, the goal of which is to successfully classify a number of dictionary terms to their definitions (or vice-versa). Specifically, the project involves the following:

  1. Build a concrete compositional-distributional model based on the categorical framework of Coecke, Sadrzadeh, Clack (2010), putting together all advances that have been taking placing on this topic recently.
  2. Use the model to compose vectors for dictionary definitions, and then measure the cosine distance between them and the term vectors as a guidance for classification.

We seek candidates who know Computational Linguistics and Linear Algebra, as well as programming. A knowledge of category theory is encouraged, but not required. The project will be co-supervised by Dimitri Kartsaklis, who has performed the previous term-classification tasks, Mehrnoosh Sadrzadeh and Bob Coecke. The outcome may be considered for publication in a peer-reviewed conference or journal.


  • Coecke, B., Sadrzadeh, M., and Clark, S. (2010). Mathematical Foundations for Distributed Compositional Model of Meaning. Lambek Festschrift. Linguistic Analysis, 36:345–384.
  • Grefenstette, E. and Sadrzadeh, M. (2011). Experimental Support for a Categorical Compositional Distributional Model of Meaning. In Proceedings of the 2011 Conference on Empirical Methods in Natural Language Processing (EMNLP).
  • Kartsaklis, D., Sadrzadeh, M., and Pulman, S. (2012). A Unified Sentence Space for Categorical Distributional-Compositional semantics: Theory and experiments. In Proceedings of 24th International Conference on Computational Linguistics (COLING 2012)
  • Sadrzadeh, M., Clark, S., and Coecke, B. (2013). The Frobenius anatomy of word meanings I: Subject and object relative pronouns. Journal of Logic and Computation, Advance Access.
  • Kartsaklis, D. and Sadrzadeh, M. (2013). Prior Disambiguation of Word Tensors for Constructing Sentence Vectors. In Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing (EMNLP).
Categorical quantum mechanics and Lie Algebras Bob Coecke, Amar Hadzihasanovic Foundations, Structures, and Quantum MSc

Is there a way to define a category whose objects are root systems (with additional structure), so that the construction of Lie algebras from them is functorial? If so, what are its properties - e.g., is it monoidal, and in that case, is the functor (lax) monoidal?

Prerequisites: categorical quantum mechanics and Lie algebras

Developing conceptual space semantics for compositional distributional models of meaning Bob Coecke, Martha Lewis Foundations, Structures, and Quantum MSc

The compositional distributional model of meaning (Coecke et al., 2010) models the meaning of sentences using a category such as pregroups to model the grammar of the sentences and another category with the same compact closed structure to model the meanings of the individual words. In practice, word meanings are often modelled using finite-dimensional vector spaces, built via word co-occurrence in text corpora. The conceptual spaces approach to semantics (Gärdenfors, 2014) views word meanings as inhabiting regions of a conceptual space, with carefully chosen dimensions. This representation has some similarities but also key differences with the compositional distributional model of meaning. The project would examine a few key conceptual spaces such as 'colour', 'taste', or 'shape', for example, and investigate how these can be incorporated within the compositional distributional model of meaning. This might be undertaken by giving a category-theoretic formalisation of these spaces, and identifying a suitable grammatical structure to implement composition, or alternatively, by analysing the behaviour of nouns, adjectives, and verbs within conceptual spaces, and fitting these to the pregroup grammar.

Prerequisites: category theory

Coecke, B., M. Sadrzadeh, and S. Clark. "Mathematical Foundations for Distributed Compositional Model of Meaning. Lambek Festschrift." Linguistic Analysis 36 (2010): 345-384.
Gärdenfors, Peter. The geometry of meaning: Semantics based on conceptual spaces. MIT Press, 2014.

Diagrammatic methods in computer security Bob Coecke, Stefano Gogioso Foundations, Structures, and Quantum MSc

Throughout the last decade, categorical and diagrammatic methods have found enormous success in the field of quantum information theory, and are currently being extended to a variety of other disciplines. As part of this project, we will explore their application to classical cryptography and computer security. A number of distinct approaches are available, and the project can be tailored to each student's specific interests in the field.

Prerequisites: One of Quantum Computer Science or Categorical Quantum Mechanics required. Computer Security (or equivalent experience) desirable, but not required.

Effective Categorical Reasoning Bob Coecke, Dan Marsden Foundations, Structures, and Quantum MSc

Category theory is an important tool in theoretical computer science. In introductory courses proofs are generally conducted by pasting commuting diagrams together, or simple equational reasoning. There are further proof styles in category theory, for example using string diagrams, “Yoneda style” proofs and the use of internal logics. The aim of this project would be to investigate and contrast these different approaches, showing how, and when, they can be used effectively, on realistic problems. A concrete starting point would be to understanding the techniques involved, and then apply them to non trivial proofs from the literature in order to demonstrate their relative benefits. An ideal outcome would be an example based account for computer scientists of how to reason efficiently in category theory.

Prerequisites: A good understanding of elementary category theory and comfort with mathematical proofs is essential. Some experience with string diagrams, as for example in the quantum computer science courses in the department would also be helpful.

Extending pregroup grammars for inflected languages Bob Coecke, Martha Lewis Foundations, Structures, and Quantum MSc

Within the compositional distributional model of meaning developed by (Coecke et al., 2010), the model of grammar used is based on pregroup grammars, which have the feature that word order is important. In inflected languages, word order may become less important, and therefore pregroups may not be the best model of grammar. The project would involve extending the pregroup grammar to allow for different grammatical cases and to allow for more flexibility in word order. The model developed might simplify the approach taken in (Casadio and Lambek, 2005), which still uses the fundamentally ordered structure of pregroups as a basis for the grammar.

Prerequisites: either category theory or quantum computer science would do

Casadio, Claudia, and Jim Lambek. "A computational algebraic approach to Latin grammar." Research on Language and Computation 3.1 (2005): 45-60.
Coecke, B., M. Sadrzadeh, and S. Clark. "Mathematical Foundations for Distributed Compositional Model of Meaning. Lambek Festschrift." Linguistic Analysis 36 (2010): 345-384.

Quantum Computing and Quantum Information, Logic, Category Theory, Fundamental Physics Bob Coecke Foundations, Structures, and Quantum B C MSc

Bob Coecke is willing to supervise projects in the following areas. Please feel free to contact him.

- Development and applications of the categorical quantum mechanics formalism and corresponding graphical languages

- Development of the quantomatic automated graphical reasoning software

- Theory development and empirical experiments for meaning in natural language processing

Software development: quantomatic, automated graphical reasoning tools Bob Coecke Foundations, Structures, and Quantum B C MSc Recent graph-based formalisms for computation provide an abstract and symbolic way to represent and simulate quantum information processing. Manual manipulation of such graphs is slow and error prone. This project employs a formalism, based on monoidal categories, that supports mechanised reasoning with open-graphs. This gives a compositional account of graph rewriting that preserves the underlying categorical semantics.
Syntax and semantics for noun-noun combinations Bob Coecke, Martha Lewis Foundations, Structures, and Quantum MSc

In English differences in meaning of a compound noun phrase may be signalled by the intonation used. A 'TOY shop' is a shop that sells toys, whereas a 'toy SHOP' is a toy that is a shop. In the latter case the word 'toy' is used as an adjective and is more straightforwardly compositional, but in the former case the meaning of the compound is idiomatic (Kamp and Partee, 1995). The project would model these facets of meaning within the compositional distributional model of meaning (Coecke et al. 2010), and test the model thus developed within a semantic vector space based on text corpora. The project might use methods from (Coecke and Lewis, 2015), which models overextension in noun combinations, and (Kartsaklis and Sadrzadeh, 2015), which model differences in intonation.

Prerequisites: Linear Algebra, either category theory or quantum computer science would do, Python (or similar language suitable for text processing)

Coecke, Bob, and Martha Lewis. "A Compositional Explanation of the Pet Fish Phenomenon." arXiv preprint arXiv:1509.06594 (2015).
Coecke, B., M. Sadrzadeh, and S. Clark. "Mathematical Foundations for Distributed Compositional Model of Meaning. Lambek Festschrift." Linguistic Analysis 36 (2010): 345-384.
Kamp, Hans, and Barbara Partee. "Prototype theory and compositionality." Cognition 57.2 (1995): 129-191.
Kartsaklis, Dimitri, and Mehrnoosh Sadrzadeh. "A Frobenius Model of Information Structure in Categorical Compositional Distributional Semantics." arXiv preprint arXiv:1505.06294 (2015).

Tool Support for Graphical Categorical Reasoning Bob Coecke, Dan Marsden Foundations, Structures, and Quantum MSc

Recent work has demonstrated how a suitable form of string diagrams can provide a powerful tool for reasoning in elementary category theory. Currently these diagrams are laid out and manipulated manually, as good layout choices are important for aiding the intuitions of readers. The project would involve investigating appropriate tool support for working with these diagrams, and ideally to provide a prototype implementation to demonstrate some of the ideas involved. The emphasis will be on convenient layout of practically useful diagrams that convey the correct geometric intuitions, rather than proof automation as provided by the Quantomatic tool for similar graphical calculi.

Prerequisites: A good knowledge of elementary category theory is essential. Some experience with string diagrams, as for example in the quantum computer science courses in the department would be helpful. Good programming skills in at least one language will be needed for prototyping.

General game playing using inductive logic programming Andrew Cropper Artificial Intelligence and Machine Learning B C MSc In the general game playing competition, an agent is given the rules of a game (described as a logic program) and then starts playing the game (i.e. the agent generates traces of behaviour). This project will invert the task so that an agent is given traces of behaviour and then has to learn a set of rules that could produce the behaviour. The project will use techniques from inductive logic programming, a form of machine learning which learns computer programs from input/output examples. This work is mainly implementation and experimentation. Prerequisites: familiarity with logic programming (Prolog)
Lifelong program induction Andrew Cropper Artificial Intelligence and Machine Learning C MSc "The goal of program induction is to learn computer programs from input/output examples of a target program. This project will develop lifelong program induction techniques [1]. In this approach, rather than learn a single program in isolation, the goal is to learn many programs over time, allowing for solutions to be reused between tasks. The goal is to develop techniques to handle large amounts of background knowledge. This work is a mix of theory, implementation, and experimentation. [1] D. Lin, E. Dechter, K. Ellis, J.B. Tenenbaum, and S.H. Muggleton. Bias reformulation for one-shot function induction. In Proceedings of the 23rd European Conference on Artificial Intelligence (ECAI 2014), pages 525-530, Amsterdam, 2014. IOS Press." Prerequisites: familiarity with statistical machine learning and ideally logic programming
Logical reduction of higher-order Horn clauses Andrew Cropper Artificial Intelligence and Machine Learning C MSc "The goal of this project is to study the logical redundancy of sets of higher-order Horn clauses. In particular, we want to know whether certain infinite sets of higher-order Horn clauses can be reduced to minimal finite sets. This work has implications for the field of program induction where higher-order Horn clauses are used as a form of inductive bias. This work is purely theoretical and builds on two existing papers [1,2] [1] Cropper A., Tourret S. (2018) Derivation Reduction of Metarules in Meta-interpretive Learning. In: Riguzzi F., Bellodi E., Zese R. (eds) Inductive Logic Programming. ILP 2018. Lecture Notes in Computer Science, vol 11105. Springer, Cham [2] Cropper A., Muggleton S.H. (2015) Logical Minimisation of Meta-Rules Within Meta-Interpretive Learning. In: Davis J., Ramon J. (eds) Inductive Logic Programming. Lecture Notes in Computer Science, vol 9046. Springer, Cham" Prerequisites: strong knowledge of mathematical logic and computational logic
Machine learning efficient time-complexity programs Andrew Cropper Artificial Intelligence and Machine Learning C MSc "Metaopt [1,2] is an inductive logic programming (ILP) system that learns efficient logic programs from examples. For instance, given input/output examples of unsorted/sorted lists Metaopt learns quicksort (rather than bubblesort). However, Metaopt does not identify the complexity class of learned programs. The goals of this project are to (1) develop methods to identify the complexity class of a program during the learning, and (2) see whether the complexity information can improve the proof search. This work is a mix of theory, implementation, and experimentation. [1] A. Cropper and S.H. Muggleton. Learning efficient logic programs. Machine learning (2018). [2] A. Cropper and S.H. Muggleton. Learning efficient logical robot strategies involving composable objects. In Proceedings of the 24th International Joint Conference Artificial Intelligence (IJCAI 2015), pages 3423-3429. IJCAI, 2015." Prerequisites: familiarity with general machine learning (and preferably computational learning theory) and ideally logic programming
Machine learning functional programs Andrew Cropper Artificial Intelligence and Machine Learning B C MSc "Meta-interpretive learning (MIL) [1,2] is a state-of-the-art program induction system that learns computer programs from input/output examples of a target program. MIL is based on inductive logic programming, and learns logic programs (Prolog and ASP programs). The goal of this project is to see whether the MIL techniques can be adapted or translated into a functional paradigm as to learn functional programs. This work is a mix of theory, implementation, experimentation, but most of the work will be developing the system. [1] A. Cropper and S.H. Muggleton. Learning higher-order logic programs through abstraction and invention. In Proceedings of the 25th International Joint Conference Artificial Intelligence (IJCAI 2016), pages 1418-1424. IJCAI, 2016. [2] S.H. Muggleton, D. Lin, and A. Tamaddoni-Nezhad. Meta-interpretive learning of higher-order dyadic datalog: Predicate invention revisited. Machine Learning, 100(1):49-73, 2015." Prerequisites: comfortable with functional programming and preferably comfortable with logic programming
Relevancy of background knowledge in inductive logic programming Andrew Cropper Artificial Intelligence and Machine Learning B C MSc Inductive logic programming (ILP) is a form of machine learning which learns computer programs from input/output examples of a target program. To improve learning efficiency, ILP systems use background knowledge (i.e. auxiliary functions such as partition and append). However, most ILP systems cannot handle large amounts of background knowledge, and overcoming this limitation is a key challenge in ILP. The goal of this project is to explore techniques to identify relevant background knowledge. There is much freedom with this project, where one could focus on logical aspects, such as finding logically redundant background knowledge, or one could instead focus on statistical aspects, such as finding background knowledge most likely to be relevant for a given task. This work is a mix of theory, implementation, and experiments. Prerequisites: familiarity with statistics, statistical machine learning, and ideally logic programming
Graph Neural Networks for Knowledge Graphs Bernardo Cuenca Grau, Egor Kostylev Artificial Intelligence and Machine Learning, Data and Knowledge C MSc Description: Graph Neural Networks (GNNs) are a class of neural network architectures that has recently become popular for a wide range of applications dealing with graph-structured data. GNNs can be seen as a generalisation of both Convolutional Neural Networks and Recurrent Neural Networks. This project will explore GNN-based techniques for solving various tasks relevant to Knowledge Graphs (datastores where factual information is stored as a graph with nodes representing objects and edges representing relationships between such objects). Examples of relevant tasks include  (a.k.a. link prediction) and entity resolution (determining whether two nodes in the graph represent the same entity).
Information Disclosure in Data Integration Bernardo Cuenca Grau, Michael Benedikt, Egor Kostylev Artificial Intelligence and Machine Learning, Data and Knowledge C MSc Data integration systems allow users to effectively access data sitting in multiple datasources (typically relational databases) by means of queries over a global schema. In practice, datasources often contain sensitive information that the data owners want to keep inaccessible to users. In a recent research paper, the project supervisors have formalized and studied the problem of determining whether a given data integration system discloses sensitive information to an attacker. The paper studies the computational properties of the relevant problems and also identifies situations in which practical implementations are feasible. The goal of the project is to design and implement practical algorithms for checking whether information disclosure can occur in a data integration setting. These algorithms would be applicable to the aforementioned situations for which practical implementations seem feasible. Prerequisites: Familiarity with Databases. The students would also benefit from taking the Knowledge Representation and Reasoning Course and/or Theory of Data and Knowledge Bases.
Analysis of Schelling segregation models. Edith Elkind Artificial Intelligence and Machine Learning MSc In Schelling's model of strategic segergation, agents are placed on a highly regular graph (such as a line of a grid), and each agent belongs to one of k types. Agents have a preference towards being surrounded by agents who belong to their own type, and may change locations if they are not happy at their current location (by moving to an empty location or swapping with another discontent agent). Many variants of this basic model have been considered over the years. The goal of this project is to investigate, theoretically and empirically, the degree of diversity of stable outcomes in Schelling's model, as well as to explore novel variants of the model where agents's preferences may evolve over time.   
Stable roommates problem with structured preferences Edith Elkind Artificial Intelligence and Machine Learning MSc In the stable roommates problem, there are k rooms of varying sizes and n agents, who need to be allocated places in these rooms; it is sometimes assumed that the total number of places is exactly n. Agents may have preferences over rooms as well as potential rommates and may move between rooms so as to improve their assignment. The goal of the project is to understand the complexity of finding stable outcomes in such settings, assuming that agents' preferences over assignments have a simple structure (e.g., each agent wants to maximize the number of friends in their room), and to compare the best social welfare that can be achieved by a centralizes assignment and that achievable in a stable assignment.
Form Corpus & Benchmark Tim Furche Data and Knowledge B C MSc (Supervisor C Schallhart) Web pages are the past since interactive web application interfaces have reshaped the online world. With all their feature richness, they enrich our personal online experience and provide some great new challenges for research. In particular, forms became much complex in assisting the user during the _lling, e.g., with completion options, or through structuring the form _lling process by dynam-ically enabling or hiding form elements. Such forms are a very interesting research topic but their complexity prevented so far the establishment of a corpus of modern forms to benchmark di_erent tools dealing with forms automatically. This MSC project will _ll this gap in building a corpus of such forms: Based on a number of production sites from one or two domains, we will build our corpus of web interfaces, connected to a (shared) database. Not only will the future evaluations in the DIADEM project rely on this corpus, but we will also publish the corpus { promoting it as a common benchmark for the research community working on forms. Knowledge in Java, HTML, CSS, Javascript, and web application development are required.
Form Filling & Probing Tim Furche Data and Knowledge B C MSc (Joint with C Schallhart) Unearthing the knowledge hidden in queryable web sites requires a good understanding of the involved forms. As part of DIADEM, we are developing OPAL (Ontology based web Pattern Analysis with Logic), a tool to recognize forms belonging to a parameterizable application domain, such as the real estate or used car market. OPAL determines the meaning of individual form elements, e.g., it identi_es the _eld for the minimum or maximum price or for some location. This MSC project will build upon OPAL to not only deal with static forms but also with sequences of interrelated forms, as in case of a rough initial form, followed by a re_nement form, or in case of forms showing some options only after _lling some other parts. Over the course of this MSC project, we will develop a tool which invokes OPAL to analyze a given form, to explore all available submission mechanisms on this form, analyze the resulting pages for forms continuing the initial query, and to combine the outcome all found forms into a single interaction description. Knowledge in Java, HTML, CSS are required, prior experience in logic programming would be a strong plus.
Constrained Reinforcement Learning via Self-Supervision Yarin Gal Artificial Intelligence and Machine Learning MSc

Despite the recent advances and success of reinforcement learning methods, trial and error-based approaches have been limited to toy settings (e.g. video/board games). In safety-critical domains (e.g. self-driving vehicles, domestic robots) autonomous systems are trained against a simulator, since mistakes in the real-world cannot be tolerated. Nonetheless, simulators are hardly capable of capturing the full complexity of the problem and hence when the trained machines fail when exposed to novel settings [1]. Consequently, we would like those systems to be able to learn online, in the real-world, without acting catastrophically, respecting pre-defined constraints. Self-supervised (a.k.a. unsupervised) reinforcement learning studies sequential decision-making without external reward functions, driven only by intrinsically-motivated utilities. The experiments will be based be based on the recently proposed OpenAI safety-gym [2] and a suite of unsupervised reinforcement learning methods [3, 4, 5] will be implemented and benchmarked in this framework.


Requirements: constrained optimisation, experience with deep learning frameworks (e.g. TensorFlow), reinforcement learning.

[1] Generalizing from a few environments in safety-critical reinforcement learning. Zachary Kenton, Angelos Filos, Owain Evans, Yarin Gal.

[2] Benchmarking Safe Exploration in Deep Reinforcement Learning. Alex Ray, Joshua Achiam, Dario Amodei.

[3] Unsupervised meta-learning for reinforcement learning. Abhishek Gupta, Benjamin Eysenbach, Chelsea Finn, Sergey Levine.

[4] Reinforcement Learning with Unsupervised Auxiliary Tasks. Max Jaderberg, Volodymyr Mnih, Wojciech Marian Czarnecki, Tom Schaul, Joel Z Leibo, David Silver, Koray Kavukcuoglu.

[5] Diversity is All You Need: Learning Skills without a Reward Function. Benjamin Eysenbach, Abhishek Gupta, Julian Ibarz, Sergey Levine.


Autonomous Driving in Old University Towns Yarin Gal Artificial Intelligence and Machine Learning MSc Autonomous driving simulators are paramount to the development of effective policies to drive autonomous cars. For example, the popular open-source CARLA simulator, implemented using Unreal Engine, has driven the community forward by providing open digital assets simulating american towns of various characteristics. However, we would expect autonomous car agents trained in an american town simulator to fail when deployed in new scenarios, for example when deployed in old university towns. This is because many visual scenes are very different from what the agent saw at training time, thus the agent would not know how to react. In this project we will try to validate this hypothesis by extending the current assets implemented in the simulator to include Old University Towns, and assess the robustness of imitative models trained on american towns, when deployed in Old University Towns. We will use Google Maps to create a new map for the simulator based on Oxford city centre, and use Google Street View to create new assets for the simulator as well. Requirements * strong python experience * experience with deep learning, including computer vision models * preferred: experience with unreal engine and asset creation
Benchmarks for Bayesian Deep Learning: Astrophysics Yarin Gal Artificial Intelligence and Machine Learning C MSc Bayesian deep learning (BDL) is a field of Machine Learning where we develop tools that can reason about their confidence in their predictions. A main challenge in BDL is comparing different tools to each other, with common benchmarks being much needed in the field. In this project we will develop a set of tools to evaluate Bayesian deep learning techniques, reproduce common techniques in BDL, and evaluate them with the developed tools. The tools we will develop will rely on downstream tasks that have made use of BDL in real-world applications such as parameter estimation in Strong Gravitational Lensing with neural networks. Prerequisites: only suitable for someone who has done Probability Theory, has worked in Machine Learning in the past, and has strong programming skills (Python).
Deconstructing Active Learning Yarin Gal Artificial Intelligence and Machine Learning MSc Deep learning with small amounts of data - eg deep active learning - is a very attractive topic of research in Deep Learning, widely used in industry applications, while allowing for many different approaches. Close examination of the active learning literature though suggests many baselines used in previous research to be very weak. Even naive baselines such as random acquisition often perform much better than reported in the literature. This project aims to deconstruct previous research, reproduce results and show that random acquisition performs much better than reported. Moreover, we would investigate the effect of training with pseudo-labels on active learning and using random acquisition on low-confidence samples only (which should boost performance at low cost and could be competitive).  Requirements ·         Strong Python coder ·         Comfortable with reproducing/reimplementing existing papers, adapting published research code ·         Interested in Deep Learning, Computer Vision and Active Learning
Topics in Randomised Algorithms and Computational Complexity Andreas Galanis Algorithms and Complexity Theory C MSc Description: Andreas Galanis is willing to supervise projects in the areas of randomised algorithms and computational complexity. Problems of interest include (i) the analysis of average case instances of hard combinatorial problems (example: can we satisfy a random Boolean formula?), and (ii) the computational complexity of counting problems (example: can we count approximately the number of proper colourings of a graph G?). The projects would suit mathematically oriented students, especially those with an interest in applying probabilistic methods to computer science.
Data Science and Machine Learning Techniques for Model Parameterisation in Biomedical and Environmental Applications David Gavaghan, Martin Robinson, Michael Clerx Computational Biology and Health Informatics B C MSc "Time series data arise as the output of a wide range of scientific experiments and clinical monitoring techniques. Typically the system under study will either be undergoing time varying changes which can be recorded, or the system will have a time varying signal as input and the response signal will be recorded.  Familiar everyday examples of the former include ECG and EEG measurements (which record the electrical activity in the heart or brain as a function of time), whilst examples of the latter occur across scientific research from cardiac cell modelling to battery testing. Such recordings contain valuable information about the underlying system under study, and gaining insight into the behaviour of that system typically involves building a mathematical or computational model of that system which will have embedded within in key parameters governing system behaviour. The problem that we are interested in is inferring the values of these key parameter through applications of techniques from machine learning and data science. Currently used methods include Bayesian inference (Markov Chain Monte Carlo (MCMC), Approximate Bayesian Computation (ABC)), and non-linear optimisation techniques, although we are also exploring the use of other techniques such as probabilistic programming and Bayesian deep learning. We are also interested in developing techniques that will speed up these algorithms including parallelisation, and the use of Gaussian Process emulators of the underlying models Application domains of current interest  include modelling of the cardiac cell (for assessing the toxicity of new drugs), understanding how biological enzymes work (for application in developing novel fuel cells), as well as a range of basic science problems. Application domains of current interest include modelling of the cardiac cell (for assessing the toxicity of new drugs), understanding how biological enzymes work (for application in developing novel fuel cells), as well as a range of basic science problems. " Prerequisites: some knowledge of Python
Topics in Continuous Maths, Discrete Maths, Linear Algebra, Object-oriented Programming, Machine Learning and Probability and Computing David Gavaghan Computational Biology and Health Informatics B C Prof Dave Gavaghan is willing to supervise in the area of Topics in Continuous Maths, Discrete Maths, Linear Algebra, Object-oriented Programming, Machine Learning and Probability and Computing.
Benchmarking SAT-solvers via necklace splitting Paul Goldberg Algorithms and Complexity Theory C MSc We consider the problem of generating challenge-instances (or benchmarks) for SAT solvers.We propose to investigate the usage of the "NECKLACE-SPLITTING" problem to create such challenges instances, since this problem is hard but has guaranteed solutions. The project involves generating instances of NECKLACE-SPLITTING, converting them to SAT instances, and passing them to various SAT-solvers to test the SAT-solvers' performance. It will also be of interest to know whether certain constrained kinds of solutions are harder to find via this approach. There is scope to experiment with alternative ways of generating the random instances of NECKLACE-SPLITTING.
Cake-cutting with low envy Paul Goldberg Algorithms and Complexity Theory C MSc

Description: I-cut-you-choose is the classical way for two people to share a divisible good. For three people, there exists a sequence of operations using 5 cuts, that is also envy-free, but for 4 or more people, it is unknown whether you can share in an envy-free manner, using a finite number of cuts. (This is with respect to a well-known class of procedures that can be represented using a tree whose nodes are labelled with basic "cut" and "choose" operations.) The general idea of this project is to generate and test large numbers of potential cake-cutting procedures, and measure the "extent of envy" in the case where they are not envy-free. (See wikipedia's page on "cake-cutting problem".) It is of interest to find out the amount of envy that must exist in relatively simple procedures.

Prerequisites: competance and enthusiasm for program design and implementation; mathematical analysis and proofs.

Construction of finite automata from examples Paul Goldberg Algorithms and Complexity Theory C

Description: The aim of the project is to allow a user to construct a finite automaton (or alternatively, a regular expression) by providing examples of strings that ought to be accepted by it, in addition to examples that ought not to be accepted. An initial approach would be to test simple finite automata against the strings provided by the user; more sophisticated approaches could be tried out subsequently. One possibility would be to implement an algorithm proposed in a well-known paper by Angluin, "Learning regular sets from queries and counterexamples".

Prerequisites: competance and enthusiasm for program design and implementation; familiarity with finite automata, context-free langauges etc.; mathematical proofs.

Medical information extraction with deep neural networks Paul Goldberg, Alejo Nevado-Holgado Algorithms and Complexity Theory MSc

Background. Electronic Health Records (EHRs) are the databases used by hospital and general practitioners to daily log all the information they record from patients (i.e. disorders, taken medications, symptoms, medical tests…). Most of the information held in EHRs is in the form of natural language text (written by the physician during each session with each patient), making it inaccessible for research. Unlocking all this information would bring a very significant advancement to biomedical research, multiplying the quantity and variety of scientifically usable data, which is the reason why major efforts have been relatively recently initiated towards this aim (e.g. I2B2 challenges or the UK-CRIS network of EHRs

Project. Recent Deep Neural Networks (DNN) architectures have shown remarkable results in traditionally unsolved NLP problems, including some Information Extraction tasks such as Slot Filling [Mesnil et al] and Relation Classification [dos Santos et al]. When transferring this success to EHRs, DNNs offer the advantage of not requiring well formatted text, while the problem remains of labelled data being scarce (ranging on the hundreds for EHRs, rather than the tens of thousands used in typical DNN studies). However, ongoing work in our lab has shown that certain extensions of recent NLP-DNN architectures can reproduce the typical remarkable success of DNNs in situations with limited labelled data (paper in preparation). Namely, incorporating interaction terms to feed forwards DNN architectures [Denil et al] can rise the performance of relation classification in I2B2 datasets from 0.65 F1 score to 0.90, while the highest performance previously reported with the same dataset was 0.74.

We therefore propose to apply DNNs to the problem of information extraction in EHRs, using I2B2 and UK-CRIS data as a testbed. More specifically, the DNNs designed and implemented by the student should be able to extract medically relevant information, such as prescribed drugs or diagnoses given to patients. This corresponds to some of the challenges proposed by I2B2 during recent years (, and are objectives of high interest in UK-CRIS which have sometimes been addressed with older techniques such as rules [Iqbal et al, Jackson et al]. The student is free to use the extension of the feed forward DNN developed in our lab, or to explore other feed forwards or recurrent (e.g. RNN, LSTM or GRU) alternatives. The DNN should be implemented in Python’s Keras, Theano, Tensorflow, or PyTorch.


G. Mesnil et al., Using Recurrent Neural Networks for Slot Filling in Spoken Language Understanding, IEEEACM Trans. Audio Speech Lang. Process. 23 (2015) 530–539. doi:10.1109/TASLP.2014.2383614.

C.N. dos Santos, B. Xiang, B. Zhou, Classifying Relations by Ranking with Convolutional Neural Networks, CoRR. abs/1504.06580 (2015).

M. Denil, A. Demiraj, N. Kalchbrenner, P. Blunsom, N. de Freitas, Modelling, Visualising and Summarising Documents with a Single Convolutional Neural Network, CoRR. abs/1406.3830 (2014).

E. Iqbal, R. Mallah, R.G. Jackson, M. Ball, Z.M. Ibrahim, M. Broadbent, O. Dzahini, R. Stewart, C. Johnston, R.J.B. Dobson, Identification of Adverse Drug Events from Free Text Electronic Patient Records and Information in a Large Mental Health Case Register, PLOS ONE. 10 (2015) e0134208. doi:10.1371/journal.pone.0134208.

R.G. Jackson MSc, M. Ball, R. Patel, R.D. Hayes, R.J. Dobson, R. Stewart, TextHunter – A User Friendly Tool for Extracting Generic Concepts from Free Text in Clinical Research, AMIA. Ann. Symp. Proc. 2014 (2014) 729–738.

Neural network genomics Paul Goldberg, Alejo Nevado-Holgado Algorithms and Complexity Theory MSc

Background. For more than 10 years, GWAS studies have represented a revolution in the study of human disorders and human phenotypes. By measuring how your risk of suffering any given disease changes according to SNP mutations, GWAS studies can measure how relevant each gene is to the disease under study. However, this SNP-disease association is calculated “one SNP at a time�, and ignores all gene-gene interactions that are often crucially important to the causation of the disorder. If the interactions between a number of genes (e.g. insulin and its receptor in type 2 diabetes; or APP and alpha-, beta- and gamma- secretase in Alzheimer…) is what produces a given disorder, this interaction will not be detected in a GWAS analysis. This shortcoming may not be a problem in monogenetic hereditable disorders, such as Huntington's disease, where mutations in a single gene by itself are enough for causing the disease. However, GWAS studies will likely not uncover the mechanisms of complex disorders, where the disease emerges from the interaction of a number of genes. This is likely the source of the so called “missing hereditability� problem observed in most complex traits or diseases, where all the SNP variations taken together cannot account for most of the hereditability of a given trait or disease [Manolio et al]. In addition, it has been demonstrated that complex traits such as height and BMI are clearly and strongly hereditable [Cesarini and Visscher], but GWAS studies simply cannot detect most of this hereditability. In summary, GWAS analyses detect simple individual genetic factors, but not interactions between genetic factors.

Project. We propose to identify the interacting genetic factors behind Alzheimer’s disease and other neurodegenerations with neural networks, which are known for exploiting the interactions present in the to-be analysed data. While the linear models used in GWAS studies are able to identify only linear and monovariated contributions of each gene to a disorder, neural networks can analyse how genes interact with each other to explain the studied disorder. In our laboratories we already have the hardware and the expertise required to build neural networks, and we have used them in other research areas relevant to Alzheimer’s disease. In addition, we have access and experience using UK Biobank, which is the ideal dataset to implement this project. In UK Biobank they have measured wide array SNPs, all disorders and demographics in ~500,000 participants between the ages of 37 and 73, and more than 5000 of them suffer from Alzheimer’s disease, Parkinson’s disease or other neurodegenerations.

We propose the MSc student to build a neural network to predict either diagnoses or disease-related endophenotypes (i.e. brain volumes of affected areas, cognitive scores…) of each one of these participants, using only the information present in the wide array SNPs and relevant demographics. The student is free to use the extension of the feed forward DNN developed in our lab, or to explore other feed forwards or recurrent (e.g. RNN, LSTM or GRU) alternatives. The DNN should be implemented in Python’s Keras, Theano, Tensorflow, or PyTorch.


T.A. Manolio et al., Finding the missing heritability of complex diseases, Nature. 461 (2009) 747–753. doi:10.1038/nature08494.

D. Cesarini, P.M. Visscher, Genetics and educational attainment, Npj Sci. Learn. 2 (2017). doi:10.1038/s41539-017-0005-6.

Rank aggregation Paul Goldberg Algorithms and Complexity Theory C

Description: In social choice theory, a general theme is to take a set of rankings of a set of candidates (also known as alternatives) and compile an "overall" ranking that attempts to be as close as possible to the individual rankings. Each individual ranking can be thought of as a vote that we want to compile into an overall decision. The project involves taking some real-world data, for example university league tables, and computing aggregate rankings to see which individual votes are closest to the consensus. In the case of the Kemeny consensus, which is an NP-complete rank aggregation rule, it is of interest to exploit heuristics that may be effective on real-world data, and see for how large a data set can the Kemeny consensus be computed.

Prerequisites: familiarity with polynomial-time algorithms, NP-hardness; interest in computational experiments on data

Rock-paper-scissors for the computerised strategist Paul Goldberg Algorithms and Complexity Theory C MSc

In the well-known game of rock-paper-scissors, it is clear that any player can "break even" by playing entirely at random. On the other hand, people do a poor job of generating random numbers, and expert players of the game can take advantage of predictable aspects of opponents' behaviour. In this project, we envisage designing algorithms that adapt to human opponent's behaviour, using for example no-regret learning techniques, and modelling the opponent as a probabilistic automaton. Ideally, the student taking this project should manage to persuade some volunteers to test the software! This is needed to provide data for the algorithms, and should provide feedback to the opponent on what mistakes they are making. Depending on how it goes, we could also consider extending this approach to related games.

Prerequisite: Interest in working with probability is important.

Using behavioural data to model a game-theoretic adversary Paul Goldberg Algorithms and Complexity Theory MSc To take the project, a student should have done the courses in Computational Learning Theory, and Machine Learning. The project is related to the paper (Yang et al.) below, and uses a data set (studied in the paper) obtained from an on-line game that simulates an adversarial security setting, in which the player selects one of a number of targets to attack, based on information about their values, and the probabilities that they will be defended. The purpose of this game is to understand a player's "bounded rationality", i.e. the ways in which he fails to make the best decision. The project would study a model of player behaviour, related to ones studied in (Yang et al.), and aim to learn the ways in which a player fails to best-respond. A related objective is to design optimal defender behaviour based on what is learned about attackers' behaviour. @article{YKOTJ13, author="R.\ Yang and C.\ Kiekintveld and F.\ Ord{\'o}{\~n}ez and M.\ Tambe and R.\ John", title="Improving resource allocation strategies against human adversaries in security games: An extended study", journal="Artificial Intelligence", volume="195", pages="440--469", year="2013", }
A Conceptual Model for Assessing Privacy Risk Michael Goldsmith, Sadie Creese, Ioannis Agrafiotis, Arnau Erola Security B C MSc Privacy is not a binary concept, the level of privacy enjoyed by an individual or organisation will depend upon the context within which it is being considered; the more data at attacker has access to the more potential there may be for privacy compromise. We lack a model which considers the different contexts that exist in current systems, which would underpin a measurement system for determining the level of privacy risk that might be faced. This project would seek to develop a prototype model – based on a survey of known privacy breaches and common practices in data sharing. The objective being to propose a method by which privacy risk might be considered taking into consideration the variety of (threat and data-sharing) contexts that any particular person or organisation might be subjected to. It is likely that a consideration of the differences and similarities of the individual or organisational points of view will need to be made, since the nature of contexts faced could be quite diverse.
Applying AI Techniques to Auto-categorise Violent Images Michael Goldsmith, Harjinder Lallie Security B C MSc "The analysis of violent images in digital forensics cases can take a long time and be traumatising for digital investigators. Artificial intelligence (AI) techniques can aid in automating the categorisation of violent images, and datasets such as the Hollywood violent scenes dataset ( can be used to aid the categorisation. This project involves developing a tool which uses AI techniques to categorise violent images. Please note that you will not be required to look at 'real' violent images, the images in the dataset referenced herein are from Hollywood films such as Reservoir Dogs, Armageddon, Harry Potter and the Order of the Phoenix, Billy Elliot, Kill Bill 1 and The Wizard of Oz Useful reference(s): C.H. Demarty,C. Penet, M. Soleymani, G. Gravier. VSD, a public dataset for the detection of violent scenes in movies: design, annotation, analysis and evaluation. In Multimedia Tools and Applications, May 2014. " Prerequisites: suitable for someone who has some AI experience. Knowledge/experience of digital forensics is not critical
Applying AI Techniques to Estimate Suspect/Victim Age Michael Goldsmith, Harjinder Lallie Security B C MSc "Law enforcement agencies are often tasked with determining the age of a suspect or victim. In these circumstances, age determination is often guided by an investigator's professional judgement and can be subjectively biased. This project requires you to apply AI techniques to estimate the age of people in a dataset of images. Datasets such as DEX (available from provide exciting opportunities for testing age estimation algorithms. Useful reference(s): Rothe, R., Timofte, R. and Van Gool, L., 2015. Dex: Deep expectation of apparent age from a single image. In Proceedings of the IEEE International Conference on Computer Vision Workshops (pp. 10-15)." Prerequisites: Suitable for someone who has some AI experience. Knowledge/experience of digital forensics is not critical
Computer Vision for Physical Security Michael Goldsmith, Sadie Creese Security B C MSc

Computer Vision allows machines to recognise objects in real-world footage. In principle, this allows machines to flag potential threats in an automated fashion based on historical and present images in video footage of real-world environments. Automated threat detection mechanisms to help security guards identify threats would be of tremendous help to them, especially if they have to temporarily leave their post, or have to guard a significant number of areas. In this project, students are asked to implement a system that is able to observe a real environment over time and attempt to identify potential threats, independent of a number of factors, e.g. lighting conditions or wind conditions. The student is encouraged to approach this challenge as they see fit, but would be expected to design, implement and assess any methods they develop. One approach might be to implement a camera system using e.g. a web camera or a Microsoft Kinect to conduct anomaly detection on real-world environments, and flag any issues related to potential threats.

Requirements: Programming skills required

Considering the performance limiters for cybersecurity controls Michael Goldsmith, Sadie Creese, Ioannis Agrafiotis, Arnau Erola Security B C MSc

The behaviour, and so effectiveness, of security controls is dependent upon how they are used. One obvious example being the protection afforded by a firewall is dependent upon the maintenance of the rules that determine what the firewall stops and what it does not. The benefit (to the organisation or user seeking to manage risk) of various technical controls in an operational context lacks good evidence and data, so there is scope to consider the performance of controls in a lab environment. This mini-project would select one or more controls from the CIS Top 20 Critical Security Controls (CSC) (version 6.1) and seek to develop laboratory experiments (and implement them) to gather data on how the effectiveness of the control is impacted by its deployment context (including, for example, configuration, dependence on other controls, nature of the threat faced).

Requirements: Students will need an ability to develop a test-suite and deploy the selected controls.

Cybersecurity visualization Michael Goldsmith, Sadie Creese, Ioannis Agrafiotis, Arnau Erola Security B C MSc

Cybersecurity visualization helps analysts and risk owners alike to make better decisions about what to do when the network is attacked. In this project the student will develop novel cybersecurity visualizations. The student is free to approach the challenge as they see fit, but would be expected to design, implement and assess the visualizations they develop. These projects tend to have a focus on network traffic visualization, but the student is encouraged to visualize datasets they would be most interested in. Other interesting topics might for instance include: host-based activities (e.g. CPU/RAM/network usage statistics), network scans (incl. vulnerabilities scanning). Past students have visualized network traffic patterns and android permission violation patterns. Other projects on visualizations are possible (not just cybersecurity), depending on interest and inspiration.

Requirements: Programming skills required.

Detecting disguised processes using Application-Behaviour Profiling Michael Goldsmith, Sadie Creese, Ioannis Agrafiotis, Arnau Erola Security MSc

In order to avoid detection, malware can disguise itself as a legitimate program or hijack system processes to reach its goals. Commonly used signature-based Intrusion Detection Systems (IDS) struggle to distinguish between these processes and are thus only of limited use to detect these kind of attacks. They also have the shortcoming that they need to be updated frequently to possess the latest malware definitions. This makes them inherently prone to missing novel attacks. Misuse detection IDSs however overcome this problem by maintaining a ground truth of normal application behaviour and reporting deviations as anomalies. In this project, students will be tasked to investigate how a process’ behaviours can be profiled in the attempt to identify whether it is behaving anomalously and if it can be correctly identified as malware. This project will build on existing research in this area.

Requirements: Programming skills required.

Experimenting with anomaly detection features for detecting insider attacks Michael Goldsmith, Ioannis Agrafiotis, Sadie Creese, Arnau Erola Security B C MSc

This project will use an anomaly detection platform being developed by the Cyber Security Analytics Group to consider relative detection performance using different feature sets, and different anomalies of interest, in the face of varying attacks. This research would be experimental in nature and conducted with a view to exploring the minimal sets that would result in detection of a particular threat. Further reflection would then be given to how generalisable this result might be and what we might determine about the critical datasets required for this kind of control to be effective.

Eyetracker data analysis Michael Goldsmith, Ivan Martinovic Security MSc

Eyetracking allows researchers to identify where observers look on a monitor. A challenge in analysing eyetracker output however is identifying patterns of viewing behaviour from the raw eyetracker logs over time and what they mean in a research context, particularly as no semantic information about the pixel being viewed is considered. In practice, this means that we know where someone is looking, but nothing about what it means to look at that pixel or the order of pixels. From a research perspective, being able to tag areas of interest, and what those areas mean, or conduct other statistical analysis on viewing patterns would be of significant use to analysing results, particularly if these results could be automated. The purpose of this project is to conduct an eyetracker experiment, and design and implement useful methods to analyse eyetracker data. We will be able to provide training for the student, so they are able to use the eyetracking tools themselves. Other eyetracking projects also possible, depending on interest and inspiration.

Skills: Programming, Statistics

Formal threat and vulnerability analysis of a distributed ledger model Michael Goldsmith, Sadie Creese, Ioannis Agrafiotis, Arnau Erola Security B C MSc

This project would utilise the process algebra CSP and associated model checker FDR to explore various types of threat and how they might successfully compromise a distributed ledger. This type of modelling would indicate possible attacks on a distributed ledger, and could guide subsequent review of actual designs and testing strategies for implementations. The modelling approach would be based on the crypto-protocol analysis techniques already developed for this modelling and analysis environment, and would seek to replicate the approach for a distributed ledger system. Novel work would include developing the model of the distributed ledger, considering which components are important, formulating various attacker models and also formulating the security requirements / properties to be assessed using this model-checking based approach.

Requirements: In order to succeed in this project students would need to have a working knowledge of the machine readable semantics of CSP and also the model checker FDR. An appreciation of threat models and capability will need to be developed.

High Dynamic Range Imaging for Documentation Applications Michael Goldsmith, Sadie Creese Security B C MSc High dynamic range imaging (HDRI) allows more accurate information about light to be captured, stored, processed and displayed to observers. In principle, this allows viewers to obtain more accurate representations of real-world environments and objects. Naturally, HDRI would be of interest to museum curators to document their objects, particularly, non-opaque objects or whose appearance significantly alter dependent on amount of lighting in the environment. Currently, few tools exist that aid curators, archaeologists and art historians to study objects under user-defined parameters to study those object surfaces in meaningful ways. In this project the student is free to approach the challenge as they see fit, but would be expected to design, implement and assess any tools and techniques they develop. The student will then develop techniques to study these objects under user-specified conditions to enable curators and researchers study the surfaces of these objects in novel ways. These methods may involve tone mapping or other modifications of light exponents to view objects under non-natural viewing conditions to have surface details stand out in ways that are meaningful to curators.
Intelligent user activity timelines Michael Goldsmith Security B C MSc "Operating system store temporal data in multiple locations. Digital investigators are often tasked with reconstructing timelines of user activities. Timeline generation tools such as log2timeline can aid in extracting temporal data, similarly, 'Professonal' tools such as Encase and Autopsy build and visualise low level timelines. Collectively, these tools: (1) provide (often high levels of) low level data, and (2) are not able to apply any form of reasoning. This project involves the extraction of temporal data and the application of reasoning algorithms to develop reliable event sequences of interest to an investigator. Useful references: Olsson, J. and Boldt, M., 2009. Computer forensic timeline visualization tool. digital investigation, 6, pp.S78-S87. Buchholz, F.P. and Falk, C., 2005, August. Design and Implementation of Zeitline: a Forensic Timeline Editor. In DFRWS."
International Cybersecurity Capacity Building Initiatives Michael Goldsmith, Sadie Creese, Ioannis Agrafiotis, Arnau Erola Security B C MSc There is a large investment being made by the international community aimed at helping nations and regions to develop their capacity in cybersecurity. The work of the Global Cyber Security Capacity Centre (based at the Oxford Martin School) studies and documents this: There is scope to study in more detail the global trends in capacity building in cybersecurity, the nature of the work and the partnerships that exist to support it. An interesting analysis might be to identify what is missing (through comparison with the Cybersecurity Capacity Maturity Model, a key output of the Centre), and also to consider how strategic, or not, such activities appear to be. An extension of this project, or indeed a second parallel project, might seek to perform a comparison of the existing efforts with the economic and technology metrics that exist for countries around the world, exploring if the data shows any relationships exist between those metrics and the capacity building activities underway. This analysis would involve regression techniques.
Modelling of security-related interactions, in CSP or more evocative notations such as Milner's bigraphs Michael Goldsmith Security B C MSc (Joint with Sadie Creese) Modelling of security-related interactions, in CSP or more evocative notations such as Milner's bigraphs (; Concurrency and Computer Security a distinct advantage. Appropriate for good 3rd or 4th year undergraduates or MSc. * Open to suggestions for other interesting topics in Cybersecurity, if anyone has a particular interest they would like to pursue.
Organisational Resilience and Self-* re-planning Michael Goldsmith, Sadie Creese, Ioannis Agrafiotis, Arnau Erola Security MSc

Key Performance Indicator (KPI) is a type of performance measurement in organisations. They evaluate the success of activities in which an organisation or system engages. Often success is measured in terms of whether an activity has been able to complete and able to repeat (e.g. zero defects, customer satisfaction or similar), other times it is measured in terms of making progress towards a strategic goal. Well-chosen KPIs allows us to reflect upon performance of an organisation, and possibly identify potential future degradation issues in systems. In this project, students are tasked to investigate how KPIs can be used to measure and improve cyber-resilience in an organisation. Particularly, we are interested in investigating how the performance of an organisation, particularly with respects to security as well as mission performance. After identifying which aspects of an organisation or the mission are prone to error, it may be beneficial to propose solutions for how these issues can be addressed. With "self*-re-planning", we believe it would be possible for a system to suggest and automate certain aspects of how the mission, infrastructure or organisation can be repurposed to not fail. The student is encouraged to approach this challenge as they see fit, but would be expected to design, implement and assess any methods they develop. For instance, the project may be network-security focused, mission-processes focused or otherwise. It may also investigate more formal approaches to self-* re-planning methods.

Requirements: Programming and/or Social Science Research Methods.

Penetration testing for harm Michael Goldsmith, Sadie Creese, Ioannis Agrafiotis, Arnau Erola Security B C MSc

Current penetration testing is typically utilised for discovering how organisations might be vulnerable to external hacks, and testing methods are driven by using techniques determined to be similar to approaches used by hackers. The result being a report highlighting various exploitable weak-points and how they might result in unauthorised access should a malign entity attempt to gain access to a system. Recent research within the cybersecurity analytics group has been studying the relationship between these kinds of attack surfaces and the kinds of harm that an organisation might be exposed to. An interesting question would be whether an orientation around intent, or harm, might result in a different test strategy? Would a different focus be given to the kinds of attack vectors explored in a test if a particular harm is aimed at? This mini-project would aim to explore these and other questions by designing penetration test strategies based on a set of particular harms, and then seek to consider potential differences with current penetration practices by consultation with the professional community. Requirements: Students will need to have a working understanding of penetration testing techniques.

Photogrammetry Michael Goldsmith Security B C MSc Photogrammetry is a set of techniques that allows for 3D measurements from 2D photographs, especially those measurements pertaining to geometry or surface colours. The purpose of this project is to implement one or more photogrammetry techniques from a series of 2D photographs. The student is free to approach the challenge as they see fit, but would be expected to design, implement and assess the tool they develop.
Predicting exposure to risk for active tasks Michael Goldsmith, Sadie Creese, Ioannis Agrafiotis, Arnau Erola Security B C MSc

Prior research has been considering how we might better understand and predict the consequences of cyber-attacks based on knowledge of the business processes, people and tasks and how they utilise the information infrastructure / digital assets that might be exposed to specific attack vectors. However, this can clearly be refined by moving to an understanding of those tasks live or active at the time of an attack propagating across a system. If this can be calculated, then an accurate model of where risk may manifest and the harm that may result can be constructed. This project would explore the potential for such a model through practical experimentation and development of software monitors to be placed on a network aimed at inferring the tasks and users that are active based from network traffic. If time allows then host-based sensors might also be explored (such as on an application server) to further refine the understanding of which users and live on which applications etc.

Requirements: Students must be able to construct software prototypes and have a working knowledge of network architectures and computer systems.

Procedural Methods in Computer Graphics Michael Goldsmith Security B C MSc

Procedural methods in computer graphics help us develop content for virtual environments (geometry and materials) using formal grammars. Common approaches include fractals and l-systems. Examples of content may include the creation of cities, planets or buildings. In this project the student will develop an application to use create content procedurally. The student is free to approach the challenge as they see fit, but would be expected to design, implement and assess the methods they develop. These projects tend to have a strong focus designing and implementing existing procedural methods, but also includes a portion of creativity. The project can be based on reality - e.g. looking at developing content that has some kind of basis on how the real world equivalent objects were created (physically-based approaches), or the project may be entirely creative in how it creates content. Past students for instance have built tools to generate cities based on real world examples and non-existent city landscapes, another example include building of procedural planets, including asteroids and earth-like planetary bodies.

Reflectance Transformation Imaging Michael Goldsmith, Stefano Gogioso Foundations, Structures, and Quantum, Security B C MSc Reflectance Transformation Imaging (RTI) is a powerful set of techniques (the first of which known as Polynomial Texture Maps, PTMs) that enables us to capture photographs of objects under a several lighting conditions. Combined, these RTI images form a single photograph in which users can relight these objects by moving the light sources around the hemisphere in front of the object, but also specify user-defined parameters, including removing colour, making the objects more specular or diffuse in order to investigate the surface details in depth. It can be used for forensic investigation of crime scenes as well as cultural heritage documentation and investigation. The purpose of this project is to implement RTI methods of their preference.
Resilience – metrics and tools for understanding organisational resilience Michael Goldsmith, Sadie Creese, Ioannis Agrafiotis, Arnau Erola Security MSc

Resilience in the face of cyber-attack is considered a key requirement for organisations. Prior work within the cybersecurity analytics group has been developing a resilience model for organisations, but there remains a commonly agreed set of metrics that organisations can use to determine just how resilient they are (at a given point in time and continuously). This mini-project would seek to propose a set of metrics, and if time allows tools for applying them (although the latter may well better suit a following DPhil). The approach will be to consider the development of metrics to measure the various characteristics or organisational capabilities / behaviours that are determined to be necessary in order to be resilient. It will be necessary to consider how the metrics might vary according to, or take account of, different threat environments. It may also be interesting to consider if there are refinements of the resilience requirements that apply to different types of organisations. Data analytics approaches will need to be considered, in order to apply the metrics, and students might consider the use of visualisation to help with this. The resulting metrics will be validated in consultation with security professionals and organisations possessing resilience related experience. In the context of the mini-project this is most likely achievable via a small number of interviews, with a more detailed validation and iterative design approach being supported by a longitudinal study working with 2 or 3 organisations which might adopt and test the metrics

Smartphone security Michael Goldsmith Security B C MSc

(Joint with Sadie Creese) Smartphone security: one concrete idea is the development of a policy language to allow the authors of apps to describe their behaviour, designed to be precise about the expected access to peripherals and networks and the purpose thereof (data required and usage); uses skills in formal specification, understanding of app behaviour (by studying open-source apps), possibly leading to prototyping a software tool to perform run-time checking that the claimed restrictions are adhered to. Suitable for good 3rd or 4th year undergraduates, or MSc, Concurrency, Concurrent Programming, Computer Security all possibly an advantage. Other projects within this domain possible, according to interest and inspiration.


Concurrency, Concurrent Programming, Computer Security all possibly an advantage.

Technology-layer social networks Michael Goldsmith Security C MSc (Joint with Sadie Creese) Technology-layer social networks: investigate the potential to identify relationships between people via technology metadata - which machines their machines are "friendly" with. Research will involve identification of all metadata available from the network layer, app layers and the data layer, development of appropriate relationship models, and practical experimentation / forensic-style work exploring how to extract relationships between technologies and identities. Appropriate for 4th year undergraduates or MSc.
Trip-wires Michael Goldsmith, Sadie Creese, Ioannis Agrafiotis, Arnau Erola Security B C MSc At Oxford we have developed a framework for understanding the components of an attack, and documenting known attack patterns can be instrumental in developed trip-wires aimed at detecting the presence of insiders in a system. This project will seek to develop a library of such trip-wires based on a survey of openly documented and commented upon attacks, using the Oxford framework. There will be an opportunity to deploy the library into a live-trial context which should afford an opportunity to study the relative utility of the trip-wires within large commercial enterprises. The mini-project would also need to include experimenting with the trip-wires in a laboratory environment, and this would involve the design of appropriate test methods
Understanding Enterprise Infrastructure Dependencies Michael Goldsmith, Sadie Creese, Ioannis Agrafiotis, Arnau Erola Security B C MSc

There are many tools available for detecting and monitoring cyber-attacks based on network traffic and these are accompanied by a wide variety of tools designed to make alerts tangible to security analysts. By comparison, the impact of these attacks on an organisational level has received little attention. An aspect that could be enhanced further is the addition of a tool facilitating management and updating of our understanding of business processes, but also how those processes are dependent on a network infrastructure. This tool could facilitate the mapping between company strategies, activities needed to accomplish company goals and map these down to the network and people assets. At the top of the hierarchy lies the board, responsible for strategic decisions. These decision are interpreted in the managerial level and could be captured and analysed with business objective diagrams. These diagrams in return could be refined further to derive business processes and organisational charts, ensuring that decision made in the top level will be enforced in the lower levels. The combination of business processes and organisation charts could eventually provide the network infrastructure. For this project we suggest a student could develop novel algorithms for mapping of business processes to network infrastructures in an automated way (given the updated business process files). That said, the student is encouraged to approach this challenge as they see fit, but would be expected to design, implement and assess any methods they develop. Other projects on business process modelling also possible, depending on interest and inspiration.

Vulnerability analysis and threat models for Distributed Ledgers Michael Goldsmith, Sadie Creese, Ioannis Agrafiotis, Arnau Erola Security B C MSc

This project would seek to study the general form of distributed ledgers to identify weak points that might make implementations open to compromise and attack, and to develop corresponding threat models. A number of different approaches might be considered, for example: formal modelling of protocols used to establish agreement or resolve conflicts; design of test suites using attack-graphs and then validation in a laboratory setting.

Bioinformatics Projects Jotun Hein Computational Biology and Health Informatics B C MSc

Jotun Hein has been in Oxford since 2001 and has supervised many students from Computer Science. His main interest is computational biology and combinatorial optimization problems in this area, especially from phylogenetics, biosequence analysis, population genetics, grammars and the origin of life. Some idea for projects can be obtained by browsing these pages:

A student is also welcome to propose his or her own project.

Compilation of a CSP-like language Geraint Jones Programming Languages B C MSc

This is a compiler project, also requiring familiarity with concurrency.

The parallel programming language occam is essentially an implementable sublanguage of CSP. The aim of this project is to produce a small portable implementation of a subset of occam; the proposed technique is to implement a virtual machine based on the inmos transputer, and a compiler which targets that language.

One of the aims of the project is to implement an extension to occam which permits recursion; more ambitious projects might implement a distributed implementation with several communicating copies of the virtual machine. Other possibilities are to produce separate virtual machines, optimised for displaying a simulation, or for efficiency of implementation, or translating the virtual machine code into native code for a real machine.

Logic circuit workbench Geraint Jones Programming Languages B C MSc

This is an interactive programming project with some logic simulation behind it.

The idea is to produce a simulator for a traditional logic breadboard. The user should be able to construct a logic circuit by drawing components from a library of standard gates and latches and so on, and connecting them together in allowable ways. It should then be possible to simulate the behaviour of the circuit, to control its inputs in various ways, and to display its outputs and the values of internal signals in various ways.

The simulator should be able to enforce design rules (such as those about not connecting standard outputs together, or limiting fan-out) but should also cope with partially completed circuits; it might be able to implement circuits described in terms of replicated sub-circuits; it should also be able to some sort of standard netlist.

It might be that this would make a project for two undergraduates: one implementing the interface, the other implementing a simulator that runs the logic.

Modelling of arithmetic circuits Geraint Jones Programming Languages B C MSc

This is a project in the specification of hardware, which I expect to make use of functional programming.

There is a great deal of knowledge about ways (that are good by various measures) of implementing standard arithmetic operations in hardware. However, most presentations of these circuits are at a very low level, involving examples, diagrams, and many subscripts, for example these descriptions.

The aim of this project is to describe circuits like this in a higher-level way by using the higher order functions of functional programming to represent the structure of the circuit. It should certainly be possible to execute instances of the descriptions as simulations of circuits (by plugging in simulations of the component gates), but the same descriptions might well be used to generate circuit netlists for particular instances of the circuit, and even to produce the diagrams.

Toys for Animating Mathematics Geraint Jones Programming Languages B C

The aim is to take some mathematics that would be within the grasp of a mathematically-inclined sixth-former and turn it into some attention-grabbing web pages. Ideally this reveals a connection with computing science. I imagine that this project necessarily involves some sort of animation, and I have visions of Open University television maths lectures.

The programming need not be the most important part of this project, though, because some of the work is in choosing a topic and designing problems and puzzles and the like around it. There's a lot of this sort of thing about, though, so it would be necessary to be a bit original.

Think of GeomLab (and then perhaps think of something a little less ambitious). It might involve logic and proof, it might be about sequences and series, it might be about graphs, it might be about the mathematics of cryptography... you might have something else in mind.

Extensible Search-Based Theorem Prover Mark Kaminski Data and Knowledge C MSc "Background: Search-based proof methods such as those based on DPLL, tableau and connection calculi are widely used in automated theorem proving, ontological reasoning, and automated verification. As most implementations of such methods are developed with a particular logic, algorithm, and set of applications in mind, it is often difficult to extend them with new features. Project goals: The project aims to develop a modular, extensible base for implementing search-based decision procedures for propositional, modal, and description logics. A key objective is to develop an abstract proof search engine that can be instantiated with different logics, proof rules, and search strategies. The capabilities of the engine should be demonstrated by implementing a simple tableau-based or DPLL-based reasoner for propositional or modal logic. Prerequisites: good functional programming skills, familiarity with logic and automated theorem proving.
A fast numerical solver for multiphase flow David Kay Computational Biology and Health Informatics B C MSc

The Allen-Cahn equation is a differential equation used to model the phase separation of two, or more, alloys. This model may also be used to model cell motility, including chemotaxis and cell division. The numerical approximation, via a finite difference scheme, ultimately leads to a large system of linear equation. In this project, using numerical linear algebra techniques, we will develop a computational solver for the linear systems. We will then investigate the robustness of the proposed solver.

Efficient solution of one dimensional airflow models David Kay Computational Biology and Health Informatics B C MSc In this project we will investigate the use of numerical and computational methods to efficiently solve the linear system of equations arising from a one dimensional airflow model within a network of tree like branches. This work will build upon the methods presented within the first year Linear Algebra and Continuous Maths courses. All software to be developed can be in the student’s preferred programming language.  General area: Design and computational implementation of fast and reliable numerical models arising from biological phenomena.
The Square Root Law of Steganography: Empirical Validation Andrew Ker Security B C Steganography means hiding a hidden payload within an apparently-innocent cover, usually an item of digital media (in this project: images). Steganalysis is the art of detecting that hiding took place. A key question is how the amount of information that can be securely hidden (i.e. such that detectors have a high error rate) scales with the size of the cover. In 2008 I co-authored a paper showing that my theoretical "square root law" was observed experimentally, using state-of-the-art (for 2008) hiding and detection methods. This project is to run similar experiments using methods 10 years more modern. It would involve combining off-the-shelf code (some in MATLAB, some in Python) from various researchers and running fairly large scale experiments to measure detection accuracy versus cover size and payload size in tens of thousands of images, then graphing the results suitably. Prerequisites: No particular prerequisites. Ability to piece together others' code and draw graphs nicely.
Transparent Session-Layer Steganography Andrew Ker Security B C MSc Steganography means hiding a hidden payload within an apparently-innocent cover, usually an item of digital media. This project is to implement a transparent proxy which uses something like a webcam video stream to hide steganographic packets. A local process should receive communication on a local socket and merge it with the webcam data stream, a process which can be reversed at the receiver. Prerequisites: It is necessary to have some experience working with video codecs, for example experience contributing to the H.264/5 codec, in order to place the payload in a webcam stream. It is definitely NOT possible to learn the format within the time available. Don't ask if it is possible to read up on it instead: it isn't.
Twitter Pictures as a Steganographic Channel Andrew Ker Security B C MSc Steganography means hiding a hidden payload within an apparently-innocent cover, usually an item of digital media. Most pure research focuses on bitmap or JPEG images, or simple video codes. In practice, there are often further constraints: the carrier might recompress or damage the object. This project is primiarly programming. First we must characterize the properties of image transmission through Twitter, and then provide an implementation of image steganography through it. This might be a complete piece of software with a nice interface, if the channel is straightforward, or a proof-of-concept if the channel causes difficult noise (in which case we will need suitable error-correcting codes.) Prerequisites: Linear algebra. You will need a Twitter account and to learn the Twitter API without help from me. It would be useful to know a little about the JPEG image format before starting the project.
Visualiation of Steganalysis Andrew Ker Security B C Steganography means hiding a hidden payload within an apparently-innocent cover, usually an item of digital media (in this project: images). Steganalysis is the art of detecting that hiding took place. The most effective ways to detect steganography are machine learning algorithms applied to "features" extracted from the images, trained on massive sets of known cover and stego objects. The images are thus turned into points in high-dimensional space. We have little intuition as to the geometrical structure of the features (do images form a homogeneous cluster? do they scale naturally with image size? how much and in what ways do images from different cameras differ?), or how they are altered under embedding (do they move in broadly the same direction? is there is linear separator of cover from stego?). This is a programming project that creates a visualization tool for features, extracting them from images and then projecting them onto 2-dimensional space in interesting ways, while illustrating the effects of embedding. Prerequisites: Linear Algebra, Computer Graphics. Machine Learning an advantage.
Bots for a Board Game Stefan Kiefer Automated Verification B C The goal of this project is to develop bots for the board game Hex. In a previous project, an interface was created to allow future students to pit their game-playing engines against each other. In this project the goal is to program a strong Hex engine. The student may choose the algorithms that underly the engine, such as alpha-beta search, Monte-Carlo tree search, or neural networks. The available interface allows a comparison between different engines. It is hoped that these comparisons will show that the students' engines become stronger over the years. The project involves reading about game-playing algorithms, selecting promising algorithms and datastructures, and design and development of software (in Java or Scala).
Small matrices for irrational nonnegative matrix factorisation Stefan Kiefer Automated Verification B C MSc Recently it was shown ( that there is a 6x11-matrix M with rational nonnegative entries so that for any factorisation M = W H (where W is a 6x5-matrix with nonnegative entires, and H is a 5x11-matrix with nonnegative entries) some entries of W and H need to be irrational. The goal of this project is to explore if the number of columns of M can be chosen below 11, perhaps by dropping some columns of M. This will require some geometric reasoning. The project will likely involve the use of tools (such as SMT solvers) that check systems of nonlinear inequalities for solvability. Prerequisites: The project is mathematical in nature, and a good recollection of linear algebra is needed. Openness towards programming and the use of tools is helpful.
Applications of deductive machine learning Daniel Kroening Automated Verification C MSc

The goal of deductive machine learning is to provide computers with the ability to automatically learn a behaviour that provably satisfies a given high-level specification. As opposed to techniques that generalise from incomplete specifications (e.g. examples), deductive machine learning starts with a complete problem description and develops a behaviour as a particular solution.

Potential applications of deductive machine learning are detailed below, and a student would focus on one of these items for their project. We envisage applying existing algorithms, with potential to develop new ones.

- Game playing strategy: given the specification of the winning criteria for a two-player game, learn a winning strategy.

- Program repair: given a buggy program according to a correctness specification, learn a repair that makes the program correct.

- Lock-free data structures: learn a data structure that guarantees the progress of at least one thread when executing multi-threaded procedures, thereby helping to avoid deadlock.

- Security exploit generation: learn code that takes advantage of a security vulnerability present in a given software in order to cause unintended behaviour of that software.

- Security/cryptographic protocol: learn a protocol that performs a security-related function and potentially applies cryptographic methods.

- Compression: learn an encoding for some given data that uses fewer bits than the original representation. This can apply to both lossless and lossy compression.

Probabilistic Modelling Checking Marta Kwiatkowska Automated Verification B C MSc

Professor Marta Kwiatkowska is happy to supervise projects involving probabilistic modelling, verification and strategy synthesis. This is of interest to students taking the Probabilistic Model Checking course and/or those familiar with probabilistic programming.

Below are some concrete project proposals, but students’ own suggestions will also be considered:

  • Synthesis of driver assistance strategies in semi-autonomous driving. Safety of advanced driver assistance systems can be improved by utilising probabilistic model checking. Recently ( a method was proposed for correct-by-construction synthesis of driver assistance systems. The method involves cognitive modelling of driver behaviour in ACT-R and employs PRISM. This project builds on these techniques to analyse complex scenarios of semi-autonomous driving such as multi-vehicle interactions at road intersections.
  • Equilibria-based model checking for stochastic games. Probabilistic model checking for stochastic games enables formal verification of systems where competing or collaborating entities operate in a stochastic environment. Examples include robot coordination systems and the Aloha protocol. Recently ( probabilistic model checking for stochastic games was extended to enable synthesis of strategies that are subgame perfect social welfare optimal Nash equilibria, soon to be included in the next release of PRISM-games ( This project aims to model and analyse various coordination protocols using PRISM-games.

  • Probabilistic programming for affective computing. Probabilistic programming facilitates the modelling of cognitive processes ( In a recent paper (, a probabilistic programming approach to affective computing was proposed, which enables cognitive modelling of emotions and executing the models as stochastic, executable computer programs. This project builds on these approaches to develop affective models based on, e.g., this paper (
Safety Assurance for Deep Neural Networks Marta Kwiatkowska Automated Verification B C MSc

Safety Assurance for Deep Neural Networks

Professor Marta Kwiatkowska is happy to supervise projects in the area of safety assurance and automated verification for deep learning, including Bayesian neural networks. For recent papers on this topic see, and, and also

Below are some concrete project proposals, but students’ own suggestions will also be considered:

  • Robustness of attention-based sentiment analysis models to substitutions. Neural network models for NLP tasks such as sentiment analysis are susceptible to adversarial examples. In a recent paper ( a method was proposed for verifying robustness of NLP tasks to symbol and word substitutions. The method was evaluated on CNN models. This project aims to develop similar techniques for attention-based NLP models (
  • Attribution-based safety testing of deep neural networks. Despite the improved accuracy of deep neural networks, the discovery of adversarial examples has raised serious safety concerns. In a recent paper ( a game-based method was proposed for robustness evaluation, which can be used to provide saliency analysis. This project aims to extend these techniques with the attribution method ( to produce a methodology for computing the causal effect of each feature and evaluate it on image data.
  • Uncertainty quantification for end-to-end neural network controllers. NVIDIA has created a deep learning system for end-to-end driving called PilotNet ( It inputs camera images and produces a steering angle. The network is trained on data from cars being driven by real drivers, but it is also possible to use the Carla simulator. In a recent paper ( a robustness analysis with statistical guarantees for different driving conditions was carried out for a Bayesian variant of the network. This project aims to develop a methodology based on these techniques and semantic transformation of weather conditions (see to evaluate the robustness of PilotNet or similar end-to-end controllers in a variety of scenarios.


An Innovative Network Scanning Framework Harjinder Lallie, Michael Goldsmith Security B C Networks often contain multiple vulnerabilities and weaknesses which can be exploited by attackers. Security analysts often find it necessary to perform n Network scanning, probing and vulnerability testing aids the process of discovering and correcting network vulnerabilities. There exist a number of challenges in terms of gathering network data in this manner: -          Scalability. Scanning/probing techniques do not scale well to large networks (Lippmann et al., 2006, Bopche and Mehtre, 2014). -          Semantic problems. A number of approaches – such as that by Roschke et al. (2009) and Cheng et al. (2011) attempted to combine and correlate the results returned from multiple vulnerability databases. However, the databases return the results in different formats - some in textual format, others in XML format. This means that the results have to be unified into a common meaningful format -          Data Consolidation. Other than Rosschke et al’s (2009) study into combining network scanning methods, there have been very few studies which critically evaluate methods of combining the use and results of multiple scanning tools -          Performance. Scanning and then combining the results from multiple scanners takes a lot of time (Cheng et al., 2011). Quite often the network configuration changes during the scan – which means that the results are quite often inaccurate. This thesis addresses one or more of the challenges presented above. Prerequisites: This project involves practical work which may involve setting up a virtual network and applying a range of scanning, probing and vulnerability testing mechanisms on the network.
Calculating Network Security Metrics Harjinder Lallie, Michael Goldsmith Security B C The calculation of network security metrics is a complex problem which involves understanding the state and configuration of network connections, devices and protocols. This research analyses the problem of calculating network security metrics and proposes a framework which can calculate security metrics for a typical small network comprising of numerous devices, operating systems and hosts. The project will involve the configuration of virtual networks for testing the framework. CVSS or other metric scores may be used to aid the calculation of the metric. This dissertation involves some practical work which may involve setting up a virtual network which has a number of machines on the network and applying a range of scanning, probing and vulnerability testing mechanisms on the network.
Analysing concurrent datatypes in CSP Gavin Lowe Automated Verification C The aim of this project would be to model some concurrent datatypes in CSP, and to analyse them using the model checker FDR. The concurrent datatypes could be some of those studied in the Concurrent Algorithms and Datatypes course. Typical properties to be proved would be linearizability (against a suitable sequential specification) and lock freedom. Prerequisites: Concurrency and Concurrent Algorithms and Datatypes. Reading: Analysing Lock-Free Linearizable Datatypes using CSP, Gavin Lowe.
Analysis of wildlife tracking data Andrew Markham Cyber Physical Systems MSc We have developed ultra-low power wireless tracking tags which have been deployed on badgers, hares and swans in the wild. The sensors mainly measure 3-D acceleration, but some data is also from our world-first underground 3-D tracking system. The sensors are typically sampled at 8 Hz and collars run for between 2 and 3 months. This has generated a large dataset (> 500 million samples) of animal behaviour traces. For some of these deployments, there are secondary (ground truth) data including video and RFID. The challenge is to develop tools to analyse these datasets and present results to end-users (zoologists) in a user-friendly way. Such a tool would perform classification (typically unsupervised) on behaviour traces, generating classes of behaviour. At a very coarse level, this could simply be sleeping vs active, but the richer the classification, the better for automatic generation of ethograms. This classification can also be used online to alter the behaviour of the tracking tags themselves, such as to reduce the sampling rate when an animal is resting to prolong the lifetime of the collar. Suitable for MSc student interested in statistical analysis/data mining and interdisciplinary science
Automatic Heap Layout Manipulation for Exploitation of Javascript Interpreters Tom Melham, Daniel Kroening Automated Verification C MSc In order to exploit heap-based buffer overflows an exploit developer must place the target object relative to the source of the overflow. This has been automated for situations in which the starting state of the heap is known and the memory allocator behaves deterministically. However, the allocators found in the Javascript interpreters of modern web browsers behave non-deterministically. This project aims to develop automated heap layout manipulation for such situations. The goal will be to build a system that takes as input a proof-of-concept exploit that triggers memory corruption on the heap, and produces as output a new exploit that places an 'interesting' corruption target after the overflow source with a high degree of probability. The proposed projects will expand on the work of the listed Dphil student (Sean Heelan). Prerequisites: * (Required) Ability to comprehend large systems written in C++ * (Required) Familiarity with operating systems concepts such as virtual memory, heap allocators and low level security vulnerabilities * (Optional) Some knowledge of modern web browser internals * (Optional) Some knowledge of exploitation of memory corruption vulnerabilities
Converting Vulnerabilities into Exploitation Primitives Tom Melham, Daniel Kroening Automated Verification C MSc

Fuzzing is a popular approach for producing inputs that cause a program to crash. In the case of heap overflow vulnerabilities, to use them in an exploit it is necessary to determine what can be corrupted by the overflow, how to manipulate the heap to ensure that this corruption occurs, and then how to make use of the corrupted data. The goal of this project is to investigate how to automate this process and develop a system that can convert a vulnerability trigger into into a new input that can be used as part of an exploit to hijack the control flow of the target.

The proposed projects will expand on the work of the lidted Dphil studen (Sean Heelan)

Prerequisites: * (Required) Ability to comprehend C/C++ * (Required) Comfort developing an experimental system of moderate size * (Required) Familiarity with operating systems concepts such as virtual memory, heap allocators and low level security vulnerabilities * (Optional) Some knowledge of exploitation of memory corruption vulnerabilities

Assessing the performance of stochastic optimization methods in structural modelling Peter Minary Computational Biology and Health Informatics B C

The efficiency of Monte Carlo (MC) based stochastic optimization methods (e.g. Kirkpatrick, et al. Science, 220, 671–680 (1983)) are compared to find low energy conformational states of a given system. We are particularly interested in MC protocols where the temperature varies as a series of impulses followed by relaxation and only the temperature of a given part (e.g. site of interest) of the system is changed.

Prerequisites:  Recommended for students who has done the Probability and Computing and Geometric Modelling courses and have interest in randomized search methods and their applications in structural biology.

Developing computational tools to aid the design of CRISPR/Cas9 gene editing experiments Peter Minary Computational Biology and Health Informatics C MSc

At present, the most versatile and widely used gene-editing tool is the CRISPR/Cas9 system, which is composed of a Cas9 nuclease and a short oligonucleotide guide RNA (or guide) that guides the Cas9 nuclease to the targeted DNA sequence (on-target) through complementary binding. There are a large number of computational tools to design highly specific and efficient CRISPR-Cas9 guides but there is a great variation in performance and lack of consensus among the tools. We aim to use ensemble learning to combine the benefits of a selected set of guide design tools to reach superior performance compared to any single method in predicting the efficiency of guides (for which experimental data on their efficiency is available) correctly.

Recommended for students who has done the Machine Learning and the Probability and Computing courses.


Software development for computational structural biology Peter Minary Computational Biology and Health Informatics B C

Several projects are available to implement new algorithms or protocols into the MOSAICS ( software package. The first project would aim for the development of new analysis tools to interpret the simulation result. For example, the new protocol would take (input) a structural similarity mearure and a trajectory of simulated conformations and would produce (output) a measure of structural diversity of conformations visited. In the second project, students would implement and compare different physical models to describe hydrogen bonding, which is among the most important canonical interactions that stabilize the double helical DNA.


The project is recommended for students who took the Geometric Modelling and Computer Animation courses as well as have some interest in Numerical Methods (e.g. solution of Ordinary Differential Equations) and their applications to atomistic simulations.

Tools and applications in computational epigenetics Peter Minary Computational Biology and Health Informatics C

Chemical modifications such as (hydroxy)methylation on nucleic acids are used by the cell for silencing and activation of genes. These so called epigenetic marks can be recognized by ‘protein readers’ indirectly due to their structural ‘imprints', the effects they impose on DNA structure. The project include the development of computational protocols to assess the effect of epigenetic modifications on DNA structure. This research may shed light on how different epigentic modifications affect the helical parameters of the double stranded DNA.


Strong interest in visualizing, analysing and comparing 3D objects and modelling molecular structures.  The project can be tailored to suit those from a variety of backgrounds but would benefit from having taken the following courses: Computer Graphics, Geometric Modelling and Computer Animation.

Topics in Automata Theory, Program Verification and Programming Languages (including lambda calculus and categorical semantics) Andrzej Murawski Programming Languages C MSc

Prof Murawski is willing to supervise in the area of automata theory, program verification and programming languages (broadly construed, including lambda calculus and categorical semantics).

For a taste of potential projects, follow this link.

An Electronic Commerce Protocol Hanno Nickau Foundations, Structures, and Quantum B C

Commercial use of the Internet is becoming more and more common, with an increasing variety of goods becoming available for purchase over the Net. Clearly, we want such purchases to be carried out securely: a customer wants to be sure of what (s)he's buying and the price (s)he's paying; the merchant wants to be sure of receiving payment; both sides want to end up with evidence of the transaction, in case the other side denies it took place; the act of purchase should not leak secrets, such as credit card details, to an eavesdropper.

The aim of this project is to find out more about the protocols that are used for electronic commerce, and to implement a simple e-commerce protocol. In more detail:

Understand the requirements of e-commerce protocols;

Specify an e-commerce protocol, both in terms of its functional and security requirements;

  • Understand cryptographic techniques;
  • Understand how these cryptographic techniques can be combined together to create a secure protocol - and understand the weaknesses that allow some protocols to be attacked;
  • Design a protocol to meet the requirements identified;
  • Implement the protocol.

A variant of this project would be to implement a protocol for voting on the web (which would have a different set of security properties).

Prerequisites for this project include good program design and implementation skills, including some experience of object-oriented programming, and a willingness to learn about protocols and cryptography. The courses on concurrency and distributed systems provide useful background for this project.

1 Jonathan Knudsen, Java Cryptography, O'Reilly, 1998.

Bootstrapping document annotation with Nadeschda Nikitina Data and Knowledge MSc

The recently started initiative of the major search engine providers aims at fostering semantic annotations across the Web. You can read about it here.  Semi-automatic annotation of natural language documents is a long-standing problem area. The goal of this project would be to apply state-of-the-art annotation techniques to a large corpus based on the semantic model.


Some familiarity with topics from Computational Linguistics and Knowledge, Representation and Reasoning.

Factorised Databases Dan Olteanu Data and Knowledge B C MSc

More details can be found at ( and Dr Olteanu would be happy to discuss specific projects within the aforementioned topics with interested students.

Branching Temporal Logics, Automata and Games Luke Ong Programming Languages B C MSc

Model checking has emerged as a powerful method for the formal verification of programs. Temporal logics such as CTL (computational tree logic) and CTL* are widely used to specify programs because they are expressive and easy to understand. Given an abstract model of a program, a model checker (which typically implements the acceptance problem for a class of automata) verifies whether the model meets a given specification. A conceptually attractive method for solving the model checking problem is by reducing it to the solution of (a suitable subclass of) parity games. These are a type of two player infinite game played on a finite graph. The project concerns the connexions between the temporal logics CTL and / or CTL*, automata, and games. Some of the following directions may be explored. 1. Representing CTL / CTL* as classes of alternating tree automata. 2. Inter-translation between CTL / CTL* and classes of alternating tree automata 3. Using B¨uchi games and other subclasses of parity games to analyse the CTL / CTL* model checking problem 4. Efficient implementation of model checking algorithms 5. Application of the model checker to higher-order model checking.


Orna Kupferman, Moshe Y. Vardi, Pierre Wolper: An automata-theoretic approach to branchingtime model checking. J. ACM 47(2): 312-360 (2000).

Rachel Bailey: A Comparative Study of Alorithmics for Solving B¨uchi Games. University of Oxford MSc Dissertation, 2010.

Foundational Structures for Concurrency Luke Ong Programming Languages MSc

Game Semantics and Linear Logic Luke Ong Programming Languages MSc

Lambda Calculus Luke Ong Programming Languages MSc


Luke Ong's Projects Luke Ong Programming Languages B C MSc

See for more information

Semantics of Programming Languages Luke Ong Programming Languages MSc

Types, Proofs and Categorical Logic Luke Ong Programming Languages MSc

Building databases of mathematical objects in Sagemath (Python) Dmitrii Pasechnik Algorithms and Complexity Theory B C MSc

There is an enormous amount of information on constructing various sorts of ``interesting'', in one or another way, mathematical objects, e.g.

block designs, linear and non-linear codes, Hadamard matrices, elliptic curves, etc.  There is considerable interest in having this information available in computer-ready form.  However, usually the only  available form is a paper describing the construction, while no computer code (and often no detailed description of a possible implementation) is provided. This provides interesting algorithmic and software engineering challenges in creating verifiable implementations; properly structured and documented code, supplemented by unit tests, has to be provided, preferably in functional programming style (although  performance is important too).


Sagemath project aims in part to remedy this, by implementing such constructions, see e.g. Hadamard matrices in Sagemath: and


The project will contribute to such implementations.

There might be a possibility for participation in Google Summer of Code (GSoC) with Sagemath as a GSoC organisation, and being partially funded by the EU project ``Open Digital Research Environment Toolkit for the Advancement of Mathematics''

Prerequisites: Interest in open source software, some knowledge of Python, some maths background.

Computing with semi-algebraic sets in Sage (math) Dmitrii Pasechnik Algorithms and Complexity Theory B C MSc

Semi-algebraic sets are subsets of R^n specified by polynomial inequalities. The project will extend capabilities of Sage ( to deal with them, such as CADs computations or sums of squares based (i.e. semi-definite programming based) methods. There might be a possibility for participation in Google Summer of Code (GSoC) or Google Semester of Code with Sage as a GSoC organisation.


Interest in open source software, some knowledge of Python, appropriate maths background.

Implementing and experimenting with variants of Weisfeiler-Leman graph stabilisation Dmitrii Pasechnik Algorithms and Complexity Theory B C MSc Weisfeiler-Leman graph stabilisation is a procedure that plays an important role in modern graph isomorphism algorithms (L.Babai 2015) and enjoyed attention in machine learning community recently (cf. "Weisfeiler-Lehman graph kernels"). While it is relatively easy to implement, efficient portable implementations seem to be hard to find. In this project we will work on producing such an open-source implementation, either by reworking our old code from or producing new one; we will also work on providing Python/Cython interfaces for a possible inclusion in SageMath ( library, and/or elsewhere.
Automatic translation to GPGPU Joe Pitt-Francis Computational Biology and Health Informatics B C MSc

This project involves running cardiac cell models on a high-end GPU card. Each model simulates the electrophysiology of a single heart cell and can be subjected to a series of computational experiments (such as being paced at particular heart rates). For more information about the science and to see it in action on CPU see "Cardiac Electrophysiology Web Lab" at An existing compiler (implemented in Python) is able to translate from a domain specific XML language ( into a C++ implementation. The goal of the project is to add functionality to the compiler in order to get OpenCL or CUDA implementations of the same cell models and to thus increase the efficiency of the "Web Lab".

General GPGPU and high performance computing projects Joe Pitt-Francis Computational Biology and Health Informatics B C MSc

I am willing to supervise projects which fit into the general areas of General Purpose Graphics Processing Unit (GPGPU) programming and High-Performance Computing (HPC). Specific technologies used are likely to based around

  • NVIDIA CUDA for GPU programming;
  • MPI for distributed-memory cluster computing.

All application areas considered although geometric algorithms are favourites of mine.

General graphics projects Joe Pitt-Francis Computational Biology and Health Informatics B C MSc

I am interested in supervising general projects in the area of computer graphics.  If you have a particular area of graphics-related research that you are keen to explore then we can tailor a bespoke project for you.  Specific projects I have supervised in the past include

  • "natural tree generation" which involved using Lindenmayer systems to grow realistic looking bushes and trees to be rendered in a scene;
  • "procedural landscape generation" in which an island world could be generated on-the-fly using a set of simple rules as a user explored it;
  • "gesture recognition" where a human could control a simple interface using hand-gestures;
  • "parallel ray-tracing" on distributed-memory clusters and using multiple threads on a GPU card;
  • "radiosity modelling" used for analysing the distribution of RFID radio signal inside a building; and
  • "non-photorealistic rendering" where various models were rendered with toon/cel shaders and a set of pencil-sketch shaders.

MSc students should note that in order for this option to work as a potential MSc project then it should be combined with a taught-course topic such as machine learning, concurrent programming, linguistics etc.

Graphics pipeline animator Joe Pitt-Francis Computational Biology and Health Informatics B

Pre-requisites: Computer graphics, Object-oriented programming

The idea behind this project is to build an educational tool which enables the stages of the graphics pipeline to be visualised. One might imagine the pipeline being represented by a sequence of windows; the user is able to manipulate a model in the first window and watch the progress of her modifications in the subsequent windows. Alternatively, the pipeline might be represented by an annotated slider widget; the user inputs a model and then she moves the slider down the pipeline, watching an animation of the process

Intuitive exploration through novel visualisation Joe Pitt-Francis Computational Biology and Health Informatics B C I am interested in novel visualisation as a way to represent things in a more appealing and intuitive way. For example the Gnome disk usage analyzer (Baobab) uses either a "ring chart" or "treemap chart" Representation to show us which sub-folders are using the most disk. In the early 1990s the IRIX file system navigator used a 3D skyscraper representation to show us similar information. There are plenty more ways of representing disk usage: from DAGs to centralised Voronoi diagrams. What kind of representation is most intuitive for finding a file which hogging disk-space and which is most intuitive for helping us to remember where something is located in the file-system tree? The aim is to explore other places where visualisation gain intuition: for example, to visualise the output of a profiler to find bottlenecks in software, to visual a code coverage tool in order to check that test-suites are are testing the appropriate functionality or even to visualise the prevalence of diabetes and heart disease in various regions of the country.
Computation of particle interactions in n-dimensional space Martin Robinson Computational Biology and Health Informatics C MSc

"Aboria ( is a C++ library for evaluating and solving systems of equations that can be described as interactions between particles in n-dimensional space. It can be used as a high performance library to implement numerical methods such as Molecular Dynamics in computational chemistry, or Gaussian Processes for machine learning.

Project 1: Aboria features a radial neighbour search to find nearby particles in the n-dimensional space, in order to calculate their interactions. This project will implement a new algorithm based on calculating the interactions between neighbouring *clusters* of particles. Its performance will be compared against the existing implementation, and across the different spatial data structures used by Aboria (Cell-list, Octree, Kdtree). Prerequisites: C++

Project 2: Aboria features a serial Fast Multipole Algorithm (FMM) for evaluating smooth long range interactions between particles. This project will implement and profile a parallel FMM algorithm using CUDA and/or the Thrust library.

Prerequisites: C++, Knowledge of GPU programming using CUDA and/or Thrust

Project 3: The main bottleneck of the FMM is the interactions between well-separated particle clusters, which can be described as low-rank matrix operations. This project will explore different methods compressing these matrices in order to improve performance, using either Singular Value Decomposition (SVD), Randomised" SVD, or Adaptive Cross Approximation

Prerequisites: C++, Linear Algebra

Statistical shape atlas of cardiac anatomy Blanca Rodriguez, Pablo Lamata Computational Biology and Health Informatics MSc

Description: Cardiac remodelling is the change in shape of the anatomy due to disease processes. 3D computational meshes encode shape variation in cardiac anatomy, and render higher diagnostic than conventional geometrical metrics. Better shape metrics will enable an earlier detection, a more accurate stratification of disease, and more reliable evaluation of remodelling response. This project will contribute to the development of a toolkit for the construction of anatomical atlases of cardiac anatomy, and its translation to clinical adoption. The student will learn about the challenges and opportunities of the cross-disciplinary field between image analysis, computational modelling and cardiology, and be part of a project with a big potential impact on the management of cardiovascular diseases.

Prerequisites: Motivation. Good programming skills. Experience with computational graphics and image analysis is an advantage.

A High-Level Scripting Language for AudioMoth Alex Rogers Cyber Physical Systems MSc AudioMoth is a low-cost open-source smart acoustic sensor used for biodiversity and environmental sensing ( There are 10,000 AudioMoth devices in use today; the majority of which are using the standard firmware which provides scheduled recording. We want to open up AudioMoth to ecologists and environmental scientists who wish to use it as a generic data logging platform and to do so we need to provide a high level scripting language that is easy to use and familiar to users of Python and R (currently all AudioMoth code must be in low-level embedded C code). This project will deliver this language (possibly a new language or the porting of an existing language, such as MicroPython, to the AudioMoth platform) and an associated standard library that make reprogramming and reconfiguring AudioMoth simple and straightforward.
A functional/dataflow programming language for acoustic signal processing Alex Rogers Cyber Physical Systems MSc Many detection tasks for acoustic sensors deployed in environmental monitoring (for example, triggering an extended recording after analysing a small sample of sound in order to capture the sound of a particular species of animal, or to detect human activity such as hunting and logging) can be readily described using functional or dataflow programming languages. Prototyping algorithms at this level is intuitive, and with the correct tools can be interactive, enabling non-specialist programmers to make rapid progress. However, to run on small embedded devices the resulting program typically needs to be converted to low-level C code. In this project, you will build a toolchain to automate this process. The resulting system will allow a functional or dataflow programming language (possibly one of your own design) to be used to prototype and evaluate acoustic algorithms. It will then automatically generate low level C code that can be compiled to run on an open-source resource-constrained acoustic sensor ( Prerequisites: Compilers and Principles of Programming Languages useful but not essential
An operating system for embedded devices Alex Rogers Cyber Physical Systems MSc Most small embedded devices do not separate high-level application code from the low-level device-specific code that interacts with hardware peripherals. As a result, small changes to application logic often results in the entire firmware of a device being updated. In many settings this is inconvenient and results in inefficient use of constrained resources such as communication bandwidth. In this project, you will design and implement a low-level operating system for embedded devices that separates relocatable application logic from low-level I/O logic. The resulting operating system should allow easy access to only that hardware functionality that the device designer wishes to make available to application developers. The project will target a typical small micro-controller for embedded applications that uses an ARM M0+ core with 64KB flash and 8KB RAM. Prerequisites: Digital Systems and some knowledge of C programming is useful but not essential.
Deploying TensorFlow Lite on AudioMoth Alex Rogers Cyber Physical Systems MSc The use of convolution and deep neural networks on computationally constrained hardware has received lots of attention recently with the development of TensorFlow Lite which targets small ARM devices. This project will explore the use of this framework for audio detection tasks. The intended platform is AudioMoth -- a low-cost open-source smart acoustic sensor used for biodiversity and environmental sensing ( — and a typical application would involve training a neural network to recognise the sound of a particular species of bird such that only recordings of that species are captured by the device. This project will develop a complete toolchain to allow a model to be trained and tested on a desktop machine using library recordings, and then ported to run on the AudioMoth device (constrained by its limited RAM and flash storage).
Educational tools for micro:bit Alex Rogers Cyber Physical Systems MSc The micro:bit is an ARM-based embedded development board intended for STEM education ( To date over two million micro:bit devices have been manufactured and distributed. In this project, you will work with the Micro:bit Educational Foundation to extend the capabilities of this device. Potential projects include (i) developing interaction mechanisms to facilitate local large-scale multi-player games and simulations (e.g. modeling virus contagion throughout a classroom or synchronisation of fireflies) using the existing I/O and peer-to-peer networking capabilities of the micro:bit board, (ii) a data-logging application that allows micro:bit devices to collect experimental data, exposes that data to a webpage through a WebUSB interface, and presents useful data analysis and plotting tools, or (ii) deploying small convolution neural networks on the device to perform image recognition tasks using a low resolution camera made from an optical mouse sensor. You will be encouraged to explore other ideas of your own. The focus is to provide a complete end-to-end solution, balancing the needs of both students and instructors, that can ultimately be incorporated into publicly available micro:bit educational resources. Prerequisites: No specific prerequisites.
Applications of ad hoc security Bill Roscoe Security MSc My group have developed novel technology for developing security between pairs and groups of devices such as mobile phones. This technology clearly has many potential applications: the objective of this project is to build an implementation directed at an application of the student's choosing, for example email, healthcase or social networking. See my papers with Bangdao Chen, Ronald Kainda and Long Nguyen on my web page. Prerequisites: Security
Developing and using SVA Bill Roscoe Security MSc SVA is a front end for FDR that analyses shared variable programs. See Chapters 18 and 19 of Understanding Concurrent Systems. It has the potential to support a number of projects including: (a) Building a version that accepts a subset of some standard programming language as opposed to the custom language it presently uses. (b) Extending its functionality (c) Case studies using it. (a) and (b) would both involve programming in CSP and JAVA, the two languages SVA is written in. Prerequisites: Concurrency for (a) and (b)
Modelling and verifying systems in Timed CSP and FDR Bill Roscoe Security B C MSc

Timed CSP reinterprets the CSP language in a real-time setting and has a semantics in which the exact times of events are recorded as well as their order. Originally devised in the 1980s, it has only just been implemented as an alternative mode for FDR. The objective of this project is to take one or more examples of timed concurrent system from the literature, implement them in Timed CSP, and where possible compare the performance of these models with similar examples running on other analysis tools such as Uppaal.


(Reference Understanding Concurrent Systems, especially Chapter 15, and Model Checking Timed CSP, from AWR's web list of publications)

Automatically synthesizing Spark programs Amir Shaikhha Data and Knowledge B C MSc There are different ways to write a query processing algorithm (such as a join algorithm) using Spark. For different data sizes and based on the number of worker nodes, the best Spark implementation may vary. The aim of this project is to use program synthesis techniques to automatically derive the most efficient Spark program based on the given data information and the given platform.
Data Compression Schemes as Compiler Transformations Amir Shaikhha Data and Knowledge B C Compression algorithms can improve the runtime performance and memory usage of database systems. This improvement is more obvious in the context of column-store database systems. On the other hand, with the advent of in-memory database systems, query compilers are getting more important. As a result, using compilation techniques in database development is gaining more attention. DBLAB is a framework for building database systems using high-level programming, producing optimized code in a low-level language like C. This is achieved through compilation techniques implemented in an extensible optimizing compiler called SC (Systems Compiler). The suggested project combines these two aspects. To be more concrete, we would like to implement different compression schemes using compilation transformations, which will be encoded using SC and added to DBLAB.
Model-Based Machine Learning for Optimizing Data Analytics Amir Shaikhha Data and Knowledge B C MSc Model-based machine learning advocates for using probabilistic models to learn patterns on data. Probabilistic programming languages are the key frameworks for expressing such machine learning models. The aim of this project is to use these frameworks for optimizing data analytics workloads. More specifically, the focus of this project is on finding more precise estimates for cardinalities of intermediate relations in database queries, using probabilistic programming languages.
A simple object-oriented language Michael Spivey Programming Languages B C MSc

Use Keiko to implement a simple language that is purely object-oriented. Study the compromises that must be made to get reasonable performance, comparing your implementation with Smalltalk, Ruby or Scala.

Please see for further details.

NB. This project is not available to MSc students in 2016-17.

Undergraduate students who wish to enquire about a project for 2017-18 are welcome to contact Prof Spivey but should note that the response may be delayed as he is on sabbatical.

Better JIT translation Michael Spivey Programming Languages B C MSc

The existing JIT for Keiko is very simple-minded, and does little more than translate each bytecode into the corresponding machine code. Either improve the translation by using one of the many JIT libraries now available, or adjust the Oberon compiler and the specification of the bytecode machine to free it of restrictive assumptions and produce a better pure-JIT implementation.

Please see for further details.

N.B This project is not available to MSc students in 2016-17.

Undergraduate students who wish to enquire about a project for 2017-18 are welcome to contact Prof Spivey but should note that the response may be delayed as he is on sabbatical.

Better performance for GeomLab Michael Spivey Programming Languages B C MSc

At present, GeomLab programs show a performance that competes favourably with Python, making it possible to address tasks like computing images of the Mandelbrot set using a purely functional program that calls a function once for each pixel. But there is still a gap between the performance of GeomLab programs and similar ones written in Java or C, and more ambitious image-processing tasks would be made possible by better performance, particularly in the area of arithmetic. Explore ways of improving performance, perhaps including the possibility of improving the performance of GeomLab by allowing numbers to be passed around without wrapping them in heap-allocated objects, or the possibility of compiling the code for Haskell-style pattern matching in a better way.

Please see for further details.

N.B. This project is not available to MSc students in 2016-17.

Undergraduate students who wish to enquire about a project for 2017-18 are welcome to contact Prof Spivey but should note that the response may be delayed as he is on sabbatical.

Eliminating array bounds checks Michael Spivey Programming Languages B C MSc

The Oberon compiler inserts code into every array access and every pointer dereference to check for runtime errors, like a subscript that is out of bounds or a pointer that is null. In many cases, it is possible to eliminate the checks because it is possible to determine from the program that no error can occur. For example, an array access inside a FOR loop may be safe given the bounds of the loop, and several uses of the same pointer in successive statements may be able to share one check that the pointer is non-null. Modify the Oberon compiler (or a simpler one taken from the Compilers labs) so that it represents the checks explicitly in its IR, and introduce a pass that removes unnecessary checks, so speeding up the code without compromising safety.

Please see for further details.

N.B. This project is not available to MSc students in 2016-17.

Undergraduate students who wish to enquire about a project for 2017-18 are welcome to contact Prof Spivey but should note that the response may be delayed as he is on sabbatical.

GeomLab and Mindstorms Michael Spivey Programming Languages B C MSc

GeomLab has a turtle graphics feature, but the pictures are drawn only on the screen. It should be possible to make a turtle out of Lego Mindstorms, then control it with an instance of Geomlab running on a host computer, with communication over Bluetooth.

Please see for further details.

N.B. This project is not available to MSc students in 2016-17.

Undergraduate students who wish to enquire about a project for 2017-18 are welcome to contact Prof Spivey but should note that the response may be delayed as he is on sabbatical.

GeomLab on Android Michael Spivey Programming Languages B C MSc

Produce an implementation of GeomLab's GUI and graphics library that works on the Android platform. Either use an interpreter for GeomLab's intermediate code to execute GeomLab programs, or investigate dynamic translation of the intermediate code into code for Android's virtual machine Dalvik.

Please see for further details.



Heap-based activation records Michael Spivey Programming Languages B C MSc

At present, Keiko supports only conventional Pascal-like language implementations that store activation records on a stack. Experiment with an implementation where activation records are heap-allocated (and therefore recovered by a garbage collector), procedures are genuinely first-class citizens that can be returned as results in addition to being passed as arguments, and tail recursion is optimised seamlessly.

Please see for further details.



Keiko on Mindstorms Michael Spivey Programming Languages B C MSc

Alternative firmware for the Mindstorms robot controller provides an implementation of the JVM, allowing Java programs to run on the controller, subject to some restrictions. Using this firmware as a guide, produce an interpreter for a suitable bytecode, perhaps some variant of Keiko, allowing Oberon or another robot language of your own design to run on the controller. Aim to support the buttons and display at first, and perhaps add control of the motors and sensors later.

Please see for further details.



Type-checking for GeomLab Michael Spivey Programming Languages B C MSc

The GeomLab language is untyped, leading to errors when expressions are evaluated that would be better caught at an earlier stage. Most GeomLab programs, however, follow relatively simple typing rules. The aim in this project is to write a polymorphic type checker for GeomLab and integrate it into the GeomLab system, which is implemented in Java. A simple implementation of the type-checker would wait until an expression is about to be evaluated, and type-check the whole program at that point. As an extension of the project, you could investigate whether it is possible to type-check function definitions one at a time, even when some of the functions they call have not yet been defined.

Please see for further details.



MSc and Undergraduate Projects Bernard Sufrin Programming Languages MSc

Bernard Sufrin's most recent collection of detailed project descriptions can always be found at In addition to the projects described there, he is happy to discuss supervising projects in concurrent programming, user-interface design, programming language implementation, program transformation, and proof support.

PARALLEL REASONING IN SEQUOIA David Tena Cucala, Bernardo Cuenca Grau Artificial Intelligence and Machine Learning, Data and Knowledge C MSc

Sequoia is a state-of-the-art system developed in the Information Systems Group of the University of Oxford for automated reasoning in OWL 2, a standard ontology language in the Semantic Web. Sequoia uses consequence-based calculus, a novel and promising approach towards fast and scalable reasoning. Consequence-based algorithms are naturally amenable to parallel reasoning; however, the current version of Sequoia does not take advantage of this possibility. This project aims at implementing and testing parallel reasoning capabilities in Sequoia. The student involved in this project will acquire knowledge of state-of-the-art techniques for reasoning in OWL 2. They will develop an implementation that exploits the characteristics of Sequoia's underlying algorithm to achieve parallel reasoning, and they will perform an evaluation of the system's performance against other state-of-the-art reasoners.


only suitable for someone who has done the
Knowledge Representation and Reasoning course. Experience with concurrent programmig and/or Scala is desirable

ARETHA: A Transparent and Respectful Virtual Assistant for the Home Max Van Kleek Human Centred Computing B C MSc Virtual assistants like Alexa and Siri have become hugely popular conveniences in homes worldwide. However, they also pose unprecedented privacy risks by being black-box vehicles of corporate surveillance by the likes of Amazon, Apple and others. Is it possible to design a virtual assistant that can be guaranteed to protect user privacy, and be fully transparent and controllable by end-users? This project will involve designing and developing a fully modular, open source virtual assistant called ARETHA from the ground up, using open source components, which can provide strong guarantees about being prirvacy-preserving and pro-user.
Algorithmic Explanations Max Van Kleek, Jun Zhao Human Centred Computing B C MSc Description: Machine learning is a range of methods for classifying or predicting real-world phenomena using data. When ML models are used to predict employee success, or set insurance prices, they can have significant effects on people’s lives. Understandably, in such circumstances people might want explanations for why a system has given them a negative score. Various methods have been proposed to generate interpretable ML models; including systematically perturbing the inputs to observe effects on the output, or providing power indices for each input. However, few of these techniques have been translated into user-friendly explanation interfaces. In this project, you would explore ways to improve explanation systems. This could involve generating novel explanation methods; translating existing ones into graphical or natural language formats; and / or testing them out on users for different purposes.
App and loT X-Ray Max Van Kleek, Jun Zhao Human Centred Computing B C MSc The X-Ray project enables people to see all of the first and third parties that receive data from the smartphone apps and IoT devices in their homes, so that they can make better privacy decisions. There are several pieces to the X-Ray: the Observatory, which discovers and retrieves smartphone software such as smartphone apps, the analyser, a static analysis method for analysing these apps to derive information flow models, and end-user clients which provide visualisations and ways for people to understand these models in the context of specific privacy choices. In this project, you will be invited to extend the X-Ray in some way: such as by introducing a new module for the observatory (to ingest new kinds of content), the analyser, or a new way for end-users to view and use the models derived.
Privopticon: A privacy-preserving mesh OS for self-observation Max Van Kleek Human Centred Computing B C MSc

Self-monitoring has many potential applications within the home, such as the ability to understand important health and activity rhythms within a household automatically. But such monitoring activities also have associated extreme privacy risks. Can we design new kinds of sensing architectures that are designed to preserve inhabitants'  privacy?  We have made initial inroads of a new mesh operating system prototype for raspberry π hardware for new classes of privacy-preserving self-monitoring applications for the home, which will provide  user-configurable degrees of information fidelity, have built-in forgetting and are accountable by design. We need your help to try and evaluate different kinds of methods for achieving these goals.

3D demos of geometric concepts Irina Voiculescu Computational Biology and Health Informatics B The Geometric Modelling course (Part B) deals with several interesting concepts which would be best visualised in a suite of applications. A coherent suite of 3D demos could easily become a useful tool for this course, as well as for users worldwide. This project would be most suitable for a candidate who already have some experience using a 3D graphics library of their choice and want to improve this skill. The mathematical concepts are well-documented.
3D environment for Hand Physiotherapy Irina Voiculescu Computational Biology and Health Informatics B C After hand surgery, it is almost always necessary for patients to have physiotherapy afterwards to help with their recovery. As part of this, the patient will need to perform hand exercises at home. However, the patient may not always do the exercises correctly, or they might forget to do their exercises. The goal of this project is to use the Leap Motion to create a user-friendly GUI which a patient could use to aid them with their home exercises. The interface would show the user where their hand should be and they would then need to follow the movements. It could work from a web-based software or a downloaded software. It would need to be tailored to the patient so it contained their specific required exercises, which could be input by the physiotherapist. It would need to store data on how the patient is doing and feedback this data to the patient, and possibly also to the physiotherapist via the internet. If internet-based, patient confidentiality and security would need to be considered. This project would be performed in close collaboration with a physiotherapist, an orthopaedic hand surgeon, and a post-doctoral researcher based at the Nuffield Orthopaedic Centre.
3D printing medical scan data Irina Voiculescu Computational Biology and Health Informatics B C MSc

Computed tomography (CT) scanning is a ubiquitous scanning modality. It produces volumes of data representing internal parts of a human body. Scans are usually output in a standard imaging format (DICOM) and come as a series of axial slices (i.e. slices across the length of the person's body, in planes perpendicular to the imaginary straight line along the person's spine.)

The slices most frequently come at a resolution of 512 x 512 voxels, achieving an accuracy of about 0.5 to 1mm of tissue per voxel, and can be viewed and analysed using a variety of tools. The distance between slices is a parameter of the scanning process and is typically much larger, about 5mm.

During the analysis of CT data volumes it is often useful to correct for the large spacing between slices. For example when preparing a model for 3D printing, the axial voxels would appear elongated. These could be corrected through an interpolation process along the spinal axis.

This project is about the interpolation process, either in the raw data output by the scanner, or in the post-processed data which is being prepared for further analysis or 3D printing.

The output models would ideally be files in a format compatible with 3D printing, such as STL. The main aesthetic feature of the output would be measurable as a smoothness factor, parameterisable by the user.

Existing DICOM image analysis software designed within the Spatial Reasoning Group at Oxford is available to use as part of the project.

3D stereo display of medical scan data Irina Voiculescu, Stuart Golodetz Computational Biology and Health Informatics B C The Medical Imaging research group has been working with a variety of data sourced from CT and MRI scans. This data comes in collections of (generally greyscale) slices which together make up 3D images. Our group has developed software to generate 3D models of the major organs in these images. This project aims to develop a simple augmented reality simulation for the Oculus Rift which will render these organs within a transparent model of a human and allow the user to walk around the model so as to view the organs from any angle. This has a number of possible applications, including to train medical students and to help surgeons to explain medical procedures to their patients.
Exact Algorithms for Complex Root Isolation Irina Voiculescu Computational Biology and Health Informatics B C MSc

Not available in 2013/14

Isolating the complex roots of a polynomial can be achieved using subdivision algorithms. Traditional Newton methods can be applied in conjunction with interval arithmetic. Previous work (jointly with Prof Chee Yap and MSc student Narayan Kamath) has compared the performance of three operators: Moore's, Krawczyk's and Hansen-Sengupta's. This work makes extensive use of the CORE library, which is is a collection of C++ classes for exact computation with algebraic real numbers and arbitrary precision arithmetic. CORE defines multiple levels of operation over which a program can be compiled and executed. Each of these levels provide stronger guarantees on exactness, traded against efficiency. Further extensions of this work can include (and are not limited to): (1) Extending the range of applicability of the algorithm at CORE's Level 1; (2) Making an automatic transition from CORE's Level 1 to the more detailed Level 2 when extra precision becomes necessary; (3) Designing efficiency optimisations to the current approach (such as confirming a single root or analysing areas potentially not containing a root with a view to discarding them earlier in the process); (4) Tackling the isolation problem using a continued fraction approach. The code has been included and is available within the CORE repository. Future work can continue to be carried out in consultation with Prof Yap at NYU.

Gesture recognition using Leap Motion Irina Voiculescu Computational Biology and Health Informatics B C MSc

Scientists in the Experimental Psychology Department study patients with a variety of motor difficulties, including apraxia - a condition usually following stroke which involves lack of control of a patient over their hands or fingers. Diagnosis and rehabilitation are traditionally carried out by Occupational Therapists. In recent years, computer-based tests have been developed in order to remove the human subjectivity from the diagnosis, and in order to enable the patient to carry out a rehabilitation programme at home. One such test involves users being asked to carry out static gestures above a Leap Motion sensor, and these gestures being scored according to a variety of criteria. A prototype has been constructed to gather data, and some data has been gathered from a few controls and patients. In order to deploy this as a clinical tool into the NHS, there is need for a systematic data collection and analysis tool, based on machine learning algorithms to help classify the data into different categories. Algorithms are also needed in order to classify data from stroke patients, and to assess the degree of severity of their apraxia. Also, the graphical user interface needs to be extended to give particular kinds of feedback to the patient in the form of home exercises, as part of a rehabilitation programme.

This project was originally set up in collaboration with Prof Glyn Humphreys, Watts Professor of Experimental Psychology. Due to Glyn's untimely death a new co-supervisor needs to be found in the Experimental Psychology Department. It is unrealistic to assume this project can run in the summer of 2016.

Identifying features in MRI scan data Irina Voiculescu Computational Biology and Health Informatics B C MSc In recent years, medical diagnosis using a variety of scanning modalities has become quasi-universal and has brought about the need for computer analysis of digital scans. Members of the Spatial Reasoning research group have developed image processing software for CT (tomography) scan data. The program partitions (segments) images into regions with similar properties. These images are then analysed further so that particular features (such as bones, organs or blood vessels) can be segmented out. The team's research continues to deal with each of these two separate meanings of medical image segmentation. The existing software is written in C++ and features carefully-crafted and well-documented data structures and algorithms for image manipulation. In certain areas of surgery (e.g. orthopaedic surgery involving hip and knee joint) the magnetic resonance scanning modality (MRI) is preferred, both because of its safety (no radiation involved) and because of its increased visualisation potential. This project is about converting MRI scan data into a format that can become compatible with existing segmentation algorithms. The data input would need to be integrated into the group's analysis software in order then to carry out 3D reconstructions and other measurements. This project is co-supervised by Professor David Murray MA, MD, FRCS (Orth), Consultant Orthopaedic Surgeon at the Nuffield Orthopaedic Centre and the Nuffield Department of Orthopaedics, Rheumatology and Musculoskeletal Sciences (NDORMS), and by Mr Hemant Pandit MBBS, MS (Orth), DNB (Orth), FRCS (Orth), DPhil (Oxon)Orthopaedic Surgeon / Honorary Senior Clinical Lecturer, Oxford Orthopaedic Engineering Centre (OOEC), NDORMS.
Reinforcement learning techniques for games Irina Voiculescu Computational Biology and Health Informatics B C

This project is already taken for 2017-2018


Psychology has inspired and informed a number of machine learning methods. Decisions within an algorithm can be made so as to improve an overall aim of maximising a (cumulative) reward. Supervised learning methods in this class are known as Reinforcement Learning. A basic reinforcement learning model consists of establishing a number of environment states, a set of valid actions, and rules for transitioning between states. Applying this model to the rules of a board game means that the machine can be made to learn how to play a simple board game by playing a large number of games against itself. The goal of this project is to set up a reinforcement learning environment for a simple board game with a discrete set of states (such as Backgammon). If time permits, this will be extended to a simple geometric game (such as Pong) where the states may have to be parameterised in terms of geometric actions to be taken at each stage in the game.

Simple drawing analysis Irina Voiculescu Computational Biology and Health Informatics B C MSc

Scientists in the Experimental Psychology Department study patients with a variety of motor difficulties, including apraxia - a condition usually following stroke which involves lack of control of a patient over their hands or fingers. Diagnosis and rehabilitation are traditionally carried out by Occupational Therapists. In recent years, computer-based tests have been developed in order to remove the human subjectivity from the diagnosis, and in order to enable the patient to carry out a rehabilitation programme at home. One such test involves users drawing simple figures on a tablet, and these figures being scored according to a variety of criteria. Data has already been gathered from 200 or so controls, and is being analysed for a range of parameters in order to assess what a neurotypical person could achieve when drawing such simple figures. Further machine learning analysis could help classify such data into different categories. Algorithms are also needed in order to classify data from stroke patients, and to assess the degree of severity of their apraxia.

This project was originally co-supervised by Prof Glyn Humphreys, Watts Professor of Experimental Psychology. Due to Glyn's untimely death a new co-supervisor needs to be found in the Experimental Psychology Department. It is unrealistic to assume this project can run in the summer of 2016.

Surgery teaching and training tool Irina Voiculescu Computational Biology and Health Informatics B C MSc

Knee replacement surgery involves a precise series of steps that a surgeon needs to follow. Trainee surgeons have traditionally mastered these steps by learning from textbooks or experienced colleagues. Surgeons at the Nuffield Department of Orthopaedics, Rheumatology and Musculoskeletal Sciences (NDORMS) in Oxford have been working on a standardised method to help trainees internalise the sequence of events in an operation. It is proposed to construct a computer-based tool which would help with this goal. Apart from the choice of tools and materials, the tool would also feature a virtual model of the knee. The graphical user interface will present a 3D model of a generic knee to be operated, and would have the ability for the user to make cuts necessary to the knee replacement procedure. There would be pre-defined parameters regarding the type and depth of each cut, and an evaluation tool on how the virtual cuts compared against the parameters.

The project goals are quite extensive and so this would be suitable for an experienced programmer.

This project is co-supervised by Professor David Murray MA, MD, FRCS (Orth), Consultant Orthopaedic Surgeon at the Nuffield Orthopaedic Centre and the Nuffield Department of Orthopaedics, Rheumatology and Musculoskeletal Sciences (NDORMS), and by Mr Hemant Pandit MBBS, MS (Orth), DNB (Orth), FRCS (Orth), DPhil (Oxon)Orthopaedic Surgeon / Honorary Senior Clinical Lecturer, Oxford Orthopaedic Engineering Centre (OOEC), NDORMS.

Textile fabric detail simulation Irina Voiculescu Computational Biology and Health Informatics B C MSc Conventional computer-aided design (CAD) software uses methods such as extrusion and revolution tools that the user can apply to create the 3D shape of a part. These tools are based on traditional manufacturing methods and work very well for most CAD applications. One application which these tools do not work well for is creating 3-dimensional representations of textiles. The exact path of each fibre within the textile is dependent upon the other fibres, and the flexibility of the fibres. The purpose of this project is to create a simple software tool/algorithm into which a user can input a weave pattern (flat), or a braid pattern (cylindrical), and the flexibility of the fibres and it will create a 3-dimensional representation of the structure.
The efficiency of numerical algorithms Jonathan Whiteley Computational Biology and Health Informatics B C MSc Many numerical algorithms have error bounds that depend on some user provided input.  For example, the error in a numerical method for solving a differential equation is bounded in terms of the step-size h, and so the user may change the step-size h until a desired accuracy is attained.  Although useful, these error bounds do not take account of computational efficiency.  For example, a numerical method for solving a differential equation may have a very impressive bound with respect to step size h, but may require significantly more computational effort than other methods with less impressive error bounds.  The field of scientific computing is a rich source of algorithms such as these, for example the numerical solution of differential equations, the numerical solution of linear systems, and interpolation of functions.  The aim of this project is to work with a variety of algorithms for solving a given problem, and to assess the computational efficiency of these algorithms.
The efficient storage of sparse matrices Jonathan Whiteley Computational Biology and Health Informatics B Many matrix applications involve sparse matrices, i.e. matrices that have a very large number of rows and columns, but only a small number of non-zero entries in each row.  On a given computer we may only store a finite number of matrix entries.  When working with sparse matrices we usually store only the non-zero entries and their locations.  There are several established techniques for storing sparse matrices.  These methods all have individual strengths and weaknesses - some allow efficient multiplication of sparse matrices by vectors, others allow entries to be modified effectively, while others allow the sparsity pattern to be modified dynamically.  The aim of this project is to investigate these sparse storage methods, and to highlight the strengths and weaknesses of each approach.
Model Checking LTL on Markov Chains James Worrell Automated Verification B C

The goal of this project is to write a program that model checks a Markov chain against an LTL formula, i.e., calculates the probability that formula is satisfied. The two main algorithmic tasks are to efficiently compile LTL formulas into automata and then to solve systems of linear equations arising from the product of the Markov chain and the automaton. An important aspect of this project is to make use of an approach that avoids determinising the automaton that represents the LTL formula. This project builds on material contained in the Logic and Proof and Models of Computation Courses.

Reading: J-M. Couvreur, N. Saheb and G. Sutre. An optimal automata approach to LTL model checking of probabilistic systems. Proceedings of LPAR'03, LNCS 2850, Springer 2003.

Topics in Linear Dynamical Systems James Worrell Automated Verification C MSc A linear dynamical system is a discrete- or continuous-time system whose dynamics is given by a linear function of the current state. Examples include Markov chains, linear recurrence sequences (such as the Fibonacci sequence), and linear differential equations. This project involves investigating the decidability and complexity of various reachability problems for linear dynamical systems. It would suit a mathematically oriented student. Linear algebra is essential, and number theory or complexity theory is desirable. A relevant paper is Ventsislav Chonev, Joël Ouaknine, James Worrell: The orbit problem in higher dimensions. STOC 2013: 941-950.
Analysing user-algorithm interaction Jun Zhao Human Centred Computing MSc

"Members of the Human Centred Computing Group are currently involved in the UnBias research project ( This investigates the user experience of algorithm-driven internet platforms. As part of this project we have been conducting observational data collection in which pairs of volunteers are video recorded whilst browsing online. The on screen activity is also captured and the data are being analysed to identify instances of user-algorithm interaction; in particular instances in which an algorithmic outcome – such as the auto complete suggestion in a search bar or the filtering mechanisms on a recommendation platform – might shape the trajectory of browsing activity. We will shortly undertake a second wave of data collection in which users browse online but are placed in situations in which their usual experience of certain algorithmic processes will be disrupted. We are looking for a student to conduct analysis on the collected data and to then build on this analysis by designing and conducting their own disruption experiment. The student can set their own specific research questions for the analysis, and determine what particular kind of methodological approach they would like to undertake. :There are no prerequisites. The study will suit a student interested in developing  skills in social research methods and human computer interaction approaches. We will assist in securing ethical clearance and recruiting research participants for the data collection activities. 

Children, Well-Being, and Apps Jun Zhao Human Centred Computing MSc Description: Tablet computers are becoming the primary means for young children to go online. However, few studies have examined how young children perceive and cope with personal data privacy during their interactions with these mobile technologies. Our current research has shown that young children and their parents largely have little awareness of personal data collection by mobile apps. Parents’ primary concerns are misplaced, and remain focused on familiar risks, like content appropriateness or in-app purchases. As a result, children are regularly exposed to personal data risks and have little skills to recognise or cope with them. In the KOALA project, we will explore different ways to mitigate these risks, including recognising and detecting apps that designed for children in the general app store, and analysing personal data collection practices of existing mobile apps from leading kids app stores.
Topics in Algorithms, Complexity, and Combinatorial Optimisation Standa Živný Algorithms and Complexity Theory C MSc Prof Zivny is willing to supervise in the area of algorithms, complexity, and combinatorial optimisation. In particular, on problems related to convex relaxations (linear and semidefinite programming relaxations), submodular functions, and algorithms for and complexity of homomorphisms problems and Constraint Satisfaction Problems.