MSc in Computer Science  Student projects
Suggested projects
Project  Supervisors  Parts  Description  

Aggregation of Photovoltaic Panels  Alessandro Abate  The increased relevance of renewable energy sources has modified the behaviour of the electrical grid. Some renewable energy sources affect the network in a distributed manner: whilst each unit has little influence, a large population can have a significant impact on the global network, particularly in the case of synchronised behaviour. This work investigates the behaviour of a large, heterogeneous population of photovoltaic panels connected to the grid. We employ Markov models to represent the aggregated behaviour of the population, while the rest of the network (and its associated consumption) is modelled as a single equivalent generator, accounting for both inertia and frequency regulation. Analysis and simulations of the model show that it is a realistic abstraction, and quantitatively indicate that heterogeneity is necessary to enable the overall network to function in safe conditions and to avoid load shedding. This project will provide extensions of this recent research. In collaboration with an industrial partner. Prerequisites: ComputerAided Formal Verification, Probabilistic Model Checking 

Analysis and verification of stochastic hybrid systems  Alessandro Abate  Stochastic Hybrid Systems (SHS) are dynamical models that are employed to characterize the probabilistic evolution of systems with interleaved and interacting continuous and discrete components. Formal analysis, verification, and optimal control of SHS models represent relevant goals because of their theoretical generality and for their applicability to a wealth of studies in the Sciences and in Engineering. In a number of practical instances the presence of a discrete number of continuously operating modes (e.g., in faulttolerant industrial systems), the effect of uncertainty (e.g., in safetycritical airtraffic systems), or both occurrences (e.g., in models of biological entities) advocate the use of a mathematical framework, such as that of SHS, which is structurally predisposed to model such heterogeneous systems. In this project, we plan to investigate and develop new analysis and verification techniques (e.g., based on abstractions) that are directly applicable to general SHS models, while being computationally scalable. Courses: ComputerAided Formal Verification, Probabilistic Model Checking, Probability and Computing, Automata Logic and Games Prerequisites: Familiarity with stochastic processes and formal verification 

Automated verification of complex systems in the energy sector  Alessandro Abate  Smart microgrids are smallscale versions of centralized electricity systems, which locally generate, distribute, and regulate the flow of electricity to consumers. Among other advantages, microgrids have shown positive effects over the reliability of distribution networks. These systems present heterogeneity and complexity coming from 1. local and volatile renewables generation, and 2. the presence of nonlinear dynamics both over continuous and discrete variables. These factors calls for the development of proper quantitative models. This framework provides the opportunity of employing formal methods to verify properties of the microgrid. The goal of the project is in particular to focus on the energy production via renewables, such as photovoltaic panels. The project can benefit form a paid visit/internship to industrial partners. Courses: ComputerAided Formal Verification, Probabilistic Model Checking, Probability and Computing, Automata Logic and Games Prerequisites: Familiarity with stochastic processes and formal verification, whereas no specific knowledge of smart grids is needed. 

Development of software for the verification of MPL models  Alessandro Abate  This project is targeted to enhance the software tollbox VeriSiMPL (''very simple''), which has been developed to enable the abstraction of MaxPlusLinear (MPL) models. MPL models are specified in MATLAB, and abstracted to Labeled Transition Systems (LTS). The LTS abstraction is formally put in relationship with its MPL counterpart via a (bi)simulation relation. The abstraction procedure runs in MATLAB and leverages sparse representations, fast manipulations based on vector calculus, and optimized data structures such as DifferenceBound Matrices. LTS can be pictorially represented via the Graphviz tool and exported to PROMELA language. This enables the verification of MPL models against temporal specifications within the SPIN model checker. Courses: ComputerAided Formal Verification, Numerical Solution of Differential Equations Prerequisites: Some familiarity with dynamical systems, working knowledge of MATLAB and C 

Formal verification of a software tool for physical and digital components  Alessandro Abate  We are interested in working with existing commercial simulation software that is targeted around the modelling and analysis of physical, multidomain systems. It further encompasses the integration with related software tools, as well as the interfacing with devices and the generation of code. We are interested in enriching the software with formal verification features, envisioning the extension of the tool towards capabilities that might enable the user to raise formal assertions or guarantees on models properties, or to synthesise correctbydesign architectures. Within this longterm plan, this project shall target the formal generation of faults warnings, namely of messages to the user that are related to ``bad (dynamical) behaviours'' or to unwanted ``modelling errors''. The student will be engaged in developing algorithmic solutions towards this goal, while reframing them within a formal and general approach. The project is interdisciplinary in dealing with hybrid models involving digital and physical quantities, and in connecting the use of formal verification techniques from the computer sciences with more classical analytical tools from control engineering. With Daniel Kroening. Courses: ComputerAided Formal Verification, Software Verification. Prerequisites: Knowledge of basic formal verification. 

Innovative Sensing and Actuation for Smart Buildings  Alessandro Abate  Sensorisation and actuation in smart buildings and the development of smart HVAC (heat, ventilation and airconditioning) control strategies for energy management allow for optimised energy usage, leading to the reduction in power consumption or to optimised demand/response strategies that are key in a rather volatile market. This can further lead to optimised maintenance for the building devices. Of course the sensorisation of buildings leads to heavy requirements on the overall infrastructure. Further, we plan to investigate approaches to perform metasensing, namely to extrapolate the knowledge from physical sensors towards that of virtual elements (as an example, to infer the current building occupancy from correlated measurements of temperature and humidity dynamics). On the actuation side, we are likewise interested in engineering noninvasive minimalistic solutions, which are robust to uncertainty and performancecertified. The plan for this project is to make the first steps in this direction, based on recent results in the literature. The
project can benefit from a visit to industrial partners. 

Model learning and verification  Alessandro Abate, Daniel Kroening  This project will explore connections of techniques from machine learning with successful approaches from formal verification. The project has two sides: a theoretical one, and a more practical one: it will be up to the student to emphasise either of the two sides depending on his/her background and/of interests. The theoretical part will develop existing research, for instance in one of the following two interdisciplinary domain pairs: learning & repair, or reachability analysis & Bayesian inference. On the other hand, a more practical project will apply the above theoretical connections on a simple models setup in the area of robotics and autonomy. Courses: ComputerAided Formal Verification, Probabilistic Model Checking, Machine Learning 

Precise simulations and analysis of aggregated probabilistic models  Alessandro Abate  This project shall investigate a rich research line, recently pursued by a few within the Department of CS, looking at the development of quantitative abstractions of Markovian models. Abstractions come in the form of lumped, aggregated models, which are beneficial in being easier to simulate or to analyse. Key to the novelty of this work, the proposed abstractions are quantitative in that precise error bounds with the original model can be established. As such, whatever can be shown over the abstract model, can be as well formally discussed over the original one. This project, grounded on existing literature, will pursue (depending on the student's interests) extensions of this recent work, or its implementation as a software tool. Courses: ComputerAided Formal Verification, Probabilistic Model Checking, Machine Learning 

Safe Reinforcement Learning  Alessandro Abate  Reinforcement Learning (RL) is a known architecture for synthesising policies for Markov Decision Processes (MDP). We work on extending this paradigm to the synthesis of ‘safe policies’, or more general of policies such that a linear time property is satisfied. We convert the property into an automaton, then construct a product MDP between the automaton and the original MDP. A reward function is then assigned to the states of the product automaton, according to accepting conditions of the automaton. With this reward function, RL synthesises a policy that satisfies the property: as such, the policy synthesis procedure is `constrained' by the given specification. Additionally, we show that the RL procedure sets up an online value iteration method to calculate the maximum probability of satisfying the given property, at any given state of the MDP. We evaluate the performance of the algorithm on numerous numerical examples. This project will provide extensions of these novel and recent results. Prerequisites: ComputerAided Formal Verification, Probabilistic Model Checking, Machine Learning 

Software development for abstractions of stochastic hybrid systems  Alessandro Abate  Stochastic hybrid systems (SHS) are dynamical models for the interaction of continuous and discrete states. The probabilistic evolution of continuous and discrete parts of the system are coupled, which makes analysis and verification of such systems compelling. Among specifications of SHS, probabilistic invariance and reachavoid have received quite some attention recently. Numerical methods have been developed to compute these two specifications. These methods are mainly based on the state space partitioning and abstraction of SHS by Markov chains, which are optimal in the sense of reduction in abstraction error with minimum number of Markov states. The goal of the project is to combine codes have been developed for these methods. The student should also design a nice user interface (for the choice of dynamical equations, parameters, and methods, etc.). Courses: Probabilistic Model Checking, Probability and Computing, Numerical Solution of Differential Equations Prerequisites: Some familiarity with stochastic processes, working knowledge of MATLAB and C 

Pebble games, monads and comonads in classical, probabilistic and quantum computation  Samson Abramsky  Pebble games are an important and widely used tool in logic, algorithms and complexity, constraint satisfaction and database theory. The idea is that we can explore a pair of structures, e.g. graphs, by placing up to k pebbles on them, so we have a window of size at most k on the two structures. If we can always keep these pebbles in sync so that the two ksized windows look the same (are isomorphic) then we say that Duplicator has a winning strategy for the kpebble game. This gives a resourcebounded notion of approximation to graphs and other structures which has a wide range of applications. Monads and comonads are widely used in functional programming, e.g. in Haskell, and come originally from category theory. It turns out that pebble games, and similar notions of approximate or local views on data, can be captured elegantly by comonads, and this gives a powerful language for many central notions in constraints, databases and descriptive complexity. For example, kconsistency can be captured in these terms; another important example is treewidth, a key parameter which is very widely used to give “islands of tractability” in otherwise hard problems. Finally, monads can be used to give various notions of approximate or nonclassical solutions to computational problems. These include probabilistic and quantum solutions. For example, there are quantum versions of constraint systems and games which admit quantum solutions when there are no classical solutions, thus demonstrating a “quantum advantage”. Monads and comonads can be used in combination, making use of certain “distributive laws”. The aim of this project is to explore a number of aspects of these ideas. Depending on the interests and background of the student, different aspects may be emphasised, from functional programming, category theory, logic, algorithms and descriptive complexity, probabilistic and quantum computation. Some specific directions include: 1. Developing Haskell code for the kpebbling comonad and various nonclassical monads, and using this to give a computational toolbox for various constructions in finite model theory and probabilistic or quantum computation. 2. Developing the categorytheoretic ideas involved in combining monads and comonads, and studying some examples. 3. Using the language of comonads to capture other important combinatorial invariants such as treedepth. 4. Developing the connections between category theory, finite model theory and descriptive complexity. Some background reading. 1. Leonid Libkin, Elements of finite model theory. (Background on pebble games and the connection with logic and complexity). 2. The pebbling comonad in finite mode theory. S. Abramsky, A. Dawar and P. Wang. (Technical report describing the basic ideas which can serve as a starting point.)  
Sheaf theoretic semantics for vector space models of natural language  Samson Abramsky  Contextuality is a fundamental feature of quantum physical theories and one that distinguishes it from classical mechanics. In a recent paper by Abramsky and Brandenburger, the categorical notion of sheaves has been used to formalize contextuality. This has resulted in generalizing and extending contextuality to other theories which share some structural properties with quantum mechanics. A consequence of this type of modeling is a succinct logical axiomatization of properties such as nonlocal correlations and as a result of classical no go theorems such as Bell and KochenSoecker. Like quantum mechanics, natural language has contextual features; these have been the subject of much study in distributional models of meaning, originated in the work of Firth and later advanced by Schutze. These models are based on vector spaces over the semiring of positive reals with an inner product operation. The vectors represent meanings of words, based on the contexts in which they often appear, and the inner product measures degrees of word synonymy. Despite their success in modeling word meaning, vector spaces suffer from two major shortcomings: firstly they do not immediately scale up to sentences, and secondly, they cannot, at least not in an intuitive way, provide semantics for logical words such as `and', `or', `not'. Recent work in our group has developed a compositional distributional model of meaning in natural language, which lifts vector space meaning to phrases and sentences. This has already led to some very promising experimental results. However, this approach does not deal so well with the logical words. The goal of this project is to use sheaf theoretic models to provide both a contextual and logical semantics for natural language. We believe that sheaves provide a generalization of the logical Montague semantics of natural language which did very well in modeling logical connectives, but did not account for contextuality. The project will also aim to combine these ideas with those of the distributional approach, leading to an approach which combines the advantages of Montaguestyle and vectorspace semantics. Prerequisites ========== The interested student should have taken the category theory and computational linguistics courses, or be familiar with the contents of these. 

Classical simulation of quantum systems  Jonathan Barrett  Description For many information theoretic tasks, an advantage can be gained if quantum systems are used as the basic carrier of information, rather than classical variables. For example, it may be the case that with quantum systems, fewer resources are required. The project will investigate the classical simulation of quantum systems in simple scenarios in which quantum systems are communicated from one party to another, or in which quantum systems are measured to produce correlated outcomes. The aim is to compare the resources required by the classical simulation with those required when quantum systems are used. Prerequistes Linear algebra. A student taking this project should also be taking the Quantum Computer Science course. Some extra reading to cover the basic formalism of quantum theory would be an advantage. 

Deviceindependent quantum cryptography  Jonathan Barrett  Description One of the most successful applications of quantum information science is quantum key distribution, which enables separated parties to send secret messages, with security guaranteed by the laws of quantum theory. The mysterious phenomenon of “quantum nonlocality”, wherein two quantum systems appear to influence one another even though they are separated in space, can be used to design a particularly strong kind of key distribution protocol. The idea is that the honest users do not need to trust that their quantum devices are behaving as advertised, or even that quantum theory is correct. The project will explore the relationship between different kinds of nonlocality and the possibilities for secure communication. Prerequisites Linear algebra. A student taking this project should also be taking the Quantum Computer Science course. Some extra reading to cover the basic formalism of quantum theory would be an advantage. 

Optimization of Web Query Plans  Michael Benedikt  This project will look at how to find the best plan for a query, given a collection of data sources with access restrictions. We will look at logicbased methods for analyzing query plans, taking into account integrity constraints that may exist on the data. 

Simulation / Learning Tool for Process Scheduling  Stephen Cameron  "In domains such as manufacturing there may be a large number of individual steps required to complete an overall task, with various constraints between the steps and finite availability of resources. For example, an aircraft may require hundreds of thousands of steps to build, with constraints like ""we cannot mount the engines before the wings"", and resources like the number of workers and pieces of key machinery. Scheduling software exists that takes the lists of steps, constraints, and resources and generates feasible schedules; that is, produces lists of which steps should be performed at what times. Given the complexity of the problem it is impractical to generate optimal schedules, but in general close to optimal schedules ('good schedules') can be generated in a reasonable time. However the choice of which good schedule to use is often determined by factors that are not known early in the process or are difficult to quantify, such as the layout of a factory or the temporary loss of a worker due to illness. The goal of this project is to take an existing scheduling program and a class of reallife industrial problems and to develop a simulation program that could assist a process engineer to visualise the similarities and differences between a small number of good schedules, and hence to interactively adjust the scheduling parameters in order to improve the schedule. For example the set of feasible schedules can be displayed as a graph with the steps as graph nodes, and a visualisation might show the progress of the schedule as annotations on the graph nodes, whilst providing a set of onscreen controls to adjust the scheduling parameters. The scheduling program itself is written in C++, but this does not constrain the simulation program to be written in a particular language. The skillset required of a student taking this project would then be a mixture of twodimensional graphics, plus a desire to find out more about twodimensional animation and graphical user interface design. There is also the option of applying techniques from machine learning in order to automatically improve the schedule quality. The scheduling scenario to be used as an example in this project will be provided by an Oxfordbased company who are interested in potentially applying these techniques in the future. Online videos: https://www.promodel.com/solutionscafe/webinars/PCS_Quickstart/PCS_Quickstart.html gives a very simple example of the style of animation envisioned for this project, although we would expect the scheduling graph to be produced by the scheduling system rather than by hand as in this video. https://uk.mathworks.com/videos/simeventsforoperationsresearch118566.html shows a Matlab extension (Simulink) on a more realistic example; this is without any animation but includes the use of machine learning in the form of a genetic algorithm." 

Visualization for Process Scheduling  Stephen Cameron  "In domains such as manufacturing there may be a large number of individual steps required to complete an overall task, with various constraints between the steps and finite availability of resources. For example, an aircraft may require hundreds of thousands of steps to build, with constraints like ""we cannot mount the engines before the wings"", and resources like the number of workers and pieces of key machinery. Scheduling software exists that takes the lists of steps, constraints, and resources and generates feasible schedules; that is, produces lists of which steps should be performed at what times. Given the complexity of the problem it is impractical to generate optimal schedules, but in general close to optimal schedules ('good schedules') can be generated in a reasonable time. However the choice of which good schedule to use is often determined by factors that are not known early in the process or are difficult to quantify, such as the layout of a factory or the temporary loss of a worker due to illness. The goal of this project is to take an existing scheduling program and a class of reallife industrial problems and to develop a visualisation program that could help an enduser picture the running of a particular schedule as a threedimensional animation. The scheduling program itself is written in C++, but this does not constrain the visualisation program to be written in a particular language. The skillset required of a student taking this project would primarily be in threedimensional animation, either using their own code or by bolting onto an existing animation tool such as Blender. It should also be possible for the enduser to easily vary the context of the visualisation (i.e., the placement of the equipment in the threedimensional world and its relationship the the output of the scheduling program). The scheduling scenario to be used as an example in this project will be provided by an Oxfordbased company who are interested in potentially applying these techniques in the future. Online video: An online example of such a visualisation is at https://www.youtube.com/watch?v=YWuM43yP09A, although this level of detail is not expected in the student project; rather we are seeking a proof of concept that integrates the scheduling program output directly. "  
Merging all the biomedical image processing power of ITK with Matlab  Ramón Casero Cañas, Vicente Grau  The Insight Toolkit (ITK), a C++ library, is the most advanced software base for biomedical image processing. Our group has produced a basic working prototype of a software interface to run ITK image filters from Matlab [1], the biomedical engineer's platform of choice. We are looking for a student who, as a minimum, integrates ~155 existing image filters into this interface to make them available for biomedical research. This is more or less straightforward by hand. The challenge for an exceptional student would be to make a parser to automatically link ITK filters to Matlab. [1] http://code.google.com/p/gerardus/source/browse/trunk/matlab/?r=1082#matlab%2FItkToolbox  
Information processing and conservation laws in general probabilistic theories  Giulio Chiribella  "Project description and suitability for different students considered. See the appendix. areas in which I could supervise student projects in quantum information theory and quantum foundations. general areas for student supervision: quantum information and foundations of quantum mechanics.  
A generic evaluation of a categorical compositionaldistributional model of meaning  Bob Coecke  The categorical compositional distributional model of meaning is an emerging field of Computational Linguistics and natural language processing. It has been pioneered in the Quantum and Computational Linguistic groups of the department. The general theoretical underpinnings of the model is based on compact closed categories, inspired by the categorical quantum models of Abramsky and Coecke. In order to apply the model to main stream tasks, one has to instantiate it on concrete linguistic models, and in particular on distributional (vector space) models of meaning. So far, these instantiations have been done on datasets with small sentences containing few language units such as verbs, adjectives, and relative pronouns, all separately. In this project, we aim to connect these single experiments and perform a unified task. This is a termclassification task, the goal of which is to successfully classify a number of dictionary terms to their definitions (or viceversa). Specifically, the project involves the following:
We seek candidates who know Computational Linguistics and Linear Algebra, as well as programming. A knowledge of category theory is encouraged, but not required. The project will be cosupervised by Dimitri Kartsaklis, who has performed the previous termclassification tasks, Mehrnoosh Sadrzadeh and Bob Coecke. The outcome may be considered for publication in a peerreviewed conference or journal. References:


Categorical quantum mechanics and Lie Algebras  Bob Coecke, Amar Hadzihasanovic  Is there a way to define a category whose objects are root systems (with additional structure), so that the construction of Lie algebras from them is functorial? If so, what are its properties  e.g., is it monoidal, and in that case, is the functor (lax) monoidal? Prerequisites: categorical quantum mechanics and Lie algebras 

Developing conceptual space semantics for compositional distributional models of meaning  Bob Coecke, Martha Lewis  The compositional distributional model of meaning (Coecke et al., 2010) models the meaning of sentences using a category such as pregroups to model the grammar of the sentences and another category with the same compact closed structure to model the meanings of the individual words. In practice, word meanings are often modelled using finitedimensional vector spaces, built via word cooccurrence in text corpora. The conceptual spaces approach to semantics (Gärdenfors, 2014) views word meanings as inhabiting regions of a conceptual space, with carefully chosen dimensions. This representation has some similarities but also key differences with the compositional distributional model of meaning. The project would examine a few key conceptual spaces such as 'colour', 'taste', or 'shape', for example, and investigate how these can be incorporated within the compositional distributional model of meaning. This might be undertaken by giving a categorytheoretic formalisation of these spaces, and identifying a suitable grammatical structure to implement composition, or alternatively, by analysing the behaviour of nouns, adjectives, and verbs within conceptual spaces, and fitting these to the pregroup grammar. Prerequisites: category theory References: 

Diagrammatic methods in computer security  Bob Coecke, Stefano Gogioso  Throughout the last decade, categorical and diagrammatic methods have found enormous success in the field of quantum information theory, and are currently being extended to a variety of other disciplines. As part of this project, we will explore their application to classical cryptography and computer security. A number of distinct approaches are available, and the project can be tailored to each student's specific interests in the field. Prerequisites: One of Quantum Computer Science or Categorical Quantum Mechanics required. Computer Security (or equivalent experience) desirable, but not required. 

Effective Categorical Reasoning  Bob Coecke, Dan Marsden  Category theory is an important tool in theoretical computer science. In introductory courses proofs are generally conducted by pasting commuting diagrams together, or simple equational reasoning. There are further proof styles in category theory, for example using string diagrams, “Yoneda style” proofs and the use of internal logics. The aim of this project would be to investigate and contrast these different approaches, showing how, and when, they can be used effectively, on realistic problems. A concrete starting point would be to understanding the techniques involved, and then apply them to non trivial proofs from the literature in order to demonstrate their relative benefits. An ideal outcome would be an example based account for computer scientists of how to reason efficiently in category theory. Prerequisites: A good understanding of elementary category theory and comfort with mathematical proofs is essential. Some experience with string diagrams, as for example in the quantum computer science courses in the department would also be helpful. 

Extending pregroup grammars for inflected languages  Bob Coecke, Martha Lewis  Within the compositional distributional model of meaning developed by (Coecke et al., 2010), the model of grammar used is based on pregroup grammars, which have the feature that word order is important. In inflected languages, word order may become less important, and therefore pregroups may not be the best model of grammar. The project would involve extending the pregroup grammar to allow for different grammatical cases and to allow for more flexibility in word order. The model developed might simplify the approach taken in (Casadio and Lambek, 2005), which still uses the fundamentally ordered structure of pregroups as a basis for the grammar. Prerequisites: either category theory or quantum computer science would do References: 

Quantum Computing and Quantum Information, Logic, Category Theory, Fundamental Physics  Bob Coecke  Bob Coecke is willing to supervise projects in the following areas. Please feel free to contact him.  Development and applications of the categorical quantum mechanics formalism and corresponding graphical languages http://en.wikipedia.org/wiki/Categorical_quantum_mechanics  Development of the quantomatic automated graphical reasoning software https://sites.google.com/site/quantomatic/  Theory development and empirical experiments for meaning in natural language processing http://www.cs.ox.ac.uk/people/bob.coecke/NewScientist.pdf http://www.cs.ox.ac.uk/industry/content/IndustryNewsSummer2011.pdf 

Software development: quantomatic, automated graphical reasoning tools  Bob Coecke  Recent graphbased formalisms for computation provide an abstract and symbolic way to represent and simulate quantum information processing. Manual manipulation of such graphs is slow and error prone. This project employs a formalism, based on monoidal categories, that supports mechanised reasoning with opengraphs. This gives a compositional account of graph rewriting that preserves the underlying categorical semantics.  
Syntax and semantics for nounnoun combinations  Bob Coecke, Martha Lewis  In English differences in meaning of a compound noun phrase may be signalled by the intonation used. A 'TOY shop' is a shop that sells toys, whereas a 'toy SHOP' is a toy that is a shop. In the latter case the word 'toy' is used as an adjective and is more straightforwardly compositional, but in the former case the meaning of the compound is idiomatic (Kamp and Partee, 1995). The project would model these facets of meaning within the compositional distributional model of meaning (Coecke et al. 2010), and test the model thus developed within a semantic vector space based on text corpora. The project might use methods from (Coecke and Lewis, 2015), which models overextension in noun combinations, and (Kartsaklis and Sadrzadeh, 2015), which model differences in intonation. Prerequisites: Linear Algebra, either category theory or quantum computer science would do, Python (or similar language suitable for text processing) References: 

Tool Support for Graphical Categorical Reasoning  Bob Coecke, Dan Marsden  Recent work has demonstrated how a suitable form of string diagrams can provide a powerful tool for reasoning in elementary category theory. Currently these diagrams are laid out and manipulated manually, as good layout choices are important for aiding the intuitions of readers. The project would involve investigating appropriate tool support for working with these diagrams, and ideally to provide a prototype implementation to demonstrate some of the ideas involved. The emphasis will be on convenient layout of practically useful diagrams that convey the correct geometric intuitions, rather than proof automation as provided by the Quantomatic tool for similar graphical calculi. Prerequisites: A good knowledge of elementary category theory is essential. Some experience with string diagrams, as for example in the quantum computer science courses in the department would be helpful. Good programming skills in at least one language will be needed for prototyping. 

Kinect interface development for cybersecurity visual analytics tool  Sadie Creese  (Joint with Michael Goldsmith) Kinect interface development for cybersecurity visual analytics tool: using the open development environment for Kinect, to develop an alternative HCI for the Oxford CyberVis tool http://www.cs.ox.ac.uk/projects/cybervis/ (focusing on navigation of the CyberVis environment). As CyberVIs is primarily developed in JAVA a good understanding that language is essential in order to interface Kinect to CyberVis. Suitable for 3rd or 4th year undergraduates, or MSc. Other projects on novel humancomputer interfaces for security possible, depending on interest and inspiration.


Modelling of securityrelated interactions, in CSP or more evocative notations such as Milner's bigraphs  Sadie Creese  (Joint with Michael Goldsmith) Modelling of securityrelated interactions, in CSP or more evocative notations such as Milner's bigraphs (http://www.springerlink.com/content/axra2lvj4ph81bdm/; http://www.amazon.co.uk/TheSpaceMotionCommunicatingAgents/dp/0521738334/ref=sr_1_2?ie=UTF8&qid=1334845525&sr=82). Concurrency and Computer Security a distinct advantage. Appropriate for good 3rd or 4th year undergraduates or MSc. * Open to suggestions for other interesting topics in Cybersecurity, if anyone has a particular interest they would like to pursue. 

Smartphone security  Sadie Creese  (Joint with Michael Goldsmith) Smartphone security: one concrete idea is the development of a policy language to allow the authors of apps to describe their behaviour, designed to be precise about the expected access to peripherals and networks and the purpose thereof (data required and usage); uses skills in formal specification, understanding of app behaviour (by studying opensource apps), possibly leading to prototyping a software tool to perform runtime checking that the claimed restrictions are adhered to. Suitable for good 3rd or 4th year undergraduates, or MSc, Concurrency, Concurrent Programming, Computer Security all possibly an advantage. Other projects within this domain possible, according to interest and inspiration.
Prerequisites:Concurrency, Concurrent Programming, Computer Security all possibly an advantage. 

Technologylayer social networks  Sadie Creese  (joint with Michael Goldsmith) Technologylayer social networks: investigate the potential to identify relationships between people via technology metadata  which machines their machines are "friendly" with. Research will involve identification of all metadata available from the network layer, app layers and the data layer, development of appropriate relationship models, and practical experimentation / forensicstyle work exploring how to extract relationships between technologies and identities. Appropriate for 4th year undergraduates or MSc. 

Information Disclosure in Data Integration  Bernardo Cuenca Grau, Michael Benedikt, Egor Kostylev  Data integration systems allow users to effectively access data sitting in multiple datasources (typically relational databases) by means of queries over a global schema. In practice, datasources often contain sensitive information that the data owners want to keep inaccessible to users. In a recent research paper, the project supervisors have formalized and studied the problem of determining whether a given data integration system discloses sensitive information to an attacker. The paper studies the computational properties of the relevant problems and also identifies situations in which practical implementations are feasible. The goal of the project is to design and implement practical algorithms for checking whether information disclosure can occur in a data integration setting. These algorithms would be applicable to the aforementioned situations for which practical implementations seem feasible. Prerequisites: Familiarity with Databases. The students would also benefit from taking the Knowledge Representation and Reasoning Course and/or Theory of Data and Knowledge Bases.  
Practical Confidentiality Enforcement Algorithms in Ontologybased Information Systems  Bernardo Cuenca Grau  BACKGROUND Preserving confidentiality of information (i.e.,ensuring that sensitive data is only accessible to authorised users) is a critical requirement for the design of information systems. In recent years, Semantic Technologies have become widespread in many application domains. There is consequently a pressing need for suitable confidentiality enforcement infrastructure in information systems which rely on RDF as a data model, SPARQL as a query language, and OWL 2 as a language for describing background knowledge. Controlled Query Evaluation (CQE) is an approach to confidentiality enforcement where system administrators specify in a declarative way the information that cannot be disclosed to users by means of a confidentiality policy. When given a user query, a censor checks whether returning the answer would lead to a violation of the corresponding policy and thus to a disclosure of confidential information to unauthorised users; in that case, the censor returns a distorted answer. CQE has been recently studied in the context of ontologies and algorithms have been proposed in the case where the relevant ontologies are expressed in the OWL 2 RL profile—a fragment of OWL 2 for which query answering is known to be theoretically tractable in the size of both ontology and data, and efficiently implementable by means of rulebased technologies. PROJECT GOALS The main goals of this project are the following: • Implement the basic CQE algorithm for linear OWL 2 RL ontologies. This algorithm will be implemented on top of RDFox (http://www.cs.ox.ac.uk/isg/tools/RDFox/): a query answering engine for OWL 2 RL. • Study and implement potential optimisations. • Evaluate the implementation using both benchmark and realistic datasets. • Design and implement an extension of the basic algorithm that is applicable to a larger fragment of OWL 2 RL. PROJECT PREREQUISITES • Good programming skills in Java and/or C++ • The following courses are relevant for this project: o Knowledge representation and reasoning. o Theory of Data and Knowledge Bases o Databases o Computer Security  
Voting for facility location  Edith Elkind  Description: suppose that people living in a given geographic area would like to decide where to place a certain facility (say, a library or a gas station). There are several possible locations, and each person prefers to have the facility as close to them as possible. However, the central planner, who will make the final decision, does not know the voters' location, and, moreover, for various reasons (such as privacy or the design of the voting machine), the voters cannot communicate their location. Instead, they communicate their ranking over the available locations, ranking them from the closest one to the one that is furthest away. The central planner then applies a fixed voting rule to these rankings. The quality of each location is determined by the sum of distances from it to all voters, or alternatively the maximum distance. A research question that has recently received a substantial amount of attention is whether classic voting rules tend to produce goodquality solutions in this setting. The goal of the project would be to empirically evaluate various voting rules with respect to this measure, both for singlewinner rules and for multiwinner rules (where more than one facility can be opened). In addition to purely empirical work, there are interesting theoretical questions that one could explore, such as proving worstcase upper and lower bounds of the performance of various rules. Prerequisites: basic programming skills.  
Form Corpus & Benchmark  Tim Furche  (Supervisor C Schallhart) Web pages are the past since interactive web application interfaces have reshaped the online world. With all their feature richness, they enrich our personal online experience and provide some great new challenges for research. In particular, forms became much complex in assisting the user during the _lling, e.g., with completion options, or through structuring the form _lling process by dynamically enabling or hiding form elements. Such forms are a very interesting research topic but their complexity prevented so far the establishment of a corpus of modern forms to benchmark di_erent tools dealing with forms automatically. This MSC project will _ll this gap in building a corpus of such forms: Based on a number of production sites from one or two domains, we will build our corpus of web interfaces, connected to a (shared) database. Not only will the future evaluations in the DIADEM project rely on this corpus, but we will also publish the corpus { promoting it as a common benchmark for the research community working on forms. Knowledge in Java, HTML, CSS, Javascript, and web application development are required.  
Form Filling & Probing  Tim Furche  (Joint with C Schallhart) Unearthing the knowledge hidden in queryable web sites requires a good understanding of the involved forms. As part of DIADEM, we are developing OPAL (Ontology based web Pattern Analysis with Logic), a tool to recognize forms belonging to a parameterizable application domain, such as the real estate or used car market. OPAL determines the meaning of individual form elements, e.g., it identi_es the _eld for the minimum or maximum price or for some location. This MSC project will build upon OPAL to not only deal with static forms but also with sequences of interrelated forms, as in case of a rough initial form, followed by a re_nement form, or in case of forms showing some options only after _lling some other parts. Over the course of this MSC project, we will develop a tool which invokes OPAL to analyze a given form, to explore all available submission mechanisms on this form, analyze the resulting pages for forms continuing the initial query, and to combine the outcome all found forms into a single interaction description. Knowledge in Java, HTML, CSS are required, prior experience in logic programming would be a strong plus.  
Benchmarks for Bayesian Deep Learning: Astrophysics  Yarin Gal  Bayesian deep learning (BDL) is a field of Machine Learning where we develop tools that can reason about their confidence in their predictions. A main challenge in BDL is comparing different tools to each other, with common benchmarks being much needed in the field. In this project we will develop a set of tools to evaluate Bayesian deep learning techniques, reproduce common techniques in BDL, and evaluate them with the developed tools. The tools we will develop will rely on downstream tasks that have made use of BDL in realworld applications such as parameter estimation in Strong Gravitational Lensing with neural networks. Prerequisites: only suitable for someone who has done Probability Theory, has worked in Machine Learning in the past, and has strong programming skills (Python).  
Benchmarks for Bayesian Deep Learning: Diabetes Diagnosis  Yarin Gal  Bayesian deep learning (BDL) is a field of Machine Learning where we develop tools that can reason about their confidence in their predictions. A main challenge in BDL is comparing different tools to each other, with common benchmarks being much needed in the field. In this project we will develop a set of tools to evaluate Bayesian deep learning techniques, reproduce common techniques in BDL, and evaluate them with the developed tools. The tools we will develop will rely on downstream tasks that have made use of BDL in realworld applications such as detecting diabetic retinopathy from fundus photos and referring the most uncertain decisions for further inspection. Prerequisites: only suitable for someone who has done Probability Theory, has worked in Machine Learning in the past, and has strong programming skills (Python).  
Small Data Challenge in Reinforcement Learning  Yarin Gal  Reinforcement learning (RL) algorithms often require large amounts of data for training, data which is often collected from simulations of experiments of robotics systems. The requirement for large amounts of data forms a major hurdle in using RL algorithms for tasks in robotics though, where each realworld experiment would cost time and potential damage to the robot. In this project we will develop a mock "Challenge" similar to Kaggle challenges. In this challenge we will restrict the amount of data a user can query the system at each point in time, and try to implement simple RL baselines under this constraint. We will inspect the challenge definition and try to improve it. Prerequisites: only suitable for someone who has worked in Machine Learning in the past, is familiar with Reinforcement Learning, and has strong programming skills (Python).  
Data Science and Machine Learning Techniques for Model Parameterisation in Biomedical and Environmental Applications  David Gavaghan, Martin Robinson, Michael Clerx, Sanmitra Ghosh  "Time series data arise as the output of a wide range of scientific experiments and clinical monitoring techniques. Typically the system under study will either be undergoing time varying changes which can be recorded, or the system will have a time varying signal as input and the response signal will be recorded. Familiar everyday examples of the former include ECG and EEG measurements (which record the electrical activity in the heart or brain as a function of time), whilst examples of the latter occur across scientific research from cardiac cell modelling to battery testing. Such recordings contain valuable information about the underlying system under study, and gaining insight into the behaviour of that system typically involves building a mathematical or computational model of that system which will have embedded within in key parameters governing system behaviour. The problem that we are interested in is inferring the values of these key parameter through applications of techniques from machine learning and data science. Currently used methods include Bayesian inference (Markov Chain Monte Carlo (MCMC), Approximate Bayesian Computation (ABC)), and nonlinear optimisation techniques, although we are also exploring the use of other techniques such as probabilistic programming and Bayesian deep learning. We are also interested in developing techniques that will speed up these algorithms including parallelisation, and the use of Gaussian Process emulators of the underlying models Application domains of current interest include modelling of the cardiac cell (for assessing the toxicity of new drugs), understanding how biological enzymes work (for application in developing novel fuel cells), as well as a range of basic science problems. Application domains of current interest include modelling of the cardiac cell (for assessing the toxicity of new drugs), understanding how biological enzymes work (for application in developing novel fuel cells), as well as a range of basic science problems. " Prerequisites: some knowledge of Python  
Cakecutting with low envy  Paul Goldberg  Description: Icutyouchoose is the classical way for two people to share a divisible good. For three people, there exists a sequence of operations using 5 cuts, that is also envyfree, but for 4 or more people, it is unknown whether you can share in an envyfree manner, using a finite number of cuts. (This is with respect to a wellknown class of procedures that can be represented using a tree whose nodes are labelled with basic "cut" and "choose" operations.) The general idea of this project is to generate and test large numbers of potential cakecutting procedures, and measure the "extent of envy" in the case where they are not envyfree. (See wikipedia's page on "cakecutting problem".) It is of interest to find out the amount of envy that must exist in relatively simple procedures. Prerequisites: competance and enthusiasm for program design and implementation; mathematical analysis and proofs. 

Computing prices in a generalisation of the ProductMix Auction  Paul Goldberg  The ProductMix Auction was devised by Klemperer in 2008 for the purpose of providing liquidity to commercial banks during
the financial crisis; it was used for a number of years by the Bank of England. See the following link: 

Contest design  Paul Goldberg  Description A "contest" refers to a widespread form of competition that has attracted a huge literature in Economics. In a contest, players expend costly effort that translate to some form of output, or score. Then they (or some of them) receive prizes associated with their ranking on output/score. The project would study a class of contests considered in a paper "Ranking games that have competitivenessbased strategies". One aspect of the project would be to study the convergence properties of the Fictitious Play procedure, applied to these games. We also envisage addressing, experimentally, the challenge of handicapping. Handicapping involves ranking the players on the values of monotonic functions of their outputs, rather than just raw outputs, in order to elicit greater effort; the challenge is to choose the best functions. The project is experimental, with some scope for analytical work (specifically, for the problem of optimal handicapping in the 2player case). Prerequisites familiarity with basic probability theory. 

Medical information extraction with deep neural networks  Paul Goldberg, Alejo NevadoHolgado  Background. Electronic Health Records (EHRs) are the databases used by hospital and general practitioners to daily log all the information they record from patients (i.e. disorders, taken medications, symptoms, medical tests…). In number of subjects (e.g. 50 million patients in the case of EMIF http://www.emif.eu/), EHRs are the largest source of empirical data in biomedical research, allowing for major scientific findings in central disorders such as cancer and Alzheimer’s disease [1]. However, most of the information held in EHRs is in the form of natural language text (written by the physician during each session with each patient), making it inaccessible for research. Unlocking all this information would bring a very significant advancement to biomedical research, multiplying the quantity and variety of scientifically usable data, which is the reason why major efforts have been relatively recently initiated towards this aim (e.g. I2B2 challenges https://www.i2b2.org/NLP/ or the UKCRIS network of EHRs https://crisnetwork.co/ukcrisprogramme) In Artificial Intelligence, Information Extraction (IE) is the task of systematically extracting information from natural language text in a form that can later be processed by a computer. For instance, if a physician describes the symptoms and full treatment of a patient, an IE task could be identifying from the text alone all the drugs prescribed to the patient and their dosages. Although Natural Language Algorithms (NLPs) can perform this task with fair accuracy in the simpler situations (e.g. wellstructured text, large amounts of labelled data available…), the challenge remains an unsolved problem in the more complex cases (e.g. badly structured language; no labelled samples…), which is more akin the text typically found in EHRs. Namely, physicians tend to use badly formatted shorthand and nonwidespread acronyms (e.g. ‘transport pt to OT tid via W/C’ for ‘transport patient to occupational therapy three times a day via wheel chair’), while labelled records are scarce (ranging in the hundreds for a given task, with the best labelled datasets provided by I2B2). Project. Recent Deep Neural Networks (DNN)[2] architectures have shown remarkable results in traditionally unsolved NLP problems, including some IE tasks such as Slot Filling [3] and Relation Classification [4]. When transferring this success to EHRs, DNNs offer the advantage of not requiring well formatted text, while the problem remains of labelled data being scarce (ranging on the hundreds for EHRs, rather than the tens of thousands used in typical DNN studies). However, ongoing work in our lab has shown that certain extensions of recent NLPDNN architectures can reproduce the typical remarkable success of DNNs in situations with limited labelled data (paper in preparation). Namely, incorporating interaction terms to feed forwards DNN architectures [5] can rise the performance of relation classification in I2B2 datasets from 0.65 F1 score to 0.90, while the highest performance previously reported with the same dataset was 0.74. With an F1 score of 0.90, the quality of the extracted information meets the standards required for such information to be used in subsequent biomedical studies, promising to unlock the scientific data that at present is hidden in the free text of EHRs We therefore propose to apply DNNs to the problem of information extraction in EHRs, using I2B2 and UKCRIS data as a testbed. More specifically, the DNNs designed and implemented by the student should be able to extract medically relevant information, such as prescribed drugs or diagnoses given to patients. This corresponds to some of the challenges proposed by I2B2 during recent years (https://www.i2b2.org/NLP/Medication/), and are objectives of high interest in UKCRIS which have sometimes been addressed with older techniques such as rules [1,6,7]. The student is free to use the extension of the feed forward DNN developed in our lab, or to explore other feed forwards or recurrent (e.g. RNN, LSTM or GRU) alternatives. The DNN should be implemented in Python’s Keras (https://keras.io/), Theano (http://deeplearning.net/software/theano/), Tensorflow (https://www.tensorflow.org/), or PyTorch (http://pytorch.org/). Bibliography [1] G. Perera, M. Khondoker, M. Broadbent, G. Breen, R. Stewart, Factors Associated with Response to Acetylcholinesterase Inhibition in Dementia: A Cohort Study from a Secondary Mental Health Care Case Register in London, PLOS ONE. 9 (2014) e109484. doi:10.1371/journal.pone.0109484. [2] Y. LeCun, Y. Bengio, G. Hinton, DL  Deep learning, Nature. 521 (2015) 436–444. doi:10.1038/nature14539. [3] G. Mesnil, Y. Dauphin, K. Yao, Y. Bengio, L. Deng, D. HakkaniTur, X. He, L. Heck, G. Tur, D. Yu, G. Zweig, Using Recurrent Neural Networks for Slot Filling in Spoken Language Understanding, IEEEACM Trans. Audio Speech Lang. Process. 23 (2015) 530–539. doi:10.1109/TASLP.2014.2383614. [4] C.N. dos Santos, B. Xiang, B. Zhou, Classifying Relations by Ranking with Convolutional Neural Networks, CoRR. abs/1504.06580 (2015). http://arxiv.org/abs/1504.06580. [5] M. Denil, A. Demiraj, N. Kalchbrenner, P. Blunsom, N. de Freitas, Modelling, Visualising and Summarising Documents with a Single Convolutional Neural Network, CoRR. abs/1406.3830 (2014). http://arxiv.org/abs/1406.3830. [6] E. Iqbal, R. Mallah, R.G. Jackson, M. Ball, Z.M. Ibrahim, M. Broadbent, O. Dzahini, R. Stewart, C. Johnston, R.J.B. Dobson, Identification of Adverse Drug Events from Free Text Electronic Patient Records and Information in a Large Mental Health Case Register, PLOS ONE. 10 (2015) e0134208. doi:10.1371/journal.pone.0134208. [7] R.G. Jackson MSc, M. Ball, R. Patel, R.D. Hayes, R.J. Dobson, R. Stewart, TextHunter – A User Friendly Tool for Extracting Generic Concepts from Free Text in Clinical Research, AMIA. Annu. Symp. Proc. 2014 (2014) 729–738.  
Neural network genomics  Paul Goldberg, Alejo NevadoHolgado  Background. For more than 10 years, GWAS studies have represented a revolution in the study of human disorders and human phenotypes. By measuring how your risk of suffering any given disease changes according to SNP mutations, GWAS studies can measure how relevant each gene is to the disease under study. However, this SNPdisease association is calculated “one SNP at a time”, and ignores all genegene interactions that are often crucially important to the causation of the disorder. If the interactions between a number of genes (e.g. insulin and its receptor in type 2 diabetes; or APP and alpha, beta and gamma secretase in Alzheimer…) is what produces a given disorder, this interaction will not be detected in a GWAS analysis. This shortcoming may not be a problem in monogenetic hereditable disorders, such as Huntington disease, where mutations in a single gene by itself are enough for causing the disease. However, GWAS studies will likely not uncover the mechanisms of complex disorders, where the disease emerges from the interaction of a number of genes. This is likely the source of the so called “missing hereditability” problem observed in most complex traits or diseases, where all the SNP variations taken together cannot account for most of the hereditability of a given trait or disease [1]. In addition, it has been demonstrated that complex traits such as height and BMI are clearly and strongly hereditable [2], but GWAS studies simply cannot detect most of this hereditability. In summary, GWAS analyses detect simple individual genetic factors, but not interactions between genetic factors. Project. In this project, we propose to identify the interacting genetic factors behind Alzheimer’s disease and other neurodegenerations with neural networks, which are known for exploiting the interactions present in the tobe analysed data. While the linear models used in GWAS studies are able to identify only linear and monovariated contributions of each gene to a disorder, neural networks can analyse how genes interact with each other to explain the studied disorder. This versatility is what has allowed neural networks to find such widespread use in industry in the last decade, where they are revolutionizing image, sound and language analysis [3–5]. In our laboratories we already have the hardware and the expertise required to build neural networks, and we have used them in other research areas relevant to Alzheimer’s disease. In addition, we have access and experience using UK Biobank, which is the ideal dataset to implement this project. In UK Biobank they have measured wide array SNPs, all disorders and demographics in ~500,000 participants between the ages of 37 and 73, and more than 5000 of them suffer from Alzheimer’s disease, Parkinson’s disease or other neurodegenerations. We propose the MSc student to build a neural network to predict either diagnoses or diseaserelated endophenotypes (i.e. brain volumes of affected areas, cognitive scores…) of each one of these participants, using only the information present in the wide array SNPs and relevant demographics. The student is free to use the extension of the feed forward DNN developed in our lab, or to explore other feed forwards or recurrent (e.g. RNN, LSTM or GRU) alternatives. The DNN should be implemented in Python’s Keras (https://keras.io/), Theano (http://deeplearning.net/software/theano/), Tensorflow (https://www.tensorflow.org/), or PyTorch (http://pytorch.org/). Bibliography [1] T.A. Manolio, F.S. Collins, N.J. Cox, D.B. Goldstein, L.A. Hindorff, D.J. Hunter, M.I. McCarthy, E.M. Ramos, L.R. Cardon, A. Chakravarti, J.H. Cho, A.E. Guttmacher, A. Kong, L. Kruglyak, E. Mardis, C.N. Rotimi, M. Slatkin, D. Valle, A.S. Whittemore, M. Boehnke, A.G. Clark, E.E. Eichler, G. Gibson, J.L. Haines, T.F.C. Mackay, S.A. McCarroll, P.M. Visscher, Finding the missing heritability of complex diseases, Nature. 461 (2009) 747–753. doi:10.1038/nature08494. [2] D. Cesarini, P.M. Visscher, Genetics and educational attainment, Npj Sci. Learn. 2 (2017). doi:10.1038/s4153901700056. [3] Y. LeCun, Y. Bengio, G. Hinton, Deep learning, Nature. 521 (2015) 436–444. doi:10.1038/nature14539. [4] A. Esteva, B. Kuprel, R.A. Novoa, J. Ko, S.M. Swetter, H.M. Blau, S. Thrun, Dermatologistlevel classification of skin cancer with deep neural networks, Nature. 542 (2017) 115–118. doi:10.1038/nature21056. [5] V. Mnih, K. Kavukcuoglu, D. Silver, A.A. Rusu, J. Veness, M.G. Bellemare, A. Graves, M. Riedmiller, A.K. Fidjeland, G. Ostrovski, S. Petersen, C. Beattie, A. Sadik, I. Antonoglou, H. King, D. Kumaran, D. Wierstra, S. Legg, D. Hassabis, Humanlevel control through deep reinforcement learning, Nature. 518 (2015) 529–533. doi:10.1038/nature14236.  
Rockpaperscissors for the computerised strategist  Paul Goldberg  In the wellknown game of rockpaperscissors, it is clear that any player can "break even" by playing entirely at random. On the other hand, people do a poor job of generating random numbers, and expert players of the game can take advantage of predictable aspects of opponents' behaviour. In this project, we envisage designing algorithms that adapt to human opponent's behaviour, using for example noregret learning techniques, and modelling the opponent as a probabilistic automaton. Ideally, the student taking this project should manage to persuade some volunteers to test the software! This is needed to provide data for the algorithms, and should provide feedback to the opponent on what mistakes they are making. Depending on how it goes, we could also consider extending this approach to related games. Prerequisite: Interest in working with probability is important. 

Using behavioural data to model a gametheoretic adversary  Paul Goldberg  To take the project, a student should have done the courses in Computational Learning Theory, and Machine Learning. The project is related to the paper (Yang et al.) below, and uses a data set (studied in the paper) obtained from an online game that simulates an adversarial security setting, in which the player selects one of a number of targets to attack, based on information about their values, and the probabilities that they will be defended. The purpose of this game is to understand a player's "bounded rationality", i.e. the ways in which he fails to make the best decision. The project would study a model of player behaviour, related to ones studied in (Yang et al.), and aim to learn the ways in which a player fails to bestrespond. A related objective is to design optimal defender behaviour based on what is learned about attackers' behaviour. @article{YKOTJ13, author="R.\ Yang and C.\ Kiekintveld and F.\ Ord{\'o}{\~n}ez and M.\ Tambe and R.\ John", title="Improving resource allocation strategies against human adversaries in security games: An extended study", journal="Artificial Intelligence", volume="195", pages="440469", year="2013", }  
A Conceptual Model for Assessing Privacy Risk  Michael Goldsmith, Sadie Creese, Ioannis Agrafiotis, Arnau Erola, Jason Nurse  Privacy is not a binary concept, the level of privacy enjoyed by an individual or organisation will depend upon the context within which it is being considered; the more data at attacker has access to the more potential there may be for privacy compromise. We lack a model which considers the different contexts that exist in current systems, which would underpin a measurement system for determining the level of privacy risk that might be faced. This project would seek to develop a prototype model – based on a survey of known privacy breaches and common practices in data sharing. The objective being to propose a method by which privacy risk might be considered taking into consideration the variety of (threat and datasharing) contexts that any particular person or organisation might be subjected to. It is likely that a consideration of the differences and similarities of the individual or organisational points of view will need to be made, since the nature of contexts faced could be quite diverse.  
Computer Vision for Physical Security  Michael Goldsmith, Sadie Creese, Jassim Happa  Computer Vision allows machines to recognise objects in realworld footage. In principle, this allows machines to flag potential threats in an automated fashion based on historical and present images in video footage of realworld environments. Automated threat detection mechanisms to help security guards identify threats would be of tremendous help to them, especially if they have to temporarily leave their post, or have to guard a significant number of areas. In this project, students are asked to implement a system that is able to observe a real environment over time and attempt to identify potential threats, independent of a number of factors, e.g. lighting conditions or wind conditions. The student is encouraged to approach this challenge as they see fit, but would be expected to design, implement and assess any methods they develop. One approach might be to implement a camera system using e.g. a web camera or a Microsoft Kinect to conduct anomaly detection on realworld environments, and flag any issues related to potential threats. Requirements: Programming skills required 

Considering the performance limiters for controls  Michael Goldsmith, Sadie Creese, Ioannis Agrafiotis, Arnau Erola, Jason Nurse  The behavior of controls will be dependent upon how they are used. One obvious example being the protection afforded by a firewall is dependent upon the maintenance of the rules that determine what the firewall stops and what it does not. The benefit of various technical controls in operational context lacks good evidence and data, so there is scope to consider the performance of controls in a lab environment. This miniproject would select one or more controls from the CIS Top 20 Critical Security Controls (CSC) (version 6.1) and seek to develop laboratory experiments (and implement them) to gather data on how the effectiveness of the control is impacted by its deployment context (including, for example, configuration, dependence on other controls, threat faced). Requirements: Students will need an ability to develop a testsuite and deploy the selected controls. 

Cybersecurity visualization  Michael Goldsmith, Sadie Creese, Ioannis Agrafiotis, Arnau Erola, Jassim Happa  Cybersecurity visualization helps analysts and risk owners alike to make better decisions about what to do when the network is attacked. In this project the student will develop novel cybersecurity visualizations. The student is free to approach the challenge as they see fit, but would be expected to design, implement and assess the visualizations they develop. These projects tend to have a focus on network traffic visualization, but the student is encouraged to visualize datasets they would be most interested in. Other interesting topics might for instance include: hostbased activities (e.g. CPU/RAM/network usage statistics), network scans (incl. vulnerabilities scanning). Past students have visualized network traffic patterns and android permission violation patterns. Other projects on visualizations are possible (not just cybersecurity), depending on interest and inspiration. Requirements: Programming skills required. 

Datapoisoning  Michael Goldsmith  How worried should we be about a malign entity deliberately changing or poisoning our data? Is there risk? What is its nature? What harms might be effected? Can we build dataanalytics that are resistant to such attacks, can we detect them? It is highly unlikely that techniques for handling erroneous data will be sufficient since we are likely to face highly targeted datacorruption.  
Design /Architectural Vulnerability analysis of Distributed Ledgers  Michael Goldsmith, Sadie Creese, Ioannis Agrafiotis, Arnau Erola, Jason Nurse  This project would seek to study the general form of distributed ledgers, and the claimed nuances and general for in implementations, and assess all the possible week points that might make implementations open to compromise. The general approach will be to develop a detailed understanding of the security requirements and interdependencies of functionality – capturing the general security case for a distributed ledger and how it decomposes into lower level security requirements. Then an assessment will be made of each, and the potential for vulnerabilities in design and implementation considered. The ultimate result being a broad analysis of potential weakpoints. If possible these will then be practically investigated in a labbased environment. One output might be a proposal for testing strategies. 

Detecting disguised processes using ApplicationBehaviour Profiling  Michael Goldsmith, Sadie Creese, Ioannis Agrafiotis, Arnau Erola, Jason Nurse  In order to avoid detection, malware can disguise itself as a legitimate program or hijack system processes to reach its goals. Commonly used signaturebased Intrusion Detection Systems (IDS) struggle to distinguish between these processes and are thus only of limited use to detect these kind of attacks. They also have the shortcoming that they need to be updated frequently to possess the latest malware definitions. This makes them inherently prone to missing novel attacks. Misuse detection IDSs however overcome this problem by maintaining a ground truth of normal application behaviour and reporting deviations as anomalies. In this project, students will be tasked to investigate how a process’ behaviours can be profiled in the attempt to identify whether it is behaving anomalously and if it can be correctly identified as malware. This project will build on existing research in this area. Requirements: Programming skills required. 

Experimenting with anomaly detection features for performance  Michael Goldsmith, Ioannis Agrafiotis, Sadie Creese, Jason Nurse, Arnau Erola  Considering relative detection performance using different feature sets, and different anomalies of interest, in the face of varying attacks. Conducted with a view to exploring the minimal sets that would result in the threat detection, and producing guidance that is aimed at determining the critical datasets required for the control to be effective.  
Eyetracker data analysis  Michael Goldsmith, Jassim Happa, Ivan Martinovic  Eyetracking allows researchers to identify where observers look on a monitor. A challenge in analysing eyetracker output
however is identifying patterns of viewing behaviour from the raw eyetracker logs over time and what they mean in a research
context, particularly as no semantic information about the pixel being viewed is considered. In practice, this means that
we know where someone is looking, but nothing about what it means to look at that pixel or the order of pixels. From a research
perspective, being able to tag areas of interest, and what those areas mean, or conduct other statistical analysis on viewing
patterns would be of significant use to analysing results, particularly if these results could be automated. The purpose of
this project is to conduct an eyetracker experiment, and design and implement useful methods to analyse eyetracker data. We
will be able to provide training for the student, so they are able to use the eyetracking tools themselves. Other eyetracking
projects also possible, depending on interest and inspiration. 

Formal threat and vulnerability analysis of a distributed ledger model  Michael Goldsmith, Sadie Creese, Ioannis Agrafiotis, Arnau Erola, Jason Nurse  This project would utilise the process algebra CSP and associated model checker FDR to explore various types of threat and how they might successfully compromise a distributed ledger. This type of modelling would indicate possible attacks on a distributed ledger, and could guide subsequent review of actual designs and testing strategies for implementations. The modelling approach would be based on the cryptoprotocol analysis techniques already developed for this modelling and analysis environment, and would seek to replicate the approach for a distributed ledger system. Novel work would include developing the model of the distributed ledger, considering which components are important, formulating various attacker models and also formulating the security requirements / properties to be assessed using this modelchecking based approach. Requirements: In order to succeed in this project students would need to have a working knowledge of the machine readable semantics of CSP and also the model checker FDR. An appreciation of threat models and capability will need to be developed. 

High Dynamic Range Imaging for Documentation Applications  Michael Goldsmith, Jassim Happa, Sadie Creese  High dynamic range imaging (HDRI) allows more accurate information about light to be captured, stored, processed and displayed to observers. In principle, this allows viewers to obtain more accurate representations of realworld environments and objects. Naturally, HDRI would be of interest to museum curators to document their objects, particularly, nonopaque objects or whose appearance significantly alter dependent on amount of lighting in the environment. Currently, few tools exist that aid curators, archaeologists and art historians to study objects under userdefined parameters to study those object surfaces in meaningful ways. In this project the student is free to approach the challenge as they see fit, but would be expected to design, implement and assess any tools and techniques they develop. The student will then develop techniques to study these objects under userspecified conditions to enable curators and researchers study the surfaces of these objects in novel ways. These methods may involve tone mapping or other modifications of light exponents to view objects under nonnatural viewing conditions to have surface details stand out in ways that are meaningful to curators.  
High Dynamic Range Imaging for Physical Security  Michael Goldsmith, Sadie Creese, Jassim Happa  High dynamic range imaging (HDRI) allows more accurate information about light to be captured, stored, processed and displayed to observers. In principle, this allows viewers to obtain more accurate representations of realworld environments. Naturally, HDRI would be of interest to security personnel who use ClosedCircuit Television (CCTVs) to identify potential threats or review security footage. Conceivably, it may be possible to identify threats or activities in shadows or overexposed areas. Another example being able to identify facial features better of someone who is wearing a hoodie. The student is encouraged to approach this challenge as they see fit, but would be expected to design, implement and assess any methods they develop. One approach might be to implement an HDR viewer, then conduct a user study and in which participants attempt to identify how well low dynamic range content viewing compares to HDR viewing set in a physical security context. Another approach might be to implement an HDR viewer that changes exposures based on where the viewer is looking. We have eyetrackers at our disposable that students would be able to use as part of their assessment. We will also be able to provide training for the student, so they are able to use the eyetracking tools themselves. Requirements: Programming skills required 

International Cybersecurity Capacity Building Initiatives  Michael Goldsmith, Sadie Creese, Ioannis Agrafiotis, Arnau Erola, Jason Nurse  There is a large investment being made by the international community aimed at helping nations and regions to develop their capacity in cybersecurity. The work of the Global Cyber Security Capacity Centre (based at the Oxford Martin School) studies and documents this: https://www.sbs.ox.ac.uk/cybersecuritycapacity/content/front. There is scope to study in more detail the global trends in capacity building in cybersecurity, the nature of the work and the partnerships that exist to support it. An interesting analysis might be to identify what is missing (through comparison with the Cybersecurity Capacity Maturity Model, a key output of the Centre), and also to consider how strategic, or not, such activities appear to be. An extension of this project, or indeed a second parallel project, might seek to perform a comparison of the existing efforts with the economic and technology metrics that exist for countries around the world, exploring if the data shows any relationships exist between those metrics and the capacity building activities underway. This analysis would involve regression techniques.  
Modelling of securityrelated interactions, in CSP or more evocative notations such as Milner's bigraphs  Michael Goldsmith  (Joint with Sadie Creese) Modelling of securityrelated interactions, in CSP or more evocative notations such as Milner's bigraphs (http://www.springerlink.com/content/axra2lvj4ph81bdm/; http://www.amazon.co.uk/TheSpaceMotionCommunicatingAgents/dp/0521738334/ref=sr_1_2?ie=UTF8&qid=1334845525&sr=82). Concurrency and Computer Security a distinct advantage. Appropriate for good 3rd or 4th year undergraduates or MSc. * Open to suggestions for other interesting topics in Cybersecurity, if anyone has a particular interest they would like to pursue.  
Organisational Resilience and Self* replanning  Michael Goldsmith, Sadie Creese, Ioannis Agrafiotis, Arnau Erola, Jason Nurse, Jassim Happa  Key Performance Indicator (KPI) is a type of performance measurement in organisations. They evaluate the success of activities in which an organisation or system engages. Often success is measured in terms of whether an activity has been able to complete and able to repeat (e.g. zero defects, customer satisfaction or similar), other times it is measured in terms of making progress towards a strategic goal. Wellchosen KPIs allows us to reflect upon performance of an organisation, and possibly identify potential future degradation issues in systems. In this project, students are tasked to investigate how KPIs can be used to measure and improve cyberresilience in an organisation. Particularly, we are interested in investigating how the performance of an organisation, particularly with respects to security as well as mission performance. After identifying which aspects of an organisation or the mission are prone to error, it may be beneficial to propose solutions for how these issues can be addressed. With "self*replanning", we believe it would be possible for a system to suggest and automate certain aspects of how the mission, infrastructure or organisation can be repurposed to not fail. The student is encouraged to approach this challenge as they see fit, but would be expected to design, implement and assess any methods they develop. For instance, the project may be networksecurity focused, missionprocesses focused or otherwise. It may also investigate more formal approaches to self* replanning methods. Requirements: Programming and/or Social Science Research Methods. 

Penetration testing for harm  Michael Goldsmith, Sadie Creese, Ioannis Agrafiotis, Arnau Erola, Jason Nurse  Current penetration testing is typically utilised for discovering how organisations might be vulnerable to external hacks, and testing methods are driven by using techniques determined to be similar to approaches used by hackers. The result being a report highlighting various exploitable weakpoints and how they might result in unauthorised access should a malign entity attempt to gain access to a system. Recent research within the cybersecurity analytics group has been studying the relationship between these kinds of attack surfaces and the kinds of harm that an organisation might be exposed to. An interesting question would be whether an orientation around intent, or harm, might result in a different test strategy; would a different focus be given to the kinds of attack vectors explored in a test if a particular harm is aimed at. This miniproject would aim to explore this question by designing penetration test strategies based on a set of particular harms, and then seek to consider potential differences with current penetration practices by consultation with the professional community. Requirements: Students will need to have a working understanding of penetration testing techniques.  
Photogrammetry  Michael Goldsmith, Jassim Happa  Photogrammetry is a set of techniques that allows for 3D measurements from 2D photographs, especially those measurements pertaining to geometry or surface colours. The purpose of this project is to implement one or more photogrammetry techniques from a series of 2D photographs. The student is free to approach the challenge as they see fit, but would be expected to design, implement and assess the tool they develop.  
Predicting exposure to risk for active tasks  Michael Goldsmith, Sadie Creese, Ioannis Agrafiotis, Arnau Erola, Jason Nurse  Prior research has been considering how we might better understand and predict the consequences of cyberattacks based on knowledge of the business processes, people and tasks and how they utilise the information infrastructure / digital assets that might be exposed to specific attack vectors. However, this can clearly be refined by moving to an understanding of those tasks live or active at the time of an attack propagating across a system. If this can be calculated, then an accurate model of where risk may manifest and the harm that may result can be constructed. This project would explore the potential for such a model through practical experimentation and development of software monitors to be placed on a network aimed at inferring the tasks and users that are active based from network traffic. If time allows then hostbased sensors might also be explored (such as on an application server) to further refine the understanding of which users and live on which applications etc. Requirements: Students must be able to construct software prototypes and have a working knowledge of network architectures and computer systems. 

Procedural Methods in Computer Graphics  Michael Goldsmith, Jassim Happa  Procedural methods in computer graphics help us develop content for virtual environments (geometry and materials) using formal grammars. Common approaches include fractals and lsystems. Examples of content may include the creation of cities, planets or buildings. In this project the student will develop an application to use create content procedurally. The student is free to approach the challenge as they see fit, but would be expected to design, implement and assess the methods they develop. These projects tend to have a strong focus designing and implementing existing procedural methods, but also includes a portion of creativity. The project can be based on reality  e.g. looking at developing content that has some kind of basis on how the real world equivalent objects were created (physicallybased approaches), or the project may be entirely creative in how it creates content. Past students for instance have built tools to generate cities based on real world examples and nonexistent city landscapes, another example include building of procedural planets, including asteroids and earthlike planetary bodies. 

Reflectance Transformation Imaging  Michael Goldsmith, Jassim Happa, Stefano Gogioso  Reflectance Transformation Imaging (RTI) is a powerful set of techniques (the first of which known as Polynomial Texture Maps, PTMs) that enables us to capture photographs of objects under a several lighting conditions. Combined, these RTI images form a single photograph in which users can relight these objects by moving the light sources around the hemisphere in front of the object, but also specify userdefined parameters, including removing colour, making the objects more specular or diffuse in order to investigate the surface details in depth. It can be used for forensic investigation of crime scenes as well as cultural heritage documentation and investigation. The purpose of this project is to implement RTI methods of their preference.  
Resilience – metrics and tools for understanding organisational resilience  Michael Goldsmith, Sadie Creese, Ioannis Agrafiotis, Arnau Erola, Jason Nurse, Jassim Happa  Resilience in the face of cyberattack is considered a key requirement for organisations. Prior work within the cybersecurity analytics group has been developing a resilience model for organisations, but there remains a commonly agreed set of metrics that organisations can use to determine just how resilient they are (at a given point in time and continuously). This miniproject would seek to propose a set of metrics, and if time allows tools for applying them (although the latter may well better suit a following DPhil). The approach will be to consider the development of metrics to measure the various characteristics or organisational capabilities / behaviours that are determined to be necessary in order to be resilient. It will be necessary to consider how the metrics might vary according to, or take account of, different threat environments. It may also be interesting to consider if there are refinements of the resilience requirements that apply to different types of organisations. Data analytics approaches will need to be considered, in order to apply the metrics, and students might consider the use of visualisation to help with this. The resulting metrics will be validated in consultation with security professionals and organisations possessing resilience related experience. In the context of the miniproject this is most likely achievable via a small number of interviews, with a more detailed validation and iterative design approach being supported by a longitudinal study working with 2 or 3 organisations which might adopt and test the metrics 

Smartphone security  Michael Goldsmith  (Joint with Sadie Creese) Smartphone security: one concrete idea is the development of a policy language to allow the authors of apps to describe their behaviour, designed to be precise about the expected access to peripherals and networks and the purpose thereof (data required and usage); uses skills in formal specification, understanding of app behaviour (by studying opensource apps), possibly leading to prototyping a software tool to perform runtime checking that the claimed restrictions are adhered to. Suitable for good 3rd or 4th year undergraduates, or MSc, Concurrency, Concurrent Programming, Computer Security all possibly an advantage. Other projects within this domain possible, according to interest and inspiration. Prerequisites:Concurrency, Concurrent Programming, Computer Security all possibly an advantage. 

Technologylayer social networks  Michael Goldsmith  (Joint with Sadie Creese) Technologylayer social networks: investigate the potential to identify relationships between people via technology metadata  which machines their machines are "friendly" with. Research will involve identification of all metadata available from the network layer, app layers and the data layer, development of appropriate relationship models, and practical experimentation / forensicstyle work exploring how to extract relationships between technologies and identities. Appropriate for 4th year undergraduates or MSc.  
Tripwires  Michael Goldsmith, Sadie Creese, Ioannis Agrafiotis, Arnau Erola, Jason Nurse  At Oxford we have developed a framework for understanding the components of an attack, and documenting known attack patterns can be instrumental in developed tripwires aimed at detecting the presence of insiders in a system. This project will seek to develop a library of such tripwires based on a survey of openly documented and commented upon attacks, using the Oxford framework. There will be an opportunity to deploy the library into a livetrial context which should afford an opportunity to study the relative utility of the tripwires within large commercial enterprises. The miniproject would also need to include experimenting with the tripwires in a laboratory environment, and this would involve the design of appropriate test methods  
Understanding Enterprise Infrastructure Dependencies  Michael Goldsmith, Sadie Creese, Ioannis Agrafiotis, Arnau Erola, Jason Nurse  There are many tools available for detecting and monitoring cyberattacks based on network traffic and these are accompanied by a wide variety of tools designed to make alerts tangible to security analysts. By comparison, the impact of these attacks on an organisational level has received little attention. An aspect that could be enhanced further is the addition of a tool facilitating management and updating of our understanding of business processes, but also how those processes are dependent on a network infrastructure. This tool could facilitate the mapping between company strategies, activities needed to accomplish company goals and map these down to the network and people assets. At the top of the hierarchy lies the board, responsible for strategic decisions. These decision are interpreted in the managerial level and could be captured and analysed with business objective diagrams. These diagrams in return could be refined further to derive business processes and organisational charts, ensuring that decision made in the top level will be enforced in the lower levels. The combination of business processes and organisation charts could eventually provide the network infrastructure. For this project we suggest a student could develop novel algorithms for mapping of business processes to network infrastructures in an automated way (given the updated business process files). That said, the student is encouraged to approach this challenge as they see fit, but would be expected to design, implement and assess any methods they develop. Other projects on business process modelling also possible, depending on interest and inspiration. 

Visualising attack patterns  Michael Goldsmith, Sadie Creese, Ioannis Agrafiotis, Arnau Erola, Jassim Happa  The use of visualisation techniques to better detect attacks and predict the harm that might result is a common approach. This project will provide the student with sample datasets so that a novel visualisation might be developed (using opensource libraries) of one or more insider attacks. The aim being to help an analyst understand the nature of the attack and help identify appropriate response techniques. 

 Parity Games over Random Graphs  Julian Gutierrez  Parity games are a tool for formal verification. For instance, many model checking problems can be reduced to the solution of parity games. The exact complexity of these games is unknown: they can be solved in NP and in coNP, but it is not known whether a polynomial time algorithm exists, despite more than 3 decades of research. However, in practice, most parity games are solved efficiently. This project aims at defining new algorithms for parity games and evaluating them in practice, in particular, over random graphs. Prerequisites: Discrete Mathematics, Intro. to Formal Proof, Logic and Proof Desirable: ComputerAided Formal Verification, Computational Complexity  
Bisimilarity in Logics for Strategic Reasoning  Julian Gutierrez  Nash equilibrium is the standard solution concept for multiplayer games. Such games have multiple applications in Logic and Semantics, Artificial Intelligence and MultiAgent Systems, and Verification and Computer Science. Unfortunately, Nash equilibria is not preserved under bisimilarity, one of the most important behavioural equivalences for concurrent systems. In a recent paper it was shown that this problem may not arise when certain models of strategies are considered. In this project the aim is to investigate further implications of considering the new model of strategies. For instance, whether a number of logics for strategic reasoning become invariant under bisimilarity if the new model of strategies is considered, whether some logics that are unable to express Nash equilibria can do so with respect to the new model of strategies, and whether the results already obtained still hold more complex classes of systems, for instance, where nondeterminism has to be considered. Prerequisites: Discrete Mathematics, Introduction to Formal Proof, Logic and Proof Desirable: ComputerAided Formal Verification, Computational Complexity Project type: theory 

Towards Tractable Strategic Reasoning in Strategy Logic  Julian Gutierrez  Strategy Logic (SL) is a temporal logic to reason about strategies in multiplayer games. Such games have multiple applications in Logic and Semantics, Artificial Intelligence and MultiAgent Systems, and Verification and Computer Science. SL is a very powerful logic for strategic reasoning  for instance, Nash equilibria and many other solution concepts in game theory can be easily expressed  which has an undecidable satisfiability problem and a nonelementary model checking problem. In this project, the aim is to study fragments of SL which can potentially have better results (decidability and complexitity) with respect to the satisfiability and model checking problems. The fragments to be studied can be either syntactic fragments of the full language or semantic fragments where only particular classes of models are considered. Prerequisites: Discrete Mathematics, Introduction to Formal Proof, Logic and Proof Desirable: ComputerAided Formal Verification, Computational Complexity Project type: theory 

Parallel Algorithms for Computing Hilbert Bases  Christoph Haase  Given a homogeneous system of linear equations A x = 0, a Hilbert basis is a unique finite minimal set of nonnegative solutions from which every nonnegative solution of the system can be generated. Computing Hilbert bases is a fundamental problem encountered in various areas in computer science and mathematics, for instance in decision procedures for arithmetic theories, the verification of infinitestate systems and pure combinatorics. In this project. we plan to revisit an approach to computing Hilbert bases described in [1] that is highly parallelizable. With the ubiquity of multicore architectures, it seems conceivable that, with proper engineering, this approach will outperform existing approaches. The goal of this project is to deliver an implementation of the algorithm of [1] and benchmark it against competing tools. If successful, an integration into the SageMath platform could be envisioned. [1] https://pdfs.semanticscholar.org/6e84/f9ccfdaa37cb33cfc2024aec1dc7d13964c7.pdf Prerequisites: good knowledge of concurrent programming and data structures, and linear algebra  
The Complexity of the FirstOrderTheory of the Integers with Addition  Christoph Haase  The goal of this project is to determine the computational complexity of the firstorder theory of the integers with addition and equality FO(Z,+,=), but without order. This theory is a strict fragment of Presburger arithmetic, an arithmetic theory that finds numerous applications, for instance, in the verification of infinitestate systems. We aim for making an original scientific contribution by determining what time and space resources are needed in order to decide a sentence in FO(Z,+,=). If time permits, an implementation of the decision procedure developed in this project could be envisioned. Prerequisites: good familiarity with firstorder logic, linear algebra and computational complexity  
Bioinformatics Projects  Jotun Hein  Bioinformatics Projects can be found here. If you choose any of these projects you must find a joint supervisor from the Department of Computer Science. Pete Jeavons can advise on a suitable choice. 

Analysing useralgorithm interaction  Marina Jirotka  "Members of the Human Centred Computing Group are currently involved in the UnBias research project ( http://unbias.wp.horizon.ac.uk/). This investigates the user experience of algorithmdriven internet platforms. As part of this project we have been conducting observational data collection in which pairs of volunteers are video recorded whilst browsing online. The on screen activity is also captured and the data are being analysed to identify instances of useralgorithm interaction; in particular instances in which an algorithmic outcome – such as the auto complete suggestion in a search bar or the filtering mechanisms on a recommendation platform – might shape the trajectory of browsing activity. We will shortly undertake a second wave of data collection in which users browse online but are placed in situations in which their usual experience of certain algorithmic processes will be disrupted. We are looking for a student to conduct analysis on the collected data and to then build on this analysis by designing and conducting their own disruption experiment. The student can set their own specific research questions for the analysis, and determine what particular kind of methodological approach they would like to undertake. :There are no prerequisites. The study will suit a student interested in developing skills in social research methods and human computer interaction approaches. We will assist in securing ethical clearance and recruiting research participants for the data collection activities.  
User testing of ‘fairness toolkit’ to address algorithmic bias  Marina Jirotka, Helena Webb  Members of the Human Centred Computing Group are currently involved in the UnBias research project ( http://unbias.wp.horizon.ac.uk/). This investigates the user experience of algorithmdriven internet platforms and explores contemporary concerns that algorithmic processes can lead to inaccurate, unfair or even discriminatory outcomes. As part of the project we are developing a ‘fairness toolkit’ of material resources that are designed to raise awareness and debate over key project themes. The toolkit will consist of: 1) Awareness Cards – a set of cards intended to help young people (secondary school age, roughly 1217) devise and explore scenarios that illustrate how bias in algorithmic systems can affect them; 2) TrustScapes – a tool for mapping (on large sheets of paper) how young people visualise how they perceive the issues of algorithmic bias, data protection and online safety and what they would like to see done to make the online world fair and trustworthy; and 3) Empathy MetaMaps – a tool for “stakeholders”  such as employees of online platforms, developers, technologists, researchers, policymakers, regulators, public agency officials, civil servants etc.  to collaboratively/cooperatively respond to the TrustScapes shared by young people as a sector. We are looking for a student to conduct several rounds of user testing of the tools. This testing will take the form of group activities, focus groups, interviews and surveys etc. In addition to conducting the testing the student will analyse the data collected in order to assess the quality of the toolkit, identify ways in which it might be refined and devise an impact strategy for the toolkit in practice. Prerequisites: There are no prerequisites. The study will suit a student interested in developing skills in social research methods and human computer interaction approaches, as well as in gaining experience in public engagement. It would therefore particularly suit a student interested in undertaking doctoral study. We will assist in providing training in the methods to be used in the study as well as assistance with securing ethical clearance and recruiting research participants 

Compilation of a CSPlike language  Geraint Jones  This is a compiler project, also requiring familiarity with concurrency. The parallel programming language occam is essentially an implementable sublanguage of CSP. The aim of this project is to produce a small portable implementation of a subset of occam; the proposed technique is to implement a virtual machine based on the inmos transputer, and a compiler which targets that language. One of the aims of the project is to implement an extension to occam which permits recursion; more ambitious projects might implement a distributed implementation with several communicating copies of the virtual machine. Other possibilities are to produce separate virtual machines, optimised for displaying a simulation, or for efficiency of implementation, or translating the virtual machine code into native code for a real machine. 

Logic circuit workbench  Geraint Jones  This is an interactive programming project with some logic simulation behind it. The idea is to produce a simulator for a traditional logic breadboard. The user should be able to construct a logic circuit by drawing components from a library of standard gates and latches and so on, and connecting them together in allowable ways. It should then be possible to simulate the behaviour of the circuit, to control its inputs in various ways, and to display its outputs and the values of internal signals in various ways. The simulator should be able to enforce design rules (such as those about not connecting standard outputs together, or limiting fanout) but should also cope with partially completed circuits; it might be able to implement circuits described in terms of replicated subcircuits; it should also be able to some sort of standard netlist. It might be that this would make a project for two undergraduates: one implementing the interface, the other implementing a simulator that runs the logic. 

Modelling of arithmetic circuits  Geraint Jones  This is a project in the specification of hardware, which I expect to make use of functional programming. There is a great deal of knowledge about ways (that are good by various measures) of implementing standard arithmetic operations in hardware. However, most presentations of these circuits are at a very low level, involving examples, diagrams, and many subscripts, for example these descriptions. The aim of this project is to describe circuits like this in a higherlevel way by using the higher order functions of functional programming to represent the structure of the circuit. It should certainly be possible to execute instances of the descriptions as simulations of circuits (by plugging in simulations of the component gates), but the same descriptions might well be used to generate circuit netlists for particular instances of the circuit, and even to produce the diagrams. 

Extensible SearchBased Theorem Prover  Mark Kaminski  "Background: Searchbased proof methods such as those based on DPLL, tableau and connection calculi are widely used in automated theorem proving, ontological reasoning, and automated verification. As most implementations of such methods are developed with a particular logic, algorithm, and set of applications in mind, it is often difficult to extend them with new features. Project goals: The project aims to develop a modular, extensible base for implementing searchbased decision procedures for propositional, modal, and description logics. A key objective is to develop an abstract proof search engine that can be instantiated with different logics, proof rules, and search strategies. The capabilities of the engine should be demonstrated by implementing a simple tableaubased or DPLLbased reasoner for propositional or modal logic. Prerequisites: good functional programming skills, familiarity with logic and automated theorem proving.  
Translating OWL Ontologies into Rules  Mark Kaminski  "Background: Many reasoning tasks for expressive ontology languages such as OWL 2 DL can be reduced to or approximated by rulebased reasoning in formalisms such as OWL 2 RL, Datalog, or Disjunctive Datalog. A key step of any such reduction is a structural transformation of the original ontology into a set of rules in the target formalism that are in a certain sense equivalent to the ontology. The goal of the project is to implement an algorithm for transforming OWL 2 DL ontologies into rules that can then serve as input to Datalog and OWL 2 RL reasoners, as well as ASP solvers. Prereaquisites: * Good programming skills in Java * The following courses are relevant for this project:  Logic and Proof  Knowledge Representation and Reasoning " 

A fast numerical solver for multiphase flow  David Kay  The AllenCahn equation is a differential equation used to model the phase separation of two, or more, alloys. This model may also be used to model cell motility, including chemotaxis and cell division. The numerical approximation, via a finite difference scheme, ultimately leads to a large system of linear equation. In this project, using numerical linear algebra techniques, we will develop a computational solver for the linear systems. We will then investigate the robustness of the proposed solver. 

Efficient solution of one dimensional airflow models  David Kay  In this project we will investigate the use of numerical and computational methods to efficiently solve the linear system of equations arising from a one dimensional airflow model within a network of tree like branches. This work will build upon the methods presented within the first year Linear Algebra and Continuous Maths courses. All software to be developed can be in the student’s preferred programming language. General area: Design and computational implementation of fast and reliable numerical models arising from biological phenomena.  
Transparent SessionLayer Steganography  Andrew Ker  Steganography means hiding a hidden payload within an apparentlyinnocent cover, usually an item of digital media. This project is to implement a transparent proxy which uses something like a webcam video stream to hide steganographic packets. A local process should receive communication on a local socket and merge it with the webcam data stream, a process which can be reversed at the receiver. Prerequisites: It is necessary to have some experience working with video codecs, for example experience contributing to the H.264/5 codec, in order to place the payload in a webcam stream. It is definitely NOT possible to learn the format within the time available. Don't ask if it is possible to read up on it instead: it isn't.  
Twitter Pictures as a Steganographic Channel  Andrew Ker  Steganography means hiding a hidden payload within an apparentlyinnocent cover, usually an item of digital media. Most pure research focuses on bitmap or JPEG images, or simple video codes. In practice, there are often further constraints: the carrier might recompress or damage the object. This project is primiarly programming. First we must characterize the properties of image transmission through Twitter, and then provide an implementation of image steganography through it. This might be a complete piece of software with a nice interface, if the channel is straightforward, or a proofofconcept if the channel causes difficult noise (in which case we will need suitable errorcorrecting codes.) Prerequisites: Linear algebra. You will need a Twitter account and to learn the Twitter API without help from me. It would be useful to know a little about the JPEG image format before starting the project.  
Faceted Search over (Semantic) Web  Evgeny Kharlamov  An increasing number of applications are relying on RDF, OWL and SPARQL for storing, publishing and querying data. SPARQL however, is not targeted towards endusers, and suitable query interfaces are needed. Faceted search is a prominent endusers oriented approach to data access; during the last decade it became a mainstream commercial technology ubiquitous in ecommerce (e.g., Amazon.com, Booking.com). This student’s project aims at development and implementation of novel algorithms and/or interfaces for faceted search over RDF databases enhanced with OWL ontologies. The student’s project will be conducted within the SemFacet project (www.cs.ox.ac.uk/isg/tools/SemFacet/). Preferable knowledge: Java, RDF, and HCI, 

Query formulation interfaces over semantically enhanced Big Data.  Evgeny Kharlamov  Enabling endusers access to industrial Big Data is a challenging and important problem in many large companies. A largescale EU funded project Optique (www.optiqueprojectspage) addresses this problem with Semantic Web technologies and aims at developing a system for Big Data access deployed at Statoil and Siemens. In Oxford (www.oxfpageforoptique) we contribute in both research and development for several components of this system. In particular, we develop query formulation interfaces (QFIs) powered with Semantic Web technologies that endusers, such as geologists or energy engineers, can use to access Big Data. This student’s project aims at development of novel solutions for QFIs over Semantically enhanced Big Data: development (and adaptation) of techniques for ranking query answers and components of visual query formulation interfaces; development of techniques to support visual formulation of aggregate queries; implementation and experiments with real and synthetic data. Preferable knowledge: Java, Databases, Semantic Web technologies. 

Semiautomatic installation support for OntologyBased Data Access systems (OBDA).  Evgeny Kharlamov  OBDA is a prominent approach for an easy, endusers oriented, access to collections of databases. The approach relies on the Semantic Web technologies and its key idea is to use an ontology to mediate between the user and databases. The ontology provides a semantic abstractions from the technical details of databases. It is connected to the databases by means of socalled mappings. This student’s project aims at development and implementation of novel techniques to (i) extract ontologies from relational metadata (ii) extract mappings to existing ontologies. The implemented techniques would be tested on different databases. The student’s project will be conducted within a large scale Optique project (www.optiqueprojectspage) that aims at development of a system for endusers access to industrial Big Data in Statoil and Siemens. Preferable knowledge: Java, Databases, Semantic Web technologies. 

Small matrices for irrational nonnegative matrix factorisation  Stefan Kiefer  Recently it was shown (https://arxiv.org/abs/1605.06848) that there is a 6x11matrix M with rational nonnegative entries so that for any factorisation M = W H (where W is a 6x5matrix with nonnegative entires, and H is a 5x11matrix with nonnegative entries) some entries of W and H need to be irrational. The goal of this project is to explore if the number of columns of M can be chosen below 11, perhaps by dropping some columns of M. This will require some geometric reasoning. The project will likely involve the use of tools (such as SMT solvers) that check systems of nonlinear inequalities for solvability. Prerequisites: The project is mathematical in nature, and a good recollection of linear algebra is needed. Openness towards programming and the use of tools is helpful.  
Bilateral trade  Elias Koutsoupias  One of the fundamental problems at the interface of Algorithms and Economics is the problem of designing algorithms for simple trade problems that are incentive compatible. An algorithm is incentive compatible when the participants have no reason to lie or deviate from the protocol. In general, we have a good understanding when the problem involves only buyers or only sellers, but poor understanding when the market has both buyers and sellers. This project will investigate and implement optimal algorithms for this problem. It will also consider algorithms that are natural and simple but may be suboptimal. Prerequisites: Mathematical and algorithmic maturity. Fundamentals of game theory is useful, but not essential  
Bitcoin mining games  Elias Koutsoupias  Bitcoin is a new digital currency that is based on distributed computation to maintain a consistent account of the ownership of coins. The main difficulty that the bitcoin protocol tries to address is to achieve agreement in a distributed system run by selfish participants. To address this difficulty, the bitcoin protocol relies on a proofofwork scheme. Since the participants in this proofofwork scheme want to maximize their own utility, they may have reasons to diverge from the prescribed protocol. The project will address some of the gametheoretic issues that arise from the bitcoin protocol Prerequisites: Mathematical and algorithmic maturity. Fundamentals of game theory is useful, but not essential  
Learning algorithms for linear programming  Elias Koutsoupias  Linear programming (LP) is an important algorithmic problem. There are efficient algorithms for LP, such as the Simplex Algorithm, that do not perform well in the worst case, and there are inefficient algorithms, such as the Ellipsoid Algorithm, that are good in theory but not in practice. Learning algorithms, such as the Multiplicative Weight Update Algorithm, have the potential to be good both in theory and in practice. The project will address this question with theoretical analysis and implementation. Prerequisites: Mathematical and algorithmic maturity. Fundamentals of learning theory is useful, but not essential  
"KLEEPROVER"  Daniel Kroening  Alter the CBMC symbolic execution engine to create a dynamic symbolic execution tool similar to KLEE. Possible features include: • Distributed and incremental support using the caching scheme here • A feature to run BMC as you go to explore the traces 'around' the execution traces • Use path exploration as the witness generation part of an ACDL process. • Perform precondition inference, possibly using ACDL.  
Applications of deductive machine learning  Daniel Kroening  The goal of deductive machine learning is to provide computers with the ability to automatically learn a behaviour that
provably satisfies a given highlevel specification. As opposed to techniques that generalise from incomplete specifications
(e.g. examples), deductive machine learning starts with a complete problem description and develops a behaviour as a particular
solution. 

Automatic Diverse Machine Code Generator  Daniel Kroening, Tom Melham  Supervisors: Daniel Kroening, Tom Melham, Alastair Donaldson The target of this project is safety relevant, embedded, automotive applications (steering, braking, airbags, drive by wire controls) where the computational state of an embedded controller can be disturbed by transient effects (EM and RF fields, ESD and voltage transients). The suggested project is to demonstrate a system taking one image of machine code and using a binary to binary translator, transform it into a functionally equivalent image (a diverse image) which has a high degree of architectural diversity (differing opcode, differing intermediate values etc.). The two images can then be independently executed in realtime embedded controllers and the outputs crosschecked to hopefully detect disturbances to the machine state and operations. Having a diverse executable is advocated in the new ISO26262 automotive functional safety standard  but no tools or methods exist today to automatically derive diverse images. A related project could be to measure the degree of diversity between code images, although the best measure might be based on the hardware footprint of the software  at a lower level, it could be measured via some architectural properties, unique instructions/register resources/memory address ranges/etc. There is scope in this project for collaboration project with Infineon Technologies (Bristol, UK).  
Bug Hunting in Linux  Daniel Kroening  Model Checkers for concurrent programs have made tremendous progress in the past years. It's time to revisit the Linux Kernel with these methods, and investigate scalability and applicability of automated program analysis in this context. This is an applied project. 

CORAL / Q  Daniel Kroening  Build something similar to the Q tool in SLAM.  
Coding Standards Checker  Daniel Kroening  Build instrumenters for various coding standards, e.g., AutoSar, MISRAC, CERT C HIC++, etc.  
Combine Uninitialized Variables Analysis with Alias Analysis  Daniel Kroening  gotoinstrument features an instrumentation to track cases of usebeforedefine, i.e., reading of uninitialized variables. At present, the analysis does not take into account aliasing information. As such, the following code will yield a spurious counterexample: int main() { int x; int *ptr; ptr=&x; *ptr=42; if(x < 42) return 1; return 0; }  
Compute Program Dependence Graphs  Daniel Kroening  Program dependence graphs (PDGs) form an essential element for slicing. The reaching definitions analysis is particularly well suited to derive a clean and precise implementation of program dependence graphs, because it is fieldsensitive. Make SAMATE Tests Work Read this paper http://www.sciencedirect.com/science/article/pii/S0950584913000384 and make the tests mentioned in there work.  
Connect to GDB  Daniel Kroening  GDB has a client/server system to allow the debugger to be controlled from other programs. It would be awesome to have a feature for CBMC that allows a trace to be run via gdb. This would allow easy integration with existing debugging tools (an IDE plugin could simply drop the programmer into the debugger with an execution trace that causes the problem), some really cool CEGAR like tricks and some fun with parallel programs using multiple debuggers / the thread support and possibly reverse debugging to allow error traces to be explored. To do this will require working out which functions are nondeterministic models of real functions and accounting for this. This would also be necessary for some of the preceeding ideas.  
Constant Folding  Daniel Kroening  With the recently added reaching definitions analysis, it is straightforward to implement a program transformation that performs constant folding.To ensure soundness in concurrent settings, it should be combined with the existing is_threaded analysis. (A concurrencyaware version of reaching definitions now exists and is awaiting a merge.) Along the way, assertions that become trivially true should be eliminated.  
Dalvik FrontEnd  Daniel Kroening  Connect to an existing unpacker or add to the Java byte code frontend so that it can handle Dalvik / Android binaries. Some work implementing the Android libraries will be needed to make this useful.  
Decision Procedures  Daniel Kroening  A list of projects can be found here  
Enhancing the CBMC symbolic execution tool to support mixed concrete + symbolic execution  Daniel Kroening  The purpose of this enhancement is to solve cases where pure symbolic execution fails (incomplete decision procedures and handling of native libraries) while still maintaining its advantages. Previous work in [1] and [2] have mixed concrete and symbolic execution by using run time values of program variables to simplify constraints that could not be handled previously, or using concrete solutions of the solvable constraints in the current symbolic path condition. The project would involve implementing an approach similar to [2] into the existing CBMC tool. [1] P. Godefroid, N.Klarlund, and K.Sen. DART: Directed Automated Random Testing. SIGPLAN Not., 40(6):213223, 2005. [2] C. Pasareanu, N. Rungta, and W. Visser, Symbolic Execution with Mixed ConcreteSymbolic Solving. In International Symposium on Software Testing and Analysis (ISSTA), July 2011.  
Error Explanation  Daniel Kroening  Reimplement the algorithm described in this paper. http://www.kroening.com/papers/STTTerrorexplanation2005.pdf  
FrontEnd Testing  Daniel Kroening  Run the regressions mentioned here and the GCC C Torture tests with gotocc and a gotoprogram interpreter (mild dependency on the interpreterrelated task above).  
Hardware footprinting of software  Daniel Kroening, Tom Melham  The target of this project is safety relevant, embedded, automotive applications (steering, braking, airbags, drive by wire controls) where the computational state of an embedded controller must be checked before the system can be enabled for each drive cycle. The suggested topic is to outline and demonstrate a system of determining the hardware (gates/transistors/memories) within a controller architecture that is required to execute a given binary image for all possible data input  the 'silicon footprint' of an executable. This 'silicon footprint' would be useful in verifying the diagnostic coverage of a given selfchecking code, for targeting error injection into the computation for test purposes, and to optimise power required for common used items. Such proof of diagnostic coverage (ability to detect static and transient faults in the processing core) is requested in the new ISO26262 automotive functional safety standard  but no tools or methods exist today to measure the 'silicon footprint' of an algorithm, so claims on diagnostic coverage are mostly derived from generic silicon failure rates, which is a suboptimal approach. There is scope in this project for collaboration project with Infineon Technologies (Bristol, UK). 

Improved Interpreter  Daniel Kroening  CProver has an interpreter for goto programs but it could be improved, for example adding support for dynamic analysis, calls to external functions and possibly even mixed concrete and symbolic execution. It would also be good to be able to interpret traces found by BMC to catch mismatches / bugs between the logical model and the bit blasting models. This would also be useful for more advanced CEGAR like techniques  
Separation Logic in SMT Solvers  Daniel Kroening  Reasoning about the heap is one of the biggest challenges in program analysis. The goal of this project is to evaluate ways of combining Satisfiability Modulo Theories with a decision procedure for a fragment of Separation Logic. 

Static Detection of Data Races  Daniel Kroening  It would be good to have a tool that checks for data races in a similar fashion to DRD / Helgrind. The current pointer analysis can over approximate shared variables, so possibly some combination of this, trace generation, DRD / Helgrind and then refinement of which traces to pick? Perhaps this is even something ACDL like?  
The Galileo Phone  Daniel Kroening, Tom Melham  The Intel Galileo boards are micro controller boards that can be networked, and also conform to the Arduino standard so that devices (sensors, actuators, screens, I/O devices) can be attached to them and controlled. The task is to build a device resembling a systemonchip, i.e., a big system with several devices that work in concert together. An obvious example would be to build your own smartphone. It should provide the ability to initiate and receive calls, feature an MP3 player, and possibly some other phone features – alarm clock, GPS, and so on. You will be given a small budget to add the devices you need. Part of the challenge is restricted battery power — how long can you run your phone off a largish capacitor? 

Unspecified C Behaviours  Daniel Kroening  Build instrumentation to check for unspecified behaviours in C, as per Section J.2 (and maybe also J.3) of the C standard. Talk to Michael Tautschnig or Martin Brain for this item. You can point to http://www.cprover.org/wiki/starter_projects for further information.  
kliveness  Daniel Kroening  Implement kliveness for softwareliveness checking.  
Automated protocol synthesis from requirements  Marta Kwiatkowska  Network protocols are sets of rules that define how information is exchanged between computers or agents on a network. Many communication protocols, to mention Bluetooth, can be modelled as timed automata or their extensions. Protocols typically have the same algorithmic structure but may differ in values of parameters, e.g. timing delays. This project aims to develop a method for synthesising protocols from requirements given in temporal logic so that they are correct by construction. The idea is to combine templatebased synthesis (“Templatebased Program Verification and Program Synthesis”, Journal on Software Tools for Technology Transfer, 2012) with parameter synthesis for timed automata developed and implemented recently (http://www.veriware.org/bibitem.php?key=DKKM14). This way, the algorithmic structure will be given as a parametric template, and appropriate parameter values can be selected automatically to optimise a given quantitative requirement. This project would suit a student interested in theory and/or constraint solving.  
Autonomous urban driving  Marta Kwiatkowska  This project is concerned with synthesising strategies for autonomous driving directly from requirements expressed in temporal logic, so that they are correct by construction. Probability is used to quantify information about hazards, such as accidents hotspots. Inspired by the DARPA Urban Challenge, a method for synthesising strategies (controllers) from multiobjective requirements was developed and validated on map data for villages in Oxfordshire (http://www.prismmodelchecker.org/bibitem.php?key=CKSW13). The idea is to develop the techniques further, to allow highlevel navigation based on waypoints, and to develop strategies for avoiding threats, such as road blockage, at runtime. In the longer term, the goal is to validate the methods on realistic scenarios in collaboration with the Mobile Robotics Group. The project will suit a student interested in theory or software implementation. For more information about the project see http://www.veriware.org/autonomous.php  
Controller synthesis for robot coordination  Marta Kwiatkowska  Autonomous robots have numerous applications in scenarios such as warehouse management, planetary exploration, or search and rescue. In view of environmental uncertainty, such scenarios are modelled using Markov decision processes. The goals (e.g. the goods should be delivered to location A while avoiding the hazardous location B) can be conveniently specified using temporal logic, from which correctbyconstruction controllers (strategies) can be generated. This project aims to develop a PRISM model of a system of robots for a particular scenario so that safety and effectiveness of their cooperation is guaranteed. Techniques based on machine learning, and specifically realtime dynamic programming (http://www.prismmodelchecker.org/papers/atva14.pdf), will be utilised to generate controllers directly from temporal logic goals. This project will suit a student interested in machine learning and software implementation.  
Driver modelling  Marta Kwiatkowska  Semiautonomous driving is intended to improve safety, and involves, for example, taking control of the car from the driver if a collision is predicted. Probabilistic models of driver behaviour are needed in order to monitor whether the driver is attentive or distracted, and to predict if the manoeuvre, such as the driver swerving to the neighbouring lane, is safe. Recently, a driver model was developed and analysed using PRISM (http://www.prismmodelchecker.org/bibext.php?search=driver) and validated against data from the Carsim simulator. This project aims to extend this model in a number of ways, firstly, with more realistic threats, and, secondly, with active threats; the latter can be achieved using stochastic games and the PRISMgames model checker (http://www.prismmodelchecker.org/games/). In the longer term, the goal is to validate the model with real traffic data in collaboration with the Mobile Robotics Group. This project will suit a student interested in modelling.  
Heartbeat identification for smartphones  Marta Kwiatkowska  Today’s smartphones and fitness gadgets provide a variety of apps for health monitoring. Monitoring the heartbeat has also been proposed for use as a password, to ensure security and privacy, and this idea is currently being developed in e.g. the Nymi wristband. As part of ongoing research, techniques have been developed to analyse the ECG signal to facilitate personalisation of pacemaker devices, see http://www.veriware.org/bibitem.php?key=KMP14. This project aims to implement the method proposed in the paper “Human Identification Using Heartbeat Interval Features and ECG Morphology”, Proceedings of Seventh International Conference on BioInspired Computing: Theories and Applications (BICTA 2012), Springer, 2013. The implementation will be evaluated using a hardware testbed to be provided. The project will suit a student interested in embedded software. For more information about the pacemaker project see http://www.veriware.org/pacemaker.php.  
Modelling and analysis of energy usage for RFID protocols  Marta Kwiatkowska  Radio frequency identification (RFID) is used for object identification with application to devices connected to the Internet of Things. RFID protocols can benefit from formal verification techniques, and particularly quantitative verification with which a number of cost measures can be analysed. Recently, a Markov model was developed for performance analysis (“Performance Analysis of RFID Protocols: CDMA Versus the Standard EPC Gen2”, IEEE Transactions on Automation Science and Engineering, vol.11, no.4, pp.12501261, 2014). The idea is to develop a parametric PRISM model (www.prismmodelchecker.org) based on this Markov model and analyse its energy usage characteristics. The advantage of the parametric model is that it can be used to automatically determine optimal parameter values that guarantee that a given property is satisfied. The PRISM model of the ZigBee protocol (http://www.prismmodelchecker.org/casestudies/zigbee.php) may be useful for this purpose. This project would suit a student interested in probabilistic modelling and Internet of Things.  
Modelling and verification of DNA programs  Marta Kwiatkowska  DNA molecules can be used to perform complex logical computation. DNA computation differs from conventional digital computation and is sometimes referred to as ‘computing with soup’ http://www.economist.com/node/21548488. Correct design of DNA devices is errorprone and the task is supported by tools such as the DNA Strand Displacement (DSD) programming language and simulator (http://research.microsoft.com/enus/projects/dna/default.aspx) developed at Microsoft. The DSD tool enables the probabilistic model checking of DNA circuits and has been used to identify a flaw in a DNA transducer gate (see http://qav.comlab.ox.ac.uk/bibitem.php?key=LPC+12). The aim of this project is to model and analyse DNA implementation of logic inference proposed in “Autonomous Resolution Based on DNA Strand Displacement”, DNA Computing and Molecular Programming, LNCS 6937, 2011. The project will suit a student interested in modelling DNA programs using DSD and/or PRISM. For more information about the DNA computing project see http://www.veriware.org/dna.php.  
Modelling trust in humanrobot interaction  Marta Kwiatkowska  When human users interact with autonomous robots, appropriate notions of computational trust are needed to ensure that their interactions are safe and effective: too little trust can lead to user disengagement, and too much trust may cause damage. Trust management systems have been introduced for autonomous agents on the Internet, but need to be adapted to the setting of mobile robots, taking into account intermittent connectivity and uncertainty on sensor readings. Recently, a quantitative reputationbased trust model for usercentric networks has been modelled and analysed using PRISMgames, http://www.prismmodelchecker.org/games/, an extension of the PRISM model checker which supports stochastic multiplayer games as models and objectives expressed in temporal logic (http://www.prismmodelchecker.org/bibitem.php?key=KPS13). This project aims to develop a quantitative trust management system that is suitable for mobile robots. Initially, a simplified setting will be considered and an approach to modelling trust developed for this setting. The project will suit a student interested in modelling and/or software implementation.  
Probabilistic model checking for cellular microcolonies  Marta Kwiatkowska  The gro language enables the programming, modelling, specifying and visually simulating the behaviour of cells in growing microcolonies of microorganisms (http://depts.washington.edu/soslab/gro/index.html). gro has been used for teaching of synthetic biology. Microcolonies develop through e.g. cell division and cell growth, and can be described in a simple guarded command language. This project aims to extend the gro framework with probabilistic model checking by translating the language to the PRISM modelling language (www.prismmodelchecker.org) and extending the functionality to allow for model checking queries specified in probabilistic temporal logic. The project would suit a student interested in programming and synthetic biology.  
Quantitative/Probabilistic Modelling, Verification and Synthesis  Marta Kwiatkowska  Professor Marta Kwiatkowska is happy to supervise projects in the area of quantitative/probabilistic modelling, verification and synthesis, particularly those relating to the PRISM model checker. PRISM is an open source formal verification tool for analysis of probabilistic systems. PRISM has an extensive website which includes software for download, tutorial, manual, publications and many case studies. Students' own proposals in the broad area of theory, algorithms and implementation techniques for software verification/synthesis will also be considered. Below are some concrete project proposals:
A Molecular Recorder (with Luca Cardelli). Recent technological developments allow massive parallel reading (sequencing) and writing (synthesis) of heterogeneous pools of DNA strands. We are no longer limited to simple circuits built out of a small number of different strands, nor to reading out a few bits of output by fluorescence. While these 

Safety Assurance for Deep Neural Networks  Marta Kwiatkowska  Professor Marta Kwiatkowska is happy to supervise projects in the area of safety assurance and automated verification for deep learning. This is a new research topic initiated with this paper (http://qav.comlab.ox.ac.uk/bibitem.php?key=HKW+17), see also https://www.youtube.com/watch?v=XHdVnGxQBfQ. Below are some concrete project proposals:


Symmetry Declarations for SATsolvers  Anthony Lin  Constraint programming is a programming paradigm, in which programmers write programs by specifying the problem description (in terms of constraints) and letting computers use their computational power to figure out the solution automatically. This is in contrast to typical programming paradigms, wherein programmers explicitly spell out procedures for carrying out the computation. Constraint programming has a wide range of applications (e.g. optimisation, planning, ) and rely on powerful constraint solvers to solve computational problems. SATsolvers  claimed by some researchers to be among the greatest achievements in the past decade  are one of the most powerful solvers in the constraint programming toolbox. In a nutshell, SATsolvers are algorithms for solving satisfiability of boolean formulas, which is an NPcomplete problem that can be used as a universal language for encoding many practical combinatorial problems. Although the problem of satisfiability of boolean formulas is difficult in theory, SATsolvers have greatly advanced in the past two decades to the extent that large formulas (with millions of variables/ clauses) can now be handled. The aim of the project is to investigate ways in which to improve the performance of SATsolvers by embedding symmetry information (e.g. variable symmetry). Boolean formulas that arise in practice (e.g. as encodings of combinatorial problems) exhibit many symmetries, which the programmers typically know. Specific questions to explore include: (1) what is a convenient (but general) way to specify symmetry information in boolean formulas? (2) given the symmetry information, how do we engineer SATsolvers to exploit such information to speed up the search for solutions?  
Analysis of wildlife tracking data  Andrew Markham  We have developed ultralow power wireless tracking tags which have been deployed on badgers, hares and swans in the wild. The sensors mainly measure 3D acceleration, but some data is also from our worldfirst underground 3D tracking system. The sensors are typically sampled at 8 Hz and collars run for between 2 and 3 months. This has generated a large dataset (> 500 million samples) of animal behaviour traces. For some of these deployments, there are secondary (ground truth) data including video and RFID. The challenge is to develop tools to analyse these datasets and present results to endusers (zoologists) in a userfriendly way. Such a tool would perform classification (typically unsupervised) on behaviour traces, generating classes of behaviour. At a very coarse level, this could simply be sleeping vs active, but the richer the classification, the better for automatic generation of ethograms. This classification can also be used online to alter the behaviour of the tracking tags themselves, such as to reduce the sampling rate when an animal is resting to prolong the lifetime of the collar. Suitable for MSc student interested in statistical analysis/data mining and interdisciplinary science  
Medical information extraction with deep neural networks  Alejo NevadoHolgado, Paul Goldberg  Background 

Bootstrapping document annotation with Schema.org  Nadeschda Nikitina  The recently started Schema.org initiative of the major search engine providers aims at fostering semantic annotations across the Web. You can read about it here. Semiautomatic annotation of natural language documents is a longstanding problem area. The goal of this project would be to apply stateoftheart annotation techniques to a large corpus based on the Schema.org semantic model. Prerequisites Some familiarity with topics from Computational Linguistics and Knowledge, Representation and Reasoning. 

Efficient processing of nested collections in Apache Spark  Milos Nikolic  "Apache Spark is today one of the most popular frameworks for largescale data analysis. Spark offers a functional (Scalalike) API for processing data collections that are distributed over a cluster of machines. Its declarative approach, domainspecific libraries (e.g., for machine learning and graph processing), and high performance have enabled its wide adoption in the industry. Although Spark can transform collections of arbitrary types, it can exhibit severe performance problems when processing nested data formats such as JSON and XML. In particular, distributed processing of datasets where nested collections have skewed cardinalities (e.g., one extremely large, others small nested collections) leads to uneven distribution of work among the machines. In such cases, developers typically have to undergo a painful process of manual query rewriting to avoid load imbalance for large inner collections in their workloads. This project aims to extend the Spark API with a new functionality that would automatically transform user queries to avoid data skews. This project is a great opportunity for students to understand how Apache Spark works under the hood and to contribute to an opensource project." 

Experimental evaluation of hashmap implementations  Milos Nikolic  This project will study existing inmemory hashmap implementations (e.g, GCC, Boost, Google's sparsehash) and experimentally evaluate their performance. This evaluation aims to provide deeper insights into the behavior of these hashmaps under different types of workloads using performance profiling tools (e.g., Valgrind). Armed with this knowledge, the student will implement a hashmap with multiindex access optimised for using in realtime (streaming) environments. Prerequisites: solid programming skills (C++ preferable)  
Parallel query evaluation in streaming environments  Milos Nikolic  Many modern applications such as social web and IoT applications use stream processing engines to handle continuously arriving data in real time. In this project, we consider one such stateoftheart engine, called DBToaster, which continuously evaluates (static) queries over changing (dynamic) data. The goal of this project is to enable multicore query processing over streaming data in DBToaster, which requires implementing necessary primitives for parallelization of the existing singlethreaded query evaluation procedures.  
Compressed Databases  Dan Olteanu  The goal of this project is to investigate theoretical and practical aspects of databases that use compressed representations of records in both query input and output. Under compression, we mean here techniques that resemble algebraic factorisation of Boolean functions. This is a novel research topic with no publications available so far in the context of database research. In the broad context of factorisations of Boolean functions, a good start is a paper by Brayton on "Factoring logic functions". For more information, please contact Dan Olteanu.  
Various (see url)  Dan Olteanu  Please click here to see the full list of projects being offered: https://www.cs.ox.ac.uk/teaching/materials1516/databasesystemsimplementation/projects.html 

Branching Temporal Logics, Automata and Games  Luke Ong  Model checking has emerged as a powerful method for the formal verification of programs. Temporal logics such as CTL (computational tree logic) and CTL* are widely used to specify programs because they are expressive and easy to understand. Given an abstract model of a program, a model checker (which typically implements the acceptance problem for a class of automata) verifies whether the model meets a given specification. A conceptually attractive method for solving the model checking problem is by reducing it to the solution of (a suitable subclass of) parity games. These are a type of two player infinite game played on a finite graph. The project concerns the connexions between the temporal logics CTL and / or CTL*, automata, and games. Some of the following directions may be explored. 1. Representing CTL / CTL* as classes of alternating tree automata. 2. Intertranslation between CTL / CTL* and classes of alternating tree automata 3. Using B¨uchi games and other subclasses of parity games to analyse the CTL / CTL* model checking problem 4. Efficient implementation of model checking algorithms 5. Application of the model checker to higherorder model checking. References:Orna Kupferman, Moshe Y. Vardi, Pierre Wolper: An automatatheoretic approach to branchingtime model checking. J. ACM 47(2):
312360 (2000). Rachel Bailey: A Comparative Study of Alorithmics for Solving B¨uchi Games. University of Oxford MSc Dissertation,
2010. 

Foundational Structures for Concurrency  Luke Ong  Dr Ong is willing to supervise projects in this area.  
Game Semantics and Linear Logic  Luke Ong  Dr Ong is willing to supervise projects in this area.  
Lambda Calculus  Luke Ong  Dr Ong is willing to supervise projects in this area  
Luke Ong's Projects  Luke Ong  See http://users.comlab.ox.ac.uk/luke.ong/projects/ for more information  
Semantics of Programming Languages  Luke Ong  Dr Ong is willing to supervise projects in this area  
Types, Proofs and Categorical Logic  Luke Ong  Dr Ong is willing to supervise projects in this area.  
Building databases of mathematical objects in Sagemath (Python)  Dmitrii Pasechnik 
Prerequisites: Interest in open source software, some knowledge of Python, some maths background. 

Computing with semialgebraic sets in Sage (math)  Dmitrii Pasechnik  Semialgebraic sets are subsets of R^n specified by polynomial inequalities. The project will extend capabilities of Sage (http://www.sagemath.org) to deal with them, such as CADs computations or sums of squares based (i.e. semidefinite programming based) methods. There might be a possibility for participation in Google Summer of Code (GSoC) or Google Semester of Code with Sage as a GSoC organisation. Prerequisites Interest in open source software, some knowledge of Python, appropriate maths background. 

Implementing and experimenting with variants of WeisfeilerLeman graph stabilisation  Dmitrii Pasechnik  WeisfeilerLeman graph stabilisation is a procedure that plays an important role in modern graph isomorphism algorithms (L.Babai 2015) and enjoyed attention in machine learning community recently (cf. "WeisfeilerLehman graph kernels"). While it is relatively easy to implement, efficient portable implementations seem to be hard to find. In this project we will work on producing such an opensource implementation, either by reworking our old code from http://arxiv.org/abs/1002.1921 or producing new one; we will also work on providing Python/Cython interfaces for a possible inclusion in SageMath (http://sagemath.org) library, and/or elsewhere.  
An interactive visual tutorial for Bayesian parameter estimation  Joe PittFrancis, Michael Clerx  The aim of this project is to build an educational tool which enables the progress of a Bayesian parameter estimation algorithm to be visualised. The model to be fitted might be (but is not limited to) a system of Ordinary Differential Equations and the Bayesian estimation tools might be build around an existing system such as Stan, PyML or Edward. A good tutorial system should be able to let the user change the underlying model system, introduce noise to a system, visualise interactive updates to probability distributions, explore the progress of a chosen sampling method such as MetropolisHastings and provide enough information that a novice student can get an intuition into all aspects of the process.  
Automatic translation to GPGPU  Joe PittFrancis  This project involves running cardiac cell models on a highend GPU card. Each model simulates the electrophysiology of a single heart cell and can be subjected to a series of computational experiments (such as being paced at particular heart rates). For more information about the science and to see it in action on CPU see "Cardiac Electrophysiology Web Lab" at https://travis.cs.ox.ac.uk/FunctionalCuration/ An existing compiler (implemented in Python) is able to translate from a domain specific XML language (http://models.cellml.org) into a C++ implementation. The goal of the project is to add functionality to the compiler in order to get OpenCL or CUDA implementations of the same cell models and to thus increase the efficiency of the "Web Lab". 

General graphics projects  Joe PittFrancis  I am interested in supervising general projects in the area of computer graphics. If you have a particular area of graphicsrelated research that you are keen to explore then we can tailor a bespoke project for you. Specific projects I have supervised in the past include "natural tree generation" which involved using Lindenmayer systems to grow realistic looking bushes and trees to be rendered in a scene; "procedural landscape generation" in which an island world could be generated onthefly using a set of simple rules as a user explored it; "gesture recognition" where a human could control a simple interface using handgestures; "parallel raytracing" on distributedmemory clusters and using multiple threads on a GPU card; "radiosity modelling" used for analysing the distribution of RFID radio signal inside a building; and "nonphotorealistic rendering" where various models were rendered with toon/cel shaders and a set of pencilsketch shaders. 

Computation of particle interactions in ndimensional space  Martin Robinson  "Aboria (https://github.com/martinjrobins/Aboria) is a C++ library for evaluating and solving systems of equations that can be described as interactions between particles in ndimensional space. It can be used as a high performance library to implement numerical methods such as Molecular Dynamics in computational chemistry, or Gaussian Processes for machine learning. Project 1: Aboria features a radial neighbour search to find nearby particles in the ndimensional space, in order to calculate their interactions. This project will implement a new algorithm based on calculating the interactions between neighbouring *clusters* of particles. Its performance will be compared against the existing implementation, and across the different spatial data structures used by Aboria (Celllist, Octree, Kdtree). Prerequisites: C++ Project 2: Aboria features a serial Fast Multipole Algorithm (FMM) for evaluating smooth long range interactions between particles. This project will implement and profile a parallel FMM algorithm using CUDA and/or the Thrust library. Prerequisites: C++, Knowledge of GPU programming using CUDA and/or Thrust Project 3: The main bottleneck of the FMM is the interactions between wellseparated particle clusters, which can be described as lowrank matrix operations. This project will explore different methods compressing these matrices in order to improve performance, using either Singular Value Decomposition (SVD), Randomised" SVD, or Adaptive Cross Approximation Prerequisites: C++, Linear Algebra 

Statistical shape atlas of cardiac anatomy  Blanca Rodriguez, Pablo Lamata  Description: Cardiac remodelling is the change in shape of the anatomy due to disease processes. 3D computational meshes encode shape variation in cardiac anatomy, and render higher diagnostic than conventional geometrical metrics. Better shape metrics will enable an earlier detection, a more accurate stratification of disease, and more reliable evaluation of remodelling response. This project will contribute to the development of a toolkit for the construction of anatomical atlases of cardiac anatomy, and its translation to clinical adoption. The student will learn about the challenges and opportunities of the crossdisciplinary field between image analysis, computational modelling and cardiology, and be part of a project with a big potential impact on the management of cardiovascular diseases. Prerequisites: Motivation. Good programming skills. Experience with computational graphics and image analysis is an advantage. 

Verification of Concurrent Software based on PartialOrder Semantics  César Rodríguez, Daniel Kroening  Description: A Boolean program is one where all variables are of Boolean type. In the context of formal verification of concurrent software, concurrent Boolean programs (CBPs) are generated by a process of abstracting the original program, in such a way that the original analysis problem reduces now to an equivalent problem on the CBP. The unfolding technique is a well studied verification method for another model of concurrency called Petri nets. Unfoldings represent the behaviour of a Petri net in a compact way, by means of a partialorder enriched with additional information. Not only this this representation is a theoretically neat one, but also very efficient for practical verification. The goal of the project is to apply the unfolding technique to the verification of CBPs, and compare with existing verification techniques for CBPs. Prerequisites: suitable for students having followed the course "ComputerAided Formal Verification" 

A processor design toolkit for an opensource FPGA toolchain  Alex Rogers  Fieldprogrammable gate arrays (FPGA) are integrated circuits that can be configured after manufacture into almost any conceivable logic circuit combining both combinatorial and synchronous elements. They can be used to explore real hardware implementations of processor designs from simple accumulator machines, through to register machines, and more unusual stack machines. Their use outside industry has previously been limited due to the high cost and complexity of their associated development software. However, this is changing, with an opensource toolchain for popular FPGA devises becoming available in the last year (the equivalent of Linux in the OS world). This project will build on this toolchain to develop the additional software necessary to allow students to design and explore simple processor designs on a custom FPGA development board. The current toolchain requires the use of the Verilog hardware description language. Verilog is very powerful but also very general. This project will develop a highlevel language (possibly a graphical language) focused on the development of simple processor designs from a small number of standard components (such as RAM, multiplexers and registers). Prerequisites: Digital Systems, Compilers & Computer Architecture useful but not essential.  
An Acoustic Detection Design Tool for Environmental Monitoring  Alex Rogers  Acoustic monitoring using lowpower embedded devices is increasingly valuable in environmental and biodiversity monitoring. One aspect of this work is to automatically detect and count acoustic events such as the calls of a bats, insects, birds or amphibians. This currently involves writing bespoke lowlevel code for each specific application. However, it is also possible to make effective detection algorithms by chaining together simple filters, threshold detectors, and finite state machines. This project will investigate the feasibility of automatically generating this code through a highlevel description of this chain; either through a highlevel language or through a graphical representation. The student will develop the key elements of a tool chain that will allow an ecologist or biologist to interactively design a detection algorithm, given an example of the sound to be detected, and to compile this algorithm for the AudioMoth acoustic sensor (www.openacousticdevices.info). Prerequisites: Compilers and experience of embedded devise useful but not essential  
Convolution neural networks for microcontrollers and constrained hardware  Alex Rogers  Convolution neural networks have made dramatic advances in recent years on many image and vision processing tasks. While training such networks is computationally expensive (typically requiring very large image datasets and exploiting GPU acceleration), they can often be deployed on much simpler hardware if simplifications such as integer or even binary weights are imposed on the network. This project will explore the deployment of trained convolution networks on microcontrollers (and possibly also FPGAbased hardware) with the intention of demonstrating useful image processing (perhaps recognising the presence of a face in the field of view of a low pixel camera) on lowpower devices. Prerequisites: Machine Learning & Computer Architecture useful but not essential.  
Resurrecting Extinct Computers  Alex Rogers  While the architecture of current reduced instruction set processors is well established, and relatively static, the early days of computing saw extensive experimentation and exploration of alternative designs. Commercial processors developed during the 1960s, 1970s and 1980s included stack machines, LISP machines and massively parallel machines, such as the Connection Machine (CM1) consisting of 65,536 individual onebit processors connected together as a 12dimensional hypercube. This period also saw the development of the first single chip microprocessors, such as the Intel 4004, and the first personal computers, such as the Altair 8800 using the Intel 8080 microprocessor. This project will attempt to resurrect one of these extinct designs (or a scaled down version if necessary) using a modern lowcost fieldprogrammable gate array (FPGA). You will be required research the chosen processor, using both original and modern sources, and then use Verilog to develop a register level description of the device that can be implemented on a FPGA. The final device should be able to run the software of the original and could be realised in a number of different forms depending on the chosen processor (e.g. an Altair 8800 on a small USB stick running Microsoft BASIC). Prerequisites: Digital Systems or Computer Architecture useful but not essential 

Units of Measure as Types within an Interactive Programming Environment  Alex Rogers  Being able to define the units of constants and variables in a programming language has great value in many applications. NASA's Mars Climate Orbiter was lost in 1999 due to software that calculated trajectory thruster firings in pounds seconds, rather than newtonseconds. In other cases, dimensional analysis (statically checking that the computed units match those that are expected) is sufficient to catch many errors in calculations. While F# supports units natively, and libraries exist in many others languages (e.g. Java, Haskell, Python), none are particularly easy to use, and often introduce clumsy syntax. This project will build improve on these approaches. You will be required to develop a unitaware interactive programming environment enabling unitsafe physics based calculations to be performed. This might be a standalone solution, or a kernel for an existing interactive computing environment such as Project Jupyter (jupyter.org). Prerequisites: Compilers useful but not essential  
Applications of ad hoc security  Bill Roscoe  My group have developed novel technology for developing security between pairs and groups of devices such as mobile phones. This technology clearly has many potential applications: the objective of this project is to build an implementation directed at an application of the student's choosing, for example email, healthcase or social networking. See my papers with Bangdao Chen, Ronald Kainda and Long Nguyen on my web page. Prerequisites: Security  
Developing and using SVA  Bill Roscoe  SVA is a front end for FDR that analyses shared variable programs. See Chapters 18 and 19 of Understanding Concurrent Systems. It has the potential to support a number of projects including: (a) Building a version that accepts a subset of some standard programming language as opposed to the custom language it presently uses. (b) Extending its functionality (c) Case studies using it. (a) and (b) would both involve programming in CSP and JAVA, the two languages SVA is written in. Prerequisites: Concurrency for (a) and (b)  
Modelling and verifying systems in Timed CSP and FDR  Bill Roscoe  Timed CSP reinterprets the CSP language in a realtime setting and has a semantics in which the exact times of events are recorded as well as their order. Originally devised in the 1980s, it has only just been implemented as an alternative mode for FDR. The objective of this project is to take one or more examples of timed concurrent system from the literature, implement them in Timed CSP, and where possible compare the performance of these models with similar examples running on other analysis tools such as Uppaal. References:(Reference Understanding Concurrent Systems, especially Chapter 15, and Model Checking Timed CSP, from AWR's web list of publications) 

Building Heterogeneous Social Software, or, “How dare you tweet my snapchat?”  Nigel Shadbolt, Max Van Kleek, Jun Zhao  Social apps today require everybody to use the same application (e.g. Snapchat) or at least the same protocol (e.g. Twitter/Tweetdeck). This project explores how future social software could blur the boundaries between multiple social services by allowing the abstraction of social protocols for exchanging social information, while preserving the norms and social expectations around it. For example, Alice might send Bob a photo that should be deleted after its first viewing, and expect Bob to conform to that desire; to receive a ‘seen’ and ‘deleted’ receipt; and for Bob to be able to respond. To support this, we would require a novel language to express such social protocols and an architecture for allowing application designers to easily build complying social apps.  
PrivacyPreserving Personal Information Environments  Nigel Shadbolt, Max Van Kleek, Jun Zhao  While the Internet of Things is likely to enable many significant new kinds of applications to make people’s lives easier, it also presents unprecedented threats to individual privacy, by introducing sensors into the most private locations and aspects of people’s lives. This project seeks to reverse current trends of having IoTbased sensors (ranging from environmental to wearable sensors) siphon data back to third party providers, by creating private Personal Information Environments for storing and managing sensor data under the individual’s control. The student will be involved in designing and implementing the foundational architecture for a decentralised, privacyrespecting Personal Information Environment.  
A simple objectoriented language  Michael Spivey  Use Keiko to implement a simple language that is purely objectoriented. Study the compromises that must be made to get reasonable performance, comparing your implementation with Smalltalk, Ruby or Scala. Please see http://spivey.oriel.ox.ac.uk/corner/Undergraduate_and_M.Sc._projects for further details. NB. This project is not available to MSc students in 201617. Undergraduate students who wish to enquire about a project for 201718 are welcome to contact Prof Spivey but should note that the response may be delayed as he is on sabbatical. 

Better JIT translation  Michael Spivey  The existing JIT for Keiko is very simpleminded, and does little more than translate each bytecode into the corresponding machine code. Either improve the translation by using one of the many JIT libraries now available, or adjust the Oberon compiler and the specification of the bytecode machine to free it of restrictive assumptions and produce a better pureJIT implementation. Please see http://spivey.oriel.ox.ac.uk/corner/Undergraduate_and_M.Sc._projects for further details. N.B This project is not available to MSc students in 201617. Undergraduate students who wish to enquire about a project for 201718 are welcome to contact Prof Spivey but should note that the response may be delayed as he is on sabbatical. 

Better performance for GeomLab  Michael Spivey  At present, GeomLab programs show a performance that competes favourably with Python, making it possible to address tasks like computing images of the Mandelbrot set using a purely functional program that calls a function once for each pixel. But there is still a gap between the performance of GeomLab programs and similar ones written in Java or C, and more ambitious imageprocessing tasks would be made possible by better performance, particularly in the area of arithmetic. Explore ways of improving performance, perhaps including the possibility of improving the performance of GeomLab by allowing numbers to be passed around without wrapping them in heapallocated objects, or the possibility of compiling the code for Haskellstyle pattern matching in a better way. Please see http://spivey.oriel.ox.ac.uk/corner/Undergraduate_and_M.Sc._projects for further details. N.B. This project is not available to MSc students in 201617. Undergraduate students who wish to enquire about a project for 201718 are welcome to contact Prof Spivey but should
note that the response may be delayed as he is on sabbatical. 

Eliminating array bounds checks  Michael Spivey  The Oberon compiler inserts code into every array access and every pointer dereference to check for runtime errors, like a subscript that is out of bounds or a pointer that is null. In many cases, it is possible to eliminate the checks because it is possible to determine from the program that no error can occur. For example, an array access inside a FOR loop may be safe given the bounds of the loop, and several uses of the same pointer in successive statements may be able to share one check that the pointer is nonnull. Modify the Oberon compiler (or a simpler one taken from the Compilers labs) so that it represents the checks explicitly in its IR, and introduce a pass that removes unnecessary checks, so speeding up the code without compromising safety. Please see http://spivey.oriel.ox.ac.uk/corner/Undergraduate_and_M.Sc._projects for further details. N.B. This project is not available to MSc students in 201617. Undergraduate students who wish to enquire about a project for 201718 are welcome to contact Prof Spivey but should note that the response may be delayed as he is on sabbatical. 

GeomLab and Mindstorms  Michael Spivey  GeomLab has a turtle graphics feature, but the pictures are drawn only on the screen. It should be possible to make a turtle out of Lego Mindstorms, then control it with an instance of Geomlab running on a host computer, with communication over Bluetooth. Please see http://spivey.oriel.ox.ac.uk/corner/Undergraduate_and_M.Sc._projects for further details. N.B. This project is not available to MSc students in 201617. Undergraduate students who wish to enquire about a project for 201718 are welcome to contact Prof Spivey but should note that the response may be delayed as he is on sabbatical. 

GeomLab on Android  Michael Spivey  Produce an implementation of GeomLab's GUI and graphics library that works on the Android platform. Either use an interpreter for GeomLab's intermediate code to execute GeomLab programs, or investigate dynamic translation of the intermediate code into code for Android's virtual machine Dalvik. Please see http://spivey.oriel.ox.ac.uk/corner/Undergraduate_and_M.Sc._projects for further details. N.B. This project is not available to MSc students in 201617. Undergraduate students who wish to enquire about a project for 201718 are welcome to contact Prof Spivey but should note that the response may be delayed as he is on sabbatical. 

Heapbased activation records  Michael Spivey  At present, Keiko supports only conventional Pascallike language implementations that store activation records on a stack. Experiment with an implementation where activation records are heapallocated (and therefore recovered by a garbage collector), procedures are genuinely firstclass citizens that can be returned as results in addition to being passed as arguments, and tail recursion is optimised seamlessly. Please see http://spivey.oriel.ox.ac.uk/corner/Undergraduate_and_M.Sc._projects for further details. N.B. This project is not available to MSc students in 201617. Undergraduate students who wish to enquire about a project for 201718 are welcome to contact Prof Spivey but should note that the response may be delayed as he is on sabbatical. 

Keiko on Mindstorms  Michael Spivey  Alternative firmware for the Mindstorms robot controller provides an implementation of the JVM, allowing Java programs to run on the controller, subject to some restrictions. Using this firmware as a guide, produce an interpreter for a suitable bytecode, perhaps some variant of Keiko, allowing Oberon or another robot language of your own design to run on the controller. Aim to support the buttons and display at first, and perhaps add control of the motors and sensors later. Please see http://spivey.oriel.ox.ac.uk/corner/Undergraduate_and_M.Sc._projects for further details. N.B. This project is not available to MSc students in 201617. Undergraduate students who wish to enquire about a project for 201718 are welcome to contact Prof Spivey but should note that the response may be delayed as he is on sabbatical. 

Programming Language Semantics  Michael Spivey  Mike Spivey is willing to supervise projects in the area of Programming Language Semantics. N.B This project is not available to MSc students in 201617. 

Typechecking for GeomLab  Michael Spivey  The GeomLab language is untyped, leading to errors when expressions are evaluated that would be better caught at an earlier stage. Most GeomLab programs, however, follow relatively simple typing rules. The aim in this project is to write a polymorphic type checker for GeomLab and integrate it into the GeomLab system, which is implemented in Java. A simple implementation of the typechecker would wait until an expression is about to be evaluated, and typecheck the whole program at that point. As an extension of the project, you could investigate whether it is possible to typecheck function definitions one at a time, even when some of the functions they call have not yet been defined. Please see http://spivey.oriel.ox.ac.uk/corner/Undergraduate_and_M.Sc._projects for further details. N.B This project is not available to MSc students in 201617. Undergraduate students who wish to enquire about a project for 201718 are welcome to contact Prof Spivey but should note that the response may be delayed as he is on sabbatical. 

MSc and Undergraduate Projects  Bernard Sufrin  Bernard Sufrin's most recent collection of detailed project descriptions can always be found at http://www.cs.ox.ac.uk/people/bernard.sufrin/personal/projects.html. In addition to the projects described there, he is happy to discuss supervising projects in concurrent programming, userinterface design, programming language implementation, program transformation, and proof support. 

A Machine Learning Approach to Personalizing Education: Improving Individual Learning through Tracking and Course Recommendation  Mihaela van der Schaar, Edith Elkind  More students are enrolling in college and professional degree programs than ever before. However, current degree programs are often “one size fits all”; such programs ignore the heterogeneity of students in terms of backgrounds, abilities, learning styles and career goals. Moreover, because of everincreasing student/teacher ratios, students are often left struggling to find their own pathways through degree programs. The combination leads to poor learning outcomes, low engagement, dissatisfaction and high dropout rates. In this project, an interactive electronic system will be built that is personalized for each student, is able to continuously track progress and goals, capitalize on the knowledge accumulated, and recommend suitable courses and activities in order to build skills, enhance interest and promote longterm goals. In effect, our personalized interactive system operates as “if” there is a dedicated mentor for each student. To build this system, the following modules will need to be developed: (1) student and course similarity discovery methods; (2) student performance prediction algorithms; (3) personalized course recommendation algorithms. To read more about the role of machine learning in education – see medianetlab.ee.ucla.edu/EduAdvance Prerequisites: This project is suitable for someone with at least basic knowledge of machine learning  
Detecting structure in voters' preferences  Mihaela van der Schaar, Edith Elkind  A large international community of researchers is trying to use computers to allow groups of people (or groups of automated agents) make better joint decisions. This is a hard problem, since the preferences that agents report might contradict each other, and this leads to socalled voting paradoxes. Also, it can be computationally hard to calculate what decisions to make. A promising way to tackle this problem is by exploiting structure in the reported preferences. For this purpose, Australian researchers have collected an impressive amount of realworld preference data (PrefLib http://www.preflib.org/) comprising over 3,000 data sets coming from user preferences taken from places as diverse as rating systems at Netflix and TripAdvisor, and real political election data, with the aim of figuring out how we might use properties of preferences that actually occur in the real world. This project is about analysing this data to reveal how much structure is contained in these preferences. Technically, we are interested in figuring out how close the preferences reported are to what are known as singlepeaked and/or singlecrossing preferences (see, e.g., https://en.wikipedia.org/wiki/Single_peaked_preferences). There are different measures of closeness, and for many of them the associated decision problem is NPhard; for others, the computational complexity is not known. Prerequisites: There are several ways in which this project can be pursued. On the one hand, one can consider the notions of closeness for which the complexity of the associated problem is not known, and try to develop an efficient algorithm or prove NPhardness. On the other hand, one can try to develop practical algorithms for detecting profiles that are almost singlepeaked/singlecrossing, by encoding the associated problem as an instance of SAT or an integer linear program and running a respective solver; such algorithms could then be applied to PrefLib data.  
Machine learning for finance  Mihaela van der Schaar, Edith Elkind  This project aims to use machine learning techniques such as ensemble learning, convolutional neural networks etc. to predict spot prices for a variety of industries. Machine learning is increasingly used in finance to make predictions as well as to aggregate among existing strategies for making investments over time. We will use various free as well as proprietary data sets to assess the value of our newly developed methods in terms of both profit and risk, and compare them with state of the art techniques. This will also involve developing new “lucky factors” (features) that can be extracted from the data to inform and improve existing and new investment strategies. The expectation is that the work will lead to a conference publication. Prerequisites: This project is suitable for someone with at least basic knowledge of machine learning.  
Revolutionizing medicine through machine learning – Using advanced graphical models for developing personalized policies for HIV screening and treatment  Mihaela van der Schaar, Edith Elkind  The first part of this project aims to use advanced graphical models (including enhancements of Hidden Markov Models etc.) to discover personalized trajectories for HIV disease progression, using available electronic health record data. The second part of the project aims to learn how personalized treatment and screening plans can affect disease trajectories in the short run and in the long run, with the overall goal of identifying effective treatment plans for various types of patients. The dataset contains various types of patients and their responses to different medications over time. The project will involve also interacting with a renowned clinician specializing in HIV. In the short run, this work will lead to a publication in an important conference. In the longer run, this work – and more generally, the development of these methods – will change and advance the way medicine is practiced. To read more about the role of machine learning in medicine – see medianetlab.ee.ucla.edu/MedAdvance Prerequisites: This project is suitable for someone with at least basic knowledge of neural networks and/or machine learning.  
Revolutionizing medicine through machine learning – Using convolutional neural networks for asthma  Mihaela van der Schaar, Edith Elkind  This project aims to use convolutional neural networks to discover how to best treat patients with asthma, using available electronic health record data. The dataset contains information about various (types of) patients and their responses to different medications over time. The focus of the project is to train a convolutional neural network to identify how to best treat patients over time. The project will involve interacting with a renowned clinician specializing in asthma diagnosis and treatment. In the short run, this work will lead to a publication in an important conference. In the longer run, this work – and more generally, the development of these methods – will change and advance the way medicine is practiced. To read more about the role of machine learning in medicine – see medianetlab.ee.ucla.edu/MedAdvance Prerequisites: This project is suitable for someone with at least basic knowledge of neural networks and/or machine learning.  
Deciding satisfiability of formulas in the guarded negation fragment  Michael Vanden Boom  The guarded negation fragment of firstorder logic is an expressive logic of interest in databases and knowledge representation. It has been shown to have a decidable satisfiability problem but, to the best of my knowledge, there is no tool actually implementing a decision procedure for it. The goal would be to design a tool to determine whether or not a formula in this logic is satisfiable. Most likely, this would require designing and implementing a tableaubased algorithm, in the spirit of related tools for description logics and the guarded fragment. Prerequisites: Logic and Proof (or equivalent). There are some connections to material in Knowledge Representation and Reasoning, but this is not essential background 

Interpolation  Michael Vanden Boom  Let F1 and F2 be sentences (in firstorder logic, say) such that F1 entails F2: that is, any model of F1 is also a model of F2. An interpolant is a sentence G such that F1 entails G, and G entails F2, but G only uses relations and functions that appear in *both* F1 and F2. The goal in this project is to explore and implement procedures for constructing interpolants, particularly for certain decidable fragments of firstorder logic. It turns out that finding interpolants like this has applications in some database query rewriting problems. Prerequisites: Logic and Proof (or equivalent) 

MSO Automata  Michael Vanden Boom  The Automata, Logic, and Games course presents automata that run on infinite binary trees. However, in certain applications, it is natural to consider trees with unbounded or even infinite branching degree. The automata that run on trees like this are sometimes called MSO automata, since the transition function is described by a monadic secondorder logic (MSO) formula, rather than by explicitly giving the state for each successor. Although these MSO automata have been considered in the literature, the complexity of various operations on these automata is not well studied. The goal in this project is to fill this gap. Prerequisites: Automata, Logic, and Games 

3D printing medical scan data  Irina Voiculescu  Computed tomography (CT) scanning is a ubiquitous scanning modality. It produces volumes of data representing internal parts of a human body. Scans are usually output in a standard imaging format (DICOM) and come as a series of axial slices (i.e. slices across the length of the person's body, in planes perpendicular to the imaginary straight line along the person's spine.) The slices most frequently come at a resolution of 512 x 512 voxels, achieving an accuracy of about 0.5 to 1mm of tissue per voxel, and can be viewed and analysed using a variety of tools. The distance between slices is a parameter of the scanning process and is typically much larger, about 5mm. During the analysis of CT data volumes it is often useful to correct for the large spacing between slices. For example when preparing a model for 3D printing, the axial voxels would appear elongated. These could be corrected through an interpolation process along the spinal axis. This project is about the interpolation process, either in the raw data output by the scanner, or in the postprocessed data which is being prepared for further analysis or 3D printing. The output models would ideally be files in a format compatible with 3D printing, such as STL. The main aesthetic feature of the output would be measurable as a smoothness factor, parameterisable by the user. Existing DICOM image analysis software designed within the Spatial Reasoning Group at Oxford is available to use as part of the project. 

Exact Algorithms for Complex Root Isolation  Irina Voiculescu  Not available in 2013/14 Isolating the complex roots of a polynomial can be achieved using subdivision algorithms. Traditional Newton methods can be applied in conjunction with interval arithmetic. Previous work (jointly with Prof Chee Yap and MSc student Narayan Kamath) has compared the performance of three operators: Moore's, Krawczyk's and HansenSengupta's. This work makes extensive use of the CORE library, which is is a collection of C++ classes for exact computation with algebraic real numbers and arbitrary precision arithmetic. CORE defines multiple levels of operation over which a program can be compiled and executed. Each of these levels provide stronger guarantees on exactness, traded against efficiency. Further extensions of this work can include (and are not limited to): (1) Extending the range of applicability of the algorithm at CORE's Level 1; (2) Making an automatic transition from CORE's Level 1 to the more detailed Level 2 when extra precision becomes necessary; (3) Designing efficiency optimisations to the current approach (such as confirming a single root or analysing areas potentially not containing a root with a view to discarding them earlier in the process); (4) Tackling the isolation problem using a continued fraction approach. The code has been included and is available within the CORE repository. Future work can continue to be carried out in consultation with Prof Yap at NYU. 

Gesture recognition using Leap Motion  Irina Voiculescu  Scientists in the Experimental Psychology Department study patients with a variety of motor difficulties, including apraxia  a condition usually following stroke which involves lack of control of a patient over their hands or fingers. Diagnosis and rehabilitation are traditionally carried out by Occupational Therapists. In recent years, computerbased tests have been developed in order to remove the human subjectivity from the diagnosis, and in order to enable the patient to carry out a rehabilitation programme at home. One such test involves users being asked to carry out static gestures above a Leap Motion sensor, and these gestures being scored according to a variety of criteria. A prototype has been constructed to gather data, and some data has been gathered from a few controls and patients. In order to deploy this as a clinical tool into the NHS, there is need for a systematic data collection and analysis tool, based on machine learning algorithms to help classify the data into different categories. Algorithms are also needed in order to classify data from stroke patients, and to assess the degree of severity of their apraxia. Also, the graphical user interface needs to be extended to give particular kinds of feedback to the patient in the form of home exercises, as part of a rehabilitation programme. This project was originally set up in collaboration with Prof Glyn Humphreys, Watts Professor of Experimental Psychology. Due to Glyn's untimely death a new cosupervisor needs to be found in the Experimental Psychology Department. It is unrealistic to assume this project can run in the summer of 2016. 

Identifying features in MRI scan data  Irina Voiculescu  In recent years, medical diagnosis using a variety of scanning modalities has become quasiuniversal and has brought about the need for computer analysis of digital scans. Members of the Spatial Reasoning research group have developed image processing software for CT (tomography) scan data. The program partitions (segments) images into regions with similar properties. These images are then analysed further so that particular features (such as bones, organs or blood vessels) can be segmented out. The team's research continues to deal with each of these two separate meanings of medical image segmentation. The existing software is written in C++ and features carefullycrafted and welldocumented data structures and algorithms for image manipulation. In certain areas of surgery (e.g. orthopaedic surgery involving hip and knee joint) the magnetic resonance scanning modality (MRI) is preferred, both because of its safety (no radiation involved) and because of its increased visualisation potential. This project is about converting MRI scan data into a format that can become compatible with existing segmentation algorithms. The data input would need to be integrated into the group's analysis software in order then to carry out 3D reconstructions and other measurements. This project is cosupervised by Professor David Murray MA, MD, FRCS (Orth), Consultant Orthopaedic Surgeon at the Nuffield Orthopaedic Centre and the Nuffield Department of Orthopaedics, Rheumatology and Musculoskeletal Sciences (NDORMS), and by Mr Hemant Pandit MBBS, MS (Orth), DNB (Orth), FRCS (Orth), DPhil (Oxon)Orthopaedic Surgeon / Honorary Senior Clinical Lecturer, Oxford Orthopaedic Engineering Centre (OOEC), NDORMS.  
Simple drawing analysis  Irina Voiculescu  Scientists in the Experimental Psychology Department study patients with a variety of motor difficulties, including apraxia  a condition usually following stroke which involves lack of control of a patient over their hands or fingers. Diagnosis and rehabilitation are traditionally carried out by Occupational Therapists. In recent years, computerbased tests have been developed in order to remove the human subjectivity from the diagnosis, and in order to enable the patient to carry out a rehabilitation programme at home. One such test involves users drawing simple figures on a tablet, and these figures being scored according to a variety of criteria. Data has already been gathered from 200 or so controls, and is being analysed for a range of parameters in order to assess what a neurotypical person could achieve when drawing such simple figures. Further machine learning analysis could help classify such data into different categories. Algorithms are also needed in order to classify data from stroke patients, and to assess the degree of severity of their apraxia. This project was originally cosupervised by Prof Glyn Humphreys, Watts Professor of Experimental Psychology. Due to Glyn's untimely death a new cosupervisor needs to be found in the Experimental Psychology Department. It is unrealistic to assume this project can run in the summer of 2016. 

Surgery teaching and training tool  Irina Voiculescu  Knee replacement surgery involves a precise series of steps that a surgeon needs to follow. Trainee surgeons have traditionally mastered these steps by learning from textbooks or experienced colleagues. Surgeons at the Nuffield Department of Orthopaedics, Rheumatology and Musculoskeletal Sciences (NDORMS) in Oxford have been working on a standardised method to help trainees internalise the sequence of events in an operation. It is proposed to construct a computerbased tool which would help with this goal. Apart from the choice of tools and materials, the tool would also feature a virtual model of the knee. The graphical user interface will present a 3D model of a generic knee to be operated, and would have the ability for the user to make cuts necessary to the knee replacement procedure. There would be predefined parameters regarding the type and depth of each cut, and an evaluation tool on how the virtual cuts compared against the parameters. The project goals are quite extensive and so this would be suitable for an experienced programmer. This project is cosupervised by Professor David Murray MA, MD, FRCS (Orth), Consultant Orthopaedic Surgeon at the Nuffield Orthopaedic Centre and the Nuffield Department of Orthopaedics, Rheumatology and Musculoskeletal Sciences (NDORMS), and by Mr Hemant Pandit MBBS, MS (Orth), DNB (Orth), FRCS (Orth), DPhil (Oxon)Orthopaedic Surgeon / Honorary Senior Clinical Lecturer, Oxford Orthopaedic Engineering Centre (OOEC), NDORMS. 

Textile fabric detail simulation  Irina Voiculescu  Conventional computeraided design (CAD) software uses methods such as extrusion and revolution tools that the user can apply to create the 3D shape of a part. These tools are based on traditional manufacturing methods and work very well for most CAD applications. One application which these tools do not work well for is creating 3dimensional representations of textiles. The exact path of each fibre within the textile is dependent upon the other fibres, and the flexibility of the fibres. The purpose of this project is to create a simple software tool/algorithm into which a user can input a weave pattern (flat), or a braid pattern (cylindrical), and the flexibility of the fibres and it will create a 3dimensional representation of the structure.  
An investigation of the solution of least squares problems using the QR factorisation  Jonathan Whiteley  Experimental data inevitably contains error. These experimental observations are often compared to theoretical predictions by writing as a least squares problem, i.e. minimising the sum of squares between the experimental data and theoretical predictions. These least squares problems are often solved using a QRfactorisation of a known matrix, which uses the GramSchmidt method to write the columns of this matrix as a linear sum of orthonormal vectors. This method, when used in practice, can exhibit numerical instabilities, where the (inevitable) numerical errors due to fixed precision calculations on a computer are magnified, and may swamp the calculation. Instead, a modified GramSchmidt method is used for the QRfactorisation. This modified GramSchmidt factorisation avoids numerical instabilities, but is less computationally efficient. The first aim of this project is to investigate the relative computational efficiencies of the two methods for QRfactorisation. The second aim is to use the QRfactorisation to identify: (i) what parameters can be recovered from experimental data; and (ii) whether the data can automatically be classified as "good" or "bad".  
The efficiency of numerical algorithms  Jonathan Whiteley  Many numerical algorithms have error bounds that depend on some user provided input. For example, the error in a numerical method for solving a differential equation is bounded in terms of the stepsize h, and so the user may change the stepsize h until a desired accuracy is attained. Although useful, these error bounds do not take account of computational efficiency. For example, a numerical method for solving a differential equation may have a very impressive bound with respect to step size h, but may require significantly more computational effort than other methods with less impressive error bounds. The field of scientific computing is a rich source of algorithms such as these, for example the numerical solution of differential equations, the numerical solution of linear systems, and interpolation of functions. The aim of this project is to work with a variety of algorithms for solving a given problem, and to assess the computational efficiency of these algorithms.  
The efficient storage of sparse matrices  Jonathan Whiteley  Many matrix applications involve sparse matrices, i.e. matrices that have a very large number of rows and columns, but only a small number of nonzero entries in each row. On a given computer we may only store a finite number of matrix entries. When working with sparse matrices we usually store only the nonzero entries and their locations. There are several established techniques for storing sparse matrices. These methods all have individual strengths and weaknesses  some allow efficient multiplication of sparse matrices by vectors, others allow entries to be modified effectively, while others allow the sparsity pattern to be modified dynamically. The aim of this project is to investigate these sparse storage methods, and to highlight the strengths and weaknesses of each approach.  
Complexity of Language Equivalence for tTurn Pushdown Automata  James Worrell  Language equivalence for tTurn Pushdown automata was shown to be in coNP by Sénizergues 2003. The aim of the project is to survey this result and perhaps improve the complexity bound using techniques from the following paper: James Worrell: Revisiting the Equivalence Problem for Finite Multitape Automata. ICALP (2) 2013: 422433  
Topics in Linear Dynamical Systems  James Worrell  A linear dynamical system is a discrete or continuoustime system whose dynamics is given by a linear function of the current state. Examples include Markov chains, linear recurrence sequences (such as the Fibonacci sequence), and linear differential equations. This project involves investigating the decidability and complexity of various reachability problems for linear dynamical systems. It would suit a mathematically oriented student. Linear algebra is essential, and number theory or complexity theory is desirable. A relevant paper is Ventsislav Chonev, Joël Ouaknine, James Worrell: The orbit problem in higher dimensions. STOC 2013: 941950.  
Do you have full control of your ‘smart’ home?  Jun Zhao, Jason Nurse  Smart home devices are gaining huge popularity as they redefine how we entertain, keep a heathy life style or educate our
children. They promise to improve our life quality, energyefficiency and home security. However, at the same time, we understand
so little about these fast emerging technologies and pay even less attention on how they achieve these efficiencies without
compromising our personal privacy and security. 

Topics in Algorithms, Complexity, and Combinatorial Optimisation  Standa Živný  Prof Zivny is willing to supervise in the area of algorithms, complexity, and combinatorial optimisation. In particular, on problems related to convex relaxations (linear and semidefinite programming relaxations), submodular functions, and algorithms for and complexity of homomorphisms problems and Constraint Satisfaction Problems. 