Skip to main content

Research Associate on Interpretable and Explainable Deep Learning for Natural Language Understanding and Commonsense Reasoning

Posted: 16th December 2019

Department of Computer Science, Parks Road, Oxford.

Research Associate on “Interpretable and Explainable Deep Learning for Natural Language Understanding and Commonsense Reasoning”

Fixed term until for up to 9 months

Grade 7: £32,817 - £40,322 p.a. (note: post may be under-filled at grade 6)

The Artificial Intelligence and Machine Learning group at the Department of Computer Science has a vacancy for a Research Associate on “Interpretable and Explainable Deep Learning for Natural Language Understanding and Commonsense Reasoning”, funded by the Alan Turing Institute. 

Reporting to Professor Thomas Lukasiewicz, you will be responsible for carrying out research towards new approaches to interpretable and explainable deep learning for natural language understanding and commonsense reasoning.  You will explore, generalise, and integrate deep learning approaches to structured data extraction and to large-scale logic-based reasoning, towards an interpretable and explainable deep-learning approach to human-like understanding and commonsense reasoning in natural language processing, and to investigate its applications in other disciplines, such as healthcare, engineering, law, and finance.  You will also collaborate with Professor Lukasiewicz and members of his research group, providing guidance to junior members of the research group, including PhD students, MSc students, and/or project volunteers.

The primary selection criteria are a PhD/DPhil (or close to completion) in Computer Science, Mathematics, Statistics, Engineering, Computational Linguistics, or related discipline, together with relevant experience, in particular possessing a good (theoretical and programming) background in machine learning, and knowledge representation and reasoning (desirably in deep learning and neural networks, deep-learning-based representations, knowledge bases and graphs, ontology languages, natural language processing, and explainable and interpretable artificial intelligence), as well as good software engineering skills (especially in system implementations and experimental evaluations), and potentially experience in health-care, engineering, law, or finance applications.

The closing date for applications is 12 noon on Friday 31 January 2020. For further details and to apply please visit https://my.corehr.com/pls/uoxrecruit/erq_jobspec_details_form.jobspec?p_id=144561

Our staff and students come from all over the world and we proudly promote a friendly and inclusive culture. Diversity is positively encouraged, through diversity groups and champions, for example http://www.cs.ox.ac.uk/aboutus/women-cs-oxford/index.html, as well as a number of family-friendly policies, such as the right to apply for flexible working and support for staff returning from periods of extended absence, for example maternity leave.