I am interested in the ways in which users interact with technologies in different kinds of setting and how social action both shapes and is shaped by innovation. The projects I work on typically seek to identify identify mechanisms for the improved design, responsible development and effective regulation of technologies. I am a social scientist by training and I specialise in the application of qualitative research methods. I am also very interested in the ways in which detailed, granular analysis can be combined with larger scale computational work. I was recently added to the list of Brilliant Women working in AI Ethics.
The current project I am working on is RoboTIPS: Developing Responsible Robots for the Digital Economy. This is a 5 year EPSRC Established Career Fellowship held by Marina Jirotka and runs from March 2019 to February 2024. The project focuses on the domain of social robots, those which interact with people and make decisions about what to do on their own accord. Because they make their own decisions in order to perform actions, we need to be able to recover what they did and why they did it, when things seem to go wrong. We will develop an Ethical Black Box (EBB) through which the social robot will be able to explain its behaviour in simple and understandable ways. We will test this out in incident investigations as a social process in a variety of contexts. The development of the EBB is an example of Responsible Innovation (RI). We develop an agile process which will take account of the views of a wide range of people in a fast-changing context, in order to have some influence over the trajectory of an innovation. We draw on this and an understanding of people's lived rights and obligations to provide creative resources and methods for designers to develop responsible and accountable new technologies.
I teach in the Department on the Requirements course and Cyber Security CDT elective courses on research methods. I helped to develop and lead the Department's new undergraduate course on Ethics and Responsible Innovation. I co supervise individual students and co sponsor 2nd year UG group projects. I contribute to various initiatives on research ethics and the ethics of technology within the University and externally. I am very committed to widening participatation at Oxford and regularly take part in outreach activities organised by the Department. I am also a member of the Departmental Equality and Diversity Committee.
I have presented my work at a range of conferences and in written journal publications. I am also on the editorial board of Inspired Research magazine.
I am a Senior Researcher in the Department of Computer Science and I work in the Human Centred Computing theme. My first degree was in Social and Political Sciences (University of Cambridge) and my PhD (University of Nottingham) explored doctor-patient communication in specialist obesity clinics. From 2009 to 2014 I worked as a researcher in the Work, Interaction and Technology Research Centre (WIT) at King’s College London. WIT specialises in exploring the relationships between interaction, technology and organisation in workplace settings. Whilst there I was invovled in a number of projects covering topics including eye care, obesity management, electronic health records and robotic surgery technologies. I was also part of a collaboration with the College of Optometrists to produce and deliver a communication skills training package for eye care practitioners.
I joined the Department of Computer Science in November 2014 to work on the ESRC funded project 'Digital Wildfire: (Mis)information flows, propagation and responsible governance' . The project investigated interactions on social media platforms such as Twitter and particularly focused on potentially harmful communication behaviours such as the spread of misinformation, hate speech and antagonistic content. We explored opportunities for the responsible governance of social media and as part of that work produced educational materials for schools and contributed to the government's inquiry on Children and the Internet. You can find out more about the project here.
From November 2014 to early 2019 I worked on projects about algoritmic bias and fairness. From September 2016 to November 2018 the EPSRC funded project “UnBias: Emancipating Users Against Algorithmic Biases for a Trusted Digital Economy”. explored the user experience of algorithm driven internet services and the process of algorithm design. The project was a collaboration with the Universities of Nottingham and Edinburgh and focused on key questions including:
- Are algorithms ever neutral?
- How might algorithmic systems produce unexpected outcomes that systematically disadvantage individuals, groups or communities?
- How can we make sure that algorithmic processes operate in our best interests?
The project involved a range of empirical and engagement activities to address these questions. We identified the existence of concerns across different societal and professional groups over the contemporary prevalence of algorithm driven online platforms. At the same time, we also identified amongst these groups a desire for change to improve the user experience of platforms. Our analysis highlighted several opportunities for positive change – in particular in relation to education, societal engagement and policy. As part of this work we produced a 'fairness toolkit' to raise awareness of and stimulate a civic dialogue about how algorithms shape our online experiences. We have also contributed to a recent European Parliament Science and Technology Options Assessment panel report on algorithmic accountability and transparency.
In December 2018 the project team began a new study. "ReEnTrust" is also funded by the EPSRC and builds on the UnBias project findings. ReEnTrust will identify mechanisms to help foster trust between users, systems and algorithms. The project continues until late 2020 - I was originally a Researcher- Co Investigator on the project and am now a member of its Advisory Board.
I have also worked on a number of small projects within the Department. In the summer of 2018 I was a co-investigator on the 'LabHackathon Zimbabwe' project. This was a pilot project to develop a new method to address resource scarcity in African science laboratories. We ran a hackathon style event in which students competed to design and build frugral and reproducible pieces of laboratory equipment such as centrifuges, PCR machines and magnetic stirrers.
I am highly interested in research ethics and the ethics of development and innovation. I have served on the Departmental Research Ethics Committee and have worked with colleagues to run one day workshops on research ethics here at the University of Oxford and at the Alan Turing Insitute. I have also worked with Marina Jirotka and other members of the Human Centred Computing theme to develop the ethical hackathon model.
Why we trust dynamic consent to deliver on privacy.
Schuler Scott A. Goldsmith M. Teare H. Webb H. and S. Creese
In Trust Management XIII.. Vol. IFIP Advances in Information and Communication Technology‚ vol 563. 2019.
Communication Practices for delivering health behaviour change conversations in primary care: a synthetic review and thematic synthesis.
H. Albury C. Hall A. Syed A. Ziebland S. Stokoe E. Roberts N. Webb and P. Aveyard
In BMC Family Practice. 2019.
Human Centred Computing approaches to embed Responsible Innovation in HCI.
Webb H. Jirotka M. Inglesant P. and M Patel