I am interested in the ways in which users interact with technologies in different kinds of setting and how social action both shapes and is shaped by innovation. The projects I work on typically seek to identify identify mechanisms for the improved design, responsible development and effective regulation of technologies. I am a qualitative researcher and am very interested in the ways in which detailed, granular analysis can be combined with larger scale computational work.
Since November 2014 I have been working on projects about algoritmic bias and fairness. From September 2016 to November 2018 the EPSRC funded project “UnBias: Emancipating Users Against Algorithmic Biases for a Trusted Digital Economy”. explored the user experience of algorithm driven internet services and the process of algorithm design. The project was a collaboration with the Universities of Nottingham and Edinburgh and focused on key questions including:
- Are algorithms ever neutral?
- How might algorithmic systems produce unexpected outcomes that systematically disadvantage individuals, groups or communities?
- How can we make sure that algorithmic processes operate in our best interests?
The project involved a range of empirical and engagement activities to address these questions. We identified the existence of concerns across different societal and professional groups over the contemporary prevalence of algorithm driven online platforms. At the same time, we also identified amongst these groups a desire for change to improve the user experience of platforms. Our analysis highlighted several opportunities for positive change – in particular in relation to education, societal engagement and policy. As part of this work we produced a 'fairness toolkit' to raise awareness of and stimulate a civic dialogue about how algorithms shape our online experiences. We have also contributed to a recent European Parliament Science and Technology Options Assessment panel report on algorithmic transparency and regulation.
In December 2018 our project team began a new study. "ReEnTrust" is also funded by the EPSRC and builds on the UnBias project findings. ReEnTrust will identify mechanisms to help foster trust between users, systems and algorithms.
I teach in the Department on the Requirements course and Cyber Security CDT elective courses on research methods. I also regulalry supervise individual students and the 2nd year UG group projects. I continue to contribute to initiatives on research ethics and the ethics of technology. I am very committed to widening participatation at Oxford and regularly take part in outreach activities organised by the Department. I am also a member of the Departmental Equality and Diversity Committee.
I have presented my work at a range of conferences and in written journal publications. I am also a regular contributor to Inspired Research magazine.
I am a Senior Researcher in the Department of Computer Science and I work in the Human Centred Computing theme. My first degree was in Social and Political Sciences (University of Cambridge) and my PhD (University of Nottingham) explored doctor-patient communication in specialist obesity clinics. From 2009 to 2014 I worked as a researcher in the Work, Interaction and Technology Research Centre (WIT) at King’s College London. WIT specialises in exploring the relationships between interaction, technology and organisation in workplace settings. Whilst there I was invovled in a number of projects covering topics including eye care, obesity management, electronic health records and robotic surgery technologies. I was also part of a collaboration with the College of Optometrists to produce and deliver a communication skills training package for eye care practitioners.
I joined the Department of Computer Science in November 2014 to work on the ESRC funded project 'Digital Wildfire: (Mis)information flows, propagation and responsible governance' . The project investigated interactions on social media platforms such as Twitter and particularly focused on potentially harmful communication behaviours such as the spread of misinformation, hate speech and antagonistic content. We explored opportunities for the responsible governance of social media and as part of that work produced educational materials for schools and contributed to the government's inquiry on Children and the Internet. You can find out more about the project here and follow project updates on Twitter @EthicsWildfire.
I have also worked on a number of small projects within the Department. In the summer of 2018 I was a co-investigator on the 'LabHackathon Zimbabwe' project. This was a pilot project to develop a new method to address resource scarcity in African science laboratories. We ran a hackathon style event in which students competed to design and build frugral and reproducible pieces of laboratory equipment such as centrifuges, PCR machines and magnetic stirrers.
I am highly interested in research ethics and the ethics of development and innovation. I have served on the Departmental Research Ethics Committee and have worked with colleagues to run one day workshops on research ethics here at the University of Oxford and at the Alan Turing Insitute. I have also worked with Marina Jirotka and other members of the Human Centred Computing theme to develop the ethical hackathon model.
Can video−based qualitative analysis help us understand user−algorithm interaction?
M. Webb H. M. & Patel
In Proceedings of the 32nd International BCS Human Computer Interaction Conference (HCI 2018). 2018.
GP delivered brief weight loss interventions: a cohort study of patient responses and subsequent actions‚ using conversation analysis in UK primary care.
Albury C . Stokoe E. Ziebland S. Webb H. and Aveyard P.
In British Journal of General Practice 68 (674): e646−2653. 2018.
Work in Progress: Multi−Stakeholder Dialogue for Policy Recommendations on Algorithmic fairness
M. Webb H. Koene A. Patel and E.P Vallejos