Skip to main content

Human Centred Computing

The Human-Centred Computing (HCC) group is an interdisciplinary research centre led by Professor Marina Jirotka and consists of research staff and postgraduate students. We carry out a range of projects that seek to:

  • advance empirical and conceptual understanding of the ways in which technology shapes - and is shaped by - communication, collaboration and knowledge exchange within scientific, work and home settings;
  • develop new approaches to requirements elicitation and analysis;
  • understand how people adapt to the rapid social changes brought by innovation and to assist them to use those innovations more productively and safely;
  • consider the ways in which new technologies can be designed, developed and evaluated to be more responsive to societal acceptability and desirability.

Our projects draw on a range of research methods and approaches, including; digital ethnography, fieldwork observations and interviews, surveys, computational analysis, and interaction analysis.

In recent projects, we have been exploring the challenges of provocative content on social media (Digital Wildfire), the importance of establishing the rights for participants in ‘sharing economy’ platforms (Smart Society), the risk of algorithm bias online (UnBias), and responsible innovation in quantum computing (NQIT). We have strong working relationships with other research centres across the University, around the UK and worldwide. We work regularly with external collaborators and engage with stakeholders from various fields including policy, law enforcement, education, commerce and civil society. Our projects regularly involve engagement and participation activities with stakeholders. These activities aid the user-centred and collaborative design of new technologies and support the development of responsible innovations.

Our research areas include:

 

Requirements Elicitation and Analysis

Requirements Engineering (RE) is a process which identifies the properties a software-intensive system should have in order to operate successfully in the context for which it is designed. In our projects we conduct requirements elicitation and analysis for collaborative systems. Our approach particularly acknowledges the importance of understanding human activity, human- computer interaction and stakeholder needs for an effective requirements process. We also focus on developing methods for evaulating systems with users and different stakeholders. Members of the group lead and teach on the Department's Requirements course for students. 

Responsible Research and Innovation

How can we integrate ethical and societal considerations into the processes and products of research and innovation?  

This is a core consideration of the field of Responsible Research and Innovation (RRI). The key aim of the field is to explore and develop the means by which societal and ethical concerns to be seriously embedded in research and innovation. This will allow for such  considerations to be proactive and preventive, rather than reactive to serious ethical issues that may arise through the introduction of technologies “in the wild”. Importantly RRI may also be seen as shaping a creative space in which researchers and innovators can generate insights informed by and aligned to societal and ethical concerns: having a mutual benefit to them such as in increasingly the likelihood of the acceptability of their research and research outcomes.  

As a group, we have played a key role and continue to take a keen interest in the development of the field. We have led and undertaken seminal projects such as the EPSRC funded FRRIICT Project; and the EU FP7 funded GREAT Project (Governance for Responsible Innovation) and RESPONSIBILITY Project (Global Model and Observatory for International Responsible Research and Innovation Coordination).  We are currently leading the RRI component of the EU FP7 funded Smart Society Project, as well as the within the large-scale EPSRC funded NQIT (Networked Quantum Information Technologies) Hub.  This is the largest of four hubs in the £270 million pound investment made by the government through their UK National Quantum Technology Programme.

FRRICT has produced a video animation that describes the value of responsible research and innovation in ICT.

Read Professor Marina Jirotka talk about the importance of responsibility in research here. 

Computing and the social  

We are interested in the inter-relationships between computing and the social world. How is computing conducted for social purposes? How do people’s computing practices shape the world around them and how does the world we live in shape computing? We can investigate these issues in a number of ways – for example by observing user interactions and collaborative behaviours or exploring people’s understandings of the roles that computers play in their lives. 

The recently started UnBias project examines the ways that algorithms shape user behaviours online and the implications this has for their understandings of what is ‘fair’. We live in an age of ubiquitous online data collection, analysis and processing. News feeds, search engine results and product recommendations increasingly use personalisation algorithms to determine the information we see when browsing online. Whilst this can help us to cut through the mountains of available information and find those bits that are most relevant to us, how can we be sure that they are operating in our best interests? Are algorithms ever ‘neutral’ and how can we judge the trustworthiness and fairness of systems that heavily rely on algorithms? UnBias investigates the user experience of algorithm driven internet services and the processes of algorithm design. The project is funded by the EPSRC as part of its TIPS theme - Trust, Identity, Privacy and Security in the Digital Economy.

Security and safety

The rapid pace and broad scope of contemporary innovation can affect the security of individuals, groups and entire communities. In our work we take a broad understanding of security to encompass issues such as trust, well-being, transparency and privacy. Our projects seek to explore these issues in both conceptual and practical terms. 

The 'Digital Wildfire' project is part of RCUK's Global Uncertainties programme. It follows a report produced by the World Economic Forum which describes social media as a global risk factor due to the capacity for harmful content (in the form of rumour, malicious campaigns, inflammatory speech etc.) to spread rapidly and have very negative consequences. We use a variety of methods to explore opportunities for the responsible governance of social media. How can we manage or limit the spread of harmful content on social media? What can we do to protect the most vulnerable users of social media - such as children and young people?

We undertake various activities to translate our research into practical impacts that can enhance security and safety. Marina Jirotka has been appointed a specialist advisor to the House of Lords inquiry into Children and the Internet. The Digital Wildfire project has produced e-safety teaching and learning materials for schools as well as a video animation that encourages young people to take care on social media. 

 

Previous DPhil students:

Grace Eden, (The Contextual Evaluation Framework: A prototype evaluation technique for e-Research) currently a PostDoctoral Research Assistant at the Dept. of Computer Science University of Oxford.

Peter Darch, (When scientists meet the public: An Investigation into Volunteer Computing) currently a PostDoctoral Research Assistant at the Dept. of Information Studies UCLA.

Matteo Turilli, (Ethics and the Practice of Software Design) is currently a DoE Research Fellow at Rutgers University.

Fernando Muradas (A Framework for Requirements Elicitation in a Military Setting) has been promoted in the Brazilian Navy following completion of the doctorate and is now responsible for Software Development.

Chris Hinds (A Practical, Methodological and Philosophical Investigation in the use of Ethnomethodological Fieldwork for Engineering Software Requirements) is lead software developer in the Dept. of Psychiatry, University of Oxford.

Simon Smith (Towards a Knowledge Management methodology for articulating the role of tacit knowledge) works for Natural England.

Faculty

Research

Students

External

Past Members

Selected Publications

View All