Skip to main content

Royal Society cautions against online censorship of scientific misinformation


Governments and social media platforms should not rely on content removal for combatting harmful scientific misinformation online, according to a report today from the Royal Society, the UK’s national academy of science.

But, the Online Information Environment report, created by a working group of leading researchers, including Oxford computing, internet and media experts, recommends wide-ranging measures to build resilience to misinformation and a healthy online information environment.

Professor Sir Nigel Shadbolt, Oxford Professor of Computing Science (Department of Computer Science), one of the working group, maintains, ‘The internet has been one humanity’s greatest innovations. The knowledge and information it supports and disseminates is amongst our greatest resources.’

But, he says, ‘We face a torrent of misinformation on topics great and small. The report reviews the challenges of misinformation and what steps we can take to deal with them. It does not call for content removal as a panacea, rather it recommends a range of measures that governments, tech platforms and academic institutions can implement - recommendations that build resilience to misinformation and promote a healthy online environment.’ 

Professor Gina Neff, another working group member and Professor of Technology & Society at the Oxford Internet Institute., adds, ‘Scientific misinformation doesn’t just affect individuals, it can harm society and even future generations if allowed to spread unchecked.

‘Our polling showed people have complex reasons for sharing misinformation, and we won’t change this by giving them more facts.’

Meanwhile, Professor Michael Bronstein, Oxford Deep Mind professor of artificial intelligence (Department of Computer Science) and working group member, points out, ‘The coronavirus pandemic has highlighted the phenomenon of "scientific misinformation" and the harm it can inflict on individuals and societies. A complex and rapidly evolving scientific phenomenon such as a new disease is beyond the grasp of the public, which reaches out to online sources of information in an attempt to answer questions and understand how to behave.

‘Members of the public often lack the tools to tell authoritative sources from fictitious ones and tend to regard science as the absolute "truth" rather than a constantly evolving picture, and consequently fall victim both to honest mistakes and misreading of scientific results as well as intentional manipulation. The report studies the phenomenon of scientific misinformation and makes recommendations to different stakeholders that we believe might help build a healthier information ecosystem.’ 

Professor Rasmus Kleis Neilson, of Oxford University’s Reuters Institute for the study of journalism, concludes, ‘a lot of citizens [would have] their worst suspicions confirmed’ if access to information were limited – even if it is misinformation.

The working group’s report recommends a range of measures for policy makers, online platforms and others to understand and limit misinformation’s harms, including:

  • Supporting media plurality and independent fact-checking.
  • Monitoring and mitigating evolving sources of scientific misinformation online.
  • Investing in lifelong information literacy.

The working group also included:

  • Chair – Professor Frank Kelly CBE FRS, Emeritus Professor of the Mathematics of Systems, University of Cambridge
  • Dr Vint Cerf ForMemRS, Vice President, Google
  • Professor Lilian Edwards, Professor of Law, Innovation, and Society, University of Newcastle
  • Professor Derek McAuley, Professor of Digital Economy, University of Nottingham
  • Professor Melissa Terras, Professor of Digital Cultural Heritage, University of Edinburgh


Michael Bronstein
(Head of Graph Learning Research, Twitter)