Subscribe to our newsletter

Articulating Research Impact – Digital Science Webinar Write-up

1st July 2015
 | Katy Alexander

As part of a continuing series, we recently broadcast our second Digital Science webinar on research impact strategies from around the world. The aim of these webinars is to provide the very latest perspectives on key topics in scholarly communication

Research outcomes are diverse, complex and realised over a wide spectrum of time. High-quality research, whether basic or applied, delivers economic, social and health benefits, and many in very unexpected and complex ways. This is not always easy to communicate and articulate. Our webinar examined some of the ways that funders, governments and institutions are engaging with research impact.

Laura Wheeler (@laurawheelers), Community Manager at Digital Science, started the webinar by giving a brief overview of the esteemed panel and their backgrounds before handing over to Ben McLeish (@BenMcLeish) from Altmetric who moderated and questioned the panel. 

Ben described how “impact” has become a buzzword in scholarly communication which could benefit from closer examination and inspection. The nature of scholarly communication has dramatically changed in recent years, and research is no longer sealed off from the wider world. The world has become more connected and with that comes new opportunities to understand and engage with research impact.

Ben asked Jonathan (@JAdams_Research) to share some insights from his work creating a searchable online database of real world examples of research impact, which came from the Research Excellence Framework (REF) in the UK (impact.ref.ac.uk/CaseStudies/).

Processing and Analysis of the REF Impact Case Studies

Jonathan explained how the REF has introduced assessment of broader non-academic impact via case studies, alongside the assessment of traditional academic impact. The use of case studies, documents in which researchers describe what they see as the critical impacts of their research, represents a new attempt to articulate the impact of UK research.

UK research has considerable impact which is global, diverse in nature and highly cross-disciplinary, but it is not easily “metricized”

In Jonathan’s view, these case studies are “food for a debate in the UK, and more widely, about how we construe what research impact really is”. Jonathan has lead the work by Digital Science in taking the 6,600 documents, normalizing the text, attaching different tags for location, institution, impact type, field of research etc and then making them available online via a searchable database. Jonathan highlighted a particular case study from the University of Oxford as an example of high quality UK research with considerable and sustained impact overseas.

Overall, Jonathan’s key conclusion from the analysis was that UK research has considerable impact which is global, diverse in nature and highly cross-disciplinary, but it is not easily “metricized”. 

US University Research Funding, Peer Reviews, and Metrics

Dan Katz (@danielskatz) from the University of Chicago spoke next, sharing his perspective on how the US is engaging with research impact. Dan pointed out the key differences between research funding in the UK and in the US, explaining that the majority of funding in the US is awarded to projects, and this process starts with the submission of a proposal.

US funders use data and metrics as an input to peer review, not to replace it.

Research assessment is specific to national funding agencies, which all have specific missions. For example, the National Science Foundation (NSF) uses two standard criteria, the first is “intellectual merit”, i.e. in what way will it advance scientific knowledge, the second is “broader impacts”, i.e. in what way will it benefit society. Different agencies have different expectations of what societal benefit looks like, informed by their own particular missions. 

Broadly speaking, US funders use data and metrics as an input to peer review, not to replace it. Overall the landscape in the US is more complex than in the UK due to the variety of US funding agencies and the fact that funding goes to projects, not institutions. 

Understanding “Impact” Beyond Citations

Following on from Dan, Stacy Konkiel (@skonkiel), Research Metrics Consultant at Altmetric, gave her presentation on understanding “impact” beyond citations. Stacy accepted that impact is difficult to capture with metrics, but argued that metrics and the data associated with them can be used as indicators of attention and impact.

Stacy described an in-depth but small scale analysis that she carried out, exploring possible links between the REF case studies, traditional citation data and Altmetric data. Using Altmetric’s data Stacy found that many of the REF case studies did not include all of the citations that exist in policy documents. She also found that evidence of impact is heavily concentrated on public health, epidemiology and climate change. The Altmetric data also provided evidence of some novel and unexpected routes to impact which were not always well represented in the corresponding case studies.

Evidence of impact is heavily concentrated on public health, epidemiology and climate change.

After Stacy’s presentation, Ben then moved things over to a more open discussion, fielding several probing questions from those attending the webinar. 

Overall the webinar was a fascinating and enlightening examination of research impact and the ways in which policymakers, funders, institutions, researchers and technology providers are all engaging with impact. There are clearly many opportunities for development and progress in this area.