Altmetrics: A decade in the making
On 7th December 2020, Digital Science celebrates its official tenth birthday (though it existed as “Project Babbage ” more than a year before it was launched publicly). To mark the occasion, we will be releasing a few birthday posts over the course of the year.
This post is an edited version of an editorial that appeared in UKSG insights in early 2020.
October 2020 will mark 10 years since the term ‘altmetrics’ was first coined. Made famous in a manifesto authored by Jason Priem, Dario Taraborelli, Paul Groth, and Cameron Neylon, altmetrics, or alternative metrics, promised a new, better way to capture the reach and influence of any form of scholarly output.
In the context of a community that had embraced digital and was increasingly frustrated with the limitations of the Impact Factor as a measure of impact, altmetrics were an exciting new idea that promised to change the way we understand research outcomes forever.
So, where are we now? The last 10 years have seen a huge growth in the development, use and adoption of altmetrics. Numerous service providers have emerged, with Altmetric (owned by Digital Science) often noted as the most recognised of these, and publishers, institutions and funders have increasingly integrated these new data into their websites and workflows.
Their use, and the nature of the data itself, has changed over time, and new applications of these insights are still emerging each year.
With the idea of new metrics came a demand (and need) for new technologies and tools. Where Impact Factors relied on article-to-article citations, altmetrics required alternative methods of tracking that could capture a much broader range of online activity.
Since Altmetric’s initial launch in 2011, both the breadth and the range of altmetrics technologies have changed significantly. New sources of attention have been mined (for example, it’s now possible to use altmetrics to see where a research publication is cited in policy or patents). New methods have been created to help researchers not just track the outcomes of their research but to actively plan and promote it to achieve the impact they believe it can have, this emphasizes the importance of open communication in an increasingly online world.
The precise methods of tracking and collating altmetrics data have evolved over the years, and a NISO initiative has helped to ensure better documentation on exactly how the data are gathered and presented.
A new way of understanding reach and influence
Touted as the solution to evidence of impact for research that would not be reflected through traditional citations, altmetrics have made a huge difference to many.
Today, altmetrics are widely used by publishers to both publicly display the attention around their publications, and to track and report on it internally. Altmetrics have also increasingly been adopted by institutions who are keen to help their researchers strengthen funding applications and to find interesting stories to tell about their work. Altmetrics are also being explored by funders and corporate organizations looking to track the outcomes of their programs and refine their research strategies.
More and more individual researchers are using altmetrics too, as a way to connect directly with new audiences, and to showcase the influence that their work is having. Some have started to include data on media coverage or mentions in policy in tenure applications, whilst others have used the information to refine their own engagement strategies and encourage others in their field to more proactively share their research with a global audience, instead of just their immediate colleagues.
Early career researchers or those publishing non-article outputs such as datasets, books, and other grey literature now have the opportunity to tell a story around their research that would not have been visible before, and an opportunity to influence real change in how we think about the value of research.
Initiatives such as Bookmetrix, a collaboration between Altmetric and Springer Nature, and Altmetric Badges for Books, enable book authors (often underserved by traditional citation metrics) to benefit from altmetrics too – making it possible to delve into the online engagement around individual chapters and to see where their work has been included in academic reading lists, shared on social media or mentioned in a blog.
The immediacy of altmetrics make them a powerful tool, not just for reporting on engagement or attention, but for developing new connections, building alternative approaches, and shaping future research efforts.
Critically, altmetrics can show a valuable and different aspect of research engagement and influence than traditional citations. Early scientometric studies found little correlation between the two, and scholars in this field have also begun to investigate how discipline, author location and output type also play a role in the attention a research output might receive.
The responsible use of metrics
The emergence of altmetrics and the ongoing use of the much-debated Impact Factor as a proxy for article quality have prompted much discussion over the last few years. Initiatives such as the Metric Tide report in the UK, the Metrics Toolkit, and the Expert Group on Indicators established by the European Commission have taken feedback from the research community in an effort to provide guidelines and shape a new approach to research metrics.
Ensuring the responsible use of metrics cannot be left only to the people who provide the data (indeed, some would argue that it shouldn’t be at all) – and this is where librarians and research offices increasingly find themselves needing to offer guidance to their organizational leadership, and to individual faculty and departments.
Like all emerging technologies, altmetrics are not (and likely never will be) a perfect solution to everything it was hoped they might do. The reasons for this are numerous – technical challenges, changing requirements, and even just the shifting priorities and needs of the research community. As we move towards a more open research world, issues around reproducibility and global accessibility are coming to the fore.
The research community still has a vital role to play in how these tools and data develop. Events and groups such as the Altmetrics Workshop, the Altmetrics Conference, ISSI and STI provide a platform for discussion and debate around what it is we are trying to capture and why, and the practical application of these data are being explored in forums such as the Altmetric REF 2021 working group and the LIS-Bibliometrics initiative.
What the future holds for altmetrics is yet to be determined – but it certainly promises to be an interesting journey!