Providing Context for Research and the Ever-Evolving Field of Altmetrics
Research always has context – the experience of the researcher, the institutional setting, the funding body, the publishing organ, the way it is interpreted and by whom. Trust is encoded in all of these complex contextual relationships. At ScienceOpen we are trying to open up the context of research as much as possible to support search and discovery of trusted / trustable research. With its transparent aggregation of online mentions, Altmetric provides a window into a key part of this web of context. These days, everything is quantified, from citations to tweets, and from news mentions to Reddit upvotes. In every branch, we are all driven by “key performance indicators”.
Is the communication of scholarly research any different? In a world where everything is measured and scrutinised so closely, how can the different stakeholders in the academic world use this to the best possible effect?
When Altmetric was launched back in 2011 it was a game changer. It didn’t so much provide an alternative to traditional, citation-based metrics, but made us think about how we can use different metrics to suit a variety of purposes. Altmetrics became an integral part of an ongoing and rapidly evolving ‘open movement’, encompassing important aspects of research openness and evaluation. At ScienceOpen, we have been working with Altmetric for several years now to help provide services for both researchers and authors. We see altmetrics as a way to show essential ‘context’ about research articles and try to leverage this for a better search and discovery experience for users.
At the article level, altmetrics can be used to create enhanced discovery tools for researchers. The rising sea of metrics is matched only by a completely overwhelming number of newly published research articles. Estimates are around 2-2.5 million new research papers now published every year.
The real question for researchers these days is how do we discover research that is relevant, and across all the existing publishing platforms?
Sorting articles by their Altmetric score is one efficient way of filtering research, much like we do on other platforms like Amazon or Reddit where we sort by popularity or ratings. Researchers can get a quick overview of the papers in their field getting the most online attention and use this to prioritise which articles to spend time reading and re-using.
With interdisciplinary research on the rise, however, researchers must also often do literature searches outside of their field – for grant proposals, collaborations and new courses. Searching within a new field – pulling up 800 000 papers on diabetes as a non-expert for example – can be a challenge. Sorting by Altmetric score gives a different way to drill down into this content. Just click on the donut to find a transparent view on who is saying what about a piece of research.
What is the point of doing research if no-one is there to tweet it?
OK, perhaps a bit far, but not that far really from the reality of modern research. Researchers want and need to know about the re-use of their research, for grant and tenure committees, for their CVs, for promotions, and well, for a bit of self-gratification every now and then.
Who has shared your work? On what platforms did they share it? Who is writing about and re-using your work in the social sphere? What are they saying about your work?
All of these can be at least indicated and often quantified by Altmetric providing important context at the individual level. If you integrate your ORCID profile, which every researcher should have as part of their essential toolkit, we integrate data from Altmetric so you can see which of all of your papers are most popular. This provides a great way of tracking how all of your research is being distributed and re-used in digital social spheres.
Our journey to better understand, quantify and harness the context of research at ScienceOpen has only begun and we are looking forward to traveling down this road with Altmetric.