Last month I had the honor of organizing one of the Society for Scholarly Publishing’s fall educational seminars in Washington, DC. The topic of the seminar that I was given was the ‘innovation in scholarly publishing’. The field of innovation is, of course, incredibly broad and there were many directions that the seminar could be taken. My co-panelist and I decided to take a warts-and-all look at innovation and try to tease out what make some innovations stick, when others are barely noticed by users.

My co-panelists, Rick Anderson from the University of Utah and Angela Cochran from ASCE gave insightful overviews of how their organizations in particular see innovation. A surprising amount of common ground emerged between librarian and publisher, both in the recognition that publishers need to find new ways to satisfy changing researcher needs in a tough library funding environment, but also that an understanding of which innovations provide real value is needed. One way to provide a framework to assess innovations is to look at lessons of successful innovations from the past.

Altmetrics are a great example, as they have fully integrated into the culture of scholarly communication and academia itself. As my colleague, Euan Adie, of our own Altmetric recently said at a user group meeting,

‘There are two reasons to adopt altmetrics: the first is because users really like them, and the second is because they, and their institutions, increasingly need them to satisfy funder mandates’.

The first point is easy to prove, so I felt it worth taking a closer look at the second: funder mandates. Research funders are increasingly requiring grant applicants to supply evidence of societal impact on an individual research project level as part of their evaluation frameworks, a notable example being the REF in the UK, the Dutch SEP and Australian ERA. For obvious reasons, these funding requirements form the basis of a strong market need for alternative impact analysis amongst academics and institutions alike.

It’s clear that innovations that stick are not merely bells and whistles but address real needs of the end-user or institution. To assess these needs, not only is it necessary to engage directly with those stakeholders, as many publishers are increasingly doing, but also to look at the underlying drivers of those needs, most particularly, funder mandates. An emerging trend is the requirement for data sharing. In the US, important funders like the NIH have data sharing policies in place, while in the UK, the research council’s common principle document sets out the goal that all data be made available and most interestingly, that research funding may be used to support that effort. 

The trend to funder mandated open science is unlikely to stop any time soon. Currently, the Nuffield Council on Bioethics, which is funded by the Wellcome Trust and the UK Research Councils is consulting with scientists of all disciplines as part of their ‘Culture of Scientific Research’ project, during which, issues of reproducibility and the need to make science more open have been a major talking point.

During a recent Nuffield discussion day at Edinburgh University, amongst ideas considered were pre-registering experiments and mandating that all data, both positive and negative be made not only available but interpretable. While ideas like these are still only at the stage of being discussed in consultation, it’s clear that an appetite for further mandates exist.

Funder mandates for open science create real and tangible market needs in academia that go beyond traditional publishing. Publishers have an opportunity here to get ahead of these emerging needs before somebody else does so.One way to predict future needs of scholars is to follow the money