Changes in the scholarly research landscape, its inefficiencies and its evolution were hot topics at this year’s inaugural Society for Scholarly Publishing (SSP) London event. A packed space saw Liz Allen, director of strategic initiatives at F1000, keynote on topics ranging from new ways to assign credit, how collaboration is growing, the developments of research outputs and scholarly infrastructure, and new requirements and publishing models.
Allen has valuable insights from both sides of the funder/publisher coin, having spent over a decade working at the Wellcome Trust before joining F1000, where she is involved in shaping new initiatives and partnerships to promote and foster open research.
Her opening gambit was revealing, and perhaps not surprising: 'When joining a publisher, I was quite surprised at just how little they actually interacted with funders.'
This set the scene for discussions around how the scholarly community needs to think collectively, not in their silos: researchers, funders, publishers and institutions need to work together to contribute to the ecosystem. Ultimately, we all have the same goal in mind, and that’s to do science better.
Allen showed data highlighting how the research landscape is growing across all fields, leading to a massive increase in the amount and types of research outputs published and being shared. This can be challenging, particularly for funders, when attempting to evaluate the effect of their funding and link the outputs and contributions of individuals and teams to research findings.
Allen has also been a keen advocate of the need to replace narrow definitions of authorship on articles to a more holistic and realistic concept of contributions to research output. Now that we are thinking about research outputs differently, perhaps we need to think about authorship differently too; project CRediT (Contributor Roles Taxonomy) emerged from a collaborative (funder, researcher, publisher) effort to reinvent the concept of authorship to reflect how research is done today and importantly to provide greater visibility and transparency to author contributions.
CRediT has been hosted by CASRAI for the last few years and an increasing number of publishers and information providers are introducing the taxonomy into their workflows so that contributor information is captured as part of an output’s metadata and is easily usable. Allen puts this into context – if CRediT has been used you can look at a published research paper and easily identify, for example, the person who was responsible for the data or the visualisation of figures – it’s a better way to capture information. It’s also helps researchers to get visibility and recognition for all their contributions, can be used to support research evaluation, and perhaps most practically, can support identification of individuals with specific expertise for peer review or potential collaboration.
However, it’s not just transparency around author contribution; there are other inefficiencies across the research ecosystem. Allen touched on the historic lack of access to methods and protocols as an example, which can make papers hard to reproduce. Shockingly, it’s estimated around 85 per cent of clinical research spend is wasted due to a range of factors, including lack of data access and sharing and duplications of studies. A lot of work that has been funded – and particularly studies producing ‘negative’, ‘null’ or confirmatory findings – can be difficult to publish or not published at all, contributing to the ‘reproducibility crisis’ and exacerbating research waste.
Compounding that are issues with the 'publish or perish' culture, where you are judged on research outputs in terms of published articles. Allen explained we need to be more holistic in the way we review research outputs. Initiatives trying to make a difference include the Declaration on Research Assessment (DORA), which recognises the need to improve the ways in which the outputs of scholarly research are evaluated. The initiative is working with stakeholder across the research system (particularly with funders, research institutions, researchers and publishers) to promote a more rounded way of evaluating researchers and their impact.
Allen challenged the audience with: is the word 'publishing' out of date? Perhaps we need to reassess and think about what we mean by 'publishing'? This bold statement provoked further discussions around how research is shared. Publishing research on preprint servers prior to peer review is growing, speeding up access to research. Digital object identifiers (DOIs) now enable the tracking of research outputs, so you can link to research funding. Allen explained the benefits in research outputs linking to research inputs and the importance of metadata and scholarly infrastructure in supporting these approaches – when done in the right way this will ultimately enable innovation and can help to deliver less burdensome approaches to evaluating academic research.
A missing link in the evaluation of academic research is also the transparency around the peer review process. Allen challenged: should open peer review be a common scholarly practice? Tools like Publons that are improving the process by assigning credit and rewarding researchers for putting their publishing peer-review activity online were also discussed. But it’s not just the publishing peer review process that has inefficiencies; there is also a lack of transparency in the grant peer review process too. It is important to consider research funding and its outputs in the round to provide greater insights into how we might do science more efficiently and accelerate impact.
It boils down to being more open in our approach to authorship, as well as the ways we evaluate, review and share research outputs. It’s clear how we share and talk about science is changing fast. The key is to understand the evolving roles within the scholarly ecosystem and adapt to new models that encourage collaboration and innovation. It’s all about fostering openness and as a community we should all be embracing the fact that the future of research is indeed open.
Laura Wheeler is director of external communications at Clarivate Analytics