The future of evaluative metrics

Gali Halevi

Gali Halevi discusses the impact of new publishing outlets and formats

As more outlets and formats become available, scientific publishing continues to evolve. The transition from print to digital has enabled the development of open research initiatives, including open access (OA), preprints and open data. These initiatives have allowed the research community to communicate research outputs faster and more transparently, and given us the ability to measure new impact aspects that go beyond citations.

Open access publishing has made significant strides. With a growing number of mandates from global policymakers to publish funded scientific papers OA, journals are under pressure to offer more OA publishing options to researchers. Bibliometricians have also acknowledged OA publishing’s potential to change the evaluation arena, as can be seen in the expanding literature examining the correlation between OA and citations rates, and whether they increase post-publication.

Preprints are also a major factor in the shifting research landscape. Thirty years since arXiv launched at Cornell, preprint repositories continue to grow on both an institutional and global level. In addition to the 60+ repositories founded and managed by commercial entities, there are estimated to be more than 8,000 repositories managed by universities and colleges.

Repositories not only present the first opportunity for authors to make their research openly available, but also pave the way for new forms of evaluation and impact metrics to evolve by capturing usage, downloads and views. Also, artefacts not previously considered for evaluation, such as datasets, software codes, working papers, posters and presentations, have become candidates for inclusion as a source for impact metrics development focusing on use and interaction.

Scholarly communication through social media enables researchers to gain attention, connect with readers and participate in discussions around their research in real time. Authors use these networks to make portions of their research, figures and links to full texts available on various platforms.

Open scholarship and the availability of data surrounding these artefacts across social platforms, open management systems and recommendation and sharing sites have changed the publishing landscape, including the rise of a new cluster of evaluative metrics generally referred to as ‘alternative metrics’.

One subset of these new types of metrics is ‘attention metrics’, defined as a set of data indicators that capture the levels of engagement a research artefact has gained since publication. Although commonly perceived as only related to social media, they actually include more than ‘likes’ and ‘shares’ and are captured across journal platforms and databases. A good example of an attention metric that evolved from open science outlets is ‘usage.’ Usage can be defined in many ways, including article downloads, page views, clicks, figure or table views, bookmarking and more.

Other attention metrics include ‘recommendation’ indicators that contain the number of times an article was shared with others and ‘public reach’, revealing the number of times an article was featured in a news outlet, blog post or on television.

Many publishers and database providers now offer some of these metrics across their platforms, in addition to the citation-related metrics that were the standard evaluation tool, allowing for more interactions to be captured in ways that were not available before.

Although ‘attention metrics’ and ‘alternative metrics’ cannot provide context into quality, they do offer a glimpse into a research artefact’s influence, not only within a community of practice, but also in the public domain. However, it is the responsibility of the evaluator to distinguish between negative and positive influences, as well as pay attention to the context within which they are captured.

Attention-related metrics are diverse and capture several engagement points at various levels; when used by the public or news media they can offer an indication of societal influence. However, their importance or use must be applied cautiously, taking into consideration the context in which they are made.

The move towards open scholarly platforms, whether repositories, OA publishing or social networks, has created a shift in the way research impact is measured and perceived. Data availability, coupled with the growing call for responsible metrics in the forms of the Leiden Manifesto, DORA and others, has created a new focus on the need for a more holistic view of the impact of research. This has resulted in the capture of social media interactions, but also in the development of a sub-set of attention metrics as more research artefacts are hosted on open platforms.

Attention metrics are not yet universally accepted for evaluation, tenure and promotion exercises because they are heavily context dependent. However, as more open publishing channels and open scholarship data become available, there might be more room to consider them in the future, especially as we seek to understand the overall impact of research.
The COVID-19 pandemic and the urgent need to make related research quickly available has also accelerated the deposit and use of preprints. This trend highlights their growing potential to provide impact-related metrics, especially in a time when science is moving rapidly and the number of articles is increasing apace.

However, a word of caution is needed, as the need for speed must be balanced by the rigour that comes from peer review. We must not lose sight of the quality and integrity that underpins good research, which allows for true innovation and collaboration. Responsible research assessment is crucial and evaluative metrics will continue to play a key role.
The seismic shift in the scientific publishing landscape might be far from over – but, given the right access to data and our collective care in using it responsibly, I am certain our most exciting times lie ahead.

Gali Haveli is director of The Institute for Scientific Information

Back to top