Thanks for visiting Research Information.

You're trying to access an editorial feature that is only available to logged in, registered users of Research Information. Registering is completely free, so why not sign up with us?

By registering, as well as being able to browse all content on the site without further interruption, you'll also have the option to receive our magazine (multiple times a year) and our email newsletters.

Panel reveals more awareness and implementation of altmetrics

Share this on social media:

Topic tags: 

William Gunn reports back from a session on altmetrics that he chaired at the SSP conference earlier this year

With the increasing amount of information available to help authors, readers, and publishers discover and assess scholarly content, new mechanisms are required to understand this data’s utility and value. Alternative metrics are often seen as potential tools that add value to journals and raise awareness of content.  

At the Society for Scholarly Publishing (SSP) conference in May, a panel discussion took on this topic. Speakers included Todd Carpenter, director of standards organisation NISO, which has been leading a project to develop best practices around the use of altmetrics. He presented the results of the first phase of the NISO initiative, which recently concluded and yielded a list of topics of importance to librarians, funders, publishers, authors, and readers.

These topics of importance included research evaluation, data quality and gaming, and community adoption. They also resulted in potential action items, which the scholarly community helped prioritise last month to set the direction for Phase Two of the NISO altmetrics initiative. The decision on which items to work on is still pending, but potential action items include work on definitions; types of research output that are applicable to the use of metrics; methodologies; and use cases.

Michael Habib, senior product manager for Scopus, presented data from a survey that Elsevier conducted, of authors who have published with Elsevier, on the awareness and utility of various types of metrics to researchers. The brand awareness of altmetrics relative to the Journal Impact Factor (see page 32) is still relatively low. However, it has shown the greatest increase in mindshare, increasing eight-fold to eight per cent from last year’s survey.

Euan Adie, founder of altmetrics company Altmetric (see page 32), discussed his work in promoting publisher adoption of the new metrics. He observed that every large scholarly publisher is now using altmetrics in some form with many smaller and society journals showing increasing interest.

Adie spoke about what different audiences are currently getting out of altmetrics. For institutions, this is about awareness, reputation building and discovery; for publishers, it is about providing an author service; and, for funders, it is to collect evidence for different types of impact in one place. He spoke about how altmetrics has evolved, from being something primarily linked with social media to looking at policy documents, patents and more. He also described some things that still need to be improved. These include things such as more frequently updated data feeds and cleaner data from providers such as Mendeley and Twitter.

A new perspective to the discussion was contributed by Elizabeth Iorns from Science Exchange and co-founder of the Reproducibility Initiative. Her contribution, ‘Reproducibility as an Altmetric’, looked at the story behind the metrics, focusing on the work the initiative is doing to improve the quality and reliability of pre-clinical research. The interesting thing about this is the potential to improve the data sources from which metrics are derived, potentially making metrics more informative and focused by improving the quality of the input data.

Given that the presentations showed the increasing implementation of altmetrics and rapidly growing brand awareness of the concept among authors and readers, questions from the audience frequently focused on the practical aspects of implementing altmetrics.

The novel approach of the Reproducibility Initiative in illuminating some issues with the underlying content of scholarly communications also stimulated interest among the attendees, who had many questions about why peer review is not adequate to catch reproducibility problems and how to communicate the best practices developed by the initiative.

William Gunn is head of academic outreach at Mendeley. He chaired the session ‘Concurrent 3E: 21st Century Assessment: How Authors, Publishers, and Readers are using Altmetrics’ at the SSP conference in Boston, MA, USA in May