How altmetrics are reshaping research impact measurement

Digital Science’s Mike Taylor discusses how altmetrics are adapting to the evolving landscape of research impact measurement

Thirteen years after their launch, altmetrics find themselves at another inflection point. As social media landscapes shift and new platforms emerge, the field faces fresh questions about how to meaningfully measure research impact across an increasingly fragmented digital ecosystem.

Mike Taylor, Head of Data Insights at Digital Science, draws on three decades in academic publishing to examine how altmetrics are adapting to these changes.

Why altmetrics remain essential for measuring research impact

Despite being more than a decade old, altmetrics remain relevant in today’s research landscape. Taylor, whose career spans the transformation of academic publishing, points to the rise of open access as a key driver of this continued relevance. “The biggest thing we’ve seen over 15-20 years has been the rise of Open Access. It has cost a lot, caused a lot of disruption,” he says. “A lot of the thinking behind OA was moral, that it is right to have science available to all. Altmetric gives us a great way of quantifying that success.”

Beyond measuring open access impact, altmetrics provide unprecedented insights into the speed of research translation. The inclusion of clinical guidelines in Altmetric’s data sources offers researchers and publishers a window into how quickly academic research moves from publication to practical application, a crucial metric for understanding real-world impact.

Tracking the scholarly platform migration: from X to BlueSky

Perhaps nowhere is Altmetric’s adaptability more evident than in its recent addition of BlueSky tracking. This move comes as the scholarly communication landscape experiences unprecedented upheaval, with researchers migrating away from traditional platforms. “I would never have imagined seeing X drop as it has done,” Taylor says. “Although there’s been a small recovery since Mr. Musk backed out of politics, we see plenty of days when Bluesky has a bigger active community than X.”

This platform migration offers insights into academic community behavior patterns. Taylor identifies a paradox: “It taught me two things, which are opposed: firstly, how quickly a stable community moves, secondly how resistant communities are to disruption!” The data suggests that while academic communities can mobilize rapidly when pushed, they also demonstrate considerable resilience to change, a phenomenon that Taylor believes warrants further research attention.

The tracking of multiple platforms raises broader questions about where scholarly discourse happens and how comprehensively it can be captured.

Addressing long-standing technical challenges

One significant development addresses a persistent technical challenge: meaningful sentiment analysis for scientific content. Traditional sentiment analysis tools have long struggled with research-related posts, often misinterpreting positive findings. “You’d put in a post about cancer survivorship increasing, and it’d say ‘oh no, cancer, that’s bad,'” explains Taylor. “Standard tools still do this.”

The solution involved developing a specialised system tailored to how people discuss science on social media. Led by Dr Carlos Areia, the team spent three years creating and refining this sentiment analysis approach. “The data was great, we’ve been refining this for the last couple of years, and listening to our customers,” Taylor explains. The development represents progress in understanding not just the volume of research attention, but its qualitative dimensions.

Empowering publishers with benchmark intelligence

For publishers seeking competitive intelligence and portfolio optimisation, Altmetric’s new Journal Benchmark dashboard addresses a persistent challenge. Drawing from his 20 years at Elsevier, Taylor understands firsthand the difficulty of comparing journals or benchmarking against disciplines. “Although the big publishers now have data science teams, and access to Dimensions and Altmetric data, it’s still pretty hard to make easy comparisons,” he says.

The Journal Benchmark dashboard emerged from Taylor’s personal vision to support journal communities with clear, actionable data. Whether publishers are managing flagship journals or developing emerging titles, the tool provides editor-friendly visualisations that facilitate meaningful comparisons and strategic decision-making. “I wanted to help people like me, people who love their journals, love the communities, and want to support editorial boards with good, clear data,” says Taylor.

AI’s promise and peril in research metrics

Looking ahead, Taylor sees AI as having tremendous potential to democratise research evaluation. The ability to provide context-rich insights, such as identifying that a paper ranks as “the number three most discussed paper published in asthma in the last year” and is “discussed amongst patient advocates” or “covered in the mainstream media,” represents a significant advancement in making metrics more digestible and meaningful.

However, Taylor notes the responsibility that comes with this power. “The users of research evaluation are mostly academics; they expect robust answers, firmly rooted in the data,” he says. As AI increasingly influences promotion and tenure decisions, transparency becomes paramount. “Whether it’s promotion, or article rejection, or grant approval, we can’t get into a situation where ‘the algorithm says no’: our work is too important for that.”

This perspective reflects Altmetric’s broader philosophy of supporting rather than replacing human judgment, a principle rooted in Taylor’s belief that the field should “always value dissent and disruption.”

Demystifying the Altmetric Attention Score (AAS)

Despite altmetrics’ maturation, misconceptions persist. Taylor frequently encounters claims that social media metrics are meaningless or that sources like Wikipedia lack credibility. Recent research challenges these assumptions. Using Altmetric’s sentiment analysis, the team can demonstrate the meaningful nature of social media conversations about research. Additionally, survey data reveals that academics do tend to trust Wikipedia articles within their subject areas.

“Altmetrics mean things (plural) and this requires understanding and analysis,” says Taylor. “The Altmetric Attention Score is a great tool for telling people to look at the data more closely, it’s not an answer in itself.”

The road ahead

Taylor also hints at significant developments that will bring more data and insights to the Altmetric Explorer. These additions, apparently addressing long-standing user requests, promise to further enhance the platform’s utility for publishers, researchers, and institutions.

As the research communication landscape continues to evolve, Altmetric’s combination of platform agility, sophisticated analysis tools, and commitment to transparency positions it as an essential resource for understanding research impact. In an era where traditional metrics alone cannot capture the full scope of scholarly influence, Altmetric’s comprehensive approach to tracking attention across diverse sources, from clinical guidelines to emerging social platforms, provides the nuanced insights necessary for informed decision-making.

Back to top