Report highlights misused metrics and recommends alternatives

Share this on social media:

The Institute for Scientific Information has launched Global Research Reports – a series of publications aimed at those who deal with research in academia, corporations, publishers and governments to inform, to stimulate debate and to demonstrate the rich information potential of research data.

The first report, Profiles not Metrics, draws attention to information that is lost when data about researchers and their institutions are squeezed into simplified metrics or a league table. it starts that research is not one-dimensional: the process is complex and no two projects are identical. Yet, to the dismay of many, the global research community is surrounded by analyses that claim to measure relative performance among people, publications and organisations, disregarding the counter arguments offered by informed analysts.

Profiles not Metrics examines four familiar types of analysis that can obscure real research performance when misused, and offers four alternative visualisations that unpack the richer information that lies beneath each headline indicator:

Researchers: A beam-plot not an h-index. The h-index is a widely quoted but poorly understood way of characterising a researcher’s publication and citation profile whilst the beam plot can be used for a fair and meaningful evaluation;

Journals: The whole Journal Citation Record (JCR), not just the Journal Impact Factor (JIF). The JIF has been irresponsibly applied to wider management research whilst the new JCR offers revised journal profiles with a richer data context;

Institutes: An Impact Profile, not an isolated Average Citation Impact. Category normalised citation impacts have no statistical power and can be deceptive whilst Impact Profiles show the real spread of citations; and

Universities: A Research Footprint, not a university ranking. A global university ranking may be fun but suppresses more information than most analyses and hides the diversity and complexity of activity of any one campus, whilst a Research Footprint provides a more informative approach as it can unpack performance by discipline or data type.

Jonathan Adams, director at the Institute for Scientific Information, explained: 'In re-establishing ISI we committed to supporting the needs of the global community of researchers, research managers, policymakers and publishers with high quality and timely information to inform and support best practice analysis and interpretation of research trends and performance. In this, our first ISI report of the C21st we examine the efficiency and effectiveness of current metric indicators as we seek to support sound, responsible research management.

'For every over-simplified or mis-used metric there is a better alternative, usually involving proper and responsible data analysis through a graphical display with multiple, complimentary dimensions. By placing data in a wider context, we see new features and understand more and improve our ability to interpret research activity.'

Related news