Report cautions against the over-reliance on citation statistics

Share this on social media:

A new report urges caution against the over-reliance on citation statistics such as the impact factor and h-index.

Promoting the sensible use of citation statistics in evaluating research, the report points out several common misuses. While the authors of the report recognise that assessment must be practical and that easily-derived citation statistics will be part of the process, they caution that citations provide only a limited and incomplete view of research quality. Research is too important, they say, to measure its value with only a single, coarse tool.

The report points out that statistics are not more accurate when they are improperly used; statistics can mislead when they are misused or misunderstood. It also points out that the meaning of citations is not well-understood. This means that a citation's meaning can be very far from 'impact', making the objectivity of citations illusory. In addition, while having a single number to judge quality is indeed simple, it can lead to a shallow understanding of something as complicated as research. Numbers are not inherently superior to sound judgements, observes the report.

The report, entitled Citation Statistics, on the use of citations in assessing research quality, strongly, was commissioned by the International Mathematical Union (IMU) in cooperation with the International Council on Industrial and Applied Mathematics (ICIAM), and the Institute of Mathematical Statistics (IMS). The work was also based on practices as reported from mathematicians and other scientists from around the world.