Responsible Use of Metrics

Research metrics can provide useful quantitative measures of publication and citation activity and patterns, they do not provide a comprehensive picture of research activities and tend to reduce the complex and nuanced impacts of research outputs to a single number.

The use of metrics has become an intrinsic part of evaluating research, but the origins and original purpose of some of these measurement tools may surprise you. For example, one of the most used metrics – the Journal Impact Factor – was originally designed to assist librarians with stock selection. Over time it has been misappropriated to inform opinions around the scholarly prestige of a journal, the quality of the research therein, and the hiring potential of the author.

The responsible Metrics movement advocates for balanced use of metrics alongside qualitative, expert assessment; and favours a more open approach to measurement, in line with other open research practices.

Resources

The following are three key documents that outline some recommended principles and practices for the responsible use of metrics.

The Metric Tide

The Metric Tide Source: Wilsdon, J., et al. (2015). The metric tide: Report of the independent review of the role of metrics in research assessment and management. https://doi.org/10.13140/RG.2.1.4929.1363 The Metric Tide articulates five main principles for responsible metrics assessment:

DORA

The Declaration on Research Assessment (DORA) identifies the need to improve the ways in which scientific research is evaluated.

The outputs from scientific research are many and varied including: research articles reporting new knowledge, data, reagents, software, intellectual property, and highly trained young scientists.

Funding agencies, institutions that employ scientists, and scientists themselves, all have a desire, and need, to assess the quality and impact of scientific outputs. It is thus imperative that scientific output is measured accurately and evaluated wisely.

DORA's general recommendation is that researchers do not use journal-based metrics, like the Journal Impact Factor (JIF), as surrogate measures of the quality of individual research articles, to assess the individual contributions, or allow them to influence hiring, promotion, or funding decisions.

Leiden Manifesto

The Leiden Manifesto outlines ten principles to guide quantitative research evaluation.

  • Quantitative evaluation should support qualitative, expert assessment.
  • Measure performance against the research missions of the institution, group, or researcher.
  • Protect excellence in locally relevant research.
  • Keep data collection and analytical processes open, transparent, and simple.
  • Allow those evaluated to verify data and analysis.
  • Account for variation by field in publication and citation practices.
  • Base assessment of individual researchers on a qualitative judgement of their portfolio.
  • Avoid misplaced concreteness and false precision.
  • Recognise the systemic effects of assessment and indicators.
  • Scrutinise indicators regularly and update them.

Source: Hicks, D., Wouters, P. Waltman, L de Rijcke, S., & Rafols, I. (2015). Bibliometrics: The Leiden Manifesto for research metrics. Nature, 520(7548). https://doi.org/10.1038/520429a

Position Statements

A growing number of research institutions are acknowledging this issue by developing position statements and policy documents around the responsible use of research metrics and, more broadly, the responsible assessment of research.

Other Support Documentation

Profiles, not metrics (Clarivate)
This report draws attention to the information that is lost when data about researchers is squeezed into a simplified metric. It looks at four familiar types of analysis that can obscure real research performance when misused and discusses some alternative visualisations that support sound and responsible research management.

HuMetricsHSS (Humane Metrics Initiative)
HuMetricsHSS is an initiative that creates and supports values-enacted frameworks for understanding and evaluating all aspects of the scholarly life well-lived and for promoting the nurturing of these values in scholarly practice.

Responsible Metrics Explained
Watch this video by the Office of Scholarly Communications, Cambridge (03:20) for an overview of the Responsible Metrics movement.

Source: Office of Scholarly Communication, Cambridge. (2019, April 18). Research in 3 minutes: Responsible metrics [Video]. Youtube. www.youtube.com/watch?v=zbGb08jJzt0