Skip to main content

Responsible Research Metrics Policy

1. Introduction and scope

1.1 The University is committed to: (i) maintaining the integrity of the research disseminated by its affiliated authors; (ii) ensuring all opportunities for progression are assessed equitably and transparently; (iii) ensuring influence and impact of research outputs is demonstrated and understood.

1.2 The scope of this policy includes the measurement and assessment of all Research Outputs, including but not limited to peer-reviewed publications, datasets and software.

1.3 This policy aims to provide guidance and understanding of how The University will responsibly engage in the use of metrics in line with the principles set out by DORA1 and the Leiden Manifesto.2

2. Definitions

Affiliated authorsUsed in this policy to refer to authors who consider themselves representatives of the University.
JIFA journal level impact factor is calculated from the number of citations during a census period to articles published in a journal during a preceding designated period, divided by the total number of citable outputs published in that journal during the same designated period. For example, the 2017 two year impact factor was the number of citations made in 2017 to articles published in 2015 and 2016, divided by the total number of citable articles published in 2015 and 2016.
NormaliseThe rescaling of data, usually through division which attempts to remove the effects of unwanted variables. For example, the age of an article or the citation culture of a disciplinary field.
RepresentativesUsed in this policy refers to people who are associated with the University. This can include but is not limited to academic staff and professional staff.
Research OutputsThe published product of a research project. For example: published journal articles; monographs; preprints; working papers; datasets; software; patents; commissioned reports; conference papers; invited talks.
Unitary levelIn this document, this refers to an individual person or individual research output.
The UniversityIn this document, the University of Southampton and/or its academic and professional representatives.

3. Responsibilities and ownership

3.1 The Research and Enterprise Executive Group (REEG) has responsibility for the oversight of this policy, to monitor compliance with it and to ensure The University meets the ethical recommendations of its funding bodies, DORA1 and the Leiden Manifesto2.

3.2 This policy will be subject to periodic audit and review by REEG, to provide assurance to Senate that the terms remain fit for purpose.

3.3 This Policy will be reviewed at least every two years or as appropriate to respond to changes in relevant legislation or national guidance.

3.4 To support the implementation of this policy, the Library and Research Innovation Services are responsible for maintaining expert knowledge on metrics so that individuals can seek advice on their use and be satisfied that the potential effects of numerical transformations are well understood and clearly communicated (see below 5.1.2, 5.1.3, 5.1.5 & 5.1.7).

4. Institutional warranties

4.1 The University recognises that in order to interrogate big data, metrics are sometimes required. To achieve meaningful results, metrics need to be based on significant data, thus metrics are not a substitute for qualitative information or expert knowledge at a Unitary level.

4.2 The University recognises that effective metric analysis of recently published articles can be difficult.

4.3 All metrics are comparative measures and The University will appropriately identify and select key performance indicators and Normalise (see below 5.1.7) them as appropriate ahead of analysing results, as set out below 5.1.4, 5.2-5.2.1.

4.4 The University will be transparent about the use of metrics, especially when used for hiring and promotion decisions, as set out below 5.1.2, 5.1.3 & 5.1.5.

4.5 The University understands that metrics can create incentives and change behaviour. Behaviours associated with gaming metrics are actively discouraged.

4.6 It is well understood that protected characteristics can bias measurable units of research assessment.

5. Expectations

5.1 The University expects its representatives to:

  • 5.1.1 consider whether a metric is necessary and on balance why its use is more helpful than expert testimony alone;
  • 5.1.2 be transparent about the use of metrics;
  • 5.1.3 ensure results of research output data analysis are reproducible, where possible;
  • 5.1.4 define a question explicitly ahead of choosing a method and conducting an analysis on research outputs. As stated in the Leiden Manifesto, “measure performance against the research missions of the institution, group or researcher" (see: 4.3 & 5.2.1);
  • 5.1.5 accompany any metrics with explanations in unambiguous plain English to ensure end users understand the data used, their reliability and limitations;
  • 5.1.6 where possible, use more than one metric to verify interpretations of the results when a metric is employed;
  • 5.1.7 where possible, Normalise metrics to diminish the effects of comparing across disciplines, years, etc.;
  • 5.1.8 regularly review commonly used metrics to establish if they remain fit for purpose.

5.2 Metrics should not be used:

  • 5.2.1 without due cause. All metrics should be tailored to the question being asked, the data available and the expected end use;
  • 5.2.2 if the data available are insufficient in scope to answer the question being asked;
  • 5.2.3 solely to make decisions that affect personal circumstances, especially employment status or the reputation of an individual i.e. “quantitative evaluation should support qualitative, expert assessment”2.

5.3 Operationally, this means The University prohibits the use of Journal-based metrics, such as JIFs as a surrogate measure of the quality of individual research articles. Expert testimony or personal statements should be taken into account when assessing an individual.

5.4 The University is a signatory of DORA1. Within our individual roles at the University we act as representatives of the institution, but we may also act as publishers, an organisation that supplies metrics, and researchers, thus it is important to read and understand the terms DORA sets out for those roles (see 2).

6. Guidance

6.1 Guidance on how this policy relates to DORA1 and the Leiden Manifesto2 can be found here:

7. Further reading and resources

8. Related regulations and policies

Code of Conduct for Research

Open Access Policy

Research Data Management Policy

Data protection

9. Bibliography

1San Francisco Declaration on Research Assessment," 01 01 2019. [Online]. Available:

2D. Hicks, P. Wouters, L. Waltman, S. de Rijcke and I. Rafols Bibliometrics: The Leiden Manifesto for research metrics. Nature 520, 429–431 (2015)


10. Version Control

Date Approved: 20/11/19Author: Library Research Engagement GroupRevision Date:   
Version:Revision Date:Revised by:Page No:Amendment:Authorised by:
1.0N/AN/AN/ANo previous versions.Senate