On Metrics and Metaphysics

From: Stevan Harnad <amsciforum_at_GMAIL.COM>
Date: Sun, 19 Oct 2008 10:11:42 -0400

            'the man who is ready to prove that
            metaphysics is wholly impossible... is a
            brother metaphysician with a rival theory.'

            Francis Herbert
            Bradley (1846-1924) Appearance and Reality

      A critique of metrics and European Reference Index for
      the Humanities (ERIH) by History of Science, Technology
      and Medicine journal editors has been posted on
      the Classicists list. ERIH looks like an attempt to set
      up a bigger, better alternative to the ISI Journal Impact
      Factor (JIF), tailored specifically for theHumanities.
      The protest from the journal editors looks as if it is
      partly anti-JIF, partly opposed to the ERIH approach and
      appointees, and partly anti-metrics.

      Their vision seems rather narrow. In the Open Access era,
      metrics are becoming far richer, more diverse, more
      transparent and more answerable than just the ISI JIF:
      author/article citations, author/article downloads, book
      citations, growth/decay metrics, co-citation metrics,
      hub/authority metrics, endogamy/exogamy metrics,
      semiometrics and much more. The days of the univariate
      JIF are already over. This is not the time to reject
      metrics; it is the time to test and validate, jointly, as
      full a battery of candidate metrics as possible, but
      validating the battery separately for each discipline,
      against peer ranking or other validated or face-valid
      standards (as in the UK's RAE 2008).

      Brody, T., Kampa, S., Harnad, S., Carr, L. and Hitchcock,
      S. (2003) Digitometric Services for Open Archives
      Environments. In Proceedings of European Conference on
      Digital Libraries2003, pp. 207-220, Trondheim, Norway. 

      Brody, T., Carr, L., Harnad, S. and Swan, A. (2007) Time
      to Convert to Metrics. Research Fortnight pp. 17-18. 

      Brody, T., Carr, L., Gingras, Y., Hajjem, C., Harnad, S.
      and Swan, A. (2007) Incentivizing the Open Access
      Research Web: Publication-Archiving, Data-Archiving and
      Scientometrics.CTWatch Quarterly 3(3). 

      Carr, L., Hitchcock, S., Oppenheim, C., McDonald, J. W.,
      Champion, T. and Harnad, S. (2006) Extending
      journal-based research impact assessment to book-based
      disciplines. Technical Report, ECS, University of

      Harnad, S. (2001) Research access, impact and
      assessment. Times Higher Education Supplement 1487: p.

      Harnad, S., Carr, L., Brody, T. & Oppenheim, C.
      (2003) Mandated online RAE CVs Linked to University
      Eprint Archives: Improving the UK Research Assessment
      Exercise whilst making it cheaper and easier. Ariadne

      Harnad, S. (2006) Online, Continuous, Metrics-Based
      Research Assessment. Technical Report, ECS, University of

      Harnad, S. (2007) Open Access Scientometrics and the UK
      Research Assessment Exercise. In Proceedings of 11th
      Annual Meeting of the International Society for
      Scientometrics and Informetrics 11(1), pp. 27-33, Madrid,
      Spain. Torres-Salinas, D. and Moed, H. F., Eds. 

      Harnad, S. (2008) Self-Archiving, Metrics and
      Mandates. Science Editor 31(2) 57-59

      Harnad, S. (2008) Validating Research Performance Metrics
      Against Peer Rankings. Ethics in Science and
      Environmental Politics 8 (11) doi:10.3354/esep00088 The
      Use And Misuse Of Bibliometric Indices In Evaluating
      Scholarly Performance 

      Harnad, S., Carr, L. and Gingras, Y. (2008) Maximizing
      Research Progress Through Open Access Mandates and
      Metrics. Liinc em Revista.

Date: Sun, 19 Oct 2008 11:56:22 +0100
Sender: Classicists 
From: Nick Lowe
Subject: History of Science pulls out of ERIH
      [As editorial boards and subject associations in other
      humanities subjects contemplate their options, this
      announcement by journals in History of Science seems
      worth passing on in full. Thanks to Stephen Clark for the

Journals under Threat: A Joint Response from History of Science,
Technology and Medicine Editors

We live in an age of metrics. All around us, things are being
standardized, quantified, measured. Scholars concerned with the work
of science and technology must regard this as a fascinating and
crucial practical, cultural and intellectual phenomenon. Analysis of
the roots and meaning of metrics and metrology has been a
preoccupation of much of the best work in our field for the past
quarter century at least. As practitioners of the interconnected
disciplines that make up the field of science studies we understand
how significant, contingent and uncertain can be the process of
rendering nature and society in grades, classes and numbers. We now
confront a situation in which our own research work is being
subjected to putatively precise accountancy by arbitrary and
unaccountable agencies.

Some may already be aware of the proposed European Reference Index
for the Humanities (ERIH), an initiative originating with the
European Science Foundation. The ERIH is an attempt to grade journals
in the humanities - including "history and philosophy of science".
The initiative proposes a league table of academic journals, with
premier, second and third divisions. According to the European
Science Foundation, ERIH "aims initially to identify, and gain more
visibility for, top-quality European Humanities research published in
academic journals in, potentially, all European languages". It is
hoped "that ERIH will form the backbone of a fully-fledged research
information system for the Humanities". What is meant, however, is
that ERIH will provide funding bodies and other agencies in Europe
and elsewhere with an allegedly exact measure of research quality. In
short, if research is published in a premier league journal it will
be recognized as first rate; if it appears somewhere in the lower
divisions, it will be rated (and not funded) accordingly.

This initiative is entirely defective in conception and execution.
Consider the major issues of accountability and transparency. The
process of producing the graded list of journals in science studies
was overseen by a committee of four (the membership is currently
listed at ). This committee cannot be considered representative. It
was not selected in consultation with any of the various disciplinary
organizations that currently represent our field such as the European
Association for the History of Medicine and Health, the Society for
the Social History of Medicine, the British Society for the History
of Science, the History of Science Society, the Philosophy of Science
Association, the Society for the History of Technology or the Society
for Social Studies of Science. Journal editors were only belatedly
informed of the process and its relevant criteria or asked to provide
any information regarding their publications.

No indication has been given of the means through which the list was
compiled; nor how it might be maintained in the future. The ERIH
depends on a fundamental misunderstanding of conduct and publication
of research in our field, and in the humanities in general. Journals'
quality cannot be separated from their contents and their review
processes. Great research may be published anywhere and in any
language. Truly ground-breaking work may be more likely to appear
from marginal, dissident or unexpected sources, rather than from a
well-established and entrenched mainstream. Our journals are various,
heterogeneous and distinct. Some are aimed at a broad, general and
international readership, others are more specialized in their
content and implied audience. Their scope and readership say nothing
about the quality of their intellectual content. The ERIH, on the
other hand, confuses internationality with quality in a way that is
particularly prejudicial to specialist and non-English language

In a recent report, the British Academy, with judicious
understatement, concludes that "the European Reference Index for the
Humanities as presently conceived does not represent a reliable way
in which metrics of peer-reviewed publications can be constructed"
(Peer Review: the Challenges for the Humanities and Social Sciences,
September 2007: ). Such exercises as ERIH can become self- fulfilling
prophecies. If such measures as ERIH are adopted as metrics by
funding and other agencies, then many in our field will conclude that
they have little choice other than to limit their publications to
journals in the premier division. We will sustain fewer journals,
much less diversity and impoverish our discipline. Along with many
others in our field, this Journal has concluded that we want no part
of this dangerous and misguided exercise. This joint Editorial is
being published in journals across the fields of history of science
and science studies as an expression of our collective dissent and
our refusal to allow our field to be managed and appraised in this
fashion. We have asked the compilers of the ERIH to remove our
journals' titles from their lists.
      Hanne Andersen (Centaurus)
      Roger Ariew & Moti Feingold (Perspectives on Science)
      A. K. Bag (Indian Journal of History of Science)
      June Barrow-Green & Benno van Dalen (Historia
      Keith Benson (History and Philosophy of the Life
      Marco Beretta (Nuncius)
      Michel Blay (Revue d'Histoire des Sciences)
      Cornelius Borck (Berichte zur Wissenschaftsgeschichte)
      Geof Bowker and Susan Leigh Star (Science, Technology and
      Human Values)
      Massimo Bucciantini & Michele Camerota (Galilaeana:
      Journal of Galilean Studies)
      Jed Buchwald and Jeremy Gray (Archive for History of
      Exact Sciences)
      Vincenzo Cappelletti & Guido Cimino (Physis)
      Roger Cline (International Journal for the History of
      Engineering & Technology)
      Stephen Clucas & Stephen Gaukroger (Intellectual History
      Hal Cook & Anne Hardy (Medical History)
      Leo Corry, Alexandre Métraux & Jürgen Renn (Science in
      D. Diecks & J. Uffink (Studies in History and Philosophy
      of Modern Physics)
      Brian Dolan & Bill Luckin (Social History of Medicine)
      Hilmar Duerbeck & Wayne Orchiston (Journal of
      Astronomical History & Heritage)
      Moritz Epple, Mikael Hård, Hans-Jörg Rheinberger & Volker
      Roelcke (NTM: Zeitschrift für Geschichte der
      Wissenschaften, Technik und Medizin)
      Steven French (Metascience)
      Willem Hackmann (Bulletin of the Scientific Instrument
      Bosse Holmqvist (Lychnos) Paul Farber (Journal of the
      History of Biology)
      Mary Fissell & Randall Packard (Bulletin of the History
      of Medicine)
      Robert Fox (Notes & Records of the Royal Society)
      Jim Good (History of the Human Sciences)
      Michael Hoskin (Journal for the History of Astronomy)
      Ian Inkster (History of Technology)
      Marina Frasca Spada (Studies in History and Philosophy of
      Nick Jardine (Studies in History and Philosophy of
      Biological and Biomedical Sciences)
      Trevor Levere (Annals of Science)
      Bernard Lightman (Isis)
      Christoph Lüthy (Early Science and Medicine)
      Michael Lynch (Social Studies of Science)
      Stephen McCluskey & Clive Ruggles (Archaeostronomy: the
      Journal of Astronomy in Culture)
      Peter Morris (Ambix)
      E. Charles Nelson (Archives of Natural History)
      Ian Nicholson (Journal of the History of the Behavioural
      Iwan Rhys Morus (History of Science)
      John Rigden & Roger H Stuewer (Physics in Perspective)
      Simon Schaffer (British Journal for the History of
      Paul Unschuld (Sudhoffs Archiv)
      Peter Weingart (Minerva)
      Stefan Zamecki (Kwartalnik Historii Nauki i Techniki)
      Viviane Quirke, RCUK Academic Fellow in twentieth-century
      Biomedicine, Secretary of the BSHS, Centre for Health,
      Medicine and Society, Oxford Brookes University
Received on Sun Oct 19 2008 - 15:12:46 BST

This archive was generated by hypermail 2.3.0 : Fri Dec 10 2010 - 19:49:33 GMT