Re: UK Research Assessment Exercise (RAE) review

From: Stevan Harnad <harnad_at_ecs.soton.ac.uk>
Date: Tue, 26 Nov 2002 19:05:22 +0000

On Tue, 26 Nov 2002, Jan Velterop wrote:

> Scientometrics and other metrics are about counting what can be
> counted... So 'quantity' is dealt with. What about 'quality'?
> Quality is relative, and based on judgement... utterly subjective,
> so what we count is 'votes'. Do more votes mean a higher 'quality'
> than fewer votes? Does it matter who does the voting?

All good scientometric questions, it seems to me (even the one about
how to identify and weight voting "authorities"). How to answer, if
not scientometrically? (Or do you think it should just be a matter of
individual opinion or taste?)

> I think it [matters who does the voting], at least in these matters,
> and therefore a review process is needed that ranks things like
> originality, fundamental new insights, and yes, contributions to
> wider dissemination and understanding as well, in order to base
> important decisions on more than just quasi-objective measurements.

Is this not among the things peer review is supposed to do? These are
almost literally the questions that appear in many referee evaluation
forms. Are you proposing a second round of review, a few years after
a paper appears? By all means, if you have the time and resources. And
certainly the RAE should include such secondary review data in its
scientometric equation too, if they are available in time.

But in what way is any of this an alternative to the quantitative,
scientometric assessment of research quality and impact? The only ones who
are not doing it scientometrically are the reviewers themselves (whether
in the primary peer review or in the second one Jan recommends). But their
judgments are just votes (i.e., scientometric data) too, just as the
journal-names are, in 1st-round peer review. Perhaps reviewer-names
could accrue some objective scientometric weight too, for the second
round.

But this is all speculation about what the future scientometric analyses
will yield, once we have these (open access) data available to do all
these analyses on.

For the RAE, unless Jan is recommending that the assessors do a 3rd round
of direct review of all their submission themselves, scientometrics
(yes, counting!) seems to be the only way they can do their ranking
(which is likewise counting).

> Fortunately, in biology such secondary review is beginning to take shape:
> Faculty of 1000 (www.facultyof1000.com). It shows that the subjective
> importance of articles is often unconnected, or only very loosely connected,
> to established scientometrics. It constantly brings up 'hidden jewels',
> articles in pretty obscure journals that are nonetheless highly interesting
> or significant.

I would certainly want to use Faculty of 1000 ratings and citations in
my multiple regression equation for impact, perhaps even giving them a
special weight (if analysis shows they earn it!). But what is the point?
This is just a further source of scientometric data!

> I am sure that automated, more inclusive, counting of votes made possible by
> open and OAI-compliant online journals and repositories will help the
> visibility of those currently outside the ISI Impact Factory universe, such
> as the journals from Bhutan. But it can't replace judgement.

No, it can't replace judgment. Like all other analyses, it can merely
quantify the outcomes of judgments, and weigh them, against one another
and against other measures. What else is there? Even the decision to
browse, read, and cite is just a set of human judgments we are counting
and trying to use to predict with. Predict what? Later human performance,
and findings, and judgments, i.e., research impact.

I don't think that in reminding us that all of this is based on human
judgment (and, of course, on empirical reality, in the case of science),
Jan is not giving us an alternative to scientometric quantification. He
is just reminding us of what it is that we are quantifying!

Stevan Harnad
Received on Tue Nov 26 2002 - 19:05:22 GMT

This archive was generated by hypermail 2.3.0 : Fri Dec 10 2010 - 19:46:43 GMT