Re: The Facts about Open Access: Results from the ALPSP/AAAS/HighWire Study

From: Stevan Harnad <harnad_at_ecs.soton.ac.uk>
Date: Thu, 27 Oct 2005 20:22:01 +0100

On Thu, 27 Oct 2005, [identity deleted] wrote:

> Dear Dr. Harnad,
> Are you familiar with the report APLSP just published? I'm wondering if
> you might have any comments on it that I could incorporate into a news
> story I am writing for [deleted]. I've attached the report here.
> Many thanks for your time!

Yes, I am familiar with the report. It is a report commissioned by some
publishing groups on the current status of Open Access Publishing --
a new economic model that it is perhaps too early to assess, because
it has not been around long enough, nor spread sufficiently yet. So
on the report's general conclusions -- that (1) it is not yet clear
how viable the OA cost-recovery model is (author-institution pays for
publication instead of user-institution paying for access), and that (2)
many OA publishers do not use the OA cost-recovery model at all, but the
conventional subscription/license model -- I think this is generally true,
but not very new or informative, as it is too early to determine whether
the OA cost-recovery model is viable. (I think it will be, but opinions
don't matter: the evidence of time will tell.)

I am doubtful about one reported outcome: (3) that OA journals have less
rigorous peer review. This is not only not correct, but it does not even
make sense: What does the quality-level of a journal have to do with its
cost-recovery model? There are OA journals with higher peer-reviewing
standards and OA journals with lower peer-reviewing standards, exactly
as there are non-OA journals with higher and lower peer-reviewing
standards. PLoS Biology is an example of an OA journal of the highest
peer-review standards, every bit as high as those of Nature of Science,
which are non-OA, and I could easily name OA and non-OA journals with
lower peer-review standards. So I would say that the conclusion about
peer-review standards is certainly wrong, and based on a methodological
error: not comparing like with like.

But there is something much more seriously wrong with the report. It
is described as "The Facts about *Open Access*"; but not only is not
everything reported a fact (e.g., the part about lower OA peer-review
standards), and not only are the conclusions about the viability of OA
publishing vastly premature and not grounded in anywhere near sufficient
data, but these are not the "facts" about *OA* either, they are merely
some provisional observations about OA *Publishing*. For there is an
entirely different means of providing OA -- 100% OA for every article
published, in every (peer-reviewed) journal, whether the journal happens
to be an OA journal or not -- and that is OA self-archiving: The author
deposits a copy of the author's own peer-reviewed final draft on the web
for free for all would-be users who cannot afford access to the official
version in the journal in which it is published.

There are economics involved here too, but they are not the economics of
the publisher's cost-recovery model, but the economics of the return on
the tax-paying public's investment in research: Research that is funded
but not used, applied and built upon may as well not have been conducted
or funded. The benefits of research come from its uptake and usage,
not from its publishers' sales-revenue. Self-archiving maximizes how
much research is used, applied and built upon by making it accessible
not only to those users who can afford paid access, but to those who
cannot. It increases citation impact by 50%-250%. That is a fact about
Open Access, and it has nothing to do with OA publishing cost-recovery
models. Yet it is a fact that this report says next to nothing about,
so focused is this report on OA publishing cost-recovery models.

Last, most of this report is based on surveying (publishers')
*opinions*; so if it is facts about anything, it is merely facts about
opinions (of publishers, about OA publishing). What we need is facts
about OA: not just publishers' opinions about the present and future of
OA publishing, but facts about OA itself, whether via OA publishing or OA
self-archiving. Facts about researchers (not just their opinions, but
their actual practices); and facts about the *effects* of OA -- not just
the effects of the OA publishing on cost-recovery, but the effects of OA
on research usage, impact, productivity and progress, and the financial
implications of *those* effects, not just for the cost-recovery models
of publishers, but for the return on the tax-paying public's investment
in the research that is conducted and published.

> Thank you so much for your reply. I've had to finish the article
> without your comments, but I'm hoping I may have gotten the gist
> correct...
>
> As far as I can tell, no one has looked at citation impacts, or they are
> looking and still trying to figure them out. Thompson ISI showed
> PLoS-Biology to be 13.9 on their scale, which is good for a 2-year-old
> journal, but [there are] variety of journals out there (i.e. [including]
> ones that don't even publish anymore or have defunct websites).

Yes, there has been a *great* deal of evidence of the the OA citation
impact advantage, but you are (alas) simply making the very same error
I noted above that the Kaufman-Wills report made: You are equating
OA with OA journal publishing!

Most of the evidence on the OA impact advantage comes from OA
self-archiving, not OA publishing. With OA publishing there is exactly
the same problem I mentioned in connection with comparing peer-review
quality: the problem of comparing like with like (otherwise one's
conclusions are arbitrary or biassed). But with OA self-archiving, we
can (and do) compare scientific impact *within* the same journal and
year: by comparing articles that have and have not been self-archived:

See:
    http://opcit.eprints.org/oacitation-biblio.html
    http://citebase.eprints.org/isi_study/
    http://www.crsc.uqam.ca/lab/chawki/graphes/EtudeImpact.htm
    http://openaccess.eprints.org/index.php?/archives/28-guid.html

As to OA journals coming and going: Again a problem of comparing like
with like, for many non-OA journals go belly-up too, especially
in their vulnerable first start-up years.

Stevan Harnad
Received on Thu Oct 27 2005 - 23:12:32 BST

This archive was generated by hypermail 2.3.0 : Fri Dec 10 2010 - 19:48:05 GMT