Summary of a colloquium in Melbourne

From: Arthur Sale <ahjs_at_ozemail.com.au>
Date: Sat, 17 Feb 2007 16:40:42 +1100

The RQF Explained

Colloquium held in Melbourne, Australia, 15 Feb 2007



Disclaimer

This is a personal summary and does not represent the official view of
anybody other than myself. Please forgive spelling and infelicitous
wording. Note for international readers: RQF = Research Quality
Framework, an RAE-type research evaluation activity; DEST =Australian
Government Dept of Education Science & Training, HERDC = Higher
Education Research Data Collection. Note that all Australian universities
will be required to have a repository by year-end.

Arthur Sale



The Melbourne event followed and was improved by the prior Sydney
colloquium (13 Feb). Although I did not attend the Sydney event, I was
told by several people that the themes and outcomes were similar. The
talks in Melbourne were slightly updated from Sydney&#8217;s outcomes.



The first part of the day was devoted to DEST briefings. The following
are highlights and in some cases matters that came out subsequently in
conversation and questions, as well as a summary of stuff we knew
already, and some personal injections.

1. The ASHER program, to fund repositories in Australian
universities, was announced in December 2006. Guidelines for applying are
expected shortly, and expenditure backdated to 18 Dec 2006 should be
acceptable. The $25.5M will become available after July 2007 and
subsequent budget years and will run for three years. The ASHER program
should not be seen as only the short-term goal of making all universities
RQF-ready, but also and more importantly equipping Australian
universities with repositories for their future needs and the
Accessibility Framework (read this to mean something like Open Access).

2. There is also $16.4M specifically for the costs of the new
requirements for data gathering (the IAP program). A similar timetable
applies, but research offices will be the main beneficiaries.

3. The RQF will work with all repository software in use in
Australia, provided they are slightly adapted to RQF needs. There are six
in rough order of installations: EPrints, DSpace, ProQuest (bepress),
VITAL, a single instance of Fez, and a possible Ex Libris site.
Universities are not required to use the same software, and a software
will not be mandated.

4. DEST, together with CAUL, will negotiate with major publishers
about copyright issues. How the smaller publishers will be dealt with is
unclear. The RQF will be consistent with the 1998 Copyright Act. DEST
will provide copyright advice.

5. The RQF will be conducted all-electronic. This intention may be
compromised slightly for books as research outputs and odd outputs such
as architectural drawings.

6. Evidence portfolios will be submitted and held centrally in a
DEST IMS.

7. Assessors will log in to the DEST IMS to see evidence portfolios.
If they choose to follow a persistent link to an evidence object, the IMS
will log on to the university repository and retrieve the required data.

8. Don&#8217;t forget that evidence is not just journals, books,
chapters and conferences (the HERDC set), but can be very many things,
both in the quality arena (&#8216;best four publications&#8217;) and in
the impact arena (evidence of impact). Examples: contracts, environmental
impact studies, reports, maps, architectural or engineering drawings, art
works, music and drama performances, software, exhibition catalogues,
attendance counts.

9. The present timetable is:

2nd quarter 2007 &#8211; trial IMS implementation and testing
with selected universities

July 2007 &#8211; Release of RQF specifications. First tranche
of ASHER and IAP funds become available.

3rd quarter &#8211; RQF roadshow and communications strategy,
talk to every university, including WA and NT.

October 2007 &#8211; proposed research groupings to be submitted
to DEST, Unis not held to details, but allow for panel workload planning.

10. The DEST IMS will have challenges to deal with big objects, and
possibly low bandwidth assessors.

11. The fraction of research objects that assessors will access is not
even guessed, though the RAE experience of around 15% suggests a similar
or lower figure. Coping with different platforms, and usability of the
interface will be challenges.

12. Providing access to evidence in the wider &#8220;all research
outputs&#8221; may be advantageous, but who knows who will look at what?
Probably more likely in impact measurement.

13. It was suggested that universities should be able to see what
assessors see, to check their presentations. Perhaps one person in a
research office.

14. The HERDC data collection will continue for the foreseeable future,
and without change. RQF does not displace it. Conferences (suggested for
deletion by a participant) are in fact more important than and equal
quality with journals for some disciplines. Long term linkage of HERDC
databases and repositories is seen as important.

15. Derek Whitehead spoke on copyright and is difficult to summarize.
Read the OAK LAW Report and look at the presentation slides. However,
noting that this is not legal advice by either Derek or myself:

Note that the law on non-HERDC research objects may well be
different, eg music performances, artworks. Think also about research
data, newspaper articles demonstrating impact, etc.

Third party copyright in a research object is a difficult issue.

Even &#8216;unpublished&#8217; material may be
&#8216;published&#8217; according to the Copyright Act.

There are lots of pair-wise relationships to consider.

Is it worthwhile considering a just-in-time approach for some
objects?

Universities should perhaps look at the takedown procedures
prescribed by the Act for ISPs and comply, even if they are not strictly
required to.



There were then four presentations of Mock-RQF trials, all of differing
characteristics. Rather than embarrass anyone, I&#8217;ll make this
anonymous and pick out some eyes from the interesting presentations.

16. Each university needs to analyze where its strengths and research
groupings are (and how strong) and there is only 6-8 months to do it.

17. Each repository should be up 24x7. Remember assessors from the other
side of the world won&#8217;t be impressed if your site is down for
backup when they want to work in the middle of our night or over
weekends.

18. Big challenge for us to collaborate while we are being urged to be
competitive.

19. Important that every university identify the person who is the
driver of the RQF effort (or have a two-headed Cerberus?).

20. Three groups are key to this activity: researchers, the research
offices and their PVC administrator, and libraries/ICT sections. (The
first is usually under-represented in RQF planning.)

21. It was suggested that every university should have (if it does not
yet)

 * an RQF Steering Committee, chaired by some senior person such as
    DVC(R) or PVC(R) and comprising top-level decision-makers embodying
    institutional commitments.
 * An RQF Project Team reporting to the above, comprised of workers. The
    Chair can be from the academic sector, an administrator from the
    research office, or a responsible librarian. The important thing is
    that there be a &#8216;champion&#8217; of the RQF activity.



Personal comments

The colloquium focused to some extent on the repository and delivering
evidence content to assessors. This was despite this being a relatively
small part of assessment, and only of contributory significance. Maybe
10-15% of evidence will be looked at in detail (max), and then by an
assessor who is almost certainly not competent to referee the article
concerned.



I had hoped for more clarity on the issue of citations and metrics
generally. Evidence is beginning to come to hand that the differences
between Thomson, Scopus and Google Scholar are significant, and the
determination of &#8220;best papers&#8221; may partly hang on citation
evidence. The sector needs clarity on metric evidence to be provided to
panels, recognizing that they are making their own decisions on how to
use it. Personally, I&#8217;d be surprised if panels to whom metrics were
relevant didn&#8217;t give it high importance. After all, they are
supposed to be experienced researchers and it doesn&#8217;t make any
sense to rely on amateur evaluation as against experts.



I was also surprised by many speakers&#8217; and questioners&#8217;
attitudes (revealed by what they said or sometimes did not say) about the
Open Access agenda. This did not generally apply to DEST, but did to
sector participants. While recognizing that the colloquium was about the
RQF, it was as though university attention has been focused on the RQF
agenda to the exclusion of all else.



Unscalable repository deposit practices were reported. There were also
comments that the sector only needed to upload 4 documents per active
researcher, without in parallel commenting that all research output
should have been uploaded years ago and that the four was now only a
belated start. Some even suggested holding off uploading until RQF
requirements were known!



Some comments suggested that the RQF would shape repositories, where the
influence is the other way around. University repositories exist to
provide accessibility and improve citations independently of the RQF, and
the RQF will have to adapt to this reality, with a few minor changes
required in the repository software. It seems that DEST knows this, even
if the sector doesn&#8217;t.






Received on Sat Feb 17 2007 - 06:30:23 GMT

This archive was generated by hypermail 2.3.0 : Fri Dec 10 2010 - 19:48:45 GMT