russell group pvcs 3 jun 09
DESCRIPTION
Presentation by Michael Jubb to Russell Group Pro-Vice Chancellors, June 2009TRANSCRIPT
Research, Publication, Management
…….and the REFMichael Jubb
Research Information Network
Russell Group PVCs (Research) Working Group 3 June 2009
What are we trying to measure/assess?
productivityindividual, institutional and national volumes and shares of outputs
research impactcitationsreadership and usage
networkswho is reading, citing, linking to whom
socio-economic impactbut what precisely and over what time-frame?
Researchers’ publication and dissemination behaviour
study commissioned by RINbroad aim to gather and analyse evidence about:
the motivations, incentives and constraints that lead researchers in different subjects and disciplines to publish and disseminate their work in different ways and at different times; how and why researchers cite other researchers’ work; how researchers’ decisions on publication and citation are influenced (or not) by considerations arising from research assessment.
What kinds of outputs? RAE 2008 %
journal articles sciences, 79-99%soc sci, arts & humanities, 22-88%
books, chapters ? sciences, 0-3%soc sci, arts & humanities 22-88%
conference papers ? sciences, 0-14%soc sci, arts & humanities, 0-5%
exhibitions, performances etc
x sciences, 0%soc sci, arts & humanities, 0-35%
reviews, reports……… ? sciences, 0-1% soc sci, arts & humanities, 0-1%
journals dominant across all disciplines (ex arts)diverse range of other outputs
patents, instrument building, databases, web-based resources, working papers………
Some measures of productivity
Values shown for UK percentage presence in different reports
5
5.5
6
6.5
7
7.5
8
8.5
9
9.5
Year
1988
1990
1992
1994
1996
1998
2000
2002
2004
2006
UK
% s
hare
WoS
SCOPUS
NSF 2006
OST 2006
WT, SCI
Evidence 2008
EC 2007
Why the differences?data sources
WoSSCI, SSCI, AHCI database versioninclude letters and conference proceedings?
SCOPUS
year to be countedyear published in print or online?
counting methodinteger countingfractional counting
So what are researchers telling us?
where, when and how to publishkey motivation is recognition by peers
peer review critically importantrecognition measured by citationcareer advancement
secondary motivation is maximising dissemination
tension between targeting best audience and highest quality journal
increasing collaboration more co-authorshipsignificant rise in proportion of multi-authored works between 2003 and 2008
research assessment affects choices signs of increase in productivity
small rise in no. of articles per author 2003-2008
Disciplinary differences?books/book chapters equal in importance to journals in humanitiesconference papers important in engineeringconcerns about practice-led outputs
creative and performing artsapplied disciplines such as (aspects of) psychology, nursing and midwifery
So what are researchers telling us?
citation behaviour (citations out)some evidence of increase in volume of citationsvarying reasons for citing others/types of citationassociated with types of output
articles, books, conference papers etc
motivated byauthority of cited material (64%) or of author (44%)requirement to reference a method/theory/argument (53%)guidance from others: mainly reviewers and editors (29%)
self citation of multi-authored papersdisciplinary differences
medical sciences tend not to cite conference papershumanities cite personal communications and anecdotal refs
Some citation measures (citations in)
no. of citationsincluding and excluding self-citations
average no. of citations per publication% of publications not citednormalised worldwide average no. of citations per publication for a specific field
percentile breakdowns
comparisons of individuals, groups, departments, institutions etc against normalised worldwide average for a specific fieldquality profiles and percentile breakdowns
Some normalised worldwide averages
worldwide average 2003-06
UK average (without self-citations)
agriculture and food 2.24 2.51
biological sciences 2.69 3.47
biomedical sciences 4.02 4.83
astronomy 4.23 5.25
chemistry 2.52 3.18
maths 0.74 0.88
civil engineering 0.82 0.71
energy 1.21 1.11
economics 1.13 1.34
psychology 2.06 2.25
sociology 1.06 1.12
history 0.51 0.55
Some problems:coverage of the literature
Internal WoS coverage
80-100% 60-80% 40-60% <40%
Molecular Biology & Biochemistry
App Physics & Chemistry
Mathematics
Other Soc Sci
Bio Science – humans
Bio Science – animals & plants
Economics Humanities & Arts
Chemistry Psychology & Psychiatry
Engineering
Clinical Medicine
Geosciences
Physics & Astronomy
Soc Sci in Medicine
Some problems: timing and citation half-life
Time-lags and skewed distributionsDisciplinary, and sub-disciplinary, differences
Some problems: multiple authorships
Other measures? the Hirsch index
From: Lutz Bornmann (2006) H Index
Other measures?network and page rank analysis
weighted transfers of prestige from one journal/researcher/institution to anothereigenfactor, SCImago journal ranking
usage measuresCOUNTER metricsnetwork analysis
Do they tell the same story?
Do they tell the same story?Earth and planetary sciences in
selected institutions
So what are researchers telling us?
Has dissemination behaviour been influenced by RAE? Will it be influenced by REF?
across all disciplines, Yessenior academics less affected, early-to-mid career academics more sofocus on publicationinstitutional strategies
quarter of all researchers think the RAE excluded important research outputsif REF based on citations, they will employ open access publishing more often (42%)
So what are researchers telling us?
Has citation behaviour been influenced by RAE?
across all disciplines (ex physical sciences), No or Not Sure
Will it be influenced by REF?across all disciplines (ex economics), Yes, or Mightlikely to cite collaborators’ work more often (38%) and competitors’ work less (13%)
Issues for REFcoverage of different kinds of outputs
disciplinary differences
accuracy of dataoriginal citationpublication databases (publishers’ and institutions’)
definition of fieldsinterdisciplinarity
costs……….selectivity or not?who owns the publication or citation?
institution individual
bottom up or top down analysis?different metrics give different results
Relationship between metrics and peer review
“The future of research assessment exercises lies in the intelligent combination of metrics and peer review”
Henk Moed, CWTS, Leiden University
peer review informed by bibliometrics, or
bibliometrics moderated by peer review?
possibilitieslet the type of peer review depend on the outcomes of the bibliometricsuse citation analysis for initial rankings and explicitly justify any subsequent deviations from them
Lessons for institutions?even simple bibliometrics are not simple
and they are rapidly becoming more complex and sophisticated need for bibliometric expertise to understand and be able to employ a range of measureslocal, central or commercial services?
researchers’ motivations and behaviours are complex
need for assessments by others is implicit in all their motivationsrewards come from assessments; RAE/REF part of a wider ecologydisciplinary differences are realinstitutional policies and strategies must take account of them
7
Top researchers in Distinctive Competencies
Key metrics for performance in specific
Distinctive Competencies
Benchmark against competition per
Distinctive Competencies
Mapping research performance
Lessons for institutions?staff awareness and consultationlessons from pilots
comprehensive research information systemspublications databases
not necessarily the same as the repository
accurate bibliographic data
author ID systems?other lessons once current study completed?
Questions???
Michael Jubbwww.rin.ac.uk