journal rankings: can’t live with ‘em, can’t live without ‘em!

20
Journal rankings: Can’t live with ‘em, can’t live without ‘em! Professor John Mingers Kent Business School, ECU, February 2014 [email protected]

Upload: doyle

Post on 16-Mar-2016

55 views

Category:

Documents


1 download

DESCRIPTION

Journal rankings: Can’t live with ‘em, can’t live without ‘em!. Professor John Mingers Kent Business School, ECU, February 2014 [email protected]. Components of the REF. Components of the submission Staff data Output data (65%) Impact case studies and contextual narrative (20%) - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Journal rankings: Can’t live with ‘em, can’t live without ‘em!

Journal rankings: Can’t live with ‘em, can’t live without ‘em!

Professor John MingersKent Business School, ECU, February [email protected]

Page 2: Journal rankings: Can’t live with ‘em, can’t live without ‘em!

Components of the REF– Components of the

submission• Staff data• Output data (65%)• Impact case studies and

contextual narrative (20%)

• Research environment narrative, income and PhD data (15%)

– Outcome reported as a ‘quality profile’

Page 2

Page 3: Journal rankings: Can’t live with ‘em, can’t live without ‘em!

• Many different journal rankings each with its own biases and prejudices

• They are based on often arbitrary criteria. They can be by peer review or behavioural (e.g, impact factors)

• The original Kent ranking was simply a statistical combination of other rankings - “Objectivity results from a combination of subjectivities” (Ackoff)

• ABS ranking was based on a UWE peer review ranking that was developed for ABS

• Since the 2008 RAE it has become the de facto standard and yet is hugely contentious

Journal rankings

Page 4: Journal rankings: Can’t live with ‘em, can’t live without ‘em!

1. Problems of peer reviewing papers1. Simply too time consuming2. Disagreements between reviewers (cf journal referees

disagreeing)3. Bias

2. Makes more open and transparent what would otherwise be very judgmental and open to bias

3. Provides a common currency against which to discuss and judge research quality

4. Provides clear guidance and targets for people to aim at5. Provides a lot of information for DoRs, ECRs, libraries etc.

Advantages of journal rankings

Page 5: Journal rankings: Can’t live with ‘em, can’t live without ‘em!

1. History and development• 2004 Version 1 Bristol BS “Not intended for general circulation”; based on journals

submitted to RAE 2001 plus others; grades standardised to 2008 RAE 1* - 4*; decisions made by the editors (In OR/MS there are only 19 journals but 6 have the top grade In IS/IT there were 25 journals of which 7 have the top grade)

• 2007 Version 1 of ABS list; based on the BBS one but with input from subject specialists and use of impact factors; many journals downgradedWho were the subject experts? How/why were they chosen? Why were disciplinary bodies, e.g., COPIOR, not included?(In OR there are 40 journals, 5 have top score but 3 of these are statistics journals, so only 2, American, OR journals – Management Science and Operations ResearchIn IS there are 68 journals, but only 4 top ones, all USIn both areas, all the UK/Euro ones had been demoted leaving only US onesThe people on the Panel were Chris Voss, an Ops Mgt person at LBS, and Bob O’Keefe, more an IS person, who had just returned from the US)

• 2010 current version. Has become highly contentious, especially in particular fields such as OR, IS and Accounting and Finance.

Problems of journal rankings (ABS in particular)

Page 6: Journal rankings: Can’t live with ‘em, can’t live without ‘em!

Numbers of journals in the RAE and the ABS list

2. General coverage of management

Page 7: Journal rankings: Can’t live with ‘em, can’t live without ‘em!

Submission statistics for the last three RAEsAdapted from Geary et al (2004), Bence and Oppenheim (2004), RAE (2009a)a Totals differ slightly between different sources. Figures for 2008 are after data cleaning as described later

1996 2001 2008

No of submissions 100 97 90

No. of staff submitted 2300+ 3000+ 3300

Total no. of outputs 8000+ 9942 12575

No. of journal papers (% of total)a

5494 (69%) 7973 (80%) 11625 (92%)

No. of journal titles 1275 1582 1639

Mean outputs/journal 4.3 5.0 7.1

Mean outputs/institution 80.0 102.5 139.7

Page 8: Journal rankings: Can’t live with ‘em, can’t live without ‘em!

Output Type

Description 1996 2001 2008

A Authored book 431 285B Edited book 77 60C Chapter in book 863 332D Journal article 7973 11374E Conference contribution 295 85G Software 3 1H Internet publication 24 318N Research report for

external body80 98

T Other form of assessable output

184 22

Total 9942 12575

Number of publications by output typeAdapted from Geary et al (2004), Bence and Oppenheim (2004), RAE (2009a). Categories with zero entries have been suppressed

Page 9: Journal rankings: Can’t live with ‘em, can’t live without ‘em!

Figure 1 Pareto curve for the number of entries per journal in the 2008 RAE

Page 10: Journal rankings: Can’t live with ‘em, can’t live without ‘em!

3. Disciplinary coverage• There are 22 different subject area in the list which seems like a lot. It is

also very ad hoc: “an eclectic mix of categories consisting of: academic disciplines, business functions, industries, sectors, issues or interests as well as more or less residual categories which includes many of the leading business and management journals” Rowlinson, 2013

• Much of the list is devoted to reference disciplines rather than to B&M and applied areas – Economics (16%), Psychology (5%), Social science (7%) – nearly 30% in total. But: Gen Man (4%), HR (4%), Marketing (7%), OR (4%), Strategy (2%)(In OR, ABS had 35, but COPIOR list has 68 and is growing)

• Unequal proportions of 4*Psychology (42%), Gen Mgt (23%), Soc Sci (20%), Econ (13%), HR (11%), Marketing (9%), Fin (7%), Ops Mgt (3%), Ethics/governance (0%), Mgt Ed (0%)(In OR, there are 4x4* so apparently 11%, but in fact 2 are statistics journals so in reality 6%)

Page 11: Journal rankings: Can’t live with ‘em, can’t live without ‘em!

• Specific disciplinary factors (e.g., why is Business History a 4*?).See Accounting Education December 2011 for critiques from an Accounting perspective(The two 4* OR journals are American and specifically exclude Soft OR, which is one of the major British strengths)

• Over-reliance on ISI impact factors – journals not in ISI are ignored or at least lowly graded

Page 12: Journal rankings: Can’t live with ‘em, can’t live without ‘em!

4. Problems of process

• Lack of openness about how the list is created or updated

• Lack of engagement with disciplinary communities(No members of COPIOR on the committee despite our offers)

• Few changes made despite protests; little attempt to address the criticisms(COPIOR overtures ignored so reluctantly we produced our own ranking. This too has been ignored)

• Is it now seen as a money-making venture?

Page 13: Journal rankings: Can’t live with ‘em, can’t live without ‘em!

Problems with a single dominant list

1. Journals outside the list are inevitably marginalised

2. With the REF, journals at 1*, and increasingly at 2* are devalued

3. It’s very hard for new journals to get started

4. The quality levels given in the list tend to be taken as the quality levels of both the journal and then the papers within it “It’s a 3* paper”, “Jane Bloggs is a 4* researcher”

5. Individual researchers are disciplined into channelling into ABS journals

Page 14: Journal rankings: Can’t live with ‘em, can’t live without ‘em!

6. It discourages cross disciplinary or applied work

7. The particular focus of ABS appears to be on US journals – these tend to be highly theoretical, positivistic, and anti-pluralistThis leads to less practical and engaged work and more arcane theory

8. Ideas or work that is pushing the boundaries will not get published and hence will not get done

9. Journal fetishism - gimme that 4* “hit”

10. Potentially serious effect on peoples’ careers and sections of Schools – e.g., OR at Warwick which was decimated

Page 15: Journal rankings: Can’t live with ‘em, can’t live without ‘em!

ABS Grades RAE Estimated Grades

All journals

Journals not in RAE

Journals in RAE and our list

All our list

Journals not in ABS

Journals in ABS and our list

4* 10% 4% 15% 17% 13% 18%

3* 24% 12% 31% 29% 28% 31%

2* 37% 39% 37% 28% 26% 28%

1* 27% 45% 17% 23% 19% 22%

0* 3% 13% 2%GPA 2.17 1.74 2.43 2.34 2.09 2.41

Table 9 Proportions of journals in particular ranks comparing ABS with RAE gradesNote: we show the proportions in terms of % for ease of comparison but all Chi-Square tests were performed on the underlying frequencies

Page 16: Journal rankings: Can’t live with ‘em, can’t live without ‘em!

Conclusions from Table 9

• Overall RAE grades were higher than overall ABS grades (cols 1, 4) but this was because of selectivity of submissions

• This can be seen by comparing the ABS submitted with the ABS not submitted (cols 2, 3)

• Comparing those journals that are in common the level of grading is very similar (cols 3,6)

• In the RAE , ABS journals were graded more highly than non-ABS journals (cols 5,6)

• 13% of non-ABS journals were graded 0*

Page 17: Journal rankings: Can’t live with ‘em, can’t live without ‘em!

Figure 3 Scattergram showing association between GPA and proportion of an institution’s submitted journals that are in ABS

Page 18: Journal rankings: Can’t live with ‘em, can’t live without ‘em!

There are at least 3 possible explanations of this:

Higher % ABS journals

Better RAE grades

Higher quality of department

Higher % ABS journals

Higher % ABS journals

Better RAE grades

“RAE Bias”

“Better depts. more mainstream”

“Greater selectivity”

Higher quality of department

Page 19: Journal rankings: Can’t live with ‘em, can’t live without ‘em!

Problems with the whole RAE regime• Current measurement regimes are hugely distorting to research:

• Narrow focus on types of outputs – ie “4*” English language journal articles

• Narrow focus on types of measurements• Narrow focus on types of impact

• The RAE/REF has had a huge negative effect on the overall contribution of research in the UK – lack of innovation, opening up new areas; lack of major projects (books); lack of engaged research trying to deal with the real problems of our society and environment

• Concentration on peer review and rejection of bibliometrics (current REF Panel) – leads to maintenance of the status quo – “The Golden Triangle”

• Should we stop now and develop a system that aims to evaluate quality in a variety of forms, a variety of media, through a variety of measures with the ultimate goal of answering significant questions?

Page 20: Journal rankings: Can’t live with ‘em, can’t live without ‘em!

N. Adler and A-W Harzing, 2009 “When Knowledge Wins: Transcending the Sense and Nonsense of Academic Rankings”, Academy of Management Learning and Education 8, 1, 72-95

S. Hussain, 2013 “Journal Fetishism and the ‘Sign of the 4’ in the ABS guide: A Question of Trust? Organization (online publication)

S. Hussain, 2011 “Food for Thought on the ABS Academic Journal Quality Guide”, Accounting Edication 20, pp. 545-559H. Willmott, 2011 “Journal List Fetishism and the Perversion of Scholarship: Reactivity and the ABS List”, Organization

18, 4, pp. 429-442J. Mingers and L. Leydesdorff, 2013, “Identifying Research Fields within Business and Management: A Journal Cross-

Citation Analysis, available from http://arxiv.org/abs/1212.6773J. Mingers and H. Willmott, 2012, “Taylorizing Business School Research: On the “One Best Way” Performative Effects of

Journal Ranking Lists”, Human Relations, DOI: 10.1177/0018726712467048 J. Mingers, K. Watson and M. P. Scaparra, 2012 “Estimating Business and Management Journal Quality from the 2008

Research Assessment Exercise in the UK”, Information Processing and Management 48, 6, pp. 1078-1093, http://dx.doi.org/10.1016/j.ipm.2012.01.008

J. Mingers, F. Macri and D. Petrovici, 2011, “Using the h-index to Measure the Quality of Journals in the field of Business and Management”, Information Processing and Management, 48, 2, pp. 234-241 http://dx.doi.org/10.1016/j.ipm.2011.03.009

J. Mingers, 2009, “Measuring the Research Contribution of Management Academics using the Hirsch-Index”, J. of the Operational Research Society, 60, 8, pp. 1143-1153, doi 10.1057/jors.2008.94

J. Mingers and A.-W. Harzing, 2007, “Ranking Journals in Business and Management: A Statistical Analysis of the Harzing Database”, European J. of Information Systems 16, 4, pp. 303-316, http://www.palgrave-journals.com/ejis/journal/v16/n4/pdf/3000696a.pdf

J. Mingers and Q. Burrell, 2006, “Modelling Citation Behavior in Management Science Journals”, Information Processing and Management 42, 6, pp. 1451-1464

H. Morris, C. Harvey, Aidan Kelly and M. Rowlinson, 2011 “Food for Thought? A Rejoinder on Peer Review and the RAE 2008 Evidence”, Accounting Edication 20, pp. 561-573

D. Tourish, 2011, “Leading Questions: Journal Rankings, Academic freedom, and Performativity: What is or Should be the Future of Leadership?”, Leadership 7, pp. 367-381

COPIOR JOURNAL LIST: http://www.copior.ac.uk/Journallist.aspx