an unhealthy obsession with global university rankings? a ... · 1.0 introduction ..... 4 2.0 the...
TRANSCRIPT
1
An unhealthy obsession with Global University
Rankings?
A briefing paper
By
Lee Zi Sheng
&
Ong Kian Ming
8 September 2017
2
Table of Contents 1.0 Introduction ............................................................................................................................................ 4
2.0 The performance of Malaysian universities in the THES-QS university ranking system (2004 to 2009)
.......................................................................................................................................................... 6
3.0 The performance of Malaysian universities in the QS ranking system (2010 onwards) .................. 7
3.1 Reliability of the QS Rankings – An indicator of quality? .................................................................. 10
3.2 Reliability of the QS Rankings: Can the Academic and Employer Reputation Surveys be manipulated?
................................................................................................................................................................ 13
3.3 Reliability of the QS Rankings: Scrutinising the Faculty to Student Ratio (FSR) Scores .................... 15
3.4 Reliability of QS World University Rankings by Subject - Reputation and Transparency ................. 17
4.0 The performance of Malaysian universities under other university ranking systems.................... 22
4.1 Shanghai Jiao Tong Academic Ranking of World Universities (Shanghai ARWU) ....................... 23
4.2 Times Higher Education (THE) World University Rankings ......................................................... 24
4.3 US News & Report Best Global Universities Rankings ................................................................ 25
4.4 Should these ranking systems be considered? ........................................................................... 26
4.5 Should we rely on any world university ranking system? ........................................................... 28
5.0 MyRA, SETARA and home-grown solutions .................................................................................... 29
6.0 What then? Summary and Conclusion ........................................................................................... 30
References .................................................................................................................................................. 33
Appendix: Score Breakdown of Malaysian Universities in the QS World University Rankings by Subject . 36
3
List of Figures
Figure 1: MOHE facebook posting on the rise in the QS rankings by the top 5 research public universities
in Malaysia .................................................................................................................................................... 5
Figure 2: UM QS Ranking and Scores, 2013 to 2018 ................................................................................. 11
Figure 3: UKM QS Ranking and Scores, 2013 to 2018 ............................................................................... 11
Figure 4: UTM QS Ranking and Scores, 2014 to 2018 ............................................................................... 12
Figure 5: USM QS Ranking and Scores, 2013 to 2018 ............................................................................... 12
Figure 6: UPM QS Ranking and Scores, 2013 to 2018 ............................................................................... 13
Figure 7: The Former Vice-Chancellor congratulating UM on being ranked in the top 50 in 5 subjects 18
Figure 8: Breakdown of weightages according to each QS Subject Ranking which shows the importance
of Academic Reputation by weightage ...................................................................................................... 19
Figure 9: UM QS Subject Ranking and Scores, 2015-2017 for Arts & Humanities ................................... 20
Figure 10: UM QS Subject Ranking and Scores, 2014-2017 for Life Sciences & Medicine ....................... 21
Figure 11: Performance of UM in different World University Rankings, 2010-2017 ............................... 22
Figure 12: Performance of USM in different World University Rankings, 2010-2017 ............................. 23
Figure 13: Box E-1, University Rankings, Malaysia Education Blueprint (HE) 2015-2025 ........................ 26
Figure 14: Quality, Malaysia Education Blueprint (HE) 2015-2025 ........................................................... 27
List of Tables
Table 1: Performance of Malaysian Universities in the THES-QS World University Rankings from 2004-
2009 .............................................................................................................................................................. 6
Table 2: Universiti Malaya’s scores for the 2004 THE-QS Ranking Indicators............................................ 7
Table 3: Universiti Malaya’s scores for the 2005 THE-QS Ranking Indicators............................................ 7
Table 4: Rank of Malaysia’s top 5 research universities from 2010-2018 in the QS World University
Rankings ........................................................................................................................................................ 8
Table 5: Breakdown of component scores of the QS rankings for UM, UPM, UKM, UTM and USM, 2013
to 2018 .......................................................................................................................................................... 9
Table 6: Percentage of Malaysian Respondents in the Academic and Employer Surveys ...................... 14
Table 7: Total Full-Time Student Enrolment Statistics for QS Ranked Malaysian Institutions by the MoHE
(2015), the QS Intelligence Unit (2018 Rankings) and the University Official Websites ......................... 15
Table 8: Total Academic Staff Statistics for QS Ranked Malaysian Institutions by the MoHE (2015), the
QS Intelligence Unit and the University Official Websites ....................................................................... 16
Table 9: Faculty to Student Ratio (Rounded off) Statistics for QS Ranked Malaysian Institutions
Calculated According to the Full-Time Student Enrolment and Academic Staff Statistics and Normalized
QS World University Ranking Faculty to Student Ratio Scores for 2015/2016 and 2018 ........................ 16
Table 10: Performance of Malaysian Universities in the Shanghai Jiao Tong Academic Ranking of World
Universities ................................................................................................................................................. 24
Table 11: Performance of Malaysian Universities in the Times Higher Education World University
Rankings ...................................................................................................................................................... 25
Table 12: Performance of Malaysian Universities in the US News Best Global Universities Rankings... 25
4
1.0 Introduction
Global university rankings have risen in prominence over the last decade and their importance and
influence will likely grow in the future. Starting with the Times Higher Education-Quacquarelli Symonds
(better known as the THES-QS) World University Rankings in 2004, global university rankings of various
sorts have proliferated to the scale of a small cottage industry. They have evolved to become such
important measures of institutional quality and reputation in the public sphere that policymakers in
Malaysia are now using ranking performance as improvement targets for our public universities.
According to the Malaysian Education Blueprint (Higher Education) 2015-2025, the Ministry of Higher
Education “aims to place one university in Asia’s Top 25, two in the Global Top 100 and four in the Global
Top 200” by 2025, as measured by the QS World University Rankings.1
More recently, the Ministry of Higher Education (MoHE) has been busy promoting the rise in the global
rankings of our public universities as measured by the QS World University Rankings. Five Malaysian public
universities were among the top 300 in the latest QS rankings, with University Malaya (UM) being placed
highest with a ranking of 114 in 2017/2018, followed by Universiti Putra Malaysia (UPM), Universiti
Kebangsaan Malaysia (UKM), Universiti Teknologi Malaysia (UTM) and Universiti Sains Malaysia (USM).
All five of these research universities experienced significant increases in the latest ranking with UKM, for
example, jumping 72 places, from 302 to 230 and USM jumping 66 places, from 330 to 264 (See Figure 1
below)2
1 Malaysia, Ministry of Education, Malaysia Education Blueprint 2015-2015 (Higher Education) Putrajaya: Kementerian Pendidikan Malaysia, 2015), E-5. 2 June 14, 2017, Facebook, June 14, 2017, https://www.facebook.com/moheofficial/photos/pcb.1290338701065126/1290331181065878/?type=3&theater.
5
Figure 1: MOHE Facebook posting on the rise in the QS rankings by the top five public research universities in Malaysia
Should the MoHE be lauding the rise in QS rankings among Malaysian universities? Is this praise justified?
Does the rise in the rankings truly indicate an increase in the quality of teaching and the quality of research
in our public universities? Are there better alternatives, apart from these global university rankings, to
measure the standards and performance of our public universities? These are some of the questions
which we hope to answer in this briefing paper.
Firstly, we review the past performance of Malaysian universities under the THES-QS ranking system. We
will then proceed to analyse the methodology and components of the QS rankings and to explain the
reasons for the recent rise in the rankings of Malaysian universities. We then compare Malaysia’s
performance in other well-known global university rankings such as the Academic Ranking of World
Universities (ARWU) rankings, previously known as the Shanghai Jiao Tong rankings, the Times Higher
Education (THE) rankings and the US News and World Report Best Global Universities Ranking. We then
proceed to examine the strengths and weaknesses of Malaysia’s own university evaluation system,
SETARA, and suggest ways to improve this ranking. We conclude with some recommendations on why
and how we should move away from an overdependence on global university rankings systems and focus
instead on other metrics and processes to improve the quality of teaching and research in our public
universities. All raw ranking data used in this paper have been openly obtained from the websites of the
respective ranking agencies (QS, THE, Shanghai Ranking Consultancy (AWRU) and the US News & World
Report). The analysis of the performance of Malaysian universities in each ranking list covers different
6
time frames, beginning from the years where Malaysian universities first appear on the list up to the latest
available data.
2.0 The performance of Malaysian universities in the THES-QS
university ranking system (2004 to 2009)
The first worldwide university ranking system to capture public attention was the ranking produced by
the Times Higher Education (THE) and Quacquarelli Symonds (QS) in 2004, which was published in the
Times Higher Education Supplement. The components of this ranking system in 2004 were as follows: (i)
A Peer Review score (50%), a Faculty / Student Score (20%), a Citations / Faculty Score (20%), an
International Faculty Score (5%) and an International Student Score (5%). In 2005, a recruiter review (10%)
component was added.
This THES-QS ranking was produced until 2009 when both companies decided to go their separate ways
and publish their own rankings. Table 1 below summarizes the performance of Malaysian universities in
the THES-QS rankings from 2004 to 2009. The inaugural THE-QS rankings received significant coverage in
Malaysia mostly because the oldest and prestigious public university, the University of Malaya (UM) was
ranked 89th, a feat which led the then Vice Chancellor, Prof Dato’ Dr Hashim Yaacob, to inundate UM with
buntings hailing this ‘achievement’.3
Table 1: Performance of Malaysian Universities in the THES-QS World University Rankings from 2004-2009
2004 2005 2006 2007 2008 2009
Universiti Malaya (UM) 89 169 192 246 230 180
Universiti Sains Malaysia (USM) 111 - - 307 - -
Universiti Kebangsaan Malaysia (UKM) - 289 185 309 - -
Universiti Putra Malaysia (UPM) - - - 364 - -
Tony Pua and Dr Ong Kian Ming, who were both education bloggers then, and are now Members of
Parliament for PJ Utara and Serdang respectively, speculated that the strong showing in the THE-QS
rankings by UM (89th) and Universiti Sains Malaysia (USM) (111th) was due to the misclassification of
3 “The Unbelievable Professor Hashim Yaacob,” Letters, Malaysiakini, November 08, 2005.
7
Chinese and Indian Malaysian students as international students, which significantly boosted the
international student score of both universities.4 This was later corrected by THE-QS as part of the process
of “clarification of data”.5 As result of this correction, UM’s ranking fell from 89 in 2004 to 169 in 2005.
This can be clearly seen in the international student score which fell from 68 in 2004 to 12 in 2005. As a
result of these adjustments, the overall score for UM fell from 166.4 in 2004 to 23.5 in 2005. (Tables 2 and
3 below).
Table 2: Universiti Malaya’s scores for the 2004 THE-QS Ranking Indicators
Peer Score (50%)
International Faculty Score (5%)
International Student Score (5%)
Faculty / Student Score (20%)
Citations / Faculty Score (20%)
Overall Score
50 29 68 15 0 166.4
Table 3: Universiti Malaya’s scores for the 2005 THE-QS Ranking Indicators
Peer Score (40%)
Recruiter Review (10%)
International Faculty Score (5%)
International Student Score (5%)
Faculty / Student Score (20%)
Citations / Faculty Score (20%) Overall Score
33 0 12 7 8 1 23.5
This episode highlights an important weakness of this ranking system: It can be ‘gamed’ by hiring more
international faculty and/or accepting more international students to increase ranking performance. This
may increase the volatility of a university’s performance within a short term period, showing that a sharp
increase or decrease in ranking performance may not necessarily correspond with a drastic change in
institutional quality or performance.
Richard Holmes, the academic behind a respected blog 6 which examines various university ranking
systems, lists down a number of major shortcomings7 with respect to the THE-QS rankings, some of which
still continue to plague the recent QS rankings, as we shall see below.
3.0 The performance of Malaysian universities in the QS ranking
system (2010 onwards)
4 Tony Pua, “Universiti Malaya: 89th or Nowhere? (Part III)”, Education in Malaysia, August 29, 2005. 5 Richard Holmes, "Universiti Malaya Again," University Ranking Watch, August 25, 2012. 6 Richard Holmes, University Ranking Watch. http://rankingwatch.blogspot.my 7 Richard Holmes, "The THE-QS World University Rankings, 2004 – 2009," Asian Journal of University Education 6, no. 1 (January 2010): 91-113.
8
Since the split between THE and QS, the top 5 research universities in Malaysia have been consistently
ranked in the top 450 of the QS World University Rankings. Malaysian universities have generally been on
an upward trend in the rankings since 2013.
Table 4: Rank of Malaysia’s top 5 research universities from 2010-2018 in the QS World University Rankings
2010 2011
2012-
2013
2013-
2014
2014-
2015
2015-
2016
2016-
2017 2018
UM 207 167 156 167 151 146 133 114
UKM 263 279 261 269 259 312 302 230
USM 309 335 326 355 309 289 330 264
UPM 319 358 350 411-420 376 331 270 229
UTM 365 401-450 358 355 294 303 288 253
Green figures in bold and italics show an increase in ranking from the previous year
As shown in Table 4 above, the rankings for all five research universities fluctuate from 2010 to 2015-
2016. In 2016-2017, four out of five of these universities (with the exception of USM) experienced an
improvement in their ranking. In 2018, all 5 universities experienced an improvement in their rankings
leading to all 5 universities being placed in the top 300 for the first time in the QS rankings’ history.
To understand the main factors behind the changes in the rankings for these top five research universities
in Malaysia, it is necessary to examine the components of the QS ranking system and the changes in the
scores of these components. Surprisingly, the components of the QS ranking and the weightage of these
components have not changed significantly from the methodology used in the THE-QS rankings. One
would have thought that QS would have used the opportunity from its separation from THE to update its
ranking components and methodology, especially since one of the main reasons for the separation was
due to disagreements between both parties on the quality of assessment.8 Instead, the components and
the weightage remain largely the same with academic reputation9 accounting for 40% of the total score,
the employer reputation accounting for 10% of the score, the faculty/student ratio accounting for 20% of
8 "Leader: Only the best for the best," Times Higher Education (THE), November 05, 2009. 9 The QS reputation scores were collected via surveys from academics/educators for the Academic Reputation score and Employers for the Employer Reputation score, and the questionnaire basically just asks respondents to list down universities that are good/reputable in their field of expertise. The argument put forth later on in the paper is that the idea that reputation should influence rankings is flawed- if people refer to rankings for reputation then all the highly established/top schools will only reinforce their status, and when the surveys are conducted for the next round the answers are backed by knowledge of previous rankings, hence the catch 22 situation. THE publishes a separate World Reputation Rankings so reputation scores do not influence its World University Rankings.
9
the score, the citations per faculty10 accounting for 20% of the score and the international faculty and
international student ratios making up the final 10%, each accounting for 5% of the component score. The
breakdown of the component scores for the 5 research universities in Malaysia are given below in Table
5 below.
Table 5: Breakdown of component scores of the QS rankings for UM, UPM, UKM, UTM and USM, 2013 to 2018
Year Rank Academic
Reputation
(40%)
Employer
Reputation
(10%)
Faculty
Student
Ratio
(20%)
Citations per
Faculty
(20%)
International
Faculty (5%)
International
Students (5%)
Overall
(out of
100)
UM 2013-14 167 58.8 60.8 90 7.5 77.3 75.2 56.9
2014-15 151
(16)
64.4 (5.6) 64 (3.2) 93.1 (3.1) 11.1 (3.6) 79.2 (1.9) 77 (1.8) 61 (4.1)
2015-16 146 (5) 62.0 (2.4) 56.8 (7.2) 94.1 (1.0) 23.8 (12.7) 80.6 (1.4) 77.3 (0.3) 62.1
(1.1)
2016-17 133=
(13)
55.7 (6.3) 48 (8.8) 92.2 (1.9) 20.3 (3.5) 77.7 (2.9) 70.5 (6.8) 57.1
(5.0)
2018 114
(19)
65.7 (10) 57.5 (9.5) 87.8 (4.4) 24.3 (4.0) 65.4 (12.3) 59.7 (10.8) 60.8
(3.7)
UPM 2013-14 411-
420
- - 44 6.1 13.1 39.4 32
2014-15 376
(35-44)
45.7 36.4 43.2 (0.8) 9.8 (3.7) 31.3 (18.2) 44.8 (5.4) 36.4
(4.4)
2015-16 331
(45)
43.5 (2.2) 33.0 (3.4) 48.1 (4.9) 20.8 (11.0) 44.5 (13.2) 57.2 (12.4) 39.7
(3.3)
2016-17 270
(61)
43.2 (0.3) 25.5 (7.5) 48.9 (0.8) 16.4 (4.4) 45.2 (0.7) 71.8 (14.6) 38.8
(0.9)
2018 229
(41)
49 (5.8) 34.4 (8.9) 56 (7.1) 17.2 (0.8) 43 (2.2) 73 (1.2) 43.6
(4.8)
UKM 2013-14 269 48 36.5 58.5 5.7 93 44.6 42.7
2014-15 259
(10)
52.1 (4.1) 39.5 (3.0) 60.4 (1.9) 7.9 (2.2) 95.1 (2.1) 42.5 (2.1) 45.4
(2.7)
2015-16 312
(53)
49.0 (3.1) 34.3 (5.2) 50.1
(10.3)
14.0 (6.1) 72.9 (22.2) 33.7 (8.8) 41.3
(4.1)
10 The citations per faculty score can also be easily “gamed” through, for example, the appointments of “academic icons” whereby some of their research papers will list their affiliations with local public universities and will be counted as faculty citations for those universities.
10
2016-17 302=
(10)
40 (9.0) 26.2 (8.1) 53.9 (3.8) 11.1 (2.9) 55 (17.9) 44 (10.3) 36.7
(4.6)
2018 230=
(72)
49.6 (9.6) 35.1 (8.9) 66.5
(12.6)
11.9 (0.8) 44.7 (10.3) 39.9 (4.1) 43.4
(6.7)
Year Rank Academic
Reputation
Employer
Reputation
Faculty
Student
Citations per
Faculty
International
Faculty
International
Students
Overall
UTM 2013-14 355 32.3 40.6 62.2 3.2 37.1 79.6 36
2014-15 294
(61)
40.5 (8.3) 37.6 (3.0) 70.5 (8.3) 4.3 (1.1) 60.6 (23.5) 76.7 (2.9) 41.9
(5.9)
2015-16 303
(9)
32.8 (7.7) 35.1 (2.5) 77.9 (7.4) 12.8 (8.5) 74.0 (13.4) 66.4 (9.7) 41.9
2016-17 288
(15)
26.7 (6.1) 30 (5.1) 82.9 (5.0) 10.7 (2.1) 56.8 (17.2) 41.3 (25.1) 37.4
(4.5)
2018 253
(35)
32.8 (6.1) 38.9 (8.9) 85.9 (3.0) 12.5 (1.8) 35.2 (21.6) 51.4 (10.1) 41.1
(3.7)
USM 2013-14 355 48.1 42.6 33.8 13.2 20.3 39.3 36
2014-15 309
(46)
55.3 (7.2) 46.4 (3.8) 39.9 (6.1) 17.4 (4.2) 24.5 (4.2) 30 (9.3) 41 (5.0)
2015-16 289
(20)
50.2 (5.1) 41.1 (5.3) 44.3 (4.4) 36.7 (19.3) 29.9 (5.4) 27.5 (2.5) 43.4
(2.4)
2016-17 330
(41)
40.3 (9.9) 34.7 (6.4) 36.8 (7.5) 25.9 (10.8) 23.4 (6.5) 25.8 (1.7) 34.6
(8.8)
2018 264
(66)
48.2 (7.9) 43.8 (9.1) 46.7 (9.9) 22.1 (3.8) 27.2 (3.8) 27.5 (1.7) 40.2
(5.6)
Figures in green show increase from previous rankings, figures in red show decrease. Largest increase/smallest decrease in score for the year is highlighted in yellow.
3.1 Reliability of the QS Rankings – An indicator of quality?
How reliable are these rankings as an indication of the quality of education and research at Malaysian
universities as compared to their counterparts in other countries? Have the QS scores for the Malaysian
universities improved in conjunction with the improvement in rankings? If we take the QS scores at face
value, then we must examine the relationship between the QS scores and rankings with regards to the
five public universities that feature in the QS rankings namely UM, UKM, USM, UTM and UPM.
Firstly, although the UM QS ranking has been steadily improving from 167 in 2013/2014 all the way up to
114 in the last 2018 rankings, UM’s QS score of 60.8 in 2018 is below its score of 62.1 in 2015/2016 when
it was ranked 146. Furthermore, UM’s score decreased from 62.1 in 2015/2016 to 57.1 in 2016/2017 even
as its ranking improved from 146 to 133 during the same time period. (See Figure 2 below)
11
Figure 2: UM QS Ranking and Scores, 2013 to 2018
Secondly, although UKM’s QS ranking reached a high of 230 in 2018, its QS Score of 43.4 in 2018 is still
below its QS Score of 45.4 in 2014/2015 when it was ranked 259. (See Figure 3 below)
Figure 3: UKM QS Ranking and Scores, 2013 to 2018
Thirdly, UTM’s QS score fell from 41.9 in 2015/2016 to 37.4 in the following year in 2016/2017. Despite
this relatively significant drop in its QS scores, UTM’s ranking actually improved from 303 in 2015/2016 to
288 in 2016/2017, an improvement of 15 spots. Even though UTM’s QS scores bounced back up the
following year to 41.1 in 2018 and its ranking improved further to 253, it has yet to achieve a higher QS
score compared to what it achieved in 2015/2016. (See Figure 4 below)
56.9
61
62.1
57.1
60.8
167 151
146133
114
0
20
40
60
80
100
120
140
160
180
54
55
56
57
58
59
60
61
62
63
2013-14 2014-15 2015-16 2016-17 2018
UM QS Ranking 2013 to 2018
Score Rank
42.745.4
41.3
36.7
43.4
269 259
312 302
230
0
50
100
150
200
250
300
350
0
5
10
15
20
25
30
35
40
45
50
2013-14 2014-15 2015-16 2016-17 2018
UKM QS Ranking 2013 to 2018
Score Rank
12
Figure 4: UTM QS Ranking and Scores, 2014 to 2018
Fourthly, USM’s QS reached a high of 43.4 in 2015/2016 when it was ranked 289 but its 2018 ranking
improved to 264 even though its QS score of 40.2 was actually lower than the score USM achieved in
2015/2016 when it was ranked 289 (See Figure 5 below).
Figure 5: USM QS Ranking and Scores, 2013 to 2018
Indeed, the only university which had a QS Score and Ranking that was relatively consistent was UPM.
UPM’s score improved steadily from 32 in 2013-2014 to a high of 43.6 in 2018. At the same time, its
ranking has also improved steadily from 411-420 in 2013/2014 to 229 in 2018. (See Figure 6 below)
41.9 41.9
37.4
41.1294
303
288
253
220
230
240
250
260
270
280
290
300
310
35
36
37
38
39
40
41
42
43
2014-15 2015-16 2016-17 2018
UTM QS Ranking 2014 to 2018
Score Rank
36
4143.4
34.6
40.2355
309289
330
264
0
50
100
150
200
250
300
350
400
0
5
10
15
20
25
30
35
40
45
50
2013-14 2014-15 2015-16 2016-17 2018
USM QS Ranking 2013 to 2018
Score Rank
13
Figure 6: UPM QS Ranking and Scores, 2013 to 2018
In other words, while the improvement in QS scores and ranking might seem to be indicative of the rising
quality of education and research at our top public universities, these metrics do not reflect a
determinative trend of improvement, especially when it comes to the scores. The score of a university
can decrease at the same time as its ranking improves and vice versa.
3.2 Reliability of the QS Rankings: Can the Academic and Employer Reputation Surveys be
manipulated?
The Academic and Employer Reputation Surveys have been the most controversial indicators in the QS
World University Rankings methodology. Together, these two indicators make up 50% of the QS ranking.
The subjectivity of survey responses poses serious credibility concerns for any ranking methodology and
especially one where they make up half of the ranking measure. Richard Holmes writes: “Some critics
object in principle to the use of such subjective measures in rankings, on the grounds that they reflect
past, not current performance, that they are based on stereotype or even ignorance, and that a good or
bad reputation may be mindlessly replicated.”11 Despite the criticism, QS has maintained both indicators,
choosing to refine the surveys to minimise bias and manipulation, as these two components set the QS
ranking apart from any other list. Yet, as we shall see, the indicators still contain serious problems and are
not fool proof.
11 Richard Holmes, "The THE-QS World University Rankings, 2004 – 2009," Asian Journal of University Education 6, no. 1 (January 2010): 107.
32
36.439 38.8
43.6415376
331
270
229
0
50
100
150
200
250
300
350
400
450
0
5
10
15
20
25
30
35
40
45
50
2013-14 2014-15 2015-16 2016-17 2018
UPM QS Ranking 2013 to 2018
Score Rank
14
The first concern regarding the reputation surveys is that they are subject to a voluntary response bias. In
2010, QS launched its academic signups initiative as a way foracademics to signal their interest to take
part in the Academic Reputation Survey. Since its launch, over 25,000 academics have signed upto
participate in the survey.12 For the Employer Reputation Survey, institutions have also been invited to
submit lists of employers.13 While this has increased the sample size of respondents, it has also distorted
the proportionality of the geographical distribution of respondents against the percentage of universities
in certaincountries, allowing smaller countries like Malaysia to provide a higher percentage of responses
than both the percentage of Malaysian universities in the ranking sample and the population in general.
Table 6 below shows the percentage of Malaysian Respondents in both the Academic and Reputation
Surveys.
Table 6: Percentage of Malaysian Respondents in the Academic and Employer Surveys
Year % of Malaysian Respondents - Academic Survey
% of Malaysian Respondents - Employer Survey
2012-13 1.70 (741) 0.60 (142)
2014-15 1.31 (739) 1.14 (207)
2016-17 2.70 (2160) 1.50 (658)
2018 3.70 0.90
Number of respondents in (brackets) Source: QS Intelligence Unit, QS World University Rankings Country Reports Note: Country Report for the 2015/2016 unavailable for Malaysia
The percentage of the Academic Reputation Survey respondents from Malaysia has increased over the
years even as the total number of respondents have increased. The percentage of Employer Reputation
Survey respondents has also increased in recent years, except for the latest rankings, which sees the
percentage decrease from 1.50% to 0.90%. The representation of Malaysian academics in the academic
survey is unduly large, considering that Malaysia makes up merely 0.41% of the world’s population yet
its representation in the academic survey is 3.7%. The percentage of Malaysian respondents in the
academic survey is even greater than countries like China (1.7%), Germany (2.9%) and Japan (3.2%). This
raises possibilities that the survey can be ‘gamed’ to improve the scores of Malaysian universities. In a
ranking system where the difference of 2 points in the ranking score can mean a fall or a rise in ranking
12 "Academic Reputation," QS Intelligence Unit. 13 "Employer Reputation," QS Intelligence Unit.
15
spots by as much as 10 places, an increase in 1 or 2 percentage points in the number of Malaysian
respondents can make a significant difference.
The second concern regarding the surveys is the credibility of the respondents. As disclosed by the QS
Intelligence Unit, respondents for the Academic Reputation Survey are sourced from subscribers and
contacts of World Scientific and Mardev-DM2,14 who are not necessarily contributors to the databases.
This can potentially allow unqualified respondents to participate in the survey, as rank and experience
are self-declared by respondents. Administrators, presidents/chancellors, functional managers and
“others” have been profiled as respondents, yet some may not have had significant experience in
academia. At the same time, employers may be guilty of over reliance on past rankings to recruit
graduates. This creates a catch-22 situation, where, due to selective recruitment policies, employers may
not be exposed to a pool of job applicants that is significantly diverse to ensure ’selective recruitment
policies reduce the diversity of objective responses. Again, this provides opportunities for universities to
manipulate the results of the survey.
3.3 Reliability of the QS Rankings: Scrutinising the Faculty to Student Ratio (FSR) Scores
The Faculty to Student Ratio (FSR) indicator is used by the QS Intelligence Unit to evaluate teaching quality
in universities, where a small ratio represents a “commitment to teaching” and correlates with higher
teaching quality.15 Strong improvements in FSR scores have helped to propel UKM and USM immensely
in the latest 2018 rankings (72 and 66 places respectively). In fact, all of our ranked universities, with the
exception of UM, received improved FSR scores.16 The latest official statistics for total student enrolment
and academic staff published by the Ministry of Higher Education (MoHE) in 2015,17 QS Intelligence Unit
and the respective official websites of each university are compared below.
Table 7: Total Full-Time Student Enrolment Statistics for QS Ranked Malaysian Institutions by the MoHE (2015), the QS Intelligence Unit (2018 Rankings) and the University Official Websites
Institution MoHE QS Intelligence Unit Official University Website
UM 27452 17902 17580
USM 30853 20955 NA
UTM 31066 17419 17419
14"Academic Reputation," QS Intelligence Unit. 15 "Faculty Student Ratio," QS Intelligence Unit. 16 FSR scores are also influenced by the type of faculty. Science faculties have smaller classrooms compared to social sciences because of the need to have laboratory practical. If a university, such as UM, shifts to more Science and Engineering based programs, they would ‘perform’ better according to FSR. 17 Malaysia, Ministry of Education, Malaysia Educational Statistics 2016 (Putrajaya: Kementerian Pendidikan Malaysia, 2015), 162.
16
UKM 27239 18094 26961
UPM 30670 20924 24879
Table 8: Total Academic Staff Statistics for QS Ranked Malaysian Institutions by the MoHE (2015), the QS Intelligence Unit and the University Official Websites
Institution MoHE QS IU Official Website
UM 2177 2755 2807
USM 1963 2318 NA
UTM 1965 2613 2613
UKM 2159 2460 NA
UPM 1812 2334 2203
Tables 7 and 8 above show that the figures provided by QS and published on the official websites are
significantly different from the statistics provided by the MoHE in 2015(the latest available statistics from
the Higher Education ministry). These discrepancies weaken reliability of the QS and website data,
especially since there are no supporting statistics from the government available for verification. Total
student enrolment figures used by QS are consistently lower across all of the ranked universities, with
differences ranging from 9,145 (UKM) to 13,647 students (UTM). The figures stated on the UTM website
are the same as the figures used by QS while the figures stated by UM, UPM and UKM are different. The
figure from USM could not be verified. In contrast, Total Academic Staff figures used by QS are consistently
higher, with differences in the range of 301 to 648 staff, although figures from USM and UKM could not
be verified. This combination of lower total standard enrolment and higher total academic staff figures
skew the faculty to student ratio in favour of the universities for the latest rankings.
Table 9: Faculty to Student Ratio (Rounded off) Statistics for QS Ranked Malaysian Institutions Calculated According to the Full-Time Student Enrolment and Academic Staff Statistics and Normalized QS World University Ranking Faculty to Student Ratio Scores for 2015/2016 and 2018
Institution MoHE (2015) QS IU Official Website QS 2015/2016 QS 2018
UM 1:13 1:6 1:6 94.1 87.8
USM 1:16 1:9 NA 44.3 46.7
17
UTM 1:16 1:7 1:7 77.9 85.9
UKM 1:13 1:7 NA 50.1 66.5
UPM 1:17 1:9 1:11 48.1 56
A simple formula of total student enrolment divided by total academic staff was used to calculate the
faculty to student ratio for each set of data, with figures rounded off to the nearest whole number. The
Faculty Student Ratio (FSR) using the 2015 figures from MOHE, from the QS Intelligence Report and from
the figures from the official website as well as the QS scores for 2015/2016 and for 2018 are listed in Table
9 above. The discrepancies are immediately clear. The difference in the FSR between UM and the other
universities are not significant even using the statistics from QS IU. The MOHE (2015) figures show a 1:13
FSR for UM which is the same as UKM. And yet, the FSR scores for the QS 2015/2016 ranking for UM was
94.1 which was nearly twice as high as UKM’s score of 50.1 and more than twice USM’s score of 44.3! The
QS 2015/2016 scores seem to indicate that UM has twice as many faculty members per student compared
to UKM and USM. But this is clearly not reflected in the MOHE official statistics, and these discrepancies
remained in the QS 2018 ranking. All of the universities in Table 9 above experienced an increase in their
FSR QS scores from 2015/2016 to 2018 with the exception of UM. But these scores still show UM’s FSR
score to be significantly higher than the other universities (with the exception of UTM), a result that is not
reflected in the MOHE statistics.
With inconsistent data and puzzlingly different scores despite similar ratios, the FSR score for the
Malaysian public universities are highly suspect.
3.4 Reliability of QS World University Rankings by Subject - Reputation and Transparency
Malaysian universities have also been on the rise in the QS World University Rankings by Subject over the
past few years. Unsurprisingly, the improvement in these subject ranking scores have also been used by
the Ministry of Higher Education and also the individual universities themselves as ‘proof’ of the high
quality of education in these institutions. (See Figure 7 below as an example).
18
Figure 7: The Former Vice-Chancellor congratulating UM on being ranked in the top 50 in 5 subjects
Unfortunately, as this section aims to show, this improvement in rankings has not been supported by
consistent improvement across all ranking indicators. It is therefore premature to conclude through these
rankings that the quality of our universities has improved.
The QS World University Rankings by Subject uses different indicators and weightages from the main
World University Ranking list. While the Academic and Employer Reputation indicators are retained, the
Citations per Faculty, Faculty to Student Ratio as well as the International Student and Faculty scores have
been replaced with Citations per Paper and H-Index Citations scores. The Citations per Faculty score takes
into account all publications in Scopus regardless of language, while Citations per Paper scores only
include papers indexed in Scopus, a bibliographic database containing abstracts and citations for academic
journal articles, thus omitting local language journals that are not covered by this database.18 The Hirsch
Index, suggested by Jorge E. Hirsch, is different from usual citation scores as it aims to measure only high
impact and productive research.19 As the importance of research output differs from subject to subject,
QS introduced different weightages for each indicator in the different subject rankings, aiming to provide
a more accurate picture of less research focused subjects. However, QS has done so by increasing the
weightages of the Academic and Employer Reputation scores, which as argued earlier, may be easily
manipulated. Rankings for subjects like Art & Design, which is entirely dependent on the surveys alone
18 “Papers and Citations,” QS Intelligence Unit. 19 "H Index," QS Intelligence Unit.
19
(Academic 90%, Employer 10%), calls the objectivity and credibility of these rankings into serious
question.
Figure 8: Breakdown of weightages according to each QS Subject Ranking showing the importance of
Academic Reputation by weightage20
Once again, tabulated data indicates that the rise in the rankings is strongly linked to a consistent
improvement in the Academic Reputation Survey scores, which are subjective in nature. While ‘Citations
per Paper’ scores largely fluctuate, the ranking of Malaysian universities has improved even when their
‘Citations per Paper’ scores decrease. The ‘Employer Reputation’ and ‘H-Index Citations’ scores have
improved since 2014 and have supplemented the rise in the rankings of our universities. There are a
number of occasions where the ‘Hirsch-Index Citations’ scores have increased when the ‘Citations per
Paper’ scores decreased. The two graphs below show the performance of UM in the subject rankings of
Arts & Humanities, which is less research focused and in the subject rankings of Life Sciences & Medicine,
a research-heavy field. In both cases, Academic and Employer Reputation have propelled UM up the
20 “QS World University Rankings by Subject,” QS Intelligence Unit.
20
rankings despite a decrease in citation scores. For example, Figure 9 shows that the UM QS ranking for
the Arts and Humanities faculty has improved from 159 in 2015 to 85 in 2017 even though its Citations
per Paper has decreased from 83.4 in 2015 to 73.2 in 2017. In Figure 10, the Life Science and Medicine
ranking of UM has improved from 325 in 2014 to 164 in 2017 even though the Citations Per Paper score
has decreased from 73.4 in 2014 to 59.9 in 2017.
Figure 9: UM QS Subject Ranking and Scores, 2015-2017 for Arts & Humanities
68.8 7278.883.4
74 73.2
55.563.2
74.5
159
127
85
2015 2016 2017
0
20
40
60
80
100
120
140
160
180
Arts & Humanities Subject Rankings for UM (2015 to 2017)
Academic Reputation Citations Per Paper
Employer Reputation Arts & Humanities Ranking
21
Figure 10: UM QS Subject Ranking and Scores, 2014-2017 for Life Sciences & Medicine
The full breakdown of scores in all broad subject areas for the top 5 Malaysian research universities are
attached in the Appendix.
University administrators are increasingly cognizant of the effects that citations can have on university
rankings. So much so that some of these administrators are trying to ‘game’ the citation scores by
‘encouraging’ academics to cite the work of their colleagues in the same department as part of their KPIs.
Retraction Watch, a popular blog which highlights dubious academic research and publications, reported
a recent circular by Professor Ir. Dr Noor Azuan Abu Osman, the current Dean of the Faculty of Engineering
at Universiti Malaya, asking faculty members “to cite at least 3 relevant papers of colleagues in each of
your publications”.21 This practice of “Citations Stacking” does not inherently improve the quality of
research and publication in an academic institution but is a blatant way of trying to ‘game’ the system in
order to boost the ranking scores of the institution in question. Such a practice may have a more direct
effect on the QS subject rankings for specific faculties in a university (as opposed to the overall global
university rankings which encompass a larger number of faculties and a greater variety of publications).
This may explain the rationale behind the circular issued by the Dean of Engineering at Universiti Malaya.
21 http://retractionwatch.com/2017/08/22/one-way-boost-unis-ranking-ask-faculty-cite/
65.2 65.482.7 87.2
73.485.7 74.6
59.952.6 55.1 52.874.9
325298
249
164
2014 2015 2016 2017
0
50
100
150
200
250
300
350
Life Sciences & Medicine Subject Rankings for UM (2014 to 2017)
Academic Reputation Citations Per Paper
Employer Reputation Life Sciences & Medicine
22
One wonders how many more such circulars have been issued in other faculties at Universiti Malaya and
also other Malaysian public universities.
4.0 The performance of Malaysian universities under other university
ranking systems
Besides the QS World University Rankings, Malaysian universities have also been ranked in a few other
well-known world university ranking lists. We examined three of the most prominent ranking lists: (i) the
Academic Ranking of World Universities (ARWU) previously known as the Shanghai Jiao Tong Academic
Ranking of World Universities (ii) the Times Higher Education World University Rankings and (iii) the US
News & Report Best Global Universities Rankings. A comparison of Malaysian universities’ performance
under these, and the QS rankings, yields huge disparities. All three ranking lists rank Malaysian universities
consistently and considerably lower than the QS rankings. Figures 11 and 12 below show the
performances of UM, the best performing Malaysian university and USM, the university selected for the
MOHE’s Accelerated Programme for Excellence (APEX) in the different ranking lists to illustrate this
disparity.
Figure 11: Performance of UM in different World University Rankings, 2010-2017
207
167 156 167151 146 133
114
401-500 401-500 401-500
301-400 301-400
401-500 401-500
356
0
100
200
300
400
500
600
2010 2011 2012 2013 2014 2015 2016 2017
Ran
kin
g
Year of Publication
Performance of UM in different World University Rankings
QS Shanghai ARWU US News
23
Figure 12: Performance of USM in different World University Rankings, 2010-2017
Figure 11 shows that UM’s ranking according to the Shanghai ARWU has fluctuated between 301-400 and
401-500 from 2011 to 2017, while the US News Report Global University Rankings 2017 ranked UM at 356
(UM has not participated in the THE rankings and is therefore not ranked). This is a big difference from QS
rankings which show UM improving to 114 in 2017. Meanwhile, n Figure 12 shows that USM was ranked
in the 601-800 range in 2016 and 2017 according the THE rankings, between 401-500 according to the
Shanghai AWRU ranking from 2014 to 2017 and 576 in the US News Report 2017 ranking. As with UM,
USM’s rankings in these lists again differ vastly from the QS rankings.
4.1 Shanghai Jiao Tong Academic Ranking of World Universities (Shanghai ARWU)
The Shanghai ARWU is widely known as a research focused ranking system, with criteria assessing the
quality of education and the faculty based on the number of alumni or faculty winning the Nobel Prize
(except the Peace and Literature awards) and the Fields Medal as well as being highly cited researchers.22
It has been published by the Centre for World-Class Universities at Shanghai Jiao Tong University (CWCU)
since 2003. Prior to the latest 2017 rankings, only three of our universities have ever been ranked, with
UM making the earliest appearance among the three in 2011. UPM and UTM made their first appearance
22 "Methodology," Academic Ranking of World Universities, 2016.
309335 326
355309
289330
264
401-500 401-500 401-500 401-500
601-800 601-800
576
0
100
200
300
400
500
600
700
800
900
2010 2011 2012 2013 2014 2015 2016 2017
Ran
kin
g
Year of Publication
Performance of USM in different World University Rankings
QS Shanghai ARWU THE US News
24
this year, when the ranking list expanded from 500 to 800. Their performance here is a far cry from the
QS ranking, as they have never broken beyond the top 300 in the Shanghai ARWU ranking (See Table 10
below)
Table 10: Performance of Malaysian Universities in the Shanghai Jiao Tong Academic Ranking of World Universities
2011 2012 2013 2014 2015 2016 2017
UM 401-500 401-500 401-500 301-400 301-400 401-500 401-500
USM NA NA NA 401-500 401-500 401-500 401-500
UKM NA NA NA NA NA 401-500 501-600
UPM NA NA NA NA NA NA 501-600
UTM NA NA NA NA NA NA 701-800
4.2 Times Higher Education (THE) World University Rankings
The THE continued publishing a World University Rankings list after its split with QS but with a different
methodology and a different data provider. It now focuses on the teaching, research, citations, industry
income and international outlook of universities.23 Reputation is not included as an indicator as THE has a
separate World Reputation Rankings.24 Malaysian universities have performed poorly, with the strongest
showing from UTM in the 401-500 ranking bracket in 2015-2016. The other Malaysian universities
featured in the THE rankings – UKM, UPM and USM - were all ranked in the 601-800 range in 2015-2016.
Even UTM fell to the 601-800 ranking range in the 2016-2017 THE rankings. In the recently released 2017-
2018 THE rankings, UTM, UKM, UPM and USM retained their 601-800 ranking, while UUM saw its ranking
drop to the below 1000 range.
For the first time, UM was featured in the 2018 THE rankings and it made its debut in the 351-400 range
(See Table 11 below).
In the 2016-2017 Asia-Pacific University Rankings released to reflect the region’s growing strength in the
higher education sector, no universities in Malaysia were ranked in the top 100.25 The highest ranked
Malaysian universities were UTM and UPM in the 121-130 range.
23 "World University Rankings," Times Higher Education (THE), September 23, 2016. 24 "World Reputation Rankings," Times Higher Education (THE), June 14, 2017. 25 "THE Asia-Pacific University Rankings 2017: ready for the next level?" Times Higher Education (THE), July 04, 2017.
25
Table 11: Performance of Malaysian Universities in the Times Higher Education World University Rankings
2015-16 2016-17 2017-2018
UM N/A N/A 351-400
UTM 401-500 601-800 601-800
UKM 601-800 601-800 601-800
UPM 601-800 601-800 601-800
USM 601-800 601-800 601-800
UUM N/A >800 >1000
4.3 US News & Report Best Global Universities Rankings
The US News & Report Best Global Universities Rankings was first published in 2014. This ranking takes
into account an institution’s global and regional reputation in addition to their academic research
performance.26 With only 25% of the rankings based on reputation scores, Malaysian universities have
not succeeded in replicating their high performance in the QS Rankings. UM is the highest ranked
Malaysian university at 356 followed by USM at 576, UTM at 639, UPM at 670 and UKM at 783 (See Table
12 below)
However, UM is ranked as the 27th best engineering school in the world,27 an even better ranking than
their performance in the QS Subject Rankings, with strong publication and citation as indicated by
Clarivate Analytics, the data provider for the US News rankings. (One wonders how much of this is driven
by the citation stacking strategy highlighted earlier)
Table 12: Performance of Malaysian Universities in the US News Best Global Universities Rankings
University 2017 (Year of Ranking)
UM 356
USM 576
UTM 639
UPM 670
UKM 783
26 "Best Global Universities Rankings," U.S. News & World Report Education, 2016. 27 "University of Malaya," U.S. News & World Report Education, 2016.
26
4.4 Should these ranking systems be considered?
If the quality of Malaysian universities is also reflected in these ranking systems, shouldn’t the MoHE also
use these rankings as achievement targets in addition to the QS rankings? The MoHE’s choice to focus
only on the QS World University Rankings as an indicator of institutional quality seems to be based on the
stronger performance of Malaysian universities in that ranking system. The Malaysian Education Blueprint
(HE) 2015-2025 offers the following justification:
Figure 13: Box E-1, University Rankings, Malaysia Education Blueprint (HE) 2015-202528
The MoHE’s justification of the QS ranking being ‘the oldest global ranking’ and providing a ‘broader
perspective’ is not persuasive. Firstly, an analysis of the performance of Malaysian universities across
different ranking lists would provide the MOHE with a much broader perspective of university
performance. By tracking the performance of our universities across a wider range of indicators, the
Ministry can identify weaknesses in our higher education system in specific areas. For example, the scores
in other rankings such as the THE’s Teaching Indicator, the Shanghai ARWU’s Per Capita Performance
Indicator and US News Bibliometric and Scientific Excellence Indicators can be incorporated into the
Ministry’s evaluation of the universities and help the Ministry improve the performance of universities
which show weaker scores, as measured by these indicators. In addition, if the quality of the overall
system is to be measured by research output (See Figure 14), the Ministry’s reason to disregard other
university rankings because they are ‘heavily weighted towards research outcomes’ is odd and
contradictory. It is not illogical to conclude that the MoHE’s decision is a selection bias towards favourable
results, ignoring obvious gaps in the system exposed by the other rankings. This will paint a one-sided,
unrealistic picture of the quality of our institutions.
28 Malaysia, Ministry of Education, Malaysia Education Blueprint 2015-2015 (Higher Education (Putrajaya: Kementerian Pendidikan Malaysia, 2015), E-4.
27
Figure 14: Quality, Malaysia Education Blueprint (HE) 2015-202529
Notwithstanding, these other ranking systems are not without their weaknesses, and universities around
the world have benefitted from ‘gaming’ them too. Richard Holmes has pointed out issues that plagued
the credibility of the Shanghai ARWU and the THE World University Rankings. The Shanghai ARWU has
struggled with the issue of secondary affiliation. A case in point is the massive recruitment of adjunct
faculty by King Abdulaziz University (KAU), which propelled them to be the ranked as the top university in
the world for publications in mathematics.30 Their research contributed to KAU’s scores in the ‘number of
faculty members who are highly cited researchers’ category and their publications in Nature and Science,
even if they were only loosely affiliated to KAU.31 Malaysian universities who wish to improve their
research scores may also ‘game’ the system in this manner by giving short term fellowships to overseas
academics with impressive publication records. The THE rankings, on the other hand, has struggled with
self-citations by researchers and specialized institutions. Veltech University of India was ranked as the
best in Asia for research impact with a citation score of 100,32 but it has been speculated to be the result
of one researcher repeatedly citing himself.33 UTAR has also been ranked as the best university in Malaysia
for research impact on the basis of one researcher’s involvement in a prominent global medical project.34
Small, specialized universities have also received very strong citation scores, with the National Research
29 Malaysia, Ministry of Education, Malaysia Education Blueprint 2015-2015 (Higher Education (Putrajaya: Kementerian Pendidikan Malaysia, 2015), E-5. 30 "ShanghaiRanking's Global Ranking of Academic Subjects 2017 - Mathematics," Academic Ranking of World Universities, 2016. 31 Richard Holmes, "Are global university rankings losing their credibility?" Wonkhe, September 23, 2016. 32 "Asia University Rankings," Times Higher Education (THE), March 15, 2017. 33 Richard Holmes, "The Abuse and Use of Rankings," University Ranking Watch, June 16, 2017. 34 Ibid.
28
Nuclear University, MEPhI, in Moscow, and St George’s, University of London, a medical school, ranked as
the top for research impact in the world in 2012-13 and 2016-17 respectively.35 In fact, Cambridge
University is surprisingly ranked as the second best research university in Cambridge behind Anglia Ruskin
University.36 These results point to evidence that other ranking systems can also be ‘gamed’ by university
administrators or be skewed because of outlier performances by a small number of very prolific
researchers. The fact that a small number of researchers in a university are very prolific does not indicate
that the quality and quantity of research produced across the board in that university is of a world class
standard.
4.5 Should we rely on any world university ranking system?
Given the various weaknesses inherent in all global university ranking systems, should we rely on any of
them at all as a KPI to benchmark our public universities against? Dr Sean Matthews has been a vocal critic
of world university rankings. He claims that they are ‘subjective, misleading, and profoundly damaging,’
and by relying on them, the MoHE is ‘surrendering the autonomy to choose the priorities and direction
for our children, our students, our universities and our country’37 to these ranking agencies. He gives the
following reasons: (i) The weightings that ranking agencies give to their ranking indicators are wholly
arbitrary and produce volatile results; (ii) It is neither cost-effective nor efficient for middle-ranking, good
quality universities to measure themselves against well-established and well-funded research
powerhouses that dominate the top of ranking lists; and (iii) Ranking agencies are commercial entities
that sell their products according to public interest.38 Miguel Antonio Lim adds to this final point through
his research of the THE rankings, describing the relationship that THE has with its audience as ‘weak
expertise’ and a ‘constant struggle for influence.’ 39 The consultancy service offered by QS to help
universities rise up the rankings is a potential conflict of interest that may affect the objectivity of the
rankings. Malaysian universities have clearly bought the hype, with four of the top five universities, except
UTM, employing QS to evaluate them based on their STAR ranking system. By choosing to accept advice
on how to improve according to the ranking methodologies, Malaysian universities are in danger of
sacrificing more holistic and structural long-term improvements for quick fixes.
35 Richard Holmes, "Ten Universities with a Surprisingly Large Research Impact," University Ranking Watch, May 30, 2017. 36 Richard Holmes, "Proving anything you want from rankings," University Ranking Watch, July 3, 2017. 37 Sean Matthews, "Can we trust international university rankings?" Malay Mail Online, June 09, 2017. 38 Ibid. 39 Miguel Antonio Lim, "The building of weak expertise: the work of global university rankers," Higher Education, April 13, 2017, doi:10.1007/s10734-017-0147-8.
29
Yet, the reality of a highly commercialised higher education industry is still such that university ranking
lists are closely monitored by prospective students all over the world. If there is one area in which these
university rankings may be useful to Malaysian universities, it is that these rankings can help universities
attract international students. According to the 2017 International Student Survey (ISS) published
byHobsons, 19.6% of prospective international students cite rankings as the most important factor in
their choice of destination country, while 23.5% say that a university being well-ranked is the most
important factor in their choice of university.40 Among all the world university ranking lists, QS has been
rated as the most popular, including in China and India, the two most populated countries in the world.
These findings show that the performance of Malaysian universities in the QS rankings are helpful in
attracting more foreign talent, and the MoHE’s decision to use the achievements in the QS rankings as
marketing tools in the form of physical banners and social media is a strategic one. However, the
performance of our universities in ranking lists must not distract policymakers and university
administrators from addressing issues and flaws in our higher education system that are not reflected
in these lists.
5.0 MyRA, SETARA and home-grown solutions
MOHE’s obsession with global university rankings lists, specifically the QS ranking, has distracted it from
further developing and enhancing domestic rankings and indicators for both public and private
universities, developed by MoHE together with the Malaysian Qualifications Agency (MQA). The
Malaysian Research Assessment Instrument (MyRA), the Rating System for Malaysian Higher Education
Institutions (SETARA) and the Discipline-Based Rating System (D-SETARA) have been developed by the
MoHE and the MQA to assess the quality of local public and private universities. MyRA is used for
accreditation purposes and to monitor the research performance of public universities,41 while SETARA
and D-SETARA uses three generic domains to rate each institution, namely: 42
(i) Input: Addressing faculty and student talent, physical and financial resources and
governance
(ii) Process: Focusing on aspects related to the quality of curriculum
40 https://www.hobsons.com/apac/resources/entry/blog-ranking-the-rankings-how-international-students-choice-of-rankings-is 41 "What is MyRA?" KPIMS II, December 14, 2015. 42 Ian Jerome Leong, "Setting national education standards," The Star , March 6, 2016, Star Special sec.
30
(iii) Output: Measuring graduate satisfaction and the quality of graduates based on
information, such as marketability, graduate attributes and employers’ feedback
In the 2014-2015 MyRA rating, six universities achieved the highest rating of six stars.43 In the 2013
SETARA ratings, all our universities were rated as either Tier 5, the second highest category available, or
Tier 4.44 These results have rated our institutions so well that there is not much room for local universities
to improve. Yet, it is undeniable that the gulf in quality of teaching, research and graduates between
Malaysian universities and the major powerhouse universities is still huge, thus shedding doubt on
therelevance of these results. However, the D-SETARA rating system has yielded more varied results. One
surprising result is that UM’s Engineering programme is rated as Tier 4, which is lower than USM, UPM,
UTAR, UiTM, UTP, UMP, MMU, Monash University and Curtin University, who all received Tier 5 ratings.45
This is a large contrast to the QS Subject Rankings and the US News Rankings, which rated UM as the 35th
and 27th best university in the world for engineering respectively, the highest among all Malaysian
universities in both lists, further highlighting the subjectivity and volatility between different ranking
results.
Undoubtedly, MOHE faces pressure from both public and private universities to ‘inflate’ their MyRA and
SETARA scores. MOHE also has little incentive to resists these lobbying pressures from the public and
private universities since having more local universities with higher ratings will also be a good reflection
on the MOHE. Instead of abandoning these local ranking systems, more effort needs to be channelled into
improving these rankings so that they are more accurate reflections of different performance
indicators:the quality of teaching, the quality of research output, the employability of its students, just to
name a few. Moreover, these indicators should be made public and disclosed in a transparent manner so
that they do not fall prey to ‘lobbying’ by the individual universities. To date, the details of the MyRA,
SETARA and D-SETARA scores are not listed anywhere in the MQA website or the MOHE website. All we
know about the general performance of these universities is how many STARS they have received. This is
unacceptable and can be easily rectified.
6.0 What next? Summary and Conclusion
In an age of commercialised higher education, world university rankings are still very much a relevant tool
for public relations and marketing, and will continue to capture wide attention and interest for years to
43 "Universities obtain six star rating," The Star Online, December 12, 2015. 44 Malaysia, Malaysian Qualifications Agency, The 2013 Rating for Higher Education Institutions in Malaysia. 45 Malaysia, Malaysian Qualifications Agency, Discipline Based Rating System.
31
come. However, as argued in this paper, if the MoHE is serious about improving the quality of Malaysian
universities and graduates, achieving improvements in ranking lists should not be a government-endorsed
target as they are an inadequate benchmark, and are largely unhelpful in setting the higher education on
the right trajectory. Richard Holmes argues it this way: ‘Rankings are not entirely worthless and if they did
not exist no doubt they would somehow be invented. But it is doing nobody any good to use them to
promote the special interests of university bureaucrats and insecure senior academics.’46
Instead, the MoHE should strive to improve our own domestic ranking systems such as MyRA, SETARA
and D-SETARA by increasing the robustness of these rating indicators. By focusing on these domestic
evaluation methods, the Ministry will be able to better promote long-term structural changes for local
institutions and free the higher education system from being bounded by KPIs set by ranking agencies
that have differing interests, some of which may conflict with the objectivity of the rankings. This kind of
support for our own domestic evaluation standards will empower our own statutory review bodies such
as the MQA to provide a more customized and accurate alternative to the current global world university
rankings. The academic and administrative leadership of IPTAs must also be given time and autonomy to
come up with their own internal Key Performance Indicators (KPI) that are more holistic and relevant for
both a Malaysian and regional context. Throughout the process, Ministry must uphold transparency and
accountability.. For example, the output of researchers can be measured by also including more diverse
publications such as books and monographs instead of just internationally recognised, English medium
Scopus and ISI publications. The teaching quality of the universities can be improved by allowing separate
teaching and research career paths for university lecturers instead of forcing all faculty members to do
both.
In his report for IDEAS entitled “Accountability and Autonomy in Higher Education: Lessons from Ghana
and Mexico”, Dr Sean Matthews appealed to our government to learn from the experiences of the Higher
Education sectors in Ghana and Mexico, two similarly developing and middle-income postcolonial
nations. 47 He suggested that ‘Malaysia’s university governance practices must be fundamentally
reconceived.’ 48 The changes made in Ghana and Mexico had entailed ‘confidence and trust in
professionals working within the sector to assess and articulate their own strengths, mission and
46 Richard Holmes, "Proving anything you want from rankings," University Ranking Watch, July 3, 2017. 47 Sean Matthews, Autonomy and Accountability in Higher Education: Lessons from Ghana and Mexico, working paper no. 39, Institute for Democracy and Economic Affairs, June 5, 2017. 48 Ibid.
32
strategy.’49 If Malaysia is to reap the same results, the unhealthy obsession with world university rankings
must stop.
49 Ibid.
33
References
June 14, 2017. Facebook. June 14, 2017.
https://www.facebook.com/moheofficial/photos/pcb.1290338701065126/1290331181065878/?type=3
&theater.
"Academic Reputation." QS Intelligence Unit. Accessed July 07, 2017. http://www.iu.qs.com/university-
rankings/indicator-academic/#toggle-id-10.
"Asia University Rankings." Times Higher Education (THE). March 15, 2017. Accessed July 07, 2017.
https://www.timeshighereducation.com/world-university-rankings/2017/regional-
ranking#!/page/0/length/25/sort_by/rank/sort_order/asc/cols/stats.
"Best Global Universities Rankings." U.S. News & World Report Education. 2016.
https://www.usnews.com/education/best-global-universities/rankings.
Malaysia. Malaysian Qualifications Agency. Discipline Based Rating
System.http://www.mqa.gov.my/PortalMQAv3/D-SETARA/THE%20SUN%2037X26.1_.pdf.
"Employer Reputation." QS Intelligence Unit. Accessed July 07, 2017. http://www.iu.qs.com/university-
rankings/indicator-employer/.
"Faculty Student Ratio." QS Intelligence Unit. Accessed July 07, 2017. http://www.iu.qs.com/university-
rankings/indicator-faculty-student/.
"H Index." QS Intelligence Unit. Accessed July 07, 2017. http://www.iu.qs.com/university-rankings/h-
index/.
Mroz, Ann. “Leader: Only the best for the best.” Times Higher Education (THE). November 05, 2009.
Accessed August 16, 2017.
https://www.timeshighereducation.com/comment/leader/leader-only-the-best-for-the-
best/408968.article?sectioncode=26&storycode=408968&c=1
"Papers and Citations." QS Intelligence Unit. Accessed August 15, 2017. http://www.iu.qs.com/university-
rankings/indicator-papers-citations/.
"QS World University Rankings by Subject." QS Intelligence Unit. Accessed August 15, 2017.
http://www.iu.qs.com/university-rankings/subject-tables/#toggle-id-1.
Holmes, Richard. "Are global university rankings losing their credibility?" Wonkhe. September 23, 2016.
Accessed July 07, 2017. http://wonkhe.com/blogs/are-global-rankings-losing-their-credibility/.
34
Holmes, Richard. "Proving anything you want from rankings." University Ranking Watch. July 3, 2017.
Accessed July 07, 2017. http://rankingwatch.blogspot.my/2017/07/proving-anything-you-want-from-
rankings.html.
Holmes, Richard. "Ten Universities with a Surprisingly Large Research Impact." University Ranking Watch.
May 30, 2017. Accessed July 07, 2017. http://rankingwatch.blogspot.my/2017/05/ten-universities-with-
surprisingly.html.
Holmes, Richard. "The Abuse and Use of Rankings." University Ranking Watch. June 16, 2017. Accessed
July 07, 2017. http://rankingwatch.blogspot.my/2017/06/the-abuse-and-use-of-rankings.html.
Holmes, Richard. "The THE-QS World University Rankings, 2004 – 2009." Asian Journal of University
Education6, no. 1 (January 2010): 107.
Holmes, Richard. "The THE World University Rankings: Arguably the Most Amusing League Table in the
World." University Ranking Watch. September 24, 2016. Accessed July 07, 2017.
http://rankingwatch.blogspot.my/2016/09/the-world-university-rankings-arguably.html.
Holmes, Richard. "University Malaya Again." University Ranking Watch. August 25, 2012. Accessed August
25, 2012. http://rankingwatch.blogspot.my/2012/08/universiti-malaya-again-in-many.html
Leong, Ian Jerome. "Setting national education standards." The Star, March 6, 2016, Star Special sec.
https://www.scribd.com/doc/302803916/SETARA-6-March-2016.
Lim, Miguel Antonio. "The building of weak expertise: the work of global university rankers." Higher
Education, April 13, 2017. doi:10.1007/s10734-017-0147-8.
Malaysia. Ministry of Education. Malaysia Education Blueprint 2015-2015 (Higher Education). Putrajaya:
Kementerian Pendidikan Malaysia, 2015. E-4.
Malaysia. Ministry of Education. Malaysia Education Blueprint 2015-2015 (Higher Education). Putrajaya:
Kementerian Pendidikan Malaysia, 2015. E-5.
Malaysia. Ministry of Education. Malaysia Educational Statistics 2016. Putrajaya: Kementerian Pendidikan
Malaysia, 2015. 162.
Matthews, Sean. "Can we trust international university rankings?" Malay Mail Online. June 09, 2017.
Accessed July 07, 2017. http://www.themalaymailonline.com/what-you-think/article/can-we-trust-
international-university-rankings-sean-matthews.
Matthews, Sean. Autonomy and Accountability in Higher Education: Lessons from Ghana and Mexico.
Working paper no. 39. Institute for Democracy and Economic Affairs. June 5, 2017.
http://www.ideas.org.my/wp-content/uploads/2017/06/Autonomy-and-Accountability-in-Higher-
Education-Lessons-from-Ghana-and-Mexico.pdf.
35
"Methodology." Academic Ranking of World Universities. Accessed July 07, 2017.
http://www.shanghairanking.com/ARWU-Methodology-2016.html.
Pua, Tony. “Universiti Malaya: 89th or Nowhere? (Part III)”, Education in Malaysia. August 29, 2005.
Accessed August 15, 2017.
http://educationmalaysia.blogspot.my/2005/08/universiti-malaya-89th-or-nowhere-part_29.html
"Shanghai Ranking's Global Ranking of Academic Subjects 2017 - Mathematics." Academic Ranking of
World Universities. 2016. Accessed July 07, 2017. http://www.shanghairanking.com/Shanghairanking-
Subject-Rankings/mathematics.html.
Malaysia. Malaysian Qualifications Agency. The 2013 Rating for Higher Education Institutions in
Malaysia.http://www.mqa.gov.my/PortalMQAv3/SETARA13/SETARA%20'13%20Result.pdf.
"THE Asia-Pacific University Rankings 2017: ready for the next level?" Times Higher Education (THE). July
04, 2017. Accessed July 07, 2017. https://www.timeshighereducation.com/world-university-
rankings/asia-pacific-university-rankings-2017-ready-for-the-next-level.
“The Unbelievable Professor Hashim Yaacob,” Letters, Malaysiakini, November 08, 2005. Accessed August
15, 2017. https://www.malaysiakini.com/letters/42736
"Universities obtain six star rating - Education." The Star Online. December 12, 2015. Accessed July 07,
2017. http://www.thestar.com.my/news/education/2015/12/13/universities-obtain-six-star-rating/.
"University of Malaya." U.S. News & World Report Education. 2016.
https://www.usnews.com/education/best-global-universities/university-of-malaya-501730.
"What is MyRA?" KPIMS II. December 14, 2015. Accessed July 07, 2017.
https://www.kpims.usm.my/v2/?p=what-is-myra.
"World Reputation Rankings." Times Higher Education (THE). June 14, 2017. Accessed July 07, 2017.
https://www.timeshighereducation.com/world-university-rankings/2017/reputation-
ranking#!/page/0/length/25/sort_by/rank/sort_order/asc/cols/stats.
"World University Rankings." Times Higher Education (THE). September 23, 2016. Accessed July 07, 2017.
https://www.timeshighereducation.com/world-university-rankings/2017/world-
ranking#!/page/0/length/25/sort_by/rank/sort_order/asc/cols/stats.
36
Appendix: Score Breakdown of Malaysian Universities in the QS World
University Rankings by Subject Universiti Malaya (UM)
Arts & Humanities
Year Rank OVERALL
SCORE
ACADEMIC
REPUTATION
CITATIONS
PER PAPER
EMPLOYER
REPUTATION
H-INDEX
CITATIONS
2015 159 66.3 68.8 83.4 55.5 27.9
2016 127 66.5 72 74 63.2 21.5
2017 85 72.8 78.8 73.2 74.5 33
Engineering & Technology
Year Rank OVERALL
SCORE
ACADEMIC
REPUTATION
CITATIONS
PER PAPER
EMPLOYER
REPUTATION
H-INDEX
CITATIONS
2014 213 64.4 59.5 73.8 68.5 54.8
2015 83 78 68.6 87.5 77.2 84.4
2016 54 79.7 73.7 80.7 81 92.5
2017 35 82.8 82.9 86.2 75 94.9
Life Sciences & Medicine
Year Rank OVERALL
SCORE
ACADEMIC
REPUTATION
CITATIONS
PER PAPER
EMPLOYER
REPUTATION
H-INDEX
CITATIONS
2014 325 56.6 65.2 73.4 52.6 40
2015 298 58.6 65.4 85.7 55.1 40.4
2016 249 62.9 82.7 74.6 52.8 36.6
2017 164 71.5 87.2 59.9 74.9 56.7
Natural Sciences
Year Rank OVERALL
SCORE
ACADEMIC
REPUTATION
CITATIONS
PER PAPER
EMPLOYER
REPUTATION
H-INDEX
CITATIONS
2015 367 62.1 55.9 87.1 62.1 49.4
2016 217 68.7 75.3 79.7 65.7 47.5
2017 188 73.3 80.7 69.7 75.2 60.2
Social Sciences & Management
37
Year Rank OVERALL
SCORE
ACADEMIC
REPUTATION
CITATIONS
PER PAPER
EMPLOYER
REPUTATION
H-INDEX
CITATIONS
2014 167 64.1 58.7 72.1 69.4 62.2
2015 123 72.1 64.3 84.8 85.2 60
2016 69 73.7 74.3 80 72.1 52.9
2017 71 73.2 75.6 74.9 74.4 55.6
Universiti Sains Malaysia (USM)
Arts & Humanities
Year Rank OVERALL
SCORE
ACADEMIC
REPUTATION
CITATIONS
PER PAPER
EMPLOYER
REPUTATION
H-INDEX
CITATIONS
2015 228 61.3 61.9 80.3 53 27.9
2016 191 62 66.4 69.7 60.8 21.5
2017 144 68.3 74.1 71.5 66.9 33
Engineering & Technology
Year Rank OVERALL
SCORE
ACADEMIC
REPUTATION
CITATIONS
PER PAPER
EMPLOYER
REPUTATION
H-INDEX
CITATIONS
2014 169 67.1 56.6 69.7 81.9 75.1
2015 122 74.7 67 84.1 78.2 72.9
2016 85 75.9 72.6 75.9 78.9 82
2017 93 76.7 80 80.3 69.8 77.8
Life Sciences & Medicine
Year Rank OVERALL
SCORE
ACADEMIC
REPUTATION
CITATIONS
PER PAPER
EMPLOYER
REPUTATION
H-INDEX
CITATIONS
2014 361 54.3 62.3 71.1 50.2 39
2015 327 56.8 66.5 83.2 51.4 36.2
2016 254 62.1 82 71.6 50.2 38.2
2017 269 65.1 84.7 49.7 68.4 47.9
Natural Sciences
38
Year Rank OVERALL
SCORE
ACADEMIC
REPUTATION
CITATIONS
PER PAPER
EMPLOYER
REPUTATION
H-INDEX
CITATIONS
2014 262 63.9 56.6 70.4 75.9 60.1
2015 337 63.3 62.8 82.7 65.6 42.5
2016 226 68.2 76.4 75.1 68 45.3
2017 232 70.7 79.5 70 75.1 49.4
Social Sciences & Management
Year Rank OVERALL
SCORE
ACADEMIC
REPUTATION
CITATIONS
PER PAPER
EMPLOYER
REPUTATION
H-INDEX
CITATIONS
2014 153 65.4 54.7 66.8 91.3 88.8
2015 161 69.9 61.4 80.3 91.3 60
2016 89 71.9 70.7 74.5 80.8 61
2017 106 70.1 71.1 77.7 69.7 58.3
Universiti Kebangsaan Malaysia (UKM)
Arts & Humanities
Year Rank OVERALL
SCORE
ACADEMIC
REPUTATION
CITATIONS
PER PAPER
EMPLOYER
REPUTATION
H-INDEX
CITATIONS
2015 196 63.1 65.1 74.8 62.5 27.9
2016 179 63.1 69.1 63.4 67.9 21.5
2017 135 68.6 77.5 71.7 58.1 33
Engineering & Technology
Year Rank OVERALL
SCORE
ACADEMIC
REPUTATION
CITATIONS
PER PAPER
EMPLOYER
REPUTATION
H-INDEX
CITATIONS
2014 333 59.2 55.3 64.4 63.4 54.8
2015 186 71.5 65.1 80.3 69.6 72.9
2016 149 71.1 69.9 68.9 71.7 77.8
2017 135 72.9 77.8 74.2 63.2 77.8
Life Sciences & Medicine
39
Year Rank OVERALL
SCORE
ACADEMIC
REPUTATION
CITATIONS
PER PAPER
EMPLOYER
REPUTATION
H-INDEX
CITATIONS
2015 391 52.1 67.8 81.6 39.6 27.5
2016 355 57.1 81.7 68.3 39.4 31.1
2017 295 63.7 84.7 44.8 66.7 47.9
Natural Sciences
Year Rank OVERALL
SCORE
ACADEMIC
REPUTATION
CITATIONS
PER PAPER
EMPLOYER
REPUTATION
H-INDEX
CITATIONS
2016 363 62.7 74.4 67.1 57 40.4
2017 369 64.8 78.7 59.7 63.5 43.4
Social Sciences & Management
Year Rank OVERALL
SCORE
ACADEMIC
REPUTATION
CITATIONS
PER PAPER
EMPLOYER
REPUTATION
H-INDEX
CITATIONS
2014 308 57 54.7 65.2 57.4 43.8
2015 196 67.7 62.2 77.6 76.4 56.8
2016 161 67.5 71.7 70 57.2 49.5
2017 146 67.6 72.9 65.3 62.7 58.3
Universiti Putra Malaysia (UPM)
Arts & Humanities
Year Rank OVERALL
SCORE
ACADEMIC
REPUTATION
CITATIONS
PER PAPER
EMPLOYER
REPUTATION
H-INDEX
CITATIONS
2016 295 55.9 59.9 67.6 64.5
2017 223 63.4 69.7 67.4 64 20.8
40
Engineering & Technology
Year Rank OVERALL
SCORE
ACADEMIC
REPUTATION
CITATIONS
PER PAPER
EMPLOYER
REPUTATION
H-INDEX
CITATIONS
2014 295 60.8 56.5 67.6 67.2 52.4
2015 161 72.6 66.3 81.8 74.1 69.1
2016 133 71.6 69 71.7 75.3 74.7
2017 145 72.4 75.7 76.9 64.9 73.9
Life Sciences & Medicine
Year Rank OVERALL
SCORE
ACADEMIC
REPUTATION
CITATIONS PER
PAPER
EMPLOYER
REPUTATION
H-INDEX
CITATIONS
2015 340 55.6 63.5 85.5 50.4 36.2
2016 311 59.2 75.8 74.6 50.9 34.9
2017 289 64 78.6 53.9 67.6 49.1
Natural Sciences
Year Rank OVERALL
SCORE
ACADEMIC
REPUTATION
CITATIONS
PER PAPER
EMPLOYER
REPUTATION
H-INDEX
CITATIONS
2014 387 58.1 57.7 68.6 59.8 46.7
2015 356 62.5 64.9 79.5 58.3 45
2016 256 66.9 75 71.8 63.3 49.6
2017 276 68.9 77 67.4 68.8 54.5
Social Sciences & Management
Year Rank OVERALL
SCORE
ACADEMIC
REPUTATION
CITATIONS
PER PAPER
EMPLOYER
REPUTATION
H-INDEX
CITATIONS
2014 274 58.4 49.8 68.3 67.9 62.2
2015 232 66.2 56.2 78.1 84 62.9
2016 152 68.1 68.2 72.4 67.2 55.9
2017 187 65.7 67.7 69.4 65.5 52.6
41
Universiti Teknologi Malaysia (UTM)
Arts & Humanities
Year Rank OVERALL
SCORE
ACADEMIC
REPUTATION
CITATIONS
PER PAPER
EMPLOYER
REPUTATION
H-INDEX
CITATIONS
2016 372 51.9 55.2 67.8 52.6
2017 296 60 66 63.6 59.7 20.8
Engineering & Technology
Year Rank OVERALL
SCORE
ACADEMIC
REPUTATION
CITATIONS
PER PAPER
EMPLOYER
REPUTATION
H-INDEX
CITATIONS
2014 256 62.4 63.3 68.1 60.8 49.9
2015 134 74.1 71.6 81.2 67 72.9
2016 100 74 75.1 72 69.9 79.3
2017 90 76.9 82.8 73.6 67.9 82.3
Natural Sciences
Year Rank OVERALL
SCORE
ACADEMIC
REPUTATION
CITATIONS
PER PAPER
EMPLOYER
REPUTATION
H-INDEX
CITATIONS
2017 378 64.6 77.2 61.6 68.2 38.7
Social Sciences & Management
Year Rank OVERALL
SCORE
ACADEMIC
REPUTATION
CITATIONS
PER PAPER
EMPLOYER
REPUTATION
H-INDEX
CITATIONS
2014 363 54.8 47.2 66 58.7 55.3
2015 283 64 57.9 76.5 77.1 43.9
2016 211 64.9 65.2 70.9 64.6 45.6
2017 204 64.7 65.3 70.5 66.8 49.2