ireg ranking audit manual

24
November 2011 IREG Ranking Audit Manual

Upload: others

Post on 14-Nov-2021

4 views

Category:

Documents


0 download

TRANSCRIPT

November 2011

IREG Ranking AuditManual

IREG Observatory on Academic Ranking and Excellence(IREG stands for International Ranking Expert Group)

www.ireg-observatory.org

The IREG Ranking Audit Manual3

Table of Contents

1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

2 The IREG Ranking Audit Criteria. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62.1. Criteria on Purpose, Target Groups, Basic Approach . . . . . . . . . . . . . . 62.2. Criteria on Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62.3. Criteria on Publication and Presentation of Results. . . . . . . . . . . . . . . . . 72.4. Criteria on Transparency and Responsiveness. . . . . . . . . . . . . . . . . . . . . . 82.5. Criteria on Quality Assurance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

3 The Assessment of Criteria . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83.1. General Rules of Assessment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83.2. Weights of IREG Ranking Audit Criteria . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

4 The Ranking Audit Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104.1. Eligibility and Formal Application . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124.2. Nomination and Appointment of an Audit Team. . . . . . . . . . . . . . . . . . . 124.3. Avoidance of Conflict of Interest . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134.4. Production of a Self-report by the Ranking Organisation. . . . . . . . . 134.5. Interaction Ranking Organisation – Audit Team . . . . . . . . . . . . . . . . . . . 144.6. Production of an Ranking Audit Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . 154.7. Ranking Audit Decision. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 154.8. Management of Disputes and Appeals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 164.9. Publication of Ranking Audit Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

APPENDIX

A1. Data Sheet on Rankings/Ranking Organisation . . . . . . . . . . . . . . . . . . . . 17A2. Structure of the Self Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19A3. Conflict of Interest Declaration for IREG Ranking Auditors . . . . . . . . 20A4. Berlin Principles on Ranking of Higher Education Institutions . . . . . 21

THE IREG RANKING AUDIT MANUAL

The IREG Ranking Audit Manual4

The IREG Ranking Audit Manual5

Academic rankings are an entrenched phenomenonaround the world and as such are recognized assource of information, as transparency instrumentas well as methods of quality assessment. There isalso empirical evidence that rankings are influencingindividual decisions, institutional and system-levelpolicy-making areas. Consequently, those whoproduce and publish ranking aregrowingly aware that theyput their reputation onthe line in case theirranking tables are not freeof material errors or theyare not carried out with adue attention to basicdeontological procedures. Inthis context importantinitiative was undertaken by anad-hoc expert group - theInternational Ranking Expert Group (IREG) whichcame up in May 2006 with a set of guidelines – theBerlin Principles on Ranking of Higher EducationInstitutions [in short “Berlin Principles” – seeAppendix or www.ireg-observatory.org].

In October 2009, on the basis of IREG was createdthe IREG Observatory on Academic Ranking andExcellence [in short “IREG Observatory”]. One ofits main activities relates to the collectiveunderstanding of the importance of qualityassessment as one of its principal own domain ofactivities – university rankings [actually covering alltypes of higher education institutions]. The IREGRanking Audit initiative needs to be seen in theabove context. It is based on the Berlin Principlesand is expected to:• enhance the transparency about rankings;• give users of rankings a tool to identify trustworthyrankings; and • improve the overall quality of rankings.

Users of university rankings (i.e. students and theirparents, university leaders, academic staff,representatives of the corporate sectors, national andinternational policy makers) differ very much in theirinside knowledge about higher education, universities

and appropriate rankingmethodologies. Particularly, theless informed groups (likeprospective students) do nothave a deep understandingof the usefulness andlimitations of rankings,thus, an audit must be avalid and robustevaluation. This will offer

a quality stamp which is easyto understand and in case of positive

evaluation, rankings are entitled to use the quality labeland corresponding logo “IREG approved”.

The purpose of this manual is to guide rankingorganisations how to assemble and presentrequested information and other evidence in allstages of the IREG Ranking Audit. It also will servethe members of the IREG Secretariat and audit teamsto prepare and conduct all stages of the auditprocess – collection of information, team visits, andwriting the reports.

The main objective of this manual is to develop acommon understanding of the IREG Ranking Auditprocess. Accordingly, this manual has the followingmain sections: The second and third chapter of thisdocument describe the criteria of the IREG RankingAudit as well as the method of assessing the criteria.Chapter four presents the process of audit in itsvarious steps from the application for an audit to thedecision making process within the IREGObservatory.

1 INTRODUCTION

The criteria of the IREG Ranking Audit have beendeveloped and approved by the IREG ObservatoryExecutive Committee in May 2011.

The criteria refer to five dimensions of rankings: first, thedefinition of their purpose, target groups and their basicapproach, second, various aspects of theirmethodology, including selection of indicators, methodsof data collection and calculation of indicators, third, thepublication and presentation of their results, fourth,aspects of transparency and responsiveness of theranking and the ranking organisation and, last, aspectsof internal quality assurance processes and instrumentswithin the ranking.

A number of criteria are referring to the Berlin Principles(see Appendix). The Berlin Principles were not yet meantto provide an operational instrument to assess individualrankings. They were a first attempt to define generalprinciples of good ranking practice. Not all relevantaspects of the quality of rankings were covered by theBerlin Principles, not all the dimensions were elaboratedin full detail. In addition, rankings and the discussionabout rankings have developed further since thepublication of the Berlin Principles in 2006. Hence thereare a number of new criteria that do not relate directlyto the Berlin Principles.

2.1. Criteria on Purpose, Target Groups, Basic Approach

The method of evaluation called “ranking” refers to amethod which allows a comparison and ordering of units,in this case that of higher education institutions and theiractivities, by quantitative and/or quantitative-like (e.g.stars) indicators. Within this general framework rankingscan differ in their purpose and aims, their main targetaudiences and their basic approach.

Criterion 1: The purpose of the ranking and the (main) target groupsshould be made explicit. The ranking has todemonstrate that it is designed with due regard to itspurpose (see Berlin Principles, 2). This includes a modelof indicators that refers to the purpose of the ranking.

Criterion 2: Rankings should recognize the diversity of institutionsand take the different missions and goals of institutions

into account. Quality measures for research-orientedinstitutions, for example, are quite different from those thatare appropriate for institutions that provide broad accessto underserved communities (see Berlin Principles, 3).The ranking has to be explicit about the type/profile ofinstitutions which are included and those which are not.

Criterion 3:Rankings should specify the linguistic, cultural,economic, and historical contexts of the educationalsystems being ranked. International rankings inparticular should be aware of possible biases and beprecise about their objectives and data (see BerlinPrinciples, 5).

International rankings should adopt indicators withsufficient comparability across various nationalsystems of higher education.

2.2. Criteria on MethodologyThe use of a proper methodology is decisive to thequality of rankings. The methodology has to correspondto the purpose and basic approach of the ranking. Atthe same time rankings have to meet standards ofcollecting and processing statistical data.

Criterion 4: Rankings should choose indicators according to theirrelevance and validity. The choice of data should begrounded in recognition of the ability of eachmeasure to represent quality and academic andinstitutional strengths, and not availability of data.Rankings should be clear about why measures wereincluded and what they are meant to represent (seeBerlin Principles, 7).

Criterion 5: The concept of quality of higher education institutions ismultidimensional and multi-perspective and “quality liesin the eye of the beholder”. Good ranking practice wouldbe to combine the different perspectives provided bythose sources in order to get a more complete view ofeach higher education institution included in the ranking.Rankings have to avoid presenting data that reflect onlyone particular perspective on higher educationinstitutions (e.g. employers only, students only). If aranking refers to one perspective/one data source, onlythis limitation has to be made explicit.

The IREG Ranking Audit Manual6

2 THE IREG RANKING AUDIT CRITERIA

The IREG Ranking Audit Manual7

Criterion 6:Rankings should measure outcomes in preference toinputs whenever possible. Data on inputs andprocesses are relevant as they reflect the generalcondition of a given establishment and are morefrequently available. Measures of outcomes providea more accurate assessment of the standing and/orquality of a given institution or program, andcompilers of rankings should ensure that anappropriate balance is achieved (see BerlinPrinciples, 8).

Criterion 7:Rankings have to be transparent regarding themethodology used for creating the rankings. Thechoice of methods used to prepare rankings shouldbe clear and unambiguous (see Berlin Principles, 6).It should also be indicated who establishes themethodology and if it is externally evaluated.

Ranking must provide clear definitions andoperationalisations for each indicator as well as theunderlying data sources and the calculation ofindicators from raw data. The methodology has to bepublicly available to all users of the ranking as longas the ranking results are open to public. In particular,methods of normalizing and standardizing indicatorshave to be explained with regard to their impact onraw indicators.

Criterion 8: If rankings are using composite indicators theweights of the individual indicators have to bepublished. Changes in weights over time should belimited and have to be justified due to methodologicalor conceptual considerations.Institutional rankings have to make clear the methodsof aggregating results for a whole institution.Institutional rankings should try to control for effectsof different field structures (e.g. specialized vs.comprehensive universities) in their aggregate result(see Berlin Principles, 6).

Criterion 9: Data used in the ranking must be obtained fromauthorized, audited and verifiable data sourcesand/or collected with proper procedures forprofessional data collection following the rules ofempirical research (see Berlin Principles, 11 and 12).Procedures of data collection have to be madetransparent, in particular with regard to survey data.

Information on survey data has to include: source ofdata, method of data collection, response rates, andstructure of the samples (such as geographicaland/or occupational structure).

Criterion 10: Although rankings have to adapt to changes inhigher education and should try to enhance theirmethods, the basic methodology should be keptstable as much as possible. Changes inmethodology should be based on methodologicalarguments and not be used as a means to producedifferent results compared to previous years.Changes in methodology should be madetransparent (see Berlin Principles, 9).

2.3. Criteria on Publication and Presentation ofResults

Rankings should provide users with a clearunderstanding of all of the factors used to developa ranking, and offer them a choice in how rankingsare displayed. This way, the users of rankings wouldhave a better understanding of the indicators thatare used to rank institutions or programs (see theBerlin Principles, 15).

Criterion 11: The publication of a ranking has to be madeavailable to users throughout the year either by print publications and/or by an onlineversion of the ranking.

Criterion 12: The publication has to deliver a description of themethods and indicators used in the ranking. Thatinformation should take into account the knowledgeof the main target groups of the ranking.

Criterion 13:The publication of the ranking must provide scoresof each individual indicator used to calculate acomposite indicator in order to allow users to verifythe calculation of ranking results. Compositeindicators may not refer to indicators that are notpublished.

Criterion 14: Rankings should allow users to have someopportunity to make their own decisions about therelevance and weights of indicators (see the BerlinPrinciples, 15).

The IREG Ranking Audit Manual8

2.4. Criteria on Transparency andResponsiveness

Accumulated experience with regard to the degreeof confidence and “popularity” of a given rankingdemonstrates that greater transparency meanshigher credibility of a given ranking.

Criterion 15:Rankings should be compiled in a way that eliminatesor reduces errors caused by the ranking and beorganized and published in a way that errors and faultscaused by the ranking can be corrected (see BerlinPrinciples, 16). This implies that such errors should becorrected within a ranking period at least in an onlinepublication of the ranking.

Criterion 16: Rankings have to be responsive to higher educationinstitutions included/ participating in the ranking.This involves giving explanations on methods andindicators as well as explanation of results ofindividual institutions.

Criterion 17:Rankings have to provide a contact address in theirpublication (print, online version) to which users andinstitutions ranked can direct questions about themethodology, feedback on errors and generalcomments. They have to demonstrate that theyrespond to questions from users.

2.5. Criteria on Quality AssuranceRankings are assessing the quality of highereducation institutions. They want to have an impacton the development of institutions. This claim putsa great responsibility on rankings concerning theirown quality and accurateness. They have to developtheir own internal instruments of quality assurance.

Criterion 18: Rankings have to apply measures of quality assuranceto ranking processes themselves. These processesshould take note of the expertise that is being appliedto evaluate institutions and use this knowledge toevaluate the ranking itself (see Berlin Principles, 13).

Criterion 19:Rankings have to document the internal processes ofquality assurance. This documentation has to refer toprocesses of organising the ranking and datacollection as well as to the quality of data andindicators.

Criterion 20:Rankings should apply organisational measuresthat enhance the credibility of rankings. Thesemeasures could include advisory or evensupervisory bodies, preferably (in particular forinternational rankings) with some internationalparticipation (see Berlin Principles, 14).

3 THE ASSESSMENT OF CRITERIA

3.1 General Rules of AssessmentThe Audit decision will be based on a standardisedassessment of the criteria set up above. Criteria areassessed with numerical scores. In the audit processthe score of each criterion is graded by the reviewteams according to the degree of fulfilment of thatcriterion. The audit will apply a scale from 1 to 6:

Not sufficient/not existing 1Marginally applied 2Adequate 3Good 4Strong 5Distinguished 6

Not all criteria are of the same relevance. Hence criteriawill be divided into core criteria with a weight of two andregular criteria with a weight of one (see below table -Weights of Audit Criteria). Hence the maximum scorefor each core criteria will be twelve, for regular criteriasix. Based on the attribution of criteria (with 10 core and10 regular criteria) the total maximum score will be 180.

On the bases of the assessment scale describedabove, the threshold for a positive audit decision willbe 60 per cent of the maximum total score. Thismeans the average score on the individual criteriahas to be slightly higher than “adequate”. In orderto establish the IREG Ranking Audit as a qualitylabel none of the core criteria must be assessedwith a score lower than three.

The IREG Ranking Audit Manual9

Criterion (short description) Weight

PURPOSE, TARGET GROUPS, BASIC APPROACH1. The purpose of the ranking and the (main) target groups should be made explicit: 2

2. Rankings should recognize the diversity of institutions: 2

3. Rankings should specify the linguistic, cultural, economic, and historical contexts 1of the educational systems being ranked.

METHODOLOGY4. Rankings should choose indicators according to their relevance and validity. 2

5. The concept of quality of higher education institutions is multidimensional and multi-perspective (...). 1Good ranking practice would be to combine the different perspectives.

6. Rankings should measure outcomes in preference to inputs whenever possible 1

7. Rankings have to be transparent regarding the methodology used for creating the rankings. 2

8. If rankings are using composite indicators the weights of the individual indicators have to 2be published. Changes in weights over time should be limited and due to methodological or conception-related considerations

9. Data used in the rankings must be obtained from authorized, audited and verifiable data sources 2and/or collected with proper procedures for professional data collection

10. The basic methodology should be kept stable as much as possible. 1

PUBLICATION AND PRESENTATION OF RESULTS11. The publication of a ranking has to be made available to users throughout the year either by print

publications and/or by an online version of the ranking 1

12. The publication has to deliver a description of the methods and indicators used in the ranking. 1

13. The publication of the ranking must provide scores of each individual indicator used to calculate a 2composite indicator in order to allow users to verify the calculation of ranking results.

14. Rankings should allow users to have some opportunity to make their own decisions about the relevance and weights of indicators 1

TRANSPARENCY, RESPONSIVENESS15. Rankings should be compiled in a way that eliminates or reduces errors 1

16. Rankings have to be responsive to higher education institutions included/ participating in the ranking 2

17. Rankings have to provide a contact address in their publication (print, online version) 1

QUALITY ASSURANCE18. Rankings have to apply measures of quality assurance to ranking processes themselves. 2

19. Rankings have to document the internal processes of quality assurance 1

20. Rankings should apply organisational measures that enhance the credibility of rankings 2

MAXIMUM TOTAL SCORE (with 6 grade scale of assessment) 180

3.2 Weights of IREG Ranking Audit Criteria

The IREG Ranking Audit Manual10

4 THE RANKING AUDIT PROCESS

This section of the manual is designed to helpranking organisations to learn how to assemble andpresent requested information and other evidencein all stages of the IREG Ranking Audit. It is alsoserves the Secretariat of IREG Observatory as wellas audit teams to prepare and conduct all stages ofthe audit process – collection of information, teamvisits, and writing the reports. The audit processfollows the structure, procedures, processes andgood practices which have been established inother forms of quality assurance, in particular theaccreditation, for such procedures covering theinstitutions of higher education as well as their studyprograms and other activities.

The process includes a number of steps that aredescribed in this session of the manual. Actorsinvolved are:

• The IREG Executive Committee has an overallresponsibility for the audit in order to assure thehighest standards and impartiality of the processand takes the decision about approval ofrankings.

• The IREG Ranking Audit Teams are nominated bythe Executive Committee in consultation with theCoordinator of IREG Audit out of a pool ofauditors. The Audit Team is preparing a report andrecommendation on the approval of a ranking tothe Executive Committee.

• The Coordinator of IREG Ranking Audit. In order toassure the impartiality and the highestprofessional and deontological standards of theaudit process, the Executive Committee appointsfor a period of 3 years a Coordinator of IREG

Ranking Audit. He/she is not a member of theExecutive Committee and is not involved in doingrankings. His/her task is to guarantee that allstages of the process as well as the collectedevidence (i.e. the self-reports submitted byranking organisations and the audit reportsdrafted by the Audit Teams) meet the standardsset by this manual. He/she is providing advice onthe composition of the audit teams. He/shereviews a report drafted by the Audit Teams andsubmits a recommendation to the ExecutiveCommittee but does not participate in the vote.The Coordinator of IREG Ranking Audit receivesorganisational support from the Secretariat of theIREG Observatory.

• The IREG Observatory Secretariat is givingadministrative and technical support to the AuditTeams and the Audit Coordinator. The Secretariatis the contact address for the rankingorganisation.

• The ranking organisation which is applying forIREG Ranking Audit: The Ranking organisationhas to submit all relevant information to IREGObservatory, particular in form of a self-report andis involved in communication and interaction withIREG Observatory throughout the process.

The following illustration gives on overview on thewhole audit process. The individual steps andprocedures are described in the sections to follow.

Overview: The IREG Ranking Audit Process v

The IREG Ranking Audit Manual11

Ranking

IREG Secretariat

Executive Committee

Ranking

IREG Audit Coordinator

IREG Audit Team

IREG Audit Coordinator

Ranking

IREG Audit Team

IREG Audit Team

IREG Audit Coordinator

Ranking

IREG Audit Coordinator

Executive Committee

Application for ranking audit

Setup of audit group

Preparation of self-report

Feedback

Start

1 month

2 month

1 month

1 month

2 month

1 month

1 month

3 month

Distribution of report to audit group

Check of self-report

Sending comments & additional questions

Sending additional questions to ranking

Answering additional questions

Drafting of audit report

Sending report to ranking

Reaction/statement to report

AUDIT DECISION

Information to ranking

Positive audit decision

“IREG approved”

Publication

Negative audit decision

Submitting report & statement by ranking to Executice Commitee

Check of audit report (coherence to criteria and standards)

On-site vistit to ranking(on invitation by ranking only)

Check of self-report (comleteness, consistency)

Check of eligibility; Audit manual and materials send

to ranking

4.1. Eligibility and Formal ApplicationEligible for the IREG ranking audit are national andinternational rankings in the field of higher educationthat have been published at least twice within thelast four years. The last release should not be olderthan two years.

The Ranking Audit and the approval refer toindividual rankings, not to the ranking organisationas a whole. If a ranking organisation producesseveral rankings based on the same basicmethodology they can be audited in one review, butdecision will be made for individual rankings.

A ranking organisation that wishes to enter the IREGRanking Audit process sends an application letterto the President of IREG and completes a datasheetcontaining basic data about the ranking and theranking organisation. The datasheet can bedownloaded from the IREG website. The IREGSecretariat may request further clarification if thisappears necessary.

The decision about the start of the audit process willbe made by the Executive Committee by simplemajority of its members. Members who are relatedto the applying ranking (either as part of the rankingorganisation or as member of any body of theranking, executive or advisory), are excluded fromthe vote.

The decision about the start of the Ranking Audit willbe communicated to the ranking within four weeksafter application. Together with the decision aboutthe eligibility the ranking organisation will beinformed about the members of the Audit Team. Thenames of auditors have to be treated confidentiallyby the ranking organisation.

The Ranking Audit has been conceived as a public-responsibility initiative. For this reason its financingis based on cost-recovery principle. The fee whichis periodically established by the ExecutiveCommittee takes into account the costs of theorganisation and conduct of the audit. The level ofthe fee is higher by 50 per cent for non-members ofIREG Observatory. The ranking organisation has topay the fee for the audit process within two weeksafter it received the confirmation of the start of theAudit by IREG Observatory.

4.2. Nomination and Appointment of an Audit Team

The nomination of an Audit Team will be made bythe Executive Committee after the decision madeabout the start of an audit process.

The Audit Team consists of three to five members.In order to guarantee independence the majority ofauditors are not actively involved in doing rankings.The IREG Executive Committee appoints onemember of the Audit Team to chair the team. Inorder to guarantee neutrality and independence ofthe Audit Teams the chairs of Audit Teams are notformally associated with an organisation that isdoing rankings.

There is no single best model for the compositionof Audit Teams. The key requirements are thatauditors should be independent of the ranking(s)under review and have a sufficient level ofknowledge, experience and expertise to conduct theRanking Audit to a high standard. The acceptanceof the IREG Ranking Audit will largely depend on thequality and integrity of the Audit Teams. TheExecutive Committee can also consult theCoordinator of IREG Ranking Audit.

Members of an Audit Team should represent arange of professional experience in highereducation, quality assurance and the assessmentof higher education institutions or systems. Withregard to the audit of national rankings at least onemember of the Audit Team should have a soundknowledge of the respective national highereducation system. International auditors in the teamcan provide valuable insights for the audit and helpto enhance its credibility; therefore at least onemember of the Audit Team should be an expert fromoutside the country or the countries (in case ofcross-national regional rankings) covered by theranking. The members of Audit Teams of globalrankings should represent the diversity of regionsand cultures. IREG is aiming at including expertsfrom quality assurance agencies who areexperienced in processes of evaluating, accreditingor auditing of institutions or agencies in the field ofhigher education.

Auditors will be required to notify the ExecutiveCommittee in writing of any connection or interest,which could result in a conflict, or potential conflict,of interest related to the audit. In particular auditors

The IREG Ranking Audit Manual12

have to notify the Executive Committee anycommitment to the ranking(s) under review. Thisincludes prior association with the rankingorganisation, membership in any executive oradvisory board of the ranking organisation or any oftheir rankings. If auditors are unsure as to whetheran interest or conflict should be disclosed, theyshould discuss the matter with the IREG RankingAudit Coordinator.

4.3. Avoidance of Conflict of InterestIn carrying out its audit responsibilities, the Audit Teamshould ensure that their participation and decisions arebased solely on the application of criteria andprocedure of IREG Ranking Audit and professionaljudgement. Therefore, it is important to avoid conflictof interest or appearance of conflict of interest.

Conflict of interest is defined as any circumstancein which an individual’s capacity to make animpartial and unbiased decision may be affectedbecause of a prior, current, or anticipatedinstitutional affiliation(s), other significantrelationship(s) with an evaluated organisation orinstitution.

In order to avoid conflict of interest all members ofthe Audit Team should be independent of theinstitution being reviewed, with no personal,professional or commercial relationships that couldlead to a conflict of interest.

When first approached about participating in theparticular audit the proposed member of the AuditTeam will be asked to indicate any potential conflictof interest or prior association which could appearto influence judgment made. The members of theteam will be asked to sign a Declaration confirming[see Annex] certifying that they have no conflict ofinterest with the institution under review. If themember has any doubts about whether any pastrelationship could be considered a conflict ofinterest details should be provided to the ExecutiveCommittee and IREG Ranking Audit Coordinator forconsideration.

As a general rule members of the ExecutiveCommittee will not be included in Audit Teams foraudits of rankings published in their own country(with regard to audits of national rankings); withregard to audits of global rankings members of theExecutive Committee who are doing global rankingswill not be part of Audit Teams, too.

4.4. Production of a Self-report by the RankingOrganisation

The production of the self-report by the rankingorganisation is an essential part of the audit processand is the major portion of the evidence which theAudit Team will draw on in forming its report andsuggestion regarding the audit decision. All claims,judgments and statements made by the rankingorganisation should be backed up by the factsnecessary to corroborate them. The self-report hasto be submitted within two months after the start ofthe Ranking Audit has been communicated to theranking organisation by IREG Observatory.

The self-report has to apply the structure which isdetermined by IREG and which is sent to the rankingorganisation. According to the model structure (seeAppendix) issues to be dealt with in the reportinclude:• information on previous record of ranking activities• an outline of the purpose and main target groups

of the ranking(s);• information on the scope of the ranking in terms

of regional coverage, types of institutionsincluded, fields, time cycle of publication etc.;

• a detailed description of the methodology;• a description instruments of internal quality

assurance of the ranking;• an outline of the publication and use of the

ranking; and• as far as available information on the impact of

the ranking on the level of the individual (e.g. onstudent choice), of the institutions and the highereducation system.

The assessment has to be based on the actualversion of the ranking. Nevertheless the reportshould include a session on recent and plannedchanges.

The ranking organisation can decide about relevantannexes to the report beyond the mandatorydeliverables (within reason, preferably not more than5 to 10 annexes).

The self-report has to be in English, annexes shouldpreferably be in English, too. The report, together withannexes and additional documents, should be sent tothe IREG Secretariat in electronic copy and to each ofthe Audit Team members and the IREG rankingcoordinator both in electronic and hard copy (printeddouble-sided). The address of the Audit Teammembers will be provided by the IREG Secretariat.

The IREG Ranking Audit Manual13

The report and materials delivered by the rankingorganisation will be checked for completeness andcoherence by the IREG ranking coordinator. Theranking organisation will be informed within fourweeks after delivery of the report about additionalrequirements. A revised report should then beprovided within another four weeks.

It is important to note, that should a self-report beconsidered inadequate as a preparation for theaudit report, the whole process may be postponed(including a site visit which eventually has beenplanned). In this case, any additional costs incurred,e.g. in the rescheduling of auditor flights, will be atthe expense of the ranking organisation.

4.5. Interaction Ranking Organisation – AuditTeam

In order to guarantee a correct understanding of theranking and a high quality of the Audit Report anddecision it is necessary to include interactionbetween the ranking organisation and the AuditTeam/IREG Observatory.

Communication should preferably be made by e-mail. All communication should be distributed to allmembers of the Audit Team, to the IREG RankingAudit Co-ordinator and the IREG Secretariat and thecontact person responsible for the audit in theranking organisation. IREG Observatory will providea mailing list to the ranking organisation.

Specification of the interaction process:• After a first check of the report by the IREG

Ranking Audit Coordinator, the Audit Team willreact on the self-report within four weeks bywritten questions, comments and demands foradditional information and materials. Thisfeedback will be sent electronically to the rankingorganisation by the IREG Secretariat.

• The ranking organisation is expected to answer to theadditional questions within one month. The answerby the ranking organisation should be submittedelectronically to the whole mailing list describedabove and in hard-copy to the IREG Secretariat.

• In order to avoid a flood of information at this stagethe ranking organisation should send any additionalmaterial to the Audit Team beyond the writtenanswer to the additional questions only afterconsultation with the IREG Ranking AuditCoordinator.

• The decision about the audit is taken (based on thereport by the Audit Team) by the ExecutiveCommittee. Hence members of the Audit Team andthe IREG Ranking Audit Coordinator are asked notto give any statements on the possible outcome ofthe audit process to the ranking organisation.

At the end of the interaction process the IREGSecretariat informs the ranking organisation aboutthe formal closing of interaction.

Optional: Site visits (on invitation by the rankingorganisation)

In order to keep the costs of the Ranking Audit lowa visit to the ranking organisation is not planned asa regular element of the audit process. Yet theranking organisation is free to invite the Audit Teamto a visit. An invitation for such a visit has to beextended together with the application to theRanking Audit. Members of the Audit Team are freeto accept the invitation. At least two members of theAudit Team have to participate in a visit. The costsfor the visits (travel costs, accommodation) formembers of the Audit Team have to be paid by theranking organisation.

Site visits should preferably take place after theadditional questions have been sent to the rankingorganisation. IREG expects the head of the rankingorganisation, the person responsible for the RankingAudit, all employees and the head of existing advisorybodies to be available at the visit. The duration andagenda of the visit will be scheduled by the contactperson at the ranking institutions and the chair of theAudit team. The site visit is normally conducted inEnglish. If the ranking organisation wishes to useinterpreters, it should inform the IREG Secretariat atleast one month prior to the visit. The rankingorganisation will bear the cost of interpretation. Whenplanning the site visit, it should be kept in mind that theuse of interpretation will lengthen the duration ofmeetings and may also lead to some loss ofinformation and full understanding of details.

If there is a site visit this is a good opportunity for aninternal meeting of the Audit Team at the end of thevisit. At this meeting the team will review theevidence presented and draw preliminary findings,and if possible put them into a “skeleton” report. In sum a site visit can have a number of functions:• to enable the IREG Ranking Audit Team to share,

in personal communication, the impressionsgained from the written materials,

The IREG Ranking Audit Manual14

• to explore in meetings with the key individuals atthe ranking organisation their expertise and thecompliance to the IREG Ranking Audit criteria,

• to formulate the Audit Team’s preliminary findingsin face-to-face communication, and,

• to produce a material for the draft report as abasis for further elaboration after the site visit.

To enable the site visit to fulfil these key functions, it isessential that the visit is managed efficiently andeffectively. The IREG Ranking Audit Team should avoidany impression that any “social program” might haveinfluenced their evaluation of the ranking.

4.6. Production of an Ranking Audit ReportThe Ranking Audit report by the Audit Team is the mainbasis of the decision about the approval of the rankingby the Executive Committee. The review report will bedrafted by the chair of the Audit Team in collaborationwith the other members of the Audit Team. The basisfor the Audit Report are the self-report and additionalmaterials provided by the ranking institution,communication between ranking organisation and theAudit Team (including site visit if it was part of theprocess) and the Audit Team’s findings.

The Audit Report should not exceed 15 – 20 pages(font size 11). The draft report will be sent to themembers of the Audit Team by the IREG Secretariatfor comments and approval. They should submittheir comments within two weeks. The whole reportshould be finalised and sent to the rankingorganisation not later than three months after theformal statement of the end of interaction (see 2.4.).A major part of the Audit Report is the Criteria AuditForm. Each criterion is assessed by a score fromone to six. The final Audit Score is calculated as thesum of the products of the criterion score by theweight of the criterion (see chapter 3). Auditors haveto agree upon the scores of each individual criterion.

A generic structure of the report1) Executive Summary and Criteria Assessment

Form2) The ranking organisation3) The ranking(s) under audit – short description

a) Scope (countries, types of institutions, ...)b) Purpose and target groupsc) Methodology and Data sourcesd) Indicators

4) Findings and assessment of criteriaa) Purpose, target groups, basic application

(criteria 1 to 3)b) Methodology (criteria 4 to 10)c) Publication and presentation of results (criteria

11 to 14)d) Transparency, responsiveness (criteria 15 to

17)e) Quality assurance (criteria 18 to 20)f) Final assessment (total score)

5) Recommendations and – if applicable –conditions to be fulfilled within one year

The draft of the Audit Report is submitted to theIREG Ranking Audit Coordinator who is checkingwhether the report fulfils the standards of AuditsReports set up in this manual and whether theassessment of the ranking is coherent with priorRanking Audit exercises. He/she can recommend arevision of the report. The report will be sentelectronically to the ranking organisation within twomonths after the end of interaction between theranking organisation and the Audit Team.

The ranking organisation can formulate a reactionon the report within one month. The reaction has tobe sent electronically to the IREG Secretariat, whichwill forward it to the IREG Ranking Audit Coordinatorand the members of the Audit Team.

The Audit Report and the reaction of the rankingorganisation are submitted to the IREG ObservatoryExecutive Committee.

4.7. Ranking Audit DecisionBased on a statement of the IREG Ranking AuditCoordinator the Executive Committee testifies thatthe report applies the criteria for ranking audit. TheExecutive Committee decides about the approval ofthe ranking on the basis of the Audit Reportdelivered by the Audit Team and the reaction on thereport submitted by the ranking organisation.

Decision is made by simple majority of the members ofthe Executive Committee. Members of the Committeewho are associated or used to be associated with theranking under audit (either directly or as member of anybody of the ranking organisation) are excluded from thevote. The vote can take place either in a meeting of theExecutive Committee or electronically.

The IREG Ranking Audit Manual15

4.8. Management of Disputes and AppealsThe process of audit and the preparation of reports areintended to be consultative and supportive rather thancritical and adversarial. Nevertheless it is possible thatdifferences of opinion or disputes may arise, or thatjudgments about conduct of the audit or its negativeresult may be disputed. Consequently proceduresshould be made available for resolution of disputes.

Complaints or disputes may relate to the followingkinds of issues:1. The way the tasks of the Audit Team or itsparticular member have been carried out.2. Errors of fact, or misinterpretations of theevidence presented in the self-report or otherdocuments presented during the audit.3. Faulty decision reached by the Audit Team on theresults of the audit.

In order to assure that the appeal processdemonstrates procedural fairness for the appellantit is proposed that:

Ad.1. Any complaints about the proceduresfollowed by members of the Audit Team should beaddressed to the IREG Ranking Audit Coordinator(with a copy to IREG Secretariat) who will considerthe matter, determine a response, and advice thecomplainant of action taken. In considering thematter the IREG Ranking Audit Coordinator may athis/her discretion seek the advice from the membersof the IREG Observatory who are independent of thedispute and have not been involved with the givenaudit review. The final decision will be made by theIREG Ranking Audit Coordinator.

Ad.2. The ranking organisation will be given anopportunity to comment on significant matters of factthat may have been overlooked or misunderstoodduring the review or site-visit, and to requestcorrections. The IREG Ranking Audit Coordinator willconsider any such requests and may consult the chairof the Audit Team in doing so. The ranking organisationwill be advised of the response and provided with acopy of revised wording in the draft report. In case ofthe further dispute the second, and final, round ofclarification is envisaged.

Ad. 3. The Audit Results are result of the work of theAudit Team and the ranking organisation. Theconclusions should be based on evidence. If thereis a dispute over the judgment of the Audit Team onaudit results, the ranking organisation may submitan appeal to the IREG Ranking Audit Coordinatorciting evidence in support of its appeal in relation toIREG Ranking Audit Criteria. The Coordinator willconsider the submission and if he/she believesthere were good reasons for considering the appealwill appoint an ad-hoc three-person appeal boardto advice on the matter. The composition of theboard will be selected from the members of IREGObservatory in consultation with the President ofIREG Observatory. The IREG Ranking Audit Co-ordinator will chair the board.The appeal board may recommend rejection of theappeal if it believes the decision made was reasonablein the light of the evidence and the criteria adopted forthe audit. The decision of the board will be final.Consequently the ranking organisation is not allowed toclaim the status “IREG approved”.

If the appeal board believes there is insufficientevidence to make a fully informed decision, due tosome ambiguity in technicalities or interpretation ofevidence, it may recommend that decision issuspended and a further full or partial new audit isundertaken. The financial aspect of such audit is tobe subject of the decision of the ExecutiveCommittee of IREG Observatory.

4.9. Publication of Ranking Audit ResultsThe audit decision does not have any formalconsequences, e.g. concerning IREG Observatorymembership. In order not to prevent too many rankingsfrom undergoing the audit procedure only positive auditdecisions will be published. IREG Observatory hopesthat negative decisions will set incentives for the rankingorganisation to enhance the quality of its ranking(s) -and maybe apply for a second audit at a later time.

The audit decision and the executive summary of theAudit Report are published on the IREG website. Thedetailed report can be published by agreementbetween IREG and the audited ranking organisation.The audits will not be turned into a ranking of rankingsand hence the audit scores will not be published.

The IREG Ranking Audit Manual16

The IREG Ranking Audit Manual17

A. Information on Ranking Organisation

Name of the Ranking Organisation:

Head/director of organisation:

Year of foundation of ranking organisation:

Kind of organisation (please select):Commercial/for-profit (incl. media) Private, non-profitUniversity/Higher education institution Independent public organisation State organisationOther: …………………………………………………..................................................…..

Major activities (besides ranking):

Is the Ranking organisation part of a parent organisation:NoYes

If so: Name of organisation:

Address of ranking organisation

Street City

Post box Zip Code

URL Country

Contact person for the ranking audit(Please nominate a contact person to whom all communication will be directed)

Name Function

E-mail address Phone Fax

Name(s) of ranking(s) for which IREG Approval is sought:(Please note that only rankings with a largely similar methodology can be audited jointly!)

Ranking 1:

Ranking 2:

Ranking 3: If you apply for a joint audit of several rankings, please describe the similarity in methodology:

APPENDIX

APPENDIX 1: Data Sheet – to be attached to application for audit

The IREG Ranking Audit Manual18

B. Information on Ranking(s)Note: This sheet has to be submitted for each ranking IREG approval is sought

Name of Ranking

Cycle of publication/update of results: First year of publication:AnnualOther: Most recent year of publication:

Regional scope:National: Country: Regional cross-national

Countries: International/global

Level of ranking/analysis Major dimensions covered:(Multiple answers are possible) Dimension 1:

Institutional ranking Dimension 2:Broad fields (e.g. humanities) Dimension 3:Fields (e.g. history) Dimension 4:Programmes Dimension 5:

Dimension 6:

Publication(Multiple answers are possible) Is there an online publication

Print: on the ranking methodology? If so, URL:Part of magazine/other publicationSpecial publication

Online:Internet URL: Mobile App.

The IREG Ranking Audit Manual19

1. Information on the ranking organisation1.1. Name and address1.2. Type (academic – non-academic, public –

private, for-profit – non-profit) 1.3. Financing model/sources of the ranking1.4. Contact person

2. Information on previous record of the rankingactivity

2.1. Date of first publication and brief history2.2. Publication frequency2.3. Date of the latest two publications

3. Purpose and main target groups of the ranking3.1. Purpose 3.2. Main target groups /users

4. Scope of the ranking4.1. Geographical scope (global, international (e.g.

European, national etc.)4.2. Types of institutions, number of institutions

ranked, number of institutions published4.3. Level of comparison (e.g. institutional, field

based, etc.)

5. Methodology of the ranking5.1. General approach, options include overall

composite score through setting fixed weightsto different indicators, separate indicators whichallow customized rankings, other aggregationmethods

5.2. Measured dimensions (research, teaching &learning, third mission etc.)

5.3. Indicators (relevance, definitions, weights etc.)

5.4. Data source, options include third-partydatabase (data was not provided byuniversities), data collected from universities bythird-party agencies, data collected fromuniversities by ranking organisations (or theirrepresentative), survey of university staff orstudents by ranking organisations withcollaboration of the universities, surveyconducted by ranking organisations exclusively,etc.

5.5. Transparency of methodology5.6. Display of ranking results (league tables,

groups or clusters, mixed way)

6. Quality assurance of the ranking6.1. Quality assurance on data collection and

process6.2. Organisational measures for quality assurance

(consultants, boards etc.)

7. Publication and use of the ranking7.1. Type of publication (print, online or both)7.2. Language of publication (primary language,

other available language)7.3. Access for users of the rankings (registration,

fees)

8. Impact of the ranking8.1. Impact on personal level (students, parents,

researchers etc.)8.2. Impact on institutional level (higher education

institution as a whole, rectors and presidents,deans, students, administration, etc.)

8.3. Impact on the higher education system

APPENDIX 2: Structure of the self-report

DECLARATIONRELATED TO CONFLICT OF INTEREST

(to be signed by members of IREG Ranking Audit Teams)

I have read and fully understand the IREG Ranking Audit policy with regard to Conflict of Interestand, to the best of my knowledge, can declare that there are no situations and circumstances whichmay be considered conflicts of interest or potential conflicts of interest related to my participation asa member of the Audit Team undertaking the following audit:

The IREG Ranking Audit Manual20

Name of the audited organisation

Signature Date

APPENDIX 3:

Printed Name

Rankings and league tables of higher educationinstitutions (HEIs) and programs are a globalphenomenon. They serve many purposes: theyrespond to demands from consumers for easilyinterpretable information on the standing of highereducation institutions; they stimulate competitionamong them; they provide some of the rationale forallocation of funds; and they help differentiateamong different types of institutions and differentprograms and disciplines. In addition, whencorrectly understood and interpreted, theycontribute to the definition of “quality” of highereducation institutions within a particular country,complementing the rigorous work conducted in thecontext of quality assessment and review performedby public and independent accrediting agencies.This is why rankings of HEIs have become part ofthe framework of national accountability and qualityassurance processes, and why more nations arelikely to see the development of rankings in thefuture. Given this trend, it is important that thoseproducing rankings and league tables holdthemselves accountable for quality in their own datacollection, methodology, and dissemination.

In view of the above, the International RankingExpert Group (IREG) was founded in 2004 by theUNESCO European Centre for Higher Education(UNESCO-CEPES) in Bucharest and the Institute forHigher Education Policy in Washington, DC. It isupon this initiative that IREG’s second meeting(Berlin, 18 to 20 May, 2006) has been convened toconsider a set of principles of quality and goodpractice in HEI rankings - the Berlin Principles onRanking of Higher Education Institutions.

It is expected that this initiative has set a framework forthe elaboration and dissemination of rankings -whether they are national, regional, or global in scope- that ultimately will lead to a system of continuousimprovement and refinement of the methodologiesused to conduct these rankings. Given theheterogeneity of methodologies of rankings, theseprinciples for good ranking practice will be useful forthe improvement and evaluation of ranking.

Rankings and league tables should:

A) Purposes and Goals of Rankings

1. Be one of a number of diverse approaches to theassessment of higher education inputs, processes,and outputs. Rankings can provide comparativeinformation and improved understanding of highereducation, but should not be the main method forassessing what higher education is and does.Rankings provide a market-based perspective thatcan complement the work of government,accrediting authorities, and independent reviewagencies.

2. Be clear about their purpose and their target groups.Rankings have to be designed with due regard to theirpurpose. Indicators designed to meet a particularobjective or to inform one target group may not beadequate for different purposes or target groups.

3. Recognize the diversity of institutions and take thedifferent missions and goals of institutions into account.Quality measures for research-oriented institutions, forexample, are quite different from those that areappropriate for institutions that provide broad accessto underserved communities. Institutions that arebeing ranked and the experts that inform the rankingprocess should be consulted often.

4. Provide clarity about the range of informationsources for rankings and the messages each sourcegenerates. The relevance of ranking results dependson the audiences receiving the information and thesources of that information (such as databases,students, professors, employers). Good practicewould be to combine the different perspectivesprovided by those sources in order to get a morecomplete view of each higher education institutionincluded in the ranking.

5. Specify the linguistic, cultural, economic, andhistorical contexts of the educational systems beingranked. International rankings in particular should beaware of possible biases and be precise about theirobjective. Not all nations or systems share the same

The IREG Ranking Audit Manual21

Berlin Principles on Ranking of Higher Education Institutions

APPENDIX 4:

values and beliefs about what constitutes “quality”in tertiary institutions, and ranking systems shouldnot be devised to force such comparisons.

B) Design and Weighting of Indicators

6. Be transparent regarding the methodology usedfor creating the rankings. The choice of methodsused to prepare rankings should be clear andunambiguous. This transparency should include thecalculation of indicators as well as the origin of data.

7. Choose indicators according to their relevanceand validity. The choice of data should be groundedin recognition of the ability of each measure torepresent quality and academic and institutionalstrengths, and not availability of data. Be clear aboutwhy measures were included and what they aremeant to represent.

8. Measure outcomes in preference to inputswhenever possible. Data on inputs are relevant asthey reflect the general condition of a givenestablishment and are more frequently available.Measures of outcomes provide a more accurateassessment of the standing and/or quality of a giveninstitution or program, and compilers of rankingsshould ensure that an appropriate balance isachieved.

9. Make the weights assigned to different indicators(if used) prominent and limit changes to them.Changes in weights make it difficult for consumersto discern whether an institution’s or program’sstatus changed in the rankings due to an inherentdifference or due to a methodological change.C) Collection and Processing of Data

10. Pay due attention to ethical standards and thegood practice recommendations articulated in thesePrinciples. In order to assure the credibility of eachranking, those responsible for collecting and usingdata and undertaking on-site visits should be asobjective and impartial as possible.

11. Use audited and verifiable data wheneverpossible. Such data have several advantages,including the fact that they have been accepted byinstitutions and that they are comparable andcompatible across institutions.

12. Include data that are collected with properprocedures for scientific data collection. Datacollected from an unrepresentative or skewedsubset of students, faculty, or other parties may notaccurately represent an institution or program andshould be excluded.

13. Apply measures of quality assurance to rankingprocesses themselves. These processes shouldtake note of the expertise that is being applied toevaluate institutions and use this knowledge toevaluate the ranking itself. Rankings should belearning systems continuously utilizing this expertiseto develop methodology.

14. Apply organisational measures that enhance thecredibility of rankings. These measures couldinclude advisory or even supervisory bodies,preferably with some international participation.

D) Presentation of Ranking Results

15. Provide consumers with a clear understanding ofall of the factors used to develop a ranking, and offerthem a choice in how rankings are displayed. Thisway, the users of rankings would have a betterunderstanding of the indicators that are used to rankinstitutions or programs. In addition, they shouldhave some opportunity to make their own decisionsabout how these indicators should be weighted.

16. Be compiled in a way that eliminates or reduceserrors in original data, and be organized andpublished in a way that errors and faults can becorrected. Institutions and the public should beinformed about errors that have occurred

Berlin, 20 May 2006.

The IREG Ranking Audit Manual22

If you need more information, please contact:

[email protected]

IREG Observatory on Academic Ranking and Excellence

Rue Washington 40

1050 Brussels, Belgium

Secretariat:

IREG Observatory Secretariat

Nowogrodzka 31 room 415

00-511 Warsaw, Poland

[email protected]

www.ireg-observatory.org