review title: peer observation of teaching in health ... · review seeks to undertake a systematic...

29
1 Review title: Peer Observation of Teaching in Health Professionals Education Accuracy of Observation Tools A BEME Systematic Review Review Group: Professor Zena Moore (Lead reviewer), Professor, Head of School of Nursing & Midwifery, Royal College of Surgeons in Ireland (RCSI); [email protected] Dr Richard Arnett, Associate Director, Quality Enhancement Office, RCSI; [email protected] Ms Jane Burns, Research Officer, Health Professions Education Centre, RCSI; [email protected] Dr Martina Crehan, Curriculum Innovator, Health Professions Education Centre, RCSI; [email protected] Ms Grainne McCabe, Information Services Librarian, RCSI; [email protected] Dr Tom O’Connor, Senior Lecturer, Deputy Head of School, School of Nursing& Midwifery, RCSI; [email protected] Dr Tommy Kyaw Tun, Consultant Endocrinologist/Senior Lecturer, RCSI; Connolly Hospital, Blanchardstown, Dublin 15, Ireland; [email protected] Ms Anne Weadick, Executive Assistant, Quality Enhancement Office, RCSI; [email protected] Professor Teresa Pawlikowska, Director, Health Professionals Education Centre, RCSI; [email protected]

Upload: others

Post on 20-Jul-2020

5 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Review title: Peer Observation of Teaching in Health ... · review seeks to undertake a systematic review of the existing literature specifically focussing on the evaluation of the

1

Review title: Peer Observation of Teaching in Health Professionals Education –

Accuracy of Observation Tools – A BEME Systematic Review

Review Group:

Professor Zena Moore (Lead reviewer), Professor, Head of School of Nursing &

Midwifery, Royal College of Surgeons in Ireland (RCSI); [email protected]

Dr Richard Arnett, Associate Director, Quality Enhancement Office, RCSI;

[email protected]

Ms Jane Burns, Research Officer, Health Professions Education Centre, RCSI;

[email protected]

Dr Martina Crehan, Curriculum Innovator, Health Professions Education Centre,

RCSI; [email protected]

Ms Grainne McCabe, Information Services Librarian, RCSI; [email protected]

Dr Tom O’Connor, Senior Lecturer, Deputy Head of School, School of Nursing&

Midwifery, RCSI; [email protected]

Dr Tommy Kyaw Tun, Consultant Endocrinologist/Senior Lecturer, RCSI; Connolly

Hospital, Blanchardstown, Dublin 15, Ireland; [email protected]

Ms Anne Weadick, Executive Assistant, Quality Enhancement Office, RCSI;

[email protected]

Professor Teresa Pawlikowska, Director, Health Professionals Education Centre,

RCSI; [email protected]

Page 2: Review title: Peer Observation of Teaching in Health ... · review seeks to undertake a systematic review of the existing literature specifically focussing on the evaluation of the

2

Table of Experience of Review Group

Name RCSI Role Experience with

BEME

Experience with

Systematic

Reviewing Generally

Area of Speciality/

Research/Skills

Professor

Zena Moore

(PI)

Head of School of Nursing &

Midwifery, Royal College of

Surgeons in Ireland (RCSI)

First time

involvement with a

BEME Review

Currently involved

with 15 Cochrane

Systematic Reviews,

5 of these reviews are

completed.

RCTs

Systematic review and

meta-analysis

Quantitative research

methods

Peer Observation of

Teaching

Dr. Richard

Arnett

Associate Director, Quality

Enhancement Office, RCSI

First time

involvement with a

BEME Review

Systematic Reviews

for Research

Experience in

assessment. Strong

background in

statistics which is

applied for internal

and external

consultancy

Psychometrics

Data collection

Quantitative analyses

Qualitative analyses

Jane Burns Research Officer Health

Professions Education Centre-

(RCSI)

BEME Coordinator

for HPEC/ BICC

Centre. Currently

involved with 2

other BEME

reviews

Teaching / advising

on search strategies

for systematic reviews

Cochrane Reviews,

Systematic Reviews

for Research.

Information

Management, Advanced

Searching skills,

Database development

Open Access, Library

Management, Digital

Technology

Dr. Martina

Crehan

Curriculum Innovator, Health

Professions Education Centre,

RCSI

Reviewer role as

part of HPEC/BICC

centre

Systematic Reviews

for Research

RCSI BEME BICC

Panel of Experts

register.

Experience of

qualiat8ive

methodology for DEd

thesis

Co-ordinate and

supervise a Peer

Observation of Teaching

process within a

Postgraduate Diploma in

Higher Education

Teaching

Deliver faculty training

in Peer Observation of

Teaching

General Educational

Research in areas of

Impact of

professional

development

programmes on

discipline-specific

Page 3: Review title: Peer Observation of Teaching in Health ... · review seeks to undertake a systematic review of the existing literature specifically focussing on the evaluation of the

3

Faculty

Reflective Practice

Student Transition

Qualitative analysis

skills & expertise

Grainne

McCabe

Information Services

Librarian, RCSI

First time

involvement with a

BEME Review.

Also involved in

another BEME

review in

development at

RCSI

Teaching / advising

on search strategies

for systematic reviews

Library skills;

information literacy;

search skills. Advanced

Search and database

management skills.

Dr Tom

O’Connor

Senior Lecturer, Deputy Head

of School, School of Nursing

& Midwifery, RCSI

First time

involvement with a

BEME Review

Cochrane Reviews,

Systematic Reviews

for Research.

Qualitative &

Quantative Research

Experience

Research in health

care education

(development of

critical thinking

abilities).

Effects of

Education

interventions on the

development of risk

management

abilities in pressure

ulcers.

Evaluating

interventions to

prevent elder abuse

Dr Tommy

Kyaw Tun

Consultant

Endocrinologist/Senior

Lecturer, RCSI; Connolly

Hospital Blanchardstown,

Dublin 15

First time

involvement with a

BEME Review

Systematic Reviews

for Research

Qualitative &

Quantative Research

Experience

Completed an MSc in

Leadership in Health

Professions Education

including a dissertation

on Feedback. Role of

training and supervising

Interns. Feedback

Feedforward.

Ms Anne

Weadick

Executive Assistant, Quality

Enhancement Office, RCSI

First time

involvement with a

BEME Review

Systematic Reviews

for Research

Qualitative &

Quantative Research

Experience

Researching Peer

Observation of Teaching

for MSc dissertation.

Experience of

quantitative &

qualitative data

collection and analysis.

Professor

Teresa

Pawlikowska

Director, Health Professionals

Education Centre, RCSI

BEME Lead for

HPEC/ BICC

Centre. Currently

involved with 2

other BEME

reviews

Chair of Review

Panel

Systematic Reviews

for Research.

Qualitative and

quantitative research

and synthesis

PI on range of

research projects

PhD using mixed

research methods.

Communication and

consultation skills

teaching and assessment,

clinical reasoning,

methodology

Page 4: Review title: Peer Observation of Teaching in Health ... · review seeks to undertake a systematic review of the existing literature specifically focussing on the evaluation of the

4

Contact details for the lead reviewer

Professor Zena Moore (Lead reviewer), Professor, Head of School of Nursing &

Midwifery, Royal College of Surgeons in Ireland, 123 St. Stephen's Green, Dublin 2,

Ireland

Tel: 0035314022569; Email: [email protected]

Sources of support

Internal: Dean, Faculty of Medicine & Health Sciences, Royal College of Surgeons

in Ireland

Abstract

Background: Peer Observation of teaching is a way to evaluate teaching quality

which can be achieved by either formative or summative methods. This BEME

review seeks to undertake a systematic review of the existing literature specifically

focussing on the evaluation of the reliability and validity of tools used for peer

observation of teaching in health professionals’ education

Methods: A review of the literature will be undertaken using key defined search terms and databases. Utilising the approach of BEME protocol structures the researchers in this project will evaluate content based on developed inclusion and exclusion criteria. The protocol sets out to answer the research question which aims to “evaluate the reliability and validity of tools used for peer observation of teaching in health professionals education “ Results: A modified BEME coding form will be used for data extraction and auditing.

An evaluation of the methodological strength of the studies identified will be

performed using the BEME coding form ‘Strength of Findings’ model.(Dornan et al.,

2006)

Discussion: The results of this review will identify research papers that will identify

tools that can be used for peer observation of teaching in health professionals’

education

Page 5: Review title: Peer Observation of Teaching in Health ... · review seeks to undertake a systematic review of the existing literature specifically focussing on the evaluation of the

5

1.0 Background

Peer Observation of Teaching (POT) is defined as

“A collaborative and reciprocal process whereby one peer observes another’s teaching (actual or virtual) and provides supportive and constructive feedback (Lublin, 2002:5)”.

POT is a method of evaluation of teaching quality that may be either formative or

summative (McMahon et al., 2007). Formative methods are concerned with staff

development initiatives, whereas summative methods are most closely aligned with

performance appraisal, and external quality assurance systems (McMahon et al.,

2007).

For students to achieve their potential they need to be exposed to the highest

standards of teaching and learning within an environment conducive to the

attainment of academic goals (Department of Education and Skills, 2011). The

quality of teaching offered by higher education institutions is fundamental to

achieving this goal, this is true for the wider education field, and is also of equal

importance in the field of health professionals education (Hendry and Oliver, 2012).

Therefore, due to a number of diverse drivers, such as public expectation, economic

competitiveness and the strive for enhanced quality within institutions, evaluation of

teaching is increasingly seen as central to individual and organisation growth and

development (Hénard, 2010).

The ultimate aim of POT is to improve teacher and student outcomes, to create a

quality working environment and to increase staff development (Cabrera et al.,

2001). Thus, with the increasing strive for quality assurance combined with a greater

attention by external reviewers on the quality of teaching within higher education, not

least within health professionals education, POT has become increasingly popular

(McMahon et al., 2007).

The concept of POT arose in the United States in the 1960’s as a means by which

educational organisations could evaluate teaching from the perspective of internal

and external quality assurance (McMahon et al., 2007). Furthermore, POT was also

used as a means by which teachers were assessed for on-going tenure and

suitability for promotion (McMahon et al., 2007). This approach has remained the

Page 6: Review title: Peer Observation of Teaching in Health ... · review seeks to undertake a systematic review of the existing literature specifically focussing on the evaluation of the

6

main focus of POT within the United States; However, as the concept spread across

Australia and the United Kingdom, inclusion of a formative aspect to POT emerged

(McMahon et al., 2007, Iqbal, 2013). Here, the emphasis was placed on the

additional contribution of POT to the personal and professional growth of the

individuals involved (McMahon et al., 2007). This aspect is seen as key to the

potential for enhanced participation in POT strategies (Bell and Mladenovic,

2008).From an Irish perspective, POT began in early 2000 and is slowly beginning to

be seen as important from both a summative and formative point of view (Murphy

Tighe and Bradshaw, 2012).

POT may occur both informally and formally (Newman et al., 2012). When an

informal approach is taken, academic colleagues may observe a single teaching

session for a fellow colleague, following which, feedback and discussion on the

observed teaching session is given (Newman et al., 2012). In a formal approach, a

similar activity takes place; however, it occurs within the domain of a structured

faculty system of POT (Newman et al., 2012). POT is linked to enhanced teaching

and learning through reflection, critical thinking and discussion (Hammersley-

Fletcher and Orsmond, 2005).

Health professionals account for a large proportion of the student body. Indeed, as

an example, in Australia statistics from December 2013 indicate that there were

25,295 registered psychologists, 252,868 registered nurses, 24,166 registered

physiotherapists, 24,867 registered pharmacists and 98,194 registered

doctors(Australian Health Practitioner Regulation Agency, 2013). This equates to

198,390 health professionals within these disciplines all of whom received education

in order to qualify as a health professional. Clearly, the quality of the education these

individuals received has a significant impact on their ability to demonstrate that they

are eligible for registration. Furthermore, ongoing continuous professional

development is necessary to continue registration and as such the ability to work in

their chosen field of healthcare. Therefore, POT is as important in the field of health

professionals’ education as it is in other education sectors.

Page 7: Review title: Peer Observation of Teaching in Health ... · review seeks to undertake a systematic review of the existing literature specifically focussing on the evaluation of the

7

The General Medical Council’s 2009 report ‘Tomorrow’s Doctors’, emphasised the

importance of, and need for, standards to judge the quality of undergraduate

teaching and assessing in medical schools(General Medical Council, 2009). Siddiqui

et al. (2007) argue that POT is a tool which may provide rich, qualitative evidence for

teachers, which can sit alongside measures such as student evaluations. As such,

POT has been utilised in a variety of health professions education contexts and

settings e.g. Paediatric Teaching Faculty (Sullivan et al., 2012); in Nursing as part of

a triangulation of methods with student ratings and a teaching portfolio (Berk et al.,

2004); and in Pharmacy (Davis, 2011). POT is also increasingly used in the clinical

environment (Finn et al., 2011).

Bias in peer observation of teaching is a potential problem unless there is a

systematic approach to how the POT is undertaken. Bias is defined as a consistent

deviation from the truth and is particularly problematic when measures used to

assess performance are inadvertently influenced by personal or professional

rivalries, and also through lack of skills and training in POT. However, the potential

for bias can be reduced through the use of a valid and reliable evaluation instrument

(Trujillo et al. 2008).

1.1 Research Question

For the purposes of this review, the following research question has been developed:

“What is the reliability and validity of tools used during peer observation of teaching

of health professions in higher education?”

Specifically, the objectives of the review are to establish the reliability and validity of

instruments used during peer observation in health professionals’ education, through

the conduct of a systematic review and analysis of the existing literature.

We are using the best practice classic psychometric approach because it fits more

closely with the question we wish to answer. In the real world, however, other

methods are applied and we would like to be inclusive and fair. Therefore, we will

include all relevant methods as appropriate.

Page 8: Review title: Peer Observation of Teaching in Health ... · review seeks to undertake a systematic review of the existing literature specifically focussing on the evaluation of the

8

1.1.1 Rationale for the choice of research question

Regardless of the model utilised, the POT process usually, although not exclusively,

involves an observation tool which is completed by the observer, and is used as the

basis for feedback between observer and observed. The successful implementation

of peer observation of teaching is dependent on the quality of the feedback

processes employed (Hammersley-Fletcher and Orsmond, 2004). Guidance on best

practice in the selection of such tools varies from institution to institution, and the

literature provides many examples of the development and testing of customised

templates for example Beckman et al., (2003) and Trujillo et al., (2008).Institutional

practice varies from utilising a pre-existing instrument for observation, or developing

customised, in-house instruments(Pattison et al., 2012).

Instruments used during the POT process should demonstrate validity and reliability.

Validity is a measure of accuracy and reliability is a measure of consistency

(Anthony, 1999). Such concepts are important given that POT is seen as a means by

which competency may be demonstrated (McCarrick, 2011). Furthermore, POT is

one method used to determine whether teaching and learning within an institution is

fit for purpose (Hammersley-Fletcher and Orsmond, 2004). Therefore, applying

consistent measures, which are capturing the essence of POT, is essential to yield

meaningful outputs from the engagement in the process.

1.1.2 Statement of the significance of the research question

Many issues impact on the use of peer observation of teaching as a developmental

tool. Definitions of what constitutes effective pedagogy are not widely shared (Strong

et al, 2011). Preconceived notions of what constitutes effective pedagogy as well as

the tendency to utilise a frame of “self” when observing the practice of others may

strongly influence perceptions (Courneya et al 2007). Thus, the process of

observation and evaluation necessitates the planning and implementation of a

systematic approach in order to reduce bias and unreliability. A key aspect of this

systematic approach is the observation instrument utilised.

Ko et al (2013) report on a range of validated observation instruments which are in

use in teaching observation schemes at primary and second level education (e.g. the

Page 9: Review title: Peer Observation of Teaching in Health ... · review seeks to undertake a systematic review of the existing literature specifically focussing on the evaluation of the

9

Assessment Profile for Early Childhood Programs (APECP), the Classroom

Practices Inventory (CPI), the UTeach Teacher Observation Protocol (UTOP).

Guidance and best practice is available in terms of their development and

implementation. In higher education a number of instruments are identified in the

literature (Beckman et al. 2003; Fry & Morris 2004; Bell 2005), they range from

checklists, to rating scales and open-ended narratives. Whilst Kohut et al 2007

suggest a need for instruments to be flexible to accommodate various teaching

styles there is debate the regarding the optimal process of choosing an instrument.

Seldin (1999) recommends a combination of two instruments. Kohut et al (2007)

note the use of varied types of peer observation instruments being used, with a

preference for the narrative form, and a frequent combination of other forms such as

video, checklists, and self-analysis. Observing and reporting on teaching accurately

relies on the use of reliable and validated instruments, standardized observation

procedures, and peer observer training

At the outset, it seems logical that if the purpose of POT is to make an evaluation of

teaching quality, then there is a requirement to define what is meant by quality

teaching (Gosling, 2002). In the absence of this, it is likely that assessments will be

so subjective in nature that they may yield unreliable information. Indeed, as the

process involves measurement the methods employed need to demonstrate

consistency (Green and South, 2006). It is unclear whether observation tools

employed in the POT process have been subject to rigorous testing for accuracy.

Furthermore, this has not been assessed systematically (Yon et al., 2002).

2.0 Review topic/question(s), objectives and key words

The research question for this review is:

“What is the reliability and validity of tools used during peer observation of teaching

of health professions in higher education?”

Specifically, the objectives of the review are to establish the validity and reliability of

instruments used during peer observation in health professionals’ education, through

the conduct of a systematic review and analysis of the existing literature. The

following section will outline the specific elements of this question.

Page 10: Review title: Peer Observation of Teaching in Health ... · review seeks to undertake a systematic review of the existing literature specifically focussing on the evaluation of the

10

Population

o The population of interest for this review is adults involved in teaching

within the health professions higher education sector

Higher education is defined as: “education at a college or

university where subjects are studied at an advanced

level”(Cambridge Dictionaries On Line, 2014).

Teaching may be at undergraduate or postgraduate levels, or

continuing education in a broad range of health professions

education areas including but not limited to medicine, nursing,

dentistry, physiotherapy, pharmacy and other disciplines.

Activity

o The activity of interest is peer observation of teaching

POT is defined as “A collaborative and reciprocal process

whereby one peer observes another’s teaching (actual or virtual)

and provides supportive and constructive feedback”(Lublin,

2002:5).

POT is usually a threefold process and involves:

(1)a pre meeting, where both the observee and observer agree

what is to be observed;

(2) the actual observation of the agreed teaching session;

(3)provision of feedback by the observer to the

observee(Newman et al., 2012).

For the purpose of this review the instrument of interest is a

structured POT tool using predetermined content outlining

aspects of teaching to be observed.

Outcomes

o For the purpose of this review the outcomes of interest are:

Reliability: stability, equivalence and internal consistency

Validity: content, criterion-related and construct validity

Page 11: Review title: Peer Observation of Teaching in Health ... · review seeks to undertake a systematic review of the existing literature specifically focussing on the evaluation of the

11

Key Words

o Health professionals education

o Peer observation of teaching

o Observation tools

o Reliability; Validity

o Instrument(s)

3.0 Search Sources and Strategies

An English language limit will be selected. The scope of years of publication will

limited to a start date of 1960. As indicated in the Background section of the protocol

- the concept of POT arose in the United States in the 1960’s as a means by which

educational organisations could evaluate teaching from the perspective of internal

and external quality assurance (McMahon et al., 2007). A preliminary smaller,

limited search will be undertaken to define and identify the range years that will be

included. This will serve as a baseline indicator as to the range of years to be

included in the fuller more comprehensive search. The preliminary search will inform

the following criteria; scope and range of topic and validity of research models.

3.1.1 Selective Databases &Search Engines

PubMed, CINAHL, ERIC, Web of Science, EMBASE and Google Scholar

3.1.2 Strategies

Term searching will be undertaken in each database and search engine. Free text

using key words will be the initial search step. Secondary searching will involve the

use of Boolean logic to create compound term searching to ensure maximum

discoverability.

The initial tranche of relevant articles and studies reference lists will be reviewed to

cross check all relevant references have been captured.

The Mayo Teaching Evaluation form(MTEF)(Beckman et al., 2003) has been

identified as an established instrument in the area of POT and it will be included in

the secondary search strategy.

Page 12: Review title: Peer Observation of Teaching in Health ... · review seeks to undertake a systematic review of the existing literature specifically focussing on the evaluation of the

12

Output from the implementation of this strategy will be reviewed to identify any other

key words or search terms not identified in original search.

3.1.3 Search terms

The identified search strategy was tested on a beta sample of four articles. From this

test sample and analysis of key concepts in the field of Peer Observation the

following MeSH terms and keywords were identified to be used in the search

implementation:

Primary MeSH (Medical Subject Headings)

Peer Review

Peer Group

Teaching

Educational Measurement

Instrument (s)

Secondary MeSH (Medical Subject Headings)

“Mayo Teaching Evaluation Form”

Mayo Teaching Form

MTEF

Tertiary MeSH (Medical Subject Headings)

If the scope of retrieved references from primary and secondary searching is very

broad then the application of tertiary MeSH terms which are related to the

instruments, tools and validity of testing will be considered. MeSH terms for these

attributes will be developed if required.

Keywords:

Peer AND observer(s) / observing / observation

Peer AND evaluator(s) / evaluating / evaluation

Peer AND assessor(s) / assessing / assessment

Page 13: Review title: Peer Observation of Teaching in Health ... · review seeks to undertake a systematic review of the existing literature specifically focussing on the evaluation of the

13

Peer AND review(s)/ assessing/assessment

Teaching

Educational AND measurement

Edumetric

Edumetrics

Educational Measurement

Instrument (s)

Observing

Evaluating

Teach

Teaching

Validate

Validates

These will be combined to search PubMed and mapped into similar headings and

keywords in the other databases and search engine.

3.2 Scoping Review

The scoping review methodology has permitted us to categorise the literature

pertaining to peer observation of teaching in Health Professions Education. The

resulting literature repository that our review has created can be of use to

researchers and educators interested in the topic of Peer observation of teaching in

these areas. For the purposed of the BEME review the scoping review was

conducted to identify if there existed a range of literature that was relevant to this

topic generally.

The following constructed search was used for scoping review:

((((("Peer Review"[Mesh]) OR ((peer[Title/Abstract]) AND

(((((((observe[Title/Abstract]) OR observes[Title/Abstract]) OR

observed[Title/Abstract]) OR observing[Title/Abstract]) OR

observation[Title/Abstract])) OR ((((evaluate[Title/Abstract]) OR

Page 14: Review title: Peer Observation of Teaching in Health ... · review seeks to undertake a systematic review of the existing literature specifically focussing on the evaluation of the

14

evaluates[Title/Abstract]) OR evaluating[Title/Abstract]) OR

evaluation[Title/Abstract]))))) AND (((("Teaching"[Mesh]) OR (((teach[Title/Abstract])

OR teaches[Title/Abstract]) OR teaching[Title/Abstract]))) OR

((edumetrics[Title/Abstract]) OR edumetric[Title/Abstract])))) AND

((((((((((((("Reproducibility of Results"[Mesh]) OR "Validation Studies as

Topic"[Mesh]) OR "Questionnaires"[Mesh]) OR (((instrument[Title/Abstract]) OR

instruments[Title/Abstract]) OR instrumentation[Title/Abstract])) OR

(((test[Title/Abstract]) OR tests[Title/Abstract]) OR testing[Title/Abstract])) OR

((survey[Title/Abstract]) OR surveys[Title/Abstract])) OR

((questionnaire[Title/Abstract]) OR questionnaires[Title/Abstract])) OR

((tool[Title/Abstract]) OR tools[Title/Abstract])) OR ((scale[Title/Abstract]) OR

scales[Title/Abstract])) OR ((checklist[Title/Abstract]) OR checklists[Title/Abstract]))

OR (((((valid[Title/Abstract]) OR validity[Title/Abstract]) OR validation[Title/Abstract])

OR validate[Title/Abstract]) OR validates[Title/Abstract])) OR

((reliable[Title/Abstract]) OR reliability[Title/Abstract])) OR ((predictive

value[Title/Abstract]) OR predictive values[Title/Abstract]))

This search resulted in 507 research papers that will be included in the review. The

results of the scoping review indicate that there is sufficient material to proceed. A

sample size of 5% of the published articles was reviewed against the coding sheet

using the intended inclusion and exclusion criteria as well.

4.0 Study selection criteria

4.1 Inclusion Criteria

4.1.1 Type of Study

For the purposes of this review, studies using quantitative or qualitative research

methodology will be included. We will include studies that describe the development,

validation and use of instruments for peer observation of teaching.

4.1.2 Setting

The development and validation of peer observation instruments in in a

broad range of health professions education areas including but not limited

to medicine, nursing, dentistry, physiotherapy, pharmacy and other

disciplines.

Page 15: Review title: Peer Observation of Teaching in Health ... · review seeks to undertake a systematic review of the existing literature specifically focussing on the evaluation of the

15

The instruments which are the subject of the article may be developed for

teaching faculty in any of the above disciplines, at undergraduate or

postgraduate level, and in pre-clinical or clinical stages of teaching.

We will include studies conducted in any geographic location, but

published in English

4.1.3 Type of instrument

The instrument may be developed for formative or summative purposes; and may

take the form of a checklist, rating scales or open-ended narrative.

4.1.4 Limits

Please refer to section 3.0 Search Sources and Strategies where limits are

identified and explained.

4.2 Exclusion criteria at title and abstract screening phase:

Studies exploring POT in Pre-primary, primary or secondary education

Study not published in English. We do not have adequate resources to

translate articles in enough detail to be included in a systematic review

Subject and context of the study in a non-health professions education field.

Our focus is a systemic review in the context of health professions education.

Whilst literature and research conducted in other disciplinary areas has

informed our research rationale and approach, our review focus must

incorporate instruments relevant to the specific contexts of health professions

teaching such as clinical teaching; bed-side teaching, etc.

Does not include empirical data. Paper must document the development and

testing/validation of an instrument not just an account of its use or faculty

perceptions of same, etc. Literature reviews, commentaries and opinion

pieces will also be excluded.

Page 16: Review title: Peer Observation of Teaching in Health ... · review seeks to undertake a systematic review of the existing literature specifically focussing on the evaluation of the

16

Focuses on other forms of teaching feedback or assessment such as student

review, managerial review. The focus must be peer observation of teaching

5.0 Procedure for extracting data

Data will be extracted using a pre-designed data extraction template which has been

adapted from the BEME Coding Sheet. As outlined in the BEME guideline, examples

of review coding sheets can be found in each of the published BEME systematic

reviews; these will be used for guidance. Thus, a bespoke coding sheet will be

developed in accordance with systematic reviewing criteria related to the topic of

Peer Observation of Teaching.

The members of the review team will be divided into pairs. The retrieved articles will

be divided among the pairs, and each pair will data extract the assigned articles,

independently from the other member in their specific paring group, and also

independently from the other pairing groups. A sample of 5 articles from each pair

will be obtained and using the kappa statistic (Viera and Garrett, 2005) we will

determine the interrater reliability for the data extraction.

5.1 Procedure for dealing with discrepancies

In the event that there are differing opinion as to the inclusion or exclusion of article

or where both reviewers are unsure the differences in data extraction will be resolved

through discussion, if agreement is not achieved, data extraction for the relevant

article will be independently undertaken by a 3rd member of the team, and

differences resolved through discussion.

5.2 Training of Reviewers

Training of reviewers will be administered by team members, Prof Teresa

Pawlikowska and Jane Burns, both of whom have experience with systematic

reviews generally and BEME reviews specifically. All reviewers will be familiarised

with BEME support training information available on the BEME.org website.

Page 17: Review title: Peer Observation of Teaching in Health ... · review seeks to undertake a systematic review of the existing literature specifically focussing on the evaluation of the

17

5.3 Quality Appraisal

Each included study will be quality appraised using the evidence based literature

critical appraisal checklist devised by Glynn (2006). This checklist appraises the

study under the following domains:

Population

Data Collection

Study design

Results

Each domain has a number of subcategories, and each is assessed using a yes, no,

unclear, or a not applicable rating. Calculation for each section’s quality is as follows:

(Y+N+U=T). If Y/T <75% or if N+U/T > 25% then you one can conclude that the

section identifies significant omissions and that the quality of the study is

questionable. It is important to look at the overall quality as well as section quality,

therefore, calculation for the total validity is as follows: (Y+N+U=T).If Y/T ≥75% or if

N+U/T ≤ 25% then you can safely conclude that the study of sound quality. The

critical appraisal tool provides a thorough, generic list of questions that one would

ask when attempting to determine the validity, applicability and appropriateness of a

study, either qualitative or qualitative, since the tool allows for the use of non-

applicable for questions that are not relevant to the particular study under

examination.

6.0 Synthesis of extracted evidence

Data synthesis will be undertaken quantitatively if appropriate, a narrative summary

of the data will be also be provided (Moore and Cowman, 2008).

Reliability is established as follows:

Stability means that the same results will be achieved with repeated testing

using the tool; outcomes from stability testing are usually presented as

agreement between measures using the correlation coefficient, with results

ranging between -1 and +1.

Equivalence is measured by inter-rater reliability, where two people score an

event independently using agreed scoring criteria; correlation coefficients

between the scores are calculated with results ranging between -1 and +1.

Page 18: Review title: Peer Observation of Teaching in Health ... · review seeks to undertake a systematic review of the existing literature specifically focussing on the evaluation of the

18

The internal consistency addresses the correlation of various items within

an instrument, i.e. all the subparts of the tool measure the same

characteristic, in essence this type of analysis is looking at how consistent the

results are for different items for the same construct within the measure.

Correlation coefficients between the scores for the questions on the

instrument are calculated with results ranging between -1 and +1.

Validity is established as follows:

Content validity is concerned with how well the content of the tool covers the

subject area. Each item is examined for its relevance, often by asking experts

in the field to examine the items. Correlation coefficients are calculated from

the responses of the experts with results ranging between -1 and +1.

Criterion-related validity is concerned with how well the tool compares with

previous tools, both instruments are applied to the same observation and

Correlation coefficients are calculated from the responses of the experts with

results ranging between -1 and +1.

Construct validity is concerned with the fit between the construct under

exploration and the conceptual definitions and the operational definitions of

variables within the instrument, this may be established through an iterative

process and results presented correlations of the measure being examined in

regard to variables that are known to be related to the construct

We will synthesise the outcomes from the validity and reliability studies, individually

for each instrument and also for each aspect of reliability and validity. Results will be

presented using means and standard deviations.

6.1 Clarification of Quality Assessment

Quality Assessment will be ensured by the application of the EBL Critical Appraisal

Checklist (below) and adaptation of the Critical Appraisal Skills Programme.

6.2 The potential expected outcomes for education research and practice

The objective of this review is to establish the reliability and validity of tools used

during peer observation of teaching, should this review identify a tool or tools which

are demonstrated to have good reliability and validity, then the instrument(s) will be

of value for further research in education. In addition, given that peer observation of

Page 19: Review title: Peer Observation of Teaching in Health ... · review seeks to undertake a systematic review of the existing literature specifically focussing on the evaluation of the

19

teaching is widely used in practice the outcomes of this review will have application

for practice in that use of a reliable and valid tool will enhance confidence in the peer

observation process through adoption and implementation of a more systematic

approach to this important quality measure.

Page 20: Review title: Peer Observation of Teaching in Health ... · review seeks to undertake a systematic review of the existing literature specifically focussing on the evaluation of the

20

EBL Critical Appraisal Checklist Yes (Y) No (N) Unclear

(U)

N/A

Sect

ion

A:

Po

pu

lati

on

Is the study population representative of all users, actual and eligible, who might be included

in the study?

Are inclusion and exclusion criteria definitively outlined?

Is the sample size large enough for sufficiently precise estimates?

Is the response rate large enough for sufficiently precise estimates?

Is the choice of population bias-free?

If a comparative study:

Were participants randomized into groups?

Were the groups comparable at baseline?

If groups were not comparable at baseline, was incomparability addressed by the authors in

the analysis?

Was informed consent obtained?

Se

ctio

n B

:

Dat

a C

olle

ctio

n

Are data collection methods clearly described?

If a face-to-face survey, were inter-observer and intra-observer bias reduced?

Is the data collection instrument validated?

If based on regularly collected statistics, are the statistics free from subjectivity?

Does the study measure the outcome at a time appropriate for capturing the intervention’s

effect?

Is the instrument included in the publication?

Are questions posed clearly enough to be able to elicit precise answers?

Were those involved in data collection not involved in delivering a service to the target

population?

Sect

ion

C:

Stu

dy

De

sign

Is the study type / methodology utilized appropriate?

Is there face validity?

Is the research methodology clearly stated at a level of detail that would allow its replication?

Was ethics approval obtained?

Are the outcomes clearly stated and discussed in relation to the data collection?

Page 21: Review title: Peer Observation of Teaching in Health ... · review seeks to undertake a systematic review of the existing literature specifically focussing on the evaluation of the

21

Sect

ion

D:

Re

sult

s

Are all the results clearly outlined?

Are confounding variables accounted for?

Do the conclusions accurately reflect the analysis?

Is subset analysis a minor, rather than a major, focus of the article?

Are suggestions provided for further areas to research?

Is there external validity?

Calculation for section validity: (Y+N+U=T)

If Y/T <75% or if N+U/T > 25% then you can safely conclude that the

section identifies significant omissions and that the study’s validity is

questionable. It is important to look at the overall validity as well as

section validity.

Calculation for overall validity: (Y+N+U=T)

If Y/T ≥75% or if N+U/T ≤ 25% then you can safely conclude that the

study is valid.

Section A validity calculation: 1/ 4= 25% not valid

Section B validity calculation: 5/ 7= 71% not valid

Section C validity calculation: 1/ 5= 20% not valid

Section D validity calculation: 4/ 6= 67% not valid

Overall validity calculation:

11/ 22 = 50% not valid

Glynn, L. A critical appraisal tool for library and information research. Library Hi Tech 2006, 24(3):.387 – 399

Page 22: Review title: Peer Observation of Teaching in Health ... · review seeks to undertake a systematic review of the existing literature specifically focussing on the evaluation of the

22

7.0 Project timetable

1 2 3 4 6 7 8 9 10 11 12 13

Project Activity Oct

2015

Nov

2015

Dec

2015

Jan

2016

Feb

2016

Mar

2016

April

2016

May

2016

June

2016

July

2016

Sep

2016

Nov

2016

Search & Retrieval

Raw Materials

Divide articles

Work off abstracts

Agree potential articles

Test template for data extraction

Read papers & check suitability against template

Final batch of papers & versions of data

extraction

Verify that everything matches up

Analysis

Results & write up

Implications for future research

Disseminate results

Page 23: Review title: Peer Observation of Teaching in Health ... · review seeks to undertake a systematic review of the existing literature specifically focussing on the evaluation of the

23

8.0 Conflict of interest statement

The authors have no conflict of interest to declare.

9.0 Plans for updating the review

The reviews will be updated within three years of the date of the review. The lead

reviewer will take responsibility for liaising with the review team to plan the update of

the review. We will follow the guidance of BEME(Best Evidence Medical Education

Collaboration, 2014) and will adhere to the relevant option of the 3 potential

directions for updating reviews:

A collection of significant papers on the review topic published since

completion of the review.

A short supplement to the original review highlighting key developments in the

topic area.

A second edition of the review, involving a major reworking and rewrite of the

original and further peer review.

10.0 Dissemination

A number of steps for dissemination of the outcomes of this review will be taken.

These are:

Journal publication for example:

o Assessment & Evaluation in Higher Education,

o Medical Education,

o Medical Teacher

o or Nurse Education Today

Conference presentation both oral and poster:

o ASME: http://www.asme.org.uk/

o AMEE: http://www.amee.org/home

o INMED: http://www.inmed.ie/

o ENQA http://www.enqa.eu/

o INQAAHE http://www.inqaahe.org/index.php

o International Conference on Faculty Development in the Health

Professions

o International Education Forum, RCSI

Page 24: Review title: Peer Observation of Teaching in Health ... · review seeks to undertake a systematic review of the existing literature specifically focussing on the evaluation of the

24

o Faculty of Nursing & Midwifery International Education and Research

Conference, RCSI

Integration into local, national and international guidelines for peer

observation of teaching.

Page 25: Review title: Peer Observation of Teaching in Health ... · review seeks to undertake a systematic review of the existing literature specifically focussing on the evaluation of the

25

11.0 References

Anthony, D. 1999. Validity and reliability. Understanding Advanced Statistics.

London: Churchill Livingstone.

Australian Health Practioner Regulaton Agency. 2013. Statistics [Online]. Canberra,

Australia: Australian Health Practitioner Regulation Agency. Available:

http://www.ahpra.gov.au/Registration/Graduate-Applications.aspx [Accessed 27th

March 2014].

Beckman, T. J., Lee, M. C., Rohren, C. H. & Pankratz, V. S. 2003. Evaluating an

instrument for the peer review of inpatient teaching. Medical Teacher, 25, 131-5.

Bell, A. & Mladenovic, R. 2008. The benefits of peer observation of teaching for tutor

development. Higher Education, 55, 735 - 52.

Berk R., Naumanny, P. & Applingz , S. 2004. Beyond student ratings: peer

observation of classroom and clinical teaching. International Journal of Nursing

Education Scholarship, 1, 1-26.

Best Evidence Medical Education C (BEME). 2014. Updating the review. Available:

http://bemecollaboration.org/Step+9+Update+Review/ [Accessed 26th January

2014].

Cabrera, A. F., Colbeck, C. L. & Terenzini, P. T. 2001. Developing Performance

Indicators for Assessing Classroom Teaching Practices and Student Learning: The

Case of Engineering. Research in Higher Education, 42, 327-52.

CAMBRIDGE DICTIONARIES ON LINE. 2014. Definition of Higher Education

[Online]. Cambridge: Cambridge University Press,. Available:

http://dictionary.cambridge.org/dictionary/british/higher-

education?q=Higher+education+ [Accessed 12th April 2014].

Courneya, C.A., Pratt , D., & Collins, J. 2008. Through what perspective do we

judge the teaching of peers? Teaching and Teacher Education, 24, 69–79

Page 26: Review title: Peer Observation of Teaching in Health ... · review seeks to undertake a systematic review of the existing literature specifically focussing on the evaluation of the

26

Davis, S. 2011. Peer observation: a faculty initiative. Currents in Pharmacy Teaching

and Learning, 3, 106-15.

Department of Education and Skills. 2011. National Strategy for Higher Education to

2030 [Online]. Dublin: Government Publications Office. Available:

http://www.hea.ie/en/policy/national-strategy [Accessed 12th April 2014].

Dornan T, Littlewood S, Margolis SA, Scherpbier A, Spencer J, Ypinazar V. How can

experience in clinical and community settings contribute to early medical education?

A BEME systematic review. Medical Teacher. 2006;28(1):3-18.

Finn, K., Chiappa, V., Puig, A. & Hunt, D. P. 2011. How to become a better clinical

teacher: a collaborative peer review process. Medical Teacher, 33, 151-5.

Fry H, & Morris C. 2004. Peer observation of clinical teaching. Medical Education,

38, 560–561.

General Medical Council. 2009. Tomorrow’s Doctors [Online]. London: General

Medical Council. Available: http://www.gmc-

uk.org/education/undergraduate/tomorrows_doctors_2009.asp [Accessed 12th April

2014].

Glynn, L. A critical appraisal tool for library and information research. Library Hi Tech

2006, 24(3):.387 – 399

Hammersley- Fletcher, L. & Orsmond, P. 2004. Evaluating our peers: is peer

observation a meaningful process? Studies in Higher Education, 29, 489-503.

Gosling, D. 2002. Models of peer observation of teaching. paper for LTSN Generic

Centre [Online]. Available: http://www.ltsn.ac.uk/genericcentre/ [Accessed 26th

January 2014].

Green, J. & South, J. 2006. Evaluation - concepts and approaches. Evaluation

Berkshire, UK: Open University Press.

Page 27: Review title: Peer Observation of Teaching in Health ... · review seeks to undertake a systematic review of the existing literature specifically focussing on the evaluation of the

27

Hammersley-Fletcher, L. & Orsmond, P. 2005. Reflecting on reflective practices

within peer observation. Studies in Higher Education, 30, 213-24.

Henard, F. 2010. Learning Our Lesson: Review of Quality Teaching in Higher

Education [Online]. Paris, France: OECD Publishing. Available:

http://www.oktemvardar.com/docs/OECD_Learning_our_Lesson_2010.pdf

[Accessed 12th April 2014].

Hendry, G. D. & Oliver, G. R. 2012. Seeing is Believing: The Benefits of Peer

Observation. Journal of University Teaching & Learning Practice, 9, 1-9.

IQBAL, I. 2013. Academics' resistance to summative peer review of teaching:

questionable rewards and the importance of student evaluations. Teaching in Higher

Education 18, 557-69.

Ko J., Sammons P. & Bakkum, L. 2013. Effective Teaching: a review of research and

evidence. CFBT Education Trust.

http://cdn.cfbt.com/~/media/cfbtcorporate/files/research/2013/reffective-teaching-

2013.pdf (Accessed May 8th 2015)

Kohut, G.F., Burnap, C. & Yon, M.G. 2007. Peer Observation of Teaching. College

Teaching, 55, 19-25.

Lublin, J. 2002. A Guide to Peer Review of Teaching. Available:

www.utas.edu.au/tl/improving/peerreview/ [Accessed 26th January 2014].

McCarrick, E. 2011. Meeting Professional Competence Requirements: a proposal to

the Irish College of General Practitioners on behalf of clinical teachers who do not

see patients. Available: http://www.icgp.ie/go/pcs [Accessed 26th January 2014].

McMahon, T., Barrett, T. & O'Neill, G. 2007. Using observation of teaching to

improve quality: finding your way through the muddle of competing conceptions,

confusion of practice and mutually exclusive intentions. Teaching in Higher

Education, 12, 499 - 511.

Page 28: Review title: Peer Observation of Teaching in Health ... · review seeks to undertake a systematic review of the existing literature specifically focussing on the evaluation of the

28

Moore, Z. & Cowman, S. 2008. The Cochrane Collaboration, systematic reviews and

meta analysis. In: Watson, R., McKenna, H., Cowman, S. & Keady, J. (eds.) Nursing

Research: Designs and Methods London: Churchill Livingstone.

Murphy T., A. & Bradshaw C. 2012. Peer-supported review of teaching: Making the

grade in midwifery and nursing education Nurse Education Today, 33, 1347-51.

Newman, L., Roberts, D. & Schwartzstein, R. M. 2012. Peer Observation of

Teaching Handbook. MedEdPORTAL [Online]. Available:

www.mededportal.org/publication/9150 [Accessed 26th January 2014].

Pattison, A., Sherwood, M., Lumsden, C., Gale, A. & Markides, M. 2012. Foundation

observation of teaching project - a developmental model of peer observation of

teaching. Medical Teacher, 34, 136-42.

Seldin, P., & Associates. 1999. Changing Practices in Evaluating Teaching: A Practical Guide to Improved Faculty Performance and Promotion/Tenure Decisions. Bolton, MA: Anker Publishing.

Siddiqui, Z. S., JONAS-DWYER, D. & CARR, S. E. 2007. Twelve tips for peer

observation of teaching. Medical Teacher, 29, 297-300.

Strong, M., Gargani, J., & Hacifazlioglu, O. 2011. Do we know a successful teacher

when we see one? Experiments in the identification of effective teachers. Journal of

Teacher Education, 62(4), 367–382

Sullivan, P. B., A. Buckle, A., Nicky, G. & Atkinson, S. H. 2012. Peer observation of

teaching as a faculty development tool. BMC Medical Education, 12, 26-32.

Trujillo, J. M., Divall, M. V., Barr J., Gonyeau M., Van Amburgh, J. A., Matthews, J. &

Qualters, D. 2008. Development of a Peer Teaching-Assessment Program and a

Peer Observation and Evaluation Tool. American Journal of Pharmaceutical

Education, 72, 1-9.

Viera, A. J. & Garrett, J. M. 2005. Understanding interobserver agreement: the

kappa statistic. Fam Med, 37, 360-3.

Page 29: Review title: Peer Observation of Teaching in Health ... · review seeks to undertake a systematic review of the existing literature specifically focussing on the evaluation of the

29

Yon, M., Burnap, C. & Kohut, G. 2002. Evidence of Effective Teaching: Perceptions

of Peer Reviewers. College Teaching, 50, 104-10.