assessing knowledge and performance

42
Assessing Knowledge and Performance John Littlefield University of Texas Health Science Center at San Antonio

Upload: gibson

Post on 14-Jan-2016

38 views

Category:

Documents


0 download

DESCRIPTION

Assessing Knowledge and Performance. John Littlefield University of Texas Health Science Center at San Antonio. Recall a medical student or resident whose performance made you uneasy. What behavior or event made you uneasy? What action did you take? - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Assessing Knowledge  and Performance

Assessing Knowledge and Performance

John Littlefield

University of Texas Health Science Center

at San Antonio

Page 2: Assessing Knowledge  and Performance

Recall a medical student or resident whose performance made you uneasy.

1. What behavior or event made you uneasy?

2. What action did you take?

a. Talk with faculty colleagues about your concerns

b. Write a candid performance appraisal and send it to the course/residency director

3. If you wrote a candid appraisal, did an administrative action occur related to the student/ resident?

Page 3: Assessing Knowledge  and Performance

Goals: Assessment of Knowledge and Performance

1. Clarify two distinct uses for assessments of clinical knowledge and performance

2. Define two aspects of validity for all assessment methods

3. Compare and contrast 6 techniques for assessing clinical knowledge and performance

4. Identify poorly written multiple choice test items

5. Write a key features test item

6. Describe a role for narrative comments in scoring interactions with Standardized Patients

7. Describe three elements of a clinical performance assessment system

8. Critique a clinical performance assessment system that you use

Page 4: Assessing Knowledge  and Performance

Agenda: Assessment of Knowledge and Performance

1. Exercise: Recall a medical student or resident whose performance made you uneasy

2. Presentation: Quality assurance when assessing clinical knowledge and performance

3. Exercise: Take then critique a multiple choice test

4. Presentation: Key features test items

5. Exercise: Write a key features test item

6. Exercise: Critique a videotaped student-SP interaction

7. Presentation: Widening the lens on SP assessment

8. Exercise: Recommend program director actions based on faculty comments about a resident

9. Presentation: Improving clinical performance assessment systems

10. Exercise: Critique your clinical performance assessment system

Page 5: Assessing Knowledge  and Performance

Uses for Assessment: Formative vs. Summative

Purpose Feedback for Certification/Grading

Learning

Breadth of Narrow Focus on Broad Focus on

Scope Specific Objectives General Goals

Scoring Explicit Feedback Overall Performance

Learner Affective Little Anxiety Moderate to High

Response Anxiety

Target Audience Learner Society

Page 6: Assessing Knowledge  and Performance

Validity of Knowledge and Performance Assessments

Content - Does the assessment method measure a representative cross-section of student competencies?

Structural – Degree to which content and scoring represent competence in each subsection of the performance domain

External - Do scores from this assessment method correlate highly with scores from other measures of the same student competencies?

Consequential - Do various subgroups of students (e.g., different ethnic groups) score equally well on the assessment?

Generalizability

– Does the student perform at about the same level across 5 to 7 different patients / case problems?

– Does the student receive a similar rating from different faculty? Substantive – the context surrounding the assessment evokes the

domain of cognitive processes used by a physician

Page 7: Assessing Knowledge  and Performance

Generalizability

Content

Substantive

Consequential External Structural

Six Aspects of Assessment Validity Viewed as a Cube

Page 8: Assessing Knowledge  and Performance

Generalizability of Physician Performance on Multiple Patients

0

20

40

60

80

100

120

Patient 1 Patient 2 Patient 3 Patient 4 Patient 5

IdealActual

Page 9: Assessing Knowledge  and Performance

Validity of Knowledge and Performance Assessments

Content - Does the assessment method measure a representative cross-section of student competencies?

External - Do scores from this assessment method correlate highly with scores from other measures of the same student competencies?

Consequential - Do various subgroups of students (e.g., different ethnic groups) score equally well on the assessment?

Generalizability– Does the student perform at about the same level across 5 to 7

different patients / case problems?– Does the student receive a similar rating from different

faculty? Substantive – the context surrounding the assessment evokes the

domain of cognitive processes used by a physician

Page 10: Assessing Knowledge  and Performance

Substantive Aspect of Validity: Four Levels of Performance Assessment 1

Knows

(Examination – Multiple-choice)

Knows How

(Examination – Oral)

Shows How

( OSCE)

Does

(Global Rating)

1. Miller, GE. Assessment of clinical skills/competence/performance, Academic Medicine, 65(9), supplement, S63-7, 1990

Page 11: Assessing Knowledge  and Performance

Comparisons among Six Techniques for Assessment of Clinical Knowledge and Performance

________________________________________________________________________________________

Multiple Modified Structured Multimedia Standardized On-the- Choice Essay Oral Exam Simulation Patients Job PA

Knowledge +++ ++ + + + +

Interviewing/Interpersonal __ __ + + ++ +

Data gathering / history + ++ ++ ++ ++ +

Physical exam (technique) __ __ __ + +++ +

Reasoning / diagnosis + ++ ++ ++ ++ +

Lab utilization/management + ++ ++ ++ ++ +

Personal qualities __ __ __ __ __ ++ ___________________________________________________________________________________

+ = adequate ++ = good +++ = excellent __ = not applicable

Newble D, (1992). Assessing clinical comp. at undergrad level. Med. Educ. V. 26, 504-511

Page 12: Assessing Knowledge  and Performance

Interim Summary of Session

Session thus far

– Two uses of knowledge and performance assessments: Formative and Summative

– Validity of all assessment techniques

– Comparisons among 6 assessment techniques

Coming up

– Take and critique a 14 item multiple choice exam

– Presentation on Key Features items

Page 13: Assessing Knowledge  and Performance

Knowing and Realizing

The mere knowledge of a fact is pale; but when you come to realize your fact, it takes on color. It is all the difference between hearing of a man being stabbed in the heart, and seeing it done.

Mark Twain, A Connecticutt Yankee in King Arthur’s Court

Page 14: Assessing Knowledge  and Performance

How are Multiple Choice Items Selected for an Exam?

Page 15: Assessing Knowledge  and Performance

Sample Exam Blueprint based on Clinical Problems

Patient Life Span % Total Patients

% Men/Women Number of Problems

Pregnancy/Infant 5 50/50 4

Pediatrics 16 53/47 6

Adolescence 16 31/69 6

Adults 47 34/66 6

Geriatrics 16 45/46 6

Total 100 39/61 40

Page G, Bordage G, Allen T. Developing key-feature problems and examinations to assess clinical decision-making skills, Acad. Med. 70(3), 1995.

Page 16: Assessing Knowledge  and Performance

Key Features of a Clinical Problem

Definition: Critical steps that must be taken to identify and manage a patient’s problem

– focuses on a step in which examinees are likely to make an error

– is a difficult aspect in identifying and managing the problem Example: For a pregnant woman experiencing third-trimester

bleeding with no abdominal pain, the physician should:

– generate placenta previa as the leading diagnosis

– avoid performing a pelvic examination (may cause bleeding)

– avoid discharging from clinic or emergency room

– order coagulation tests and cross-match

Page G, Bordage G, Allen T. Developing key-feature problems and examinations to assess clinical decision-making skills, Acad. Med. 70(3), 1995.

Page 17: Assessing Knowledge  and Performance

Test Items based on a Clinical Problem and its Key Features

Page G, Bordage G, Allen T. Developing key-feature problems and examinations to assess clinical decision-making skills, Acad. Med. 70(3), 1995.

Jennifer, a 24-year-old woman, G3, P2, 31 weeks pregnant, comes to the emergency room at 8:00 PM complaining of bright red vaginal bleeding for the past two hours. The three sanitary napkins that she used were completely soaked. Her pregnancy has been uneventful, as were the previous ones. She has not had any contractions or abdominal pain. The fetus is moving as usual. Her BP is 110/70 mm Hg, and her pulse is 92/min. The examination of the abdomen reveals a uterine height of 31 cm. with a soft and nontender uterus. The fetus is in a breech position and has a heart rate of 150/min. No active bleeding has occurred since she arrived 25 minutes ago. 1. What is your leading diagnosis at this time? List only one diagnosis. Write “normal” if you judge Jennifer’s situation to be within normal limits. 2. What steps would you include in your immediate assessment and management of this patient? List as many as are appropriate.

Page 18: Assessing Knowledge  and Performance

Scoring the Placenta Previa Clinical Problem

Key Feature 1: To receive one point, must list placenta previa or one of the following synonyms: marginal placenta or low placental insertion

Key Features 2-4: Receive 1/3 point for listing each of the following: 1. Avoid performing a pelvic exam, 2. Avoid discharging from clinic, 3. Order coagulation tests and cross match

Total Score for Problem: Add scores for items 1 and 2 and divide by 2 (range: 0 - 1)

Page 19: Assessing Knowledge  and Performance

Steps to Develop a Clinical Problem Based Exam

Define the domain of clinical problems to be sampled by the exam Develop an exam blueprint to guide selection of clinical problems Develop a key-feature problem for each clinical problem selected

– define clinical situation for the problem (e.g. single typical problem, life-threatening situation etc.)

– define key features of the problem

– select a clinical case to represent the problem and write scenario

– write exam items for case; in general one item for each key feature

– select suitable format for each item (e.g., write-in or mcq)

– develop scoring key for each item

– pilot test items for item analysis data to guide refinement

Page 20: Assessing Knowledge  and Performance

Interim Summary of Session

Session thus far

– Two uses of knowledge and performance assessments: Formative and Summative

– Validity of all assessment techniques

– Comparisons among 6 assessment techniques

– Take and critique a 14 item multiple choice exam

– Write a Key Features item Coming up

– View and critique a videotaped student-patient interaction as part of an OSCE

Page 21: Assessing Knowledge  and Performance

Schematic Diagram of a 9 Station OSCE

Start

End

1 2 3 4

5

678 9

Page 22: Assessing Knowledge  and Performance

Widening the Lens on SP Assessment 1

Traditional scoring of SP assessment focuses on numerical data typically from checklists

Dimensions of the SP exam– basic science knowledge (organize the information)– physical exam skills (memory of routines)– establishing a human connection– role of the student (appear knowledgeable)– existential dimension of the human encounter (balance one’s

own beliefs with the patient’s) Clinical competence – mixture of knowledge and feeling,

information processing and intuition

1. Rose, M. & Wilkerson, L. Widening the Lens on Standardized Patient Assessment: What the Encounter Can Reveal about the Development of Clinical Competence, Acad. Med. 76(8), 2001.

Page 23: Assessing Knowledge  and Performance

Interim Summary of Session

Session thus far– Two uses of knowledge and performance

assessments: Formative and Summative– Validity of all assessment techniques– Comparisons among 6 assessment

techniques– Take and critique a 14 item multiple

choice exam– Write a Key Features item – View and critique a videotaped student-

patient interaction– Presentation: Widening the lens on SP

assessment Coming up

– Improving clinical performance assessment systems

Page 24: Assessing Knowledge  and Performance

Dr. Tough’s Memo regarding Dr. Will E. Makit (PGY 2)

“The performance of Dr. Makit in General Surgery this month has been completely unsatisfactory. Every member of the clinical faculty who has had any contact with him tells me of his gross incompetence and irresponsibility in clinical situations. This person is an embarrassment to our school and our training program. I spoke to him about his performance after I talked with you several weeks ago and he told me that he would improve it. There was no evidence that he made any effort to improve. There is no way this can be considered a completed or satisfactory rotation in General Surgery. In fact, he is the most unsatisfactory resident who has rotated through our service in the last five years, and his behavior is an appalling example to the rest of our housestaff.” *************************************************************

Your 1. Refer the problem to the Resident Education Action? Committee for an administrative decision.

2. Assign Dr. Makit to a rotation with Dr. Insightful as the Attending Faculty.

Page 25: Assessing Knowledge  and Performance

Dr. Insightful’s Phone Comments regarding Dr. Makit

“Judging from this week, Will has a hesitant style of speech that sometimes conveys a sense of confusion. He needs to carefully outline what he wants to say before speaking. He knows what he is doing, but sometimes does not get credit for it. Rounds were well-organized. He allowed students full rein to discuss their cases before he contributed.” ************************************************************************** Your 1. Request that Dr. Insightful continue observing Dr. Action? Makit. 2. Request that Dr. Insightful coach Dr. Makit on making patient presentations. 3. Request that Dr. Insightful write an appraisal of Dr. Makit’s performance now.

Page 26: Assessing Knowledge  and Performance

Resident Performance Assessment System

Organizational Infrastructure

Page 27: Assessing Knowledge  and Performance

A. Department’s Organizational Infrastructure

Department head emphasizes completing and returning PA forms

Consequences for evaluators who don’t complete PA forms

PA form is brief ( < 10 competencies) Don’t request pass/fail judgment by individual

faculty Evaluators trained to use PA form & criteria Evaluators believe they will be supported when

writing honest appraisals Specific staff assigned to monitor compliance in

returning forms Program Director alerted immediately when a

returned form reflects cautionary info

Organizational Infrastructure

Page 28: Assessing Knowledge  and Performance

Evaluator Role in a Performance Assessment System

Page 29: Assessing Knowledge  and Performance

B. Evaluator Role: Communicate Expectations and Observe Performance

Communicate Expectations– Consensus among evaluators

about service and education expectations

– Residents are crystal clear about service and education expectations

Observe Performance– Evaluators observe resident

multiple times before completing PA form

– Appraise only performance directly observed

– Other staff (e.g., nurses) complete PA forms

Communicate Expectations

Observe Performance

Page 30: Assessing Knowledge  and Performance

B. Evaluator’s Role: Interpret and Judge Performance

Evaluators agree on what behaviors constitute outstanding, ‘average’, and marginal performance

When facing a marginal resident, evaluators record rationale for their judgment and info surrounding the event

Evaluators record their interpretation of the performance soon after behavior occurs

diagnose performance (quality wnl ?)

Page 31: Assessing Knowledge  and Performance

B. Evaluator’s Role: Coach Resident

Evaluators aware of difference between corrective feedback, criticism and compliments

Faculty actively coach residents in timely manner

Residents are encouraged to ask for feedback

Evaluators regularly invite self-assessment from residents before giving feedback

Coach Resident

Page 32: Assessing Knowledge  and Performance

B. Evaluator’s Role: Communicate Performance Information and Complete PA Form

Communicate Performance Info– Communicate incidents that are

significantly negative or positive

– Document in writing even single instances of poor or inappropriate behavior

Complete PA Form– Evaluators write specific narrative

comments on PA forms

– Evaluators forward completed PA forms to Director in timely way

Communicate performance

information (to whom ?)

Complete PA Form

Page 33: Assessing Knowledge  and Performance

Program Director Role in a Clinical Performance Assessment System

Page 34: Assessing Knowledge  and Performance

C. Program Director’s Role: Monitor and Interpret Appraisals

Recognize evaluator rating patterns (stringent vs. lenient) to accurately interpret PA

Contact evaluators to elicit narrative info when absent to substantiate a marginal PA

Store PA forms in residents’ files in a timely manner

Summarize PA data to facilitate decision making by Resident Education Committee

Keep longitudinal records of PA data to develop norms for the PA form

Page 35: Assessing Knowledge  and Performance

C. Program Director’s Role: Committee Decision

PA decisions are a collaborative process involving multiple faculty

Seven or more PA forms per resident are available when admin decisions are made

Sufficient written narrative documentation is available when admin decisions are madecommittee

decision

Page 36: Assessing Knowledge  and Performance

C. Program Director’s Role: Formally Inform Resident

Residents are given a summary of their performance every six months

Evaluators have written guidelines outlining what must legally be in a probation letter

Evaluators know what documentation is needed to ensure adequate due process

Each resident receives an end of program performance evaluation

inform resident

Page 37: Assessing Knowledge  and Performance

Formative Evaluation: Diagnostic Checklist for Resident Performance Assessment System

Identify Problems

Recommend Improvements

A. Organizational Infrastructure B. Evaluator’s Role 1. Communicate Expectations 2. Observe Performance 3. Interpret and Judge Performance 4. Communicate Performance Info 5. Coach Resident 6. Complete PA Form C. Program Director’s Role 1. Monitor & Interpret Appraisals 2. Committee Decision 3. Formally Inform Resident

Page 38: Assessing Knowledge  and Performance

Research: Improving Resident Performance Appraisals 1

Organizational Infrastructure – Discussed PA problems at department

meetings

– Appointed a task force to review PA problems and propose solutions

– Revised old appraisal form

– Pilot-tested and adapted the new appraisal form

Evaluator Role– Provided evaluators with examples of

behaviorally-specific comments Results

– Increased # of forms returned, # forms with behaviorally-specific comments, and # of administrative actions by prog.

1. Littlefield, J. and Terrell, C. Improving the quality of resident performance appraisals, Academic Medicine, 1997: 72(10) Supplement, S45-47.

Page 39: Assessing Knowledge  and Performance

Research: Improving Written Comments byFaculty Attendings

1

Organizational Infrastructure

– Conducted a 20 minute educational sessions on evaluation and feedback

– 3 by 5 reminder card and diary Results

– Increased written comments specific to defined dimensions of competence

– Residents rated quantity of feedback higher and were more likely to make changes in clinical management of patients

1. Holmboe, E., et.al. Effectiveness of a focused educational intervention on resident evaluations from faculty. J. Gen Intern Med. 2001: 16:427-34.

Page 40: Assessing Knowledge  and Performance

Research: Summative Evaluation of a PA System1,2

Research Question Pre-intervention Post-intervention

1. % rotations return PA forms?

mean: 73% range: 31-100%

mean: 97%

range: 67-100%

2. % PA forms communicate performance problems?

3.6%

(n = 17/479)

5.9%

(n = 64/1085)

3. Probability Program will take admin action?

.50

(5/10)

.47

(8/17)

4. Reproducibility of numerical ratings?

Gen. coef.= .59

(10 forms)

Gen. coef.= .80

(10 forms)

1. Littlefield, J.H., Paukert, J., Schoolfield, J. Quality assurance data for resident global performance ratings, Academic Med., 76(10), supp., S102-04, 2001.

2. Paukert, J. et. al., Improving quality assurance data for resident subjective performance assessment, manuscript in preparation

Page 41: Assessing Knowledge  and Performance

Recall a medical student or resident whose performance made you uneasy.

1. What behavior or event made you uneasy?

2. What action did you take?

a. Talk with faculty colleagues about your concerns

b. Write a candid performance appraisal and send it to the course/residency director

3. If you wrote a candid appraisal, did an administrative action occur related to the student/ resident?

Page 42: Assessing Knowledge  and Performance

Goals: Assessment of Knowledge & Performance

1. Clarify two distinct uses for assessments of clinical knowledge and performance

2. Define two aspects of validity for all assessment methods

3. Compare and contrast 6 techniques for assessing clinical knowledge and performance

4. Identify poorly written multiple choice test items

5. Write a key features test item

6. Describe a role for narrative comments in scoring interactions with Standardized Patients

7. Describe three elements of a clinical performance assessment system

8. Critique a clinical performance assessment system that you use