where are we with assessment and where are we going? cees van der vleuten university of maastricht...

Post on 01-Apr-2015

216 Views

Category:

Documents

1 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Where are wewith assessmentand where are we going?

Cees van der VleutenUniversity of Maastricht

This presentation can be found at:www.fdg.unimaas.nl/educ/cees/amee

Overview of presentation

Where is education going? Where are we with assessment? Where are we going with

assessment? Conclusions

Where is education going?

School-based learning Discipline-based curricula (Systems) integrated curricula Problem-based curricula Outcome/competency-based curricula

Where is education going?

Underlying educational principles: Continuous learning of, or practicing with,

authentic tasks (in steps of complexity; with constant attention to transfer)

Integration of cognitive, behavioural and affective skills

Active, self-directed learning & in collaboration with others

Fostering domain-independent skills, competencies (e.g. team work, communication, presentation, science orientation, leadership professional behaviour….).

Where is education going?

Underlying educational principles: Continuous learning of, or practicing with,

authentic tasks (in steps of complexity; with constant attention to transfer)

Integration of cognitive, behavioural and affective skills

Active, self-directed learning & in collaboration with others

Fostering domain-independent skills, competencies (e.g. team work, communication, presentation, science orientation, leadership professional behaviour….).

Cognitivepsycholog

y

Cognitivepsycholog

y

Constructivism

Constructivism

Cognitiveload

theory

Cognitiveload

theory

Collaborativelearningtheory

Collaborativelearningtheory

EmpiricalevidenceEmpiricalevidence

Where is education going?

Work-based learning Practice, practice, practice…. Optimising learning by:

More reflective practice More structure in the haphazard learning

process More feedback, monitoring, guiding,

reflection, role modelling Fostering of learning culture or climate Fostering of domain-independent skills

(professional behaviour, team skills, etc).

Where is education going?

Work-based learning Practice, practice, practice…. Optimising learning by:

More reflective practice More structure in the haphazard learning

process More feedback, monitoring, guiding,

reflection, role modelling Fostering of learning culture or climate Fostering of domain-independent skills

(professional behaviour, team skills, etc).

Deliberatepractice

Deliberatepractice Emerging

work-based learning theories

Emergingwork-based

learning theories

EmpiricalevidenceEmpiricalevidence

Where is education going?

Educational reform is on the agenda everywhere

Education is professionalizing rapidly

A lot of ‘educational technology’ is available

How about assessment?

Overview of presentation

Where is education going? Where are we with assessment? Where are we going with

assessment? Conclusions

Expanding our toolbox…..

Knows

Shows how

Knows how

Does

Knows

Knows how

Established technology of efficient written or computer-based high fidelity simulations (MCQ, Key Feature, Script Concordance Test, MEQs….)

Expanding our toolbox…..

Knows

Shows how

Knows how

Does

Knows how

Shows how

Established technology of structured high fidelity in vitro simulations requiring behavioural performance (OSCE, SP-based testing, OSPE….)

Expanding our toolbox…..

Knows

Shows how

Knows how

Does

Shows how

Does

Emerging technology of appraising in vivo performance (Work-based assessment: Clinical work-sampling, Mini-CEX, Portfolio, practice visits, case orals….)

Expanding our toolbox…..

Knows

Shows how

Knows how

Does

“Domain independent” skills

“Domain specific” skills

Emerging technology of appraising in vivo performance (self-, peer, co-assessment, portfolio, multisource feedback, learning process evaluations……)

What have we learned?

Competence is specific, not generic

Reliability as a function of testing time

TestingTime inHours

1

2

4

8

MCQ1

0.62

0.76

0.93

0.93

Case-BasedShortEssay2

0.68

0.73

0.84

0.82

PMP1

0.36

0.53

0.69

0.82

OralExam3

0.50

0.69

0.82

0.90

LongCase4

0.60

0.75

0.86

0.90

OSCE5

0.47

0.64

0.78

0.88

PracticeVideo

Assess-ment7

0.62

0.76

0.93

0.931Norcini et al., 19852Stalenhoef-Halling et al., 19903Swanson, 1987

4Wass et al., 20015Petrusa, 20026Norcini et al., 1999

In-cognito

SPs8

0.61

0.76

0.92

0.93

MiniCEX6

0.73

0.84

0.92

0.967Ram et al., 19998Gorter, 2002

What have we learned?

Competence is specific, not generic Any single point measure is flawed One measure is no measure No method is inherently superior Subjectivity/unstandardised

conditions is not something to be afraid of.

What have we learned?

Competence is specific, not generic

One method can’t do it all

Magic expectations…….

Knows

Shows how

Knows how

Does

Knows

Knows how

Shows how

Does

Knows

Shows how

Does

Knows how

Shows how

Key features(short cases)

OSCEs

Direct observation methods, Portfolio

What have we learned?

Competence is specific, not generic

One method can’t do it all One measure is no measure We need a mixture of methods

to cover the entire pyramid We can choose from a rich toolbox!

What have we learned?

Competence is specific, not generic

One method can’t do it all Assessment drives learning

Assessment and learning

“The in-training assessment programme was perceived to be of benefit in making goals and objectives clear and in structuring training and learning. In addition, and not surprisingly, this study demonstrated that assessment fosters teaching and learning.….”(Govaerts et al, 2004, p. 774)

Assessment and learning

“Feedback generally inconsistent with and lower than self-perceptions elicited negative emotions. They were often strong, pervasive and long-lasting….”(Sargeant et al., under editorial review)

Assessment and learning

“You just try and cram - try and get as many of those facts into your head just that you can pass the exam and it involves… sadly it involves very little understanding because when they come to the test, when they come to the exam, they’re not testing your understanding of the concept. They test whether you can recall ten facts in this way? ” (Student quote from Cilliers et al., in preparation)

The continuous struggle

Curriculum Assessment

Content Format Programming/

scheduling Regulations Standards Examiners…Learner

What do we know?

Competence is specific, not generic

One method can’t do it all Assessment drives learning

Verify the consequences Use the effect strategically Educational reforms are as good as

the assessment allows it to be.

What do we know?

Competence is specific, not generic

One method can’t do it all Assessment drives learning

Verify the consequences Use the effect strategically Educational reforms are as good as

the assessment allows it to be.

Overview of presentation

Where is education going? Where are we with assessment? Where are we going with

assessment? Conclusions

My assumptions

Innovation in education programmes can only be as successful as the assessment programme is

Assessment should reinforce the direction of education that we are going

Future directions should use our existing evidence on what matters in assessment.

The Big Challenge Established assessment technologies

have been developed in the conventional psychometric tradition of standardisation, objectification & structuring

Emerging technologies are in vivo and by nature less standardized, unstructured, noisy, heterogeneous, subjective

Finding an assessment answer beyond the classic psychometric solutions is The Big Challenge for the future.

Design requirements future assessment

Dealing with real-life: In vivo assessment cannot and should

not be (fully) standardized, structured and objectified

Includes quantitative AND qualitative information

Professional and expert judgement play a central role.

Design requirements future assessment

Dealing with learning: All assessment should be meaningful

to learning, thus information rich Assessment should be connected to

learning (framework of the curriculum and the assessment are identical)

Assessment is ‘embedded’ in learning (equals the ‘in vivo of educational practice’ and adds significantly to the complexity).

Design requirements future assessment

Dealing with sampling: Assessment is programmatic

Comprehensive, includes domain-specific and domain independent skills

Combines sampling across many information sources, methods, examiners/judges/ occasions…..

Is planned, coordinated, implemented, evaluated, revised (just like a curriculum design).

Challenges we face

Dealing with real life: How to use professional judgement?

Do we understand judgment? How to elicit, structure and record

qualitative information? How to use (flexible) standards? What strategies for sampling should

we use? When is enough enough? How to demonstrate rigour? What

(psychometric, statistical, qualitative) models are appropriate?

Challenges we face

Dealing with learning: What are methodologies for embedding

assessment (e.g. Wilson & Sloane, 2000)? How to deal with the confounding of the

teaching and assessor role? How to combine formative and

summative assessment? How to involve stakeholders? How to educate stakeholders?

Challenges we face Dealing with sampling at the

programme level: What strategies are useful in designing a

sampling plan or structure of an assessment programme?

How to combine qualitative and quantitative information?

How to use professional judgement in decision making on aggregated information?

How to longitudinally monitor competence development?

What are (new) strategies for demonstrating rigour in decision making? What formal models are helpful?

Contrasting views in approach

Conventional assessment

Assessment separate from learning

Assessment as part of learning

Context free Context matters (dynamic relation between an ability, a task and a context in which the task occurs - Epstein & Hundert, 2002)

Programmatic embedded

assessment

Method-centred Programme-centred (based on overarching cohesive structure)

Contrasting approaches in research

Conventional assessment

Rigour defined in direct (statistical) outcome measures

Rigour defined by evidence on thrustworthiness or credibility on the assessment process Reliability/validity Saturation of information, triangulation

Programmatic embedded

assessment

Benchmarking Accounting

Contrasting views in approach

Conventional assessment

Confused

Programmatic embedded

assessment

Overview of presentation

Where is education going? Where are we with assessment? Where are we going with

assessment? Conclusions

Conclusions

Assessment has made tremendous progress Good assessment practices based on

established technology are implemented widely

Sharing of high quality assessment material has begun (IDEAL, UMAP, Dutch consortium)

Conclusions

We are facing a major next step in assessment We have to deal with the real world The real world is not only the work-

based setting but also the educational training setting

Conclusions

To make that step: We need to think out of the

box New methodologies to

support assessment strategies

New methodologies to validate the assessment

Conclusions

There is a lot at stake: Educational reform depends on it

I’m here because I couldn’t changethe assessment

Conclusions

Let’s join forces to make that next step!

This presentation can be found on:This presentation can be found on:

www.fdg.unimaas.nl/educ/ceeswww.fdg.unimaas.nl/educ/cees//ameeamee

top related