why a programme view? why testa?

42
Why a programme view? Why TESTA? Professor Tansy Jessop TESTA Workshop Trinity College Dublin 9 February 2017

Upload: tansy-jessop

Post on 14-Feb-2017

90 views

Category:

Education


1 download

TRANSCRIPT

Page 1: Why a programme view? Why TESTA?

Why a programme view? Why TESTA?

Professor Tansy JessopTESTA Workshop

Trinity College Dublin9 February 2017

Page 2: Why a programme view? Why TESTA?

Jottings

1. One thing you already know about TESTA2. One problem you have faced with assessment3. One problem you have faced with feedback4. One blue sky idea to address a problem

Page 3: Why a programme view? Why TESTA?

What I am hoping to achieve today

1. Brief overview of TESTA2. Why people find it useful3. Three problems TESTA addresses4. Four themes in the data5. Solutions: a taster

Page 4: Why a programme view? Why TESTA?
Page 5: Why a programme view? Why TESTA?

Mixed Methods approach

Programme Team

Meeting

Assessment Experience

Questionnaire(AEQ)

TESTAProgramme

Audit

Student Focus Groups

Page 6: Why a programme view? Why TESTA?

Sustained growth

Page 7: Why a programme view? Why TESTA?

TESTA….

“…is a way of thinking about assessment and feedback”

Graham Gibbs

Page 8: Why a programme view? Why TESTA?

It enables you to see the whole elephant

Page 9: Why a programme view? Why TESTA?

Three problemsThree problems

Problem 1: Something awry not sure why

Problem 2: Curriculum design problem Problem 3: The problem of educational change

Page 10: Why a programme view? Why TESTA?

1. Something awry, not sure why

Page 11: Why a programme view? Why TESTA?

Wow! Our students love History! Fantastic!

Page 12: Why a programme view? Why TESTA?

Whoops there’s a little problem here

Page 13: Why a programme view? Why TESTA?

Fix it!

Ok, we’ll look especially at polishing up our feedback. Students seems to

find that the least best thing.

Page 14: Why a programme view? Why TESTA?

Apply spit and polish

Page 15: Why a programme view? Why TESTA?

Anyone for the feedback sandwich?

I cushion the blow!

The hard truths are nicely disguised!

Me too - nice and soft!

Page 16: Why a programme view? Why TESTA?

Problem 2: Curriculum design problem

Page 17: Why a programme view? Why TESTA?

Does IKEA 101 work for complex learning?

Page 18: Why a programme view? Why TESTA?

Curriculum privileges ‘knowing’ stuff

“Content is often the most visible aspect for students, the control of which is frequently devolved to individual academics, who receive little or no training in curriculum design and planning”

(Blackmore and Kandiko 2014, 7).

Page 19: Why a programme view? Why TESTA?

Blunt instrument curriculum

Page 20: Why a programme view? Why TESTA?

Problem 3: Educational change problem

Three misguided assumptions:

1. There is not enough high quality data.

2. Data will do it

3. Academics will buy it.

http://www.liberalarts.wabash.edu/study-overview/

Page 21: Why a programme view? Why TESTA?

Proving is different from improving

“It is incredibly difficult to translate assessment evidence into improvements in student learning”

“It’s far less risky and complicated to analyze data than it is to act”

(Blaich & Wise, 2011)

Page 22: Why a programme view? Why TESTA?

Paradigm What it looks like

Technical rational Focus on data and tools

Relational Focus on people

Emancipatory Focus on systems and structures

Page 23: Why a programme view? Why TESTA?

TESTA themes and impacts

1. Variations in assessment patterns2. High summative: low formative3. Disconnected feedback4. Lack of clarity about goals and standards

Page 24: Why a programme view? Why TESTA?

Defining the terms

• Summative assessment carries a grade which counts toward the degree classification.

• Formative assessment does not count towards the degree (either pass/fail or a grade), elicits comments and is required to be done by all students.

Page 25: Why a programme view? Why TESTA?

1. Huge variations

• What is striking for you about this data?

• How does it compare with your context?

• Does variation matter?

Page 26: Why a programme view? Why TESTA?

Assessment features across a 3 year UG degree (n=73)Characteristic Range

Summative 12 -227

Formative 0 - 116

Varieties of assessment 5 - 21

Proportion of examinations 0% - 87%

Time to return marks & feedback 10 - 42 days

Volume of oral feedback 37 -1800 minutes

Volume of written feedback 936 - 22,000 words

Page 27: Why a programme view? Why TESTA?

Theme 2: High summative: low formative

• Summative ‘pedagogies of control’

• Circa 2 per module in UK

• Ratio of 1:8 of formative to summative

• Formative weakly understood and practised

Page 28: Why a programme view? Why TESTA?

Assessment Arms Race

Page 29: Why a programme view? Why TESTA?

What students say about high summative

• A lot of people don’t do wider reading. You just focus on your essay question.

• In Weeks 9 to 12 there is hardly anyone in our lectures. I'd rather use those two hours of lectures to get the assignment done.

• It’s been non-stop assignments, and I’m now free of assignments until the exams – I’ve had to rush every piece of work I’ve done.

Page 30: Why a programme view? Why TESTA?

What students say about formative

• If there are no actual consequences of not doing it, most students are going to sit in the bar.

• The lecturers do formative assessment but we don’t get any feedback on it.

Page 31: Why a programme view? Why TESTA?

Actions based on evidence

1. Rebalance summative and formative2. Programme approach3. Formative in the public domain4. Linking formative and summative5. Risky, creative, challenging tasks6. Students reading and producing more7. Deeper understanding of value of formative

Page 32: Why a programme view? Why TESTA?

Theme 3: Disconnected feedback

Page 33: Why a programme view? Why TESTA?

Take five

• Choose a quote that strikes you.

• What is the key issue?

• What strategies might address this issue?

Page 34: Why a programme view? Why TESTA?

What students say…

It’s difficult because your assignments are so detached from the next one you do for that subject. They don’t relate to each other.

Because it’s at the end of the module, it doesn’t feed into our future work.

Because they have to mark so many that our essay becomes lost in the sea that they have to mark.

It was like ‘Who’s Holly?’ It’s that relationship where you’re just a student.

Page 35: Why a programme view? Why TESTA?

Actions based on evidence

• Conversation: who starts the dialogue?• Iterative cycles of reflection across modules• Quick generic feedback: the ‘Sherlock’ factor• Feedback synthesis tasks• Technology: audio, screencast and blogging• From feedback as ‘telling’…• … to feedback as asking questions

Page 36: Why a programme view? Why TESTA?

Theme 4: Confusion about goals and standards

• Consistently low scores on the AEQ for clear goals and standards

• Alienation from the tools, especially criteria and guidelines

• Symptoms: perceptions of marker variation, unfair standards and inconsistencies in practice

Page 37: Why a programme view? Why TESTA?

What students say…We’ve got two tutors- one marks completely differently to the other and it’s pot luck which one you get.

They have different criteria, they build up their own criteria.

It’s such a guessing game.... You don’t know what they expect from you.

They read the essay and then they get a general impression, then they pluck a mark from the air.

Page 38: Why a programme view? Why TESTA?

Taking action: internalising goals and standards• Regular calibration exercises• Discussion and dialogue• Discipline specific criteria (no cut and paste)

Lecturers

• Rewrite/co-create criteria• Marking exercises • Design and value formative

Lecturers and

students

• Enter secret garden - peer review• Engage in drafting processes• Self-reflection

Students

Page 39: Why a programme view? Why TESTA?

From this educational paradigm…

Page 40: Why a programme view? Why TESTA?

Transmission Model

Page 41: Why a programme view? Why TESTA?

Social Constructivist Model

Page 42: Why a programme view? Why TESTA?

ReferencesBlaich, C., & Wise, K. (2011). From Gathering to Using Assessment Results: Lessons from the Wabash National Study. Occasional Paper #8. University of Illinois: National Institution for Learning Outcomes Assessment.Boud, D. and Molloy, E. (2013) ‘Rethinking models of feedback for learning: The challenge of design’, Assessment & Evaluation in Higher Education, 38(6), pp. 698–712. doi: 10.1080/02602938.2012.691462.Gibbs, G. & Simpson, C. (2004) Conditions r which assessment supports students' learning. Learning and Teaching in Higher Education. 1(1): 3-31.Harland, T., McLean, A., Wass, R., Miller, E. and Sim, K. N. (2014) ‘An assessment arms race and its fallout: High-stakes grading and the case for slow scholarship’, Assessment & Evaluation in Higher Jessop, T. and Tomas, C. 2016 The implications of programme assessment on student learning. Assessment and Evaluation in Higher Education. Published online 2 August 2016. Jessop, T. and Maleckar, B. (2014). The Influence of disciplinary assessment patterns on student learning: a comparative study. Studies in Higher Education. Published Online 27 August 2014 http://www.tandfonline.com/doi/abs/10.1080/03075079.2014.943170Jessop, T. , El Hakim, Y. and Gibbs, G. (2014) The whole is greater than the sum of its parts: a large-scale study of students’ learning in response to different assessment patterns. Assessment and Evaluation in Higher Education. 39(1) 73-88.Nicol, D. (2010) From monologue to dialogue: improving written feedback processes in mass higher education, Assessment & Evaluation in Higher Education, 35: 5, 501 – 517.O'Donovan, B , Price, M. and Rust, C. (2008) 'Developing student understanding of assessment standards: a nested hierarchy of approaches', Teaching in Higher Education, 13: 2, 205 — 217Sadler, D. R. (1989) ‘Formative assessment and the design of instructional systems’, Instructional Science, 18(2), pp. 119–144. doi: 10.1007/bf00117714.