iblc10 making an existing assessment more efficient

1

Click here to load reader

Upload: mark-russell

Post on 01-Jul-2015

294 views

Category:

Education


1 download

DESCRIPTION

Description if

TRANSCRIPT

Page 1: Iblc10   making an existing assessment more efficient

Using technology to make an existing assessment more efficient

Fang Lou1,2, Helen Barefoot2, Dominic Bygate2 & Mark Russell2

School of Life Science1 & Learning and Teaching Institute2, University of Hertfordshire, Herts. AL10 9AB, UK

What happened in 2007-8/2008-9?

2. Workshop after all students had completed the laboratory class•One hour workshop•Rationale for, and benefits of, peer assessment discussed•Guidance on laboratory report writing •Marking criteria considered•Consideration of student perceptions regarding peer assessment (EVS)

3. Submission of the report•Online submission of a student’s laboratory report (for moderation and academic conduct purposes)•Submission of an anonymous hard copy for distribution during the marking session

IntroductionPeer assessment is recognised as student centred and empowering. It engages students with marking criteria and helps clarify what good performance is. Peer assessment also encourages dialogue around learning amongst students and promotes interaction between staff and students. These aspects are all recognised as good assessment and feedback practice (Nicol, 2009). Additionally, peer assessment prepares students for lifelong learning and provides opportunities for students to provide constructive feedback to peers (Orsmond, 2004). First year Bioscience students studying a Human Physiology module at the University of Hertfordshire have been engaged in peer assessment of a laboratory report of a practical class over the last three years. It is a summative assessment and forms 20% of the module grade. The peer assessment activity was designed to provide students with clear instructions, transparent assessment criteria and to stimulate student learning. This year a web-based data gatherer was introduced to make the peer assessment process more effective and efficient.

1. Laboratory study

4. Peer assessment•Two hour peer marking activity •Each student was randomly given another student’s report•Lecturer guided the students through each section of the report according to detailed marking criteria•Students marked each section and provided annotated feedback •Reports collected for moderation

However, it wasn’t perfect…

What has been changed in 2009-10?

Peer assessment A print-out of the marking criteria was given to each student to enable

them to keep up with the marking A sheet designed for Optical Marking Reader (OMR) was given to

students. Marks for each section were entered.

Core features of the peer assessment (2007-2009) were kept, but actions were taken to respond to the noted challenges and ultimately improve the peer assessment. Changes include :

Reflection and feedbackAfter the peer assessment students were requested to reflect on their learning experience by answering a web-based questionnaire. The web-based data collection facility has been re-purposed to specifically support this activity. Five percent of the marks were allocated to the reflection and feedback activity.

Results148 students (out of 186 who submitted a report) completed the online questionnaire. The vast majority of the students engaged very well in answering all the 27 questions, and provided detailed free text comments. As such all but 3 students gained the full 5 marks allocated for feedback and reflection. The first part of the questionnaire using closed questions relating to reflection on the report and the peer assessment process. The results are reported here:

(SA: strongly agree; A: agree; NAND: Neutral; D: disagree; SD: strongly disagree)

As it can be seen from the graphs most students explicitly expressed that peer assessment will benefit their future learning. The students also identified the Discussion section of the report as the most challenging.Example responses to the question: “What other comments do you have regarding the peer assessment process?” include …

To enhance the resource efficiency of the activity the students were asked to comment on their mark and indicate, with explicit reference to the marking criteria, where they believe they had been over, or under, marked. Sixteen reports were moderated using the online submitted report. Extra marks were awarded to ten reports (9.4 1.8 %, mean SEM).

It was estimated that 25 to 30 staff marking hours were saved.

It was a very good eyeopener

the marking criteria was a great help and provided a guideline for future lab reports.

It wasn't as bad as I thought, but had trouble keeping up.

I think the assessment

wasn't very fair as some markers can be really generous and others will not be.

Nicol, D. (2009) http://www.reap.ac.uk/resourcesPrinciples.html. Accessed on 28 February 2010Orsmond , P (2004). Self- and Peer-Assessment – Guidance on Practice in the Biosciences. HEA Centre for Bioscience.

•Three hour laboratory class •Data recorded and analysed

Challenges:•high moderation requirement for the teaching team•Even after the moderation, a number of students still questioned their marks further increase in workload of the module team •Evident lack of self reflection (developing self reflection was one of the intended learning outcomes of the peer assessment)•Some students got behind during the marking•Time consuming and burdensome to enter 200+ marks (manually) onto a spreadsheet.

Action pointsWe needed to develop a system that could urge the students to read the marked report and:•Reflect on their learning experience•Consider whether they felt that the mark they received was appropriate according to the marking criteria•To identify what they need to do to improve future laboratory reports.We also identified the need for an automated system for entering marks and to decrease the moderation requirement and reduce the administrative burden.

ConclusionPeer assessment engaged students in productive learning. Prompt and relevant feedback was provided and students were requested to act on the feedback. It encouraged deep learning and interaction between students. Additionally, and importantly for 2009/10, staff time was saved.