how do practitioners assess students’ reflective writing  · web viewthe designers expected a...

23
How do practitioners assess students’ reflective writing? Janet Strivens Centre for Lifelong Learning University of Liverpool Paper presented at the Learning Communities and Assessment Cultures Conference organised by the EARLI Special Interest Group on Assessment and Evaluation, University of Northumbria, 28-30 August 2002 This paper arises from a workshop offered to educational practitioners by the Centre for Recording Achievement (CRA), called “Supporting Learning From Reflection”. The workshop was repeated on four occasions: at the University of Manchester on 29 th October 2001; the University of Brighton on 7 th December 2001; the University of Huddersfield on April 8 th 2002; and University College Worcester on June 18 th 2002. The aims of all four workshops were the same: to articulate the criteria we use to assess the quality of students’ reflective work; to identify how these criteria may vary depending on the purposes for which reflective tasks are set; to identify the kinds of support which would improve the quality of learning from reflection for ‘weaker’ students. Approximately 150 people attended the four workshops. Forty- nine people registered for Manchester, thirty-two for Brighton, thirty-eight for Huddersfield and thirty-one for Worcester. In addition to Higher Education Institutions (HEIs), participants came from the NUS, the school sector and UCAS. Within HE, there was a wide range of different areas of work, levels of seniority and academic disciplines. Careers, staff development, research posts and academic posts were all represented. The format of the workshops 1

Upload: others

Post on 16-Oct-2020

7 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: How do practitioners assess students’ reflective writing  · Web viewThe designers expected a wide variety in the materials brought and, by implication, in the reflective tasks

How do practitioners assess students’ reflective writing?

Janet StrivensCentre for Lifelong Learning

University of Liverpool

Paper presented at the Learning Communities and Assessment Cultures Conference organised by the EARLI Special Interest Group on Assessment and Evaluation,

University of Northumbria, 28-30 August 2002

This paper arises from a workshop offered to educational practitioners by the Centre for Recording Achievement (CRA), called “Supporting Learning From Reflection”. The workshop was repeated on four occasions: at the University of Manchester on 29th October 2001; the University of Brighton on 7th December 2001; the University of Huddersfield on April 8th 2002; and University College Worcester on June 18th 2002.

The aims of all four workshops were the same: to articulate the criteria we use to assess the quality of students’ reflective work; to identify how these criteria may vary depending on the purposes for which reflective

tasks are set; to identify the kinds of support which would improve the quality of learning from

reflection for ‘weaker’ students.

Approximately 150 people attended the four workshops. Forty-nine people registered for Manchester, thirty-two for Brighton, thirty-eight for Huddersfield and thirty-one for Worcester. In addition to Higher Education Institutions (HEIs), participants came from the NUS, the school sector and UCAS. Within HE, there was a wide range of different areas of work, levels of seniority and academic disciplines. Careers, staff development, research posts and academic posts were all represented.

The format of the workshops

The context for the development of the CRA workshop was the imminent introduction of the process known as personal development planning (PDP) into UK HEIs. The Centre for Recording Achievement is a cross-sectoral national network committed to the furtherance of PDP for all learners. (For more information about its work and mission, see:

HTTP://WWW.RECORDINGACHIEVEMENT.ORG )

The workshops were designed to give the maximum space for discussion. The idea of a day given over completely to practitioners talking to each other had grown out of a feeling of frustration over a number of events concerned with developing PDP practice. Typically a sizeable number of participants already knew each other well from their

1

Page 2: How do practitioners assess students’ reflective writing  · Web viewThe designers expected a wide variety in the materials brought and, by implication, in the reflective tasks

common work in this area and/or through membership of the CRA. A frequent comment from this group was that the standard ‘workshop’ sessions, where the participants broke into groups to discuss issues, had just about got underway when everyone was called back to attend to the next presenter. No doubt what the presenters had to say was valuable, but who knew where the discussion might have arrived if it had been allowed to continue? The workshop designers, then, were looking to design a format which facilitated purposeful discussion , but in such a way as to root the discussion firmly in the practice of developing students’ reflective writing.

The first two workshop aims addressed the process of assessment. A recurrent issue for practitioners in discussion was the assessment of the ‘quality’ of reflection. There was undoubtedly a lot of reflective work going on, in terms of the tasks set by tutors within programmes. Equally clearly these tutors knew at an intuitive level what they were looking for (or at least they knew when they were disappointed by the student’s work), but often felt that this was a highly subjective judgement. When the task contributed in some significant way to the outcomes of the programme (for example, when an aim of the programme was to produce the reflective practitioner), the feeling of discomfort over the subjectivity of the assessment was heightened.

The designers therefore wanted to give participants the space to examine their own judgements about the value of students’ responses to the reflective tasks set. Two things seemed important in facilitating the process of making these judgements explicit: a supportive group process and the use of ‘real’ student material. Participants were asked in the workshop flyers, and reminded when they registered, to bring multiple copies of two pieces of student work, a ‘good’ response and a ‘weak’ response to a reflective task which they had set. They were divided into groups of around six, small enough for each or any member of the group to present his or her assessment of student work to the rest of the group, and talk through the process of assessing, pointing out what seemed to them to be the ‘virtues and weaknesses’ in the work. The size of the groups meant that a relationship between strangers could build quite quickly and also that each member had a chance to present his or her students’ work.

Participants were also asked to bring any documentation relating to the task, especially any written criteria for assessment if these existed. However, to help structure the discussions a list of possible criteria for assessing reflection was provided (see Appendix 1, the ‘coversheet’). It was explained that this list was meant as a prompt only: the items could be edited or added to. However it was stipulated on the sheet that a maximum of five criteria should be identified for the assessment of each task, ranked if possible in order of importance. This ‘forced choice’ was intended to help the group (and particularly the presenter of the set of material under discussion), to clarify the main purpose and emphasis of the reflective task being considered.

Apart from the criteria on the sheet, no models of reflection were offered to the participants in advance of their discussion (although they were provided with a copy of Moon’s (2001)summary to take away from the day). It was expected that individuals would bring very different levels of awareness of the literature on reflection, and indeed,

2

Page 3: How do practitioners assess students’ reflective writing  · Web viewThe designers expected a wide variety in the materials brought and, by implication, in the reflective tasks

it was clear from the discussions and from the tasks themselves that some individuals used very clear models of reflection with their students, both implicitly and explicitly. Others were very new to the idea of reflection and were working largely intuitively. Of course, the designers (both of them staff and educational developers) were attempting to design an experience which would be developmental for every participant, and the possibility of useful research data arising out of the exercise was a secondary consideration. Nevertheless, it was expected that within the range of tasks brought along (despite the fact that they were all in some sense ‘reflective’), differences would emerge in the criteria selected, which would help the group to pinpoint the precise (or at least the most important) purpose in the tutor’s mind for setting the task for the students.

The groups were invited to start the day by considering the ‘good’ piece of student work. It seemed logical that a response judged to be satisfactory to the task set would help to clarify the criteria for the assessment of that task. (However, some participants found it easier and more valuable to consider the two pieces of work, good and weak, together.) The intention was to move in the afternoon to a consideration of the weaker piece, armed with the criteria which had been identified earlier, in order to answer more precisely the question: “how could this student be supported to learn more effectively from this reflective task?” In this way it was hoped that the strategies for supporting students who find reflection more difficult could be more tailored to the precise purpose of the task set.

Perhaps it is unnecessary to say that things didn’t quite turn out like this in practice! People who come to workshops do not always read the instructions carefully, much less follow them! The first risk was whether enough participants would bring materials. To be prepared for the worst, the organisers took along sets of materials from their own students. In the event, about one third of the Manchester attendees (the first workshop) had brought their own material and closer to two thirds at the three subsequent workshops. (Two sets of the organisers’ material were used with groups at Manchester, but comments on the evaluation sheet indicated that the value of the discussion was not diminished by this. However these have been excluded from this analysis.)

In the event, it was found that it could take a lot longer than the designers had envisaged for the group to feel they had exhausted the possibilities of a set of material: more than one group only managed to look at one set of materials in the morning discussion session, which lasted an hour and a half.

The possibility of valuable research material arising out of these workshops had already been recognised by the designers. Participants who were willing were asked to leave a copy of the materials they had brought. Forty-four full or partial sets of material were offered by the participants, a rich source of information about practice in the field of reflection (though not all of the student work may be made public). A ‘set’ of materials contains the filled-in ‘coversheet’ provided by the organisers with an indication of the assessment criteria chosen, two pieces of student work of differing quality, and where available, the documentation given to the student to describe the task and how it would be judged.

3

Page 4: How do practitioners assess students’ reflective writing  · Web viewThe designers expected a wide variety in the materials brought and, by implication, in the reflective tasks

This paper is a preliminary analysis of this material, focussing on the nature of the tasks and the criteria chosen. The intention is to share this analysis with all the participants (almost all willingly agreed to form a mailbase to continue discussion and exchange of ideas) and invite their further thoughts and comments. The ultimate goal (which may or may not be realisable) is to support action research by the practitioners in developing and supporting their students’ capacity to reflect, and to learn from reflection.

The criteria

The criteria offered were based on the experience of the designers from their own practice and from long involvement in discussion within the CRA, rather than being derived from an explicit theoretical model. As the coversheet makes clear, they were intended as prompts, to provide structure rather than constraint, and participants were invited to add to or amend them [Table 2].

A few groups, particularly in the first workshop, had difficulty relating the criteria to the task set rather than to the individual student work. This may have been because a particularly good piece of reflective writing seemed to exemplify most of the characteristics listed. After the first workshop, the facilitator took pains to stress that the criteria should be considered in relation to the task. Some groups felt that they had managed to identify one or more ‘essential’ characteristics of reflection, by identifying a criterion common to all the work they had brought, but consensus within a group was not matched by consensus between groups. Looking at the scatter of choices [Table 1] it would probably be fair to say that the workshop designers’ assumptions about different purposes for reflective tasks mirrored in different criteria for assessment were borne out, even allowing for the artificial restrictions placed on the participants.

Several criteria were added to the list offered. The most common suggestion for a criterion which participants felt had been omitted was “awareness of own strengths”. Clearly participants felt that this was not strongly enough indicated by the criterion “awareness of own qualities/attributes”. Two others seem like clear omissions: “record progress against goals” and “provide evidence”. These will be added to future workshops. Other suggestions were less clear and clarification will be sought.

Some emendations were suggested. “Accurate observation” was amended to “precise observation” in one instance and “meaningful observation” in another. “Appropriate” was added to “analysis of events”. There was evidence that these first two criteria were strongly linked in some people’s minds.

“Honesty/authenticity” provoked several comments and queries. Clearly some would have preferred to omit this criterion, on a variety of grounds: some wanted to distinguish between the two concepts, others queried whether they could or should be assessed. One group in verbal feedback argued that this should be the one fundamental criterion for all reflective tasks, but only three individuals ranked it on their coversheets as most important in relation to their material.

4

Page 5: How do practitioners assess students’ reflective writing  · Web viewThe designers expected a wide variety in the materials brought and, by implication, in the reflective tasks

Over a quarter disliked the idea of prioritising, either among the five criteria they chose or, in a few cases, selecting criteria at all.

The tasks

As expected, the tasks varied tremendously along several dimensions: size, degree of structure, whether or not for formal credit [Table 3]. The most substantial were essays of 2000 words plus, the shortest consisted of three questions on a form. Several were highly structured by a series of questions, sometimes with a box limiting the space for an answer. For others the structure was contained within the guidance: it was the students’ task to make sure that points were addressed within their continuous prose. While there was a tendency for highly structured tasks to be shorter, some of the more substantial pieces were guided by a detailed brief.

Nearly a quarter of the tasks could be described as essays, in that they were relatively discursive and lengthy pieces of writing, with the structure either chosen by the student or adapted from headings in the guidance documentation [A7, A17, B4, B7, B8, B9, C6a&b, D1b]. These are an interesting sub-group in that they pose particular assessment problems to tutors (and, by implication, production problems to weaker students?). There is an expectation with most of these that academic literature will be referred to, and credit is given for an appropriate bibliography. There is an expectation that theoretical models will be applied to the analysis of the writer’s experiences or observations. Language is also an issue: one feedback comment specifically mentions that the language is sometimes inappropriate for a piece of academic writing. Typically these pieces of work are submitted for formal academic credit.

In these cases, the ‘strong’ example of a student response to the task was likely to be one which handled the demands of the academic genre while at the same time using interesting and relevant episodes of personal experience. A ‘weak’ response presumably could be weak in many different ways, not necessarily related, making the task of supporting the ‘weaker’ student more problematic.

There were many examples which could be called self-evaluation, varying according to whether the evaluation was against clear targets or learning objectives; whether the writer was expected to produce an action plan following the evaluation; whether the purpose was to assess the writer’s current state of development or his/her performance on a particular occasion or his/her learning from a particular event. ‘Current state’ evaluations looking back over a specified period of time tended to be the least structured, but frequently these tasks were highly structured by a series of questions and/or a proforma. A ‘weak’ response to such a highly structured task seems to be a minimal one, suggesting that the task has not been addressed with any seriousness.

A small but interesting subgroup could be characterised as ‘critical incident analysis’ [A7(?), B5, B10b, D1b]. These examples are dissimilar in format: two are essays, two proformas. However they all rely on the writer ‘noticing’ something worth thinking about

5

Page 6: How do practitioners assess students’ reflective writing  · Web viewThe designers expected a wide variety in the materials brought and, by implication, in the reflective tasks

and reporting, usually but not always involving his/her own performance. When the writer is involved, the issue arises as to whether the tutor’s assessment will relate to the performance or to the quality of reflection. Assessors of portfolios will be well aware that experiences of failure typically provide much richer material for reflection than experiences of success, but it is difficult to persuade students, in particular weaker students, to expose themselves in this way. When the writer is only a spectator, the ‘incident worth noticing’ may well involved perceived failure in others (because this is again more likely to raise questions than watching successful performances). In some circumstances this may raise difficult issues for the tutor/assessor of ‘professional attitudes’ since the ‘failing actors’ whose behaviour the student writer is critically analysing may be more experienced and qualified professionals.

An attempt to re-examine the selection of criteria using sub-categories of tasks did not reveal any clear patterns which matched criteria to tasks [Table 4]. The next step is to resubmit this data to the workshop participants who submitted it. They may choose to adjust the tentative classification of tasks made by the author with respect to their own contribution, and could supply some of the missing data in the selection of criteria.

Discussion

There are some existing models of reflection from which assessment criteria can be derived, which practitioners seem to find useful. Most often cited are Hatton and Smith (1995), who suggest a four-level model of development from descriptive writing through descriptive reflection and dialogic reflection to critical reflection. Johns’ (1994) model is rather different, delineating the field of ‘reflection’ through a series of eighteen questions divided into five areas: description of experience; reflection; influencing factors; alternative actions; learning. (The six questions under the heading ‘reflection’ have no privileged status).

In designing these workshops, the underlying concept of reflection could perhaps be seen as closer to Johns in several ways. The designers expected a wide variety in the materials brought and, by implication, in the reflective tasks set. (Incidentally, one participant objected to the use of the word ‘task’ to describe what the students were asked to do, but we have failed to find a more satisfactory word.) From their own experience they expected these tasks to have a range of different purposes, for which different assessment criteria might well be appropriate. Reflective tasks may become more demanding at higher levels, and may in consequence activate a wider range of criteria for their assessment. However it would clearly be simplistic to look for this in terms of academic level. Students exposed to different curricula and different tutors will be asked to reflect, if at all, at different stages of their progress. The workshops were not structured in such a way as to reveal a developmental aspect to a tutor’s or programme team’s planning. The focus was on supporting students in a reflective task at a point in time, rather than looking at the design of a curriculum to deepen reflection.

Participants found the discussion of criteria useful and illuminating, according to their evaluations, but there was too little time to explore whether ‘support for the weaker

6

Page 7: How do practitioners assess students’ reflective writing  · Web viewThe designers expected a wide variety in the materials brought and, by implication, in the reflective tasks

student’ could be more effective if targeted more clearly to learning outcomes and assessment criteria. While highly-structured proforma appeared to be accepted by the practitioners as valid reflective tasks, there was certainly some discussion (provoked, interestingly, by some of the ‘strong’ examples of students’ work) of whether some students were simply better at jumping through hoops than others, and would take care to hit the criteria given when asked to write reflectively. The obvious question to follow from this is, if we can induce students to write reflectively, is this habit-forming and do they in fact think better when away from our carefully-honed assignments?

To summarise this work so far: when articulated, criteria do appear to vary depending upon the learning intention behind the task set, but not to any definable pattern. Tutors are often unclear about the criteria they are using and assess intuitively, and they welcome models which offer guidance. Greater clarity about purpose and criteria could lead to better targeted support for students who find reflective tasks difficult, but do not necessarily lead to the development of the more reflective individual.

References:

Hatton, N, and Smith, D (1995) “Reflection in teacher education: towards definition and implementation”, Teaching and Teacher Education, 11 (1) pp33-49.

Johns, C (1994) “Guided reflection” in A Palmer, S Burns and C Bulman (eds.) Reflective Practice in Nursing, Oxford: Blackwell Scientific

Moon, J (2001) Reflection in Higher Education Learning, PDP Working Paper 4, LTSN Generic Centre (at http://www.ltsn.ac.uk/genericcentre/projects/pdp/working-papers/)

Special thanks to: Catherine O’Connell for initial planning and enthusiasm;Jenny Moon for co-facilitating at Brighton and supplying a background paper for all events;Rosemary Warner, Peter Kahn and colleagues at the University of Manchester/UMIST for organising the event at Manchester;Julie Fowlie for organising the event at Brighton;Rob Lloyd Owen for organising the event at Huddersfield;John Peters for organising the event at Worcester;and Cath Hewson for all the administration from the CRA end!

7

Page 8: How do practitioners assess students’ reflective writing  · Web viewThe designers expected a wide variety in the materials brought and, by implication, in the reflective tasks

Table 1: Participants’ selection of criteria

A = Accurate observation G = Awareness of own learningB = Analysis of events/situations H = Making connections (ideas)C = Identification of relevant elements I = Making connections (prior knowledge)D = Awareness of own feelings/emotions J = Honesty/authenticityE = Awareness of own attributes/qualities K = Recognition of/planning for next stageF = Awareness of own weaknesses L = Commitment to change

A B C D E F G H I J K L Add?A1 3 1 3 1 3A2 x x 1 x 5A3 x x x x xA4 4 5 2 3 1A5 x x x x xA7 x x x x x xA8 1 2 3 4 5 YesA9 x x x x x YesA12 3 4 1 2 5A13 2 3 5 2 1A14 4 3 1 2 5 YesA15 x x x x xA17 4 3 1 2 5 YesB1 5 4 1 3 2 YesB2 3 2 3 3 3 2 1 1 1 1 3 3B3 x x x xB4 5 4 3 2 1B5 4 1 2 3 5B6 2 2/3 2/3 3 1 1B7 2 2 2 2 2 ? 1 3 3 4 4B8 3 4 3 5 2 2 1 1 2 3 2 2B10 x x x x xB12 3 1 2 5 4C1 5 2= 1 4C2 3 2 1 4C3 1 2 3C5 4 1 2C6 1 5 4 2 3D1 1 2 3 6 4 5 YesD2 4 5 2 1 3 YesD3 5 3 2 1 4D6 2 3= 3= 1 5

8

Page 9: How do practitioners assess students’ reflective writing  · Web viewThe designers expected a wide variety in the materials brought and, by implication, in the reflective tasks

Table 2: Additions/amendments to criteria

A1 Inability to prioritiseA2 Offered both a linear model and prioritiesA3 V,difficult/impossible to prioritise

Adds: “application of practice to theory”; “awareness of strengths”A4 Adds: “to record progress against goals” (most important)A5 No prioritisationA6 Equal priorityA7 Chose six – no prioritisation. Queried “honesty/authenticity” – relevant? measurable?A8 Prioritised unwillingly!

Adds: “to record progress against goals”; “ awareness of own strengths”A9 ‘In reality...would not prioritise’

Adds: “awareness of own strengths” A10 Adds: “strength” ; “evaluation”A11 Linked together “accurate observation” and “analysis of events/situations”; linked together “recognition/planning” and

“commitment to change”Adds: “awareness of strengths”

A12 Prefers “precision” to “accurate observation”A13A14 Prefers “precise” to “accurate”

Adds: “progress against learning goals”; “learning dialogues”A15A16 Queried “honesty/authenticity”; added “appropriate” to “analysis of events/situations”, “providing examples” to

“making connections with prior knowledge/experience”A17 Adds: “record progress against goals”; “awareness of own strengths”A18B1 Provided ‘where we’re at’ and ‘where to be’ choices

Adds: “more level guidance” ; “action/future learning”B2 Selected all, weighting 1,2 or 3B3 No priorities; selected fourB4 Adds: “integrating reflected material” (most important)B5 Prefers “meaningful” to “accurate (observation)”B6 Prefers “meaningful” to “accurate (observation)”B7 Selects all but “honesty/authenticity2, four weightings. Adds “purpose of reflection” to “(awareness of) own learning”B8 Selects all, five weightingsB9 Selection/prioritisation exercise not carried outB10 No prioritisation . Adds: “why?”B11 No coversheetB12C1 Adds: “provide evidence” Table of group responsesC2C3 Ranks in order of doing, not importanceC4 No coversheetC5C6 Criteria relate to first example, not second (different exercise)C7 Selection/prioritisation exercise not carried outC8 No coversheetD1 Adds: “awareness of moving on”; “use of model”D2 ‘V,difficult to prioritise –all inter-related’

Adds: “identification of areas needing improvement/change”; “changing prior assumptions”Queries “honesty/authenticity” – ‘should honesty be here at all? (but weights it 3)Prefers “acknowledgement” to “recognition (of/planning for next stage)”

D3 Adds “awareness of own strengths”D4 ‘Not done! What need for ranking? Why just five? Not discrete anyway’

Adds “strengths” to “areas for improvement”; “observations” to “(making connections between) ideas”D5 Refers to assessment criteria in own documentationD6

9

Page 10: How do practitioners assess students’ reflective writing  · Web viewThe designers expected a wide variety in the materials brought and, by implication, in the reflective tasks

Table 3: Tasks

Title Type/Category Students Size/format/structureA1 Self-evaluation sheet Self-evaluation of language development/ action plan Ug Socrates Form: 2pages, 12 questionsA2 Personal evaluation/review Self-evaluation of general skill development/ action plan Ug L2 1000wA3 Reflections on final year project Self-evaluation of project work Ug L3 (end) (3A4)A4 Summary of achievements Self-evaluation of general development against goals Ug various (1-2pp)A5 Reflection form Reflection on course assignment Ug L1 (early) Form: 1page, 4 questionsA7 Reflective essay Reflection on subject skill development over year Ug L2 Essay (2500+w): 2 parts,

including literatureA8 Assignment Report on sandwich year Ug L2/3 1000wA9 Learning log Self-evaluation of own learning from session/review/action plan Ug L2 4x1000wA12 Skills review Self-evaluation of general skill development, end of placement Ug summer Online form, questionsA13 Reflection on learning, skills & pd Self-evaluation of 1st semester Ug L1 Form: 2 pages A14 Reflection and evaluation Summative reflection to justify claim for credit PG Unstructured: 1 pageA15 Clinical placement reflective diary Self-evaluation of professional skills development Ug Yr1 Form: 2 pages A17 Portfolio Stage 3 Reflection on previous experience against objectives Stage 3 PAMS Essay: including review/action

plan and literatureB1 Self-assessment Self-development plan/review with evidence Ug Sandwich Structured by student against

objectivesB2 Reflective journal Weekly description of/reflection on course content HND Yr1 2000+w, sectioned by weeksB3 PDP Base statement Self-evaluation Ug Yr 1 Online, 9 questionsB4 Assignment (Management Science) Analysis of self-selected problem using MS modelling M.Sc Report structured by detailed

brief (2000+w)B5 Example of a critical incident Critical incident analysis Pharmacy

techniciansStructured by 5 questions in guidance: 2 pages

B6 Academic review of the year Self-evaluation of development in Yr 1 UG Yr 1/2 Form: 1 page B7 Position paper Self-evaluation of learning against competency targets Social work L1 Short essayB8 Portfolio ‘piece’ Application of theory to own learning Pg (MA) Essay: 2000+wB9 Learning log Assessment of a learning experience Ug Yr 2 EssayB10a Evaluation of session Self-evaluation of learning from teaching session Pharmacist Form: 1 page with 5 headingsB10b Example of significant incident report Critical incident analysis Pharmacy

technician1 page: description/reflection

C1 Weekly learning review and plan (WRAP)

Weekly self-assessment against competencies, with evidence -> summary

Ug Yr1 Online: Comments against competency statements

10

Page 11: How do practitioners assess students’ reflective writing  · Web viewThe designers expected a wide variety in the materials brought and, by implication, in the reflective tasks

C2 Student placement report forms/diary /contact list/ record/ portfolio of work

Evidence claims that module LOs have been met Ug Level 1 WBL

Various, extensive, mostly structured

C3 Reflection notes Reflection on past week Ug Form: 1page, 4 quC5 Programme evaluation Questionnaire CPD

CertificateOnline interview

C6a APEL claim Reflection on skills/attributes developed over 10 years BA/MAAccredited ministry

EssayC6b Spiritual autobiography Evidence claim against competency Essay

D1a Reflection on learning from workshop Evaluation of learning from workshop Yr3 PAMS Unstructured: 1page D1b Reflective essay Reflection on critical incidents during placement Ug ? PAMS EssayD2 Reflective self-evaluation Self-evaluation against targets Ug Yr1 2ppD3 Study Diary Self-assessment after exercise Ug Yr1 Form: 1page, 3 quD4 Placement log sheet Monthly record/reflection on work experiences Ug Yr3 (of4) Short essay: 1500+wD6 End of placement evaluation Self-evaluation of placement Level 3

App.TheologyStructured by introductory questions

11

Page 12: How do practitioners assess students’ reflective writing  · Web viewThe designers expected a wide variety in the materials brought and, by implication, in the reflective tasks

Table 4: Some groupings of tasks/criteria

A = Accurate Observation G = Awareness of own learningB = Analysis of events/situations H = Making connections (ideas)C = Identification of relevant elements I = Making connections (prior knowledge)D = Awareness of own feelings/emotions J = Honesty/authenticityE = Awareness of own attributes/qualities K = Planning for next stageF = Awareness of own weaknesses L = Commitment to change

A B C D E F G H I J K L Add

Self-evaluation of an assignmentA3 x x x x xA5 x x x x xD3 5 3 2 1 4D6 2 3= 3= 1 5Self-evaluation of general development over a periodA12 3 4 1 2 5A13 2 3 5 2 1A8 1 2 3 4 5 YesB3 x x x xB6 2 2/3 2/3 3 1 1C3 1 2 3As above, against goals/targets/competenciesA4 4 5 2 3 1B7 2 2 2 2 2 ? 1 3 3 4 4C1 5 2= 1 4D2 4 5 2 1 3 YesAs above, with action planA1 3 1 3 1 3A2 x x 1 x 5Assessment of own learning from workshop/learning sessionsA9 x x x x x YesB2 3 2 3 3 3 2 1 1 1 1 3 3D1a 1 2 3 6 4 5 YesCritical Incident AnalysisA7 x x x x x xB5 4 1 2 3 5B10b x x x x xApplication of theoretical modelsA7 x x x x x xB4 5 4 3 2 1A17 4 3 1 2 5 YesB7 2 2 2 2 2 ? 1 3 3 4 4B8 3 4 3 5 2 2 1 1 2 3 2 2C6 1 5 4 2 3

12

Page 13: How do practitioners assess students’ reflective writing  · Web viewThe designers expected a wide variety in the materials brought and, by implication, in the reflective tasks

Appendix 1: Coversheet for task

Supporting Learning from Reflection

Name: ............................. Institution: ......................................

(For each set of materials)Who were the students?

What was the task? Describe as precisely as you can what was the task set for these

learners. If this is written down in the documentation, include this or reproduce the instructions.

Indicate from the list below up to five criteria by which the task described above should be assessed. Add to the list (or edit the items)if you feel your criteria are not included, but don’t go beyond five.

If you can, prioritise these criteria from 1:Most Important to 5:Least Important

Accurate observationAnalysis of events/situationsIdentification of relevant elements of the situationAwareness of own feelings/emotions in response to eventsAwareness of own attributes/qualities displayedAwareness of own weaknesses/areas for improvementAwareness of own learningMaking connections between ideasMaking connections with prior knowledge/experienceHonesty/authenticityRecognition of/planning for the next stageCommitment to improvement/change................................................................................................................................................................................................................................................

For your ‘good’ example, ‘mark’ the text by highlighting/annotation to show how it meets the above criteria

Did your group agree with your own assessment? If not, why not?For your ‘weak’ example:Did your group agree with your own assessment? If not, why not?

13

Page 14: How do practitioners assess students’ reflective writing  · Web viewThe designers expected a wide variety in the materials brought and, by implication, in the reflective tasks

What strategies could be used to develop the quality of this student’s reflection?

Record here any key issues that arose in your discussion of these pieces of work, particularly any differences in views.

Do you still agree with your original assessment of these two pieces of work?

14