development of the student placement evaluation form: a tool for assessing student fieldwork...

8
Australian Occupational Therapy Journal (2004) 51, 125– 132 Blackwell Publishing, Ltd. Feature Article Development of the student placement evaluation form Development of the student placement evaluation form: A tool for assessing student fieldwork performance Heather Allison and Merrill J. Turpin Department of Occupational Therapy, School of Health and Rehabilitation Sciences, University of Queensland, Queensland, Australia Evaluation of students undertaking fieldwork education placements is a critical process in the health professions. As training programs and practice evolve, systems for assessing students need to be reviewed and updated constantly. In 1995, staff of the occupational therapy training program at the University of Queensland, Australia decided to develop a new tool for assessing student fieldwork performance. Using an action research methodology, a team developed the Student Placement Evaluation Form, a flexible and comprehensive criterion-referenced evaluation tool. The present paper examines action research as an appropriate methodology for considering real-life organisational problems in a systematic and participatory manner. The action research cycles undertaken, including preliminary information gathering, tool development, trial stages and current use of the tool, are detailed in the report. Current and future development of the tool is also described. KEY WORDS action research, fieldwork evaluation. INTRODUCTION Most health-related degree programs offer a substantial component of fieldwork education in conjunction with formal or class-based education. Fieldwork is a formative and essential part of training that provides students with the opportunity to work in a real-life practice setting (Cohn & Crist, 1995). Working under the supervision of experienced practitioners in the role of clinical educators (Hummel, 1997), students are able to practise and develop competence in a broad range of skills relevant to their particular profession (Thompson & Ryan, 1996b). The present paper describes the development of a tool to assess the performance of occupational therapy students during fieldwork. A discussion about fieldwork education and the complexities of assessing fieldwork will provide a background to a description of the process undertaken to develop the Student Placement Evaluation Form (SPEF). The current paper also highlights the suitability of an action learning methodology for solving practical problems. IMPORTANCE OF FIELDWORK EDUCATION IN OCCUPATIONAL THERAPY The formal (class-based) components of professional training foster the cognitive growth of students, equipping Heather Allison BOccThy, MOccThy; Clinical Educator, Doctoral Candidate. Merrill J. Turpin PhD, BOccThy, GradDipCounsel; Lecturer. Correspondence: Heather Allison, Division of Occupational Therapy, School of Health and Rehabilitation Sciences, The University of Queensland, QLD 4072, Australia. Email: [email protected] Accepted for publication May 2003. © 2004 Australian Association of Occupational Therapists

Upload: heather-allison

Post on 14-Jul-2016

213 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Development of the student placement evaluation form: A tool for assessing student fieldwork performance

Australian Occupational Therapy Journal (2004) 51, 125–132

Blackwell Publishing, Ltd.

F e a t u r e A r t i c l eDevelopment of the student placement evaluation form

Development of the student placement evaluation form: A tool for assessing student fieldwork performanceHeather A l l ison and Merr i l l J . Turpin

Department of Occupational Therapy, School of Health and Rehabilitation Sciences, University of Queensland, Queensland, Australia

Evaluation of students undertaking fieldwork education placements is a critical process in the health professions. As training programs and practice evolve, systems for assessing students need to be reviewed and updated constantly. In 1995, staff of the occupational therapy training program at the University of Queensland, Australia decided to develop a new tool for assessing student fieldwork performance. Using an action research methodology, a team developed the Student Placement Evaluation Form, a flexible and comprehensive criterion-referenced evaluation tool. The present paper examines action research as an appropriate methodology for considering real-life organisational problems in a systematic and participatory manner. The action research cycles undertaken, including preliminary information gathering, tool development, trial stages and current use of the tool, are detailed in the report. Current and future development of the tool is also described.

K E Y W O R D S action research, fieldwork evaluation.

INTRODUCTION

Most health-related degree programs offer a substantialcomponent of fieldwork education in conjunction withformal or class-based education. Fieldwork is a formativeand essential part of training that provides students withthe opportunity to work in a real-life practice setting(Cohn & Crist, 1995). Working under the supervision ofexperienced practitioners in the role of clinical educators(Hummel, 1997), students are able to practise and developcompetence in a broad range of skills relevant to theirparticular profession (Thompson & Ryan, 1996b). Thepresent paper describes the development of a tool toassess the performance of occupational therapy students

during fieldwork. A discussion about fieldwork educationand the complexities of assessing fieldwork will providea background to a description of the process undertakento develop the Student Placement Evaluation Form(SPEF). The current paper also highlights the suitabilityof an action learning methodology for solving practicalproblems.

IMPORTANCE OF FIELDWORK EDUCATION IN OCCUPATIONAL THERAPY

The formal (class-based) components of professionaltraining foster the cognitive growth of students, equipping

Heather Allison BOccThy, MOccThy; Clinical Educator, Doctoral Candidate. Merrill J. Turpin PhD, BOccThy, GradDipCounsel; Lecturer.Correspondence: Heather Allison, Division of Occupational Therapy, School of Health and Rehabilitation Sciences, The University of Queensland, QLD4072, Australia. Email: [email protected] for publication May 2003. © 2004 Australian Association of Occupational Therapists

Page 2: Development of the student placement evaluation form: A tool for assessing student fieldwork performance

126 H. Allison and M. Turpin

them with basic concepts, theories, rules and practicalskills that underpin practice. However, real-life practiceoften presents complex dilemmas, for which theories andrules are insufficient to inform decision-making. Effectiveoccupational therapy practice requires the integrationof knowledge and practical skills with clinical reasoningskills. Fieldwork experience is the major avenue throughwhich this integration occurs.

Fieldwork education contributes to student develop-ment in a number of ways. It provides unique opportu-nities to apply and test the theories and facts learnt informal settings and to acquire new knowledge. It alsoallows students to learn and refine practical skills byworking with actual clients. Many practical and proceduralskills, such as assessing muscle tone or joint range, are bestlearnt through direct experience. More significantly, how-ever, students learn clinical reasoning skills, and begin tounderstand the need to consider many human and non-human factors when judging how to provide appropriateand personalised intervention for individuals. Equallyimportant is the development of effective interpersonalskills that are integral to professional competence.

Many occupational therapy authors have discussed theissue of student therapists learning by observation andaction (Thompson & Ryan, 1996b; Bossers, Cook, Polatajko& Laine, 1997; Craik & Austin, 2000). These processesexpose students to the complexity of practice dilemmasand to the holistic nature of clinical practice. Studentscan observe the use of clinical judgment in determiningthe appropriate application of generalised knowledge toindividual clients. Fieldwork is the logical starting pointfor students to begin to accumulate a storehouse ofexperience to draw from when making practice decisions(Mattingly & Fleming, 1994). Learning by doing is par-ticularly important in occupational therapy, as practicalwisdom is not always formulated in words. Therapistsspend more time and energy doing rather than talkingabout what they do, so they develop a whole ‘discourseof knowledge’ that is embedded in action (Mattingly &Fleming, p. 25). Those aspects of practice that are dif-ficult to discuss are often shared directly betweenpractitioners and their students through unspoken means,such as demonstration or physically correcting a student’sperformance.

Fieldwork education also contributes to the socialisa-tion of students into the culture of the profession (Thompson& Ryan, 1996a; Lyons & Ziviani, 1998; Kasar & Muscari,2000). Opportunities to observe and model from experi-enced professionals expose students to the subtle, un-spoken social mores of the profession and the practiceenvironment. Professional socialisation contributes to thedevelopment of relevant professional values and attitudesand the skills of working cooperatively with other staff.

In summary, fieldwork education is a vital part of thetotal educational needs of occupational therapy students.It reinforces and facilitates the application of previousknowledge as well as fostering the development of addi-tional knowledge and skills that are often practice-basedand difficult to articulate.

The changing nature of fieldwork

In the past, fieldwork students usually worked in anapprenticeship model, whereby one student worked withone skilled therapist (Nolinske, 1995). Under supervision,students worked with clients, gradually taking on moreresponsibility as their skills developed. In this model, thesupervisory process was direct and immediate, but theresulting learning may have been too narrowly focusedand lacking in generality. Significant changes to bothpractice and educational environments have challengedthe relevance of these traditional apprenticeship models offieldwork education (Mackenzie, 1997). With the growingnumber and diversity of occupational therapy programsin Australia, student numbers have increased, resulting ina shortage of fieldwork opportunities. In addition, thestudent cohort is now increasingly diverse, with a greaterrepresentation of older students, male students andminority groups (Kautzmann, 1990). More flexible modelsof fieldwork are required to accommodate students withwork or family responsibilities. Human service structuresare also changing, with an increase in community-basedand generic services, and more non-traditional work rolesfor occupational therapists.

In response to these changes, fieldwork practiceshave diversified (Nolinske, 1995). Currently, students aregaining access to a variety of fieldwork options and oppor-tunities (Hummell, 1997). Students are being increasinglydirected towards new or role-emerging areas of practice,where the occupational therapy role is still being estab-lished (Fleming, Christenson, Franz & Letourneau, 1996;Bossers et al., 1997). Private practice and other developingpractice settings are also being explored (Potts, Babcock& McKee, 1998), as are part-time placements and field-work models where students might share their timebetween two centres (Huddlestone, 1999b). Some educa-tional institutions have established occupational therapyclinics of their own, to provide fieldwork experiences forstudents (Ravetz & Granell, 1996). In order to facilitatequality education in diverse fieldwork settings, new andoriginal models of fieldwork supervision are being trialled(Huddlestone, 1999a; Huddlestone). These include:reflective practice, self-directed learning in which studentsset their own goals for the placement; use of contractsbetween students and clinical educators; cooperative orpeer learning; and multiple mentoring schemes that offer

Page 3: Development of the student placement evaluation form: A tool for assessing student fieldwork performance

Development of the student placement evaluation form 127

support to students undertaking placements at centreswithout an on-site occupational therapist (Agnew, 1995;Heath, 1996; Huddlestone; Mackenzie, 1997; Mason, 1998).

Consistency in the evaluation of students undertakingfieldwork has always been difficult to achieve in situationswhere there are a large number of assessors. In addition,the systemic changes to occupational therapy practiceand fieldwork environments have added variability to theassessment of students. In the mid-1990s, the occupationaltherapy school at The University of Queensland initiateda review of its fieldwork assessment practices, due toconcerns that its existing practices were not adequatelydealing with the changing nature of fieldwork education(Agnew, 1995).

DESCRIPTION OF PROBLEM AND CONTEXT OF THE RESEARCH

The University of Queensland provided the first university-based training program for occupational therapy studentsin Australia. The current program is accredited and itsfieldwork complies with the guidelines of the WorldFederation of Occupational Therapists (WFOT), whichrequire a minimum of 1000 hours of supervised practice,the majority of which occurs in the later years of theprogram (Council of the World Federation of Occupa-tional Therapists, 1998). Although more flexible optionsare being used, students at The University of Queenslandtypically undertake full-time blocks of supervised field-work experience, incorporating paediatric, physical, andpsychosocial caseloads.

Until the mid-1990s, the Occupational TherapyDepartment used a fieldwork assessment tool known asthe Modified Fieldwork Performance Report (MFPR).This tool was a revision of the Fieldwork PerformanceReport (FPR), originally developed by the AmericanOccupational Therapy Association. It was revised at theUniversity of Queensland in 1983 (Shah, Schonfeld &Maas, 1985). The MFPR form consisted of 59 behaviours,each rated on a 12-point rating scale where 1 = extremelypoor and 12 = extremely excellent. Students were assessedat the halfway point and at the end of each fieldworkplacement. In a review of fieldwork practices in thisdepartment, Agnew (1995) found widespread dissatisfac-tion regarding the MFPR among students, new graduates,and clinicians as it was seen as difficult to mark andpunitive in its orientation. Students emphasised itssubjective nature, while clinical educators found it time-consuming to complete. Clinical educators stressed thatmany components of the MFPR lacked relevance to theirparticular work settings, because of its orientation to acutehospital practice. In particular, Agnew found that it did

not encourage the development of skills in clinical reason-ing. The following year, the University of Queenslandcompletely revised its assessment policies in favour ofcriterion-based assessment (The University of Queens-land, 1996). This environment of change provided theimpetus and opportunity to comprehensively review theclinical evaluation process and to confront criticisms ofthe assessment too.

ACTION RESEARCH AS METHODOLOGY

At the outset of the review, it was unclear whether theMFPR could be modified further or would need to bereplaced. Thus, action research was selected for thisproject because it is an appropriate methodology fortackling practical problems that do not have a clearsolution. Action research has been defined in a varietyof ways, most simply by Coghlan & Brannick (2001), asresearch that is embedded in action. Definitions of actionresearch vary, and highlight its different features, so thefollowing definition is offered as a summary of these.Action research is defined as a collaborative form ofinquiry that involves cycles of action and reflection abouta specific practical situation. The process ideally involvesthose who are directly affected, and aims to benefitthose people in an active and practical way (Coghlan &Brannick). The process of action research has been used inother occupational therapy literature (Mackenzie, 1997).

A number of features of action research confirmed itas the methodology of choice for the problem of improv-ing the assessment of students undertaking fieldwork.First, action research is used to understand real lifesituations or problems, and researchers need to acquaintthemselves with the relevant issues. In this case, the natureof occupational therapy practice was expanding andnew areas of practice were continually being developed.Similarly, fieldwork experiences were occurring in a widerange of practice contexts. It was vital that the assessmenttool selected or developed was not biased toward anyparticular model of professional practice. Any tool neededto be sufficiently flexible to be appropriate in areas ofpractice that might still be developing. If the assessmenttool was not able to assist supervisors to make judgmentsabout the student’s performance, nor provide studentswith feedback in a wide range of situations, it would be oflittle use to supervising clinicians.

A second important feature of action research isits participatory nature. Action research aims to engageinformants as collaborators in the research process. Suchcollaboration was important to the project in three ways.First, gaining the perspective of therapists might con-tribute to an understanding of their needs in assessing

Page 4: Development of the student placement evaluation form: A tool for assessing student fieldwork performance

128 H. Allison and M. Turpin

students, which could be used to make the assessmentmore user-friendly. Second, engaging therapists in theprocess may increase their receptiveness to any changesthat might be made to the practice of assessing students.Third, students were also engaged in the developmentalprocesses at the stage of planning the assessment and alsoat the trial stage in order to ensure that they felt the toolwas equitable.

A third feature of action research is its cyclical nature.Put simply, each cycle consists of a combination of actionor practical experience and reflection on that action.Cycles combine in a series, connected as if in a spiral(Coghlan & Brannick, 2001; Winter & Munn-Giddings,2001). Zuber-Skerritt (1993) described each cycle asconsisting of planning, acting, observing and reflecting.First, the initial situation is described and the problem orgoal identified, then action is planned, carried out andevaluated. Each successive cycle is planned, based on theoutcomes of preceding cycles. Through this cyclic processof action and reflection, the researchers gradually worktoward the desired outcome. The journey towards thesolution is essentially evolutionary rather than beinglinear or predetermined. In the situation facing theoccupational therapy department, the presenting problemwas relatively clear but the possible paths to a solutionneeded exploration and clarification. The cyclical processof action and reflection allowed for the exploration ofpossible solutions before there was a commitment to anyparticular course of action.

The potential limitations of the action research methodare that it can be resource intensive in terms of time andpeople. However, these disadvantages can be offset by itsparticipatory nature and capacity to address complex andpractical problems. The process of developing the StudentPlacement Evaluation Form (SPEF) using an action re-search methodology will be described in three major cycles.

Cycle 1 : Prel iminary informat ion gather ing

The authors successfully attracted funding from TheUniversity of Queensland’s Teaching and Learning GrantsScheme in order to explore the needs and perspectives ofthose people involved in the assessment of fieldworkstudents. Qualitative methodologies, specifically focusgroups, were selected for this cycle as they are appropriatewhen canvassing opinions without introducing bias toparticipants, and permit depth of inquiry. Focus groupsallow researchers to question a wide range of people in ashort period of time. The stakeholders in this process wereoccupational therapy students, academic staff, and clinicaloccupational therapists across the State of Queensland,Australia. Clinicians were sought from rural, provincial

and metropolitan areas; from paediatric, psychosocial, andphysical backgrounds; and from a variety of work environ-ments (e.g. government, industry, private practice and spe-cialist organisations).

Invitations to participate in focus groups were publishedin the local newsletter of the professional associationOT AUSTRALIA (Queensland). To ensure participationfrom the widest range of clinicians available, some thera-pists were purposefully selected and directly contacted bythe researchers. In some large work settings, focus groupswere held at the workplace, while other groups werecomposed of therapists from various work settings. Tofacilitate participation from rural and remote therapists,focus groups were aligned with regional therapistmeetings. Where possible, students undertaking fieldworkalso participated in the groups. A total of 23 focus groupmeetings were convened by occupational therapists whotravelled widely throughout the state.

Focus group participants were asked questions aboutpositive and negative aspects of the MFPR, and its useduring half-way and final placement evaluations. Specificcomments were sought about the appropriateness of theMFPR’s rating scale and the items on which students werescored. Participants were asked about recommendationsfor a new student evaluation tool, including specific detailssuch as layout. All focus groups were audiotaped.

Analys is and findings of the focus groupsThe information was transcribed verbatim and themati-cally analysed using the QSR nud*ist (non-numericalunstructured data: indexing, searching, theorising) datamanagement software (Richards & Richards, 1994). Mostcomments made by respondents described the short-comings of the MFPR tool. The main theme of thesecomments was its perceived lack of objectivity. The 12-point rating scale was described as cumbersome and diffi-cult to interpret. Clinical educators identified confusionabout the standard of student performance required foreach rating, with scores invariably skewed to the upperend of the scale. They also reported the lack of clearcriteria to indicate the point in the rating scale that dis-tinguished between a passing and failing grade. Becausestudents’ grades on the MFPR contributed to their ratingin a University subject and, for honours students, theirgrade of honours, concerns were raised about the equity ofthese grades. The MFPR primarily reflected hospital-based practice and many of the terms and performanceitems were seen as irrelevant or inappropriate. Therapistsin emerging or non-medical areas of occupational therapypractice found that many items referred to patients, treat-ment or discharge, and they increasingly resorted to ‘NotApplicable’ scores. The MFPR was also perceived as offer-ing little space to provide written feedback to students.

Page 5: Development of the student placement evaluation form: A tool for assessing student fieldwork performance

Development of the student placement evaluation form 129

Respondents were overwhelmingly in favour of thedevelopment or adoption of a new evaluation tool. Theysought a tool that would de-emphasise numerical scoresand focus students’ attention on timely and targetedfeedback. Practice suggestions included: that fieldworkevaluation be graded on a pass/fail basis; that a new ratingscale be developed with clear criteria for awarding apassing grade; that the number of items be reduced; thatscoring be quick; that more emphasis be placed on quali-tative comments; and that assessment continue to occurboth half-way through the placement and at the end.

Cycle 2 : Development of tool

After analysis of focus group data and a review of othercurrent English language occupational therapy assess-ment tools, it became clear that the development of anew assessment tool was required. An Action Learninggrant was obtained from The University of Queensland’sTertiary Education Institute. A team was formed todevelop the tool. The team included the occupational ther-apy school’s clinical manager, a number of academic stafffrom a variety of practice areas, two clinicians who wereexperienced clinical supervisors, a research assistant andan external mentor. The research team represented adiversity of clinical perspectives, and was aided by anexternal reference group, whose members representeddiverse areas of practice not already represented by theteam, for example, private and rural practice. This groupagreed to provide additional input as needed during theprocess.

The overall guiding principle in the development ofthe assessment tool was the need to emphasise and supportthe provision of useful feedback to students. The teamidentified the need to place a greater emphasis on thequalitative components of feedback rather than on numer-ical ratings. In order to do this, the rating scale neededto be very simple and descriptive. The team initiallydeveloped a four-point rating scale based on studentcompetencies, with each rating category having clear,detailed descriptors. The four-point scale included:unacceptable performance; concerns exist; competentperformance; and exceptional performance. This wasselected as it did not have a natural middle point, therebyrequiring the supervisor to make a clear distinctionbetween the two passing and two failing categories.Another strategy to emphasise the numerical ratingswas to provide considerable space next to each learningobjective for additional comments.

In constructing the standards by which students wouldbe assessed, eight major areas of competence were identi-fied. These were: professional practice; self managementskills; communication skills; documentation; assessment/

information gathering; intervention; evaluation; and groupskills. A behavioural learning objective was created foreach area. For example, the competency area ‘profes-sional practice’ yielded the learning objective ‘behaves ina manner appropriate to professional practice’. Each learn-ing objective was expanded into a number of componentbehaviours, descriptions of which were called ‘items’.There were five items for the area ‘professional practice’including: (i) respects values, beliefs and needs of clientsand staff; (ii) maintains confidentiality of information;(iii) demonstrates respect for workplace and procedures;(iv) represents occupational therapy in an appropriatemanner; and (v) shows awareness of the broad impact ofpolitical, legal and industrial issues on the profession,workplace and client group.

As the team developed learning objectives andbehavioural items, it became clear that the assessment toolneeded to provide sufficient flexibility to cater for thewide range of settings in which students are assessed. Forexample, therapists working directly with clients mightvalue different intervention behaviours and skills fromtherapists providing less direct services, like case manage-ment. Thus, the team conceived the idea to develop‘banks’ of items, that is, items grouped according topractice setting. Direct client contact, case management,and project management/consultancy were identified asthe main models of practice in which therapists typicallyengaged. Thus, alternative learning objectives and itemswere provided for the areas of communication, assessmentand intervention. This allowed supervisors in differentsettings to select the learning objective and items thatbest suited their particular setting. For example, underassessment/information gathering, the learning objectivefor direct client care emphasised the effectiveness of theperson’s skills in assessment, whereas the learning objec-tive for case management emphasised the developmentand management of assessment processes. Typicalexamples were provided after each item. For example,within the professional practice area, one item is: ‘Repre-sents occupational therapy in an appropriate manner.’A number of examples or prompts followed, such as,‘explains OT to others, assumes OT role appropriate tosetting.’ The authors added a small amount of extra spacefor each item, which allowed therapists to add examplesthat might be typical of their setting or specific to theindividual student’s performance.

The principle of assessing students both half-waythrough and at the end of the placement was retained, inthe interests of providing timely and formative feedbackto students. The layout was planned so that students couldsee all scores and feedback for each item on the samepage. The provision of early feedback was consideredvery important for students who were slow to develop

Page 6: Development of the student placement evaluation form: A tool for assessing student fieldwork performance

130 H. Allison and M. Turpin

competence. Consequently, the team developed the ‘atrisk’ form, to be completed at any point during the field-work placement if a supervisor had concerns about theperformance of a student. Both student and supervisorwould discuss and sign this form and send a copy to thefieldwork manager. A written document, which requiredsupervisors to articulate specific concerns, was consideredimportant. This would facilitate accountability and clearcommunication between all parties and assist in thedevelopment of a management plan, including supportfrom the university’s fieldwork manager.

The assessment tool and instructions were presentedin a handbook. This handbook also contained two otherforms. The first was the ‘Student Evaluation of Placement’.This form was developed to enable students to provideformal feedback to their supervisors regarding issues,such as the orientation program, expectations, and theprovision of feedback and supervision. The second formwas the ‘Student Reflection Tool’. This tool provided aguide for personal reflection by the students about theirlearning during the fieldwork placement.

Cycle 3 : Tr ia l of tool

Participatory feedback is an integral part of the actionresearch process and has been recommended as importantin the development of student evaluations (Coates &Chambers, 1992; Coghlan & Brannick, 2001). Occupa-tional therapy practitioners and students were constantlyinformed about the ongoing development and review ofthe tool through regular reports in newsletters, clinicalmeetings, and by frequent correspondence seeking theircomment and participation. Once a draft form of theSPEF was completed, the next cycle involved ongoing trialand revision of the tool. A preliminary trial was held in18 clinical centres throughout Queensland in 1997. Thesettings chosen for the trial represented a diversity ofstudent placement locations, and included metropolitan,regional, and isolated rural areas, and a range of practiceareas and work sites. Both the therapist and theirallocated student(s) were invited to participate in the trial,resulting in a participant group of 18 clinical educatorsand 18 students. Responses were received from 17 clinicaleducators and nine students. The trial required clinicaleducators to rate students using both the MFPR then inuse, and the new SPEF, although only the MFPR scorescounted towards their assessment. Feedback forms werecompleted by both students and clinical educators. Theseforms invited the participant to rate each of the eightlearning objectives in terms of clarity of wording on afour-point rating scale from 1 = unsatisfactory to 4 =highly satisfactory. Clinical educators were further invitedto rate the relevance and completeness of the learning

objectives, and provide any additional recommendationsabout the specific behaviours under each learningobjective. Additional yes/no questions considered theclarity and utility of the handbook, the rating scale, layout,and other details of the tool. Reminders were communi-cated by telephone where delays in returning forms wereexperienced.

Feedback from the pilot group of clinical educatorsand students was overwhelmingly positive. The partici-pants found that the learning objectives were clear,relevant and complete. Feedback about the specificbehaviours suggested that the project management itemswere too narrowly focused, and indicated that some ofthe items evaluated more critical behaviours than others,opening the issue of weighting. The participants gave over-whelming approval regarding the clarity of the handbookinstructions and the rating scale, the layout and the extraforms it contained, however, a number of minor changeswere recommended to enhance clarity. The main point ofdissension was with respect to the four-point ratingscale, as both students and therapists found it gave inade-quate scope to comment on improvements in student per-formance between the half-way and final assessments. Thefeedback from the majority of participants was that thescale should have at least five points.

Responding to this feedback, the team developed afive-point rating scale, with the following descriptors:unacceptable performance; concerns exist; developingcompetence; competent performance; and exceptionalperformance. Developing competence was considered apass and was included to acknowledge that studentsmight adequately apply knowledge and skill but with alevel of inconsistency, perhaps due to lack of practice oropportunities to demonstrate the skill. To develop a set ofcriteria to objectively determine passing and failing, someweighting of learning objectives and behaviours wasnecessary. Three learning objectives were considered to berelevant to all aspects of occupational therapy practiceand therefore categorised as core objectives. Some itemsin each learning objective were similarly categorised ascore items. This categorisation formed the basis for deter-mining a passing grade. In order to pass, students wererequired to obtain a passing score on all core items and aminimum number of non-core items for each learningobjective.

A final draft of the SPEF was completed, and a moreextensive trial commenced in 1998. All Queensland-basedclinical educators who provided fieldwork in the firstsemester were invited to participate. Participants wererequired to complete both the MFPR and the SPEF withtheir student(s), and provide feedback about the SPEF.Forms were returned from 37 clinical educators and29 students. The feedback on the amended SPEF was

Page 7: Development of the student placement evaluation form: A tool for assessing student fieldwork performance

Development of the student placement evaluation form 131

overwhelmingly positive. All participants approved of theweighting of learning objectives and items, and the clearcriteria outlining the minimum requirements for passing.Most participants felt that the changed scale was veryclear, gave them sufficient scope to objectively awardscores, and allowed for comment on student improvementbetween halfway and final assessments. The layout wasconsidered easy to read, providing sufficient space forwritten comments, and the handbook was clear and user-friendly. As a result of this feedback, only minor altera-tions were made to the final product.

Launch and recept ion of the SPEFIn mid-1998, the SPEF was launched and adopted foruse at The University of Queensland. A statewide trainingprogram was conducted to familiarise Queensland clin-ical educators with the use of the tool. Feedback fromclinical educators to date has been extremely positive.Clinical educators have particularly suggested that theSPEF facilitates definition of student performancedifficulties, thus clarifying remediation. Perhaps the bestindication of the success of a new occupational therapystudent evaluation tool is its reception by other occupa-tional therapy schools. To date, the SPEF has beenlicensed for use in eight out of the 10 Australian occupa-tional therapy schools.

Cont inuing development of the tool

Training therapists and students with the SPEF is an on-going process. Training sessions for therapists are offeredduring supervisor workshops and occupational therapyconferences. In addition, each new cohort of studentsbeginning fieldwork is introduced to the SPEF. A trainingvideo is nearing completion. It is well-documented thatconsistency is difficult to achieve between large numbersof raters working in different settings. The interrater relia-bility (IRR) of the SPEF is currently under investigation,to identify any areas of the evaluation where inconsist-encies may occur. As a consequence of this project,criteria for each of the points on the rating scale are beingrefined. In addition, an electronic version of the SPEFhas been developed, which allows clinical supervisors tocomplete the form and easily pass it to co-supervisors forcomment. Future research may involve monitoring theutility of the SPEF in different states of Australia, andexamining its applicability to new and emerging areas ofoccupational therapy practice.

In a rapidly changing health-care system, therapistsare working in diverse settings, and under a range ofmodels. Fieldwork education and assessment of studentsmust reflect this diversity, so that students have thebreadth of skills necessary to meet the challenge of future

work. The authors used action research as a methodologyto seek a solution to a significant educational issue. Thismethod involved broad and ongoing consultation withstakeholders at each step of the project, to develop atool that is widely applicable for assessing students under-taking fieldwork experiences. This methodology has beencriticised as being overly time-consuming and it was diffi-cult for researchers as they sought to hear the views ofmany stakeholders, and reflect these in their findings.However, these steps were necessary to produce a toolthat can incorporate the breadth of occupational therapystudent education in Australia.

ACKNOWLEDGMENTS

The development and trial of this tool was made possiblewith Teaching and Learning, and Action Learning grantsfrom the Teaching Education and Development Institutewithin the University of Queensland. The authors thankthe practitioners, students and academic staff who con-tributed to the process. The authors acknowledge themembers of the project team and, in particular, PamelaMeredith for her initial contributions to the paper.

REFERENCES

Agnew, P. (1995). Qualitative studies into the assessment ofclinical practice of occupational therapy students in relationto student learning and competencies. In: J. Butler, G. Evans,G. Isaacs & D. Price (Eds), Developments in medical andhealth sciences education. St Lucia, Australia: TEI, The Uni-versity of Queensland.

Bossers, A., Cook, J., Polatajko, H. & Laine, C. (1997). Under-standing the role-emerging fieldwork placement. CanadianJournal of Occupational Therapy, 64, 70–81.

Coates, V. E. & Chambers, M. (1992). Evaluation of tools to assessclinical competence. Nurse Education Today, 12, 122–129.

Coghlan, D. & Brannick, T. (2001). Doing action research in yourown organization. Thousand Oaks, CA: Sage.

Cohn, E. S. & Crist, P. (1995). Back to the future: Newapproaches to fieldwork education. American Journal ofOccupational Therapy, 49, 103–106.

Craik, C. & Austin, C. (2000). Educating occupational therapistsfor mental health practice. British Journal of OccupationalTherapy, 63, 335–339.

Council of the World Federation of Occupational Therapists(1998). Minimum standards for the education of occupationaltherapists. Retrieved 23 September 2002 from http://www.wfot.org.au.

Fleming, J. D., Christenson, J., Franz, D. & Letourneau, L. (1996).A fieldwork model for non-traditional community practice.Occupational Therapy in Health Care, 10, 15–35.

Heath, L. A. (1996). The use of self-directed learning during field-work education: The students’ perspective. British Journal ofOccupational Therapy, 59, 515–519.

Page 8: Development of the student placement evaluation form: A tool for assessing student fieldwork performance

132 H. Allison and M. Turpin

Huddlestone, R. (1999a). Clinical placements for the professionsallied to medicine, part 1: A summary. British Journal of Occu-pational Therapy, 62, 213–219.

Huddlestone, R. (1999b). Clinical placements for the professionsallied to medicine, part 2: Placement shortages? Two modelsthat can solve the problem. British Journal of OccupationalTherapy, 62, 295–298.

Hummell, J. (1997). Effective fieldwork supervision: Occupa-tional therapy perspectives. Australian Occupational TherapyJournal, 44, 147–157.

Kasar, J. & Muscari, M. E. (2000). A conceptual model for thedevelopment of professional behaviours in occupationaltherapists. Canadian Journal of Occupational Therapy, 67,42–50.

Kautzmann, L. N. (1990). Clinical teaching: Fieldwork super-visors’ attitudes and values. American Journal of OccupationalTherapy, 44, 835–838.

Lyons, M. & Ziviani, J. (1998). ‘I’m allowed to experiment’:The role of people with psychiatric disorders in facilitatingstudents’ learning. Occupational Therapy International, 5, 104–117.

Mackenzie, L. (1997). An application of the model of humanoccupation to fieldwork supervision and fieldwork issues inNSW. Australian Occupational Therapy Journal, 44, 71–80.

Mason, L. (1998). Fieldwork education: Collaborative grouplearning in community settings. Australian OccupationalTherapy Journal, 45, 124–130.

Mattingly, C. & Fleming, M. H. (1994). Clinical reasoning: Formsof inquiry in a therapeutic practice. Philadelphia, PA: F.A.Davis.

Nolinske, T. (1995). Multiple mentoring relationships: Facilitate

learning during fieldwork. American Journal of OccupationalTherapy, 49, 39–43.

Potts, H., Babcock, J. & McKee, M. (1998). Considerationsfor fieldwork education within a private practice. CanadianJournal of Occupational Therapy, 65, 104–107.

Ravetz, C. & Granell, C. (1996). The establishment of occupa-tional therapy clinics within an educational setting: A report.British Journal of Occupational Therapy, 59, 503–505.

Richards, L. & Richards, T. J. (1994). NUD.IST Version 3.0.[Computer Software]. Melbourne: Qualitative Solutions forResearch.

Shah, S., Schonfeld, C. & Maas, F. (1985). Evaluation of clinicalcompetence in occupational therapy. Clinical Supervisor, 3(2), 85–98.

The University of Queensland (1996). Policies and guidelineson assessment: Advice to examiners from the assessmentsubcommittee. Retrieved 13 September 2001 from http://www.admin.uq.edu.au/AcadBoardOffice/policy/assess_Pol&Guid.html.

Thompson, M. M. & Ryan, A. G. (1996a). The influence offieldwork on the professional socialisation of occupationaltherapy students. British Journal of Occupational Therapy,59, 65–70.

Thompson, M. M. & Ryan, A. G. (1996b). Students’ perspectiveof fieldwork: Process, purpose and relationship to coursework.Australian Occupational Therapy Journal, 43, 95–104.

Winter, R. & Munn-Giddings, C. (2001). A handbook for actionresearch in health and social care. New York, NY: Routledge.

Zuber-Skerritt, O. (1993). Improving learning and teachingthrough action learning and action research. Higher EducationResearch and Development, 12, 45–58.