designing classroom tests

Upload: msmyra-mahmud

Post on 14-Oct-2015

71 views

Category:

Documents


1 download

DESCRIPTION

TESL

TRANSCRIPT

OVERVIEW OF ASSESSMENT: CONTEXT, ISSUES & TRENDS

DESIGNING CLASSROOM TESTSTSL3112 LANGUAGE ASSESSMENTPISMP TESL SEMESTER 6IPGKDRISTAGES OF TEST CONSTRUCTIONWhat is the purpose of the test?What are the objectives of the test?How will the test specifications reflect both the purpose and the objectives?How will the test item types (tasks) be selected and the separate items arranged?In administering the test, what details should I attend to in order to help students achieve optimal performance?What kind of scoring, grading, and/or feedback is expected?DETERMINING THE PURPOSE OF A TESTConsider the overall purpose of the exercise that students are about to perform.For examples:Why create the test.Its significance relative to he course to evaluate overall proficiency or place a student in a course.The importance of the test compared to other student performance.The test impact before and after to teachers and students.DETERMINING THE PURPOSE OF A TESTBachman & Palmer (1996) the purpose of an assessment refers to as test usefulness, or to what use teachers will put an assessment.

DESIGNING CLEAR, UNAMBIGUOUS OBJECTIVESBegin by taking a careful look at everything that what students should know and be able to do based on the material that the students are responsible for.In other words, examine the objectives for the unit teachers are testing.DESIGNING CLEAR, UNAMBIGUOUS OBJECTIVESConsider:What to find out.Establishing appropriate objectives involves a number of issues; from relatively simple ones (e.g. about forms and functions covered in a course unit) to more complex ones (e.g. about constructs to be represented on the test).Language abilities to be assessed.DRAWING UP TEST SPECIFICATIONSAn outline of the test what it will look like.To design or evaluate a test make sure the test has a structure that logically follows from the unit or lesson it is testing.The class objectives should be present in the test through appropriate task types and weights, a logical sequence, and a variety of tasks.DRAWING UP TEST SPECIFICATIONSA blueprint of the test that includes:a description of its content.item types (method, such as multiple-choice, cloze, etc.).tasks (e.g., written essay, reading a short passage, etc.).skills to be included.how the test will be scored.how it will be reported to students.DRAWING UP TEST SPECIFICATIONSFor classroom purposes, the specifications (specs) are a guiding plan for designing an instrument that effectively fulfills desired principles esp. validity (Davidson & Lynch, 2002).Spaan (2006) for large-scale standardised test to be widely distributed, and therefore, broadly generalised, test specifications are much more formal and detail.Also, usually confidential to ensure the validity of subsequent forms of a test.DEVISING TEST ITEMSThe tasks need to be practical.For content validity tasks should mirror tasks of the course, lesson, or segment.Should be authentic with a progression biased for best performance.Can be evaluated reliably by the teacher/scorer.DEVISING TEST ITEMSThe test development is not always a clear, linear process.Test design usually involves a number of loops problems and shortcomings.

ADMINISTERING THE TESTOnce the test is ready to administer, students need to feel well prepared for their performance.Reduce unnecessary anxiety in students, raise their confidence, and help them view the test as an opportunity to learn.ADMINISTERING THE TESTPre-test considerations (the day before the in-class essay):Provide pre-test information on:The conditions for the test e.g. time limit, no portable electronics, breaks, etc.Materials to bring.Test item types.Suggestions of strategies for optimal performance.Evaluation criteria (rubrics, show benchmark samples).ADMINISTERING THE TESTOffer a review of components of narrative and description essays.Give students a chance to ask questions, and provide responses.

Test administration details:Arrive early and see to it that the classroom conditions (lighting, temperature, a clock, furniture arrangement, etc.) are conducive.ADMINISTERING THE TESTTry the audio/video or other technology that is needed for administration in advance.Have extra paper, writing instruments, or other response materials on hand.Start on time.Distribute the test itself.Remain quietly seated at the teachers desk, available for questions from students as they proceed.For a timed test, warn students when time is about to run out, and encourage their completion of their work.

SCORING, GRADING, AND GIVING FEEDBACKSCORING:Scoring plan reflects the relative weight placed on the items in each section.Greater weight should be placed in accordance with the significance of the tasks e.g., tasks which represent more general or integrative language ability.Classroom teachers may decide to revise the scoring plan for the course the next time you teach it.SCORING, GRADING, AND GIVING FEEDBACKSCORING:At that point, teachers might have valuable information about how easy or difficult a test was, whether the time limit was reasonable, students affective reaction to it, and their general performance.Finally, teachers will have an intuitive judgement about whether a test correctly assessed the students.

SCORING, GRADING, AND GIVING FEEDBACKGRADING:Grading is such a thorny issue!The assignment of letter grades to the test is a product of:the country, culture, and context of the English classroom.institutional expectations (most are unwritten).explicit and implicit definitions of grades that you have set forth.the relationship you have established with the class.student expectations that have been engendered in previous tests and quizzes in the class.

SCORING, GRADING, AND GIVING FEEDBACKGIVING FEEDBACK:Normally beneficial feedback.A few of the many possible manifestations of feedback associated with tests:

SCORING, GRADING, AND GIVING FEEDBACKIn general, scoring/grading for a test:a letter gradea total scoresubscores (e.g., of separate skills or sections of a test)For responses to listening and reading items:indication of correct/incorrect responsesdiagnostic set of scores (e.g., scores on certain grammatical categories)checklist of areas needing work and strategic optionsSCORING, GRADING, AND GIVING FEEDBACKFor oral production tests:scores for each element being ratedchecklist of areas needing work and strategic optionsoral feedback after performancepost-interview conference to go over the resultsOn written essays:scores for each element being ratedchecklist of areas needing work and suggested strategies/techniques for improving writingmarginal and end-of-essay comments, suggestionspost-test conference to go over workSCORING, GRADING, AND GIVING FEEDBACKAdditional/alternative feedback for a test:on all or selected parts of a test, peer conferences on resultswhole-class discussion of results of the testindividual conferences with each student to review a complete testself-assessment in various manifestationsCONCLUSIONSTAGES OF TEST CONSTRUCTION (COURSE PRO FORMA)Assessing clear, unambiguous objectivesDrawing up Test specificationsItem writingItem moderationPre-testingAnalysisTrainingReporting