assessment use argument

29
1 Assessment Use Argument Nancy Powers Chief of English Testing Section SHAPE, Mons, Belgium Sept 2013

Upload: kiana

Post on 31-Jan-2016

32 views

Category:

Documents


0 download

DESCRIPTION

Assessment Use Argument. Nancy Powers Chief of English Testing Section SHAPE, Mons, Belgium Sept 2013. Introduction. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Assessment Use Argument

1

Assessment Use Argument

Nancy PowersChief of English Testing Section

SHAPE, Mons, BelgiumSept 2013

Page 2: Assessment Use Argument

2

Introduction

Validity is an integrated evaluative judgment of the degree to which empirical evidence and theoretical

rationales support the adequacy and appropriateness of inferences and actions based on

test scores” (Messick 1989: 13)

Page 3: Assessment Use Argument

3

Assessment Use Argument

Based on Toulmin’s (2003) approach to practical reasoning

Justification

Accountability

Page 4: Assessment Use Argument

4

According to Bachman & Palmer

Assessment development consists of two parallel processes that serve two purposes.

1. The assessment production process

2. The assessment justification process

(p. 430, 2010)

Page 5: Assessment Use Argument

5

Therefore…

An AUA is a theoretical framework that provides a rationale and set of procedures for justifying the intended uses of the assessment.

Page 6: Assessment Use Argument

6

The nitty-gritty of an AUA

It is comprised of 4 parts

1. Claims1. The beneficial consequences of an assessment

2. The decisions that are made

3. The interpretations that are made

4. The assessment records

2. Warrant – statements that elaborate the claims

Page 7: Assessment Use Argument

7

The nitty-gritty of an AUA (cont’d)

Not everyone will agree with us

3. Rebuttal – counterclaim

4. Backing – evidence supporting the warrantsincludes feedback from stakeholders through questionnaires, verbal protocols, observations, interviews, previous research, statistical analyses

Page 8: Assessment Use Argument

8

An AUA at work

Lots of theory…

Concrete example: Justifying the inclusion of videos in a listening test

Page 9: Assessment Use Argument

9

Claim 1: The consequences are beneficial

I make the claim that

The consequences of using a video listening test are beneficial to the test developers and to the students.

So, what does this mean? I need to elaborate.

Page 10: Assessment Use Argument

10

Warrant

The consequences of using the VLT that are specific to the test developers and to the students will be beneficial.

• The test developers will develop tests that are more authentic and better reflect the TLU domain

• Students can use the visual cues to help with comprehension

• The context will be clear thereby reducing student anxiety

Page 11: Assessment Use Argument

11

Rebuttals

I disagree with you!

The consequences of using the VLT that are specific to the test developers and to the students will NOT be beneficial.

• Videos will be distracting• Attending to multiple sources of stimulation is more tiring

& demanding

Page 12: Assessment Use Argument

12

Backing: Collection of evidence that justify your claims

1. The students who trialled the test reported that…• “The video aspect helped to ground the task, making it more authentic

than just an audio test”• “It gave focus to me, therefore allowing me to listen. Often, when

listening to audio-only, my mind wanders, i.e. I think of something else, therefore missing the listening text.”

• “They [the videos] were relaxing; therefore there was no mental block to listening because of nervousness.”

2. The use of videos can be theoretically justified in that it introduces construct-relevant variance (Wagner, 2002, 2007)

Page 13: Assessment Use Argument

13

Backing cont’d

3. Wagner (2010) found that student performance on a listening test that included videos increased 6.5%

4. If test task characteristics are similar to the TLU characteristics, then the test can be seen as having construct validity (Bachman & Palmer, 1996)

Page 14: Assessment Use Argument

14

Claim 2: Decisions made

The decisions to award a proficiency level reflect existing educational and societal values and the content/task/accuracy statements as stated in the NATO STANAG 6001 Language Proficiency Levels;

and are equitable for those students who are placed at different proficiency levels. These decisions are made by the test developers and refer to which proficiency level the students belong. The individuals affected by these decisions are the students and the teachers of the MTCP program.

Page 15: Assessment Use Argument

15

Warrant: Values sensitivity

Relevant educational values of CDA are carefully considered in the proficiency level decisions that are made.

Rebuttal:

Relevant educational values of CDA are NOT carefully considered in the proficiency level decisions that are made.

Page 16: Assessment Use Argument

16

Backing:

CDA governed by two documents: Qualification Standard and the Foreign National Training Plan

VLT respects the C/T/A statements for each proficiency level in STANAG 6001

Page 17: Assessment Use Argument

17

Warrant : Equitability

Test takers and teachers are fully informed about how the decision will be made.

Rebuttal:

Test takers and teachers are NOT fully informed about how the decision will be made.

Page 18: Assessment Use Argument

18

Backing:

The testing section conducts information sessions with teachers and testers when introducing new testing methods.

Candidate’s Guide

Page 19: Assessment Use Argument

19

Claim 3: Interpretations

The interpretations about the students’ ability to utilize verbal and non-verbal behaviour to comprehend the main idea, explicitly stated

information and implicit information are meaningful in terms of

the construct definition of listening comprehension, impartial to all

groups of test takers, generalizable to tasks that resemble the

TLU, and relevant to and sufficient for the proficiency level decisions that are to be made.

Page 20: Assessment Use Argument

20

Warrant : Meaningful

The claim is meaningful in terms of listening to and comprehending general English with respect to the construct definition.

Rebuttal: The claim is NOT meaningful in terms of listening to and comprehending general English with respect to the construct definition.

Page 21: Assessment Use Argument

21

Backing: Meaningful

The construct definition is based on research on listening comprehension.

The items were developed according to the NATO STANAG 6001 Proficiency levels.

Item specs

Page 22: Assessment Use Argument

22

Warrant: Impartiality

Test takers are treated impartially during all aspects of the administration of the assessment

Rebuttal:

Test takers are NOT treated impartially during all aspects of the administration of the assessment

Page 23: Assessment Use Argument

23

Backing: Impartiality

Candidate’s guide, all sessions administered in the same way every time.

All students are given the same test with the same instructions, despite their country of origin, their

rank, their gender, etc.

Generalizable

Relevant

Sufficient

Page 24: Assessment Use Argument

24

Claim 4 Assessment Records

The scores from the video listening test are consistent across different forms and administrations of the test, across students from different military trades, and across groups with different nationalities and first languages.

Page 25: Assessment Use Argument

25

Warrants: Inter/Intra rater reliability

Scored the same way across administrations

Rebuttal: no rebuttal

Backing: this is multiple-choice, computer-delivered test: no inter/intra rater reliability needed

internal algorithm in computer program for scoring

Page 26: Assessment Use Argument

26

Conclusion: in a nutshell

Basically you are saying something about the test that you have designed

You make these claims clear by elaborating on what you mean.

Then, you address any perspective that goes against what you have claimed and gather evidence that supports your point of view.

Page 27: Assessment Use Argument

27

Questions?

Thank you

Page 28: Assessment Use Argument

28

References

Bachman, L. F., & Palmer, A.S. (1996). Language testing in practice. Oxford, Oxford University Press.

Bachman, L. F., & Palmer, A.S. (2010). Language assessment in practice. Oxford, Oxford University Press.

Hostetter A. B. (2011). When do gestures communicate? A meta-analysis. Psychological Bulletin, 137(2), 297-315.

Kellerman, S. (1990). Lip service: The contribution of the visual modality to speech perception and its relevance to the teaching and testing of foreign language listening comprehension. Applied Linguistics, 11(3), 272-280.

Kellerman, S. (1992). “I see what you mean”: The role of kinesic behaviour in listening and implications for foreign and second language learning. Applied Linguistics, 13, 239-258.

Page 29: Assessment Use Argument

29

Okey, G. (2007). Construct implications of including still image or video in computer-based listening tests. Language Testing, 24, 517-537.

Toulmin, S. E. (2003). The uses of argument (updated edn). Cambridge: Cambridge University press.

Wagner, E. (2002) Video listening tests: A pilot study. Working Papers in TESOL & Applied Linguistics, Teacher’s College, Columbia University, 2 (1). Retrieved from the Internet on August 20, 2007. http://journals.tc-library.org/index.php/tesol/article/viewFile/7/8

Wagner, E. (2007). Are they watching? Test-taker viewing behaviour during an L2 video listening test. Language Learning & Technology, 11, 67-86.

Wagner, E. (2010b). The effect of the use of video texts on ESL listening test-taker performance. Language Testing, 27, 493-513.