item analysis2

36
Item Analysis • When analyzing the test item, we have several questions about the performance of each item. Some of these questions include : •Are the item congruent with the test objectives? •Are the item valid? Do they measure what they supposed to measure? •Are the item reliable? Do they measure consistently? •How long does it take an examinee to complete each item? •What item are most difficult to answer correctly? •What item are easy? •Are they any poor performing items that need to be discarded?

Upload: cik-noorlyda

Post on 14-Jan-2015

5.143 views

Category:

Business


3 download

DESCRIPTION

 

TRANSCRIPT

Page 1: Item analysis2

Item Analysis

• When analyzing the test item, we have several questions about the performance of each item. Some of these questions include :•Are the item congruent with the test objectives?•Are the item valid? Do they measure what they supposed to measure?•Are the item reliable? Do they measure consistently?•How long does it take an examinee to complete each item?•What item are most difficult to answer correctly?•What item are easy?•Are they any poor performing items that need to be discarded?

Page 2: Item analysis2

Types Of Item Analysis for CTT• Three major types :

1. Assess quality of the distracters

2. Assess difficulty of the items

3. Assess how well an item differentiates between high and low performers

Page 3: Item analysis2

Purposes and Elements of item Analysis

To select the best available items for the final form of the test.

To identify structural or content defects in the items.

To detect learning difficulties of the class as a whole

To identify the areas of weakness of students in need of remediation

Page 4: Item analysis2

Three elements of item analysis

1. Examination of the difficulty level of the items.

2. Determination of the discriminating power of each item, and

3. Examination of the effectiveness of distractors in a multiple choice or matching items.

Page 5: Item analysis2

The difficulty level of an item is known as index of difficulty.Index of difficulty is the percentage of students answering

correctly each item in the testIndex of discrimination refer to the percentage of high-scoring

individuals responding correctly versus the number of low-scoring individuals responding responding correctly to an item.

This numeric index indicates how effectively an item differentiates between the students who did well and those who did poorly on the test.

Page 6: Item analysis2

Preparing Data for Item Analysis

1. Arrange test score from highest to lowest.

2. Ger one-third of the papers from the highest scores and the other third from the lowest scores.

3. Record separately the number of times each alternative was chosen by the students in both groups.

Page 7: Item analysis2

4. Add the number of correct answers to each item made by the combined upper and lower groups.

5. Compute the index of difficulty for each item, following formula :IDF = (NRC/TS)100

Where IDF = index of difficulty NRC = number of students responding correctly to an itemTS = total number of an students in the upper and lower groups.

Page 8: Item analysis2

6. Compute thee index of discrimination, based on the formula :IDN = (CU – CL)

NSGWhere IDN = index of discrimination

CU = number of correct responses of the upper groupCL = number of correct responses of the lower groupNSG = number of student per group

Page 9: Item analysis2

Using information about Index of Difficulty

The difficulty index of a test items tells a teacher about the comprehension of or performance on material or task contained in an item.

Page 10: Item analysis2
Page 11: Item analysis2

For an item to be considered a good item, its difficulty index should be 50%. An item with 50% difficulty index is neither easy nor difficult.

If an item has a difficulty index of 67.5%, this means that it is 67.5% easy and 32.5% difficult.

Information on the index of difficulty of an item can help a teacher decide whether a test should be revised, retained or modified.

Page 12: Item analysis2

Interpretation of the Difficulty Index

Range Difficulty Level

20 & below21-4041-6061-80

81 & above

Very difficultDifficultAverage

EasyVery easy

Page 13: Item analysis2

Using Information About Index Of Discrimination

• The Index Of Discrimination tells a teacher the degree to which a test item differentiates the high achievers from the low achievers in is class. A test item may have positive or negative discriminating power.

• An item has a positive discriminating power when more student from the upper group got the right answer than those from the lowest group.

• When more student from the upper group got the correct answer on an item than those from the upper group, the item has a negative discriminating power.

Page 14: Item analysis2

There are instance when an item has zero discriminating power – when equal number of students from upper and lower group got the right answer to a test item.

In the given example, item 5 has the highest discriminating power. This means that it can differentiate high and low achievers.

Page 15: Item analysis2

Interpretation of the Difficulty Index

Range Verbal Description

.40 & above.30 - .39.20 - .29.09 - .19

Very Good ItemGood ItemFair ItemPoor Item

Page 16: Item analysis2

When should a test item be rejected? Retained? Modified or revised

A test item can be retained when its level of difficulty is average and discriminating power is positive.

It has to rejected when it is either easy / very easy or difficult / very difficult and its discriminating power is negative or zero.

An item can be modified when its difficulty level is average and its discrimination index is negative.

Page 17: Item analysis2

Examining Distracter Effectiveness

An ideal item is one that all student in the upper group answer correctly and all students in the lower group answer wrongly. And the responses of the lower group have to be evenly distributed among the incorrect alternatives.

Page 18: Item analysis2

Developing an Item Data File

Encourage teachers to undertake an item analysis as often as practical

Allowing for accumulated data to be used to make item analysis more reliable

Providing for a wider choice of item format and objectives Facilitating the revision of items Accumulating a large pool of items as to allow for some items

to be shared with the students for study purposes.

Page 19: Item analysis2

Limitations Of Item Analysis

• It cannot be used for essay items.• Teacher must be cautious about what damage may

be due to the table of specifications when items not meeting the criteria are deleted from the test. These items are to be rewritten or replaced.

Page 20: Item analysis2

What is Item Discrimination?

• Generally, student who did well on the exam should select the correct answer to any given item on the exam.

• The Discrimination Index distinguishes for each item between the performance of students who did poorly.

Page 21: Item analysis2

How does it work?

• for each item, subtract the number in the lower group who answered correctly from the number of students in the upper group who answered correctly.

• Divide the result by the number of students in one group.

• The discrimination Index is listed in decimal format and ranges between -1 and 1.

Page 22: Item analysis2

What a “good” value?

Page 23: Item analysis2

Item Discrimination : Examples

1 90 20 0.7

2 80 70 0.1

3 100 0 1

4 100 100 0

5 50 50 0

6 20 60 -04

Item no.

Number of correct answers in group

Upper 1/4 Lower 1/4

Item Discrimination

Index

Page 24: Item analysis2

Quick Reference• Use the following table as a guideline to determine

whether an item ( or its corresponding instruction) should be considered for revision.

Item Discrimination (D)

D = < 0%

0 % < D < 30 %

D > = 30 %

High Medium Low

review review review

ok review ok

ok ok ok

Item Difficulty

Page 25: Item analysis2

Distracter analysis

First question of item analysis : how many people choose each response?

If there only one best response, then all other response options are distracters.

Example from in class assignment (N=35):

Which method has best internal consistensy ?a) Projective test 1b) Peer ratings 1c) Forced choice 21d) Differences n.s. 12

Page 26: Item analysis2

Distracter analysis (cont’d)

• A perfect test item would have 2 characteristics : 1. Everyone who knows the item gets it right 2. People who do know the item will have responses equality distributed

across the wrong answer.

• It is not desirable to have one of the distracters chosen more often then the correct answer.

• This result indicates a potential problem with the question. This distracters may be too similar to the correct answer and /or these maybe something in either the stem or the alternatives that is misleading.

Page 27: Item analysis2

Distracter analysis (cont’d)

• Calculate the # of people expected to choose each of the distracters. If random same expected number for each wrong response (Figure 10-1).

# of Persons N answering incorrectly 14 Exp. To Choose ___________________ = __ =4.7Distracter number of distracters 3

Page 28: Item analysis2

Distracter analysis (cont’d)

When the number of person choosing a distracter significantly exceeds the number expected, these are 2 possibilities:

1. It is possible that choice reflects partial knowledge2. The item is a poorly worded trick question

• Unpopular distracters may lower item and test difficulty because it is easily eliminated

• Extremely popular likely to lower the reliability and validity of the test

Page 29: Item analysis2

Distracter analysis : Definition

• Compare the performance of the highest and lowest scoring 25% of the student on the distracter option (i.e. the incorrect answers presented on the exam)

• Fewer of the top performers should choose each of the distracters as their answer compared to the bottom performers.

Page 30: Item analysis2

Distracter analysis : Examples

Item 1 A B C D E Omit

% of student in upper 1/4 20 5 0 0 0 0

% of student in middle 15 10 10 10 5 0

% of student in lower 1/4 5 5 5 10 0 0

Item 2 A B C D E Omit

% of student in upper ¼ 0 5 5 15 0 0

% of student in middle 0 10 15 5 20 0

% of student in lower 1/4 0 5 10 0 10 0

Page 31: Item analysis2

Distracter Analysis : Discussion

• What is the purpose of a good distracter?

• Which distracters should you consider throwing out?

Page 32: Item analysis2

Item analysis report

Page 33: Item analysis2

Exercise : Interpret Item Analysis

• Review the sample report.• Identify any exam items that may require revision.• For each identify item, list your observation and

hypothesis of the nature of the problem.

Page 34: Item analysis2

Knowledge Or Successful Guessing?

Multiple Choice Exam Strategies-improve odds by eliminating 1 or more infeasible or unlikely answer options

Description Exam Strategies-brain dumping-part marks-consideration for perfect answers to questions that were not asked

Page 35: Item analysis2

Possibility of a “Random Pass”

Depends on the numberof answer options per question and the number of questions!

Page 36: Item analysis2

1

2

4

6

10

20

50

Number of Questions 2 choice 3 choice 4 choice 5 choice

50 33 25 20

75 56 44 36

69 41 26 18

66 32 17 10

62 21 8 3

59 9.2 1.4 .3

56 1 .01 .0004

Percent Pass ( >50%) by Chance