systematic feedback for more effective teaching and...

20
. . . . . . . . . . . . . . . . . technical report Systematic Feedback for More Effective Teaching and Learning Margaret A. Jorgensen, Ph.D. August 2005 Copyright © 2005 by Pearson Education, Inc. or its affiliate(s). All rights reserved. Pearson and the Pearson logo are trademarks of Pearson Education, Inc. or its affiliate(s).

Upload: others

Post on 16-Apr-2020

14 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Systematic Feedback for More Effective Teaching and Learningimages.pearsonassessments.com/images/.../Systematic... · Systematic Feedback for More Effective Teaching and Learning

.

.

.

.

.

.

.

.

.

. . . . . . . .

technical report

Systematic Feedback for More Effective Teaching and Learning

Margaret A. Jorgensen, Ph.D.

August 2005

Copyright © 2005 by Pearson Education, Inc. or its affiliate(s). All rights reserved.Pearson and the Pearson logo are trademarks of Pearson Education, Inc. or its affiliate(s).

Page 2: Systematic Feedback for More Effective Teaching and Learningimages.pearsonassessments.com/images/.../Systematic... · Systematic Feedback for More Effective Teaching and Learning

.

.

.

.

.

.

.

TECHNICAL REPORT Systematic Feedback for More Effective Teaching and Learning

2

Systematic Feedback for More Effective Teaching and Learning Introduction

The value of taking a test is in getting the results back quickly and in a manner that communicates them clearly. In the 80-plus years that norm-referenced achievement tests (NRT) have been used in U.S. schools, score reports typically have been returned to the districts and schools four to six weeks after test administration. Reporting timelines remained about the same when schools moved to criterion-referenced tests (CRT). For some testing programs, score reports would not be returned until months after the test administration. It was not uncommon for testing to occur in the spring and for score reports to be returned in the fall. This situation was especially true if the test’s questions included both open-ended items and multiple-choice items, the former requiring hand scoring by trained readers.

With the enactment of the No Child Left Behind Act of 2001 (NCLB), the requirements for reporting changed. Regardless of when tests are administered, states must report results to the federal government by the first week of the following school year. As a result, for testing programs that report under the Adequate Yearly Progress (AYP) requirement of NCLB, data must be available in the summer for the states to complete their analysis and report to the federal government on schedule.

This requirement has altered the dynamics of reporting test information and the flexibility of easily disaggregating the data to comply with federal guidelines. States and districts have been pushed beyond mere electronic (PDF) score reports that mirror paper reports to requirements for electronic query functionality along with their data. Timeframes have been compressed to only two or three weeks between test administration and reporting. From the teachers’ or parents’ perspectives, neither the electronic delivery nor the query capability has caused score reports to change in either their appearance or their usability.

Copyright © 2005 by Pearson Education, Inc. or its affiliate(s). All rights reserved.Pearson and the Pearson logo are trademarks of Pearson Education, Inc. or its affiliate(s).

Page 3: Systematic Feedback for More Effective Teaching and Learningimages.pearsonassessments.com/images/.../Systematic... · Systematic Feedback for More Effective Teaching and Learning

.

.

.

.

.

.

.

TECHNICAL REPORT Systematic Feedback for More Effective Teaching and Learning

3

While it is evident that educational administrators make important resource decisions based on test data, it is equally clear that teachers and parents would prefer different kinds of information for the decisions that they want and need to make in support of learning. This report begins the story of what test information can and should be to meet the needs of teachers as they strive to meet the needs of each student. A large part of this vision has been sparked by the work of Edward Tufte (1990, 1997, 2001). His commitment to representing statistical information visually is contagious and compelling. It causes one to think quite differently about data and how best to communicate the message of that data, because, for most people, data is not, in and of itself, information. For most people, data must be translated by a specialist into words or pictures that can then be readily understood and used for action. Tufte’s challenge is to take the important data from test performance and translate it into intuitively clear information that becomes an invitation to action.

What Information is Useful?

The “what is wanted” category is clearly a function of who is asking. What is important to policy makers is likely to be quite different from what is important to parents. What is important to educational administrators may be quite different from what matters to teachers. However, the common expectation is the same—accuracy and dependability. Everyone wants to get reliable information and make the right decisions.

If we focus directly on teachers, we know that the score reports received today (either on paper or online) do little to help them refine their teaching strategy or content to meet the individual needs of students. Consider, for example, the information contained in a typical norm-referenced test score report, shown in Figure 1:

Copyright © 2005 by Pearson Education, Inc. or its affiliate(s). All rights reserved.Pearson and the Pearson logo are trademarks of Pearson Education, Inc. or its affiliate(s).

Page 4: Systematic Feedback for More Effective Teaching and Learningimages.pearsonassessments.com/images/.../Systematic... · Systematic Feedback for More Effective Teaching and Learning

.

.

.

.

.

.

.

TECHNICAL REPORT Systematic Feedback for More Effective Teaching and Learning

4

Number of Items

Raw Score

Scaled Score

National Percentile/Stanine

Total Reading 80 59 641 71-6

Reading Vocabulary 30 23 650 76-6

Reading Comprehension 50 36 636 66-6

Total Mathematics 70 66 708 93-8

Concepts & Problem Solving

40 36 689 87-7

Computation 30 30 747 96-9

Language 48 45 702 97-9

Spelling 30 27 681 90-8

Science 35 30 669 89-8

Social Science 35 29 661 80-7

Research Skills 43 35 658 75-6

Thinking Skills 141 122 672 91-8

Basic Battery 228 197 NA 88-7

Complete Battery 298 256 NA 88-7 Figure 1. Sample norm-referenced test score report for Student A (Harcourt Assessment, Inc., 2002, p. 56).1

How can a teacher use this report to improve instruction? There are, of course, several different ways to rank-order content areas to identify where a student falls in the distribution. For example, based on the percentile rank information, this student does better in language and mathematics than in reading. You can reach the same conclusion by comparing stanine information or the scale score values.

1 The sample data found in the figures and tables in this paper are used for display purposes only and should not be interpreted.

Copyright © 2005 by Pearson Education, Inc. or its affiliate(s). All rights reserved.Pearson and the Pearson logo are trademarks of Pearson Education, Inc. or its affiliate(s).

Page 5: Systematic Feedback for More Effective Teaching and Learningimages.pearsonassessments.com/images/.../Systematic... · Systematic Feedback for More Effective Teaching and Learning

.

.

.

.

.

.

.

TECHNICAL REPORT Systematic Feedback for More Effective Teaching and Learning

5

A more detailed score report would break these content areas down into finer-grained information that would help the teacher better understand the student’s specific strengths and weaknesses within each content area. The missing piece is a specific plan for what the teacher needs to do to move this student forward.

Take an example of a score report for a hypothetical standards-based test administered to Student A:

Content Domain

Content Standard

Number of Items

Raw Score

Scale Score

Reading 50 28 650

Main Idea 5 0 NA

Detail 20 15 NA

Sequence 20 13 NA

Inference 5 0 NA Figure 2. Sample standards-based test score report for Student A.

How can a teacher use this report to improve instruction? There is less information in this standards-based score report than in the NRT score report. The teacher would know that this student apparently has some difficulty identifying the correct information concerning sequence or detail. But there is no information that helps the teacher to understand how this student compares with other students at the same grade level. If we add another column for “Performance Level,” the teacher would know if this student was “Basic,” “Proficient,” “Advanced,” or another designated proficiency level.

Performance Level

Basic

Proficient

Advanced Figure 3. Performance levels.

State accountability assessment programs have, during the past few years, relied on released test questions to help teachers and parents understand the knowledge and skills that are required to answer correctly. To better understand what Student A cannot do, the teacher could look at released items that are classified under “Main Idea.” The teacher would have to analyze the items and determine

Copyright © 2005 by Pearson Education, Inc. or its affiliate(s). All rights reserved.Pearson and the Pearson logo are trademarks of Pearson Education, Inc. or its affiliate(s).

Page 6: Systematic Feedback for More Effective Teaching and Learningimages.pearsonassessments.com/images/.../Systematic... · Systematic Feedback for More Effective Teaching and Learning

.

.

.

.

.

.

.

TECHNICAL REPORT Systematic Feedback for More Effective Teaching and Learning

6

what they have in common that Student A cannot do. The teacher would then have to map that breakdown in understanding against his or her instructional plan and deduce where the breakdown happened.

These approaches do not provide the information that teachers need to make immediate decisions about what instruction Student A needs. Each of these score reports requires the teacher to determine independently where the learning broke down. If a class has 25–28 students, this process can rapidly overwhelm even the most dedicated of teachers.

Important Information

Deciding what constitutes “important information” from achievement tests traditionally has been rather straightforward. A typical score report for individual students and groups of students appears below:

Content Area Subtest

Number of Test Questions or Points Available2

Raw Score

Scaled Score

National Individual Percentile Rank and Stanine (PR – S)

Performance Level

Student Spelling 40 5 548 4 – 2 Below Basic

Class Spelling 40 17.0 588 19 – 3 Basic

School Spelling 40 26.3 633 59 – 5 Proficient

District Spelling 40 26.6 636 62 – 6 Proficient Figure 4. Typical score report for individual students and groups of students.

2 For programs having open-ended or constructed-response questions, points represent a range of possible values.

Copyright © 2005 by Pearson Education, Inc. or its affiliate(s). All rights reserved.Pearson and the Pearson logo are trademarks of Pearson Education, Inc. or its affiliate(s).

Page 7: Systematic Feedback for More Effective Teaching and Learningimages.pearsonassessments.com/images/.../Systematic... · Systematic Feedback for More Effective Teaching and Learning

.

.

.

.

.

.

.

TECHNICAL REPORT Systematic Feedback for More Effective Teaching and Learning

7

Inferences can be drawn about what students know and do not know from these types of reports. Based only on this information, conversations about where learning breaks down can be difficult.

Some score reports have generic action steps included. These are typically generic recommendations known to help students grow stronger in the content area. They are not typically derived from individual or group performance on a specific test administration. Even the instructional text provided with the New Standards® score reports is not driven by individual performance but is developed for groups of students. However, these bullets do provide a place to begin a conversation about what a student needs to do next. An example of Student B’s grade 8 mathematics score report from the New Standards® Reference Examinations follows:

Mathematics Skills (percentiles)

Mathematical Concepts (percentiles)

Problem Solving (percentiles) Total Score:

151 Class School National Class School National Class School National

Achieved Standard with Honors

38% 33% 11% 23% 14% 6% 4% 5% 10%

Achieved Overall Standard

42% 45% 22% 31% 25% 14% 38% 34% 11%

Nearly Achieved Standard

19% 20% 24% 23% 33% 17% 12% 25% 14%

Below the Standard 0% 2% 14% 22% 8% 26% 42% 36% 24%

Little Evidence of Achievement

0% 5% 19% 12% 30% 37% 4% 7% 6%

Figure 5. Score report for Student B on the New Standards grade 8 mathematics test (Harcourt Assessment, Inc., 2002, p. 109).

Copyright © 2005 by Pearson Education, Inc. or its affiliate(s). All rights reserved.Pearson and the Pearson logo are trademarks of Pearson Education, Inc. or its affiliate(s).

Page 8: Systematic Feedback for More Effective Teaching and Learningimages.pearsonassessments.com/images/.../Systematic... · Systematic Feedback for More Effective Teaching and Learning

.

.

.

.

.

.

.

TECHNICAL REPORT Systematic Feedback for More Effective Teaching and Learning

8

Instructional recommendations for Student A include improving performance in mathematics concepts by working on:

• ordering rational numbers, exponents

• reasoning proportionally

• using rate to estimate and calculate

• using inverse relationships

• identifying translations

• determining areas of two-dimensional shapes

• modeling situations geometrically

• extending numerical and spatial patterns

• representing functions (table, graph, formula)

• applying A = π r2

Certainly, this approach provides some guidance for teachers and parents, but it is not adequate.

What if score reports were not just numbers in columns on a page, not graphs of numbers for classrooms or grade levels of students, not generic instructional strategies such as the examples noted above?

In 1999, ACT released a new type of report associated with the ACT Assessment. This report, Standards for Transition®, examined item-response choices for thousands of students scoring within different total score ranges. By identifying the common skills students were lacking based on what questions they did not answer correctly, ACT was able to highlight the gaps between these performance levels. When it published the Pathways to Progress reports in 2001, Harcourt Assessment, Inc. adapted this approach for the eighth edition of the Metropolitan Achievement Test.

Copyright © 2005 by Pearson Education, Inc. or its affiliate(s). All rights reserved.Pearson and the Pearson logo are trademarks of Pearson Education, Inc. or its affiliate(s).

Page 9: Systematic Feedback for More Effective Teaching and Learningimages.pearsonassessments.com/images/.../Systematic... · Systematic Feedback for More Effective Teaching and Learning

.

.

.

.

.

.

.

TECHNICAL REPORT Systematic Feedback for More Effective Teaching and Learning

9

The specific methodology for this report was to disaggregate student data from the standardization sample by stanine. For students within a specific stanine, items that were answered correctly by 80% or more were analyzed to determine the range of skills, processes, and behaviors that were needed to respond correctly. These skills, processes, and behaviors describe the common knowledge for students in a specific stanine group (Harcourt Assessment, Inc., 2002, pp. 45–46, 73–74). This information helps the teacher know what knowledge is needed for a specific student to move up to the next stanine.3 This technique essentially brings a roadmap to the teacher. But even this approach does not help a teacher differentiate among students by the way they responded to items incorrectly. Table 1. Explanations of Subject Area Knowledge and Skills by Stanine.

Math Problem Solving Reading Comprehension

Stanine 5 Stanine 5

Students in this score category can compare whole numbers and sets to 99,999 and, to a limited extent, compare and order decimal fractions. They can identify numbers evenly divisible by a given number and are moderately able to translate simple word problems into symbolic notation. These students can estimate using compatible numbers and are able to extend number patterns and can solve algebraic equations. These students can identify geometric solid figures. They can calculate the perimeters of polygons and determine measurements indirectly from scale drawings using inches. Students at this level are able to compare and interpret data presented in bar graphs and pictographs. These students have developed a limited degree of proficiency with probability concepts. They also have limited proficiency in applying strategies to solve problems. Students at this level are able to identify a reasonable conclusion based on data presented in graphic format and identify missing or extraneous information presented in the context of a problem.

Students in this score category can understand explicit details on about one-half the items. They can make inferences and draw conclusions to a limited extent when the readability of the selection is below their level. They can understand and interpret implicit ideas, events, and relationships on about one-half of the items.

3 In Table 1, boldface content is incremental for each stanine.

Copyright © 2005 by Pearson Education, Inc. or its affiliate(s). All rights reserved.Pearson and the Pearson logo are trademarks of Pearson Education, Inc. or its affiliate(s).

Page 10: Systematic Feedback for More Effective Teaching and Learningimages.pearsonassessments.com/images/.../Systematic... · Systematic Feedback for More Effective Teaching and Learning

.

.

.

.

.

.

.

TECHNICAL REPORT Systematic Feedback for More Effective Teaching and Learning

10

Math Problem Solving Reading Comprehension

Stanine 6 Stanine 6

Students who score at this level are able to compare whole numbers and sets to 99,999, identify equivalent forms of fractions and mixed numbers, and compare and order decimal fractions. They can identify numbers evenly divisible by a given number, identify the effects of an arithmetic operation, and are moderately able to translate simple word problems into symbolic notation. These students can estimate using compatible numbers and are able to extend number patterns. They can evaluate algebraic expressions and solve algebraic equations. These students can identify geometric solid figures, parallel and perpendicular lines, geometric transformations, and models of line segments, lines, and rays. They can select units of measurement appropriate for a given situation, calculate the perimeters of polygons, and determine measurements indirectly from scale drawings using inches. Students at this level are able to use, compare, and interpret data presented in bar graphs and pictographs. They can identify possible and impossible outcomes and identify most and least likely outcomes in situations involving probability concepts. They are marginally able to apply appropriate strategies to solve problems. They also can identify a reasonable conclusion based on data presented in graphic format, solve problems using numerical reasoning, and identify missing or extraneous information presented in the context of a problem.

Students in this score category can understand explicit details in a variety of creative, informational, and functional reading selections. They perform as well when the readability of text is at mid-sixth grade level as they do when the readability level is appropriate to fifth grade. They can understand explicitly stated actions, reasons, and sequences on about two-thirds of the items. They can make inferences and draw conclusions in some selections written below their level. They can understand and interpret implicit ideas, events, and relationships on about one-half the items. They can synthesize and evaluate explicit and implicit information primarily only in reading selections below their level.

With this information, teachers can begin to focus instruction on the learning to be accomplished for student growth. Still, the burden of work falls back on the teacher to map these learning needs to instructional plans and then to create instruction that delivers on these goals.

Other advances in score reporting have focused on the “less is more” principle—that is, using less dense text and data and more open space, more graphics, and more color. In many ways, score reports have begun to resemble newspapers such as USA Today. These changes are powerful. Note the obvious differences in data display within the reports shown in Figures 6 through 9.

Copyright © 2005 by Pearson Education, Inc. or its affiliate(s). All rights reserved.Pearson and the Pearson logo are trademarks of Pearson Education, Inc. or its affiliate(s).

Page 11: Systematic Feedback for More Effective Teaching and Learningimages.pearsonassessments.com/images/.../Systematic... · Systematic Feedback for More Effective Teaching and Learning

.

.

.

.

.

.

.

TECHNICAL REPORT Systematic Feedback for More Effective Teaching and Learning

11

Figure 6. Score report showing use of color graphics.

Copyright © 2005 by Pearson Education, Inc. or its affiliate(s). All rights reserved.Pearson and the Pearson logo are trademarks of Pearson Education, Inc. or its affiliate(s).

Page 12: Systematic Feedback for More Effective Teaching and Learningimages.pearsonassessments.com/images/.../Systematic... · Systematic Feedback for More Effective Teaching and Learning

.

.

.

.

.

.

.

TECHNICAL REPORT Systematic Feedback for More Effective Teaching and Learning

12

Numbers are not only presented as statistics but also as commonly used graphics to reinforce their meaning. Likewise, relationships are clarified through graphic representation. For example, a single student’s performance relative to the group is represented numerically but also graphically. These comparisons highlight discrepancies between an individual’s performance at a point in time with relevant aggregate groups such as class, school, district, and state. Other relevant comparisons new to this type of reporting include the year-to-year comparison, as demonstrated in Figure 7.

Figure 7. Sample score report showing vertical scale score and student achievement growth.

This type of report begins to set the stage for teachers and parents to be able to have meaningful discussions about a student’s achievement relative to previous and subsequent years. In this prototype for Florida, it is possible to project where student growth needs to occur to stay on track for next year, given that this state testing program has a vertical scale.

For testing programs that do not have a vertical scale that allows direct comparisons across grade levels within content areas, a similar graphic could be built projecting out the percentage correct required at each grade level to stay on track for proficiency or other performance goals.

The previous example focuses on test scores for individual students. It is equally important to use these same techniques to heighten understanding of group performance. A straightforward, simple bar graph, such as in Figure 8 and Figure 9, can also be used to help stakeholders understand how “their” school compares to the district and state.

Copyright © 2005 by Pearson Education, Inc. or its affiliate(s). All rights reserved.Pearson and the Pearson logo are trademarks of Pearson Education, Inc. or its affiliate(s).

Page 13: Systematic Feedback for More Effective Teaching and Learningimages.pearsonassessments.com/images/.../Systematic... · Systematic Feedback for More Effective Teaching and Learning

.

.

.

.

.

.

.

TECHNICAL REPORT Systematic Feedback for More Effective Teaching and Learning

13

Figure 8. Report comparing scores for an individual, school, district, and state.

Figure 9. Score report with a bar graph comparing school, district, and state scores.

The power of graphics in simplifying test data should not be ignored. If we provide visualizations of test data, the message can be crisp and clear without requiring knowledge of statistics and data analyses.

Copyright © 2005 by Pearson Education, Inc. or its affiliate(s). All rights reserved.Pearson and the Pearson logo are trademarks of Pearson Education, Inc. or its affiliate(s).

Page 14: Systematic Feedback for More Effective Teaching and Learningimages.pearsonassessments.com/images/.../Systematic... · Systematic Feedback for More Effective Teaching and Learning

.

.

.

.

.

.

.

TECHNICAL REPORT Systematic Feedback for More Effective Teaching and Learning

14

The Challenge: Have Students Mastered the Content Standards?

The challenges facing stakeholders in education today include methods for immediately obtaining specific, relevant, useful information to guide the day-by-day decisions that teachers make about each individual student. One unmentioned approach is to alter the structure of test questions so that students’ incorrect responses are used to connect to differentiated instruction. This approach would require differentiation among incorrect responses, thereby indicating the varying levels of understanding of the particular content standard being assessed. The power behind this possibility is that targeted instruction could be delivered based on the way students answer questions incorrectly. If, for example, a student consistently selected a detail from a passage when being asked for the main idea, he or she would receive a “score report” that would include a lesson that clarifies what main idea is and how details may or may not be a part of the correct answer. If another student consistently selected the passage summary when asked for the main idea, she or he would receive a “score report” that clarified the differences between main idea and summary.

These kinds of “score reports,” although not conventional in any way, would provide teachers with specific instructional tools to intervene in an individual student’s learning journey. The connections between the assessment patterns and instruction could be validated by teachers over time, and this evidence of validity could be shared in an electronic tool for teachers across a school district or state. However, we have proposed a conflict with best practices in high-stakes assessment with this approach. This conflict arises because of the instructional value derived from test questions that have incorrect answers that are at different levels of cognitive complexity.

It is also important to understand that, with this approach, students who answer the same test questions incorrectly may receive different instructional messages depending on the particular incorrect response that each student selects.

Consider the example in Figure 10 of what this “score report” might look like with the relevant instructional lessons attached:

Copyright © 2005 by Pearson Education, Inc. or its affiliate(s). All rights reserved.Pearson and the Pearson logo are trademarks of Pearson Education, Inc. or its affiliate(s).

Page 15: Systematic Feedback for More Effective Teaching and Learningimages.pearsonassessments.com/images/.../Systematic... · Systematic Feedback for More Effective Teaching and Learning

.

.

.

.

.

.

.

TECHNICAL REPORT Systematic Feedback for More Effective Teaching and Learning

15

.

Student Error and Intervention ReportJane A. DoeTeacher: Mrs. SmithToday's Date: 05/15/04Date of Birth: 04/01/1996Grade: 03

Standard 6A 6B 10F

Description

Apply knowledge of letter-sound correspondences, language structure, and context to recognize words.

Use structural analysis to identify root words with prefixes such as dis-, non- and in- and suffixes such as -ness, -tion, and -able.

Determine a text's main (or major) ideas and how those are supported with details.

Items tested 4, 12, 19, 22 1, 2, 3, 5, 9 6, 14, 21, 29, 33Jane's Score 2/4 1/5 2/5

Class Average

3/4 4/5 3/4

Error Analysis

No pattern of error. Jane chose an option that indicates misunderstanding of suffixes.

Jane chose an option that states a main idea of the selection instead of a summary.

Next Steps

Because Jane's responses did not follow a pattern, intervention should focus on general vocabulary development and acquistion.

Suffixes

Click here to print Jane's in-class practice on suffixes. In addition to completing the exercises on the practice sheet, ask Jane to circle the suffixes of words in red.

Click here to print Jane's homework, which is the Test Prep activity on page 123 of Jane's textbook.

Summary

Click here to print a practice book activity that will help Jane understand the difference between main idea and summary.

Click here to print a skill card about summarizing to discuss with Jane.

These standardswere the focus of the

assessment.

Mouse overunderlined items to

see the items that thestudent missed.

Figure 10. Score report with corresponding instructional lessons.

Copyright © 2005 by Pearson Education, Inc. or its affiliate(s). All rights reserved.Pearson and the Pearson logo are trademarks of Pearson Education, Inc. or its affiliate(s).

Page 16: Systematic Feedback for More Effective Teaching and Learningimages.pearsonassessments.com/images/.../Systematic... · Systematic Feedback for More Effective Teaching and Learning

.

.

.

.

.

.

.

TECHNICAL REPORT Systematic Feedback for More Effective Teaching and Learning

16

Building on this type of information, the next level score report would provide specific action steps for each student. Consider the case of Jane and Peter. Table 2. Standards-Based Report

Standard 6A 6B 10F

Description Apply knowledge of letter-sound correspondences, language structure, and context to recognize words.

Uses structural analysis to identify root words with prefixes such as dis-, non- and in- and suffixes such as -ness, -tion, and -able.

Determine a test’s main (or major) ideas and how those are supported with details.

Items tested4 4, 12, 19, 22 1, 2, 3, 5, 9 6, 14, 21, 29, 33

Jane’s and Peter’s score

2/4 1/5 2/5

Class Average 3/4 4/5 3/4

4 Boldface font indicates items answered incorrectly.

Copyright © 2005 by Pearson Education, Inc. or its affiliate(s). All rights reserved.Pearson and the Pearson logo are trademarks of Pearson Education, Inc. or its affiliate(s).

Page 17: Systematic Feedback for More Effective Teaching and Learningimages.pearsonassessments.com/images/.../Systematic... · Systematic Feedback for More Effective Teaching and Learning

.

.

.

.

.

.

.

TECHNICAL REPORT Systematic Feedback for More Effective Teaching and Learning

17

From this information, it is clear that both Jane and Peter are performing at a level below the class average. We also know that both Jane and Peter answered the same test questions incorrectly, but they answered them differently. This is the key to the next level of reporting. An analysis of how they answered the questions incorrectly is included in this table: Table 3. Standards-Based Report (cont.)

6A 6B 10F

Error Analysis for Jane

No pattern of error5

Because Jane’s responses did not follow a pattern, instruction should focus on general vocabulary development and acquisition.

Jane chose an option that indicates misunderstanding of suffixes for each of these questions answered incorrectly.

Jane chooses options that state a main idea of the selection instead of the summary.

Error Analysis for Peter

No pattern of error

Because Peter’s responses did not follow a pattern, intervention should focus on general vocabulary development and acquisition.

Peter chose options that indicate a misunderstanding of affixes.

Peter chooses options that represent a misunderstanding of the definition of the characteristics of a summary.

5 The student must select three or more out of five items in the same incorrect way in order for a pattern to be defined. Fewer items answered incorrectly in the same way represent something other than systematic confusion.

Copyright © 2005 by Pearson Education, Inc. or its affiliate(s). All rights reserved.Pearson and the Pearson logo are trademarks of Pearson Education, Inc. or its affiliate(s).

Page 18: Systematic Feedback for More Effective Teaching and Learningimages.pearsonassessments.com/images/.../Systematic... · Systematic Feedback for More Effective Teaching and Learning

.

.

.

.

.

.

.

TECHNICAL REPORT Systematic Feedback for More Effective Teaching and Learning

18

A teacher armed with this specific information can then proceed to focus Jane on suffixes and Peter on affixes. Likewise, the teacher can work with Jane on understanding the difference between a main idea and a summary while working with Peter on understanding the constituent parts of main idea. For content standard 6A, there is no additional information to guide the next steps of instruction because there were no consistent patterns in wrong answers for either Jane or Peter.

Of course, the “ultimate score report” would then be the actual delivery of the practice work and instructional materials for both Peter and Jane. This content could come from the basal reading series or from supplemental materials already in use in the classroom but deployed electronically so that these materials could be customized on the fly as needed according to test performance. This type of information would at last close the gap between assessment and instruction (see Figure 11).

Copyright © 2005 by Pearson Education, Inc. or its affiliate(s). All rights reserved.Pearson and the Pearson logo are trademarks of Pearson Education, Inc. or its affiliate(s).

Page 19: Systematic Feedback for More Effective Teaching and Learningimages.pearsonassessments.com/images/.../Systematic... · Systematic Feedback for More Effective Teaching and Learning

.

.

.

.

.

.

.

TECHNICAL REPORT Systematic Feedback for More Effective Teaching and Learning

19

Figure 11. Sample ClassLinks score report.

This type of score report represents the optimum tool for teachers to refine their instruction and is likely an optimum tool for teachers on which to base conversations both with parents about their children’s learning and also with the children themselves. Deep and relevant connections with instruction are targets at which score reports have not been aimed. It is time not only to align the content of assessments and instruction but to heighten the information links between testing and teaching.

Copyright © 2005 by Pearson Education, Inc. or its affiliate(s). All rights reserved.Pearson and the Pearson logo are trademarks of Pearson Education, Inc. or its affiliate(s).

Page 20: Systematic Feedback for More Effective Teaching and Learningimages.pearsonassessments.com/images/.../Systematic... · Systematic Feedback for More Effective Teaching and Learning

.

.

.

.

.

.

.

TECHNICAL REPORT Systematic Feedback for More Effective Teaching and Learning

20

References

Harcourt Assessment, Inc. (2002). Education catalog. Tests and related products and services. San Antonio, TX: Author.

Tufte, Edward R. (1990). Envisioning information. Cheshire, CT: Graphics Press LLC.

Tufte, Edward R. (1997). Visual explanations: Images and quantities, evidence and narrative. Cheshire, CT: Graphics Press LLC.

Tufte, Edward R. (2001). The visual display of quantitative information. Cheshire, CT: Graphic Press LLC.

Additional copies of this and related documents are available from: Pearson Inc. 19500 Bulverde Road San Antonio, TX 78259 1-800-211-8378 1-877-576-1816 (fax) http://www.pearsonassess.com

Copyright © 2005 by Pearson Education, Inc. or its affiliate(s). All rights reserved.Pearson and the Pearson logo are trademarks of Pearson Education, Inc. or its affiliate(s).