using assessment data for educator and student growth

89
Andy Hegedus, Ed.D. Kingsbury Center at NWEA June 2014 Using Assessment Data for Educator and Student Growth

Upload: nwea

Post on 27-Nov-2014

354 views

Category:

Education


0 download

DESCRIPTION

This presentation reviews major topics to be considered when using assessment data in implementing a school's program of educator and student growth and evaluation. By attending this workshop, participants will improve their assessment literacy, learn how to improve student achievement and instructional effectiveness through thoughtful data use, and discuss common issues shared by educators when using data for evaluative purposes.

TRANSCRIPT

Page 1: Using Assessment Data for Educator and Student Growth

Andy Hegedus, Ed.D. Kingsbury Center at NWEA

June 2014

Using Assessment Data for Educator and

Student Growth

Page 2: Using Assessment Data for Educator and Student Growth

• Increase your understanding about various urgent assessment related topics– Ask better questions– Useful for making all types of decisions with

data

My Purpose

Page 3: Using Assessment Data for Educator and Student Growth

1. Alignment between the content assessed and the content to be taught

2. Selection of an appropriate assessment• Used for the purpose for which it was designed

(proficiency vs. growth)• Can accurately measure the knowledge of all students• Adequate sensitivity to growth

3. Adjust for context/control for factors outside a teacher’s direct control (value-added)

Three primary conditions

Page 4: Using Assessment Data for Educator and Student Growth

1. Assessment results used wisely as part of a dialogue to help teachers set and meet challenging goals

2. Use of tests as a “yellow light” to identify teachers who may be in need of additional support or are ready for more

Two approaches we like

Page 5: Using Assessment Data for Educator and Student Growth

• What we’ve known to be true is now being shown to be true– Using data thoughtfully improves student

achievement and growth rates– 12% mathematics, 13% reading

• There are dangers present however– Unintended Consequences

Go forth thoughtfullywith care

Slotnik, W. J. , Smith, M. D., It’s more than money, February 2013, retrieved from http://www.ctacusa.com/PDFs/MoreThanMoney-report.pdf

Page 6: Using Assessment Data for Educator and Student Growth

“What gets measured (and attended to), gets done”

Remember the old adage?

Page 7: Using Assessment Data for Educator and Student Growth

• NCLB– Cast light on inequities– Improved performance of “Bubble Kids”– Narrowed taught curriculum

The same dynamic happens inside your schools

An infamous example

Page 8: Using Assessment Data for Educator and Student Growth

It’s what we do that counts

A patient’s health doesn’t change because we know their blood pressure

It’s our response that makes all the difference

Page 9: Using Assessment Data for Educator and Student Growth

Be considerate of the continuum of stakes involved

Support

Compensate

Terminate

Increasing levels of required rigor

Incr

easi

ng r

isk

Page 10: Using Assessment Data for Educator and Student Growth

Marcus Normal Growth Needed Growth

Marcus’ growth

College readiness standard

Page 11: Using Assessment Data for Educator and Student Growth

The Test

The Growth Metric

The Evaluation

The Rating

There are four key steps required to answer this question

Top-Down Model

Page 12: Using Assessment Data for Educator and Student Growth

Assessment 1

Goal Setting

Assessment(s)

Results and Analysis

Evaluation (Rating)

How does the otherpopular process work?

Bottom-Up Model(Student Learning Objectives)

Understanding all four of the top-down elements are needed here

Page 13: Using Assessment Data for Educator and Student Growth

The Test

The Growth Metric

The Evaluation

The Rating

Let’s begin at the beginning

Page 14: Using Assessment Data for Educator and Student Growth

3rd Grade ELA

Standards

3rd Grade ELA

Teacher?

3rd Grade Social

Studies Teacher?

Elem. Art Teacher?

What is measured should be aligned to what is to be taught

1. Answer questions to demonstrate understanding of text….

2. Determine the main idea of a text….

3. Determine the meaning of general academic and domain specific words…

Would you use a general reading assessment in the evaluation of a….

~30% of teachers teach in tested subjects and gradesThe Other 69 Percent: Fairly Rewarding the Performance of Teachers of Nontested Subjects and Grades, http://www.cecr.ed.gov/guides/other69Percent.pdf

Page 15: Using Assessment Data for Educator and Student Growth

• Assessments should align with the teacher’s instructional responsibility– Specific advanced content

• HS teachers teaching discipline specific content – Especially 11th and 12th grade

• MS teachers teaching HS content to advanced students

– Non-tested subjects• School-wide results are more likely “professional

responsibility” rather than reflecting competence

– HS teachers providing remedial services

What is measured should be aligned to what is to be taught

Page 16: Using Assessment Data for Educator and Student Growth

• Many assessments are not designed to measure growth

• Others do not measure growth equally well for all students

The purpose and design of the instrument is significant

Page 17: Using Assessment Data for Educator and Student Growth

Let’s ensure we have similar meaning

Beginning

Literacy

Adult Reading

5th Grade x

x

Time 1 Time 2

StatusGrowth

Two assumptions:1. Measurement accuracy,

and2. Vertical interval scale

Page 18: Using Assessment Data for Educator and Student Growth

Accurately measuring growth

depends on accurately measuring

achievement

Page 19: Using Assessment Data for Educator and Student Growth

Questions surrounding the

student’s achievement level

The more questions the

merrier

What does it take to accurately measure achievement?

Page 20: Using Assessment Data for Educator and Student Growth

Teachers encounter a distribution of student performance

Beginning

Literacy

Adult Reading

5th Grad

e

x x xx

xx

xx

x

x

xx

x

xx

Grade Level Performance

Page 21: Using Assessment Data for Educator and Student Growth

Adaptive testing works differently

Item bank can span full range of achievement

Page 22: Using Assessment Data for Educator and Student Growth

How about accurately measuring height?

What if the yardstick stopped in the middle of his back?

Page 23: Using Assessment Data for Educator and Student Growth

Items available need to match student ability

California STAR NWEA MAP

Page 24: Using Assessment Data for Educator and Student Growth

How about accurately measuring height?

What if we could only mark within a pre-defined six inch range?

Page 25: Using Assessment Data for Educator and Student Growth

5th Grade Level Items

These differences impact measurement error

.00

.02

.04

.06

.08

.10

.12

Info

rmati

on

170 180 190 200 210 220 230 240Scale Score

Fully Adaptive Test

Significantly Different Error

160

Constrained Adaptive or

Paper/PencilTest

Page 26: Using Assessment Data for Educator and Student Growth

To determine growth, achievement

measurements must be related through

a scale

Page 27: Using Assessment Data for Educator and Student Growth

If I was measured as:5’ 9”

And a year later I was:1.82m

Did I grow?Yes. ~ 2.5”

How do you know?

Let’s measure height again

Page 28: Using Assessment Data for Educator and Student Growth

Traditional assessment uses items reflecting the grade level standards

Beginning

Literacy

Adult Reading

4th Grade

5th Grade

6th Grade

Grade Level Standards

Traditional Assessment Item Bank

Page 29: Using Assessment Data for Educator and Student Growth

Traditional assessment uses items reflecting the grade level standards

Beginning

Literacy

Adult Reading

4th Grade

5th Grade

6th Grade

Grade Level Standards

Grade Level StandardsOverlap allows linking and scale construction

Grade Level Standards

Page 30: Using Assessment Data for Educator and Student Growth

Black, P. and Wiliam, D.(2007) 'Large-scale assessment systems: Design principles drawn from international comparisons', Measurement: Interdisciplinary Research & Perspective, 5: 1, 1 — 53

• …when science is defined in terms of knowledge of facts that are taught in school…(then) those students who have been taught the facts will know them, and those who have not will…not. A test that assesses these skills is likely to be highly sensitive to instruction.

The instrument must be able to detect instruction

Page 31: Using Assessment Data for Educator and Student Growth

Black, P. and Wiliam, D.(2007) 'Large-scale assessment systems: Design principles drawn from international comparisons', Measurement: Interdisciplinary Research & Perspective, 5: 1, 1 — 53

• When ability in science is defined in terms of scientific reasoning…achievement will be less closely tied to age and exposure, and more closely related to general intelligence. In other words, science reasoning tasks are relatively insensitive to instruction.

The more complex, the harder to detect and attribute to one teacher

Page 32: Using Assessment Data for Educator and Student Growth

• Tests specifically designed to inform classroom instruction and school improvement in formative ways

No incentive in the system for inaccurate data

Using tests in high stakes ways creates new dynamic

Page 33: Using Assessment Data for Educator and Student Growth

1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 31 33 35 37 39 41 43 45 47 49 51 53 55 57 59 61 63 65 67 69 71-6.00

-4.00

-2.00

0.00

2.00

4.00

6.00

8.00

10.00

Students taking 10+ minutes longer spring than fall All other students

New phenomenon when used as part of a compensation program

Mean value-added growth by school

Page 35: Using Assessment Data for Educator and Student Growth

When teachers are evaluated on growth using a once per year assessment, one teacher who cheats disadvantages the next teacher

Other consequence

Page 36: Using Assessment Data for Educator and Student Growth

• Both a proctor and the teacher should be presenting during testing– Teacher can best guide students and ensure effort– Proctor protects integrity of results and can

support defense of teacher if results are challenged

• Have all student test each term– Need two terms to determine growth– More student aggregated the more you know

Proctoring

Page 37: Using Assessment Data for Educator and Student Growth

• Important for reliable test data particularly when determining growth

• Use Testing Condition Indicators as KPIs – Accuracy, duration, changes in duration– Formative conversations to improve over time

• Short test durations are worth considering follow-up– Apply criteria each test event

• Be concerned more with consistency in test duration than duration itself

Consistent Testing Conditions

Page 38: Using Assessment Data for Educator and Student Growth

• Pause or terminate before completion– Preferred option – Address when problems are

identified– Not subject to challenge that student retested

simply because the score wasn’t good enough• Monitor students as testing is going on

– Ensure effort– Support students as they struggle – G&T

• Show that accurate data is important

Early Intervention

Page 39: Using Assessment Data for Educator and Student Growth

• Define “Significant” decline between test events– Apply significant decline criteria each test term

• Simply missing cut score is not an acceptable reason to retest

Retesting

Page 40: Using Assessment Data for Educator and Student Growth

Testing is complete . . . What is useful to answer our question?

The Test

The Growth Metric

The Evaluation

The Rating

Page 41: Using Assessment Data for Educator and Student Growth

Grade 2 Grade 3 Grade 4 Grade 5 Grade 6 Grade 7 Grade 80

10

20

30

40

50

60

70

80

90

100

ReadingMath

The metric matters - Let’s go underneath “Proficiency”

Difficulty of New York Cut Score Between Level 2 and 3

Nat

iona

l Per

cent

ile

College Readiness

A study of the alignment of the NWEA RIT scale with the New York State (NYS) Testing Program, November 2013

Page 42: Using Assessment Data for Educator and Student Growth

Difficulty of ACT college readiness standards

Page 43: Using Assessment Data for Educator and Student Growth

The metric matters - Let’s go underneath “Proficiency”

Dahlin, M. and Durant, S., The State of Proficiency, Kingsbury Center at NWEA, July 2011

Page 44: Using Assessment Data for Educator and Student Growth

Mathematics

No ChangeDownUp

Fall RIT

Num

ber o

f Stu

dent

sWhat gets measured and attended to

really does matter

Proficiency College Readiness

One district’s change in 5th grade mathematics performance relative to the KY proficiency cut scores

Page 45: Using Assessment Data for Educator and Student Growth

Mathematics

Below projected growthMet or above pro-jected growth

Student’s score in fall

Nu

mb

er o

f S

tud

ents

Number of 5th grade students meeting projected mathemat-ics growth in the same district

Changing from Proficiency to Growth means all kids matter

Page 46: Using Assessment Data for Educator and Student Growth

• What did you just learn?• How will you change what you typically

do?

Guiding Questions

Page 47: Using Assessment Data for Educator and Student Growth

How can we make it fair?

The Test

The Growth Metric

The Evaluation

The Rating

Page 48: Using Assessment Data for Educator and Student Growth

Without context what is “Good”?

Beginning Reading

Adult Literacy

Nati

onal

Pe

rcen

tile

Norms StudyScale

Colle

ge R

eadi

ness

Be

nchm

arks

ACT

Perf

orm

ance

Lev

els

State Test

“Meets”Proficiency

Perf

orm

ance

Lev

els

Common Core

Proficient

Page 49: Using Assessment Data for Educator and Student Growth

Normative data for growth is a bit different

Fall Score

Subject: Reading

Grade: 4th

7 points

FRL vs. non-FRL?

IEP vs. non-IEP?

ESL vs. non-ESL?

Outside of a teacher’s direct control

Starting Achievement

Instructional Weeks

Basic Factors

Typical growth

Page 50: Using Assessment Data for Educator and Student Growth

60%20%

20%

APPRObservations State Test Growth EA Value-Added

How did we address requirements in New York?

State Tested Grades / Subjects (4-8 Math and Reading)

Other Grades / Subjects for which there is an available non-state test

60%20%

20%

APPRObservations Local Measure 2 EA Value-Added

Value-Added

Value-Added

Local Measure 2

(SLO)

State Test

Growth

Partnered with Education Analytics on VAM

Page 51: Using Assessment Data for Educator and Student Growth

The Oak Tree Analogy* – a conceptual introduction to the metric

*Developed at the Value-Added Research Center

An Introduction to Value-Added

Page 52: Using Assessment Data for Educator and Student Growth

The Oak Tree Analogy

Page 53: Using Assessment Data for Educator and Student Growth

Gardener A Gardener B

Explaining Value-Added by Evaluating Gardener Performance

• For the past year, these gardeners have been tending to their oak trees trying to maximize the height of the trees.

Page 54: Using Assessment Data for Educator and Student Growth

This method is analogous to using an Achievement Model.

Gardener A Gardener B

61 in.

72 in.

Method 1: Measure the Height of the Trees Today (One Year After the Gardeners Began)

• Using this method, Gardener B is the more effective gardener.

Page 55: Using Assessment Data for Educator and Student Growth

61 in.

72 in.Gardener A Gardener B

Oak AAge 4

(Today)

Oak BAge 4

(Today)

Oak AAge 3

(1 year ago)

Oak BAge 3

(1 year ago)

47 in.52 in.

This Achievement Result is not the Whole Story

• We need to find the starting height for each tree in order to more fairly evaluate each gardener’s performance during the past year.

Page 56: Using Assessment Data for Educator and Student Growth

This is analogous to a Simple Growth Model, also called Gain.

61 in.

72 in.Gardener A Gardener B

Oak AAge 4

(Today)

Oak BAge 4

(Today)

Oak AAge 3

(1 year ago)

Oak BAge 3

(1 year ago)

47 in.52 in.+14 in. +20 in

.

Method 2: Compare Starting Height to Ending Height

• Oak B had more growth this year, so Gardener B is the more effective gardener.

Page 57: Using Assessment Data for Educator and Student Growth

Gardener A Gardener B

What About Factors Outside the Gardener’s Influence?

• This is an “apples to oranges” comparison.• For our oak tree example, three environmental factors we will examine are:

Rainfall, Soil Richness, and Temperature.

Page 58: Using Assessment Data for Educator and Student Growth

External condition Oak Tree A Oak Tree B

Rainfall amount

Soil richness

Temperature

High LowLow HighHigh Low

Gardener A Gardener B

Page 59: Using Assessment Data for Educator and Student Growth

Gardener A Gardener B

How Much Did These External Factors Affect Growth?

• We need to analyze real data from the region to predict growth for these trees.• We compare the actual height of the trees to their predicted heights to determine

if the gardener’s effect was above or below average.

Page 60: Using Assessment Data for Educator and Student Growth

In order to find the impact of rainfall, soil richness, and temperature, we will plot the growth of each individual oak in the region compared to its environmental conditions.

Page 61: Using Assessment Data for Educator and Student Growth

Rainfall Low Medium HighGrowth in inches

relative to the average

-5 -2 +3

Soil Richness Low Medium HighGrowth in inches

relative to the average

-3 -1 +2

Temperature Low Medium HighGrowth in inches

relative to the average

+5 -3 -8

Calculating Our Prediction Adjustments Based on Real Data

Page 62: Using Assessment Data for Educator and Student Growth

Oak AAge 3

(1 year ago)

Oak BAge 3

(1 year ago)

67 in.72 in.Gardener A Gardener B

Oak APrediction

Oak BPrediction

47 in.52 in.

+20 Average+20 Average

Make Initial Prediction for the Trees Based on Starting Height

• Next, we will refine out prediction based on the growing conditions for each tree. When we are done, we will have an “apples to apples” comparison of the gardeners’ effect.

Page 63: Using Assessment Data for Educator and Student Growth

70 in. 67 in.Gardener A Gardener B

47 in.52 in.

+20 Average+20 Average

+ 3 for Rainfall - 5 for Rainfall

Based on Real Data, Customize Predictions based on Rainfall

• For having high rainfall, Oak A’s prediction is adjusted by +3 to compensate.• Similarly, for having low rainfall, Oak B’s prediction is adjusted by -5 to compensate.

Page 64: Using Assessment Data for Educator and Student Growth

67 in.69 in.Gardener A Gardener B

47 in.52 in.

+20 Average+20 Average

+ 3 for Rainfall

- 3 for Soil + 2 for Soil

- 5 for Rainfall

Adjusting for Soil Richness

• For having poor soil, Oak A’s prediction is adjusted by -3.• For having rich soil, Oak B’s prediction is adjusted by +2.

Page 65: Using Assessment Data for Educator and Student Growth

59 in.

74 in.Gardener A Gardener B

47 in.52 in.

+20 Average+20 Average

+ 3 for Rainfall

- 3 for Soil + 2 for Soil

- 8 for Temp + 5 for Temp

- 5 for Rainfall

Adjusting for Temperature

• For having high temperature, Oak A’s prediction is adjusted by -8.• For having low temperature, Oak B’s prediction is adjusted by +5.

Page 66: Using Assessment Data for Educator and Student Growth

+20 Average+20 Average

+ 3 for Rainfall

- 3 for Soil + 2 for Soil

- 8 for Temp + 5 for Temp_________+12 inchesDuring the year

_________+22 inches During the year

59 in.

74 in.Gardener A Gardener B

47 in.52 in.

- 5 for Rainfall

Our Gardeners are Now on a Level Playing Field

• The predicted height for trees in Oak A’s conditions is 59 inches.

• The predicted height for trees in Oak B’s conditions is 74 inches.

Page 67: Using Assessment Data for Educator and Student Growth

PredictedOak A

PredictedOak B

ActualOak A

ActualOak B

59 in.

74 in.Gardener A Gardener B61 in.

72 in.+2-2

Compare the Predicted Height to the Actual Height

• Oak A’s actual height is 2 inches more than predicted. We attribute this to the effect of Gardener A.• Oak B’s actual height is 2 inches less than predicted. We attribute this to the effect of Gardener B.

Page 68: Using Assessment Data for Educator and Student Growth

This is analogous to a Value-Added measure.

Above Average

Value-Added

Below Average

Value-Added

PredictedOak A

PredictedOak B

ActualOak A

ActualOak B

59 in.

74 in.Gardener A Gardener B61 in.

72 in.+2-2

Method 3: Compare the Predicted Height to the Actual Height

• By accounting for last year’s height and environmental conditions of the trees during this year, we found the “value” each gardener “added” to the growth of the trees.

Page 69: Using Assessment Data for Educator and Student Growth

Gardener A

Value-Added is a Group Measure

• To statistically isolate a gardener’s effect, we need data from many trees under that gardener’s care.

Gardener B

Page 70: Using Assessment Data for Educator and Student Growth

Oak Tree Analogy Value-Added in Education

What are we evaluating?

• Gardeners • Districts• Schools• Grades• Classrooms• Programs and Interventions

How does this analogy relate to value added in the education context?

What are we using to measure success?

• Relative height improvement in inches

• Relative improvement on standardized test scores

Sample • Single oak tree • Groups of students

Control factors • Tree’s prior height

• Other factors beyond the gardener’s control:

• Rainfall• Soil richness• Temperature

• Students’ prior test performance (usually most significant predictor)

• Other demographic characteristics such as:

• Grade level• Gender• Race / Ethnicity• Low-Income Status• ELL Status• Disability Status• Section 504 Status

Page 71: Using Assessment Data for Educator and Student Growth

• What if I skip this step?– Comparison is likely against normative data

so the comparison is to “typical kids in typical settings”

• How fair is it to disregard context?– Good teacher – bad school– Good teacher – challenging kids

Consider . . .

Page 72: Using Assessment Data for Educator and Student Growth

• Control for measurement error– All models attempt to address

this issue• Population size• Multiple data points

– Error is compounded with combining two test events

– Many teachers’ value-added scores will fall within the range of statistical error

A variety of errors means more stability only at the extremes

Page 73: Using Assessment Data for Educator and Student Growth

-12.00-11.00-10.00

-9.00-8.00-7.00-6.00-5.00-4.00-3.00-2.00-1.000.001.002.003.004.005.006.007.008.009.00

10.0011.0012.00

Mathematics Growth Index Distribution by Teacher - Validity Filtered

Aver

age

Grow

th In

dex

Scor

e an

d Ra

nge

Q5

Q4

Q3

Q2

Q1

Each line in this display represents a single teacher. The graphic shows the average growth index score for each teacher (green line), plus or minus the standard error of the growth index estimate (black line). We removed stu-dents who had tests of questionable validity and teachers with fewer than 20 students.

Range of teacher value-added estimates

Page 74: Using Assessment Data for Educator and Student Growth

With one teacher, error means a lot

Page 75: Using Assessment Data for Educator and Student Growth

Because we want studentsto learn more!

• Research view–Setting goals improves performance

Why should we care about goal setting in education?

Page 76: Using Assessment Data for Educator and Student Growth

What does research say on goal setting?

Locke, E. A. & Latham, G. P. (2002). Building a practically useful theory of goal setting and task motivation: A 35-year odyssey. American psychologist. American Psychological Association.

Goals

Moderators

Mechanisms

Performance

Satisfaction with

Performance and Rewards

Willingness to commit

Essential Elements of Goal-Setting Theory and the High-Performance Cycle

Page 77: Using Assessment Data for Educator and Student Growth

What does research say on goal setting?

Locke, E. A. & Latham, G. P. (2002). Building a practically useful theory of goal setting and task motivation: A 35-year odyssey. American psychologist. American Psychological Association.

Goals

Moderators

Mechanisms

Performance

Satisfaction with

Performance and Rewards

Willingness to commit

Essential Elements of Goal-Setting Theory and the High-Performance Cycle

Page 78: Using Assessment Data for Educator and Student Growth

• Specificity• Difficulty

– Performance and learning goals

– Proximal goals

Goals

Goals Explanation

• Specific goals are typically stronger than “Do your best” goals

• Moderately challenging is better than too easy or too hard– If complex and new knowledge or

skills needed, set learning goals• Master five new ways to assess each

student’s learning in the moment

– If complex, set short term goals to gauge progress and feel rewarded

Page 79: Using Assessment Data for Educator and Student Growth

• Lack of a historical context– What has this teacher and these students done in

the past?• Lack of comparison groups

– What have other teachers done in the past?• What is the objective?

– Is the objective to meet a standard of performance or demonstrate improvement?

• Do you set safe goals or challenging goals?

Challenges with goal setting

Page 80: Using Assessment Data for Educator and Student Growth

• Goals and targets themselves– Appropriately balance moderately

challenging goals with consequences • Only use “Stretch” goals for the organization to

stimulate creativity and create unconventional solutions

Suggestions

Locke, E. A., & Latham, G. P. (2013). New developments in goal setting and task performance.

Page 81: Using Assessment Data for Educator and Student Growth

• Goals and targets themselves (cont.)– Set additional learning goals if complex and

new– Set interim benchmarks for progress

monitoring– Carefully consider what will not happen to

attain the goal• Can you live with the consequences?• How will you look for other unintended ones?

Suggestions

Locke, E. A., & Latham, G. P. (2013). New developments in goal setting and task performance.

Page 82: Using Assessment Data for Educator and Student Growth

How tests are used to evaluate teachers

The Test

The Growth Metric

The Evaluation

The Rating

Page 83: Using Assessment Data for Educator and Student Growth

• How would you translate a rank order to a rating?• Data can be provided

• Value judgment ultimately the basis for setting cut scores for points or rating

Translation into ratings can be difficult to inform with data

Page 84: Using Assessment Data for Educator and Student Growth

• What is far below a district’s expectation is subjective

• What about• Obligation to help

teachers improve?• Quality of replacement

teachers?

Decisions are value based, not empirical

Page 85: Using Assessment Data for Educator and Student Growth

• System for combining elements and producing a rating is also a value based decision– Multiple measures and principal judgment

must be included– Evaluate the extremes to make sure it

makes sense

Even multiple measures need to be used well

Page 86: Using Assessment Data for Educator and Student Growth

Leadership Courage Is A Key

Teacher 1 Teacher 2 Teacher 30

1

2

3

4

5

Ratings can be driven by the assessment

Observation Assessment

Real or Noise?

Page 87: Using Assessment Data for Educator and Student Growth

If evaluators do not differentiate their ratings,

then all differentiation comes from the test

Big Message

Page 88: Using Assessment Data for Educator and Student Growth

1. Alignment between the content assessed and the content to be taught

2. Selection of an appropriate assessment• Used for the purpose for which it was designed

(proficiency vs. growth)• Can accurately measure the knowledge of all students• Adequate sensitivity to growth

3. Adjust for context/control for factors outside a teacher’s direct control (value-added)

Please be thoughtful about . . .

Page 89: Using Assessment Data for Educator and Student Growth

• Presentations and other recommended resources are available at: – www.nwea.org– www.kingsburycenter.org– www.slideshare.net

• Contacting us:NWEA Main Number 503-624-1951 E-mail: [email protected]

More information