school and teacher effects: a team effort sharon walpole university of delaware michael c. mckenna...

Post on 01-Jan-2016

217 Views

Category:

Documents

1 Downloads

Preview:

Click to see full reader

TRANSCRIPT

School and Teacher Effects:

A Team Effort

Sharon WalpoleUniversity of Delaware

Michael C. McKennaUniversity of Virginia

So how’d we do last year?

Let’s look at some data from the UGA external

evaluator’s report

Letter Naming FluencyKindergarten - End of Year

67.5

19.213.3

74

1610

0

20

40

60

80

Low Risk Some Risk At Risk

2004-2005 2005-2006

Phoneme Segmentation FluencyKindergarten - End of Year

70.6

21.67.7

82

135

0

20

40

60

80

100

Low Risk Some Risk At Risk

2004-2005 2005-2006

Nonsense Word FluencyKindergarten - End of the Year

70.7

16.5 12.8

77

13 10

0

20

40

60

80

100

Low Risk Some Risk At Risk

2004-2005 2005-2006

But WHY?

Think about your most effective and your least effective kindergarten teacher. They both have the same materials. They have the same professional support system. They both have the same reading block.

What is it that actually differs between the two and might lead to differences in achievement?

Phoneme Segmentation FluencyFirst Grade - End of the Year

76.3

22.1

1.6

85

141

0

20

40

60

80

100

Low Risk Some Risk At Risk

2004-2005 2005-2006

Nonsense Word FluencyFirst Grade - End of the Year

64.9

27.7

7.5

73

23

4

0

20

40

60

80

Low Risk Some Risk At Risk

2004-2005 2005-2006

Oral Reading FluencyFirst Grade - End of the Year

64.7

22.712.6

70

2010

0

20

40

60

80

Low Risk Some Risk At Risk

2004-2005 2005-2006

But WHY?

Think about your most effective and your least effective first-grade teacher. They both have the same materials. They have the same professional support system. They both have the same reading block.

What is it that actually differs between the two and might lead to differences in

achievement?

Oral Reading FluencySecond Grade - End of the Year

50.8

22.326.9

56

21 24

0102030405060

Low Risk Some Risk At Risk

2004-2005 2005-2006

But WHY?

Think about your most effective and your least effective second-grade teacher. They both have the same materials. They have the same professional support system. They both have the same reading block.

What is it that actually differs between the two and might lead to differences in achievement?

Oral Reading FluencyThird Grade - End of the Year

44.5

33.9

21.9

51

31

18

0102030405060

Low Risk Some Risk At Risk

2004-2005 2005-2006

But WHY?

Think about your most effective and your least effective third-grade teacher. They both have the same materials. They have the same professional support system. They both have the same reading block.

What is it that actually differs between the two and might lead to differences in achievement?

Individual Differences

Teacher Differences

School

Differences

Portion of

variance in

achievement

explained

Lessons from School Effectiveness Literature

An LC can form judgments about teacher differences. It’s harder to judge differences at the school level because most of us lack the opportunities to make such comparisons.

Nevertheless, factors at the school level can have strong effects on achievement. The better we understand them, the more we can control them through leadership.

Differences among teachers have a clear impact on learning. Coaching attempts to reduce some of these differences by guiding teachers toward best practice.

But what about differences among schools? Can school factors be important as well?

What are some differences among kids that affect achievement?

What are some differences among teachers that affect achievement?

What are some differences among schools that affect achievement?

Organize Your Hunches

Child Teacher School

K

1

2

3

Let’s look at three studies that attempted to identify school effects on reading achievement. What can we can learn from them?

Beat-the-Odds Study*

Design FeaturesLooked at school and teacher factorsUsed measures of word readings, fluency, and retellings14 schools, 11 of which “beat the odds”2 teachers at each grade, and 4 children per classroomRelied on interviews, surveys, observations, and scores

School factors positively correlated with growthForging links with parentsUsing systematic assessmentFostering communication and collaboration

*Taylor, Pearson, Clark, & Walpole, 2000.

State-Level Outlier Study*

Design Features2 high-achieving, 1 low-achieving from 3 clusters: country, main street, uptownUsed state tests, interviews, and observations

School factors present in the high-achieving schoolsStrong leadership and commitmentTeacher knowledgeTime and opportunity for children to readCommitment of 8-10 years to the change process

*Mosenthal, Lipson, Torncello, Russ, & Mekkelsen, (2004).

Curriculum Effects Study*

Design FeaturesCompared 4 major reform models in first gradeEach used a coach, lots of PD, and regrouping of kids4 experienced schools in each model Measures of decoding, comprehension, vocabulary; classroom observations

ComparisonsAll of the schools were relatively successfulNone of the 4 models proved best

*Tivnan & Hemphill, 2005

What lessons do these studies teach?

Common Characteristics

1. No one curriculum or intervention model is a magical solution to student achievement problems

2. Intense focus on the school’s goals is associated with success

Assessment, communication, collaboration

Leadership, vision, knowledge PD plus specific curriculum

Differences From Us

1. These schools were successful or experienced already; we are striving to be successful and we are still new to our curricula

2. These schools had different demographics than we do (except for the Curriculum Effects Study)

3. These studies do not include the effects of intensive interventions

Limits of Generalizability

1. We can’t tell whether these characteristics are causes or characteristics of success

2. We don’t know whether these factors “transfer” to striving schools

Now let’s look at a fourth study – one with better lessons for Georgia and Reading First.

GA REA Study*

Question

In a two-year reform initiative, what school-level characteristics are significantly related to first-grade achievement?

Sample

22 striving schools in REA(Mean FRL = 73%)1146 first-grade children in those schools -- all children with full data

*Walpole, Kaplan, & Blamey (in preparation).

Data

Standardized measures of decoding and comprehension, DIBELS LNF and ORFInterviews, surveys

Findings

All of the schools were relatively successful; ORF mean, grade 1 = 60 wcpmStandardized scores above 50th percentile! Assessment-based planning explained a small but significant amount of the variance in first-grade achievement. This is a school factor!

So What IS Assessment-Based Planning Anyway?

Schools with higher achievement were rated higher on these two characteristics:

The leaders made thoughtful choices to purchase commercial curriculum materials to meet school-level needs.

The leaders of this project designed a comprehensive assessment system that teachers used to differentiate instruction.

How Did We Find Out?

We made a special rating sheet to rate the levels of implementation on all aspects of the project.

We correlated those ratings with student achievement to find the ones that were most powerful.

How Did We Find Out?

Four variables “survived” the correlations1. Differentiation strategies for word

recognition and fluency2. Strategy and vocabulary instruction during

read-alouds3. Careful choice of instructional materials4. School-level design of an assessment

system linked to instruction

In our RF work . . .

These variables correspond to:1. Differentiated word recognition

strategies for needs-based work2. Interactive read-alouds of children’s

literature (Beck + Duffy)3. Use (or purchase) of curriculum

materials based on their match with emergent achievement data

4. Selection of assessments that are really used to plan needs-based instruction

How did we find out?

These 4 variables were highly correlated with one another; we combined the two leadership variables and the two differentiation variables.

We controlled for LNF at January of kindergarten, we controlled for SES, and assessment-based planning was still correlated with first-grade scores at the school level.

What Do We Need to Learn?

We need to see what variables are powerful in GARF

We need to add the level of the teacher– Identify the characteristics we are targeting – Collect meaningful observational data on them

to see whether they make a difference We need to use the data we have and get

the data we need

That’s a tall order!

(But we think we cantackle it together.)

From a design standpoint

We are all trying to collect student data to measure the success of our programs.

It does not make sense to measure program effects without measuring treatment fidelity.

It does not make sense to measure treatment fidelity without observing the treatment.

It does not make sense to document treatment fidelity without trying to improve it.

Why?

Let’s look at the concept of “innovation configuration.” This is a way of finding out how fully we are implementing Reading First.

Innovation Configuration*

Full implementatio

n

Partial implementation

No Implementatio

n

The target practice is described here.

A practice in between (or, more likely, several different ones) is described here.

A description of a practice inconsistent with the target is described here.

*Hall & Hord, 2001

Context

http://www.nsdc.org/standards/index.cfm

Context Process Content

Learning communitiesLeadershipResources

Data-drivenEvaluationResearch-basedDesignLearningCollaboration

EquityQuality teachingFamily involvement

Moving NSDC's Staff Development Standards into Practice: Innovation Configurations

By Shirley Hord, Stephanie Hirsh & Patricia Roy

Innovation Configuration for Teacher’s Professional Learning

Level 4

Engages in collaborative interactions in learning teams and participates in a variety of activities that are aligned with expected improvement outcomes (e.g., collaborative lesson design, professional networks, analyzing student work, problem solving sessions, curriculum development).

Innovation Configuration for Teacher’s Professional Learning

Level 2

Attends workshops to gain information about new programs and receives classroom-based coaching to assist with implementation of new strategies and activities that may be aligned with expected improvement outcomes.

Innovation Configuration for Teacher’s Professional Learning

Level 1

Experiences a single model or inappropriate models of professional development that are not aligned with expected outcomes.

Procedure for Making an IC

1. Designer of an innovation describes ideal implementation of various components

2. Those “ideals” are compared with “real” implementation through observation

3. The “reals” are lined up from least like the ideal to most like the ideal

4. Then the IC can be used for observations, and even linked to student achievement!

Here are our classroom-level

“ideals” for GARF so far.

Physical Environment

The classroom is neat, clean, and organized so that the teacher can monitor all children and accomplish whole-group and needs-based instruction and so that children can get the materials they need. Wall space is used to display student work and curriculum-related materials that children need to accomplish tasks.

Curriculum Materials

There is one core reading program in active use. There is physical evidence of coherence in the text-level and word-level skills and strategies targeted in the classroom environment. Texts and manipulatives for whole-group, small-group, and independent practice are organized and available.

Children’s Literature

There is a large classroom collection of high-quality children’s literature deliberately organized and in active use that includes narratives, information texts, and multicultural texts.

Instructional Schedule

There is a posted schedule inside and outside the classroom to define an organized plan for using curriculum materials for whole-group and needs-based instruction; teacher and student activities correspond to the schedule

Assessment System

There is an efficient system for screening, diagnosing specific instructional needs, and progress-monitoring that is visible to the teacher and informs instructional groupings, the content of small-group instruction, and a flexible intervention system. All data are used to make instruction more effective.

Whole-Group Instruction

Whole-group instruction is used to introduce new concepts and to model strategies. Children have multiple opportunities to participate and respond during instruction.

Small-Group Instruction

Small-group instruction is used to reinforce, reteach, and review. Each child spends some time in small group each day; and small-group instruction is clearly differentiated. Children have multiple opportunities to participate and respond during instruction.

Independent Practice

Children work alone, in small groups, or in pairs to practice skills and strategies that have been previously introduced. They read and write during independent practice. They do this with a high level of success because the teacher organizes independent practice so that it is linked to whole-group and small-group instruction.

Management

The classroom is busy and active, but focused on reading. Classroom talk is positive and academic, including challenging vocabulary. Children know how to interact during whole-class, small-group, and independent work time. Very little time is spent teaching new procedures.

There’s a Lot We Don’t Know!

Is it possible to observe with one form across all grades?

How could we collect these observational data reliably and efficiently?

Which of these might explain variance in student achievement?

Procedure for Making an IC

1. Designer of an innovation describes ideal implementation of various components

2. Those “ideals” are compared with “real” implementation through observation

3. The “reals” are lined up from least like the ideal to most like the ideal

4. Then the IC can be used for observations, and even linked to student achievement!

Professional Support System*

*Joyce & Showers, 2002

Why?

Practice

Feedback Theory

Demonstration

You be the rater!

We will give you the IC that we made for the Georgia REA study.

Think about your first-grade team last year.

Provide us a realistic reflection on where you stood during Year 2.

We will use that data to analyze the Year 2 scores and provide you an update at our next meeting!

ReferencesHall, G. & Hord, S. (2001). Implementing change: Patterns, principles, and potholes.

Boston: Allyn and Bacon.Joyce, B., & Showers, B. (2002). Student achievement through staff development.

White Plains, NY: Longman. Mosenthal, J., Lipson, M., Torncello, S., Russ, B., & Mekkelsen, J. (2004). Contexts

and practices of six schools successful in obtaining reading achievement. The Elementary School Journal, 104, 343-367.

Taylor, B. M., Pearson, P. D., Clark, K. M., & Walpole, S. (2000). Effective schools and accomplished teachers: Lessons about primary-grade reading instruction in low-income schools. The Elementary School Journal, 101, 121-165.

Tivnan, T., & Hemphill, L. (2005). Comparing four literacy reform models in high-poverty schools: patterns of first-grade achievement. The Elementary School Journal, 105, 419-441.

Walpole, S., Kaplan, D., & Blamey, K. L. (in preparation). The effects of assessment-driven instruction on first-grade reading performance:Evidence from REA in Georgia.

top related