mrf report 062006 - usm.maine.edu filethis report was prepared by janet fairman, assistant research...

157

Upload: vandung

Post on 06-Mar-2018

224 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention
Page 2: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

MAINE READING FIRST

ANNUAL PROGRESS REPORT (YEAR ONE)

Prepared by: Janet Fairman, Ph.D.

June 2006

This report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute (MEPRI) and Center for Research and Evaluation (CRE), in collaboration with CRE staff members Debra Allen, Brian Doore, Yurui Zhen, and Amy Cates, and with consultant Geoffrey Gordon for the analysis of student assessment data.

Maine Education Policy Research Institute A nonpartisan research institute funded by the Maine State Legislature,

the University of Maine, and the University of Southern Maine.

———————————————————————————————————

Center for Research and Evaluation, College of Education & Human Development University of Maine, 5766 Shibles Hall, Orono, ME 04469-5766

(207) 581-2493 • FAX (207) 581-9510

A Member of The University of Maine System

Page 3: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

i

Table of Contents

Executive Summary ................................................................................................................................. iii Introduction.................................................................................................................................... iii Delivery of the MRF Intervention ................................................................................................ iii Demographics for Subgrant Schools ............................................................................................ iii Program Implementation in Subgrant Schools ............................................................................ iv Participant Survey Results ............................................................................................................ v Baseline (Year One) Student Assessment Results....................................................................... vi Introduction .............................................................................................................................................. 1 Part I: Program Activities ........................................................................................................................ 1 Activities Coordinated/Delivered by MRF .................................................................................. 1 Staffing ................................................................................................................................. 1 Program Activities for Subgrant Schools........................................................................... 1 Cohort 1 Meetings ......................................................................................................................... 2 Cohort 2 Meetings ......................................................................................................................... 2 Coursework for Subgrant School Educators...................................................................... 4 Statewide Program Activities ............................................................................................. 6 Section Summary........................................................................................................................... 8 Part II: Demographic Data for MRF Subgrant Schools ........................................................................ 10 School Size .................................................................................................................................... 10 Teacher Demographic Data........................................................................................................... 11 Student Demographic and Achievement Data ............................................................................. 14 Section Summary........................................................................................................................... 16 Part III: Program Implementation ........................................................................................................... 18 Reading Assessment Practices ...................................................................................................... 18 School Leadership and Staffing Capacity .................................................................................... 22 Core Reading Program .................................................................................................................. 23 Participation in MRF Course ........................................................................................................ 24 Section Summary........................................................................................................................... 24 Part IV: Participant Survey Data............................................................................................................ 26 MRF Literacy Leadership Team Survey ...................................................................................... 26 Feedback on Technical Assistance ..................................................................................... 27 Feedback on Professional Development ............................................................................ 28 School Implementation Efforts........................................................................................... 30 Supports ..................................................................................................................... 30 Challenges.................................................................................................................. 31 Perceptions of Early Impacts of MRF................................................................................ 32 Impacts on Student Reading Achievement ........................................................................ 32 Views about MRF................................................................................................................ 34 Summary for Leadership Team Survey ............................................................................. 36 MRF Course Surveys and Data Comparison ............................................................................... 37 Demographics ...................................................................................................................... 39 Knowledge of Topics and Concepts in Reading ............................................................... 46

Page 4: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

ii

Knowledge of Research-Based Reading Practices ................................................. 46 Knowledge of Reading Development ...................................................................... 50 Knowledge of Literacy Environments ..................................................................... 53 Knowledge of Reading Elements and Reading Instruction.................................... 56 Professional Development Needs and Other Supports ..................................................... 64 Feedback on Core Reading Program.................................................................................. 66 Feedback on MRF Course................................................................................................... 69 Instructional Views.............................................................................................................. 74 Instructional Practice........................................................................................................... 80 Impacts of New Learning on Instructional Practice .......................................................... 82 Summary for MRF Course Surveys ................................................................................... 88 Section Summary........................................................................................................................... 92 Part V: Baseline Student Assessment Data ........................................................................................... 95 Aggregate Assessment Results: Year One ................................................................................... 101 Student progress in phonemic awareness........................................................................... 101 ISF results .................................................................................................................. 102 PSF results ................................................................................................................. 103 Student progress in phonics ................................................................................................ 105 LNF results ................................................................................................................ 107 NWF results ............................................................................................................... 108 TerraNova--Word Analysis results .......................................................................... 111 Student progress in vocabulary........................................................................................... 112 WUF results ............................................................................................................... 113 TerraNova--Vocabulary results ................................................................................ 116 Student progress in fluency................................................................................................. 117 ORF results ................................................................................................................ 118 Student progress in comprehension.................................................................................... 120 RTF results................................................................................................................. 122 TerraNova-Comprehension results .......................................................................... 123 Summary of aggregate assessment results ......................................................................... 124 Comparison of Assessment Results Across Cohort 1 Schools ................................................... 127 Phonemic awareness............................................................................................................ 127 Phonics ................................................................................................................................. 128 Vocabulary ........................................................................................................................... 130 Fluency ................................................................................................................................. 131 Comprehension .................................................................................................................... 132 Summary of cross sample comparison............................................................................... 134 Comparison of Assessment Results for Student Subgroups ....................................................... 136 Phonemic awareness............................................................................................................ 137 Phonics ................................................................................................................................. 138 Vocabulary ........................................................................................................................... 140 Fluency ................................................................................................................................. 141 Comprehension .................................................................................................................... 143 Summary for comparison of student subgroups ................................................................ 144 Summary for Part V: Baseline Student Assessment Data........................................................... 145 References .............................................................................................................................................. 147

Page 5: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

iii

Executive Summary

Introduction

This report provides a summary of progress for Maine Reading First (MRF), an initiative

coordinated by the Maine Department of Education. The Maine Education Policy Research

Institute is the external evaluator for MRF. This report includes a description of program

activities taking place since the midyear report of March 2005 to the present, and baseline

reading assessment data for students in MRF during the 2004-2005 school year (year one). MRF

is currently finishing the 2nd year of a 6-year grant project.

Delivery of the MRF Intervention

MRF coordinators, staff, and consultants provided numerous occasions and events for

technical assistance, progress monitoring, and professional development for participating schools

and professional development for literacy educators statewide during year one and into year two

of the grant project. Professional development focused on scientifically based reading research

and the five essential elements of reading, actively engaged educators in learning and reflecting

on their instructional practice, and drew on the expertise of local, state, and national leaders in

literacy. A new component of the professional development in year two was a training series for

participating school literacy intervention specialists. There is ample evidence that MRF met

program requirements during year one and two.

Demographics for Subgrant Schools

The focus of this report is on participating schools rather than districts or local

educational agencies (LEAs). LEAs apply for subgrant funding for particular elementary

schools and LEA staff participate in the MRF professional development and provide resources

and leadership to help support improvements in the participating schools. Yet, the leadership

Page 6: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

iv

provided by the school principals, literacy leadership teams, coaches, and interventionists,

together with teacher implementation efforts, may be the critical factors influencing the degree of

improvement in teaching and learning. Thus, the participating schools are truly the “target” of

interest for the MRF interventions. Assessment data are collected and analyzed at the school and

grade level, not district level. Thus, we refer to “subgrant schools” or “participating schools” to

denote schools that are participating in the MRF program through subgrants to their LEAs.

Demographic data for the 17 subgrant schools indicate that the participating schools are

generally small and that enrollment declined from year one to year two in Cohort 1 schools,

resulting in slightly fewer K-3 classrooms and teachers in year two.

Eighteen percent of all K-3 teachers in Cohort 1 schools in year two (range 6%-35%)

were either new to the school or had been reassigned to grades K-3 in year two. Teacher turnover

may have a negative impact on capacity to effectively implement reading curricula and

assessments, and may limit the sustainability of program effects over time.

Student demographic data for the 17 subgrant schools and their districts indicate high

rates of student poverty, but average participation rates for special education and LEP. Only a

few districts had participation rates that were significantly higher than the statewide rates.

Baseline fourth-grade reading and writing achievement data for the state assessment

(MEA) indicate that reading and writing achievement in most of the subgrant schools lags

significantly behind statewide levels.

Program Implementation in Subgrant Schools

Most of the K-3 educators in Cohort 1 schools participated in the year-long MRF course

in year one, and all literacy coaches participated in a year-long course for coaches. Cohort 1

coaches began delivering the Maine Literacy Partnership (MLP) course to K-3 educators in their

Page 7: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

v

schools in year two, and provided classroom coaching support. School enrollment size presents a

challenge for literacy staffing. The capacity for literacy support in the larger schools declines as

enrollment and number of classroom teachers increases. Larger Cohort 1 schools have addressed

this problem for year two by hiring an additional literacy coach (in the largest school) and by

delivering the MLP course to half their K-3 staff this year and the remaining half next year.

Three schools had a change in principal leadership between year one and year two. The

lack of stability in principal leadership may have reduced school leadership for literacy and

limited school implementation efforts

All Cohort 1 schools and six Cohort 2 schools will use the Houghton-Mifflin core reading

program in year two, while 4 of the 10 Cohort 2 schools selected the Scott Foresman program.

In year two, classroom observations and participant surveys will provide additional data on

program implementation at both the school and classroom levels. Results will be reported in

subsequent evaluation reports.

Participant Survey Results

Surveys were conducted in year one with the subgrant school Literacy Leadership Teams

(Cohort 1) and with MRF course participants from subgrant schools (Cohort 1) and from

statewide course sites. Data from both surveys indicate a high level of satisfaction with the

technical assistance, training on reading assessments, and professional development provided by

MRF staff, consultants, and course instructors.

Both subgrant school and statewide course participants reported significant gains in

knowledge of literacy instruction and all five essential elements of reading by the end of the

course. Statewide course participants reported the largest gains in knowledge. Respondents

reported the largest gains in the areas of reading comprehension and fluency. Respondents in

Page 8: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

vi

some subgrant schools indicated that the professional development experience helped educators

to develop shared understandings and goals around literacy and had increased teacher

collaboration in their schools.

Teachers described some of the changes in their reading instruction, which included:

delivering more systematic and explicit reading instruction, giving more emphasis to fluency and

comprehension, using more hands-on activities, modeling reading strategies for students, and

conducting reading assessment to target students’ reading skills.

The chief challenge to implementation mentioned by subgrant school respondents

centered around time—time to engage in the learning and time to implement the required

changes in instruction and assessment.

Most subgrant school respondents (79.3%) to the MRF Course Survey agreed that the

screening, progress monitoring, and benchmark assessments have informed their reading

instruction during year one. A majority of respondents (61%) agreed that the core reading

program had positive impacts on their students’ reading development by the end of the 1st year

of implementation.

Baseline (Year One) Student Assessment Results

Aggregate results across the seven Cohort 1 schools indicated mixed levels of progress

during year one for phonics and fluency, and strong progress in phonemic awareness. The

evidence is inconclusive with regard to progress during year one for vocabulary and

comprehension, either because the DIBELS measures lacked benchmarks or because there was

only one data point for the TerraNova assessment. These baseline assessment data cannot be

used to determine the impact or effectiveness of the MRF interventions. Subsequent analyses

will compare year one and year two assessment results to see what progress students have made

Page 9: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

vii

in their reading skills. The baseline assessment data do provide strong evidence that the

participating schools were in need of assistance to improve student performance in reading.

An analysis of the assessment results disaggregated by school (7 schools representing 7

LEAs) indicated consistent patterns of performance for these schools. Three schools had

consistently low performance on the primary reading measures and two of the three schools also

had some of the largest enrollments for this school sample. One school had higher performance

and also had one of the smallest enrollments.

An analysis of the assessment results disaggregated by student subgroup populations also

revealed some patterns. Male students consistently scored lower than female students on all

measures, but the gap was generally not large. On most measures, the gap was largest in

kindergarten and decreased at each subsequent grade level. Disadvantaged students scored lower

than other students and the gap was somewhat large. On most measures, the gap was largest in

kindergarten and decreased at each subsequent grade level. Special education students scored

significantly lower than other students. On most measures, the gap was smallest in kindergarten

and increased significantly at each subsequent grade level.

Page 10: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

1

Introduction

This report provides a summary of progress for Maine Reading First (MRF), an initiative

coordinated by the Maine Department of Education. The Maine Education Policy Research

Institute is the outside evaluator for MRF. This report includes a description of program

activities taking place since the midyear report of March 2005 to the present, and baseline

reading assessment data for students in MRF during the 2004-2005 school year (year one). MRF

is currently finishing the 2nd year of a 6-year grant project.

Part I: Program Activities

Activities Coordinated/Delivered by MRF

Staffing. MRF staffing remained stable during the 1st year and into the 2nd year of the

program. Co-directors Patrick O’Shea and Lee Anne Larsen are assisted by school technical

assistance provider Julie Scammell and statewide professional development coordinator Janet

Trembly. All four individuals provide technical assistance to MRF subgrant schools through site

visits, phone, and email communications. MRF contracts with the Maine Literacy Partnership

(MLP) to train literacy coaches from subgrant schools, with consultant Dr. Janet Spector to

provide training to subgrant school participants on reading assessment, and with representatives

from curriculum publishers to provide training to subgrant school participants on the core

reading programs. Other literacy experts also provide training to educators statewide through

workshops on the five essential elements of reading.

Program activities for subgrant schools. Seven subgrant schools (Cohort 1) have

completed their 1st year in MRF and are beginning their 2nd year. Ten new subgrant schools

(Cohort 2) have just begun their participation during summer-fall 2005. Of the 75 local

Page 11: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

2

educational agencies (LEAs) eligible to apply to the grant project in year two, 10 LEAs applied

and were awarded subgrants in spring 2005. MRF staff members made monthly site visits to

Cohort 1 schools in year one, and scheduled site visits in these schools every 6 weeks for year

two. MRF staff members scheduled monthly site visits to Cohort 2 schools for year two. Site

visits were used to provide technical assistance and to monitor school progress in program

implementation. Orientation and training meetings with each cohort during summer-fall 2005 are

described as follows:

Cohort 1 Meetings

• “Leading with the End in Mind” summer workshop for administrators on 7/15/05. This was a full-day workshop. The focus was on understanding how scientifically based reading research impacts K-3 instruction and assessment, using data to improve student achievement, using grade-level team meetings to improve reading instruction, and providing constructive feedback to support teacher growth.

• Orientation meeting for Cohort 1 school literacy leaders (principal, coach, and

interventionist) on 9/9/05 outlining activities, schedule, and goals for year two of the project

• Meeting with school literacy leaders (principals, coaches, interventionists, and a few K-3

teachers), facilitated by consultant Janet Spector on 10/27/05, focusing on assessment data analysis and using data to inform instruction and assessment practice

Cohort 2 Meetings

• Orientation meeting for Cohort 2 schools on 5/19/05 • Training on core reading program facilitated by publishers (Houghton-Mifflin and Scott

Foresman) in June 2005 • “Leading with the End in Mind” summer workshop for administrators on 7/15/05 (see

above description) • Training for school literacy leadership teams on 8/16/05 • Training on Houghton-Mifflin core curriculum—The publisher provided a day of training

to the six schools that selected this program in June 2005. ERRFTAC (regional technical assistance provider) delivered another day of training on 8/23/05. The publisher will

Page 12: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

3

provide additional training during site visits in November and December 2005, and ERRFTAC will provide another day of training during winter 2006.

• Training on Scott Foresman core curriculum—The publisher provided a day of training to

each of the four schools that selected this program in June 2005 and another day for all four schools together in August 2005. The publisher will provide additional training to each school in November and December 2005.

• Meeting with K-3 teachers and literacy leaders (principals, coaches, interventionists, and

a few K-3 teachers), facilitated by consultant Janet Spector on 10/28/05, focusing on assessment data analysis and using data to inform instruction and assessment practice

Literacy Leadership Team members in Cohort 1 were surveyed at the end of year one for

feedback on the technical assistance and professional development activities provided by MRF,

as well as members’ perceptions of early impacts of the program on reading instruction and

student reading achievement. Results of the survey are presented later in this report.

In addition to the above professional development activities, tailored to the needs of

subgrant schools in their 1st or 2nd year of the program, subgrant school participants were

invited to participate in other literacy events. Cohort 2 participants and newly hired educators in

Cohort 1 schools participated in the summer institutes that were organized primarily to target

non-subgrant school K-3 educators statewide. Approximately one third of the summer institute

participants were from subgrant schools. Also, three of the seven Cohort 1 schools sent teams of

four educators to attend the National Reading First conference in New Orleans in July 2005.

Cohort 1 and 2 participants will also be invited to attend state-level workshops on DIBELS,

vocabulary, and comprehension during the 2005-2006 school year.

A new component of the professional development in year two was a training series for

subgrant school literacy intervention specialists. MRF staff members, along with consultant Dr.

Janet Spector and ERRFTAC, provided five sessions, each a full day of training, beginning on

11/15/05 which continued throughout the school year. This component was added to provide

Page 13: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

4

additional support for literacy intervention specialists and to help schools implement the 3-Tier

Model of Reading Intervention.

At the end of year one (spring 2005), 5 of the 7 Cohort 1 schools began to use Palm Pilot

technology to administer and manage DIBELS assessment data. Two schools did not shift to

Palm Pilots in the spring. In 1 of these 2 schools, the principal resigned before the end of the

school year. In year two, all 7 Cohort 1 schools used the Palm Pilots, and approximately 8-9 of

the 10 Cohort 2 schools began training and piloting the Palm Pilots in their schools. To support

the transition from paper to wireless technology, one to three educators (literacy coach,

interventionist, and teacher leaders) from Cohort 1 schools participated in Wireless User Group

meetings throughout the 2005-2006 school year. The meetings were run by consultants from

Wireless Generation, the vendor that provides software for the Palm Pilots. Wireless Generation

began training Cohort 2 schools to use the Palm Pilots in January 2006.

Coursework for subgrant school educators. The central piece in the MRF professional

development is a year-long MRF course for K-3 educators and a year-long course for coaches.

Course syllabi have not changed significantly from year one to year two. The MRF course in

year two will include video segments of classroom reading instruction. The video segments were

recorded in subgrant and non-subgrant schools during year one. A survey of the MRF course

was conducted with course participants from subgrant schools and from other course sites around

the state at the end of year one. MRF course participants were surveyed for their feedback on the

course and to obtain participants’ perceptions of early impacts on teaching. Results from this

survey are presented in Part IV of this report.

Literacy coaches in Cohort 1 schools attended a MLP course at the University of Maine

in year one of the program (2004-2005). This course prepared participants for the responsibilities

Page 14: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

5

of coaching teachers in their schools during year two of the program. A MLP trainer provided

additional training and site visits to Cohort 1 coaches in year two (2005-2006), when literacy

coaches began to deliver a MRF course to K-3 teachers in their schools.

Literacy coach staffing remained stable from year one to year two for Cohort 1 schools.

The largest subgrant school (School E) is in a district that participates in MLP on a district-wide

basis. This district realized they need more than one literacy coach to serve several elementary

schools. In year two, they hired an additional coach who had been trained through the MLP. This

coach works in School E, while another coach serves the other schools. Both coaches have

similar job duties.

In year two, literacy coaches in Cohort 1 schools will deliver the MLP course to K-3

teachers in their schools. Because of large enrollment and staffing size, three Cohort 1 schools

decided that their literacy coaches will deliver the MLP course to approximately half the K-3

staff in year two, and to the remaining half in year three. In one school, most of the K-3 teachers

(82%) participated in the course in year one. Course enrollment size is kept small (no more than

12 educators) intentionally so that coaches can better assist the teachers in the course and

through coaching. Teachers who do not take the course until year two will still be supported in

their learning by the coach through grade level team meetings and individual consultation.

In the four remaining Cohort 1 subgrant schools, all K-3 educators and literacy staff

members are asked to participate in the MLP course provided by their literacy coaches, and

participants receive three graduate credits through the University of Maine. Literacy coaches in

the subgrant schools will continue to offer the MLP course in subsequent years to new teachers

in their schools as needed. The syllabus for the MLP course provides a more in-depth review of

scientifically based reading research, the five essential elements of reading, and applications of

Page 15: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

6

literacy concepts to classroom instruction. Participants videotape three of their own reading

lessons to reflect on their own teaching, use student observation and on-going assessment to

inform instructional decisions, participate with the literacy coach in coaching sessions, and plan

literacy instruction to meet diverse needs of students.

Literacy coaches in Cohort 2 schools will attend the MLP course at the University of

Maine this year with trainer Lucie Boucher. An additional trainer will be hired. To increase

program coordination and coherence, MRF staffer Julie Scammell will work closely with the

MLP trainers to ensure that the core reading program is well integrated with the MLP

framework. Kindergarten through grade 3 educators in Cohort 2 schools will attend the MRF

course at each of the 10 school sites facilitated by instructors trained by MRF.

Statewide program activities. MRF staffer Janet Trembly coordinates the statewide

professional development activities. There are several components to the statewide effort as

described below:

• The MRF course is offered to K-3 educators across the state in 15 school sites in year two (there were 13 sites in year one). There are 15 additional sites on the waiting list to provide the course in 2006-2007. All courses will consist of 17 sessions with the exception of one school site that will offer a modified version of 12 sessions. MRF staffers Janet Trembly and Julie Scammell trained 13 new instructors on 5/9/05, 7/12-13/05, and 8/9-10/05. All MRF instructors will attend 4 additional training days throughout the year.

• MRF staff collaborated with Maine Public Broadcasting Network in year one to develop

video segments of reading instruction to be used in the MRF course during year two. • Statewide workshops on the five essential elements of reading and scientifically based

reading research offered throughout the year. Cathy Block and John Mangieri facilitated a full-day workshop on vocabulary and comprehension research and instructional implications to 120 participants on 5/11/05. Karen Burke will lead a 1-day workshop on vocabulary and comprehension in March 2006. Educators from subgrant schools may also choose to attend these workshops.

• One-day conference for preservice teachers in March 2006 to focus on the five elements

of reading. Facilitators will be MRF staff and MRF course instructors. All Maine

Page 16: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

7

universities and colleges with teacher preparation programs will be invited to send participants. Total attendance will be limited to 150 participants.

• Two half-day workshops on using DIBELS assessments and analyzing data results led by

the MRF Program Coordinators in spring 2006.

• Summer institutes—Two summer institutes were provided in summer 2005 at different locations in the state. Each institute consisted of 2 days of professional development on scientifically based reading research and the five essential elements of reading.

• Seminar for school leaders—“Leading with the End in Mind” was held 7/15/05 for

school and district administrators from both subgrant schools and from other schools statewide. The focus was on understanding how scientifically based reading research impacts instruction and assessment and using data to improve student achievement. ERRFTAC consultants facilitated the meeting, and a total of 62 participants attended.

• MRF electronic newsletter “Literacy Links”—distributed in five editions during the

school year and accessible through the MRF website. Subscription information was sent to all elementary principals statewide. MRF distributes the newsletter to over 400 subscribers, many of whom are principals who forward the newsletter electronically to educators within their schools.

• MRF website (www.maine.gov/education/rfhomepage.htm)—in place during year one to

provide information on MRF activities to educators statewide.

As with the subgrant schools, the MRF year-long course, summer institutes, and

workshops were the primary vehicles for professional development in literacy that MRF

provided for educators statewide. Participants in the MRF course were surveyed to obtain

feedback on the course and teachers were asked about their perceptions of early impacts on their

teaching. Results are presented later in this report.

The summer institutes included the Level I Institute (6/30-7/1/05) in Rockport, Maine

(midcoast region) and the Level II Institute (6/23-6/24/05) in Bethel, Maine (southwest region).

Both institutes served K-3 teachers, paraprofessionals, special educators, and administrators; and

presenters included MRF staff, literacy experts, and representatives from subgrant schools. The

Level I Institute was designed to introduce educators to Reading First concepts and principles.

Thus, the majority of the 93 participants were from non-subgrant schools, while approximately

Page 17: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

8

30 educators from subgrant schools also attended. The Level II Institute provided more

flexibility for participants to select the sessions they wished to attend. About one third of the 106

participants were from subgrant schools.

MRF Co-Directors Patrick O’Shea and Lee Anne Larsen convened a second meeting of

the State Reading First Leadership Team on 9/29/05. The team consists of education

stakeholders at the state level. The half-day session provided members with an update on MRF

program activities. Additional updates will be shared with team members by email, and a third

meeting will be scheduled at the end of year two in spring 2006.

The Literacy Faculty Group, consisting of MRF staff and higher education literacy

faculty from across the state, met on 5/20/05, 10/6/05, and 11/4/05 to continue its work on pre-

service course review and alignment. The group discussed participation in a national

collaborative for literacy faculty and the 3-Tier Reading Intervention Model, and planned a

conference for preservice teachers for March 2006.

Section Summary

MRF coordinators, staff, and consultants provided numerous occasions and events for

technical assistance, progress monitoring, and professional development for subgrant schools and

professional development for literacy educators statewide during year one and into year two of

the grant project. Professional development focused on scientifically based reading research and

the five essential elements of reading, actively engaged educators in learning and reflecting on

their instructional practice, and drew on the expertise of local, state, and national leaders in

literacy. The list of major accomplishments showing evidence of meeting program requirements

for delivery of the MRF program include the following:

• MRF provided technical assistance and progress monitoring through monthly site visits and frequent communications with subgrant schools.

Page 18: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

9

• MRF worked closely with school leadership teams to ensure that teams were informed

about all components of program participation. • MRF provided additional support to one subgrant school where the principal resigned

prior to the end of the school year.

• In year one, MRF provided training to subgrant schools on interpreting and using student assessment data to inform instructional decisions. This effort increased in year two, based partly on the feedback participants provided on surveys.

• MRF provided a course to K-3 educators and administrators in 20 sites in year one (7 of

which were subgrant schools), and in 25 sites in year two (10 of which are subgrant schools). Literacy coaches attended a course through the MLP. Both courses took place over an entire academic year. In year two, literacy coaches in Cohort 1 schools began delivering a course to K-3 teachers in their schools and will begin to provide coaching support to teachers.

• MRF collaborated with Maine Public Broadcasting Network to produce video segments

of classroom reading instruction to use in the MRF course in year two. • MRF designed the technical assistance and professional development for year two to

meet the different needs of Cohort 1 and 2 schools and educators attending the summer institutes.

• MRF added new components for professional development in year two, to provide extra

support to participants in certain job roles. These included training for administrative school leaders and for school literacy intervention specialists.

• MRF implemented Palm Pilot technology for administering and recording student

assessment data in year one, and is expanding this effort in year two. • MRF increased its outreach to educators statewide through implementation of a website

and an electronic newsletter. • MRF convened a second annual meeting of the State Reading First Leadership Team to

inform education stakeholders at the state level about MRF activities. • MRF participated in meetings of the Literacy Faculty Group which is continuing to

review preservice education courses for alignment with Reading First principles and scientifically based reading research content.

Page 19: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

10

Part II: Demographic Data for MRF Subgrant Schools

This section of the report presents a description and analysis of demographic data for

MRF subgrant schools in years one and two. These data include: enrollment data, data on K-3

teacher staffing turnover from year one to year two, teacher demographic data, student

demographic data, and fourth-grade student reading and writing achievement data.

School Size

Table 1 shows the school configuration, K-3 enrollment ranges, and number of K-3

classrooms. Enrollment ranges are used to maintain school confidentiality. The 17 subgrant

schools (1 of which is housed in two buildings), have a variety of school configurations, ranging

from a K-3 to a K-12 grade span. The most common school configuration is K-6 (7 schools, or

41%) followed by K-8 (3 schools or 18%). Many (6) of the 10 Cohort 2 schools have early

childhood (4-year old, preschool) programs.

Following a general demographic trend throughout Maine, K-3 enrollment for Cohort 1

schools either held steady (4 of the 7 schools) or declined slightly (2 schools) from year one to

year two. One school had a more dramatic decline of approximately 60 students (17 %). As

these data indicate, the 17 subgrant schools range from very small (40 students, 3 classrooms) to

medium-sized (300 K-3 students, 16 classrooms). Median K-3 enrollment was approximately

200 in 2004-2005 and declined to 190 in 2005-2006. The median number of K-3 classrooms was

12 in 2004-2005 and declined to 10 in 2005-2006.

Page 20: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

11

Table 1. Subgrant School Size

School Grade

Configuration

K-3 Enrollment Range for 2005-2006

Change in K-3

Enrollment Cohort 1 only

Total # K-3 Classrooms 2004-2005

Total # K-3 Classrooms 2005-2006

Cohort 1: A K-8 50-99 - 10 6 5 B K-6 200-249 - 10 13 11 C K-8 50-99 0 5 4 D K-5 200-249 0 12 11 E K-6 300-349 - 60 16 16 F K-5 250-299 0 17 17 G K-6 150-199 0 11 9

Cohort 2: H K-6 200-249 -- — 12 I EK-6 50-99 -- — 6 J EK-4 250-299 -- — 14 K ED-12 50-99 -- — 4 L EK-6 150-199 -- — 10 M EK-8 1-49 -- — 3 N K-3 200-249 -- — 12 O K-5 150-199 -- — 10 P EK-6 150-199 -- — 10 Q K-3 150-199 -- — 13

Mean 04-05 11 Median 04-05 12 Mean 05-06 170 10 Median 05-06 190 10 Source: Subgrant schools. Note. N = 7 schools for Cohort 1 in 2004-2005. N =17 schools for Cohorts 1 and 2 in 2005-2006. Schools B, E, G, H, L, and Q have half-day kindergarten classes (morning and afternoon). Teacher Demographic Data

Table 2 shows demographic data for teachers in the 17 subgrant schools for 2005-2006

(year two). A comparison of teacher names on Cohort 1 staffing rosters from year one to year

two revealed that between 6% and 35% of the K-3 teachers (17.6%, or 13 teachers across all

seven schools) are either new to their school or new to grades K-3 in their school. Most of these

teachers have not participated in the MRF course as their peers did in during year one. Five of

the 13 teachers previously taught in grades other than K-3 in their current schools, while most of

the teachers are new to their schools.

Page 21: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

12

Since many teachers are new to MRF and have not received full training in the program,

there will be a variation in teacher knowledge and capacity to effectively implement the core

reading program and assessments. Continued turnover in teaching staff from year to year could

reduce the impact of the professional development component of the intervention, unless schools

find a way to sustain that training for newly hired or newly assigned K-3 teachers and to foster a

school culture that values professional growth and teacher learning. Subgrant schools are

supporting new teacher hires in a variety of ways. Some schools have sent new teachers to the

summer institute (Level I), the Core Program Training, technical assistance training, or have

them participate in the MRF course at a nearby course site. Literacy coaches in subgrant schools

will also continue to provide the MLP course in their schools in subsequent years for new

teachers as needed.

Teacher demographic data indicate that the subgrant schools employ more veteran

teachers, which is consistent with demographic patterns across the state. Fifty-five percent of the

subgrant school teachers have more than 16 years of teaching experience, while 13% of the

teachers have 11 to 15 years of teaching experience.

Most K-3 subgrant teachers have professional teaching certification. Only 7.5% of the

teachers (12 teachers) had provisional certification during year two. The percentage of teachers

with advanced educational degrees is fairly low. Only 24% of the K-3 teachers had a master’s

degree or higher level of educational attainment in year two.

Page 22: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

13

Table 2. Teacher Demographic Data for 2005-2006 (Year Two)

Source: Subgrant schools. Note. Data include only K-3 regular classroom teachers. N =74 K-3 teachers for Cohort 1 and N = 87 K-3 teachers for Cohort 2 in year two (2005-2006). Percentage values shown for all schools were computed by finding the sum in each category and then dividing the sum by the total number of teachers in these schools (161). A similar procedure was used to compute the mean number of years teaching for all schools.

School Code

Total Number of K-3

Teachers In Year

Two

Number of New

K-3 Teachers In Year

Two

% of K-3

Teachers Who are New in Year Two

Mean Number of Years Teaching

in this School

Median Number of Years Teaching

in this School

% of K-3 Teachers

with 11-15 Years

Teaching Experience

% of K-3 Teachers with 16 Years

or More Teaching

Experience

% K-3 Teachers

with Provisional

Certification

% K-3 Teachers

with Master’s Degree or

Higher Cohort 1:

A 5 1 20% 15 17 20% 60% 20% 40% B 11 3 27% 10 7 9% 63% 27% 27% C 4 1 25% 13 8 0% 25% 0% 25% D 11 0 0% 18 22 9% 82% 0% 0 E 17 6 35% 9 10 0% 76% 0% 24% F 17 1 6% 15 10 12% 47% 12% 29% G 9 1 11% 15 12 22% 44% 0% 11%

Cohort 2: H 9 — — 12 9 40% 30% 0% 30% I 5 — — 24 20 20% 80% 0% 20% J 14 — — 8 7 21% 21% 21% 21% K 3 — — 10 6 0% 33% 0% 33% L 10 — — 16 17 20% 70% 10% 30% M 3 — — 21 22 0% 67% 0% 0 N 12 — — 17 15 17% 67% 0% 17% O 9 — — 14 7 0% 56% 0% 11% P 10 — — 17 18 0% 70% 20% 50% Q 12 — — 8 6 8% 33% 0% 33%

All Schools 161

13

17.6%

13

12

13%

55%

7.5%

24%

Page 23: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

14

Student Demographic and Achievement Data

Table 3 presents student demographic data for subgrant schools. The 17 subgrant schools

have high levels of student poverty (as indicated by the percentage of students who are eligible

for free and reduced school lunch program in 2005-2006), which range from 21% to 73% (mean

of 55%). Only 3 schools have a poverty rate that is close to or below the statewide rate of

34.38%. Most schools have a significantly higher poverty rate. Free and reduced school lunch

data shown in Table 3 are at the level of the school.

Data for special education and Limited English Proficiency (LEP) participation rates are

only available at the level of the district for 2004-2005. Participation rates for special education

were close to the statewide rate of 16.61% for most of the 17 subgrant school districts. Special

education participation rates ranged from 11% to 31% (mean of 18%). Three districts had rates

close to 20%, a fourth district had a rate of 24%, and a fifth district had a rate of 31%.

Eleven of the 17 subgrant school districts had LEP students. For these 11 districts, the

participation rate was generally low, ranging from 0.10% to a high of almost 25% (the mean for

all 17 districts is 2%). Three districts had a participation rate that is almost double the state rate

of 1.54%, and one district had a significantly higher participation rate of 25%.

The percentage of students meeting or exceeding the standards on the Maine Educational

Assessment (MEA) for fourth-grade reading was averaged for the most recent 3 years for which

data are available (2002-2005). These are essentially baseline data. The 3-year averaged

percentage of students ranged from 18% to 56% (mean of 42%) across the 15 schools

participating in the fourth-grade MEA. Fourth-grade student reading achievement was lower

than the statewide level in most of the subgrant schools. Only two schools had slightly higher

Page 24: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

15

reading achievement than the state level of 50.67%. Many of the schools had significantly lower

levels of reading achievement.

In writing, fourth-grade student achievement on the MEA was also lower than the

statewide average. The percentage of students meeting or exceeding state reading standards

(averaged over 3 years) ranged from 0% to 10% (mean of 5.27%) for the 15 subgrant schools

participating in the fourth-grade MEA, which is significantly lower than the statewide level of

11.33%.

Table 3. Student Demographic and Achievement Data for Subgrant Schools

School

% Students Eligible for

Free/Reduced School Lunch (2005-2006)

% Students in Special Education

(2004-2005)

% Elementary

LEP Students

(2004-2005)

% Meeting or Exceeding

Standards for Reading on Fourth-

Grade MEA (3-yr average

2002-2005)

% Meeting or Exceeding Standards for

Writing on Fourth-Grade MEA (3-yr average

2002-2005) Cohort 1:

A 41.81% 17.95% 0.00% 51.67% 9.00% B 48.06% 10.95% 0.25% 30.00% 3.00% C 72.73% 20.73% 0.13% 55.00% 0.00% D 35.02% 16.77% 0.15% 48.33% 7.67% E 48.78% 19.76% 2.85% 42.33% 7.67% F 57.98% 17.21% 0.10% 31.00% 5.00% G 61.70% 18.29% 0.31% 39.00% 2.67%

Cohort 2: H 32.28% 16.57% 2.95% 41.33% 10.33% I 64.86% 17.05% 24.65% 51.67% 7.00% J 21.23% 12.83% 0.00% 47.33% 9.33% K 68.57% 31.21% 0.00% 18.00% 2.67% L 55.03% 16.32% 0.00% 55.67% 6.33% M 70.45% 16.43% 0.00% 43.67% 2.33% N 67.72% 12.28% 0.00% No data No data O 63.45% 19.42% 0.86% 39.67% 4.67% P 69.25% 24.16% 2.85% 41.33% 1.33% Q 48.97% 10.95% 0.25% No data No data

Mean 54.58% 17.58% 2.08% 42.40% 5.27% Statewide 34.38% 16.61% 1.54% 50.67% 11.33% Source: Maine Department of Education. Note. Data shown are the most recent that were available at the time of this report. Data for free and reduced lunch are at the school level. Data for special education and LEP are available at the district level. Schools N and Q do not have fourth-grade students and therefore do not have MEA data.

Page 25: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

16

In sum, the subgrant schools have high rates of student poverty and low rates of reading

and writing achievement at the fourth-grade level when compared with statewide data. Low

student performance on the state assessment was one of the criteria for participation in the MRF

project. MEA achievement data should be considered essentially baseline data. Students taking

the fourth-grade MEA in 2004-2005 had not participated in the MRF program in K-3

classrooms.

It is reasonable to expect that student achievement in the subgrant schools will improve

over time as K-3 students receiving the MRF interventions move up to the fourth grade. Third-

grade students participating in the 1st year of the MRF program in 2004-2005 entered fourth

grade this school year and will take the MEA in spring 2006. MEA data for these students will

be available in late fall 2006.

It is also possible that the program will have “spillover” effects in grades above third

grade as some schools use their own funding to include teachers from higher grades in the MRF

course or other MRF professional development events. Teachers will also learn new instructional

approaches from their peers through informal interactions. The evaluator will continue to collect

and analyze fourth-grade MEA data over the course of the project to see if students’ reading and

writing achievement improves over the same period.

Section Summary

Demographic data for the 17 subgrant schools indicate that the participating schools are

typically configured as K-6 schools and that enrollment size is generally small. Enrollment

declined from year one to year two in Cohort 1 schools resulting in slightly fewer K-3

classrooms and teachers in year two.

Page 26: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

17

Most K-3 teachers in the subgrant schools have professional certification and most are

veteran teachers. Across all 17 schools, 68% of the K-3 teachers have over 11 years of teaching

experience, and 55% have over 16 years of teaching experience. One quarter of the teachers

have a master’s degree or higher educational attainment. Turnover in some Cohort 1 schools

was higher than in others. Eighteen percent of the K-3 Cohort 1 teachers are either new to their

schools this year or have been reassigned to K-3 grades. In four of the seven Cohort 1 schools,

between 20% and 35% of the K-3 teachers are new. Teacher turnover may limit the sustainability

of program effects over time. This concern should be addressed at the program level and at the

level of individual schools and districts.

Student demographic data for the 17 subgrant schools and their districts indicate high

student poverty rates, but average participation rates for special education and LEP. Only a few

districts had participation rates that were significantly higher than the statewide rates.

Baseline fourth-grade reading and writing achievement data for the state assessment

(MEA) indicate that reading and writing achievement in most of the subgrant schools lags behind

statewide levels. These data will continue to be examined during the course of the project as K-3

students receiving MRF interventions advance through grade levels and take the fourth-grade

assessment beginning this year.

Page 27: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

18

Part III: Program Implementation

Reading Assessment Practices

At the end of year one (spring 2005), the evaluator conducted surveys to obtain data on

program implementation in the seven Cohort 1 schools. A survey on Maine Reading First

Assessment Information (Appendix A) was emailed to the person in each school designated as

the primary coordinator for MRF assessment data. This person was typically the school literacy

coordinator and was sometimes the literacy intervention specialist or the principal. Respondents

were asked to complete tables providing information for K-3 grades on diagnostic assessments

used, progress monitoring assessments used, and method of recording DIBELS results. The

evaluator also obtained information about assessment practices in the subgrant schools from

MRF staff who work closely with each school and monitor program implementation.

All seven Cohort 1 schools administered DIBELS subtests for the purpose of screening,

progress monitoring, and outcome measures, as required by the MRF program. Some schools

elected to administer the Peabody Picture Vocabulary Test for screening, and some schools elected

to administer the Early Reading Diagnostic Assessment (ERDA) for diagnostic assessment. All

seven Cohort 1 schools administered the TerraNova as an outcome measure in grades 1-3, and two

schools elected to administer this assessment in kindergarten. The evaluator collected student

assessment data for DIBELS and TerraNova reading assessments (results are presented in Part V

of this report). MRF recommended two different diagnostic assessments to subgrant schools—the

ERDA and Fox in a Box. All subgrant schools elected to use the ERDA, as this assessment covers

all five essential elements of reading and all four grade levels (K-3). Subgrant schools

administered the ERDA as needed with a small number of students. The evaluator did not collect

data results for diagnostic assessments.

Page 28: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

19

Table 4 below shows the required DIBELS and TerraNova assessments that were

administered in each school.

Table 4. DIBELS and TerraNova Assessments Required by MRF (Year One) Grade & Reading

Component Fall Screening Assessments

Winter Benchmark Assessments

Spring Benchmark/ Outcome Measures

K Phonemic Awareness Phonics Vocabulary

ISF, PSF (midyear) LNF, NWF (midyear) WUF

ISF, PSF LNF, NWF WUF

PSF LNF, NWF WUF

1 Phonemic Awareness Phonics Vocabulary Fluency Comprehension

PSF LNF, NWF WUF ORF (midyear) RTF

PSF NWF WUF ORF RTF

PSF NWF, TerraNova (Plus) WUF, TerraNova (Plus) ORF RTF, TerraNova

2 Phonics Vocabulary Fluency Comprehension

NWF WUF ORF RTF

— WUF ORF RTF

TerraNova (Plus) WUF, TerraNova (Plus) ORF RTF, TerraNova

3 Phonics Vocabulary Fluency Comprehension

— WUF ORF RTF

— WUF ORF RTF

TerraNova (Plus) WUF, TerraNova (Plus) ORF RTF, TerraNova

There was considerable variation in the information provided by the seven Cohort 1

schools on the assessment survey. When asked what diagnostic assessments had been used in

year one for reading, three schools indicated they used only the ERDA, one school indicated they

used only the Woodcock Reading Mastery Test in grades 1-3, one school said they used a

combination of both the ERDA and the Woodcock Reading Mastery Test in grades K-3, and one

school said they did not conduct diagnostic assessments. (This school was one of three schools

that had a change in principal leadership from year one to year two.) Subgrant schools planned

to administer the ERDA as a diagnostic assessment. These data were not collected as part of the

evaluation.

Page 29: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

20

The variation in responses to the survey question on diagnostic assessment indicates that

subgrant school educators still have some confusion about which assessments are to be used for

which purposes (screening, progress monitoring, diagnostic assessment, or outcome measures).

In year one, interventionists from Cohort 1 schools were trained by ERRFTAC on the ERDA.

However, this training was at an introductory level and did not cover the topic in sufficient

depth. Subgrant schools were urged to administer the ERDA selectively for a few students and

only after the screening and progress monitoring assessments had been conducted. Some schools

may have thought they were required to use the diagnostic assessment more extensively. Cohort

1 and 2 schools received additional training on the ERDA in year two, which will help to clarify

these issues.

Cohort 1 schools indicated that the number of students assessed and/or identified for

targeted intervention based on the diagnostic assessments ranged from 0 to 4 students across the

four grade levels with a higher number of students being assessed and identified in grades 2-3.

Results from this survey item are shown in Table 5.

Table 5. Diagnostic Assessment Practices for Cohort 1 Schools Grade Level

Number of Schools Using ERDA

Number of Schools Using Woodcock

Mean Number of Students Assessed

Mean Number of Students Identified

K 3 0 0.33 0.17 1 1 1 0.67 0.50 2 2 2 1.17 0.83 3 2 2 1.83 1.67

Source: Subgrant schools. Note. N = 6 Cohort 1 schools. When asked what assessments were used for progress monitoring in reading for grades

K-3, all seven Cohort 1 schools indicated they used DIBELS subtests. Three of the schools

indicated they also used the TerraNova reading assessment for the purpose of progress

monitoring in all four grades although this assessment was required only as an outcome measure.

Page 30: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

21

One school also used the Peabody Picture Vocabulary Test (PPVT) for kindergarten

students. One school began progress monitoring late in the school year. In this school, the

principal resigned prior to the end of the school year, and there were other major district-wide

initiatives occurring during the same year (e.g., accreditation and curriculum and assessment

development). Thus, it was a challenge for this school to fully implement the MRF program

during year one of their participation.

The number of students assessed for progress monitoring varied considerably across the

seven schools, ranging from 8 students to 113 students. The number of students identified for

targeted intervention based on progress monitoring ranged from 4 students to 20 students across

six schools and was not reported for one school. Typically school literacy specialists or

interventionists administered progress monitoring assessments, but classroom teachers (four

schools) and literacy coaches (three schools) also administered the assessments in both large and

small schools.

In year one, all seven schools administered and recorded the DIBELS assessments in fall

2004 and January-February 2005 in paper format. Five schools began to use Palm Pilots for this

purpose in spring 2005. The five schools included three larger and two smaller schools (Schools

A, B, C, D, and F). In year two, all but one of the Cohort 1 schools used palm pilots to record

DIBELS data in fall 2005, and the remaining school will use the Palm Pilots in January-February

2006. The software provider helped to train teachers to use the new technology.

Training for the Palm Pilots began with the technology coordinators from Cohort 1

schools. Schools then selected two to three K-3 educators to attend a 2-day training session.

These assessment leaders then trained other K-3 teachers in their schools. Some Cohort 1 schools

trained all their K-3 teachers in spring 2005, while other schools trained only members of their

Page 31: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

22

assessment teams. Assessment teams for literacy include the interventionist, coach, one to two

teachers from grades K-3, and sometimes other literacy support staff. MRF coordinators report

that the Palm Pilot training went well and that consultants from the software provider have been

responsive to questions and training needs. The software provider added a user group for year

two to provide additional support to Cohort 1 educators. Approximately 8-9 of the 10 Cohort 2

schools elected to use Palm Pilot technology during their 1st year of participation in the program,

and the software provider will begin training educators in these schools to use the Palm Pilots in

January 2006.

School Leadership and Staffing Capacity

Change in school administrative leadership also presented a hurdle for some Cohort 1

schools. Three of the seven Cohort 1 schools had a change in principal leadership from year one

to year two. One principal retired, one principal was reassigned within his/her district, and one

principal resigned prior to the end of the school year to assume a different position. In one of

these schools, an assistant principal was promoted to principal. Principals are members of their

school literacy leadership team, and are typically the “point” person for communicating with

teachers and other staff about MRF and for making sure that the school completes tasks required

by the program. Principals participate in MRF training and professional development, such as

the year-long MRF course. It takes time for a new principal to get “up to speed” with the MRF

program requirements, and this can create a lag time in leadership for literacy, implementation of

program components, and response to MRF requests for information or data. Unstable school

leadership could be a factor that limits full implementation and limits improvements in student

outcomes. This relationship will be monitored over time by the evaluator and by the MRF

program coordinators.

Page 32: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

23

School size is another factor that could impact staffing capacity to fully implement all

components of the MRF program. In two of the largest Cohort 1 schools, half of the K-3

teaching staff will participate in the MLP course this year (year two) and the remaining half will

participate next year. This decision will allow the coach/course instructor to better serve course

participants. The largest subgrant school district (School E) participates in MLP on a district-

wide basis. This district hired an additional coach who primarily serves School E, while the

second coach serves other elementary schools in the district.

The impact of staffing size on the literacy coach’s job responsibility is obvious—a larger

teaching staff means that the capacity of a single literacy coach will be stretched more thinly.

The seven Cohort 1 schools vary from very small (one classroom per grade, or four classroom

teachers K-3) to medium-sized (three to five classrooms per grade, or 12 to 20 classroom

teachers K-3). In the very small schools, it is not difficult for a literacy coach and interventionist

to work with all K-3 teachers on a daily or weekly basis, while this might be stretched to

biweekly visits in the larger schools.

Less intense coaching might mean slower change in classroom practices in reading

instruction. Participant surveys in year one included questions about supports and challenges in

the implementation effort as well as perceptions of early impacts on teaching and learning (see

Part IV). Surveys of coaches, classroom teachers, and principals in spring 2006 (end of year

two) will be conducted to evaluate the adequacy of coach staffing, perceptions of impacts from

coaching services, and implementation of the core reading program.

Core Reading Program

All seven Cohort 1 schools elected to implement Houghton-Mifflin’s core reading

program The Nation’s Choice 2005. In Cohort 2, six schools selected the Houghton-Mifflin

Page 33: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

24

program, while four schools selected Scott Foresman’s program. Classroom observations and

participant surveys in year two will provide data on classroom reading instruction and

implementation of the core reading programs. MRF coordinators report that both publishers have

been responsive and helpful in providing training in each of the subgrant schools. Educators

have requested that the publishers develop video material to provide a visual model of teachers

implementing the core programs in real classrooms.

Participation in MRF Course

MRF specified a participation rate of 85% or better for the MRF course as a goal for subgrant

schools. Based on final course rosters checked at the end of the course, only one of the Cohort 1

schools did not meet this goal. (Participation rates cited in the midyear report of March 2005 were

based on information provided by each of the subgrant schools, and have since been corrected using

rosters submitted by the course instructors.) In year two, most schools appear to be meeting or

exceeding the participation goal, although a few teachers have not participated in the course. A lack

of full participation in MRF professional development may pose a leadership challenge for a few

schools. More data on course participation will be collected at the end of year two.

Section Summary

Key findings on program implementation at midyear in year two include the following:

• Survey results on reading assessment practices in Cohort 1 schools indicate a variation in diagnostic and progress monitoring assessments administered for reading, as well as variation in the grades assessed and number of students identified for targeted intervention. Survey responses also indicate some confusion about the different types of reading assessment.

• One school did not administer diagnostic assessments, and another school did progress

monitoring late in the school year. In one of these schools, the principal resigned prior to the end of year one.

Page 34: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

25

• In total, three schools had a change in principal leadership between year one and year two. The lack of stability in principal leadership may have reduced school leadership for literacy and limited school implementation efforts

• Most of the Cohort 1 schools (five of seven) began to implement technology for

administering and recording assessment data in spring 2005, and this effort will be expanded for Cohort 2 schools in year two.

• Most of the K-3 educators in Cohort 1 schools participated in the year-long MRF course

in year one, and all literacy coaches participated in a year-long course for coaches. • Cohort 1 coaches began delivering a course to K-3 educators in their schools in year two,

and will provide classroom coaching support. • All Cohort 1 schools and six Cohort 2 schools will use the Houghton-Mifflin core reading

program in year two, while 4 of the 10 Cohort 2 schools selected the Scott Foresman program.

• School enrollment size presents a challenge for literacy staffing. The capacity for literacy

support in the larger schools declines as enrollment and number of classroom teachers increases. Larger Cohort 1 schools have addressed this problem for year two by hiring an additional literacy coach (in the largest school) and by delivering the MLP course to half their K-3 staff this year and the remaining half next year.

Classroom observations and participant surveys in year two will provide additional data on

program implementation at both the school and classroom levels.

Page 35: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

26

Part IV: Participant Survey Data

This section presents data results and analysis for participant surveys conducted in spring

2005 (end of year one). The first survey discussed in this section is the Literacy Leadership

Team Survey, which was mailed to the seven Cohort 1 principals in May 2005. Principals were

asked to administer the survey to team members during a team meeting. The second survey

discussed is the MRF Course Survey that was conducted with course participants from subgrant

schools and from other schools statewide at the end of the course. A pretest version of the survey

was conducted in summer-fall 2004, and results were presented in the evaluator’s midyear

progress report. A posttest version of the survey was conducted in spring 2005. A comparison of

the pretest and posttest survey results is presented here, along with results for additional survey

items included on the posttest survey. Survey instruments and data tables are appended to this

report. For all surveys, quantitative data were analyzed using SPSS (version 12.0) and qualitative

data were typed into a WORD document and analyzed for themes and patterns.

MRF Literacy Leadership Team Survey

The Leadership Team Survey was mailed to each school team in early May 2005.

Completed surveys were received by the evaluator in June. All seven school teams returned

surveys, for a total of 55 surveys returned. Preliminary analysis of the survey results was shared

with the MRF coordinators in September. Most of the seven Cohort 1 subgrant schools have 12

or 13 members on their leadership teams, while the largest team (School E) has 15 members.

Leadership team members generally include the principal, literacy coach, literacy interventionist,

literacy specialist, literacy coordinator, Title 1 teacher, special education teacher, a parent, and

classroom teachers representing grades K-3. The response rate varied across the school teams,

ranging from a return rate of 38% to 100%. This information is shown in Table 6 below.

Page 36: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

27

Table 6. Survey Return Rates by School School Leadership

Team Size Number of Surveys

Returned Return Rate

A 12 12 100% B 12 7 58% C 13 8 62% D 12 7 58% E 15 7 47% F 13 5 38% G 13 9 69%

Total 90 55 61% The survey included both likert-scaled items and open-ended items asking respondents for

their feedback on the following areas for year one of the program: technical assistance provided

by MRF, professional development provided by MRF, supports and challenges in school

implementation efforts; perceptions of early impacts of MRF, and general views and levels of

satisfaction with the MRF program. Results for each of these topics are discussed in turn.

Feedback on technical assistance. One survey item (item 1) asked respondents to rate the

extent to which they found seven technical assistance events in year one useful to their school’s

implementation of the MRF program, using a scale of 1 (not at all useful) to 5 (very useful). A

response choice of 6 (no opinion) was included for respondents (such as parents) who did not

participate in an event. When mean scores for each item were computed for all respondents, the

highest ratings were indicated for the technical assistance with assessment data and support from

the MRF staff and consultants. Mean score ratings in descending order were as follows:

Assistance with DIBELS and TerraNova provided by assessment consultant (4.42); Assistance

with Palm Pilots provided by Wireless Generation (4.28); Phone, email, or written support

provided by MRF coordinators or consultants (4.10); Assistance with reading and interventions

provided by ERRFTAC consultant (4.04); Monthly site visits by MRF coordinators or

consultants (3.93); On site assistance on core reading program provided by Houghton-Mifflin

(3.44); and Assistance with ERDA assessment tool provided by ERRFTAC consultant (3.35).

Page 37: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

28

Mean rating scores varied across the seven school teams indicating that school teams found some

technical assistance activities more useful than others (see Tables 1-3 in Appendix C for mean

scores and frequencies disaggregated by school team).

An open-ended item (item 2) asked respondents for suggestions to improve MRF

technical assistance in year two. Six schools provided a total of 24 written comments. Positive

comments indicated general satisfaction with the support and technical assistance, such as: “I

was extremely satisfied by the timely and thorough responses we received whenever we had a

question.” Suggested improvements for technical assistance were:

• Better explanation of core reading program requirements • Earlier communication of scheduled MRF meetings and training dates • More training on Palm Pilots for DIBELS assessment • More training on assessment data analysis, data interpretation • User friendly information on implementing the core reading program

Respondents from one school indicated a desire for additional resources (more leveled books),

and one respondent said there was a need for more intervention materials for students.

Feedback on professional development. One survey item (item 3) asked respondents to rate

the extent to which they found seven professional development events in year one useful in

developing their own understanding of the MRF program using a scale of 1 (not at all useful) to 5

(very useful). A response choice of 6 (no opinion) was included for respondents (such as parents)

who did not participate in an event. When mean scores for each item were computed for all

respondents, the highest ratings were indicated for the course for literacy coaches and the

leadership team training. Mean scores in descending order were as follows: Course for literacy

coaches provided by the Maine Literacy Partnership (4.71); Leadership team training provided by

the Maine Literacy Partnership (4.62); Workshop led by Wiley Blevins (on phonemic awareness,

phonics, and fluency, 4.30); MRF course (3.94); Leadership team training provided by MRF

Page 38: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

29

coordinators and consultants (3.79); Initial training on core reading program provided by publisher

(3.77); and Workshop led by Cathy Block and John Mangieri (on vocabulary and comprehension,

3.31). Again, mean rating scores varied considerably across the seven schools indicating that

schools teams found some professional development events more helpful than others (see Tables

4-6 in Appendix C for mean scores and frequencies disaggregated by school team).

One possible explanation for this finding is that two of the largest schools (schools B and

E) had participated in the MLP prior to their participation in the MRF program. Educators in

these schools may have been further along in their learning about literacy at the time they entered

the MRF program.

Further, the shift from using only the MLP framework for literacy instruction to a

combination of the MLP and the MRF program represented a substantial paradigm shift for

teachers. Prior to their participation in MRF, teachers were using leveled texts for guided

reading but did not have a scope and sequence to guide the use of this instructional tool. The

infusion of a core reading program provided a framework for more systematic and sequential

development of reading skills. Teachers had to learn to integrate the use of leveled texts and the

core reading program.

An open-ended item (item 4) asked respondents for suggestions to improve MRF

professional development in year two. The seven schools provided a total of 26 written

comments. Comments included mixed views but were generally positive. Positive comments

included: “Terrific activities were suggested”; “X has done an excellent job with the class”; and

“I was overall very pleased with the professional development activities.” Suggested

improvements for professional development included requests for more emphasis on certain

topics or a different approach:

Page 39: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

30

• More guidance or models for implementing components into the classroom • Connect professional development with daily instructional activity • Guidance on scheduling and organizing the classroom • More in-depth instruction on the five elements of reading • More training with DIBELS • Focus professional development topics at specific grade levels • More individualized approach to professional development • A support group for interventionists to share ideas

A few respondents indicated a need for earlier communication of training dates, and others said

that full involvement by all K-3 teachers in their school was needed. A few respondents

expressed the view that the course covered too much material and that too much learning was

expected in 1 academic year.

School implementation efforts. Two open-ended items (items 5 and 6) asked respondents

about significant supports or challenges in their school’s implementation of the MRF program in

year one. Respondents from all seven schools responded to the items, with a total of 51 written

responses to each item.

Supports. Respondents emphasized the support and guidance from MRF staff and

consultants, and resources of staffing (e.g., literacy coordinator, coach, and interventionist) as

being the most valuable supports in their school’s implementation effort. Other supports included

MRF training and professional development, leadership teams, principal leadership, knowledge

of literacy specialists, and MRF funding used for substitutes. Some of the written responses to

item 5 were:

The state consultants were extremely helpful. They were prompt in all areas, email, phone conversations, and researching questions and sharing info. The ongoing encouragement and support from X [MRF staff, consultants]. They were accessible through email or attendance at leadership meetings. Training on the core reading program. Assistance with assessments.

Page 40: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

31

Excellent leadership by a knowledgeable literacy coach and principal. Having literacy coaches available for classroom teachers.

Challenges. Respondents said the most significant challenge was time—

specifically time demands for classroom teachers (for learning and implementation). Other

challenges included unclear fit between literacy partnership and the core reading program, stress

or resistance among teachers in their school, lack of strong leadership at the school level;, and

pressure from other state and federal mandates in education. Responses showed a pattern within

schools, where some challenges were of more concern to teachers in particular schools than in

other schools. Respondents from four schools emphasized time constraints, and respondents

from four schools focused the considerable amount of teacher learning expected by the program.

Respondents from two schools cited teacher resistance, and respondents in one of these schools

emphasized lack of leadership in their school. Written responses to item 6 describing

implementation challenges included:

Bringing in a core program so quickly and with so little preparation was very challenging. Learning how to implement the core reading program alongside the use of leveled reading books. Learning the program, assessment, and literacy partnership training all at once. The vast amount of changes in programming, materials, communication, and assessments. Time! There are many demands from this program that added to a full plate for teachers. Lack of leadership and vision. Lack of flexibility to change in general. Changing teacher perceptions about the most effective methodology.

Page 41: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

32

Perceptions of early impacts of MRF. An open-ended item (item 7) asked respondents

about the most positive impacts of their school’s participation in MRF for K-3 reading

instruction. The seven schools responded to this item with a total of 49 written comments.

Respondents described a variety of positive impacts on reading instruction, focusing on common

goals and better consistency in teaching practices and the development of a common language or

understanding about literacy among teachers. Other positive impacts included the provision of:

curriculum and instructional materials, a scope and sequence, professional development,

research-based practices, targeted interventions, and the use of data to inform instructional

decisions. Respondents in some schools noted the positive impacts of increased teacher

collaboration and improved school culture more generally. Some of the comments included:

Making literacy a priority (i.e., materials, training, 2-hour block). Everyone is on the same page as far as their literacy instruction. Also, assessment is driving our instruction. More focused instruction in literacy school-wide. Teachers are looking at data and holding themselves accountable. All teachers are using the same materials, so a common language is in place. We are consistent with our language, and curriculum, and moving children forward.

Impacts on student reading achievement. Another survey item (item 9) asked

respondents to rate the extent to which three different activities had contributed to improved

reading achievement for K-3 students in year one, using a scale of 1 (no impact) to 4 (large

impact). A response choice of 5 (no opinion) was included. When mean rating scores were

computed for all respondents, respondents gave the highest rating for professional development

on research-based practices in reading instruction (3.41), followed closely by teachers’ use of

Page 42: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

33

research-based practices in reading instruction (3.39), and lastly, teachers’ use of research-based

reading assessments and assessment data (3.31) (see Tables 7-9 in Appendix C for mean rating

scores and frequencies disaggregated by school team).

Results for this survey item suggest that respondents believed teachers’ new knowledge

and use of research-based instructional practices in reading had a slightly greater impact on

students’ reading achievement (moderate to large impact) in year one than did teacher’ use of

research-based assessments and assessment data (moderate impact). This finding is consistent

with respondents’ comments about the challenges of finding time to meet all requirements of the

program (such as administering assessments and using the results to make instructional

decisions) and with respondents’ requests for more training on administering reading

assessments and interpreting assessment results. It may be that teachers did not have sufficient

time or knowledge to interpret assessment data and make appropriate adjustments in their

reading instruction during year one. Additional assistance on reading assessment is currently

being provided to Cohort 1 schools in year two (fall 2005) and will continue to be provided to

subgrant schools during their 1st year of participation in the program.

Classroom observations by trained literacy specialists during the 2nd year of the project

will provide data on instructional practice and teachers’ use of the core reading program.

Observations will be made in randomly selected grade 1 through 3 classrooms in both Cohort 1

and Cohort 2 schools at two points in time during year two (November 2005 and April 2006).

This data is currently being collected by the evaluator and will be reported in next year’s annual

report (November 2006). Analysis of student assessment data over a 2-year period will provide

objective data on program impacts on student reading achievement. This report includes baseline

Page 43: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

34

student assessment data for year one. Data from year one and year two will be compared and

reported in the November 2006 report.

Views about MRF. Survey item 9 asked respondents to rate their overall level of

satisfaction with the MRF program using a scale of 1 (very dissatisfied) to 5 (very satisfied).

When mean rating scores were computed for all respondents, the overall mean score was 3.96.

However, mean rating scores varied considerably across school teams, ranging from a mean of

3.0 in two schools to a mean of 5.0 in one school. Across the 52 participants who responded to

this item, 77% indicated they were “somewhat satisfied” or “very satisfied” overall with the

program (see Tables 10-11 in Appendix C).

When respondents were asked (item 10) if they felt their school should continue to

participate in the MRF program, 82% of all respondents said “yes,” 7% said “no,” and 11% said

“unsure” (see Tables 12-13 in Appendix C). There was considerable variation in responses

across the seven school teams. In three schools, all respondents said “yes,” while in one school,

half the respondents said “no” or “unsure.” In describing why they would choose to continue in

the program or not, respondents offered a total of 46 written responses. Most respondents said

they felt their schools should continue in the MRF program because of a variety of positive

impacts they perceived from the school’s participation. Reasons cited for continuing in the

program included:

• Perceived improvement in teachers’ understandings, instructional practices, collaboration, and school climate

• Provision of professional development and instructional materials • Perceived improvements in instructional effectiveness • Perceived improvements in students’ reading achievement

Page 44: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

35

A few respondents (4) in two schools indicated they felt their schools should not continue in the

program, and a few respondents (6) in three schools were unsure if their schools should continue.

Respondents in these schools cited the following concerns:

• Concern about teachers’ time constraints and stress from other educational mandates • Concern that all teachers need to be actively involved in and support the program for

successful implementation and student gains • Concerns about the fit between literacy partnership and the core reading program

When respondents were asked (item 11) if they would recommend the MRF program to

another school, 69% of all respondents said “yes,” 13% said “no,” and 18% said “unsure” (see

Tables 12-13 in Appendix C). In describing whether they would recommend the MRF program

to other schools, respondents offered a total of 35 written responses. Most respondents again

cited the perceived positive benefits from their school’s participation. Reasons cited for

recommending the program included:

• Provision of funding for literacy staff, a curriculum, and instructional materials • Provision of professional development • Perceived student gains in reading achievement • Improved consistency in instructional practices in literacy

The few respondents who said they would not recommend the program or were unsure cited the

following reasons:

• Time, workload issues, and stress resulting from this and other educational initiatives in the schools

• Lack of full teacher support for the program or a negative school culture • Lack of leadership and communication about the program in their schools

School teams that indicated the highest levels of satisfaction overall with the program

also indicated they would like to continue their participation in the program and would

recommend the program to other schools. Two school teams indicating the lowest level of

satisfaction with the program also indicated more uncertainty about their willingness to continue

Page 45: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

36

in the program or to recommend the program. One of these schools had a change in the principal

leadership.

Three of the seven Cohort 1 schools experienced a change in principal leadership

between year one and year two, with one principal leaving prior to the end of the school year.

Respondents from these schools expressed more frustration about leadership in their schools,

lack of full teacher support and participation in the program, and stress over time demands for

teachers. These conditions may have preceded the schools’ participation in the MRF program

and may help to explain some of the variation in survey responses across school teams. Still,

respondents in some of these school teams expressed high levels of satisfaction with the MRF

program and perceived positive benefits from their participation.

Summary for leadership team survey. Overall, respondents from Cohort 1 schools

indicated high levels of satisfaction with the technical assistance and professional development

provided by MRF staff and consultants. Respondents gave the highest ratings for technical

assistance provided through site visits, phone and email, training on assessment data, and for

professional development for literacy coaches and leadership teams.

Respondents indicated a desire for more training on analyzing and using assessment data

for instructional decisions. Suggestions for improvements included better communication of

training dates, grade-specific professional development on each essential element of reading, and

models for implementation.

Respondents said the most important supports for their implementation effort were the

guidance provided by MRF staff and consultants, the resources of literacy coaches and

interventionists, instructional materials, and the core reading program.

Page 46: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

37

Challenges centered around the time demands for teachers to learn about all components

of the program. A few schools had concerns about internal school leadership and

communication with teachers, climate, and teacher support for instructional change. Most of

these schools had experienced a change in their principal leadership, which served to weaken the

effectiveness of their school leadership teams.

Despite the challenges mentioned by some teams, all teams expressed the view that their

participation in MRF had many positive impacts. Respondents emphasized greater consistency in

reading instruction and shared understandings and goals for reading among teachers as positive

results of the MRF program. Some respondents said the program had encouraged greater

collaboration among educators in their school. Some respondents mentioned they perceived

early positive impacts for student achievement in reading.

MRF coordinators have focused on areas identified as needs from this survey in the

following ways:

• MRF staff provided additional technical assistance to subgrant schools that were struggling with implementation due to a change in principal leadership.

• MRF provided additional technical assistance on analyzing and using assessment data for year two (such as DIBELS data), for both subgrant schools and for educators statewide.

• MRF added a professional development series for subgrant school literacy intervention specialists in year two, to support the capacity of these educators and schools’ implementation of the 3-Tier Model of Reading Intervention.

• MRF worked to bring about closer collaboration with MLP, so that the professional development provided by MLP aligns with the MRF program and core reading programs being implemented by subgrant schools.

MRF Course Surveys and Data Comparison

The MRF Course Survey was conducted at two points during year one. The pretest

version of the survey was administered to course participants in the seven Cohort 1 subgrant

Page 47: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

38

school sites in June 2004 and to course participants in other sites statewide in October 2004. The

survey instruments for both groups were identical with the exception that the survey for subgrant

schools included three additional items on researched-based reading practices. In the subgrant

school sites, 139 of the 162 course participants (86%) responded to the pretest survey, while 335

of the 346 statewide course participants (97%) responded to the survey. Survey items asked

course participants to rate their own knowledge of literacy concepts, and to describe their goals

for their school’s participation in the MRF program and for their own professional growth.

Results of the pretest survey were reported in the midyear progress report of March 2005.

A posttest survey was conducted with both groups of MRF course participants at the end

of year one in May 2005. Survey items asked participants to rate their own level of knowledge

of literacy concepts at the end of the MRF course, rate their level of agreement with contrasting

statements describing views on literacy instruction, provide feedback on the course, describe

professional development needs, and describe how they had applied new learning to their

instructional practice in literacy. The survey for subgrant schools included additional items

asking participants to describe supports that had been helpful in implementing the core reading

program, provide feedback on the core program, and to describe instructional practices in

literacy.

Response rates for the posttest survey were slightly lower than for the pretest survey. The

subgrant school survey had a response rate of 63% (102 participants), while the statewide survey

had a response rate of 71% (245 participants). The surveys did not ask respondents to identify

themselves so we cannot know which respondents responded to both the pretest and posttest

surveys. Respondents identified demographic information including job role, school, and district

name. The following sections report the results of the posttest survey and compare results for

Page 48: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

39

common items on the pretest and posttest surveys. Specifically, this report will describe the

reported gains in literacy knowledge by comparing pretest and posttest survey data.

Demographics. Tables 7 and 8 below present demographic data for pretest and posttest

MRF course surveys for subgrant school and statewide course participants. Over half of the

survey respondents from subgrant schools and close to half of the respondents from statewide

course sites were regular classroom teachers. A slightly larger percentage of respondents from

statewide course sites were special education teachers (3.4% more for the posttest), and a

considerably larger percentage of respondents from statewide course sites were educational

technicians (18.4% more for the posttest). A larger percentage of respondents from subgrant

schools were administrators (3.8% more for the posttest) or other literacy related positions

(10.9% more, e.g., literacy coordinators, interventionists, and coaches), reflecting the strong

involvement of administrators and literacy specialists in the implementation of the MRF program

in subgrant schools. The posttest survey broke out the role of “educational technician” from

“other literacy related positions,” while the pretest survey did not include a separate response

choice of “educational technician.”

The higher concentration of statewide course participants who were not regular

classroom teachers may explain some of the differences between the statewide and subgrant

school participants’ responses to some survey items. Statewide participants often commented in

the open-ended items that they did not have a classroom of their own, did not have extensive

literacy knowledge prior to the MRF course, and that they had learned a great deal about literacy

acquisition from their participation in the MRF course. Statewide course respondents said they

felt overwhelmed by the amount of learning or material covered in the course more often than

did subgrant school respondents.

Page 49: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

40

Table 7. Position at School (Subgrant Schools)

70 51.5% 56 58.3%

17 12.5% 9 9.4%

10 10.4%

39 28.7% 16 16.7%

9 6.6% 4 4.2%

1 .7% 1 1.0%

Regular Classroom Teacher

Special Education Teacher

Educational Technician

Other Literacy Related

Position

Administration

Other

n %

Pre

n %

Post

Three respondents did not answer this item on the pre-survey; six did not on the

post-survey.

Note. Three respondents did not answer this item on the pre-survey; six did not on the post-survey. Table 8. Position at School (Statewide Course Sites)

139 42.5% 117 48.1%

54 16.5% 31 12.8%

70 28.8%

41 12.5% 14 5.8%

4 1.2% 1 .4%

89 27.2% 10 4.1%

Regular Classroom Teacher

Special Education Teacher

Educational Technician

Other Literacy Related

Position

Administration

Other

n %

Pre

n %

Post

Four respondents did not answer this item on the pre-survey; two did not on the

post-survey.

Note. Four respondents did not answer this item on the pre-survey; two did not on the post-survey. Tables 9 and 10 below show the total number of years respondents taught or worked in

their job positions for all respondents. A comparison of these data indicate that subgrant school

course participants were more veteran educators than were statewide course participants: 74% of

the subgrant school respondents indicated they had taught in their schools for more than 10

years, while 58% of the statewide respondents gave this response. Similar results were found

when the analysis was repeated to include only the regular classroom teachers, special education

teachers, and administrators. With this data set, 81% of the subgrant school respondents

indicated they had taught in their schools for more than 10 years, while 62% of the statewide

respondents gave this response. Thus, even when the educational technicians and other literacy

Page 50: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

41

positions were excluded from the analysis, the subgrant respondents were still a more veteran

group of educators.

Table 9. Number of Years Teaching (Subgrant Schools)

7 5.1% 1 1.0%

4 2.9% 1 1.0%

6 4.4% 3 3.1%

5 3.7% 4 4.1%

6 4.4% 2 2.1%

8 5.9% 4 4.1%

8 5.9% 4 4.1%

5 3.7% 4 4.1%

3 2.2% 2 2.1%

3 2.2%

81 59.6% 72 74.2%

1 year

2 years

3 years

4 years

5 years

6 years

7 years

8 years

9 years

10 years

More than 10 years

n %

Pre

n %

Post

Three respondents did not answer this item on the pre-survey; five did not on the

post-survey.

Note. Three respondents did not answer this item on the pre-survey; five did not on the post-survey. Table 10. Number of Years Teaching (Statewide Course Sites)

12 3.9% 10 4.3%

18 5.9% 13 5.6%

17 5.6% 12 5.2%

10 3.3% 10 4.3%

22 7.2% 11 4.7%

15 4.9% 8 3.4%

12 3.9% 12 5.2%

13 4.3% 11 4.7%

14 4.6% 7 3.0%

7 2.3% 4 1.7%

164 53.9% 135 57.9%

1 year

2 years

3 years

4 years

5 years

6 years

7 years

8 years

9 years

10 years

More than 10 years

n %

Pre

n %

Post

27 respondents did not answer this item on the pre-survey; 12 did not on the

post-survey.

Note. Twenty-seven respondents did not answer this item on the pre-survey; 12 did not on the post-survey.

Tables 11 and 12 below present data on the grade levels in which respondents taught or

worked. Some course participants were not regular classroom teachers but were administrators,

Page 51: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

42

special educators, or educational technicians. These respondents indicated they taught or

supervised more than one grade level. Some educators may also teach multigrade classrooms in

very small schools. Some subgrant schools involved educators from grades K-6 in the MRF

course, while statewide course sites involved educators from grades K-12. MRF provided

funding for regular classroom teachers in grades K-3 and for special education teachers in grades

K-12 to participate in the MRF course. School districts could choose to involve educators in

grades above grade 3 at their own expense.

As the data indicate, course participants from both subgrant schools and statewide sites

predominantly taught or worked within the K-3 grade levels. For the subgrant schools, 67% of

all respondents and 82% of the regular classroom teachers said they taught or worked within

grades K-3 only. For the statewide course sites, 60% of all respondents and 85% of the regular

classroom teachers said they taught or worked within grades K-3 only. Thus, the course was

successful in reaching educators in the targeted K-3 grade range.

Again, differences in the population participating in the MRF course resulted in some

differences in the responses to survey items. Some statewide survey respondents commented in

open-ended items that they teach in middle or secondary grades and would like more

professional development in literacy specifically for their grade levels, as well as more guidance

in helping special needs students in upper grades learn to read.

Page 52: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

43

Table 11. Grades Currently Teaching (Subgrant Schools) Pre Post

n % n % Pre Kindergarten 3 2.5% 4 4.5% Kindergarten 44 36.7% 38 43.7% Grade 1 59 49.2% 40 46.0% Grade 2 50 41.7% 34 39.1% Grade 3 49 40.8% 29 33.3% Grade 4 26 21.7% 16 18.4% Grade 5 26 21.7% 13 14.9% Grade 6 12 10.0% 11 12.6% Grade 7 8 6.7% 0 0% Grade 8 7 5.8% 0 0% Grade 9 0 0% 0 0% Grade 10 0 0% 0 0% Grade 11 0 0% 0 0% Grade 12 0 0% 0 0%

Table 12. Grades Currently Teaching (Statewide Course Sites)

Pre Post n % n % Pre Kindergarten 23 7.0% 20 8.8% Kindergarten 95 29.1% 73 32.2% Grade 1 103 31.5% 85 37.4% Grade 2 115 35.2% 88 38.8% Grade 3 100 30.6% 76 33.5% Grade 4 58 17.7% 39 17.2% Grade 5 47 14.4% 41 18.1% Grade 6 32 9.8% 19 8.4% Grade 7 22 6.7% 18 7.9% Grade 8 20 6.1% 16 7.0% Grade 9 17 5.2% 15 6.6% Grade 10 18 5.5% 14 6.2% Grade 11 18 5.5% 13 5.7% Grade 12 18 5.5% 15 6.6%

Tables 13 and 14 below present data on respondents’ educational attainment for all

respondents. The data on educational attainment indicate that the subgrant school course

participants had higher educational attainment than did the statewide course participants. A total

of 29.5% of the subgrant school respondents had a master’s degree or higher degree, while 15%

Page 53: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

44

of the statewide respondents had the same level of education. Similar results were obtained when

the analysis was repeated to include only regular classroom teachers, special education teachers,

and administrators. With this data set, 28.4% of the subgrant respondents had a master’s degree

or higher degree, while 13.7% of the statewide respondents had the same level of education.

Thus, even when educational technicians and other literacy positions were excluded from the

data set, the subgrant respondents had higher educational attainment.

Table 13. Educational Attainment (Subgrant Schools)

7 5.1% 4 4.2%

2 1.5% 3 3.2%

88 64.2% 60 63.2%

4 4.2%

19 13.9% 13 13.7%

12 8.8% 10 10.5%

5 3.6% 1 1.1%

1 .7%

3 2.2%

Less than 2 years college

2 years college

BA/BS

MAT

M.Ed.

MA/MS

CAS

Ed.D/Ph.D.

2 master's degrees

n %

Pre

n %

Post

Two respondents did not respond to this item on the pre-survey; seven did not on the

post-survey.

Note. Two respondents did not respond to this item on the pre-survey; seven did not on the post-survey. Table 14. Educational Attainment (Statewide Course Sites)

25 8.0% 12 5.0%

31 9.9% 21 8.8%

211 67.2% 170 70.8%

3 1.0% 2 .8%

28 8.9% 21 8.8%

15 4.8% 11 4.6%

1 .3% 1 .4%

1 .4%

Less than 2 years college

2 years college

BA/BS

MAT

M.Ed.

MA/MS

CAS

2 master's degrees

n %

Pre

n %

Post

17 respondents did not answer this question on the pre-survey; 6 did not on the

post-survey.

Note. Seventeen respondents did not answer this question on the pre-survey; 6 did not on the post-survey.

Page 54: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

45

Tables 15 and 16 present data on professional certification for all survey respondents.

Subgrant school respondents were more likely to have professional certification than were

statewide respondents. Eighty-one percent (81.5%) of the subgrant school respondents indicated

they had professional certification, while only 65.4% of the statewide respondents indicated this

level of certification. A higher percentage of statewide respondents than subgrant school

respondents had either conditional or provisional certification (10.2% more). The higher

concentration of educational technicians in the statewide course may partly explain this

difference in certification levels.

Table 15. Certification (Subgrant Schools)

2 1.6%

11 8.6% 5 5.4%

1 1.1%

2 1.6% 1 1.1%

102 79.7% 75 81.5%

11 8.6% 10 10.9%

Conditional

Provisional

Targeted Needs

Transitional

Professional

Ed Tech

n %

Pre

n %

Post

Eleven respondents did not respond to this item on the pre-survey; ten did

not on the post-survey.

Note. Eleven respondents did not respond to this item on the pre-survey; 10 did not on the post-survey. Table 16. Certification (Statewide Course Sites)

Note. Sixty-nine respondents did not answer this question on the pre-survey; 14 did not on the post-survey.

12 4.6% 8 3.5%

39 14.9% 28 12.1%

3 1.3%

2 .8%

202 77.1% 151 65.4%

7 2.7% 41 17.7%

Conditional

Provisional

Targeted Needs

Transitional

Professional

Ed Tech

n %

Pre

n %

Post

69 respondents did not answer this question on the pre-survey; 14 did not

on the post-survey.

Page 55: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

46

Knowledge of topics and concepts in reading. MRF course participants from statewide

sites include a much higher percentage of educational technicians and non-regular classroom

teachers than did participants from subgrant school sites. Respondents who indicated

“educational technician” on the posttest survey, “other literacy related position,” or “other” on

either the pretest or posttest survey were excluded from the dataset before comparing educators’

self-ratings of literacy knowledge before and after the MRF course. Thus, the tables and figures

that follow in this section on literacy knowledge include only respondents who indicated “regular

classroom teacher,” “special education teacher,” or “administrator.”

Respondents were asked to rate their literacy knowledge in the areas of research-based

reading practices, reading development, and literacy environments using a 4-point scale of 1

(limited understanding) to 4 (extensive knowledge). Figures 1-3 display results for these three

areas. Frequency tables for these survey items are presented in Tables 1-6 in Appendix F.

Knowledge of research-based reading practices. In the area of research-based

reading practices, both subgrant school and statewide respondents reported on the posttest survey

that they were most knowledgeable about “essential components of reading instruction” and the

“purpose of uninterrupted blocks of time for reading instruction.” There were some differences

in the areas where subgrant school and statewide respondents reported they had gained the most

knowledge. Subgrant school respondents indicated they gained the most knowledge in

“definition of scientifically based reading research,” “definition of systematic instruction,” and

“components of explicit instruction,” while statewide respondents indicated they gained the most

knowledge in “definition of scientifically based reading research,” “characteristics of schools

with effective school-wide literacy programs,” and “essential components of reading

instruction.” Overall, subgrant school respondents reported larger gains in knowledge about

Page 56: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

47

research-based reading practices than did statewide respondents. Figure 1 presents a comparison

of the mean scores for research-based reading practices. Table 17 presents pretest and posttest

mean scores and effect sizes.

The survey for subgrant school respondents included three additional items in the scale

for research-based reading practices. Subgrant school respondents reported significant gains in

knowledge for all three areas: “knowledge and experience utilizing a comprehensive core

reading program”; “definition of systematic instruction”; and “knowledge and use of flexible

grouping strategies,” with the largest gains reported for the first two areas.

Page 57: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

48

Figure 1. Mean Scores for Knowledge of Research-Based Reading Practices

0

0.5

1

1.5

2

2.5

3

3.5

4

Definition of

scientifically based

reading research

Characteristics of

schools with

effective school-

wide literacy

programs

Essential

components of

reading instruction

Components of

explicit instruction

Knowledge and use

of screening,

progress monitoring,

and outcome

assessment

Purpose of

uninterrupted blocks

of time for reading

instruction

Alignment of

Maine's Learning

Results with

scientifically based

reading research

Statewide-Pre

Statewide-Post

Sub-Grant Pre

Sub-Grant Post

Page 58: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

49

Table 17. Mean Scores for Knowledge of Research-Based Reading Practices

Mean Std Dev Mean Std Dev Mean Std Dev Mean Std Dev

Definition of scientifically based

reading research 2.6 0.7 3.4 0.5 1.27 2.4 0.9 3.2 0.6 1.10

Characteristics of schools with

effective school-wide literacy

programs 2.6 0.8 3.3 0.7 0.90 2.6 0.8 3.3 0.6 1.02

Essential components of reading

instruction 3.2 0.6 3.6 0.5 0.71 3.1 0.6 3.7 0.5 0.98

Components of explicit instruction 2.7 0.8 3.4 0.6 1.07 2.6 0.8 3.4 0.6 0.95

Knowledge and use of screening,

progress monitoring, and outcome

assessment to inform instruction 2.9 0.8 3.5 0.5 0.73 2.8 0.8 3.4 0.6 0.85

Purpose of uninterrupted blocks of

time for reading instruction 3.4 0.8 3.8 0.6 0.74 3.1 0.7 3.7 0.6 0.90

Alignment of Maine's Learning

Results with scientifically based

reading research 2.9 0.7 3.3 0.6 0.68 2.6 0.7 3.2 0.7 0.88

Definition of systematic instruction* 2.6 0.6 3.4 0.4 1.09

Knowledge and use of flexible

grouping strategies for the provision of

instruction* 3.1 0.8 3.5 0.6 0.73

Knowledge and experience utilizing a

comprehensive/core reading program

(e.g. Houghton Mifflin core reading

program)* 2.5 0.8 3.3 0.7 1.11

* These questions only appeared on the survey distributed to the subgrant schools.

Subgrant Schools Statewide Schools

Item Pre PostPre Post

Effect Size Effect Size

Page 59: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

50

Knowledge of reading development. In the area of reading development, subgrant

school respondents reported on the posttest survey that they were most knowledgeable about

“connection of oral language to reading development” and “cue systems that readers use.”

Statewide respondents reported they were most knowledgeable about “connection of oral

language to reading development” and “stages of reading development.” Written responses to

open-ended items reflected similar views. Statewide respondents’ written comments on the

survey emphasized their perceived gain in knowledge about stages of reading development and

the importance of oral language skills for reading development. Subgrant school respondents

reported the most gain in knowledge about “connection of oral language to reading

development” followed by “methods of supporting oral language development in primary

classrooms.” Statewide respondents reported the most gain in knowledge about “stages of

reading development” followed by “stages or oral development” and “methods of supporting oral

language development in primary classrooms.” Overall, statewide respondents reported larger

gains in knowledge about reading development than did subgrant school respondents. Figure 2

presents the mean scores for reading development, and Table 18 which follows presents pretest

and posttest mean scores and effect sizes.

Page 60: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

51

Figure 2. Mean Scores for Knowledge of Reading Development

0

0.5

1

1.5

2

2.5

3

3.5

4

Stages of oral

language

development

Connection of

oral language to

reading

development

Methods for

supporting oral

language

development in

primary

classrooms

Stages of reading

development

Concepts of print

that young

children develop

Cue systems

readers use while

reading

Methods for

supporting

reading

development

during the

preschool years

Statewide-Pre

Statewide-Post

Sub-Grant Pre

Sub-Grant Post

Page 61: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

52

Table 18. Mean Scores for Knowledge of Reading Development

Mean Std Dev Mean Std Dev Mean Std Dev Mean Std Dev

Stages of oral language development 2.8 0.7 3.3 0.7 0.73 2.6 0.8 3.4 0.6 1.12

Connection of oral language to reading

development 2.9 0.7 3.6 0.6 0.93 2.9 0.7 3.6 0.5 1.10

Methods for supporting oral language

development in primary classrooms 2.9 0.8 3.5 0.6 0.87 2.7 0.7 3.5 0.6 1.12

Stages of reading development 3.1 0.7 3.5 0.6 0.65 2.9 0.7 3.6 0.6 1.14

Concepts of print that young children

develop 3.1 0.8 3.5 0.6 0.62 2.9 0.8 3.5 0.6 0.84

Cue systems readers use while reading 3.0 0.7 3.6 0.5 0.78 2.9 0.7 3.5 0.6 0.97

Methods for supporting reading

development during the preschool

years 2.8 0.8 3.2 0.7 0.61 2.7 0.8 3.3 0.7 0.85

Subgrant Schools Statewide Schools

Item Pre Post

Effect Size

Pre Post

Effect Size

Page 62: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

53

Knowledge of literacy environments. In the area of literacy environments,

subgrant school respondents reported on the posttest survey that they were most knowledgeable

about “knowledge and use of reading aloud” and “knowledge and use of guided reading.”

Statewide respondents reported they were most knowledgeable about “knowledge and use of

reading aloud” followed by “knowledge and use of shared reading” and “knowledge and use of

guided reading.” Subgrant school respondents reported the most gain in knowledge in the areas

of “knowledge and use of word work as an instructional component” followed by “knowledge

and use of reading aloud” and “criteria for the design of classroom libraries.” Statewide

respondents reported the most gain in knowledge in the areas of “criteria for selecting research-

based materials to support instruction” followed by “knowledge and use of reading aloud” and

“knowledge and use of shared reading.” Figure 3 presents the mean scores for reading

development, and Table 19 that follows presents the mean scores and effect sizes. Overall,

statewide respondents reported larger gains in knowledge about literacy environments than did

subgrant school respondents.

Page 63: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

54

Figure 3. Mean Scores for Knowledge of Literacy Environments

0.00

0.50

1.00

1.50

2.00

2.50

3.00

3.50

4.00

Physical

organization of

classrooms to

promote

reading

development

Criteria for

selecting

research based

materials

Criteria for the

design of

classroom

libraries

Knowledge and

use of reading

aloud

Knowledge and

use of shared

reading

Knowledge and

use of guided

reading

Knowledge and

use of

independent

reading

Knowledge and

use of word

work

Knowledge and

use of writing

instruction to

promote

reading

instruction

Statewide-Pre

Statewide-Post

Sub-Grant Pre

Sub-Grant Post

Page 64: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

55

Table 19. Mean Scores for Knowledge of Literacy Environment

Mean Std Dev Mean Std Dev Mean Std Dev Mean Std Dev

Physical organization of classrooms to

promote reading development 3.0 0.8 3.6 0.6 0.77 2.8 0.7 3.6 0.6 1.06

Criteria for selecting research based

materials to support instruction 2.6 0.7 3.1 0.7 0.70 2.4 0.7 3.3 0.6 1.27

Criteria for the design of classroom

libraries that include a wide range of

high quality children's literature 2.8 0.9 3.4 0.6 0.80 2.6 0.8 3.4 0.6 1.07

Knowledge and use of reading aloud

as an instructional component 3.3 0.6 3.8 0.5 0.84 3.1 0.6 3.8 0.4 1.14

Knowledge and use of shared reading

as an instructional component 3.3 0.7 3.7 0.6 0.56 3.1 0.7 3.7 0.5 1.07

Knowledge and use of guided reading

as an instructional component 3.3 0.7 3.8 0.6 0.76 3.1 0.7 3.7 0.5 0.89

Knowledge and use of independent

reading as an instructional component 3.2 0.6 3.7 0.6 0.74 3.0 0.7 3.6 0.5 0.97

Knowledge and use of word work as

an instructional component 3.1 0.7 3.6 0.6 0.91 2.9 0.7 3.5 0.6 1.02

Knowledge and use of writing

instruction to promote reading

instruction 3.2 0.6 3.6 0.6 0.77 2.9 0.7 3.5 0.6 1.03

Subgrant Schools Statewide Schools

Pre Post Post

Effect SizeEffect Size

Pre

Page 65: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

56

The differences between the mean scores from the pretest and posttest surveys were

analyzed for both statistical and practical significance, using t-tests and effect sizes. Statistically

significant (p < .05) differences were found in all areas of literacy knowledge (research-based

reading practices, reading development, and literacy environments), and the examination of the

effect sizes showed that these differences were large (Cohen’s d > .80). Respondents from both

subgrant schools and statewide sites reported significantly higher levels of knowledge in these

three areas after completing the year-long MRF course.

Knowledge of reading elements and reading instruction. Using the same 4-point

scale, respondents rated the extent of their knowledge in the five essential elements of reading:

• phonological awareness • alphabetic principle and phonics • vocabulary • fluency • comprehension

Within each of the five elements, respondents rated their knowledge in five areas of instructional

knowledge:

• basic understanding of definitions or components of the element • reading research on the element • instructional practice for the element • ways to monitor progress for the element • differentiation of instruction for the element

Figures 4-8 present a comparison of mean scores for reported levels of knowledge within

each of the five essential elements of reading and the five areas of instructional knowledge in

literacy. Mean scores and effect sizes are presented in Tables 20-24. Frequency tables for these

survey items are presented in Tables 7-16 in Appendix F.

Page 66: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

57

Figure 4. Mean Scores for Knowledge of Phonological Awareness

Table 20. Mean Scores for Knowledge of Phonological Awareness

0.00

0.50

1.00

1.50

2.00

2.50

3.00

3.50

4.00

Def

initi

ons/Com

pone

nts

Rea

ding

Res

earc

h

Instru

ctio

nal P

ract

ices

Mon

itor P

rogr

ess

Diff

eren

tiatin

g In

stru

ctio

n

Statewide-Pre

Statewide-Post

Sub-Grant Pre

Sub-Grant Post

Mean Std Dev Mean Std Dev Mean Std Dev Mean Std Dev

Definitions/Components 2.7 0.8 3.4 0.7 0.96 2.6 0.8 3.4 0.6 1.14

Reading Research 2.7 0.8 3.3 0.6 0.84 2.5 0.8 3.4 0.6 1.25

Instructional Practices 2.8 0.8 3.4 0.6 0.88 2.5 0.8 3.5 0.6 1.29

Monitor Progress 2.4 0.7 3.2 0.7 1.12 2.3 0.7 3.2 0.6 1.34

Differentiating Instruction 2.5 0.8 3.2 0.6 0.99 2.3 0.7 3.2 0.6 1.33

Subgrant Schools Statewide Schools

Effect Size

Pre Post

Effect Size

Pre Post

Page 67: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

58

Figure 5. Mean Scores for Knowledge of Alphabetic Principle and Phonics

Table 21. Mean Scores for Knowledge of Alphabetic Principle and Phonics

0.00

0.50

1.00

1.50

2.00

2.50

3.00

3.50

4.00

Def

initi

ons/Com

pone

nts

Rea

ding

Res

earc

h

Instru

ctio

nal P

ract

ices

Mon

itor

Progr

ess

Diff

eren

tiatin

g In

stru

ctio

n

Statewide-Pre

Statewide-Post

Sub-Grant Pre

Sub-Grant Post

Mean Std Dev Mean Std Dev Mean Std Dev Mean Std Dev

Definitions/Components 2.5 0.9 3.2 0.7 0.86 2.4 0.9 3.4 0.6 1.18

Reading Research 2.5 0.9 3.2 0.7 0.89 2.3 0.8 3.3 0.6 1.34

Instructional Practices 2.8 0.8 3.4 0.7 0.82 2.7 0.9 3.5 0.6 1.11

Monitor Progress 2.5 0.8 3.2 0.7 0.90 2.4 0.8 3.3 0.6 1.17

Differentiating Instruction 2.6 0.8 3.2 0.7 0.76 2.4 0.8 3.3 0.6 1.23

Subgrant Schools Statewide Schools

Pre Post

Effect Size

Pre Post

Effect Size

Page 68: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

59

Figure 6. Mean Scores for Knowledge of Fluency

Table 22. Mean Scores for Knowledge of Fluency

0.00

0.50

1.00

1.50

2.00

2.50

3.00

3.50

4.00

Def

initi

ons/Com

pone

nts

Rea

ding

Res

earc

h

Instru

ctio

nal P

ract

ices

Mon

itor

Progr

ess

Diff

eren

tiatin

g In

stru

ctio

n

Statewide-Pre

Statewide-Post

Sub-Grant Pre

Sub-Grant Post

Mean Std Dev Mean Std Dev Mean Std Dev Mean Std Dev

Definitions/Components 2.8 0.7 3.6 0.5 1.20 2.7 0.7 3.5 0.6 1.28

Reading Research 2.6 0.7 3.4 0.6 1.10 2.5 0.8 3.4 0.6 1.25

Instructional Practices 2.8 0.7 3.4 0.6 0.95 2.6 0.7 3.4 0.5 1.32

Monitor Progress 2.6 0.7 3.3 0.6 1.07 2.4 0.8 3.3 0.6 1.37

Differentiating Instruction 2.6 0.7 3.3 0.6 1.08 2.4 0.7 3.3 0.6 1.31

Subgrant Schools Statewide Schools

Pre Post

Effect Size

Pre Post

Effect Size

Page 69: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

60

Figure 7. Mean Scores for Knowledge of Vocabulary

Table 23. Mean Scores for Knowledge of Vocabulary

0.00

0.50

1.00

1.50

2.00

2.50

3.00

3.50

4.00

Def

initi

ons/Com

pone

nts

Rea

ding

Res

earc

h

Instru

ctio

nal P

ract

ices

Mon

itor

Progr

ess

Diff

eren

tiatin

g In

stru

ctio

n

Statewide-Pre

Statewide-Post

Sub-Grant Pre

Sub-Grant Post

Mean Std Dev Mean Std Dev Mean Std Dev Mean Std Dev

Definitions/Components 3.0 0.7 3.5 0.6 0.84 2.7 0.6 3.5 0.5 1.46

Reading Research 2.7 0.8 3.4 0.6 0.98 2.5 0.8 3.3 0.6 1.15

Instructional Practices 2.9 0.6 3.4 0.6 0.80 2.6 0.7 3.4 0.5 1.38

Monitor Progress 2.6 0.7 3.2 0.7 0.88 2.2 0.7 3.2 0.6 1.56

Differentiating Instruction 2.5 0.8 3.2 0.7 0.95 2.3 0.7 3.3 0.6 1.46

Pre Post

Effect Size

Pre Post

Effect Size

Subgrant Schools Statewide Schools

Page 70: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

61

Figure 8. Mean Scores for Knowledge of Reading Comprehension

Table 24. Mean Scores for Knowledge of Reading Comprehension

0.00

0.50

1.00

1.50

2.00

2.50

3.00

3.50

4.00

Def

initi

ons/Com

pone

nts

Rea

ding

Res

earc

h

Instru

ctio

nal P

ract

ices

Mon

itor

Progr

ess

Diff

eren

tiatin

g In

stru

ctio

n

Statewide-Pre

Statewide-Post

Sub-Grant Pre

Sub-Grant Post

Mean Std Dev Mean Std Dev Mean Std Dev Mean Std Dev

Definitions/Components 3.0 0.7 3.5 0.6 0.67 2.8 0.7 3.5 0.5 1.17

Reading Research 2.7 0.9 3.3 0.6 0.73 2.5 0.8 3.4 0.6 1.30

Instructional Practices 3.0 0.7 3.5 0.6 0.76 2.7 0.7 3.5 0.5 1.30

Monitor Progress 2.7 0.7 3.3 0.7 0.82 2.5 0.7 3.3 0.6 1.31

Differentiating Instruction 2.7 0.7 3.2 0.7 0.67 2.5 0.7 3.3 0.6 1.22

Post

Effect Size

Subgrant Schools Statewide Schools

Pre Post

Effect Size

Pre

Page 71: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

62

Data presented in the previous figures and tables on knowledge of literacy, reading

elements, and reading instruction include only respondents who indicated “regular classroom

teacher,” “special education teacher,” or “administrator,” and exclude respondents who indicated

“educational technician,” “other literacy related position,” or “other” on the MRF course

surveys.

As the figures and tables illustrate, subgrant school respondents generally reported

slightly higher levels of knowledge on the pretest survey than did statewide respondents.

Statewide respondents reported larger gains in knowledge than did the subgrant school

respondents.

Preexisting differences in the two populations of course participants may partly explain

differences in self-ratings of knowledge before the MRF course and in the amount of gain in

knowledge. First, subgrant school respondents were more veteran educators than the statewide

respondents, and had higher levels of educational attainment. Second, two of the largest subgrant

schools had participated in the MLP prior to participating in the MRF program and course.

Therefore, subgrant school educators may have begun the MRF course with more extensive prior

knowledge in literacy than did statewide participants, and therefore could be expected to report

higher levels of knowledge prior to the course and slightly smaller gains in knowledge by the end

of the course.

Across the five elements of reading, subgrant school respondents and statewide course

respondents rated their levels of knowledge fairly consistently. Thus, respondents reported they

were just as knowledgeable about fluency and comprehension as they were about phonological

awareness, alphabetic principal and phonics, and vocabulary. This finding conflicts somewhat

with respondents’ written comments on the posttest survey, in which respondents indicated that

Page 72: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

63

the MRF course had helped them to realize the importance of fluency and comprehension and

that they were putting more emphasis on these areas in their reading instruction by using

strategies and activities they had learned about through the course.

Across the five areas of instructional knowledge, there was more variation in reported

levels of knowledge. Subgrant school and statewide respondents generally reported slightly

lower levels of instructional knowledge about monitoring students’ reading progress,

differentiation of reading instruction, and reading research than they reported for knowledge

about components of the element and instructional practices. This finding is consistent with

respondents’ written comments on the posttest survey that indicated a desire for more help in the

areas of reading assessment and differentiation of instruction, and continued opportunities to

learn about and discuss research on reading with their peers.

The differences between the mean scores from the pretest and posttest surveys were

analyzed for both statistical and practical significance using t-tests and effect sizes. Statistically

significant (p < .05) differences were found in all areas of knowledge of reading elements and

reading instruction, and the examination of the effect sizes showed that these differences were

large (Cohen’s d > .80). For statewide respondents, the effect sizes are quite large and, in some

areas, almost double the effect sizes for subgrant school respondents.

Across the five essential elements of reading, subgrant school respondents reported the

largest gain in knowledge in the area of fluency and the least amount of gain in the area of

vocabulary. Statewide respondents reported the largest gain in knowledge in the area of

comprehension followed by fluency.

Across the five areas of instructional knowledge, subgrant school respondents reported

the largest gains in knowledge about monitoring progress, definitions/components of fluency,

Page 73: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

64

and reading research on comprehension. Statewide respondents reported the largest gains in

knowledge about monitoring progress, differentiating instruction, instructional practices, and

definitions/components of reading comprehension.

Professional development needs and other supports. The spring 2005 survey included an

item asking course participants “In what areas would you like additional support or professional

development to improve your skills in reading instruction?” The total number of written

responses to this item was 73 from subgrant school sites and 174 from other MRF course sites

statewide. Responses were very positive, and indicated that the respondents found the MRF

course and other professional development events very helpful in deepening their professional

knowledge. Across all responses, the most frequently mentioned areas of need for additional

professional development were in the five essential elements of reading: comprehension (30

respondents mentioned this need), phonological awareness (21), fluency (23), vocabulary (17),

and phonics (11), with the area of comprehension receiving the most mentions overall. Subgrant

school course participants emphasized a need for support in the areas of comprehension, fluency,

and vocabulary, while statewide course participants emphasized these three areas and

phonological awareness. Two respondents from subgrant schools and two respondents from

schools statewide said they wanted to learn “How to teach students to self monitor their

comprehension.”

Respondents also identified other areas where they would like additional learning or in-

classroom support and materials. These areas are listed below with the corresponding number of

respondents who mentioned these needs.

Areas of additional training needed:

• Connections between reading and writing (11 respondents) • Guided reading (12)

Page 74: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

65

• Reading assessment (10)—administering, scoring, interpreting • Differentiating instruction and activities, individualized instruction (8) • Help/advice on classroom organization, management, scheduling for the literacy block

(4) • Interventions/remediation activities for struggling students (5) • Strategies/activities to keep older students (fourth grade or above) engaged, interested in

reading (2) • Individual reading programs for struggling middle level students (1) • Grouping strategies for middle level reading classroom (1) • Ways to make one-on-one reading instruction more fun (1) • Best sequence to cover reading skills (1)

Other resources or supports needed:

• Time—to digest, reread, organize course materials (10 respondents) • More materials, leveled books, list of leveled readers, other instructional materials (6) • Opportunities to observe other teachers’ reading instruction, to be observed and get

feedback, have peer mentors and coaching (10) • Newsletter (1)

Subgrant school participants mentioned a need for support in the areas of: guided reading

and small groups, instructional time management, management of paper records, organizing the

physical space in the classroom, writing, differentiation and interventions for struggling students

or special needs students, more time to review materials and discuss experiences with peers,

more opportunities for observation and feedback on instruction and videos modeling instruction,

and more integration of the MLP and MRF programs. Two teachers mentioned the need for

instructional materials, such as: more nonfiction literature, magnetic letters and white boards.

The survey for subgrant school participants included an additional item asking “What

supports have helped you implement the core reading program in your classroom this year?” A

total of 67 written responses to this item were collected. Supports mentioned included:

• Instructional materials, core reading program (10 respondents) • MRF course or other professional development (10) • In classroom support of ed techs, Title 1, intervention specialists (9) • Training on core reading program by consultants from publisher (8) • Peers, other teachers in school (8) • Literacy coordinator, coach (8)

Page 75: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

66

• Planning time, substitutes (5) • Maine Literacy Partnership (5) • Block scheduling for literacy instruction (4) • MRF teacher manual (1)

Responses to this item emphasized the supports of instructional materials, professional

development, and other literacy support staff in their schools. Five respondents from two schools

said they did not feel they had adequate support from their school principal, coach, or from the

MRF professional development to be able to implement the core reading program. Both of these

schools had a change in principal leadership between years one and two.

Feedback on core reading program. The survey for subgrant school course participants

included several items asking respondents for feedback on their use of the core reading program.

The attitudinal scale on instructional views included four statements about the core reading

program. Subgrant school respondents were asked to indicate the extent to which they agreed

with the statements using a scale of 1 (strongly agree) to 5 (strongly disagree). For ease in

presenting data results, the 5-point scale has been collapsed to a 3- point scale in Table 25 below.

Table 17 in Appendix F includes the frequency of responses for all response categories.

Table 25. Views on the Core Reading Program

Strongly Agree/Agree Not Sure

Disagree/ Strongly Disagree

Statement n % n % n % Use of core reading program (Houghton-Mifflin's The Nation's Choice 2005) this year has had a positive impact on my students' reading development.

50 61.0% 20 24.4% 12 14.6%

I would recommend the core reading program for use with all children in the grade(s) I teach.

46 54.8% 20 23.8% 18 21.4%

I would recommend the core reading program for use with children of special needs.

39 45.3% 31 36.0% 16 18.6%

Screening, progress monitoring, and benchmark assessments have informed my reading instruction this year.

73 79.3% 8 8.7% 11 12.0%

Page 76: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

67

As data in Table 25 above indicate, a majority of the subgrant school respondents (61%)

agreed that the core reading program had a positive impact on their students’ reading

development during year one, and slightly over half (55%) agreed that they would recommend

the program for use with all children they teach. Slightly less than half of the respondents (45%)

agreed that they would recommend the core reading program for use with children of special

needs and over a third (36%) indicated they were unsure. Most respondents (79%) agreed that

screening, progress monitoring, and benchmark assessments have informed their reading

instruction this year. These responses are consistent with respondents’ written comments on

open-ended items in which respondents emphasized the importance of the five elements of

reading and reading assessments, the importance of delivering systematic and explicit reading

instruction, and respondents’ desire for more guidance on differentiation and interventions for

different groups of struggling readers.

One open-ended item asked: “If you could change any two things in your core reading

program, what would they be?” A total of 70 written responses were received for this item.

Participants’ suggestions focused on the leveled books and guided reading books, student

workbooks/practice books, pacing of the program, instructional materials and hands-on

activities.

Suggestions regarding the leveled books used for guided reading practice included: more

books, a larger variety of levels, better quality books (literature), and larger print size for younger

readers (kindergarten, first grade). Respondents from various subgrant schools said the leveling

of books did not fit their students’ needs, and they needed more levels. One respondent wrote: “I

do not like the leveled readers. They are not leveled according to our benchmark levels.”

Page 77: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

68

Another respondent commented: “Leveled readers are of poor quality and are leveled higher than

other guided reading texts.”

Some respondents were dissatisfied with the guided reading texts. One respondent wrote:

“The guided reading books often don’t fit my groups. The below level is often too easy and I

would like more choices.” As with the leveled readers, respondents suggested a need for greater

variation in levels and better quality literature.

Overall, respondents liked having the additional literature provided by the grant funds.

Suggestions emphasized the desire for additional big books and more variety in literary genres

such as poetry, novels, and nonfiction texts. Some respondents said they wanted more

manipulatives (kindergarten) and more phonics activities for the lower grades.

Student workbooks were mentioned frequently as the least favorite component of the core

reading program. Several respondents suggested less use of the workbook or elimination of it

altogether, while a few respondents said the workbooks provided beneficial practice. Other

suggestions included more flexibility for teachers to choose the topics they wished to cover in

the workbook and for students to choose the topic of their writing response and to be creative in

their responses. Respondents suggested more focus on comprehension rather than on decoding

text.

Some respondents said the pacing of the program was too fast and expected too much

material to be covered. Respondents suggested spending more time on each skill before moving

to the next skill. Some comments on pacing included: “I felt pressure to get through all six

books.” and “Spend more time on a skill at time of introducing, skills change too quickly and

don’t allow for mastery.”

Page 78: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

69

Respondents said they would like more hands-on activities, more use of Readers’

Theatre, and less paperwork in their core reading program. Respondents mentioned they would

like a greater emphasis on writing or connections between reading and writing in their core

reading program.

Some respondents said they would not change anything about the core program.

Feedback on MRF course. Both the subgrant school course survey and the statewide

course survey included various items asking participants to provide feedback on the MRF course.

One item asked course participants to indicate their level of agreement with 10 statements about

the MRF course using a scale of 1 (Strongly Agree) to 5 (Strongly Disagree). Table 14 below

presents the results for this survey scale, and collapses the 5-point scale to a 3-point scale for

ease in presentation. Frequencies for the 5-point scale are presented in Tables 18-19 in

Appendix F.

Overall, both subgrant school respondents and statewide respondents indicated a high

level of satisfaction with the MRF course. Seventy-one percent (71%) of the subgrant school

respondents and 95% of the statewide respondents agreed with the statement: “Overall I have

found the course to be a worthwhile professional development experience.” Sixty-eight percent

(68%) of the subgrant school respondents and 97% of the statewide respondents agreed with the

statement: “I have found the course content to be valuable.” Differences in satisfaction levels

may be due to the fact that the subgrant school respondents had higher levels of educational

attainment and literacy knowledge prior to the course. This would be consistent with written

comments that indicated subgrant school respondents would like the course content to better fit

each participating group’s or school’s level of professional knowledge in literacy.

Page 79: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

70

Subgrant school respondents most strongly agreed with the following statements: “I have

found the instructor(s) to be knowledgeable”; “I found the instructor(s) to be easily approachable

and responsive to my needs”; and “I have found the instructor(s) to be well prepared and

organized.” Statewide respondents most strongly agreed with the statements: “I have found the

instructor(s) to be knowledgeable”; “I found the instructor(s) to be easily approachable and

responsive to my needs”; and “I have found the course content to be valuable.” These results are

consistent with respondents’ written comments indicating high levels of satisfaction with the

course instructors.

Page 80: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

71

Table 26. Feedback on MRF course

n % n % n % n % n % n %

I have found the course content to be valuable. 65 67.71% 13 13.54% 18 18.75% 235 97.11% 4 1.65% 3 1.24%

I have found the course sessions contain a

variety of activities in which I am engaged. 83 85.57% 5 5.15% 9 9.28% 223 92.53% 6 2.49% 12 4.98%

I have found the course readings to be valuable. 72 75.00% 13 13.54% 11 11.46% 223 92.53% 10 4.15% 8 3.32%

I have found the course readings to be

manageable. 79 82.29% 5 5.21% 12 12.50% 225 92.59% 8 3.29% 10 4.12%

I have found the course assignments to be

valuable. 63 64.95% 18 18.56% 16 16.49% 227 93.80% 10 4.13% 5 2.07%

I have found the course assignments to be

manageable. 75 78.13% 14 14.58% 7 7.29% 223 92.15% 10 4.13% 9 3.72%

I have found the instructor(s) to be

knowledgeable. 89 91.75% 3 3.09% 5 5.15% 236 97.52% 2 0.83% 4 1.65%

I have found the instructor(s) to be easily

approachable and responsive to my needs. 84 86.60% 5 5.15% 8 8.25% 234 97.10% 4 1.66% 3 1.24%

I have found the instructor(s) to be well

prepared and organized. 84 86.60% 8 8.25% 5 5.15% 221 91.32% 8 3.31% 13 5.37%

Overall, I have found the course to be a

worthwhile professional development

experience. 67 71.30% 8 8.50% 19 20.20% 229 95.02% 9 3.73% 3 1.24%

Statement (Literary Topic)

Subgrant Schools Statewide Schools

Strongly

Agree/Agree Not Sure

Disagree/Strongly

Disagree

Strongly

Agree/Agree Not Sure

Disagree/Strongly

Disagree

Page 81: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

72

Another item asked respondents if they would recommend the MRF course to a colleague

and to whom (which job role types) the respondent would recommend the course.

Overwhelmingly, respondents said they would recommend the course to their colleagues. Eighty

percent (80%) of the subgrant school participants and 98% of the statewide respondents

indicated “yes” to this question. Respondents who said “yes” indicated they would primarily

recommend the course to colleagues who are regular classroom teachers, educational

technicians, and special education teachers. Tables 27-28 present these data, which include all

survey respondents. Twelve subgrant school respondents and five statewide respondents did not

answer this survey item.

Table 27. Would You Recommend this Course to a Colleague? Respondent Group Yes No n % n % Subgrant school respondents 72 80% 18 20%

Statewide respondents 240 98.4% 4 1.6%

Table 28. If “Yes,” To Whom Would You Recommend this Course? Position Yes No n % n % Principal 43 59.7% 63 26.3%

District administrator 31 43.1% 45 18.8%

Classroom teacher 62 86.1% 227 94.6%

Special education teacher 58 80.6% 175 72.9%

Educational technician 63 87.5% 191 79.6%

Course participants were also asked for suggestions to improve the MRF course.

Subgrant school participants provided a total of 68 written comments and statewide course

participants provided a total of 140 written comments in response to this open-ended item.

Responses were overwhelmingly positive. Only four participants from subgrant schools made

slightly less positive comments. Three of the four participants were from districts that had

Page 82: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

73

participated in MLP prior to participating in MRF. These respondents emphasized the need to

adjust or differentiate the course content to fit the needs of local school faculties.

Both subgrant school course participants and statewide course participants suggested that

the course provide even more reading activities for teachers to use in their own classrooms (as

well as time for modeling these applications), that the number of handouts or paperwork be

reduced, and that more time for sharing of instructional ideas be provided within the course.

Subgrant school participants indicated high levels of satisfaction with the course, and

many respondents did not make any suggestions for improvements to the MRF course.

Respondents who did make suggestions focused on three areas: providing more instructional

activities for classroom use, providing more guidance in using the core reading program, and

adjusting the course content to better fit the different learning needs of each school faculty since

two of the schools had prior experience with MLP. It should be noted that the course was not

intended to provide guidance on implementing the core reading program but rather to deepen

educators’ understanding of the underlying concepts in teaching various components of reading

to young children. MRF program staff provided guidance on implementation to subgrant schools

during site visits and technical training sessions. Subgrant schools are also expected to support

teachers’ implementation of the core reading program through their literacy leadership teams, the

coach, and the interventionist.

A few respondents said they wanted a smaller class size, and not to include educators

from other schools in their district in their class. A few respondents said they felt the course

covered too much material, while others said the course should only run for one semester instead

of two. A few respondents suggested that closer alignment between MLP and the course is

needed.

Page 83: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

74

Statewide course participants focused on the number of handouts provided to participants

and suggested that these should be condensed and organized in a binder for participants. These

participants also emphasized that less time be spent on theory and more time devoted to practical

applications and classroom activities. Some respondents suggested that the course allow

participants to present instructional activities and that more time be provided for sharing ideas

and instructional experiences. A few respondents said they were overwhelmed by the amount of

material and new learning expected in the course. The statewide course involved a much higher

percentage of educational technicians than did the subgrant school course, which might help to

explain this view.

Instructional views. Subgrant school and statewide course participants were asked about

their instructional views in literacy, to ascertain their level of agreement with Reading First

principles. Using a scale of 1 (strongly agree) to 5 (strongly disagree), participants were asked

to rate their level of agreement with 13 statements reflecting contrasting views of literacy

instruction across all five elements of reading. Of the 13 statements, 7 statements expressed

Reading First principles while 6 statements expressed contrasting views. The scale included one

general statement about literacy instruction that reflected Reading First views, one general

statement about the grouping of students for instruction that reflects a contrasting view, and a

pair of statements (generally one statement reflecting the Reading First view and one statement

reflecting a contrasting view) for each of the five elements of reading. The area of phonics

included two statements reflecting Reading First views.

Items for this scale were derived from statements that appear in Put Reading First: The

Research Blocks for Teaching Children to Read (Adler, 2001). The authors of this publication

based their findings and conclusions on the National Reading Panel’s (2000) report Teaching

Page 84: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

75

Children to Read: An Evidence-Based Assessment of the Scientific Research Literature on

Reading and Its Implications for Reading Instruction. In order to create statements that contrast

with Reading First principles, the wording of some statements from the Put Reading First

publication was slightly altered. Put Reading First was one of the required texts in the MRF

course. Thus, both subgrant school and statewide course participants could be expected to be

familiar with the views and statements expressed in the publication. At least one of the

statements reflecting a contrasting view (“Most children learn to read through exposure to good

literature and do not need explicit instruction in letters and letter sounds”) was derived from a

teacher attitude survey used by Foorman, Fletcher, Francis, Schatschneider, & Mehta (1998) that

is also cited in the text Reading in the Classroom (Vaughn & Briggs, 2003).

Table 29 below presents the results for this survey scale and collapses the 5-point scale to

a 3-point scale for ease in presentation. Frequencies for the 5-point scale are presented in Tables

20-21 in Appendix F. The statements shown in the table have been reordered for the purpose of

analysis so that the statement reflecting the Reading First view (RF) is listed first, followed by a

statement reflecting a contrasting or non-Reading First view (N) for the same reading element.

The reading element or topic is identified in brackets after each statement. The statements were

randomly ordered on the actual survey instrument.

Page 85: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

76

Table 29. Instructional Views: Subgrant School and Statewide Respondents

Note. Statements shown in this table include the following areas or elements of reading: general, grouping, phonemic awareness (PA), and phonics.

n % n % n % n % n % n %

Effective literacy instruction provides ample

opportunities for children to practice newly acquired

reading and writing skills. (General) RF 88 91.7% 6 6.3% 2 2.1% 236 95.9% 3 1.2% 7 2.8%

Whole-group, small-group, and individual reading

instruction are all equally effective grouping practices.

(Grouping) N 59 61.5% 9 9.4% 28 29.2% 159 65.4% 18 7.4% 66 27.2%

Phonemic awareness instruction is most effective

when children are taught to manipulate phonemes

(sounds) by using the letters of the alphabet. (PA) RF 76 78.4% 14 14.4% 7 7.2% 180 74.7% 23 9.5% 38 15.8%

Children should be taught several different ways to

manipulate phonemes (sounds) rather than focusing

only on 1 or 2 ways of manipulating phonemes at a

time. (PA) N 56 58.9% 20 21.1% 19 20.0% 145 58.7% 38 15.4% 64 25.9%

Teaching sound and letter relationships is most

effective when begun in grades K-1. (Phonics) RF 84 87.5% 7 7.3% 5 5.2% 219 89.4% 9 3.7% 17 6.9%

Phonics instruction is most effective when it is

systematic (planned and sequential) and explicit

(clearly explained, modeled). (Phonics) RF 91 94.8% 3 3.1% 2 2.1% 229 93.1% 9 3.7% 8 3.3%

Most children learn to read through exposure to good

literature and do not need explicit instruction in letters

and letter sounds. (Phonics) N 12 12.5% 8 8.3% 76 79.2% 21 8.5% 19 7.7% 206 83.7%

Strongly

Agree/Agree Not Sure

Disagree/Strongly

Disagree

Statewide Schools

Statement (Literary Topic)

Subgrant Schools

Strongly

Agree/Agree Not Sure

Disagree/Strongly

DisagreeRF View (RF)

or Not RF

View (N)

Page 86: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

77

Table 29. Instructional Views: Subgrant School and Statewide Respondents (Continued)

Note. Statements shown in this table include the following elements of reading: vocabulary (vocab), fluency, and comprehension (comp).

n % n % n % n % n % n %

Although a great deal of vocabulary is learned

indirectly, some vocabulary should be taught

directly. (Vocab) RF 92 95.8% 2 2.1% 2 2.1% 238 96.7% 4 1.6% 4 1.6%

It is better to expose children to many new words

each week than to focus on teaching a few

words.(Vocab) N 16 16.5% 28 28.9% 53 54.6% 79 32.6% 46 19.0% 117 48.3%

Repeated and monitored oral reading improves

reading fluency and overall reading achievement.

(Fluency) RF 86 89.6% 6 6.3% 4 4.2% 223 91.8% 7 2.9% 13 5.3%

Independent, unmonitored silent reading has been

proven to increase fluency. (Fluency) N 22 23.4% 34 36.2% 38 40.4% 62 25.4% 65 26.6% 117 48.0%

Students should be taught strategies for

monitoring their own comprehension. (Comp) RF 90 92.8% 5 5.2% 2 2.1% 223 90.7% 14 5.7% 9 3.7%

Comprehension should be emphasized only after

children have well-developed decoding skills.

(Comp) N 13 13.4% 6 6.2% 78 80.4% 34 14.0% 25 10.3% 184 75.7%

Statement (Literary Topic)

RF View

(RF) or Not

RF View (N)

Subgrant Schools Statewide Schools

Strongly Not Sure Disagree/Strongly Strongly Not Sure Disagree/Strongly

Page 87: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

78

Both groups of respondents (subgrant school and statewide course participants) indicated

high levels of agreement with the seven statements reflecting research-based reading practices

advocated by Reading First and articulated through the MRF course. The percentage of

respondents who agreed with these statements ranged from 75% to 96%. Only one item resulted

in less than 88% agreement—a statement about phonemic awareness: “phonemic awareness

instruction is most effective when children are taught to manipulate phonemes (sounds) by using

the letters of the alphabet.” Fourteen percent of the subgrant respondents indicated they were

“not sure” if they agreed or disagreed with the statement, and 16% of the statewide respondents

indicated they disagreed with the statement.

Both groups of respondents indicated more mixed views and higher levels of uncertainty

in response to the six statements reflecting contrasting or non-Reading First views of reading

instruction. Of the six non-Reading First statements, respondents indicated high levels of

disagreement with only two statements (one on phonics and one on comprehension), and were

either “not sure” or disagreed with two other statements (one on vocabulary and one on fluency).

Respondents indicated a high level of agreement (over 61%) with a non-Reading First statement

about grouping students for instruction: “Whole group, small group, and individual reading

instruction are all equally effective grouping practices.” Reading First asserts that small group

reading instruction is more effective than whole group or individual instruction because children

benefit from listening to their classmates respond and receive feedback from the teacher.

Respondents indicated they were either “not sure” or agreed with another non-Reading First

statement on phonemic awareness: “Children should be taught several different ways to

manipulate phonemes (sounds) rather than focusing only on 1 or 2 ways of manipulating

Page 88: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

79

phonemes at a time.” Reading First advocates teaching students only 1 or 2 ways of

manipulating phonemes at a time to reduce students’ confusion.

It is not clear how to interpret respondents’ indication of uncertainty or agreement with

some of the non-Reading First statements on reading instruction. Different explanations for

these responses are possible, and some caveats must be offered. First, because each survey was

lengthy (four to five pages in total) and the attitude scale on instructional views appeared after a

two and a half page scale on literacy knowledge, it is quite likely that respondents experienced

fatigue, and may not have carefully read each statement in the scale in order to realize the scale

included contrasting views on instruction. The large percentage of respondents (15% to 36%)

who indicated “not sure” in response to three of the non-Reading First statements seems to

indicate that respondents were somewhat confused by the altered statements and didn’t know if

they reflected Reading First views or not. The directions for this scale did not inform

respondents that the scale included contrasting views on reading instruction but only asked

respondents to indicate their level of agreement with the statements.

Second, the attitude scale on instructional views includes pairs of statements that contrast

for each element of reading. Ideally, the scale should have been longer and contained several

items per reading element, and items that clearly presented contrasting views of the same

instructional action. The pairs of statements included in the scale are not mirror opposites but

relate to somewhat different instructional actions. The scale was kept short intentionally, in view

of the overall length of the survey.

Given these considerations, it is not clear how to interpret respondents’ uncertainty or

agreement with some of the non-Reading First statements. We cannot be sure if respondents

Page 89: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

80

truly agree with the statements or if they simply did not read the items carefully enough to

realize how these statements departed from Reading First views.

While the majority of both subgrant school and statewide course participants agreed with

the Reading First statements on reading instruction, the mixed views and uncertainty expressed

in response to some statements may indicate a need for the MRF course to place more emphasis

on certain ideas about best practices in reading instruction. Areas where respondents indicated

mixed views or uncertainty included: grouping practices, phonemic awareness, vocabulary, and

fluency.

Instructional practice. Course participants from subgrant school sites were asked to

describe their instructional practices in literacy in two survey items. One item asked participants

“How close is the match between the core reading program and your own beliefs about how to

teach children to read?” and specified four possible response choices ranging from “An exact

match” to “Not similar at all.” This survey item was used in a teacher attitude survey by

Foorman et al. (1998). Table 30 below presents the results for this item.

Table 30. How close is the match between the core reading program and your own beliefs about how to teach children to read? N %

An exact match. This is the way I already teach. 7 8.4%

Very similar. I agree with most aspects of the core reading program. 40 48.2%

Somewhat similar. I agree with some aspects of the core reading program. 32 38.6%

Not similar at all. My beliefs about the teaching of reading are contradictory to those of the core reading program.

4 4.8%

As these data indicate, subgrant school course participants were somewhat split in their

response to this survey item. Almost half of the respondents (48%) said the core reading

program is “very similar—I agree with most aspects of the core program” with instructional

beliefs, while 39% said the program is “somewhat similar—I agree with some aspects of the core

reading program.” Five percent (5%) of the respondents said the program is “not similar at all.”

Page 90: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

81

These findings indicate varied levels of agreement with the instructional approach used in the

core reading program adopted by subgrant schools as part of their participation in the MRF

program.

Another item asked participants “How many minutes per day do your students spend in

the following activities during reading instruction?” and specified four different types of student

grouping for reading instruction. Table 31 below presents the results from this survey item, for

just those respondents who indicated they were regular classroom teachers and special education

teachers.

Table 31. How many minutes per day do your students spend in the following activities during reading instruction?

Across respondents, there was considerable variation in the amount of time reported in

each of the four instructional grouping types. Subgrant school teachers reported that students

spend, on average, almost as much time in whole-class instruction as in small-group instruction.

Additionally, subgrant educators spend considerable time reading to students and having students

read independently, which would not provide as many opportunities for students to hear other

students read aloud and receive corrections from the teacher. The reader is cautioned to consider

that the survey item did not allow teachers to identify for what types of instructional activities or

reading skills they would use each of the grouping strategies, it simply asked how students

typically spend their time during reading instruction.

56 5 90 24.46 12.35

57 10 90 33.11 19.23

54 0 60 31.94 15.28

57 8 60 21.28 9.10

Teacher reading to students

Small-group guided reading

Whole-class instruction

Independent reading

N Minimum Maximum Mean Std. Deviation

Page 91: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

82

Impacts of new learning on instructional practice. The spring 2005 survey included two

similar items asking respondents to describe how the new knowledge they gained through MRF

professional development activities had impacted their instructional practice during year one.

One item asked participants to “give 2-3 examples of how you have applied new knowledge of

reading concepts to your reading instruction this year,” while another item asked participants to

“give 2-3 examples of how this [MRF] course has had a positive impact on your reading

instruction.”

In response to first item, there were 76 written responses from subgrant school course

participants and 215 from statewide course participants. Respondents from both subgrant and

non-subgrant schools were overwhelmingly positive. Respondents said they were using activities

they had learned about through the MRF course, particularly in the areas of fluency, vocabulary,

phonological awareness, and comprehension. Some respondents said they were emphasizing one

or two elements of reading in their instruction. Respondents also frequently mentioned using

reading assessments to identify students’ weaknesses.

Respondents said they were learning to deliver more systematic, explicit instruction in

reading, particularly in the area of phonological awareness. One respondent wrote: “It [MRF

course] gave me a better sense of how to go about developing a program of systematic, explicit

instruction.” Another respondent wrote: “Phonics instruction is easier to integrate

systematically.”

Respondents said the MRF course had helped them better understand the differences

between “phonics,” “phonological,” and “phonemic.” One respondent commented, “Becoming

aware of differences between phonics and phonemic awareness has changed my instruction. I

include different activities to cover both, instead of one.”

Page 92: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

83

Many respondents mentioned that they felt the activities they had learned to use through

the MRF course made reading instruction more fun and increased student engagement. Some

respondents said they were including more games and play as an instructional approach for

reading, for example, by having students acting out stories or retelling stories using puppets.

One respondent explained: “I’ve used Readers’ Theatre and it seemed to be very effective. It

really put a spark in the reading lesson and kids were very motivated.”

Several respondents said they had improved their questioning strategies, stopping while

reading aloud to check for understanding. Some respondents said they were giving more

attention to higher level thinking skills (for example by asking students to make predictions

about the story) and metacognitive skills (such as teaching students self-monitoring strategies for

reading comprehension).

Respondents said they were modeling reading strategies for students more often. One

respondent wrote: “My modeling activities have more purpose than before.” Other respondents

said they were using scaffolding approaches more effectively.

Many respondents said they were emphasizing oral reading and retelling more in their

practice. Many respondents said they were using the literacy time differently such as trying to

include all five elements of reading daily; allowing more time for students to reread stories to

strengthen fluency; and increasing waiting time for student responses. Some respondents said

they were using writing as an instructional tool for reading. A few respondents said they were

using a basal reader less frequently and using more literature groups. These written responses

were not analyzed by grade level, so we cannot distinguish what views were expressed for

certain grade levels.

Page 93: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

84

Several respondents said the MRF course had helped them learn about different ways to

organize the physical classroom space to promote reading and writing in the classroom and that

they had made changes in their own classroom space.

Some respondents said they had improved their ability to make instructional plans,

beginning with the student outcome goals and planning backwards to select activities to promote

those goals.

Respondents mentioned many specific instructional activities that they had incorporated

into their reading instruction, and said they felt these activities were effective strategies for

developing students’ reading skills. Specific activities that were mentioned include:

Phonics

• Word mapping • Word sounding • Word families • Word wall • Secret word • Syllable bingo, chunk bingo • Flip book • More sequential instruction of phonics decoding

Phonological Awareness

• Short drills • Sound Cards • Blending sounds • Onset and Rime • Rhyming

Vocabulary

• Secret Word • Graphic Organizers • Using 3-Tier system, evaluating which tier students are at and then teaching

specific vocabulary words • Concept definition map • Cloze procedures

Page 94: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

85

Comprehension

• Think aloud reading • Oral retelling • Story maps, story charts, graphic organizers • Mental imaging, visualization • Predicting what will happen in a story • Comprehension monitoring strategies • Cloze procedures

Fluency

• Read Aloud/Oral Reading/partner reading/choral reading • Echo reading • Readers’ Theater • Timed leveled rereads • Poems • Speed drills

In addition to the specific activities or instructional materials listed above, respondents

said they were making more use of leveled books and big books for reading instruction. Several

respondents said they were no longer using the “Round Robin” approach for small-group reading

where students take turns reading short text passages aloud, but are doing “guided reading”

practice which allows all students to read quietly at the same time and therefore obtain more

reading practice.

In response to the survey item asking how the MRF course had impacted reading

instruction in year one, subgrant school course participants provided 77 written comments and

statewide course participants provided 215 written comments. Responses to this item were quite

similar to responses for the item asking how new knowledge of reading concepts generally had

impacted reading instruction. Responses to both items emphasized similar changes in

instructional practice: the use of a greater variety of instructional activities for all five elements

of reading; increased use of assessments in reading; more systematic and explicit reading

Page 95: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

86

instruction; a better balance across the five elements of reading and between whole group and

small group instruction; use of the guided reading approach instead of the “Round Robin”

approach; increased use of hands-on activities, games and play to increase student interest and

engagement; increased use of literature in reading instruction; reorganization of the physical

classroom space to better promote reading and writing; increased use of teacher modeling of

reading strategies; and increased waiting time in teacher questioning.

While the survey item asking about the impacts of literacy knowledge generally on

instruction elicited mentions of specific instructional activities that respondents were beginning

to use in the classroom, the survey item asking about the impact of the MRF course on

instruction elicited more comments on perceived affective benefits for the course participants.

Both subgrant schools and statewide course participants said they felt the course was valuable for

new learning and a review of prior knowledge, supporting and affirming that what they are doing

in the classroom, encouraging participants to reflect on their own practices and make

adjustments, and providing opportunities for participants to share ideas with their peers both

within their schools and across schools in the state. Some respondents said there was increased

collaboration and shared instructional understandings or goals within their schools and districts

as a result of their district’s participation in the MRF course.

Written responses to this item from subgrant school participants included:

“It has affirmed where I am in terms of my own classroom instruction and expectations, as well as guiding me towards new ideas, strategies, and theories.” “It has validated a lot of what I knew to be best practice. It has helped me increase my teaching methods and to become more systematic and explicit.” “The components of reading instruction were brought to the forefront and it helped me create a better balance within the classroom.”

Page 96: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

87

“Wow! My reading groups are much more focused. Fluency is now being taught whereas I never taught it before.” “I feel better connected to my peers. We have more of a common language.” “The course has built community and provided opportunities for staff to become current on best instructional methods for our students.”

A few of the subgrant course participants (9) wrote less positive comments regarding the

course suggesting that they did not learn as much as they had hoped. Some of these respondents

indicated they had prior knowledge of literacy through their district’s participation in the MLP,

and they suggested that the content of the MRF course be adjusted to fit the particular needs of

each subgrant school faculty.

Many of the statewide course participants said they felt more confident in their reading

instruction, as a result of their participation in the course. Comments from statewide course

participants included:

“It [MRF course] has given me more confidence. I realized how many things I was doing right, but how to improve.” “I have been able to dovetail old knowledge with new to make better use of resources and activities in literacy instruction.” “The course has caused me to re-examine the ways in which I teach and to make changes in some areas of reading instruction.” “I was more aware of what it is that I am expecting for an outcome when assessing my class.”

“I have a better understanding of why the things I spend time on are important.”

“It [MRF course] has made me more explicit in my instruction—comprehension strategies, vocabulary instruction, and fluency.” “[Several] educators from this district were involved in this course and we are all abuzz with added knowledge.” “[The MRF course] Provided the opportunity to dialogue with colleagues that continues after the class.”

Page 97: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

88

Summary for MRF course surveys. Subgrant school and statewide course participants

gave high ratings of satisfaction with the MRF course, and most said they would recommend the

course to other classroom teachers, special education teachers, or educational technicians.

Survey respondents said their instructors were very knowledgeable and helpful. Respondents

indicated they felt the course was a worthwhile professional development experience that had

positive impacts on their knowledge of literacy and reading instruction. Specifically,

respondents felt the course increased their understanding of important components of reading,

how young children develop reading skills, and how to support and monitor the development of

reading skills in young children through instruction and assessment.

Respondents said they were using a wider variety of instructional activities that they had

learned about through the MRF course and that these activities were having the impact of getting

students more engaged in reading instruction. Respondents especially liked using instructional

activities that incorporated fun games or play (such as the Readers’ Theatre). Respondents said

they were doing more hands-on activities with students, modeling reading strategies for students

more often, and taking time to check for comprehension. Respondents said the course had helped

them to understand the importance of fluency and comprehension, and that they were putting

more emphasis on these areas in their instruction such as through oral reading and retell.

Respondents said they have learned how to deliver more explicit and systematic reading

instruction, particularly in the area of phonological awareness, and they have learned how to

conduct reading assessments and use the information to target students’ reading skills.

A few respondents in subgrant schools indicated that the content of the course did not

take into consideration their prior experience in literacy, such as their school’s participation in

the MLP. These respondents suggested that the content and pacing of the course should be

Page 98: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

89

differentiated to meet the particular needs of each school site. The higher concentration of

educational technicians and non-regular classroom teachers in the statewide course sites meant

that these groups had a steeper learning curve than did participants from subgrant schools.

Although respondents indicated a high level of satisfaction with the course and

instructional materials, they offered a few suggestions for improvement. These suggestions

included: more instructional activities (particularly hands-on activities and manipulatives for

younger children), more concise and better organized handouts in a binder, more time for sharing

of instructional ideas with peers in class, and instructor and student modeling of instructional

activities and strategies.

A comparison of pretest and posttest survey data on respondents’ self-ratings of literacy

knowledge before and after the MRF course showed statistically significant gains in reported

levels of literacy knowledge for all five elements of reading (phonological awareness, alphabetic

principle and phonics, vocabulary, fluency, and comprehension), three areas of literacy

knowledge (research-based reading practices, reading development, and literacy environments),

and five areas of instructional knowledge (definitions or components of the reading element,

reading research on the element, instructional practice for the element, ways to monitor progress

for the element, and differentiation of instruction for the element).

Respondents from statewide course sites reported larger gains than did subgrant school

respondents, which can be explained by the differences in demographics and prior experience in

literacy for the two groups. Statewide respondents reported the largest gain in knowledge in the

areas of comprehension and fluency, while subgrant school respondents reported the largest gain

in the area of fluency. Across the five areas of instructional knowledge, respondents indicated

Page 99: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

90

they were less knowledgeable about monitoring student reading progress, differentiation of

instruction, and reading research.

Both subgrant school and statewide respondents indicated a need for more professional

development in the areas of reading comprehension, phonological awareness, fluency,

vocabulary, guided reading, differentiation of instruction and interventions, assessment, and

organization of the physical classroom.

Both subgrant school and statewide respondents generally indicated high levels of

agreement with Reading First views of best practices in reading instruction. Areas where

respondents indicated mixed views or uncertainty were grouping practices, phonological

awareness, vocabulary, and fluency.

Subgrant school respondents (Cohort 1) provided feedback on their use of the core

reading program and on their own instructional practices in reading during year one. Subgrant

school respondents were generally positive in their feedback on the core program, while some

respondents were less positive. More than half of the respondents (61%) agreed that the core

program had had positive impacts on students’ reading development, and 55% of the respondents

agreed that they would recommend the program for students in the grades they teach. Slightly

less than half of the respondents (45%) agreed that they would recommend the program for

students with special needs. Most respondents (79%) agreed that the screening, progress

monitoring, and benchmark assessments have informed their reading instruction during year one.

Subgrant school respondents described supports that had been helpful in their efforts to

implement the core reading program. These supports included: instructional materials and

professional development provided by MRF; in-classroom support from coaches,

interventionists, and other literacy specialists; peer support and sharing of instructional ideas.

Page 100: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

91

Suggestions for improvements to the core reading program included: more levels for

leveled books; better quality literature and greater diversity in literary genres (novels, nonfiction,

poetry, etc.); larger print size for younger children; more manipulatives for kindergarten

students, more phonics activities for kindergarten students; and less paperwork. The least

favorite component of the reading program was the student workbook.

Subgrant school respondents did not wholly agree with the instructional approach

advocated by the core reading program. Respondents said the core reading program is either

“very similar” or “somewhat similar” to their own beliefs about how to teach children to read.

This finding indicates some degree of disjuncture between the core reading program and teachers

beliefs. Further, respondents indicated that their students spend almost as much time each day in

whole-class instruction as they do in small-group instruction. A considerable portion of the

literacy block is also spent with the teacher reading to students or with students reading

independently. These grouping practices give less priority to small-group instruction than is

suggested by Reading First.

Continued participation in the MRF course and other professional development activities

and continued practice with new instructional approaches may bring teachers’ beliefs and

practices into closer alignment with the instructional practices advocated by Reading First.

Teachers’ beliefs will influence how they choose to implement components of the program.

Some subgrant respondents indicated in their written comments that there is a great deal of

variation in teachers’ support for the core reading program and implementation effort in their

schools. This presents a challenge for both school leaders and for the MRF program.

Page 101: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

92

Section Summary

Surveys were conducted in year one with the subgrant school Literacy Leadership Teams

(Cohort 1) and with MRF course participants from subgrant schools (Cohort 1) and from

statewide course sites. Data from both surveys indicate a high level of satisfaction with the

technical assistance, training on reading assessments, and professional development provided by

MRF staff, consultants, and course instructors. Suggestions for improvements to program

support included: earlier communication of training dates, grade specific professional

development on each reading component, and models for implementing the core reading

program.

Respondents indicated that the professional development experiences increased their

knowledge of reading components, acquisition, instruction, and assessment; improved their

instructional skills; and increased teacher reflection on instructional practices. Respondents in

some subgrant schools indicated that the professional development experience helped educators

to develop shared understandings and goals around literacy and had increased teacher

collaboration in their schools. Teachers described some of the changes in their reading

instruction, which included: delivering more systematic and explicit reading instruction, giving

more emphasis to fluency and comprehension, using more hands-on activities, modeling reading

strategies for students, and conducting reading assessment to target students’ reading skills.

Educational technicians and special education teachers indicated that much of the material

covered in the MRF course was new to them and that it was challenging to process so much

information during the academic year. Respondents from a few subgrant schools indicated that

they began the course and their participation in the MRF program with prior knowledge and

experience in literacy through their schools’ participation in MLP. These respondents indicated

Page 102: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

93

they would like the course content and pacing to be differentiated to meet each school’s

particular needs.

Respondents to both the Leadership Team Survey and the MRF Course Survey indicated

a need for more training on monitoring students’ reading progress, analyzing and using reading

assessment data to inform instruction, and differentiating instruction and providing interventions

for students with different learning needs. Respondents to the MRF Course Survey also indicated

a need for more training on reading research, reading comprehension, phonological awareness,

fluency, vocabulary, guided reading, managing instructional time within the literacy block, and

organizing the physical layout of the instructional space in the classroom. Subgrant school

respondents indicated that they hold some instructional views and use grouping practices that, in

some cases, depart from those that are advocated by Reading First. This finding may indicate a

need for greater emphasis on certain instructional ideas.

Analysis of the pretest and posttest data from the MRF course Surveys revealed that both

subgrant school and statewide course participants reported significant gains in knowledge of

literacy instruction and all five essential elements of reading by the end of the course. These

gains were statistically significant (p < .05) and large (Cohen’s d > .80). Statewide course

participants reported the largest gains in knowledge. Respondents reported the largest gains in

the areas of reading comprehension and fluency. The method of using self-reported estimates of

knowledge presents limitations to these findings. More direct and objective measures of literacy

knowledge would be more reliable and preferable.

Survey respondents suggested some improvements to the MRF course, including more

hands-on activities and manipulatives for young children, better organization of course handouts,

Page 103: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

94

more time for participants to share instructional ideas, and more modeling of instructional

activities and strategies.

Subgrant school respondents to both surveys said that supports that were helpful to their

school’s efforts to implement the MRF program and the core reading program included:

assistance and training from MRF; the instructional materials and core reading program provided

through MRF; in-classroom support from literacy coaches and interventionists; the literacy

leadership team and principal support, funding for substitutes, and peer support.

The chief challenge to implementation mentioned by subgrant school respondents

centered around time—time to engage in the learning and time to implement the required

changes in instruction and assessment. For a few subgrant schools, turnover in school leadership,

teacher stress and low morale, uneven teacher support for instructional change, and the need to

reconcile MLP with the core reading program presented challenges to implementation.

Most subgrant school respondents (79.3%) to the MRF Course Survey agreed that the

screening, progress monitoring, and benchmark assessments have informed their reading

instruction during year one. A majority of respondents (61%) agreed that the core reading

program had positive impacts on their students’ reading development by the end of the 1st year

of implementation. Respondents were less sure about whether or not they would recommend the

core reading program for use with all students in the grades they teach or for students with

special needs. Suggestions for improvements to the core reading program included: more books;

more levels for the leveled books, better quality literature, greater diversity in literary genres,

larger print size for younger children, more manipulatives and phonics activities for young

children, and less use of the student workbook.

Page 104: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

95

Part V: Baseline Student Assessment Data

This section of the evaluation report presents student reading assessment results for year

one for the seven Cohort 1 schools. These results represent baseline data, as equivalent data

sources were not available for the year prior to the schools’ participation in the MRF program. It

is important to point out that these baseline assessment results cannot be used to determine the

impacts or effectiveness of the MRF interventions. These data simply describe students’ reading

skills during the first year of the program. Assessment results from subsequent years will be

collected and compared with these baseline data to determine if the interventions resulted in

improved reading skills.

The focus of this report is on participating schools rather than districts or local

educational agencies (LEAs). Assessment data are collected and analyzed at the school and

grade level, not district level. Thus, we refer to “participating schools” to denote schools that are

participating in the MRF program through subgrants to their LEAs.

MRF required the participating schools to administer the Dynamic Indicators of Basic

Early Literacy Skills (DIBELS) for the purpose of screening and progress monitoring assessment

in reading. Most DIBELS measures were administered at three points in the year (fall, winter,

spring), and the measures could be used for the purpose of screening or progress monitoring at

any time during the year. Typically, progress monitoring assessments were administered in

between the screening assessments, and the spring assessments were used as outcome measures

for some elements of reading. Participating schools also conducted diagnostic assessments

throughout the year for reading as needed, however these data were not analyzed for the purpose

of the evaluation. For outcome measures, MRF required participating schools to administer both

the DIBELS and TerraNova CTBS reading and Plus assessments (CTB McGraw Hill, Aug.

Page 105: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

96

2001). The TerraNova measures reading comprehension and the Plus subtests measure reading

vocabulary and word analysis (phonics). MRF required participating schools to administer the

TerraNova in grades 1-3 in year one, and in grades K-3 in year two. Two of the seven Cohort 1

schools elected to administer the TerraNova in kindergarten in year one. DIBELS assessments

were administered in grades K-3. The required assessments are listed in Table 32 below.

Table 32. DIBELS and TerraNova Assessments Required by MRF (Year One)

Grade & Reading Component

Fall Screening Assessments

Winter Benchmark Assessments

Spring Benchmark/ Outcome Measures

K Phonemic Awareness Phonics Vocabulary

ISF, PSF (midyear) LNF, NWF (midyear) WUF

ISF, PSF LNF, NWF WUF

PSF LNF, NWF WUF

1 Phonemic Awareness Phonics Vocabulary Fluency Comprehension

PSF LNF, NWF WUF ORF (midyear) RTF

PSF NWF WUF ORF RTF

PSF NWF, TerraNova (Plus) WUF, TerraNova (Plus) ORF RTF, TerraNova

2 Phonics Vocabulary Fluency Comprehension

NWF WUF ORF RTF

— WUF ORF RTF

TerraNova (Plus) WUF, TerraNova (Plus) ORF RTF, TerraNova

3 Phonics Vocabulary Fluency Comprehension

— WUF ORF RTF

— WUF ORF RTF

TerraNova (Plus) WUF, TerraNova (Plus) ORF RTF, TerraNova

DIBELS subtests included: ISF (Initial Sound Fluency); PSF (Phoneme Segmentation Fluency); LNF (Letter Naming Fluency); NWF (Nonsense Word Fluency); WUF (Word Use Fluency); ORF (Oral Reading Fluency); and RTF (Retell Fluency). Measures shown in bold indicate when students are expected to meet benchmark goals with “established” skills. Benchmark goals have not been established for WUF and RTF. ORF has benchmark goals but students are expected to increase their fluency skills throughout the elementary grades rather than “establishing” these skills in grades K-3. The evaluator worked with a private consultant to create a database (using SAS software)

for student assessment data. DIBELS data were downloaded directly from the assessment

vendor’s website. These data included both demographic information about individual students

that schools had entered into the DIBELS database as well as assessment results reported by

Page 106: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

97

DIBELS. The evaluator created a simplified spreadsheet containing the demographic data for

each school’s MRF student population (K-3), and each school reviewed these data at the end of

year one for accuracy, making corrections as necessary. The corrected demographic data were

then uploaded into the evaluator’s database. The evaluator is currently working with the Maine

Department of Education to obtain student demographic data directly from the state’s database,

rather than asking school personnel to provide this information. This action should improve the

accuracy and reliability of the student demographic data in subsequent years of the project.

Each subgrant school purchased a data CD with results from the TerraNova assessment

from the vendor for the evaluation. These data were uploaded into the evaluator’s database. For

both the DIBELS and TerraNova assessments, the evaluator was able to match all tested students

with their assessment scores.

This section of the report presents assessment results for year one in both graphic and

narrative form, organized by the five essential elements of reading and by the type of assessment

used to measure each element of reading (i.e., DIBELS or TerraNova assessments). Data tables

and figures include results for all grades tested and all administrations of the assessment. Data

are presented in aggregate form (including results for all seven Cohort 1 schools), and are

disaggregated by school and by various subgroup populations of students. The number of

students tested (n) and the number or percentage of students not tested for each reading

assessment are included in data tables comparing assessment results for the seven schools and in

Appendix G of this report.

Paired tests were used to determine if differences in assessment results were statistically

significant when comparing beginning-of-year and end of year aggregate results, when comparing

assessment results for the seven cohort 1 schools, and when comparing assessment results for

Page 107: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

98

students in special subgroup populations and results for other students. We note where the

differences in assessment performance were significant at the 95% confidence level (p < .05).

This section of the report focuses on addressing the following evaluation questions based

on analysis of the baseline assessment results:

• What progress did students make in each reading element (as measured by DIBELS and TerraNova) over the course of year one?

• What percentage of students met/did not meet the benchmark goal for each

administration of the DIBELS measures, and what percentage of students performed above/below the 40th percentile when compared with a national norm group on the TerraNova outcome measures?

• How do the reading assessment results compare across the seven Cohort 1 participating

schools?

• How do assessment results compare when the data are disaggregated for different student subgroup populations?

After assessment data for year two are obtained (summer 2006), the evaluator will be able to

compare students’ progress on these reading measures over the first two years of the MRF

program. The Maine Educational Assessment (MEA) will be administered to third-grade

students for the first time this year (2005-2006), (after being piloted in 2004-2005), and the state

is currently setting performance standards for that assessment. The MEA is also administered to

fourth-grade students. MEA data will be collected in subsequent years to track improvement in

students’ reading scores, and these analyses will be presented in subsequent evaluation reports.

MRF selected the DIBELS set of reading assessments for several reasons. First, the

assessments are standardized and are individually administered. Second, the assessments cover

all five essential elements of reading described in both the National Reading Panel (2000) and

the National Research Council (1998) reports, and there are assessments covering all grades in

the MRF Program (K-3). Third, the assessments have been thoroughly researched (Good,

Page 108: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

99

Simmons, & Kame-enui, 2001) and were found to be valid and reliable measures of early

literacy skills and to be predictive of later reading proficiency. Fourth, each DIBELS assessment

is a short (one minute) measure of skills essential for developing reading fluency.

DIBELS reports student assessment results either in terms of performance levels (low

risk, some risk, at risk), or in terms of students’ scores on the timed assessments (e.g., on WUF,

number of words used correctly in a sentence per minute, or on RTF, percentage of words retold

in one minute that the student read orally). The benchmarks were established through research

with thousands of children across the U.S. (Good, Gruba, & Kaminski, 2001; Good, Simmons, &

Kame-enui, 2001) and represent minimal levels of satisfactory progress for the lowest achieving

students (Good, Gruba, & Kaminski, 2001). Students performing at the “low risk” level are

considered to be making satisfactory progress indicating a high probability of attaining a

subsequent benchmark goal for a certain literacy skills (performing at grade level). For the

purpose of clearly distinguishing the three performance levels, this report uses the label “high

risk” instead of “at risk.”

The TerraNova CTBS reading assessment was selected primarily for four reasons. First,

the assessment was on the list of approved reading assessments for Reading First. Second, the all

MRF participating schools are required to participate in the Maine Literacy Partnership (MLP),

and MLP required the use of TerraNova as an outcome measure. Third, many schools in Maine

have historically administered the TerraNova assessment, so educators were familiar with

administering this assessment. Fourth, the CTBS edition of the TerraNova was the only edition

for which reading assessments could be purchased as a stand-along package (i.e., without the

mathematics assessment). The more recent edition of the TerraNova did not offer that option.

Page 109: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

100

Information about the norming process, validity and reliability of the TerraNova are described in

the publisher’s technical report (CTB McGraw-Hill, August 2001).

While the DIBELS assessments are criterion-referenced and report students’ performance

in terms of absolute standards (as levels of risk for subsequent failure to meet reading

benchmarks), the TerraNova is a norm-referenced assessment that reports students’ performance

in terms of percentile rankings for students nationally who took the same assessment. Thus, the

two types of assessments use very different frameworks and results must be interpreted in

relation to each assessment’s structure and reporting approach.

Maine participating schools use an assessment format that includes only multiple-choice

items for the TerraNova CTBS. The assessment has two parts that are similar in composition

and format. Students generally take a break between the two parts of this assessment or may

take them on different days. This assessment covers several different reading skills as determined

by the grade level. Each part of this assessment takes between 25 and 40 minutes to administer,

depending on the grade level. Either the teacher reads orally the text passages, the questions, and

the possible response choices or students read the material themselves, depending on the grade

level. Reading skills covered by this assessment include: introduction to print (K-2); basic

understanding (K- 3); oral comprehension (K- 1); analyzing text (grade 1-3); evaluating and

extending meaning (grade 1-3); and identifying reading strategies (grade 1-3).

The TerraNova Plus assessments for Word Analysis and for Vocabulary cover just those

particular content areas. Each of these assessments takes 15 minutes to administer, and only

students in grades 1-3 take the plus assessments. Kindergarten students do not take the

assessment. The actual time for each assessment to be administered in grades K-3 is indicated in

Table 33 below. Most of the participating schools in Maine administer the practice items to

Page 110: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

101

students, to help students become familiar with the test format, which takes 40 minutes to

administer.

In year one, MRF gave participating schools the option to use the TerraNova in

kindergarten, and two of the seven participating schools did so. In year two (2005-2006), MRF

required participating schools to administer this measure in kindergarten.

Table 33. TerraNova Administration Time (shown in minutes) TerraNova Assessments Kindergarten Grade 1 Grade 2 Grade 3 Reading Assessment, Part I 25 30 45 40 Reading Assessment, Part I 30 30 30 25 Plus Assessment for Word Analysis -- 15 15 15 Plus Assessment for Vocabulary -- 15 15 15 Aggregate Assessment Results: Year One Student progress in phonemic awareness. The DIBELS Initial Sounds Fluency (ISF) and

the DIBELS Phoneme Segmentation Fluency (PSF) assessments measure phonemic awareness

skills. Phonemic awareness is the ability to hear and manipulate sounds in words, which is a

fundamental skill required for reading. The PSF is the primary measure for this element of

reading.

ISF requires the student to identify and produce the initial sound (phoneme) in a word.

Researchers have characterized this skill as one typically accomplished at kindergarten age

(Moats, 2003; Snow, Burns, & Griffin, 1998). According to the developers of DIBELS, students

who can correctly identify/segment 25 or more initial sounds per minute by the middle of

kindergarten are likely to meet subsequent phonemic awareness and phonics benchmarks,

whereas students who identify fewer than 10 initial sounds per minute are at risk for not meeting

subsequent benchmarks and may need intensive instructional support (Good & Kaminski, 2002,

also online: http://dibels.uoregon.edu).

Page 111: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

102

The teacher or examiner states four different words and presents pictures for each word.

The examiner then asks, “Which picture begins with (a letter sound)?” The student is asked to

correctly identify the picture that corresponds to the sound and also produce that sound orally.

The student’s score is the number of correct initial sounds identified/produced per minute. ISF

takes about 3 minutes to administer (Good & Kaminski, 2002).

PSF requires the student to segment a word into the three or four sounds (phonemes)

comprising that word. According to the developers of DIBELS, students who can correctly

segment 35 or more words per minute by the end of kindergarten and fall of first grade are likely

to meet subsequent phonemic awareness and phonics benchmarks, while students who segment

fewer than ten words per minute are at risk for not meeting subsequent benchmarks and may

need intensive instructional support (Good & Kaminski, 2002).

The examiner states a word and asks the student to produce orally the individual sounds

(phonemes) that make up the word. The student’s score is the number of correct phonemes

produced in one minute. A student’s risk category for PSF is based on the student’s performance.

PSF takes about 2 minutes to administer (Good & Kaminski, 2002).

ISF results. The following figure presents the aggregate results from ISF

assessment for kindergarten for the fall and winter administrations of the assessment during the

2004-2005 academic year.

Page 112: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

103

Figure 9

The results for ISF indicate mixed levels of progress for kindergarten students across year

one. Although the percentage of students in the highest risk category decreased by 2.7% by mid-

year, the percentage of students in the low risk category decreased significantly (27.1%).

Overall, the percentage of students considered to be below benchmark (“some” and “high” risk

categories combined) by mid-year increased significantly, from 35% at the beginning of the year

to 58.6% by mid-year. These results indicate mixed progress for initial sound recognition skills.

PSF results. Aggregate results from PSF for kindergarten and first grade are

presented in the following figures.

ISF Kindergarten

0

10

20

30

40

50

60

70

80

Risk Category

% o

f S

tud

en

ts

Beginning 59.2 24.1 10.9

Middle 32.1 50.4 8.2

End

Low Some High

Page 113: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

104

Figure 10

Figure 11

PSF Kindergarten

0

10

20

30

40

50

60

70

80

Risk Category

% o

f S

tud

en

ts

Beginning

Middle 49.9 24.4 16.4

End 58.4 27.9 7.2

Low Some High

PSF Grade 1

0

10

20

30

40

50

60

70

80

Risk Category

% o

f S

tud

en

ts

Beginning 32.3 41.5 19.9

Middle 76.9 14.8 1.8

End 76.6 17.5 0.6

Low Some High

Page 114: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

105

The results for PSF for both kindergarten and first-grade students in year one indicate

strong progress. In kindergarten, the percentage of students considered below benchmark

(“some” and “high” risk categories combined) decreased from 40.8% at mid-year to 35.1% by

end of year, with the percentage of students in the high-risk category decreasing by more than

half. In first grade, the percentage of students considered below benchmark decreased

significantly from 61.4% at the beginning of the year to 18.1% by the end of the year. The

percentage of students in the high-risk category alone decreased significantly from 19.9% at the

beginning of the year to 1.8% by mid-year, and to 0.6% by the end of the year. Statistically

significant (p < .05) differences were found between beginning-of-year and end-of-year

performance on this measure in grade 1. The PSF results indicate strong progress for phoneme

segmentation skills.

Student progress in phonics. The DIBELS Nonsense Word Fluency (NWF) and the

TerraNova Word Analysis assessments measure phonics skills including the alphabetic principle

and the ability to decode written text. The DIBELS Letter Naming Fluency (LNF) assessment

measures the student’s ability to name a written letter symbol, which is a skill that is correlated

with phonics skills. Phonics involves the ability to understand the relationship between sounds

(phonemes) and letters (graphemes), and to sound out letters or decode whole words. These skills

are critical for reading text. NWF and TerraNova Word Analysis are the primary measures for

this element of reading.

LNF requires the student to identify the names of letters. The examiner presents the

student with a page of randomly ordered upper and lower case letters and asked the student to

name as many letters as he/she can in one minute. The student’s score is the number of letters

named correctly in one minute. The assessment takes about 2 minutes to administer. There are no

Page 115: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

106

benchmark goals for this measure. According to the developers of DIBELS, students who can

correctly identify 40 or more letters per minute by the end of kindergarten are likely to meet

subsequent phonics benchmarks, while students who identify fewer than 29 letters per minute are

at risk for not meeting subsequent benchmarks. Students scoring in the lowest 20% are

considered to be at “high risk” for failing to achieve subsequent phonics benchmarks (Good &

Kaminski, 2002).

NWF measures alphabetic principle skills by requiring the student to identify the sound

(phoneme) that corresponds to a written letter and to decode nonsense words. The examiner

presents the student with a page of randomly ordered three phoneme nonsense words in VC or

CVC pattern, and asks the student to read as many words as possible or, alternatively, to sound

out the individual letter sounds for as many words as possible in one minute. The student’s score

is the number of letter sounds produced correctly in one minute. According to the developers of

DIBELS, students who can correctly decode 50 or more nonsense words per minute by the

middle of first grade are likely to meet subsequent phonics benchmarks, whereas students who

decode fewer than 30 nonsense words per minute are at risk for not meeting subsequent

benchmarks and may need intensive instructional support (Good & Kaminski, 2002). A student

who can decode whole words is a more fluent reader and has more automated phonics skills than

a student who can only sound out individual letters. The assessment takes about 2 minutes to

administer.

The TerraNova Word Analysis is administered to students in grades 1-3 and takes 15

minutes to administer. This assessment includes items on the following phonics skills:

consonants, singles, blends, digraphs (grades 1-3); sight words (grade 1); vowels (grades 1-3);

Page 116: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

107

contractions, compounds (grades 2-3); roots, and affixes (grades 2-3). Word Analysis skills are

measured in the context of reading a text passage (CTB McGraw-Hill, 1996).

In grades 1-3, the teacher orally reads the directions and item prompts. In grade 1, the

teacher also orally reads the possible response choices. Examples of tasks students are asked to

perform may include asking students to: find a word that begins with the same beginning or

ending sound as a word read out loud by the teacher; find a word with the same vowel sound in

the middle of the word as in a word read out loud by the teacher; or to find a written word among

the provided response choices that is the same as a word read out loud by the teacher in a

sentence and by itself. The student’s score on the Word Analysis assessment is reported as a

percentile score.

LNF results. Aggregate results from LNF for kindergarten and first grade are

presented in the following figures.

Figure 12

LNF Kindergarten

0

10

20

30

40

50

60

70

80

Risk Category

% o

f S

tud

en

ts

Beginning 50.9 25.5 17.8

Middle 63.4 19.1 8.5

End 53.6 25.7 14.1

Low Some High

Page 117: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

108

Figure 13

The results for LNF for kindergarten students in year one indicate slight progress. In

kindergarten, the percentage of students who scored in the “some” and “high” risk categories

combined decreased from 43.3% at the beginning of the year to 39.8% by the end of the year.

The percentage of students in the high-risk category alone decreased from 17.8% at the

beginning of the year to 14.1% by the end of the year.

In first grade, the percentage of students who scored in the “some” and “high” risk

categories at the beginning of the year was 46.6%, with 19.9% in the high-risk category alone.

Since LNF was administered only one time, it is not possible to describe progress on this

measure for first grade.

NWF results. Aggregate results from NWF for kindergarten through second grade

are presented in the following figures.

LNF Grade 1

0

10

20

30

40

50

60

70

80

Risk Category

% o

f S

tud

en

ts

Beginning 47.5 26.7 19.9

Middle

End

Low Some High

Page 118: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

109

Figure 14

Figure 15

NWF Kindergarten

0

10

20

30

40

50

60

70

80

Risk Category

% o

f S

tud

en

ts

Beginning

Middle 46.7 24.4 19.6

End 48.0 23.1 22.3

Low Some High

NWF Grade 1

0

10

20

30

40

50

60

70

80

Risk Category

% o

f S

tud

en

ts

Beginning 27.9 25.8 40.1

Middle 25.2 45.1 22.8

End 44.2 38.0 12.5

Low Some High

Page 119: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

110

Figure 16

The results for NWF indicate mixed levels of progress for year one. In kindergarten, the

percentage of students considered below benchmark (“some” and “high” risk categories

combined) increased slightly from 44% at mid-year to 45.4% by the end of the year. The

percentage of students in the “high” risk category alone increased slightly over the same period

from 19.6% to 22.3%. Thus, the results for kindergarten indicate little change across year one.

In first grade, the percentage of students scoring below benchmark decreased from 65.9%

at the beginning of the year to 50.5% by the end of the year. The percentage of students in the

“high risk category decreased significantly from 40.1% to 12.5% over the same time period.

Statistically significant (p < .05) differences were found between beginning-of-year and end-of-

year performance on this measure in grade 1. The results for first grade indicate progress for

nonsense word decoding skills.

In second grade, the percentage of students scoring below benchmark was 60.7%, with

28.5% in the “high” risk category alone. Since NWF was only administered at the beginning of

NWF Grade 2

0

10

20

30

40

50

60

70

80

Risk Category

% o

f S

tud

en

ts

Beginning 34.4 32.2 28.5

Middle

End

Low Some High

Page 120: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

111

the year, it is not possible to describe students’ progress for this measure for second grade. The

results from the fall assessment indicate that a sizable percentage of second-grade students

demonstrated difficulty with decoding nonsense words.

TerraNova--Word Analysis results. Aggregate results from the end-of-year

TerraNova Word Analysis assessment for first, second and third grades are presented in the

following figure.

Figure 17

The TerraNova assessment was administered at the end of the year in first through third

grade. For the Word Analysis measure, 29.6% of all students tested (grades 1-3) scored below

the 40th percentile when compared with a national norm group. The percentages of students

scoring below the 40th percentile in each grade were: 22.2%, 28.8%, and 37.3% (grades 1-3

respectively). Since results were only available for year one, it is not possible to describe

progress on this measure of phonics skills.

Terra Nova Word Analysis

0

10

20

30

40

Risk Category

% o

f S

tud

en

ts

Kindergarten

Grade 1 25.9 25.6 24.4 13.1 9.1

Grade 2 24.7 19.9 24.7 15.1 13.8

Grade 3 18.1 21.0 22.2 20.4 16.9

Low

80+60<80 40<60 20<40

High

<20

Page 121: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

112

In comparison to a national norm group, the percentage of students in the Maine Reading

First schools who scored below the 40th percentile on the TerraNova CTBS was relatively small

(22% to 37%). These results are not surprising, given that Maine schools traditionally score high

on reading relative to other states on the NAEP (website: http://nces.ed.gov/nationsreportcard/).

By contrast, the percentages of students scoring below benchmark on the DIBELS (criterion-

referenced) assessments were: 47% (grade 1) on LNF and 51% (grade 1) and 61% (grade 2) on

NWF. DIBELS measures students’ performance against an absolute standard, rather than in

comparison with the performance of other students nationally. It is important to note that the

TerraNova Word Analysis measures a broader variety of phonics skills than do the DIBELS

measures. In addition to the different reporting methods, DIBELS and TerraNova are

administered under different conditions. DIBELS measures are brief, one-minute tests

administered individually to students. This means that DIBELS also tests students’ automaticity

with a skill. Thus, the two different types of assessments can produce somewhat different

results.

Student progress in vocabulary. The DIBELS Word Use Fluency (WUF) and the

TerraNova Vocabulary assessments measure vocabulary skills. Oral vocabulary helps to build

reading vocabulary, and vocabulary is important for developing reading fluency and reading

comprehension (Adler, 2001). The primary measure for this reading element is the TerraNova

assessment.

WUF assesses a student’s knowledge of vocabulary and expressive language by requiring

the student to use a word correctly in a sentence. The examiner states a word and asks the student

to use that word in a sentence. The student’s score is the number of words the student can use

correctly in a phrase, sentence, or verbal expression within one minute. This assessment is

Page 122: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

113

relatively new and benchmark goals have not yet been specified by the developers of DIBELS.

Within the Maine Reading First program, students scoring above the 40th percentile are

considered to be performing at grade level. The percentile results are based on the number of

students tested in each grade level for the seven cohort 1 schools in MRF.

The TerraNova Vocabulary is administered to students in grades 1-3 and takes 15

minutes to administer. This assessment includes items on the following vocabulary skills: word

meaning (grades 2-3); multiple meanings of words (grades 2-3); words in context (grades 1-3).

In grades 1-2, the teacher orally reads the directions for test items. In grade 1, the teacher also

orally reads the possible response choices. Students in grade 3 independently read the directions

and test items. Examples of tasks students are asked to perform may include asking students to:

find a written word among the provided response choices that has a certain meaning (for

example, a word that is a place where animals live); or to find a written word with about the

same meaning as a written word (also orally read by the teacher in lower grades) from among the

provided response choices. The student’s score on the Vocabulary assessment is reported as a

percentile score.

WUF results. Aggregate results from WUF assessment for kindergarten through

third grade are presented in the following figures.

Page 123: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

114

Figure 18

Figure 19

WUF Kindergarten

0

10

20

30

Risk Category

% o

f S

tud

en

ts

Beginning 9.3 9.8 1.9 26.0 0.0

Middle 15.6 14.6 15.6 16.2 14.6

End 16.2 14.6 16.4 14.6 15.9

Low

80+60<80 40<60 20<40

High

<20

WUF Grade 1

0

10

20

30

Risk Category

% o

f S

tud

en

ts

Beginning 11.3 12.5 12.2 12.2 11.9

Middle 16.3 13.4 17.2 15.7 15.1

End 14.8 14.8 13.9 15.7 13.4

Low

80+60<80 40<60 20<40

High

<20

Page 124: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

115

Figure 20

Figure 21

WUF Grade 2

0

10

20

30

Risk Category

% o

f S

tud

en

ts

Beginning 12.7 13.9 11.8 13.0 13.0

Middle 16.7 16.1 15.5 16.4 17.3

End 13.6 16.1 14.6 12.7 15.5

Low

80+60<80 40<60 20<40

High

<20

WUF Grade 3

0

10

20

30

Risk Category

% o

f S

tud

en

ts

Beginning 11.3 11.0 10.5 11.3 10.5

Middle 16.3 14.9 15.2 15.7 15.2

End 15.7 15.7 14.6 15.2 16.3

Low

80+60<80 40<60 20<40

High

<20

Page 125: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

116

In year one, four of the seven cohort 1 participating schools administered WUF

assessment at all three points in the school year in each grade. Three schools did not administer

this assessment at all three points in the year (schools D, E and F).

The results for WUF indicate little progress across year one for all grades tested

(kindergarten through third grade). The percentile results are based on the number of students

tested in each grade level for the seven cohort 1 schools in MRF. In all grade levels, the

percentage of students scoring at or below the 40th percentile within MRF increased from the

beginning of the year to the end of the year. The percentage of students at or below the 40th

percentile increased by the largest amount in the third grade, while the increase was smaller in

other grades. The percentage of students scoring in the highest risk category (below the 20th

percentile) increased in all grade levels, but increased most significantly in kindergarten and in

third grade. The percentages of students scoring below the 40th percentile at the end of the year

were: 30.5%, 29.1%, 28.2%, 31.5% (grades K-3 respectively). Overall, WUF results indicate

little progress across year one for vocabulary skills.

TerraNova--Vocabulary results. Aggregate results from the TerraNova

Vocabulary for first grade through third grade are presented in the following figure.

Page 126: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

117

Figure 22

For the Vocabulary measure, 28.5% of all students tested (grades 1-3) scored below the

40th percentile when compared with a national norm group. The percentages of students scoring

below the 40th percentile were roughly equivalent for first and second grade (23.4% and 24%

respectively) but were largest at third grade (37%), particularly for the highest risk category

(below the 20th percentile).

The results from the TerraNova Vocabulary measure are roughly comparable to the

results from the DIBELS WUF. The percentages of students scoring below the 40th percentile on

the TerraNova Vocabulary when compared to a national norm group were: 23%, 24%, and 37%

(grades 1-3 respectively), while the percentages of students scoring at or below the 40th

percentile on WUF were: 29%, 28%, and 32% (grades 1-3). It is important to note that the

TerraNova Vocabulary measures a wider variety of phonics skills than does the DIBELS WUF.

Student progress in fluency. The DIBELS Oral Reading Fluency (ORF) assessment

measures reading fluency, and this measure is also correlated with reading comprehension skills

Terra Nova Vocabulary

0

10

20

30

40

Risk Category

% o

f S

tud

en

ts

Kindergarten

Grade 1 18.4 32.8 23.4 14.1 9.4

Grade 2 34.6 23.1 15.7 14.1 9.9

Grade 3 20.7 19.2 21.9 18.4 19.0

Low

80+60<80 40<60 20<40

High

<20

Page 127: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

118

(Parker, Hasbrouck, & Trindal, 1992). Fluency is the ability to read a text accurately and

quickly, where decoding becomes automated. As a student becomes a more fluent reader, he/she

is able to read orally with more expression and to focus on text comprehension rather than

decoding (Adler, 2001).

ORF measures a student’s oral reading fluency by requiring the student to read

benchmark text passages orally. A student reads three passages that are calibrated for the

student’s grade level. The student’s score for reading each passage is the number of correct

words read per minute. The student’s final ORF score is the median score across the three

passages the student has read orally. According to the developers of DIBELS, students who can

correctly read 40 or more words per minute by the end of first grade are likely to meet

subsequent fluency benchmarks, whereas students who read fewer than 10 words correctly per

minute are at risk for not meeting subsequent benchmarks for fluency (Good & Kaminski, 2002).

ORF results. Aggregate results from ORF assessment for first through third grade

are presented in the following figures.

Figure 23

ORF Grade 1

0

10

20

30

40

50

60

70

80

Risk Category

% o

f S

tud

en

ts

Beginning

Middle 54.9 29.7 8.6

End 51.3 28.8 14.5

Low Some High

Page 128: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

119

Figure 24

Figure 25

The results from ORF assessment indicate mixed levels of progress in year one: first and

third grade results indicated little progress, while second grade results indicated slight progress.

ORF Grade 2

0

10

20

30

40

50

60

70

80

Risk Category

% o

f S

tud

en

ts

Beginning 42.4 29.4 23.8

Middle 55.1 13.0 27.2

End 46.7 22.0 25.7

Low Some High

ORF Grade 3

0

10

20

30

40

50

60

70

80

Risk Category

% o

f S

tud

en

ts

Beginning 38.7 23.2 32.3

Middle 34.5 24.6 33.1

End 35.6 30.7 27.1

Low Some High

Page 129: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

120

The percentage of students scoring in the “some” and “high” risk categories combined increased

by 5% in first grade and by 2.3% in third grade, while the percentage of students scoring at this

level decreased by 5.5% in second grade. The percentages of students scoring in the “some” and

“high” risk levels on ORF by the end of the year were: 43.3%, 47.7%, and 57.8% (grades 1-3

respectively). Overall, the results are mixed for student progress in oral reading fluency skills,

with the least progress in third grade.

Student progress in comprehension. The Retell Fluency (RTF) and the TerraNova

(comprehension or mastery reading score) measure a student’s reading comprehension.

Comprehension is the reason people learn to read—to “read for meaning”—and this is the

ultimate goal for reading instruction.

RTF measures a student’s reading comprehension by requiring a student to read a text

passage orally (i.e., ORF assessment), and then tell the examiner as much about what the student

read within one minute. In order to administer this assessment, the student should be able to read

at least ten words or more correctly from a leveled passage. The student’s score is the number of

words the child uses to orally retell the story within one minute. Only words that illustrate the

child’s understanding of the passage are scored, not irrelevant remarks or exclamations (Good &

Kaminski, 2002).

The developers of DIBELS have not yet set benchmarks for RTF. In reporting RTF

results, the student’s ORF score (number of words read per minute) is compared to the student’s

RTF score (number of words used to retell per minute). Again, the student has the opportunity to

read and retell three passages, with the final ORF and RTF scores being the median of the three

individual scores. According to the DIBELS developers, if a student retells at least 50% of the

words read orally when reading 40 words per minute or less then this is indicates adequate

Page 130: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

121

reading comprehension. If a student retells 25% or less when reading more than 40 words per

minutes, then this may indicate a deficiency in comprehension skills (Good & Kaminski, 2002).

The TerraNova CTBS reading assessment is administered in two parts. The student’s

performance on both parts is combined to produce one “mastery reading” score for reading

comprehension. The teacher’s manual produced by the publisher states that “…reading

comprehension items focus on the central meaning of a passage rather than surface details” (CTB

McGraw-Hill, 1996, p. 31). On this assessment, the teacher orally reads or the student

independently reads a story or text passage, and then students select a response to each question

or item. In kindergarten and grade 1, the teacher orally reads the text and directions to students.

In grade 2-3, the teacher orally reads only the directions to students, and students independently

read the text and select a response choice.

RTF results. Aggregate results from RTF assessment for first through third grade

are presented in the following figures.

Figure 26

RTF Grade 1

0

10

20

30

40

50

60

Risk Category

% o

f S

tud

en

ts

Beginning

Middle 19.9 27.0 25.5 19.0

End 23.4 29.7 33.2 6.8

Low

75+50<75 25<50

High

<25

Page 131: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

122

Figure 27

Figure 28

RTF Grade 2

0

10

20

30

40

50

60

Risk Category

% o

f S

tud

en

ts

Beginning 19.5 27.2 31.6 16.1

Middle 9.6 26.9 48.6 10.2

End 13.9 29.4 47.1 4.0

Low

75+50<75 25<50

High

<25

RTF Grade 3

0

10

20

30

40

50

60

Risk Category

% o

f S

tud

en

ts

Beginning 20.4 36.5 30.1 5.5

Middle 13.5 36.7 37.3 4.4

End 4.4 23.2 55.8 9.1

Low

75+50<75 25<50

High

<25

Page 132: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

123

The results from RTF assessment indicate mixed levels of progress: first grade results

indicate some progress, while second and third grade results indicate less progress across year

one. In first grade, the percentage of students retelling less than 50% of the words they had read

orally decreased by 4.5% by the end of the year. In second and third grades, the percentages of

students retelling less than 50% of the words read increased by 3.4% and by 29.3% respectively.

The percentages of students retelling less than 50% of the words read by the end of the year

were: 40%, 51.1%, and 64.9% (grades 1-3 respectively). Statistically significant (p < .05)

differences were found between beginning-of-year and end-of-year performance on this measure

in grade 3. Overall, RTF results indicated mixed levels of progress for comprehension skills

across year one, with some progress in first and second grade and the least progress in third

grade.

TerraNova--Comprehension results. Aggregate results from the TerraNova

(comprehension/reading mastery score) for first grade through third grade are presented in the

following figure.

Figure 29

Terra Nova Reading Comprehension

0

10

20

30

40

Risk Category

% o

f S

tud

en

ts

Kindergarten 15.2 17.7 24.0 25.4 17.3

Grade 1 23.1 26.3 18.8 13.4 16.9

Grade 2 23.7 26.6 21.2 15.4 10.9

Grade 3 24.5 22.7 22.4 16.9 12.5

Low

80+60<80 40<60 20<40

High

<20

K N=282

Grade 1 N=315

Grade 2 N=305

Grade 3 N=340

Page 133: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

124

The above figure represents test score data from five of the seven cohort 1 participating

schools. In year one, two of the schools (schools B and C) did not administer the TerraNova in

kindergarten, but did administer the assessment in grades 1-3. In year two, the program required

all participating schools to administer the TerraNova in kindergarten. For the TerraNova reading

comprehension measure, 31.9% of all students tested (grades K-3) scored below the 40th

percentile when compared with a national norm group. The percentage of students scoring below

the 40th percentile was largest in kindergarten (42.7 %), and was roughly equivalent for grades 1-

3 (30.3%, 26.3% and 29.4% respectively).

In comparison to a national norm group, the percentage of students in the Maine Reading

First schools who scored below the 40th percentile on the TerraNova reading comprehension

measure was relatively small (30%, 26%, and 29% in grades 1-3 respectively). By contrast, the

percentage of students retelling less than 50% of the words read on the DIBELS RTF measure

was relatively large (40%, 51%, and 65% for grades 1-3 respectively). Again, because DIBELS

and TerraNova use different reporting methods and test administration conditions, the two

assessments can produce somewhat different results. As benchmarks have not yet been set for

the RTF, the TerraNova assessment will be used as the primary measure for reading

comprehension for the purpose of this evaluation.

Summary of aggregate assessment results. This report presents reading assessment

results for year one of the program, or the baseline year. During this year, educators in

participating schools were attending professional development sessions in literacy and a year-

long course, learning how to use a new reading program, and learning how to administer and

interpret results for the many DIBELS measures and other reading assessments.

Page 134: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

125

When interpreting the results, it is important to keep in mind that the DIBELS benchmark

goals increase at each of the three points during the year when the assessments are administered.

Thus, students may have made progress on these measures from one point to the next, but

because the “bar” was raised each time, it became more difficult to make the cut-off for the “low

risk” category. Some of the DIBELS measures lack benchmark goals. In these cases, the

primary measures are the DIBELS measures with benchmark goals and the TerraNova

assessment.

One might regard these baseline assessment data as a “needs assessment” for the MRF

initiative—the data merely provide ample evidence that the participating schools are in need of

the intervention, because of the high percentages of students falling below grade level on the

many different measures for reading skills. These data cannot be used to determine the impact or

effectiveness of the MRF interventions. Once data from year two are available, it will be possible

to compare year-one and year-two data to see what progress students have made in each element

of reading.

A review of aggregate assessment data over the course of year one generally showed

mixed levels of progress for phonics and fluency, strong progress in phonemic awareness (grades

K-1), and inconclusive evidence of progress across the year for vocabulary and comprehension

(due to the lack of benchmarking for some DIBELS measures and the fact that there was only

one data point for the TerraNova). The baseline assessment results are summarized below by the

elements of reading:

Phonemic Awareness: The results indicate progress across year one for grades K-1. PSF is the primary measure for this reading element.

• ISF results indicate little progress for kindergarten students. • PSF results indicate strong progress for both kindergarten and grade 1.

Page 135: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

126

Phonics: The results were mixed. NWF and TerraNova are the primary measures for this reading element, however the TerraNova was only administered at one point in the year.

• LNF results indicate slight progress for kindergarten students. (LNF was administered

only once in grade 1, so it was not possible to track improvement for that grade level.) • NWF results indicate mixed levels of progress: kindergarten students showed less

progress while grade 1 students showed some progress. (NWF was only administered once in grade 2.)

• The TerraNova Word Analysis assessment was only administered once at the end of year

one, so it was not possible to track students’ progress on this measure. The percentage of students who performed below the 40th percentile on this assessment ranged from 22% (grade 1) up to 37% (grade 3).

Vocabulary: The results are inconclusive with respect to progress over year one. Results from WUF do not indicate progress over the year, however WUF lacks benchmarks. TerraNova is the primary measure for this element of reading, however this assessment was only administered at one point in the year.

• WUF results indicate less progress for grades K-3, particularly for K and 3.

Approximately one third of the students performed at or below the 40th percentile by the end of the year in each grade tested.

• The TerraNova Vocabulary assessment was only administered once at the end of year

one, so it was not possible to track students’ progress on this measure. The TerraNova results were comparable to WUF results for grades 1-3. Approximately one quarter to over one third of the students performed below the 40th percentile on the TerraNova Vocabulary.

Fluency: The results were mixed.

• ORF results indicate mixed levels of progress. Students showed less progress in grades 1 and 3, but slight progress in grade 2.

Comprehension: The results were inconclusive with respect to progress over the course of year one. RTF results indicated mixed levels of progress, however RTF lacks benchmarks. TerraNova is the primary measure for this element of reading, however this assessment was only administered at one point in the year.

• RTF results indicate mixed levels of progress. Results showed slight progress in grade 1, but little progress in grades 2-3.

• The TerraNova assessment was only administered once at the end of year one, so it was

not possible to track students’ progress on this measure. The TerraNova reading

Page 136: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

127

comprehension results indicated smaller percentages of students performing below benchmark for reading comprehension in grades 1-3 than did RTF results. Approximately one third of the students performed below the 40th percentile on the TerraNova assessment, while 40%-60% of the students performed in the some or high risk categories (retelling less than 50% of the words read orally) on the RTF.

Comparison of Assessment Results Across Cohort 1 Schools This section of the report presents baseline reading assessment data for comparison of

results across the seven Cohort 1 schools (representing seven LEAs which receive the MRF

subgrants). For this comparison, the evaluator focused on the primary measures for each reading

element: phonemic awareness (PSF); phonics (NWF & TerraNova); vocabulary (TerraNova);

fluency (ORF); and comprehension (TerraNova). The data tables indicate the number of

students tested in each school, the percentage of students not tested, and the percentages of

students performing at each of the risk levels on the last administration of the assessment within

the school year. Within the tables, “begin” denotes a beginning of the year administration, “mid”

denotes a mid-year administration, and “end” denotes an end-of-year administration of the

assessment. If a school did not administer an assessment, no data are shown.

Paired tests were used to identify statistically significant differences in assessment

results. The school with the highest percentage of students performing below benchmark or

below the 40th percentile on a particular measure within a particular grade was compared to each

of the other six schools. We note where the differences in assessment performance were

significant at the 95% confidence level (p < .05).

Phonemic awareness. A comparison of PSF results is shown in Table 34 below. Schools

E and D had the largest percentages of kindergarten students performing below benchmark on

this measure (57% and 55% respectively with “high risk” and “some risk” categories combined),

while schools C, G, B, and F all had the smallest percentages of kindergarten students

Page 137: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

128

performing below benchmark on this measure (ranging from 10-12%). Statistically significant (p

< .05) differences were found between School E’s and four other schools’ end-of-year

performance on PSF in kindergarten.

In first grade, School D had the largest percentage of students performing below grade

level (59%), while School B had the smallest percentage of students below benchmark (2%).

Statistically significant (p < .05) differences were found between School D’s and all other

schools’ end-of-year performance on this measure in grade 1.

Table 34. Subgrant School Comparison for PSF Results School

PSF All Schools A B C D E F G

Kindergarten (End) N 377 20 60 20 54 131 64 28 % High Risk 7.2 5.0 5.0 0.0 3.7 13.0 4.7 3.6 % Some Risk 27.9 30.0 6.7 10.0 51.9 44.3 7.8 7.1 % Low Risk 58.4 55.0 80.0 85.0 42.6 35.1 79.7 85.7 % Untested 6.6 10.0 8.3 5.0 1.9 7.6 7.8 3.6 First Grade (End) N 337 18 54 18 56 66 81 44 % High Risk 0.6 0.0 1.9 0.0 0.0 0.0 0.0 2.3 % Some Risk 17.5 27.8 0.0 22.2 58.9 15.2 4.9 6.8 % Low Risk 76.6 66.7 94.4 77.8 35.7 77.3 86.4 90.9 % Untested 5.3 5.6 3.7 0.0 5.4 7.6 8.6 0.0 Phonics. A comparison of NWF results is shown in Table 35 below. School E had the

largest percentage of kindergarten students performing below benchmark on this measure (76%

with “high risk” and “some risk” categories combined), while School C had the smallest

percentage of students performing below benchmark (10%). Statistically significant (p < .05)

differences were found between School E’s and all other schools’ end-of-year performance on

NWF in kindergarten.

In first grade, school D had the largest percentage of students performing below

benchmark, while School A had the smallest percentage of students below benchmark (28%).

Page 138: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

129

In second grade, schools C and E had the largest percentage of students performing

below benchmark (79% and 72% respectively), while School F had the smallest percentage of

students performing below benchmark (41%).

Table 35. Subgrant School Comparison of NWF Results School

NWF All Schools A B C D E F G

Kindergarten (End) N 377 20 60 20 54 131 64 28 % High Risk 22.3 20.0 13.3 0.0 7.4 42.0 17.2 7.1 % Some Risk 23.1 20.0 26.7 10.0 25.9 33.6 6.3 10.7 % Low Risk 48.0 50.0 51.7 85.0 64.8 16.8 68.8 78.6 % Untested 6.6 10.0 8.3 5.0 1.9 7.6 7.8 3.6 First Grade (End) N 337 18 54 18 56 66 81 44 % High Risk 12.5 5.6 9.3 11.1 14.3 16.7 6.2 22.7 % Some Risk 38.0 22.2 35.2 44.4 51.8 31.8 38.3 36.4 % Low Risk 44.2 66.7 51.9 44.4 28.6 43.9 46.9 40.9 % Untested 5.3 5.6 3.7 0.0 5.4 7.6 8.6 0.0 Second Grade (Begin) N 323 22 58 19 42 62 77 43 % High Risk 28.5 27.3 29.3 52.6 31.0 43.5 11.7 23.3 % Some Risk 32.2 36.4 39.7 26.3 33.3 29.0 29.9 30.2 % Low Risk 34.4 31.8 25.9 21.1 33.3 22.6 50.6 41.9 % Untested 5.0 4.5 5.2 0.0 2.4 4.8 7.8 4.7 A comparison of the TerraNova Word Analysis Assessment results is shown in Table 36

below. School F had the largest percentage (27%) of first-grade students performing below the

40th percentile on this measure when compared to a national norm group, while School A had the

smallest percentage (12%) of first-grade students performing at this level.

In second grade, Schools C and E had the largest percentages of students performing

below the 40th percentile (56% and 41% respectively), while School F had the smallest

percentage (18%) of students performing at the same level.

Page 139: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

130

In third grade, Schools F and G had the largest percentages of students performing below

the 40th percentile (49% and 46% respectively), while School C had the smallest percentage

(18%) of students performing at this level.

Table 36. Subgrant School Comparison of TerraNova Word Analysis Results School

TerraNova Word Analysis

All Schools A B C D E F G

First Grade N 320 17 51 18 53 64 73 44 % < 20 9.1 11.8 3.9 5.6 9.4 9.4 9.6 13.6 % 20 -< 40 13.1 0.0 11.8 11.1 15.1 14.1 17.8 9.1 % 40 -< 60 24.4 11.8 27.5 38.9 30.2 29.7 19.2 13.6 % 60 -< 80 25.6 29.4 27.5 11.1 24.5 23.4 26.0 31.8 % 80+ 25.9 41.2 29.4 33.3 20.8 21.9 21.9 31.8 % Untested 1.9 5.9 0.0 0.0 0.0 1.6 5.5 0.0 Second Grade N 312 20 56 16 42 61 74 43 % < 20 13.8 25.0 3.6 18.8 11.9 24.6 6.8 18.6 % 20 -< 40 15.1 5.0 17.9 37.5 14.3 16.4 10.8 14.0 % 40 -< 60 24.7 10.0 19.6 12.5 21.4 23.0 33.8 32.6 % 60 -< 80 19.9 25.0 26.8 18.8 14.3 18.0 23.0 11.6 % 80+ 24.7 35.0 32.1 6.3 38.1 13.1 23.0 23.3 % Untested 1.9 0.0 0.0 6.3 0.0 4.9 2.7 0.0 Third Grade N 343 22 62 17 52 94 59 37 % < 20 16.9 9.1 21.0 5.9 9.6 18.1 18.6 24.3 % 20 -< 40 20.4 22.7 19.4 11.8 23.1 13.8 30.5 21.6 % 40 -< 60 22.2 18.2 25.8 23.5 26.9 20.2 16.9 24.3 % 60 -< 80 21.0 40.9 24.2 23.5 21.2 20.2 13.6 16.2 % 80+ 18.1 9.1 8.1 29.4 19.2 25.5 18.6 13.5 % Untested 1.5 0.0 1.6 5.9 0.0 2.1 1.7 0.0 Vocabulary. A comparison of the TerraNova Vocabulary Assessment is shown in Table

37 below. Schools E and F had the largest percentages of first-grade students performing below

the 40th percentile on this measure (32% and 31% respectively), while School C had the smallest

percentage of students performing at the same level (0%).

In second grade, Schools C and G had the largest percentage of students performing

below the 40th percentile (37%), while School B had the smallest percentage of students

performing at the same level (14%).

Page 140: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

131

In third grade, School F had the largest percentage of students performing below the 40th

percentile (49%), while School D had the smallest percentage of students performing at the same

level (29%).

Table 37. Subgrant School Comparison of TerraNova Vocabulary Results School

TerraNova Vocabulary

All Schools A B C D E F G

First Grade N 320 17 51 18 53 64 73 44 % < 20 9.4 0.0 7.8 0.0 3.8 10.9 13.7 15.9 % 20 -< 40 14.1 11.8 19.6 0.0 1.9 21.9 17.8 11.4 % 40 -< 60 23.4 23.5 15.7 38.9 34.0 18.8 19.2 27.3 % 60 -< 80 32.8 23.5 37.3 50.0 43.4 35.9 26.0 18.2 % 80+ 18.4 41.2 19.6 11.1 17.0 10.9 16.4 27.3 % Untested 1.9 0.0 0.0 0.0 0.0 1.6 6.8 0.0 Second Grade N 312 20 56 16 42 61 74 43 % < 20 9.9 10.0 5.4 12.5 4.8 14.8 5.4 20.9 % 20 -< 40 14.1 15.0 8.9 25.0 14.3 14.8 13.5 16.3 % 40 -< 60 15.7 15.0 17.9 18.8 14.3 18.0 16.2 9.3 % 60 -< 80 23.1 25.0 10.7 37.5 35.7 18.0 24.3 25.6 % 80+ 34.6 35.0 57.1 0.0 31.0 29.5 35.1 27.9 % Untested 2.6 0.0 0.0 6.3 0.0 4.9 5.4 0.0 Third Grade N 343 22 62 17 52 94 59 37 % < 20 19.0 9.1 17.7 11.8 13.5 20.2 27.1 21.6 % 20 -< 40 18.4 22.7 21.0 29.4 15.4 16.0 22.0 10.8 % 40 -< 60 21.9 27.3 22.6 17.6 19.2 21.3 20.3 27.0 % 60 -< 80 19.2 13.6 21.0 17.6 21.2 24.5 11.9 16.2 % 80+ 20.7 27.3 17.7 23.5 30.8 16.0 16.9 24.3 % Untested 0.9 0.0 0.0 0.0 0.0 2.1 1.7 0.0 Fluency. A comparison of ORF results is shown in Table 38 below. School C had the

largest percentage of first-grade students performing in the “some” and “high” risk categories

combined (61%), while School A had the smallest percentage of first graders performing at the

same level (17%).

In second grade, Schools D and G had the largest percentages of students performing in

the “some” and “high” risk categories combined (57% and 56% respectively), while School C

had the smallest percentage of students performing at the same level (37%).

Page 141: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

132

In third grade, School C had the largest percentage of students performing in the “some”

and “high” risk categories combined (71%) while Schools B, D, and F had the smallest

percentage of students performing at the same level (52%).

Table 38. Subgrant School Comparison of ORF Results School

ORF All Schools A B C D E F G

First Grade (End) N 337 18 54 18 56 66 81 44 % High Risk 14.5 0.0 9.3 16.7 8.9 13.6 18.5 27.3 % Some Risk 28.8 16.7 29.6 44.4 25.0 30.3 28.4 29.5 % Low Risk 51.3 77.8 57.4 38.9 60.7 48.5 44.4 43.2 % Untested 5.3 5.6 3.7 0.0 5.4 7.6 8.6 0.0 Second Grade (End) N 323 22 58 19 42 62 77 43 % High Risk 25.7 31.8 27.6 15.8 23.8 32.3 18.2 30.2 % Some Risk 22.0 22.7 17.2 21.1 33.3 16.1 22.1 25.6 % Low Risk 46.7 36.4 51.7 42.1 42.9 45.2 51.9 44.2 % Untested 5.6 9.1 3.4 21.1 0.0 6.5 7.8 0.0 Third Grade (End) N 362 23 67 17 54 98 62 41 % High Risk 27.1 21.7 29.9 47.1 11.1 32.7 29.0 22.0 % Some Risk 30.7 43.5 22.4 23.5 40.7 28.6 22.6 43.9 % Low Risk 35.6 26.1 40.3 29.4 44.4 31.6 40.3 26.8 % Untested 6.6 8.7 7.5 0.0 3.7 7.1 8.1 7.3 Comprehension. A comparison of the TerraNova (comprehension/mastery reading score)

results is shown in Table 39 below. Two of the seven schools (Schools B and C) did not

administer the TerraNova assessment to kindergarten students, so no data were available for that

grade for these two schools. School F had the largest percentage of kindergarten students who

performed below the 40th percentile on this measure, while School G had the smallest percentage

of students performing at the same level (7%). Statistically significant (p < .05) differences were

found between School F’s and other schools’ end-of-year performance on this measure in

kindergarten.

Page 142: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

133

In first grade, School D had the largest percentage of students performing below the 40th

percentile (47%), while School A had the smallest percentage of students performing at the same

level (6%). Statistically significant (p < .05) differences were found between School D’s and

four other schools’ end-of-year performance on this measure in grade 1.

In second grade, School E had the largest percentage of students performing below the

40th percentile, while School B had the smallest percentage of students performing at this level.

In third grade, School G had the largest percentage of students performing below the 40th

percentile, while School A had the smallest percentage of students performing at this level (14%).

Page 143: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

134

Table 39. Subgrant School Comparison of TerraNova Reading Comprehension Results School

TerraNova Mastery Reading (Comprehension) All Schools A B C D E F G

Kindergarten N 283 18 53 125 60 27 % < 20 17.3 16.7 . . 7.5 22.4 21.7 3.7 % 20 -< 40 25.4 16.7 . . 13.2 19.2 61.7 3.7 % 40 -< 60 24.0 16.7 . . 24.5 37.6 3.3 11.1 % 60 -< 80 17.7 33.3 . . 30.2 13.6 3.3 33.3 % 80+ 15.2 16.7 . . 24.5 7.2 8.3 48.1 % Untested 0.4 0.0 . . 0.0 0.0 1.7 0.0 First Grade N 320 17 51 18 53 64 73 44 % < 20 16.9 5.9 5.9 0.0 28.3 18.8 21.9 15.9 % 20 -< 40 13.4 0.0 13.7 16.7 18.9 15.6 16.4 2.3 % 40 -< 60 18.8 17.6 27.5 16.7 20.8 12.5 15.1 22.7 % 60 -< 80 26.3 17.6 27.5 22.2 17.0 31.3 28.8 29.5 % 80+ 23.1 52.9 25.5 44.4 15.1 21.9 12.3 29.5 % Untested 1.6 5.9 0.0 0.0 0.0 0.0 5.5 0.0 Second Grade N 312 20 56 16 42 61 74 43 % < 20 10.9 15.0 5.4 12.5 11.9 8.2 9.5 20.9 % 20 -< 40 15.4 5.0 14.3 12.5 19.0 26.2 10.8 11.6 % 40 -< 60 21.2 5.0 25.0 31.3 14.3 21.3 27.0 16.3 % 60 -< 80 26.6 45.0 19.6 31.3 40.5 21.3 25.7 20.9 % 80+ 23.7 30.0 35.7 6.3 14.3 18.0 23.0 30.2 % Untested 2.2 0.0 0.0 6.3 0.0 4.9 4.1 0.0 Third Grade N 343 22 62 17 52 94 59 37 % < 20 12.5 4.5 14.5 11.8 3.8 13.8 13.6 21.6 % 20 -< 40 16.9 9.1 16.1 29.4 13.5 16.0 18.6 21.6 % 40 -< 60 22.4 31.8 29.0 29.4 15.4 20.2 20.3 21.6 % 60 -< 80 22.7 40.9 16.1 11.8 26.9 24.5 22.0 18.9 % 80+ 24.5 13.6 24.2 17.6 40.4 23.4 23.7 16.2 % Untested 0.9 0.0 0.0 0.0 0.0 2.1 1.7 0.0 Summary of cross sample comparison. The baseline assessment results for year one

indicate that participating school E, followed by schools C and F, consistently had the largest

percentages of students performing below benchmark on the DIBELS measures or below the 40th

percentile on the TerraNova. School A consistently had the smallest percentages of students

performing below benchmark or below the 40th percentile (and thus had the largest percentages

of students performing at benchmark (“low risk” category) or above the 40th percentile).

Page 144: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

135

Schools that had the largest percentages of students performing below benchmark or

below the 40th percentile in more than one grade level on the primary measures presented above

included the following: School E (NWF measure for phonics); School F (TerraNova Word

Analysis measure for phonics); School F (TerraNova Vocabulary measure); and School C (ORF

measure for fluency). These results may indicate areas of weakness in certain reading elements

for these schools. Once the assessment data for year two are available, it will be possible to see

if these trends continue.

School A had the largest percentage of students in the “low” risk category or above the

40th percentile in at least one grade level on all but two of the reading assessments (the DIBELS

PSF and TerraNova Vocabulary measures).

Given the small size of the school sample, it is unclear if instructional time was related to

performance for kindergarten students. Two of the low performing schools (School B and E)

have half-day kindergarten programs, while the highest performing school (School A) has a full-

day kindergarten program.

School size may be related to assessment results. Two of the three schools with the

lowest performing students (Schools E, and F) have some of the highest enrollments for the

Cohort 1 schools (see Part II of this report). The school with higher performing students has one

of the smallest enrollments. Given the small size of the sample, it is difficult to determine the

strength of the relationship between enrollment size and student performance. When Cohort 2

schools are added to the sample, the relationship between school size and performance will be

examined further.

Student demographic variables, such as percentage of K-3 students who receive special

education services or who are eligible for free/reduced school lunch, did not determine which

Page 145: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

136

schools had the highest or lowest performing students on the DIBELS and TerraNova measures

during the baseline year. Similarly, teacher demographic variables, such as the number of years

teaching experience and teachers’ educational attainment, did not determine which schools had the

highest or lowest performing students (see Part II of this report for demographic data on schools).

Subsequent evaluation reports will compare the degree of progress made on student

assessment results across the participating schools, and will continue to examine the relationship

of school size, student and teacher demographic variables, and program implementation with

student performance on reading assessments.

Comparison of Assessment Results for Student Subgroups

This section compares baseline reading assessment data for students in certain subgroups

(special education, economically disadvantaged, and gender) with all other students. Data are

presented at the aggregate level (results for all seven cohort 1 participating schools), for each

grade level tested, and are organized by the five reading elements. For this comparison, the

evaluator focused on the primary measures for each reading element: phonemic awareness

(PSF); phonics (NWF & TerraNova); vocabulary (TerraNova); fluency (ORF); and

comprehension (TerraNova). Further, only the end-of-year results for each tested grade (or last

administration during the school year) are presented here for ease in comparison.

Because Maine schools tend to have few ethnic minorities, and the sample size is very

small (seven schools), there are too few students in the Limited English Proficiency (LEP) or

ethnic minorities groups to allow for comparison with other students. Therefore, data are not

presented for these two groups in the following tables.

Paired tests were used to identify statistically significant differences in assessment

performance between students in a subgroup and other students within each grade level. We note

Page 146: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

137

where the differences in assessment performance were significant at the 95% confidence level (p <

.05).

Phonemic awareness. Table 40 below presents a comparison of results for PSF for each

tested grade. On this measure, a significantly larger percentage of students eligible for special

education services performed below benchmark (“some risk” and “high risk” categories

combined) than did other students (e.g., 49% compared with 33% for kindergarten). The

performance gap was larger by two percentage points in grade one. Statistically significant (p <

.05) differences were found between special education and non-special education students’ end-

of-year performance on PSF in kindergarten and grade 1.

Similar percentages of economically disadvantaged students (students eligible for

free/reduced lunch) and other students performed below benchmark on PSF (a difference of only

4-5 percentage points).

A slightly larger percentage of male students performed below benchmark in all tested

grades on PSF than did female students. There was a larger performance gap in kindergarten

(about 11-12 percentage points), while this gap was smaller in first grade (3 percentage points).

Statistically significant (p < .05) differences were found between male and female students’ end-

of-year performance on this measure in kindergarten.

Page 147: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

138

Table 40. Comparison of PSF Results for Different Groups of Students Special Ed. Disadvantaged Gender DIBELS

PSF All

Students Yes No Yes No Male Female Kindergarten (End) N 377 47 328 131 244 209 166 % High Risk 7.2 19.1 5.5 8.4 6.6 7.7 6.6 % Some Risk 27.9 29.8 27.4 29.8 26.6 32.5 21.7 % Low Risk 58.4 46.8 60.1 56.5 59.4 51.7 66.9 % Untested 6.6 4.3 7.0 5.3 7.4 8.1 4.8 First Grade (Mid) N 337 41 296 152 185 175 162 % High Risk 1.8 9.8 0.7 2.0 1.6 0.6 3.1 % Some Risk 14.8 19.5 14.2 17.1 13.0 16.0 13.6 % Low Risk 76.9 58.5 79.4 75.0 78.4 77.7 75.9 % Untested 6.5 12.2 5.7 5.9 7.0 5.7 7.4 Phonics. Tables 41 and 42 below present a comparison of results for NWF and the

TerraNova Word Analysis for each tested grade level. A significantly larger percentage of

special education students performed below benchmark (“some risk” and “high risk categories

combined) on NWF than did other students, with a difference of 20-27 percentage points across

all tested grades (K-2) and the largest difference in second grade. On the TerraNova Word

Analysis measure, the difference for students scoring below the 40th percentile was 37-47

percentage points across all tested grades (1-3) with the largest difference in third grade.

Statistically significant (p < .05) differences were found between special education and non-

special education students’ end-of-year performance on NWF in kindergarten and grades 1-2,

and on TerraNova Word Analysis in grades 1-3.

The performance gap was smaller when results for economically disadvantaged students

were compared with results for other students. For students performing below benchmark on

NWF, the difference was 8-12 percentage points across all tested grades with the largest

difference in second grade. For students performing below the 40th percentile on the TerraNova

Word Analysis, the difference in results for disadvantaged students versus other students was 1-

Page 148: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

139

15 percentage points with the largest difference in kindergarten. Statistically significant (p < .05)

differences were found between disadvantaged and non-disadvantaged students’ end-of-year

performance on NWF in grade 1 and on the TerraNova Word Analysis in grades 1 and 3.

A slightly larger percentage of male students performed below the benchmark on NWF

than did female students (a difference of 0-18 percentage points) with the largest difference in

kindergarten. A similar pattern was found for the TerraNova Word Analysis results, with a

smaller difference of 1-8 percentage points and the largest difference in kindergarten.

Statistically significant (p < .05) differences were found between male and female students’ end-

of-year performance on NWF in kindergarten and on TerraNova Word Analysis in grade 1.

Table 41. Comparison of NWF Results for Different Groups of Students Special Ed. Disadvantaged Gender DIBELS

NWF All

Students Yes No Yes No Male Female Kindergarten (End) N 377 47 328 131 244 209 166 % High Risk 22.3 36.2 20.4 26.0 20.5 25.8 18.1 % Some Risk 23.1 25.5 22.6 23.7 22.5 27.8 16.9 % Low Risk 48.0 34.0 50.0 45.0 49.6 38.3 60.2 % Untested 6.6 4.3 7.0 5.3 7.4 8.1 4.8 First Grade (End) N 337 41 296 152 185 175 162 % High Risk 12.5 31.7 9.8 15.1 10.3 11.4 13.6 % Some Risk 38.0 36.6 38.2 42.1 34.6 40.6 35.2 % Low Risk 44.2 22.0 47.3 40.8 47.0 42.3 46.3 % Untested 5.3 9.8 4.7 2.0 8.1 5.7 4.9 Second Grade (Beg) N 323 44 279 148 175 164 157 % High Risk 28.5 63.6 22.9 33.1 24.6 28.7 28.7 % Some Risk 32.2 20.5 34.1 32.4 32.0 32.3 32.5 % Low Risk 34.4 9.1 38.4 31.1 37.1 34.1 34.4 % Untested 5.0 6.8 4.7 3.4 6.3 4.9 4.5

Page 149: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

140

Table 42. Comparison of TerraNova Word Analysis Results for Different Groups of Students Special Ed. Disadvantaged Gender TerraNova

Word Analysis All

Students Yes No Yes No Male Female First Grade (End) N 320 38 281 148 171 167 153 % < 20 9.1 31.6 6.0 13.5 5.3 13.2 4.6 % 20 -< 40 13.1 26.3 11.0 16.9 9.4 13.2 13.1 % 40 -< 60 24.4 23.7 24.6 23.0 25.7 26.3 22.2 % 60 -< 80 25.6 5.3 28.5 21.6 29.2 22.8 28.8 % 80+ 25.9 7.9 28.5 23.0 28.7 22.2 30.1 % Untested 1.9 5.3 1.4 2.0 1.8 2.4 1.3 Second Grade (End) N 312 43 265 143 165 158 153 % < 20 13.8 37.2 9.8 14.0 13.3 14.6 13.1 % 20 -< 40 15.1 23.3 13.6 14.0 15.8 17.1 13.1 % 40 -< 60 24.7 25.6 24.9 28.7 21.8 22.2 27.5 % 60 -< 80 19.9 11.6 21.5 22.4 18.2 19.6 20.3 % 80+ 24.7 2.3 28.3 19.6 29.1 24.1 24.8 % Untested 1.9 0.0 1.9 1.4 1.8 2.5 1.3 Third Grade (End) N 343 41 296 162 174 196 146 % < 20 16.9 61.0 10.8 25.3 9.2 16.3 17.1 % 20 -< 40 20.4 17.1 20.9 19.1 21.3 20.4 20.5 % 40 -< 60 22.2 14.6 23.3 21.6 23.0 24.5 19.2 % 60 -< 80 21.0 2.4 23.3 17.3 24.1 19.9 22.6 % 80+ 18.1 2.4 20.3 15.4 20.7 17.3 19.2 % Untested 1.5 2.4 1.4 1.2 1.7 1.5 1.4 Vocabulary. Table 43 below presents a comparison of results for the TerraNova

Vocabulary measure for each tested grade level. The percentage of special education students

performing below the benchmark on this measure was significantly larger than for other students

(a difference of 27-49 percentage points across grades 1-3), with the largest difference in

performance in third grade. Statistically significant (p < .05) differences were found between

special education and non-special education students’ end-of-year performance on TerraNova

Vocabulary in grades 1-3.

A somewhat larger percentage of economically disadvantaged students performed below

benchmark than did other students (a difference of 8-16 percentage points across the tested

Page 150: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

141

grades), with the largest difference in third grade. Statistically significant (p < .05) differences

were found between disadvantaged and non-disadvantaged students’ end-of-year performance on

this measure in grades 1 and 3.

The performance gap for males was smaller (only 6-9 percentage points across the tested

grades), with the largest difference in kindergarten. Statistically significant (p < .05) differences

were found between male and female students’ end-of-year performance on this measure in grade 1.

Table 43. Comparison of TerraNova Vocabulary Results for Different Groups of Students Special Ed. Disadvantaged Gender TerraNova

Vocabulary All

Students Yes No Yes No Male Female First Grade (End) N 320 38 281 148 171 167 153 % < 20 9.4 31.6 6.4 14.2 5.3 14.4 3.9 % 20 -< 40 14.1 15.8 13.9 16.9 11.7 13.2 15.0 % 40 -< 60 23.4 28.9 22.4 19.6 26.3 24.6 22.2 % 60 -< 80 32.8 15.8 35.2 33.1 32.7 29.9 35.9 % 80+ 18.4 2.6 20.6 14.2 22.2 16.2 20.9 % Untested 1.9 5.3 1.4 2.0 1.8 1.8 2.0 Second Grade (End) N 312 43 265 143 165 158 153 % < 20 9.9 41.9 4.5 11.2 8.5 10.8 9.2 % 20 -< 40 14.1 23.3 12.5 16.8 11.5 17.1 11.1 % 40 -< 60 15.7 14.0 16.2 16.8 15.2 15.8 15.7 % 60 -< 80 23.1 7.0 26.0 23.1 23.6 21.5 24.8 % 80+ 34.6 14.0 38.1 30.1 38.8 32.3 36.6 % Untested 2.6 0.0 2.6 2.1 2.4 2.5 2.6 Third Grade (End) N 343 41 296 162 174 196 146 % < 20 19.0 68.3 12.2 23.5 14.4 22.4 14.4 % 20 -< 40 18.4 12.2 18.6 21.0 14.9 17.3 19.9 % 40 -< 60 21.9 4.9 24.7 25.9 19.0 20.4 23.3 % 60 -< 80 19.2 9.8 20.6 19.1 19.5 18.9 19.9 % 80+ 20.7 2.4 23.3 9.9 31.0 19.9 21.9 % Untested 0.9 2.4 0.7 0.6 1.1 1.0 0.7 Fluency. Table 44 below presents a comparison of results for ORF for each tested grade

level. A significantly larger percentage of special education students performed in the “some”

and “high” risk categories on this measure than did other students (a difference of 28-38

Page 151: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

142

percentage points across grades 1-3), with the largest difference in third grade. Statistically

significant (p < .05) differences were found between special education and non-special education

students’ end-of-year performance on ORF in grades 1-3.

A larger percentage of economically disadvantaged students performed in the “some” and

“high” risk categories on this measure than did other students (a difference of 14-19 percentage

points across the tested grades), with the largest difference in kindergarten. Statistically

significant differences (p < .05) were found between disadvantaged and non-disadvantaged

students’ end-of-year performance on ORF in grades 1-3.

A slightly larger percentage of male students performed in the “some” and “high” risk

categories on this measure than did female students (a difference of 9-14 percentage points), with

the largest difference in third grade. Statistically significant (p < .05) differences were found

between male and female students’ end-of-year performance on this measure in grades 1 and 3.

Table 44. Comparison of ORF Results for Different Groups of Students Special Ed. Disadvantaged Gender DIBELS

ORF All

Students Yes No Yes No Male Female First Grade (End) N 337 41 296 152 185 175 162 % High Risk 14.5 41.5 10.8 19.7 10.3 17.1 11.7 % Some Risk 28.8 26.8 29.1 34.2 24.3 31.4 25.9 % Low Risk 51.3 22.0 55.4 44.1 57.3 45.7 57.4 % Untested 5.3 9.8 4.7 2.0 8.1 5.7 4.9 Second Grade (End) N 323 44 279 148 175 164 157 % High Risk 25.7 68.2 19.0 29.1 22.9 29.9 21.7 % Some Risk 22.0 11.4 23.7 25.7 18.9 22.6 21.7 % Low Risk 46.7 15.9 51.6 41.2 51.4 41.5 52.2 % Untested 5.6 4.5 5.7 4.1 6.9 6.1 4.5 Third Grade (End) N 362 44 318 173 188 206 155 % High Risk 27.1 70.5 21.1 32.4 21.8 31.1 21.9 % Some Risk 30.7 20.5 32.1 32.4 29.3 32.0 28.4 % Low Risk 35.6 0.0 40.6 27.7 43.1 32.0 40.6 % Untested 6.6 9.1 6.3 7.5 5.9 4.9 9.0

Page 152: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

143

Comprehension. Table 45 below presents a comparison of results for the TerraNova

reading comprehension measure for each tested grade level. A significantly larger percentage of

special education students performed below the 40th percentile on this measure than did other

students (a difference of 26-58 percentage points across grades K-3), with the largest difference

in third grade. Statistically significant (p < .05) differences were found between special

education and non-special education students’ end-of-year performance on this measure in

kindergarten and grades 1-3.

A larger percentage of economically disadvantaged students performed below the 40th

percentile than did other students (a difference of 6-17 percentage points), with the largest

difference in third grade. Statistically significant differences (p < .05) were found between

disadvantaged and non-disadvantaged students’ end-of-year performance on this measure in

kindergarten, grade 1, and grade 3.

A slightly larger percentage of male students performed below the 40th percentile than did

female students (a difference of 6-12 percentage points), with the largest difference in

kindergarten. Statistically significant (p < .05) differences were found between male and female

students’ end-of-year performance on the TerraNova comprehension measure in grades 1 and 3.

Page 153: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

144

Table 45. Comparison of TerraNova Reading Comprehension Results for Different Groups of Students

Special Ed. Disadvantaged Gender TerraNova Comprehension

All Students Yes No Yes No Female Male

Kindergarten N 283 37 238 99 176 161 121 % < 20 17.3 29.7 14.7 22.2 13.6 22.4 10.7 % 20 -< 40 25.4 35.1 23.1 30.3 21.6 25.5 25.6 % 40 -< 60 24.0 18.9 25.6 28.3 22.7 23.0 25.6 % 60 -< 80 17.7 13.5 18.5 7.1 23.9 15.5 19.8 % 80+ 15.2 2.7 17.6 12.1 17.6 13.0 18.2 % Untested 0.4 0.0 0.4 0.0 0.6 0.6 0.0 First Grade N 320 38 281 148 171 167 153 % < 20 16.9 34.2 14.6 25.0 9.9 19.2 14.4 % 20 -< 40 13.4 18.4 12.5 13.5 12.9 15.6 11.1 % 40 -< 60 18.8 18.4 18.9 19.6 18.1 22.2 15.0 % 60 -< 80 26.3 21.1 27.0 22.3 29.8 19.8 33.3 % 80+ 23.1 7.9 25.3 17.6 28.1 21.0 25.5 % Untested 1.6 0.0 1.8 2.0 1.2 2.4 0.7 Second Grade N 312 43 265 143 165 158 153 % < 20 10.9 41.9 5.7 12.6 9.1 10.1 11.8 % 20 -< 40 15.4 27.9 13.2 16.8 13.9 19.0 11.8 % 40 -< 60 21.2 7.0 23.8 23.1 20.0 21.5 20.9 % 60 -< 80 26.6 18.6 28.3 27.3 26.7 25.3 27.5 % 80+ 23.7 4.7 26.8 18.2 28.5 21.5 26.1 % Untested 2.2 0.0 2.3 2.1 1.8 2.5 2.0 Third Grade N 343 41 296 162 174 196 146 % < 20 12.5 63.4 5.7 16.7 8.6 13.3 11.6 % 20 -< 40 16.9 17.1 16.9 21.6 12.6 20.9 11.0 % 40 -< 60 22.4 9.8 24.0 24.1 20.7 21.9 23.3 % 60 -< 80 22.7 2.4 25.3 21.0 24.1 19.4 27.4 % 80+ 24.5 2.4 27.7 15.4 33.3 23.5 26.0 % Untested 0.9 4.9 0.3 1.2 0.6 1.0 0.7 Summary for comparison of student subgroups. A comparison of baseline (year one)

assessment results for certain student subgroups versus other students revealed fairly consistent

patterns across all five reading elements and most reading measures. These patterns included:

• The performance gap for male students was generally not large. On most measures (but not ORF), the performance gap for male students was largest in kindergarten and then decreased in each subsequent grade level. The differences in performance were statistically significant in some grade levels.

Page 154: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

145

• The performance gap for disadvantaged students was somewhat larger, particularly in kindergarten. On most measures (but not NWF or TerraNova), the performance gap for disadvantaged students was largest in kindergarten and then decreased in each subsequent grade level. The differences in performance were statistically significant in most grade levels.

• The performance gap for special education students was significantly large at all grade

levels. On most measures (but not PSF), the performance gap was smallest in kindergarten and increased significantly at each subsequent grade level. The differences in performance were statistically significant at all grade levels.

These trends will continue to be monitored as part of the evaluation of the program. Once

year-two data are available, it will be possible to compare the degree of improvement from year

one to year two for special groups of students in comparison with other students. Due to the very

small number of LEP students and ethnic minorities in the participating schools, assessment

results for these student subgroups were not compared with results for other students.

Summary for Part V: Baseline Student Assessment Data

Part V of this report presented baseline (year one) assessment results for the seven Cohort

1 participating schools. An analysis of the aggregate results across all seven schools indicated

mixed levels of progress over year one for phonics and fluency, and strong progress in phonemic

awareness. The evidence is inconclusive for progress in year one for vocabulary and

comprehension, either because the DIBELS measures lacked benchmarks or because there was

only one data point for the TerraNova assessment. Baseline assessment data cannot be used to

determine the impact or effectiveness of the MRF interventions. Subsequent analyses will

compare year one and year two assessment results to see what progress was made. The baseline

assessment results do provide strong evidence that the participating schools were in need of

assistance to improve student performance in reading.

An analysis of the assessment results disaggregated by school (7 schools) indicated

consistent patterns of performance for these schools. Three schools had consistently low

Page 155: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

146

performance on the primary measures, while one school had higher performance. School size

may be related to student performance, as the two of the three schools with the lowest

performance also had some of the largest enrollments for this sample, while the school with the

highest performance also had one of the smallest enrollments. Given the small size of this

sample, it is difficult to know how strong the relationship is between school size and student

performance.

An analysis of the assessment results disaggregated by student subgroup populations also

revealed some patterns. Male students consistently scored lower than female students on all

measures, but the gap was generally not large. On most measures, the gap was largest in

kindergarten and decreased at each subsequent grade level. Disadvantaged students scored lower

than other students and the gap was somewhat large. On most measures, the gap was largest in

kindergarten and decreased at each subsequent grade level. Special education students scored

significantly lower than other students. On most measures, the gap was smallest in kindergarten

and increased significantly at each subsequent grade level.

Page 156: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

147

Bibliography

Adler, C. R. (Ed.). (2001). Put reading first: The research blocks for teaching children to read.

Jessup, MD: National Institute for Literacy, Partnership for Reading.

CTB McGraw-Hill. (2001, August). Technical report for Terra Nova. Monterey, CA: author. CTB McGraw-Hill. (1996). Teacher’s guide to the Terra Nova. Monterey, CA: author. Foorman, B., Fletcher, J. M., Francis, D. J., Schatschneider, C., & Mehta, P. (1989). The role of

instruction in learning to read: Preventing reading failure in at-risk children. Journal of

Educational Psychology, 90(1), 37-55.

Good, R. H., Gruba, J., & Kaminski, R. A. (2001). Best practices in using dynamic indicators of

basic early literacy skills (DIBELS) in an outcomes-based driven model. In A. Thomas &

J. Grimes (Eds.), Best practices in school psychology IV (pp.679-700). Washington, DC:

National Association of School Psychologists.

Good, R. H., & Kaminski, R. A. (Eds.). (2002). Dynamic indicators of basic early literacy skills

(6th ed.). Eugene, OR: Institute for the Development of Educational Achievement.

Available online: http:// dibels.uoregon.edu/

Good, R. H., Simmons, D., & Kame’enui, E. J. (2001) The importance and decision-making

utility of a continuum of fluency-based indicators of foundational reading skills for third-

grade high-stakes outcomes. Scientific Studies of Reading, 5(30), 257-288.

Moats, L. (2003). LETRS: Language essentials for teachers of reading and spelling. Book 1.

Longmont, CO: Supris West.

National Assessment of Educational Progress. (2006). The nation’s report card: Science 2005.

Retrieved from http://nces.ed.gov/nationsreportcard/

Page 157: MRF report 062006 - usm.maine.edu fileThis report was prepared by Janet Fairman, Assistant Research Professor, Maine Education Policy Research Institute ... Delivery of the MRF Intervention

148

National Reading Panel. (2000). Teaching children to read: An evidence-based assessment of the

scientific research literature on reading and its implications for reading instruction.

Washington, DC: National Institute of Child Health and Human Development.

Parker, R., Hasbrouck, J., & Tindal, G. (1992). Greater validity for oral reading fluency: Can

miscues help? The Journal of Special Education. 25(4), 492-503.

Snow, C. E., Burns, M. S., & Griffin, P. (Eds.). (1998). Preventing reading difficulties in young

children. Washington, DC: National Academy Press.

Vaughn, S., & Briggs, K. (Eds.). (2003). Reading in the classroom: Systems for the observation

of teaching and learning. Baltimore, MD: Paul H. Brookes Publishing Co.