http://miblsi.cenmi.org spring data review cohort 7 spring 2013 elementary schools

64
http:// miblsi.cenmi.org Spring Data Review Cohort 7 Spring 2013 Elementary Schools

Upload: myles-wade

Post on 28-Dec-2015

220 views

Category:

Documents


0 download

TRANSCRIPT

Spring Data Review

Spring Data ReviewCohort 7Spring 2013Elementary Schoolshttp://miblsi.cenmi.orgTrainer Notes:This will be C7 last core training with MiBLSi. 1The materials for this training day were developed with the efforts ofMelissa NantaisKim St. MartinAnna HarmsJennifer RollenhagenTennille Whitmore

Content was based on the work ofRoland Good, University of OregonStephanie Stollar, Dynamic Measurement Group (DMG)Rob Horner, University of OregonGeorge Sugai, University of ConnecticutJoe Torgesen, Florida Center for Reading ResearchDawn Miller, Shawnee Mission School DistrictMichigans RtI State Framework and Guidance Committee

Acknowledgements#Trainer Notes:This slide contains acknowledgements of those who contributed to the development of this content along with whose work this content is based upon. 2To make this day the best possible, we need your assistance and participationBe Responsible Attend to the Come back together signal Active participationPlease ask questionsBe Respectful Please allow others to listenPlease turn off cell phones and pagersPlease limit sidebar conversationsShare air timePlease refrain from email and Internet browsingBe SafeTake care of your own needs

Group Expectations#Trainer Notes:Please take a few moments to review the group expectations at the start of the day. 3Review the role of the school leadership team in sustaining the work of MTSS implementation

Make the work of MTSS visible within the school improvement plan

Provide teams with time and a structure to review school-wide data for the purposes of developing a plan that will improve student outcomes

Provide teams with time and a structure to identify and summarize celebrations and areas of need to share with stakeholdersPurpose of Spring Data Review Activities#Trainer Notes:This slide contains the purpose of Spring Data Review activities. Be sure to review the purpose with participants. 4Introduction

Gather: Ensuring continued accurate and efficient collection and use of data

Study: Understanding the data, strengths, areas of need, gap and cause for gap

Plan: Integrate improvement priorities into the school improvement plan

Do: Monitoring the plan and next stepsTodays Agenda#Trainer Notes:This slide contains the agenda for todays spring data review. Please consider posting the agenda on chart paper at the front of the room to refer to throughout the day. 51.0 Introduction#Trainer Notes:This module provides an introduction to todays data review. It should begin by 9:05 a.m. and end by 9:30 a.m. 6Getting Ready for TodayTake a moment to identify the following roles:

Facilitator

Recorder

Timekeeper

It will be helpful for the recorders to have access to someones computer!#Trainer Notes:Take a few moments to allow the implementation teams to determine roles for the day7Pink Assessment BinderPaper or electronic copies of your dataFollow-up activities worksheets/action plans from Fall and Winter Data Reviews

School Improvement PlanDo You Have What You Need?#Trainer Notes:The intent of this slide is to provide a quick review of materials needed for todays training. 8Completed three times per year

Problem-solving focus tied to School Improvement Plan

Focused on both Program Quality/Fidelity data & Student Outcome data

Results in an action plan that specifies what needs to be done, by whom & by when, and the resources needed

Building-Level Data Review: Big Ideas#Trainer Notes:This slide provides a review of the big ideas behind building-level data review. Please take a few moments to review the list with the group. 9To understand the status of MTSS implementation and impact on student outcomes

To identify small and large successes, communicate those successes and capitalize on them

To identify where support is needed and begin communicating and organizing resources so that support is provided where neededPurpose of a Building-Level Data Review#Trainer Notes:This slide provides a review of building-level data review. Have participants read through these bullets on their own. 10At the Winter Data Review we asked each team to share one aspect of their implementation plan that the group could hold them accountable for in May

Briefly review your progress with your team

Identify at least one aspect of this years implementation efforts related to what your team was being held accountable for that has gone well Be prepared to shareTeam Share#Trainer Notes:This team share should begin by 9:10 a.m. and end by 9:30 a.m. It may end earlier depending on the size of your group.

There are a variety of ways you can structure the share out of this team time. Please choose an option that will work best for your audience. Here are some suggestions (please note that you do not need to choose one of these options if there is another option you would like to use):For a small to medium group Same role partner; find someone at another table who has the same job role as you (e.g., teacher, principal, ISD staff, coach, etc.); once you have a partner, have a standing conversation, each sharing your groups implementation success; at the signal, thank your partner and return to your teamFor a medium to large sized group Go Visual/Museum Tour; create a poster (with a visual) representing what your team was being held accountable for and your success with implementation; designate one team member to stay with your poster and answer questions/ provide clarification; all other team members go on a museum tour looking at what other teams are doing and making notes to take back to their team112.0 Gather#Trainer Notes:This module should begin by 9:30 a.m. and end at 10:00 a.m.12MiBLSi does not have direct access to pull data from your AIMSweb accounts. Cohort 7 schools need to submit summaries of their AIMSweb screening data to MiBLSi by using a spreadsheet. AIMSweb Data Sheet for 2012_13 (send to [email protected] or your TAP)8 teams sent their spreadsheets after Winter screening.We use this data for problem solving to provide the best supports possible. We also use summaries of the data for grant reporting.AIMSweb Data Collection#[Skip this]

13The letter provides information about how MiBLSi uses data collected from schools.NEED TO ACT: Please have either the UO DIBELS Data System or DIBELSnet form signed by your districts superintendent or assistant superintendent (someone who can provide permission for the entire district).We need the DIBELSnet form to access data from schools that have switched to this data system for DIBELS Next.We need the UO DIBELS Data System form as part of an update process to comply with recent changes to FERPA requirements.Even though the form is being signed by the district, we will not access data from schools in C1-7 that have not participated with us.Cohort 1-7 District Data Sharing Agreements#[skip]14MIBLSI ASKS SCHOOLS AND DISTRICTS TO USE DATA FOR:Data-based decision making as part of a continuous school improvement process to improve student outcomes through effective/efficient implementation of research-based practices.MIBLSI COLLECTS DATA FROM SCHOOLS AND USES IT FOR:Project-level data-based decision making to inform allocation of resources and effective programming support Accountability to our grant funding sources#[Ignore bottom part]. This slide talks about why this data is important. Data is used as part of an on-going school improvement process, where we look at student outcome data and systems process data to determine if our instruction is working and if its done with fidelity.15Rates of Data Submission: Cohort 7#You can ignore the results of this graph, which looks at what percentage of MiBLSi schools submitted each evaluation piece. [Not SE Data Collection Form]16We want to gather information that tells us:How well we are implementing/doing something: Program Quality/Fidelity DataANDWhether what were doing is working: Student Outcome DataGather#The data youve collected this year has been a combination of program quality/fidelity/systems process data and student outcome data. So, how well are we implementing something and is it resulting in improved performance? Remember that these two things are closely related, and without fidelity, an otherwise effective system is unlikely to result in student improvement.17

Why Do We Want Both Types of Data?#Trainer Notes:Download and embed the video Data in the Classroom onto this slide. The intent of the video is to emphasize the importance of having both types of data student outcome data and the program quality/fidelity data. When you watch the video your attention is on the data the announcer asks for the number of red cards dealt. However, when you watch the video a second time, most people will now notice the additional data, the message on the cards that says Be sure to use at least 2 types of data. The two types of data we need to make sure we are looking at together to get the complete picture are student outcome data and program quality/fidelity data. 18Behavior Data Youve CollectedStudent OutcomeDataProgram Quality/Fidelity DataOffice Discipline Referral Data

Tier 2/ Tier 3 Intervention Tracking form

Benchmarks of Quality (BoQ)

PBIS Self-Assessment Survey (SAS)

Benchmarks for Advanced Tiers (BAT)

School-wide Evaluation Tool (SET)#Trainer Notes:Here is a list of the behavior data teams have been collecting since beginning with the project. 19Literacy Data Youve CollectedStudent OutcomeDataProgram Quality/Fidelity DataReading Curriculum Based Measurement (R-CBM) screening & progress monitoring data

Tier 2/ Tier 3 Intervention Tracking formPlanning and Evaluation Tool Revised (PET-R)

or

School-wide Evaluation and Planning Tool (SWEPT)#Trainer Notes:Here is a list of the literacy data teams have been collecting since beginning with the project. 20Considerations in Building Sustainable Systems of Data-Based Decision MakingData CollectionTraining (Initial and Re-Training)Accuracy Checks for Administration & ScoringScheduling of AssessmentsData EntryTime for Data Entry Accuracy of Data EntryTraining for Data Entry (Initial and Re-Training)Data SharingTraining in Interpretation of Data (Initial and Re-Training) Ensuring Timely Access to DataFormal and Informal Data Sharing

#Trainer Notes:The goal is to ensure accurate collection of data that is accessible in a timely fashion for ongoing decision making at the school-wide, grade level, classroom, and individual student levels. The intent of this slide and the upcoming activity is to help teams to consider how to ensure data-based decision making remains a part of the sustainable system that will continue after their formal partnership and trainings with the project have ended. 21Supports from MiBLSi:Measurement ScheduleMeasurement page on the MiBLSi website Reading Data Coordinator listservPBIS Assessment listserv SWIS Facilitator listservTraining materials on the MiBLSi website for data reviewDIBELS Mentor training (August)SWIS Facilitator trainings (Fall 2013)Sustaining Data Collection & Review#Trainer Notes:This slide provides a review of the continued supports the project will provide to assist with sustaining data collection and review after formal trainings with the project has ended. 22Sustaining Data Collection & Review

#Trainer Notes:We know that teams often encounter road blocks or barriers to the continued collection and use of data for decision-making after their time with the project. Since our focus has always been on sustainable practices, teams need to begin to identify what the potential road blocks might be in order to brainstorm ways to prevent the road block or to get around the road blocks if/when they come up. 23Example Planning Sheet

#Trainer Notes:This is a screen shot of an example Sustaining Data Collection & Analysis Planning Sheet. Walk teams through the example, focusing on the ODR row with the anticipated barrier, brain storming action items and identification of the necessary supports to address the potential barrier. Teams will use a similar process during the Team Time on the next slide.24Review the 2013-14 Measurement Schedule and determine how to best ensure that data are collected during the 2013-14 school year

Complete the Planning Sheet by identifying any potential barriers to continued data collection and data reviews for the 2013-14 school year

Brain storm strategies to prevent or overcome the potential barriers

Identify who can help address any remaining barriersTeam Time#Trainer Notes:Provide teams with 20 minutes to complete this activity. This team time should begin by 9:40 a.m. and end at 10:00 a.m. The intent of this activity is to help teams to consider how to ensure data-based decision making remains a part of the sustainable system that will continue after their formal partnership and trainings with the project have ended. Teams are also asked to anticipate potential barriers to continued data collection and analysis and brainstorm how to address these potential barriers.

There is a worksheet in the participant workbook to help teams through this Team Time. The worksheet is titled Sustaining Data Collection & Analysis Planning Sheet.

Provide a break from 10:00 a.m. to 10;15 a.m. 253.0 Study#Trainer Notes:This module should begin at 10:15 a.m. and end at 12:00 p.m. when teams break for lunch. 26Using Data for Decision Making

#Trainer Notes:Download and embed the video entitled School Leadership Videos Using Data. The intent of this video is to stress the importance of the data review process and link to school improvement planning. 27

Some times, reviewing data can be awkward#Trainer Notes:Download and embed the video clip Do you trust your data. The intent of this video is to convey the message that sometimes reviewing data can be awkward, especially if someone does not trust the data. Teams need to be aware of the emotions that may come along with reviewing data and stay focused on the things that are within their control. 28

Understanding the Parts of a School Improvement Plan Program Quality / Fidelity DataStudent Outcome Data#Trainer Notes:This slide is animated. The intent of this slide is to outline the steps of the School Improvement Process and to point out where program quality/fidelity data and student outcome data fit into the School Improvement Planning Process. 29Partner 1:Review the definition of goals and objectives and share with Partner 2

Partner 2:Review the definition of strategies and activities and share with Partner 1

Partner Activity#Trainer Notes:Provide 5 minutes for this activity. 30Questions & Data Source for Building-Level Data Analysis

#Trainer Notes:These screen shots of the two page document Questions and Data Sources for Building-Level Analysis are intended to help teams organize the study process coming up. Before teams get started studying their data, it is important to point out a few important parts to this document. Have teams open up their workbook to these two pages. On the first page have them highlight the box after question #6 and stress that the answers to items 1-6 should allow teams to identify the gap statement and write precise problem statements. On the second page have participants highlight the last box and stress that the use of the program quality/fidelity data along with the behavior data (remember the integration of reading and behavior) in questions 7-15 will help teams formulate a the cause for gap statement. There are three versions of this document in the participant workbook one for DIBELS net, one for DDS, and one for AIMSweb. Be sure to have participants locate the appropriate version for the data system they use. 31Independently:Review the document Questions and Data Sources for Building-Level Analysis

As a Team:Identify any questions or data sources that your team needs additional clarification around

Activity#Trainer Notes:Provide teams with 5-10 minutes for this activity. Be sure to circulate the room while teams are engage in this activity. Debrief with the group after the activity to determine which data sources teams may need additional supports with. The following 34 slides are hidden and trainers can stop hiding the slides that are needed for reviewing data reports with teams. 32Effectiveness of Instructional Support Reports

#

Effectiveness of Instructional Support Levels Report examines the effectiveness of a schools instructional support by grouping students by their benchmark status category at one assessment period and then determining how well that group did at the next assessment period.#

Contains data on the performance of the entire grade at the middle of the year based on the composite score.Contains data on the performance of the entire grade at the end of the year based on the composite score.#

Of the students who were at or above benchmark in the winter, what percent remained at or above benchmark in the spring? The data indicate that of the 56 students who were at or above benchmark in the winter, 84% (47) remained at or above benchmark in the spring.#

Of the students who were below benchmark in the winter, what percent moved from below benchmark to at or above benchmark in the spring? The data indicate that of the 8 students who were below benchmark in the winter, 67% (8) moved to at or above benchmark in the spring.#

Of the students who were well below benchmark in the winter, what percent moved to below benchmark or at or above benchmark in the spring? The data indicate that of the 1 student who was below benchmark in the winter, 100% (1) moved to below benchmark and at or above benchmark in the spring.#Status Report by Subgroup

#DIBELS Next Distribution Report

#Trainer Notes:This is a screenshot of a DIBELS Next Distribution Report. 40DIBELS Next Distribution Report by Subgroup

#Trainer Notes:This is a screenshot of the DIBEL Next Distribution Report by subgroup. Remind teams that it is important that they are examining their data disaggregated by subgroups such as race/ethnicity and SES. To be able to run these reports, teams must be sure to enter relevant student information into the data system. 41DIBELS Next Cross Year Box Plot

#Trainer Notes:This is a screen shot of the DIBELS Next Cross Year Box Plot graph. This report is helpful to teams when examining data over time. 42DIBELS Next Summary of Effectiveness Report

#Trainer Notes:This is a screen DIBELS Next Summary of Effectiveness Report. 43AIMSweb Tier Transition Report

#Trainer Notes:This is a screenshot of the AIMSweb Tier Transition Report. 44AIMSweb Tier Transition Report by Subgroup

Expand Report OptionsSelect groupClick Display#Trainer Notes:The AIMSweb Tier Transition Report by subgroup looks identical to the typical Tier Transition Report. This provides a screen shot of how to select the options to run this report by subgroup. 45How Effective is our Reading System of Support?Meets the Needs ofAnd SupportsCoreAt least 80% of all studentsSupport 95-100% of students to make adequate progressStrategic15% of students who need more than just CoreSupports 80% of these students to achieve benchmarkIntensive5% of students who need intensive interventionSupports 80% of these students to progress to strategic or benchmark support#Electronic Version of the Planning & Evaluation Tool-Revised (PET-R)

#.47Electronic Version of the Planning & Evaluation Tool-Revised (PET-R)

#48Self Assessment Survey (SAS)Total Score ReportSchool-wide

#Self Assessment Survey (SAS)Total Score ReportNon-Classroom

#Self Assessment Survey (SAS)Total Score ReportClassroom

#Self Assessment Survey (SAS)Total Score ReportIndividual System

#Self Assessment Survey (SAS)Subscale Report

#Self-Assessment Survey (SAS) Subscale Report

#Benchmarks of Quality (BoQ)Total Score Report

Target = 70%51%#Benchmarks of Quality (BoQ)Subscale Report

#56Benchmarks of Quality (BoQ)Items Report

#

Benchmarks for Advanced Tiers Total Score Report#Benchmarks for Advanced Tiers Subscales Report

#Benchmarks for Advanced Tiers Item Report

#Average Referrals Per Day Per Month

Referrals by Problem Behavior

Referrals by Location

Referrals by Time of Day

Referrals by StudentSchool-wide Information System (SWIS)#SWISAverage Referrals Per Day Per Month

#SWISReferrals by Problem Behavior

#SWISReferrals by Location

#SWISReferrals by Time

#SWISReferrals by Student

#SWISEthnicity Report

#Acts on school-wide data (Program Quality/Fidelity and Student Outcomes) on a regular basis

Sends grade level specific information to the grade level staff to address during grade level meetings

Provides all stakeholders with an overview of the data and areas for celebration and areas targeted for growth. This includes teachers, support staff, volunteers and parents.

Utilizes work groups to address relevant needs

Sends school-wide information to district level staff

Role of the School Leadership Team#Trainer Notes:This slide provides an outline of the role of the School Leadership Team related to data review activities. 68StudentsBuilding StaffBuildingLEA DistrictIdentifies school-wide concerns and grade level specific concerns; Develops action plan based on building level data and concerns and in alignment with the district goals for MTSS implementation; Turfs grade level specific concerns to grade level teams; Responsible for implementing plans and communicating successes/challenges on a regular basis using data anchor informationCascading Model of SupportImproved academics and behaviorISD LeadershipMiBLSiLearns the strategies and practices necessary to effectively teach critical skills; Analyzes data at the classroom and grade level to identify areas of success and need; Communicates needs to building team so the needs can be addressed#Trainer Notes:This is a screenshot of the cascading model of support. This is an animated slide. The animations highlight the various levels of supports we are focused on along with their specific roles and responsibilities. It is important that teams recall the role of the leadership team in examining school-wide data, which is the purpose of todays activities. 69RememberThe Building Leadership Team does not have to solve every problem but does need to study building data to determine school-wide needs they will address along with identifying grade-level needs and ensuring the appropriate individual(s) who will address these needs are identified (e.g., which grade-level teams need to address the identified needs)#Trainer Notes:This is a reminder to the teams that they are not being asked to solve every problem related to the data they are reviewing but rather to study the data to determine which school-wide needs they will address and identify any specific grade-level needs that will be addressed by the grad- level teams. 70Making Sense of Student Outcomes and Program Quality / Fidelity Program Quality / Fidelity

Student Outcomes

Stay the course and work to do it better (elaboration and continuous regeneration) Program Quality / Fidelity

Student Outcomes

Explore accuracy of program quality/ fidelity data. Determine if enough time has passed to expect changes in student outcomes (keep initial implementation on track and move into full implementation)

Program Quality / Fidelity

Student Outcomes

Explore accuracy of data. Examine what else is happening / present that could be contributing to strong student outcomes (keep working to do it right through initial and full implementation) Program Quality / Fidelity

Student Outcomes

Consider intensive implementation supports (revisit exploration / adoption and installation)

#Trainer Notes:Lets unpack this graphic. If we start with the green box in the top left corner, this is a situation when both the program quality/fidelity data and student outcome data are both on-track. When this occurs, teams should keep doing what they have been doing and the focus can shift to working to things even better.

Moving to the yellow box in the top right corner, this is a situation when program quality/fidelity data is high but student outcome data are in need of support. In this case, the first step a team would take is to explore the accuracy of the program quality/fidelity data. The team would also want to consider if enough time has passed as to be able to expect changes in student outcomes. When it has been verified that the program quality/fidelity data are accurate, then teams should focus on keeping initial implementation on track and expand to full implementation. Teams will also need to keep an eye on student outcome data.

Moving to the bottom left corner, this is a situation where the program quality/fidelity scores are low and student outcome data are high. If this is the situation, teams should explore the accuracy of the both sets of data. Things to consider at this point include determining what else might be occurring that could be contributing to the strong student outcomes those are the things we want to keep doing.

Finally, the red box in the bottom right hand corner describes a situation where neither your program quality/fidelity data and student outcome data are on track. In this situation, teams should consider how to provide intensive implementation supports. Teams should revisit exploration/adoption and installation. Staff consensus is important in moving the work of MTSS forward. 71Be specific by describing the:Big IdeaTime of YearTier 1, 2, 3Performance GapsPossible Program Quality / Fidelity LinksData AccuracyTranslating the Analysis into Celebrations and Gap Statements#Example School-wide Reading Student Outcome Data

#Trainer Notes:This set of example school-wide data can be found in the participant workbook. Have participants review the data in their workbook before moving on to the next series of slides which will provide an example celebration and gap statement based on this data. These is intended to examples to model the link between the data analysis (questions 1-6 on the Questions & Data Sources for Building-Level Data Analysis) and the celebrations and gap statement. 73Translating the Analysis into Celebrations

#Trainer Notes:This is a screen shot of an example celebrations worksheet. Please point out to the participants that the celebrations are very specific and identify the time of year, the specific Big Idea, and which tier(s) in describing the celebrations. 74Celebrate Successes!!!

#Trainer Notes:This slide is a reminder to teams that they need to celebrate the specific successes at each grade level and at the building level. 75Recall Our Example School-wide Reading Student Outcome Data

#Trainer Notes:This is a quick transition slide to remind teams that we are using the same data set as we gear up to show an example gap statement on the next slide. 76Example Gap StatementAs of May 2013, K is the only grade level where at least 80% of all students are at or above benchmark on the Composite score and on Phonemic Awareness measure but not for Alphabetic Principle. Grades 1 through 3 have not established at least 80% or higher on the composite scores (1st 71%, 2nd 65%, 3rd 73% in May 2013) or individual measures for each grade level indicating needs in the areas of Alphabetic Principle, Fluency, Comprehension, and Vocabulary across these grade levels.

There is inconsistent performance across subgroups at all grade levels when the May 2013 composite scores are disaggregated by ethnicity.

The data related to the effectiveness of instructional supports indicates that 1st (second semester only), 2nd, and 3rd demonstrate a relative strength in keeping 90% or more students at or above benchmark across the year but do not yet consistently have 80% of more of students at or above benchmark at each assessment period. More than 15% of students demonstrate a need for strategic reading supports, with the exception of 2nd grade second semester and 3rd grade all year. However, at each grade level, with the exception of first semester K, strategic reading supports are not moving enough students (at least 80%) to at or above benchmark at the next assessment period. At each benchmark period there are also more than 5% of students demonstrating a need for intensive reading supports in grades 1-3 and the intensive reading supports are not moving enough students out of the well below benchmark range (at least 80%). Translating the Analysis into Gap StatementData come from the Reading Data Summary Sheet

Data come from the Subgroup Performance Sheet

Data come from the Summary of Effectiveness Table

#Trainer Notes:This example gap statement can be found as a full sheet in the participant workbook. Prompt participants to read through the example gap statement in their workbook and highlight the areas where the data from the Reading Data Summary Sheet, Summary of Effectiveness Table and the Subgroup Performance Table are found in the example gap statement. Prompt participants to write where the data came from in their participant workbook. This slide is animated. After allowing individual time to review this example, advance the slide and indicate that the information in blue is from the Reading Data Summary Sheet, the information in orange is from the Summary of Effectiveness Table, and the information in purple is from the Subgroup Performance Table. 77Example Program Quality/Fidelity Data for Reading & Behavior

#Trainer Notes:This set of example reading and behavior program quality/fidelity data can be found in the participant workbook. Have participants review the data in their workbook before moving on to the next series of slides which will provide an example cause for gap. These is intended to examples to model the link between the data analysis (questions 7-14 on the Questions & Data Sources for Building-Level Data Analysis) and the cause for gap statement.

78Example Behavior Student Outcome Data

#Trainer Notes:This set of example behavior student outcome data can be found in the participant workbook. Have participants review the data in their workbook before moving on to the next series of slides which will provide an example cause for gap. These is intended to examples to model the link between the data analysis (questions 7-14 on the Questions & Data Sources for Building-Level Data Analysis) and the cause for gap statement. 79Translating Analysis into Cause for Gap Example Cause for Gap Our DIBELS Next data indicate we have weaknesses in our core reading curriculum as well as strategic and intensive reading supports. While our overall PET-R score is 98%, the subscale score for Professional Development has been at 75% for the past two years and we consistently score low on items (3) related to time being systematically allocated for educators to analyze, plan and refine instruction, and (4) professional development efforts are explicitly linked to practices and programs that have been shown to be effective through documented research. In addition, our BoQ score of 51% is below the criteria of 70% and we have an average referral per day per month that is consistently above the national median of .22 indicating that we do not have full implementation of a three tier model of behavior supports in place and most of our referrals are coming from the classroom which means that our behavior systems are likely impacting our academic outcomes. All of these factors are contributing to our gap. Data come from the Reading Program Quality/Fidelity Measure

Data come from the Behavior Program Quality/Fidelity Measure

Comes from the Behavior Student Outcome Data

#Trainer Notes:This example cause for gap can be found as a full sheet in the participant workbook. Prompt participants to read through the example cause for gap in their workbook and highlight the areas where the data from the Reading Program Quality/Fidelity Data, Behavior Student Outcome Data, and Behavior Program Quality/Fidelity Data are found in the example gap statement. Prompt participants to write where the data came from in their participant workbook. This slide is animated. After allowing individual time to review this example, advance the slide and indicate that the information in blue is from the Reading Program Quality/Fidelity Data, the information in purple is from the Behavior Program Quality/Fidelity Data, and the information in orange is from Behavior Student Outcome Data.

80Use the Questions and Data Sources for Building-Level Data Analysis with the Data Review Workbook to study your data

The intended outcome of this team time is to clearly and specifically identify celebrations, gap statement, and cause for gap on your analysis of both the student outcome data and program quality/fidelity data Team Time#Trainer Notes:This team time goes until teams break for lunch from 12:00 p.m. to 12:45 p.m. There is not a facilitator guide for this day. Teams are asked to use the Questions and Data Sources for Building-Level Data Analysis along with the questions/prompts in the data review workbook to study their data.

814.0 Plan#Trainer Notes:This module should begin at 12:45 p.m. and end by 2:55 p.m. The intent of this module is for teams to use the information from the study section to develop specific strategies and activities ideally tied to their School Improvement Plan. 82

School Improvement Plan MTSSWhat We Want to Avoid#Trainer Notes:The goal is for the work of MTSS and School Improvement to be connected and not in separate silos (as seen in this picture). 83References to the Program Quality/Fidelity Data and Student Outcome Data collected this year

Through strategies and activities related to the core principles of Multi-Tiered System of Supports, a School-wide Reading Model and School-wide Positive Behavioral Interventions & SupportsShould reflect the integrated work you are doing in reading and behavior

Making MTSS Visible in Your School Improvement Plan #Trainer Notes:This slide provides general ideas of how to make MTSS visible in School Improvement Plans. 84Discuss why the following statements are considered non-examples of making MTSS visible in a School Improvement Plan

We will implement MTSS.We will get trained in MTSS.

Group Discussion#Trainer Notes:Provide 3-5 minutes for this group discussion. 85

MTSS FrameworkEvidence Based Instructional Practices Paragraph Shrinking Explicit vocabulary instruction

Research Based Core Program Reading Street Prentice HallEvidence Based Interventions K-PALS REWARDS Read 180 Read to AchieveBehavioral Supports Schoolwide & Classroom PBIS Check-in Check-out

PLCs, Grade level meetings, problem solving processAssessmentsPLCsStudent engagement strategiesThe Reality#Trainer Notes:This slide is animated. The reality is that MTSS is a framework or umbrella under which many evidence-based programs and practices exist. Making MTSS visible in School Improvement will require being much more specific than we will implement MTSS or we will get trained in MTSS.

86So What Should it Look Like?

#Trainer Notes:This slide is a transition slide. Weve talked about what not to do related to making MTSS visible in the school improvement plan. Now lets look at what it should look like. 87Recall the Cause for Gap ExampleExample Cause for Gap Our DIBELS Next data indicate we have weaknesses in our core reading curriculum as well as strategic and intensive reading supports. While our overall PET-R score is 98%, the subscale score for Professional Development has been at 75% for the past two years and we consistently score low on items (3) related to time being systematically allocated for educators to analyze, plan and refine instruction, and (4) professional development efforts are explicitly linked to practices and programs that have been shown to be effective through documented research. In addition, our BoQ score of 51% is below the criteria of 70% and we have an average referral per day per month that is consistently above the national median of .22 indicating that we do not have full implementation of a three tier model of behavior supports in place which means that our behavior systems are likely impacting our academic outcomes. All of these factors are contributing to our gap. Example Objective: At least 80% of students in all grade levels and all subgroups at Sample School will have basic literacy skills established by May 2014, as measured by DIBELS Next composite scores and subscale scores for each grade level.

#Trainer Notes:This slide is animated. We return to the cause for gap statement. Now we provide an example objective statement. 88Translating the Cause for Gap into Action Example Cause for Gap Our DIBELS Next data indicate we have weaknesses in our core reading curriculum as well as strategic and intensive reading supports. While our overall PET-R score is 98%, the subscale score for Professional Development has been at 75% for the past two years and we consistently score low on items (3) related to time being systematically allocated for educators to analyze, plan and refine instruction, and (4) professional development efforts are explicitly linked to practices and programs that have been shown to be effective through documented research. In addition, our BoQ score of 51% is below the criteria of 70% and we have an average referral per day per month that is consistently above the national median of .22 indicating that we do not have full implementation of a three tier model of behavior supports in place which means that our behavior systems are likely impacting our academic outcomes. All of these factors are contributing to our gap. Example Strategy #1:Sample Schools staff will strengthen the MTSS framework for reading by increasing the implementation percentage on the Professional Development section of the Planning & Evaluation Tool Revised (PET-R).

#Trainer Notes:This slide is animated. We provide an example strategy and highlight the portions of the cause for gap that directly led to the example strategy. 89Translating the Cause for Gap into Action Example Cause for Gap Our DIBELS Next data indicate we have weaknesses in our core reading curriculum as well as strategic and intensive reading supports. While our overall PET-R score is 98%, the subscale score for Professional Development has been at 75% for the past two years and we consistently score low on items (3) related to time being systematically allocated for educators to analyze, plan and refine instruction, and (4) professional development efforts are explicitly linked to practices and programs that have been shown to be effective through documented research. In addition, our BoQ score of 51% is below the criteria of 70% and we have an average referral per day per month that is consistently above the national median of .22 indicating that we do not have full implementation of a three tier model of behavior supports in place which means that our behavior systems are likely impacting our academic outcomes. All of these factors are contributing to our gap. Strategy #1 - Example Activities:Monthly grade level team meetings that focus on analyzing DIBELS benchmark, progress monitoring data, and additional classroom data to inform instruction and result in clearly defined action plans will be scheduled for the 2013-14 school year.

Staff will be provided with professional development prior to the start of the 2013-14 school year to ensure all staff are able to read, analyze and interpret DIBELS reports and SWIS reports to ensure integration of academic and behavior during grade level team meetings.#Trainer Notes:This slide is animated. Given the example cause for gap and example strategy on the previous slide, here are some potential activities that would be directly related to the strategy. 90Translating the Cause for Gap into Action Example Cause for Gap Our DIBELS Next data indicate we have weaknesses in our core reading curriculum as well as strategic and intensive reading supports. While our overall PET-R score is 98%, the subscale score for Professional Development has been at 75% for the past two years and we consistently score low on items (3) related to time being systematically allocated for educators to analyze, plan and refine instruction, and (4) professional development efforts are explicitly linked to practices and programs that have been shown to be effective through documented research. In addition, our BoQ score of 51% is below the criteria of 70% and we have an average referral per day per month that is consistently above the national median of .22 indicating that we do not have full implementation of a three tier model of behavior supports in place which means that our behavior systems are likely impacting our academic outcomes. All of these factors are contributing to our gap. Example Strategy #2:Staff will increase the fidelity of implementation of School-wide PBIS with a score of 70% on the Benchmarks of Quality (BoQ) in order to decrease the average referrals per day per month to at or below the national median.

#Trainer Notes:This slide is animated. Weve created a second example strategy linked to the example cause for gap statement. 91Translating the Cause for Gap into Action Example Cause for Gap Our DIBELS Next data indicate we have weaknesses in our core reading curriculum as well as strategic and intensive reading supports. While our overall PET-R score is 98%, the subscale score for Professional Development has been at 75% for the past two years and we consistently score the items (3) related to time being systematically allocated for educators to analyze, plan and refine instruction, and (4) professional development efforts are explicitly linked to practices and programs that have been shown to be effective through documented research. In addition, our BoQ score of 51% is below the criteria of 70% and we have an average referral per day per month that is consistently above the national median of .22 indicating that we do not have full implementation of a three tier model of behavior supports in place which means that our behavior systems are likely impacting our academic outcomes. All of these factors are contributing to our gap. Strategy #2 Example Activities:

Review Benchmarks of Quality data and SWIS data with staff and facilitate a data dialogue using the BoQ Total Score, Subscale, and Items report along with the SWIS Big 5 reports. Facilitate a conversation with staff in order to gain consensus around implementation of SW-PBIS and the integration of academic and behavioral supports.

The school staff will use the completed lesson plans for teaching School-wide behavioral expectations following the schedule of dates for the 2013-14 kick-off and review.

School staff will review SWIS data at monthly staff meetings and engage in data dialogues to problem solve and action plan base on the needs identified in the school-wide data The Leadership Team will use this springs BoQ data (items report) to monitor progress of implementation efforts.

#Trainer Notes:This slide is animated. It lists example activities that are directly related to the example strategy from the previous slide. 92Building-Level School Improvement Support Tool

Do not under estimate the importance of identifying the following information as a part of your planning:Who will do it?By when?How often?Resources neededPlan for Monitoring#Trainer Notes:This slide is animated. It provides a screen shot of the School Improvement Support Tool (that is in the pink assessment binder). We want to be very explicit about the link to example objectives, strategies and activities on the previous slides to this form. We also do not want to underestimate the importance of completing the entire form including, who will do it?, by when?, how often?, resources needed, and the plan for monitoring. Remember, the goal is to have a completed action plan based on the data. 93Importance of Communication

#Trainer Notes:Communication is a key component to planning. Teams need to keep in mind what needs to be communicated out and to whom. 94Use the Data Analysis you completed during the Study Phase along with the examples in your Data Review Workbook to develop your plan.

The intended outcome of this team time is to develop specific strategies and activities to address the gap statement and the cause for gap as well as to identify what information needs to be communicated and to whom the information will be communicated Team Time#Trainer Notes:This team time should begin by 1:30 p.m. at the latest and will end by 2:55 p.m.

Teams should be prompted to take a 15 minute break during this time. 955.0 Do#Trainer Notes:This module should begin at 2:55 p.m. and end by 3:00 p.m.96

#Trainer Notes:The intent of this module is to convey this message just do it! Teams should have completed a detailed action plan based on the data analysis completed earlier today. The task then becomes to actually implement the plan just do it!

97Wrapping Up Our Time with C7#Trainer Notes:Each TAP will insert content to wrap up the time with C7 schools. This should begin at 3:00 p.m. and end at 3:30 p.m. 98Thank you for your time and your dedication to the hard work of MTSS implementation!

#