the florida problem solving/ response to …floridarti.usf.edu › resources › format › pdf ›...

110
The Florida Problem Solving/ Response to Intervention Project Year 4 Evaluation Report Stockslager, K.M., Castillo, J.M., Hines, C.V., Batsche, G.M., and Curtis, M.J. April 2013 This project is a collaboration between the University of South Florida and the Florida Department of Education.

Upload: others

Post on 26-Jun-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

 

The Florida Problem Solving/

Response to Intervention Project

Year 4 Evaluation Report Stockslager, K.M., Castillo, J.M., Hines, C.V., Batsche, G.M., and Curtis, M.J.

April 2013

This project is a collaboration between the University of South Florida and the Florida Department of Education.

Page 2: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response
Page 3: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Table of Contents Contact Information .............................................................................................................................. 1 Acknowledgements ............................................................................................................................... 2 Executive Summary .............................................................................................................................. 3 Abstract ................................................................................................................................................. 6 Introduction ........................................................................................................................................... 7 Evaluation Design ............................................................................................................................... 10 Methods and Procedures ..................................................................................................................... 11 Findings: Ongoing Pilot Site Implementation and Project Capacity Building Efforts ....................... 17 Summary of Findings .......................................................................................................................... 47 References ........................................................................................................................................... 48 Appendix A – Problem Solving-Response to Instruction/Intervention Training Outline .................. 49 Appendix B – Copies of Evaluation Instruments ............................................................................... 53 Appendix C – Data Tables .................................................................................................................. 77

Page 4: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response
Page 5: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Contact Information April 2013

Project Director George Batsche, EdD, NCSP Michael Curtis, PhD, NCSP Project Leader Clark Dorman, EdS Project Evaluators José Castillo, PhD, NCSP Kevin Stockslager, PhD, NCSP Constance Hines, PhD Regional Coordinators Deanne Cowley, EdS Beth Hardcastle, MA, MEd Kelly Justice, MEd Regional Facilitators Michael McAuley, EdS Larry Ruble, EdS Lisa Yount, MA For specific staff contact information, go to http://floridarti.usf.edu/floridaproject/contactinfo.html For an electronic copy of this document, go to http://floridarti.usf.edu/resources/program_evaluation/index.html. Please contact Kevin Stockslager at [email protected] for additional information on this report.

Problem Solving/Response to Intervention Program Evaluation 1

Page 6: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Acknowledgements The Florida Problem Solving/Response to Intervention (PS/RtI) Project would like to thank the Regional Coordinators and Regional Facilitators who were so instrumental in collecting the data reviewed in this report. Another special thanks is extended to the following Project Graduate Research Assistants for their efforts in entering, managing, and analyzing the data presented: Lisa Bateman Cheryl Gelley Tom Makowski Mario Montesino Derek Powers Krystle Preece Finally, the Project would like to thank Judi Hyde for her assistance formatting the charts, graphs, and other components of this report needed to clearly communicate findings to readers.

2 Problem Solving/Response to Intervention Program Evaluation

Page 7: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Executive Summary

This report presents information on the Florida Problem Solving/Response to Intervention (PS/RtI) Project from the 2010-11 school year. To help facilitate and inform implementation of the PS/RtI model in the state, the Florida Department of Education (FDOE) created the Project in 2006. The Project represents a collaborative effort between the FDOE and the University of South Florida, created to (1) provide training, technical assistance, and support across the state on the PS/RtI model, and (2) systematically evaluate PS/RtI implementation in a limited number of demonstration sites. The information contained within this report focuses primarily on these two components of the Project’s mission. First, data from a one-year follow-up period during which the Project collected data on important implementation outcomes from demonstration sites after the conclusion of three years of professional development and support are reported. Second, the report focuses on the PS/RtI Project’s evolving mission to build the capacity of Florida’s schools and districts to implement an integrated multi-tiered system of supports1 (MTSS). Specifically, the second section will focus on the activities that Project leadership and staff engaged in to build internal capacity to achieve the Project’s mission, as well as activities that they engaged in to assist districts in building capacity to implement an MTSS.

The demonstration site component of the Project, implemented from 2007-2010, was intended

to provide a comprehensive evaluation of the impact of implementing an MTSS model on schools, educators, and students. Data from the evaluation of the demonstration sites were intended to inform the statewide scale-up of an MTSS in Florida. The demonstration site component was implemented in 34 pilot schools within seven demonstration school districts across the state. The Project provided funding, technical assistance, and follow-up support to demonstration districts and pilot schools to facilitate implementation of the model (For more information on the three-year demonstration component, including information on the PS/RtI Project’s systems change model, see Castillo, Hines, Batsche, and Curtis [2011]). Project staff collected data on a number of variables to evaluate implementation of an MTSS. Following the completion of the three year professional development cycle, Project staff continued to collect data in 27 pilot schools within 6 districts during the 2010-11 school year to evaluate the relationship between removal of professional development supports and MTSS implementation.

The capacity building section is intended to provide an evaluation of the activities that Project staff engaged in to build internal capacity to assist districts in implementing an MTSS, as well as the activities engaged in to build district capacity for MTSS implementation. The capacity building section is organized around two main goals. First, the extent to which the Project built internal capacity is discussed. Second, the extent to which the Project staff worked with districts to begin building capacity to implement an MTSS is discussed. Data sources include the Project Tracking System (PTS; a statewide accountability database used by discretionary projects in the state to enter information on projects’ activities and outcomes) and training evaluation surveys, as well as permanent products from activities that the Project engaged in to build internal capacity.

1  The term multi-tiered system of supports (MTSS) is increasingly being used to describe the process of matching multiple tiers of instruction and intervention to student needs using data. Previously, the Project has referred to the process of delivering multiple tiers of instruction and intervention informed by the use of a data-based, problem-solving process as PS/RtI. To maintain a consistent use of terminology throughout this report, the term MTSS will be used to describe implementation efforts instead of PS/RtI.  

Problem Solving/Response to Intervention Program Evaluation 3

Page 8: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Findings Pilot school findings are organized around the systems change model adopted by the Project.

First, information is presented on the extent to which school and district staff participated in and supported MTSS implementation (Consensus). Next, the development of school structures and staff skills needed to support implementation is examined (Infrastructure). Then, the extent to which schools actually implemented the components of an MTSS is discussed (Implementation).

Increases in educator beliefs and indicators of consensus were evident during the three

years of professional development support. Despite continued increases in reported consensus building activities, decreases in some educator beliefs (as measured by mean scores on the Beliefs Survey) were evident during the following year. While receiving professional development supports from the Project, educators reported increasing levels of agreement with core beliefs central to implementation of an MTSS. School-Based Leadership Team (SBLT) members also reported increases in indicators of consensus at the school- and district-levels. Decreases in SBLT member beliefs were evident following the withdrawal of Project-provided professional development supports following the 2009-10 school year. SBLT members’ beliefs regarding the abilities and performance of students with disabilities, using data to make educational decisions, and the functions of core and supplemental instruction decreased in terms of their alignment with the tenets of an MTSS. Instructional school staff beliefs also decreased, although the decreases were smaller than for SBLT members. While decreases in educators’ beliefs were noted, SBLT members’ continued to report high levels of consensus building activities.

A similar pattern emerged when examining the structures and educator skills necessary to support implementation of an MTSS. Specifically, increases in the infrastructure for MTSS implementation and educators’ skills were noted during the three-year professional development period, followed by continued increases in infrastructure development, but decreases in educators’ perceived skills. Increases in perceived skills appeared to be related to the level of systematic and intensive professional development provided. SBLT members who received the most systematic and intensive training tended to report the highest levels of perceived skills and demonstrated the greatest increases. However, in the year following the conclusion of the professional development cycle, both SBLT members and instructional school staff reported lower levels of perceived skills in applying data-based problem solving to academic content and manipulating data and technology use. Reported skill levels in applying data-based problem solving to behavior content were sustained during the follow-up year among SBLT members, while slight decreases were noted among instructional school staff.

Consistent with patterns in consensus and infrastructure development, increases in MTSS implementation were evident during the three years of professional development support, but decreases in implementation levels tended to occur during the follow-up year. SBLT reports and reviews of permanent products (e.g., meeting notes, worksheets, graphs, charts) generated from meetings during which the critical components of an MTSS were likely to be implemented indicated increasing levels of implementation across the 3-year period for pilot schools. However, permanent product reviews suggested decreases in implementation of the four-step problem-solving process to inform instruction and intervention during the subsequent school year. These decreases suggest that additional professional development and other systematic capacity building activities may be necessary for pilot schools to sustain implementation of an MTSS.

4 Problem Solving/Response to Intervention Program Evaluation

Page 9: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

In terms of the Project’s capacity building efforts, the Project began or continued several initiatives (i.e., Statewide Professional Development and Support for MTSS, Training of Trainers Regional Meetings, Secondary School MTSS Implementation, Differentiated Accountability Support for MTSS, Technology to Support Improved Outcomes for All Students, and an Integrated Model for Academic and Behavior Supports) to build school districts’ capacity to effectively implement an MTSS. During the 2010-11 school year, Project leadership began to build internal capacity by hiring additional staff to support districts in implementing an MTSS. Additionally, Project leadership and staff began systematically collaborating with the Florida PBS/RtI:B Project to develop an MTSS that effectively integrates academic and behavioral supports. Leadership and staff from both projects collaborated to develop a vision and mission statement, formed inter-project workgroups, and began to develop professional development models around critical areas of need (e.g., secondary MTSS implementation, professional development and coaching, K-12 alignment, leadership models for MTSS implementation). Finally, professional development and support were provided to staff to (1) share information on the collaboration between the two projects and activities that staff would be engaging in to support the statewide implementation of an MTSS and to (2) build the knowledge and skills necessary to successfully support districts. In addition to developing the internal capacity of the Project, Project staff also worked to begin building the capacity of districts to implement an MTSS. Project staff provided training and technical assistance, developed and disseminated materials to stakeholders, and provided consultation support to school-, district-, and state-level stakeholders. Examples of the professional development delivered include the 2011 MTSS Leadership Institute, the Training of Trainers Regional Meetings, and Secondary MTSS Implementation Trainings. The materials and support provided by Project staff typically focused on improving the instructional strategies of schools and districts throughout the state. Preliminary Implications and Future Directions Findings from four years of evaluation activities in pilot schools across the state suggest improvements in consensus, infrastructure development, and implementation of an MTSS; however, decreases occurred across several important implementation outcomes once the 3-year professional development and support cycle ended. Specifically, on average, educators reported lower beliefs and skill levels. Implementation of most components of the problem-solving process decreased as well. These findings suggest that a relationship between systematic professional development and building educator capacity to implement an MTSS may exist. Additional professional development and support may be necessary for educators to sustain implementation. Other variables may have related to the implementation of an MTSS as well. For example, staff turnover, district policies and procedures that are inconsistent with implementation, and a lack of ongoing professional development and support may have impacted implementation. Future evaluation efforts will continue to examine the extent to which changes in consensus, infrastructure, and implementation occurred following the three-year professional development cycle.

Future evaluation activities also will examine the extent to which the Project built capacity for the statewide implementation of an integrated MTSS model of academic and behavior supports. The evaluation will include information on the activities that the Project engaged in to build internal capacity to support districts in implementing an MTSS and the associated outcomes. The evaluation also will include information on activities engaged in to build district capacity to implement an MTSS as well as the relationship between those activities and associated outcomes.

Problem Solving/Response to Intervention Program Evaluation 5

Page 10: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Abstract

The Florida Problem Solving/Response to Intervention (PS/RtI) Project represents a collaborative effort between the Florida Department of Education and the University of South Florida. The Project was created in 2006 to (1) provide professional development across the state on a multi-tiered system of supports (MTSS) model, and (2) systematically evaluate the impact of MTSS implementation in a limited number of demonstration sites. During the 2010-11 school year, the Project began building capacity to support statewide implementation of an integrated MTSS model of service delivery. The Florida Problem Solving/Response to Intervention Project: Year 4 Evaluation Report compares formative evaluation data from the three years of Project-supported implementation in the demonstration sites to data from the year following the conclusion of Project-provided supports. This report also contains information on the Project’s new initiatives. Specifically, information on Project goals (1) to build internal capacity to support districts and (2) to build district capacity to implement an MTSS are discussed in the context of the statewide implementation of an MTSS. Data from various sources are presented to provide formative information on the extent to which Project activities facilitated attainment of these two goals. Finally, potential explanations for the findings presented and possible implications for future Project activities are discussed.

6 Problem Solving/Response to Intervention Program Evaluation

Page 11: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Introduction

An effective public education system is fundamental to the United States’ ability to make significant social and economic contributions in the global marketplace. Evidence of a national emphasis on reforming public education to prepare students to be competitive in the 21st century global economy can be found in recent federal legislation as well as policy proposals. The No Child Left Behind Act (NCLB) of 2002 was authorized by Congress to hold schools accountable for the educational outcomes of students. Blueprint for Educational Reform 2010: The Reauthorization of the Elementary and Secondary Education Act (the proposal for reauthorizing NCLB generated by the U.S. Department of Education) recommends assessment of student growth, blending funding from categorical programs to support access to evidence-based interventions, and meeting the needs of students with disabilities through the Elementary and Secondary Education Act (currently referred to as NCLB), as well as through the Individuals with Disabilities Education Improvement Act (IDEIA, 2004). IDEIA (2004) allows school districts to include student response to evidence-based interventions in their criteria for determining eligibility for services under the Specific Learning Disabilities (SLD) category. Importantly, schools must demonstrate student response to interventions implemented for a reasonable period of time through frequently administered assessments that directly assess educational standards/benchmarks (IDEIA Regulations, 2006). One mechanism for making the data-based decisions called for in the policy documents described above that is receiving attention across the nation is the Problem Solving/Response to Intervention (PS/RtI) model. A recent survey of school districts sampled across the United States (Spectrum K12/CASE, 2011) indicates that 68% of responding districts reported implementing PS/RtI practices (i.e., reported that they have fully implemented or are in the process of district-wide implementation) compared to 40% in 2009. The Problem Solving/Response to Intervention (PS/RtI) Model

A PS/RtI model uses assessment to facilitate the development and implementation of evidence-based interventions in the general education environment and to determine the extent to which students respond to the interventions through continuous progress monitoring (Batsche et al., 2005). The three components of the PS/RtI model include the use of a four-step problem-solving process (Bergan & Kratochwill, 1990), provision of instruction and interventions through a three-tiered model of service delivery, and the use of data to make educational decisions. In response to federal mandates and national support for PS/RtI, the Florida Department of Education (FDOE) has created infrastructure to support the statewide implementation of PS/RtI, including the dissemination of the “Florida Department of Education Statewide Response to Instruction/Intervention (RtI) Implementation Plan” (a copy of the plan is available at http://www.florida-rti.org/floridaMTSS/history.htm). For more information on the PS/RtI model and Florida’s focus on PS/RtI practices, see Castillo, Hines, Batsche, and Curtis, (2011). Florida’s Problem Solving/Response to Intervention (PS/RtI) Project

To help facilitate and inform implementation of a PS/RtI model in the state, the FDOE created the Florida PS/RtI Project in 2006. The original mission of the Project, a collaborative effort between the FDOE and the University of South Florida, was to (1) provide training, technical assistance, and support across the state on the PS/RtI model, and (2) systematically evaluate the impact of PS/RtI implementation in a limited number of demonstration sites. The statewide training component of the Project was intended to provide school-based teams with the knowledge and skills needed to implement the PS/RtI model. The demonstration site component of the Project, on the other hand,

Problem Solving/Response to Intervention Program Evaluation 7

Page 12: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

was intended to provide a comprehensive evaluation of the impact of implementing the PS/RtI model on schools, educators, and students. From 2007 to 2010, the demonstration site component was implemented in 34 pilot schools in seven demonstration school districts across the state. Program evaluation efforts in those 34 pilot schools were aimed at informing the statewide scale-up of PS/RtI implementation in Florida. Pilot schools received 13 days of training across the three years as well as ongoing technical assistance and support from PS/RtI coaches and Project staff. Systematic data collection to evaluate PS/RtI implementation also occurred (see Castillo et al., [2011] for more information on the three-years of professional development and support provided by the Project). Project staff have continued to collect data from the pilot schools focused on important implementation outcomes; however, the mission of the Project has evolved to focus on building Florida’s school district capacity to implement a Multi-Tiered System of Supports (MTSS). Changing Role of Florida’s PS/RtI Project

MTSS is a term that has emerged in recent years to describe a comprehensive, integrated

approach to addressing the academic, behavioral, and social-emotional needs of students. Although MTSS represents a change in terminology, Projects staff consider MTSS and PS/RtI to represent different terms for the same model of service delivery. The critical elements of multi-tiered instruction and intervention and the use of data-based problem solving to inform decision-making are the same. What has changed is the level of support for an MTSS at the national and state levels (see the FDOE’s website for more information on the support for MTSS implementation in the State of Florida [FDOE MTSS]), requiring the Project’s mission to evolve to continue to meet the needs of districts and schools.

Feedback from Project stakeholders at the demonstration districts and pilot schools suggested

the need to assist district leadership in building their capacity to support schools in implementing an MTSS. Data collected from pilot schools suggested that a lack of district capacity to support schools was a barrier to schools reaching and sustaining full implementation. As the Project began planning for shifting its focus to building district capacity during the 2010-11 school year, Project staff realized the need to build internal capacity to support districts throughout the state. Because Project support had mainly been targeted at the school-level, goals for the 2010-11 school year included building relationships with other projects in the state (e.g., Florida’s PBS/RtI:B Project) supporting MTSS implementation, and continuing to cultivate relationships with district leaders across the state. The Project also sought to hire additional staff and provide professional development and support to new and existing staff. In addition to these capacity building goals, Project staff increased its emphasis on supporting the implementation of MTSS in Florida schools and districts. During the 2010-11 school year, Project staff worked toward building the capacity of districts through the five initiatives listed below:

1. Statewide Professional Development and Support for MTSS Implementation

a. One of the main goals of the Project is to provide professional development and support to schools and districts implementing an MTSS. While most Project staff deliver some professional development and support, this initiative is primarily provided by three Regional Coordinators (RCs) and three Regional Facilitators (RFs). The RCs and RFs provide support to districts through training, technical assistance, and support. Common activities engaged in to support MTSS implementation include Training of Trainers workshops, needs-based regional and district technical assistance and support, and piloting of the District Action Planning and Problem-Solving Process (DAPPS; described

8 Problem Solving/Response to Intervention Program Evaluation

Page 13: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

below). Professional development models and materials are also created to support MTSS implementation across the state.

2. Secondary School MTSS Implementation a. This initiative focuses on the implementation of data-based problem solving

within an MTSS at the secondary level. Project staff who support the piloting of this initiative have been responsible for training and technical assistance effort to increase implementation at the secondary level. Data-based problem solving, the development and delivery of instruction/intervention, and the use of Early Warning Systems at the district and secondary levels are important aspects of this initiative.

3. Differentiated Accountability (DA) Support for MTSS a. The DA Model was adopted by the FDOE to improve student performance in

the State’s most struggling schools. Differentiated Accountability MTSS Specialists support the work of the State Differentiated Accountability (DA) teams. The focus of the MTSS Specialists’ work is to provide training, technical assistance, and evaluation assistance to ensure that data-based problem solving within a multi-tiered system is a critical component of the school improvement efforts in targeted schools. MTSS Specialists provide internal professional development to their regional teams on MTSS and collaborate with other content (e.g., reading, math) specialists to improve the performance of schools in need of improvement. See the Florida DOE website (Florida Differentiated Accountability Model) for more information on the State’s DA model.

4. Technology to Support Improved Outcomes For All Students a. In order to build the Project’s capacity to support the use of technology to

improve the outcomes of all students, Project leadership engaged in activities to integrate the Regional Technology Network into the Project. The Regional Technology Network team focuses on establishing and maintaining five regional technology support centers throughout the state. The team also focuses on ensuring that students with disabilities have the assistive technology and materials they need to support their learning and performance. The initiative will begin to focus on integrating technology and universal learning design principles to support statewide MTSS implementation.

5. Integrated MTSS Model for Academic and Behavior Supports a. This initiative focused on developing and supporting the statewide

implementation of an integrated MTSS model of academic and behavior supports in Florida. The primary goals of this initiative during the 2010-11 school year were to build the Project’s internal capacity to support schools and districts to implement an MTSS, develop partnerships with the Florida PBS/RtI:B Project to develop an integrated MTSS model, develop relationships and gather information from stakeholders, and further develop a plan for the statewide implementation of an integrated MTSS.

Problem Solving/Response to Intervention Program Evaluation 9

Page 14: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Evaluation Design

Florida PS/RtI Project Demonstration Site Component

Prior to the first year of the Project demonstration component in 2007, Project staff developed a structured program evaluation model to evaluate PS/RtI implementation in the pilot schools. This evaluation model included both formative and summative evaluation approaches with the purpose of evaluating educators’ beliefs, knowledge, and skills; implementation of MTSS activities and processes; and the relationship between PS/RtI implementation and student and systemic outcomes. Finally, the evaluation design highlighted the importance of engaging in change systemically. Project staff adopted a three-stage change model to assist pilot schools in building consensus among staff, developing infrastructure for MTSS implementation, and implementing practices consistent with an MTSS. It is the continued efforts of pilot schools to systematically implement an MTSS following the conclusion of three years of professional development and support that is the focus of current evaluation activities. For a more detailed review of the evaluation design for the demonstration site component, see the Florida PS/RtI Project Year 3 Evaluation Report (Castillo, et al., 2011). Florida PS/RtI Project Capacity Building Component Purpose and Design

The overall evaluation design for the Project’s capacity building component includes a focus on evaluating progress on the following goals:

1) Increase the Project’s internal capacity to assist districts in implementing an MTSS 2) Increase the capacity of school districts to implement an MTSS

The evaluation design included a review of process data, permanent products created by Project

staff, and survey methodology. The design for the 2010-11 school year was developed post-hoc. Capacity building efforts related to the five initiatives described above also applied to the development of a comprehensive program evaluation model. In the absence of an a priori evaluation design, Project staff identified potential data sources to address the extent to which progress was made on the two aforementioned goals and made efforts to access and analyze the data/information. Given the limitations of this approach, findings related to internal and external stakeholder capacity should be interpreted with caution. Despite these limitations, Florida PS/RtI Project Leadership and Evaluators believe that data and information gathered retrospectively still provide useful information to Project staff and stakeholders regarding capacity-building efforts during the 2010-11 school year.

10 Problem Solving/Response to Intervention Program Evaluation

Page 15: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Methods and Procedures Project Goals and Activities Ongoing Pilot Site Implementation Component. Previous research on MTSS and systems change informed the first three goals of the Project. During the first three years of Project supports, the goals were to increase the extent to which Project training, technical assistance, coaching, and support related to consensus, infrastructure, and implementation in the pilot schools. These goals helped shape the development of training provided to pilot schools. Project staff (i.e., RCs and the Project Leader) delivered 13 full-day training sessions to School-Based Leadership Team (SBLT) members at the pilot schools. Training modules delivered to SBLT members focused on the conceptual and legislative/policy reasons to implement an MTSS, the aforementioned three-stage systems change model, and the knowledge and skills necessary to implement an MTSS. More information on the content of the training modules can be found in Appendix A and on the Project website (http://floridarti.usf.edu/resources/program_evaluation/index.html). In addition to training, RCs and MTSS Coaches provided ongoing technical assistance throughout the three years. Coaches provided the majority of technical assistance to SBLT members and staff at the pilot schools. Examples of support provided by coaches included additional trainings on MTSS content, ongoing support in data meetings, and assistance with planning for MTSS activities. In order to inform training and technical assistance, data were collected from the pilot and comparison schools and analyzed by Project staff. For more information on the support provided to pilot schools during the three-year professional development cycle, see Castillo, et al. (2011).

After systematic professional development and supports ended, goals for the subsequent year shifted to evaluating pilot schools’ continued activities and outcomes related to consensus building, infrastructure development, and MTSS implementation. In previous years (i.e., Years, 1, 2, and 3), data were collected from both pilot and comparison schools. Following the three-year professional development sequence, six of the demonstration districts agreed to continue data collection in their pilot schools. Data collection in comparison schools was discontinued due to resource limitations. Funding and data collection support from RCs and RFs were provided to participating pilot schools. Data collection continued in 27 of the pilot schools during the 2011-12 school year.

Specific evaluation goals for the ongoing pilot school data collection component were to:

1) Evaluate the relationship between the end of professional development and support

provided by the Project and the following outcomes in pilot schools: i. Level of consensus among SBLT members and staff regarding implementation of an

MTSS, ii. Infrastructure necessary to support implementation of an MTSS, and

iii. Level of MTSS implementation Project Capacity Building Component. Goals developed to attain the new aspects of the Project mission were (1) to increase internal capacity to support districts implementing an MTSS and (2) to increase district capacity to implement an MTSS. These goals informed the activities engaged in by Project staff throughout the 2010-11 school year. Examples of activities engaged in to build Project capacity include the hiring of additional personnel, the provision of professional development to Project staff, and collaboration with the Florida PBS/RtI:B Project. In order to assist districts in building their capacity for MTSS implementation, Project staff provided training and technical assistance to school- and district-level staff; collected needs assessment data from several school

Problem Solving/Response to Intervention Program Evaluation 11

Page 16: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

districts, and began to pilot the Educational Systems Review process and implementation of an MTSS at the secondary level. As a result of the shift in goals and activities, beginning to evaluate internal and external capacity building became a focus of Project evaluation activities.

Specific evaluation goals for the Project capacity building component were to: 2) Evaluate the extent to which the Project began to;

i. Increase internal capacity to support districts in implementing an MTSS ii. Increase district capacity to implement an MTSS

Evaluation Goals and Questions

During the three-year demonstration site initiative, Project staff regularly engaged in formative evaluation of the extent to which Project processes and activities related to attainment of implementation goals. Formative analyses were used to stimulate discussion regarding goal attainment and modifications to Project activities to address identified needs. Previous reports focused primarily on the extent to which those goals were achieved. The purpose of this year’s report is to provide information on the extent to which pilot schools continued to develop consensus, build infrastructure, and implement the critical components of an MTSS following the conclusion of Project-provided professional development and supports. Additionally, this report focuses on the extent to which the Project staff worked to build internal capacity to support MTSS implementation as well as district capacity to support schools in implementing an MTSS. The following evaluation questions were asked to provide Project stakeholders with information on the status of MTSS implementation in pilot schools as well as the status of the Project’s capacity-building initiative. To facilitate interpretation of the data, questions one through four are organized around the Project’s system change model of consensus, infrastructure, and implementation. Question five addresses the Project’s goal of building capacity for the statewide implementation of an MTSS.

The evaluation questions addressed in the report are as follows:

Consensus

1. To what extent did the conclusion of professional development and support provided by the Project relate to changes in:

a. Beliefs consistent with an MTSS model? b. Consensus development?

Infrastructure

2. To what extent did the conclusion of professional development and support provided by the Project relate to changes in:

a. The knowledge and skills required to implement an MTSS? b. Infrastructure development?

Implementation

3. To what extent did the conclusion of professional development and support provided by the Project relate to changes in:

a. Establishment of a three-tiered instruction and intervention system?

12 Problem Solving/Response to Intervention Program Evaluation

Page 17: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

b. Implementation of problem solving steps when addressing student needs at the Tier I and/or II levels?

c. Implementation of problem solving steps when addressing individual student needs?

4. To what extent did pilot schools continue to engage in data-based planning to facilitate implementation of an MTSS after direct Project support ended?

Increasing capacity for statewide implementation of an MTSS

5. To what extent did the five Project initiatives: a. Increase internal capacity to assist districts in implementing an MTSS? b. Assist districts in building their capacity to implement an MTSS?

To address the first four evaluation questions referenced above, data were gathered from

SBLT members and instructional staff in 27 pilot schools from six demonstration districts. To address the last evaluation question, data were gathered from the Project Tracking System (PTS; a statewide accountability database used by discretionary projects in the state to enter information on their activities and the associated outcomes), surveys administered following Project provided trainings, as well as reviews of permanent products created by Project staff members. What follows is a description of the instrumentation and procedures used to answer the evaluation questions. Instrumentation and Administration Procedures

To answer the above evaluation questions a variety of instruments and data sources were employed. The instruments and administration procedures described below were designed to assess components of consensus building, infrastructure development, MTSS implementation, and building capacity for the statewide implementation of an MTSS. Copies of each instrument described below are included in Appendix B. Copies of the instruments developed or adapted by the Project are posted on the Project’s website (click here for access to the PS/RtI Project Evaluation Tools) as well. See Castillo, Batsche, Curtis, Stockslager, March, and Minch (2010) for information on the technical characteristics of the instruments including available reliability and validity data (click here for access to the PS/RtI Project Technical Assistance Manual). Please note that additional analyses were conducted on the technical characteristics of several surveys after the data collection described in this report. For more information on the revised surveys and Technical Assistance Manual, please see Castillo, et al., (2012) (click here for access to the PS/RtI Project Technical Assistance Manual-Revised).

Beliefs Survey. The Beliefs Survey was designed to assess educators’ beliefs regarding data-based decision-making, functions of instruction and intervention, and the capabilities and performance of students with high-incidence disabilities. To determine educator beliefs in these domains, respondents were asked to indicate their level of agreement with each belief statement (items 6-27) using a 5-point Likert-type response scale:

1 = Strongly Disagree 2 = Disagree 3 = Neutral 4 = Agree 5 = Strongly Agree

Problem Solving/Response to Intervention Program Evaluation 13

Page 18: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

The survey was administered to both SBLT members and instructional staff in pilot schools at the beginning and end of Year 1 and at the end of Years 2, 3, and 4 to examine possible changes in beliefs over time. During the first three years, RCs administered the survey to SBLT members at SBLT trainings and MTSS Coaches administered the survey to instructional staff at the pilot schools. Administration of the instrument during staff and grade-level team meetings and dissemination via mailboxes were the primary ways that MTSS Coaches facilitated completion of the survey by instructional staff. During Year 4, SBLT members and instructional staff from five of the six demonstration districts completed the survey online via SurveyMonkey®. SBLT members and instructional staff from pilot schools in the sixth demonstration district completed hard copy versions of the surveys. The extent to which educators have agreed with the beliefs assessed by the instrument has been used by the Project as one data source to examine an important component of consensus among school staff.

Perceptions of RtI Skills Survey. The Perceptions of RtI Skills Survey was designed to assess educators’ perceptions of their skills in data-based problem solving (1) when applied to academic content, (2) when applied to behavior content, and (3) when applied to data manipulation and technology use. Respondents were asked to indicate perceptions of their skill levels using the following response scale:

1 = I do not have this skill at all (NS) 2 = I have minimal skills in this area; need substantial support to use it (MnS) 3 = I have this skill, but still need some support to use it (SS) 4 = I can use this skill with little support (HS) 5 = I am highly skilled in this area and could teach others this skill (VHS)

The survey was administered to both SBLT members and instructional staff in pilot schools at the beginning and end of Year 1 and the end of Years 2, 3, and 4 using the same procedures described above for the Beliefs Survey.

Project Tracking System (PTS). As a discretionary grant project funded by the FDOE Bureau of Exceptional Education and Student Services, Project staff are required to enter activities into the PTS. Project staff enter activities under the following Performance Categories: Training, Deliverable, and Service Delivery. Trainings include the implementation of a formal conference or event, presenting to an audience for the purpose of disseminating information, and providing in-depth instruction to recipients in order to support effective services. Deliverables include any tangible resources disseminated for the purpose of supporting data collection and analysis, providing information on effective services, and providing instructional support. Finally, service delivery events include directly working with students or adults, providing support and consulting with intermediaries (e.g., parents, families, teachers, schools) in order to assist them in providing quality services to students, and facilitating collaborative activities between several parties (e.g., statewide workgroups, local multidisciplinary teams). These data collected from Project staff are reported to the FDOE in quarterly reports.

Self-Assessment of Problem Solving Implementation. The Self-Assessment of Problem Solving

Implementation (SAPSI) is a needs assessment and progress monitoring tool designed to inform implementation of an MTSS. More specifically, the SAPSI provides information on the extent to which a school is working toward consensus regarding implementing an MTSS, has the infrastructure in place to implement the model, and has implemented the critical components of an MTSS. The

14 Problem Solving/Response to Intervention Program Evaluation

Page 19: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

SAPSI contains items that require educators to report the extent to which specific activities in the above systems change domains are occurring using the following 4-point response scale:

0 =Not Started (N): The activity occurs less than 25% of the time 1 = In Progress (I): The activity occurs approximately 25% to 74% of the time 2 = Achieved (A): The activity occurs approximately 75% to 100% of the time 3 = Maintaining (M): The activity was rated as “Achieved” last time and continues to occur

approximately 75% to 100% of the time The SAPSI was completed by SBLT members at the beginning and end of Year 1 as well as at the middle and end of Years 2, 3, and 4. One SAPSI was completed per pilot school by the SBLTs at each time point. PS/RtI Coaches facilitated a discussion among SBLT members regarding responses to each item until consensus on a response was reached. MTSS Coaches recorded the agreed upon response and submitted the final protocol to the Project.

Tiers I & II Critical Components Checklist. The Tiers I & II Critical Components Checklist contained items that assessed the extent to which critical problem-solving steps were present when educators examined core (i.e., Tier I) and/or supplemental instruction (i.e., Tier II). MTSS Coaches examined permanent products from meetings targeting Tier I and II instruction for evidence of the problem-solving steps. Common examples of permanent products used to complete the checklists included data printouts/graphs, meeting notes, and completed worksheets or forms used to record meeting outcomes. Data from the Tiers I & II Critical Components Checklists were collected three times during each of the four years examined in this report. This instrument also was completed three times per year for each of the three school years that preceded the start of the Project-provided professional development and support to establish a baseline. Documentation was gathered from data meetings targeting Tier I and/or II instruction occurring from August through November (Window 1), December through March (Window 2), and April through July (Window 3). Permanent products were examined during these windows to align with expectations for universal screenings and for Tier I problem solving meetings to occur at least 3 times per year (see Batsche et al., 2005). One checklist was completed for every content area and grade level targeted by the pilot schools within each of the windows (checklists were completed for comparison schools based on the pilot school targets). MTSS Coaches completed the checklist during the first three years of the Project by looking through the available documentation for evidence of components of problem solving and rating the extent to which each component was present using a standard rubric. During Year 4, Project staff (RCs and RFs) completed the checklists using the same procedures as previous years. The standard rubric used by MTSS Coaches and RCs and RFs employed the following scale:

0= Absent 1= Partially Present 2= Present

Tier III Critical Components Checklist. The Tier III Critical Components Checklist contained

items that assessed the extent to which critical problem-solving steps were present when educators examined individual student cases. MTSS Coaches examined permanent products from meetings targeting individual student progress for evidence of the problem-solving steps. Common examples of permanent products used to complete the checklists included data printouts/graphs, meeting notes, and completed worksheets or forms used to record meeting outcomes. Data from the Tier III Critical Components Checklists were collected on up to five individual student cases per year. This instrument was completed for cases that occurred during the 2007-08, 2008-09, 2009-10, and 2010-11 school

Problem Solving/Response to Intervention Program Evaluation 15

Page 20: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

years (i.e., Years 1-4). During Years 1-3, MTSS Coaches randomly selected cases from lists of students who had been discussed by the school team identified as responsible for addressing individual student needs. Coaches were asked to select three cases initiated before Winter Break and two cases initiated after Winter Break to facilitate a sample representative of cases that occurred throughout the year. Coaches completed one checklist for each individual student case by looking through the available documentation for evidence of components of problem solving and rating the extent to which each component was present using a standard rubric. During Year 4, Project staff (RCs and RFs) completed the checklists using the same procedures as previous years. The standard rubric used by MTSS Coaches and Project staff (RCs and RFs) employed the following scale:

0= Absent 1= Partially Present 2= Present Training Evaluation Survey. The Training Evaluation Survey was designed to assess

educators’ satisfaction with a training session delivered by Project staff. To determine educator satisfaction with various elements of a training session, respondents were asked to indicate their level of agreement with satisfaction statements (items 5-17) using a 5-point Likert-type response scale:

1 = Strongly Disagree 2 = Disagree 3 = Neutral 4 = Agree 5 = Strongly Agree

The survey was administered to educators who attended Training of Trainers sessions during the 2010-11 school year. The Training of Trainers sessions were designed to provide district personnel with the training curriculum used by the Project to facilitate professional development with the aforementioned pilot schools. The purpose of the training sessions was to provide a structured, three-year training model for MTSS implementation, which could then be used as the basis for an MTSS professional development model in participants’ districts. The Training Evaluation Survey was used to examine the recipients’ satisfaction with the quality of the Training of Trainers sessions.

16 Problem Solving/Response to Intervention Program Evaluation

Page 21: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Findings: Ongoing Pilot Site Implementation and Project Capacity Building Efforts

What follows is a discussion of data examining selected evaluation questions that address the extent to which the Project goals highlighted above were attained. First, information regarding the relationship between the removal of professional development and support and targeted systems change outcomes in pilot schools is provided. Second, information regarding the Project’s capacity building activities to support statewide MTSS implementation is discussed. Consensus To what extent did the conclusion of professional development and support provided by the Project relate to changes in beliefs consistent with an MTSS? Project staff used data from the Beliefs Survey to answer this evaluation question. Specifically, responses from SBLT members and instructional staff at the 27 pilot schools were examined. Mean domain scores summarizing beliefs about (1) students with disabilities academic capabilities and performance, (2) data-based decision-making, and (3) functions of core and supplemental instruction were calculated from each administration. Project staff graphed the data for both SBLT member and instructional staff to facilitate interpretation. See Figures 1a-1c below for data examining educator beliefs regarding students with disabilities, data-based decision-making, and functions of instruction, respectively.

Problem Solving/Response to Intervention Program Evaluation 17

Page 22: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Figure 1a. Trends in School-Based Leadership Team

(SBLT) and Pilot School Staff Beliefs About Students with D

isabilities Achieving Academ

ic Benchmarks.

18 Problem Solving/Response to Intervention Program Evaluation

Page 23: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Fi

gure

1b.

Tre

nds i

n Sc

hool

-Bas

ed L

eade

rshi

p Te

am (S

BLT)

and

Pilo

t Sch

ool S

taff

Belie

fs A

bout

Dat

a-Ba

sed

Dec

isio

n-M

akin

g.

Problem Solving/Response to Intervention Program Evaluation 19

Page 24: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Figure 1c. Trends in School-Based Leadership Team

(SBLT) and Pilot School Staff Beliefs About the Functions of Core and Supplem

ental Instruction.

20 Problem Solving/Response to Intervention Program Evaluation

Page 25: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Visual analysis of responses to the Beliefs Survey indicated that SBLT members and pilot school instructional staff continued to report beliefs about data-based decision-making and functions of core and supplemental instruction that, on average, were consistent with the fundamental beliefs of an MTSS. However, both SBLT members and instructional staff continued to report lower beliefs on average, about the academic ability and performance of students with disabilities.

Importantly, differences in levels of beliefs continued to be evident across the two groups.

SBLT members, on average, reported beliefs about data-based decision-making and functions of core and supplemental instruction that exceeded 4.0 (i.e., A score of 4.0 indicates agreement with the identified belief) across all five administrations. Overall, there was an increase in the reported beliefs for SBLT members from the beginning of Year 1 to the end of Year 3. However, there was a slight decrease in the average score for SBLT members during Year 4 for the two domains. While a slight decrease was observed, SBLT members, on average, still reported agreement with both domains at the end of Year 4. Beliefs within the two domains reported by instructional staff typically approximated 4.0, but remained consistently lower than beliefs reported by SBLT members.

As was previously mentioned, both SBLT members and instructional staff continued to report beliefs reflecting lower levels of academic capabilities and performance of students with disabilities. Consistent with beliefs in the other two domains, differences were evident across the two groups. SBLT members’ mean domain score exceeded 3.0 (a score of 3.0 is equivalent to being neutral in terms of the identified belief) across administrations with a small increase evident from the beginning of Year 1 to the end of Year 3, but a small decrease evident from the end of Year 3 to the end of Year 4. Pilot school instructional staff reported beliefs that typically approximated or slightly exceeded 3.0 with small overall increases from the beginning of Year 1 to end of Year 4. As with the other domains, SBLT members reported, on average, higher levels of agreement related to academic capabilities and performance of students with disabilities when compared to instructional staff at each time point. The mean domain scores of approximately 3.0 across the administrations indicate that, on average, SBLT members and instructional staff continued to report being neutral in terms of their beliefs regarding the academic performance and capabilities of students with disabilities.

Overall, visual analysis of the Beliefs Survey data suggests that SBLT member belief levels were initially more consistent with the tenets of an MTSS than those of the other instructional staff and remained more consistent throughout the three-years of systematic professional development provided by the Project. However, the decreases in belief levels between Year 3 and Year 4, when Project-provided professional development ended, were more noticeable among SBLT members. The increases in belief levels among SBLT members from Year 1 to Year 3 and decreases between Year 3 and Year 4 suggest a potential relationship between the professional development that was provided to SBLT members and beliefs consistent with an MTSS. In other words, the end of Project provided professional development seemed to relate to decreases in the beliefs of SBLT members for all three domains. While these results cannot be considered causal, decreases in SBLT member beliefs from Year 3 to Year 4 suggest that additional activities may be needed for staff to fully develop and sustain beliefs consistent with implementing an MTSS. See Tables 1a-1c and 2a-2c located in Appendix C for item-level data from the Beliefs Survey for SBLT members and instructional staff, respectively.

To what extent did the end of professional development and support provided by the

Project relate to changes in consensus development? Project staff used data from the SAPSI administered to pilot schools five times throughout the 4-year period to address this evaluation question. Specifically, SBLT members’ responses to five items that assessed indicators of

Problem Solving/Response to Intervention Program Evaluation 21

Page 26: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

comprehensive commitment to and support of MTSS were analyzed. Project staff analyzed SBLT member responses in terms of the number of schools that reported not starting, being in progress, achieving, or maintaining an activity. Figure 2 below shows the levels of consensus building activities reported by SBLT members across the 4-year period for 26 of the 27 pilot schools. Results for one pilot school were not included due to deviations from SAPSI administration procedures.

22 Problem Solving/Response to Intervention Program Evaluation

Page 27: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Fi

gure

2. S

choo

l-Bas

ed L

eade

rshi

p Te

am (S

BLT)

Sel

f-Ass

essm

ent o

f Pro

blem

Sol

ving

Impl

emen

tatio

n (S

APSI

) Tre

nds:

Con

sens

us B

uild

ing

Activ

ities

(BO

Y =

Beg

inni

ng o

f Yea

r; E

OY

= E

nd o

f Yea

r; Y

1 =

Yea

r 1; Y

2 =

Yea

r 2; Y

3 =

Yea

r 3; Y

4 =

Yea

r 4).

Problem Solving/Response to Intervention Program Evaluation 23

Page 28: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Visual analysis of the data indicated continued increases for three of the five consensus-building activities across the 4-year period. At the beginning of Year 1, the percentage of schools that reported achieving or maintaining consensus-building activities ranged from 0 to slightly greater than 40% depending on the item. At the end of Year 3, the percentage of schools reporting achieving or maintaining the same five activities ranged from approximately 60% to 100% depending on the item. At the end of Year 4, there was a slight decrease in the percentage of schools reporting achieving or maintaining for two items (Items 4 and 5).

Overall, pilot school SBLTs continued to report increases in three consensus-building activities after Project professional development and support concluded, but reported decreases in two consensus-building activities. The continued increase, or maintenance, of several of the consensus-building activities reflected on the SAPSI is noteworthy. During the final year of systematic supports, Project staff worked to build the capacity of the SBLTs to continue MTSS implementation in subsequent years. Project staff provided time and support to SBLT members during training sessions to strategically plan for continued MTSS implementation. Results from the SAPSI indicate that the majority of pilot schools continued to report engaging in consensus-building activities regarding MTSS implementation during the 2010-11 school year. Infrastructure

To what extent did the conclusion of professional development and support provided by the Project relate to changes in educators’ knowledge and skills required to implement an MTSS? Project staff used data from the Perceptions of RtI Skills Survey to answer this evaluation question. Specifically, responses from SBLT members and instructional staff at the 27 pilot schools were examined. Mean domain scores summarizing perceptions regarding (1) skills applied to academic content, (2) skills applied to behavior content, and (3) data manipulation and technology use skills were calculated using the items that comprise the domain. Project staff graphed the data for both SBLT members and instructional staff to facilitate interpretation. See Figures 3a-3c below for data on educator perceived skills when applied to academic content, when applied to behavior content, and when applied to data manipulation and technology use, respectively.

24 Problem Solving/Response to Intervention Program Evaluation

Page 29: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Fi

gure

3a.

Tre

nds i

n Sc

hool

-Bas

ed L

eade

rshi

p Te

am (S

BLT)

and

Pilo

t Sch

ool S

taff

Perc

eptio

ns o

f MTS

S Sk

ills A

pplie

d to

Aca

dem

ic C

onte

nt.

Problem Solving/Response to Intervention Program Evaluation 25

Page 30: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Figure 3b. Trends in School-Based Leadership Team

(SBLT) and Pilot School Staff Perceptions of MTSS Skills Applied to Behavior C

ontent.

26 Problem Solving/Response to Intervention Program Evaluation

Page 31: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Fi

gure

3c.

Tre

nds i

n Sc

hool

-Bas

ed L

eade

rshi

p Te

am (S

BLT)

and

Pilo

t Sch

ool S

taff

Perc

eptio

ns o

f Dat

a M

anip

ulat

ion

and

Tech

nolo

gy U

se

Skill

s.

Problem Solving/Response to Intervention Program Evaluation 27

Page 32: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Visual analysis of the data suggested continued differences in the levels of and trends in perceived skills across SBLT members, instruction staff, and domains. Educators in both groups consistently reported the highest level of perceived skills when applying MTSS skills to academic content; however, changes in perceived skills differed across the groups. At the beginning of Year 1, the mean perceived skill level was about 3.6 for SBLT members and 3.4 for instructional staff. These perceived skill levels indicated that, on average, educators reported having some skills with applying MTSS concepts to academic content, but required support. Increases in perceived skill levels occurred for both groups across the 3-year period of Project support. At the end of the three years of Project support, the mean perceived skill level when applying MTSS skills to academic content was about 4.0 (i.e., I can use this skill with little support) for SBLT members and 3.7 for instructional staff. However, at the end of Year 4, both SBLT member and pilot school instructional staff perceived skills when applying MTSS skills to academic content decreased.

A similar pattern of perceived skills continued for the domains measuring application to

behavior content as well as application to data manipulation and technology use despite lower levels of perceived skill. Mean reported levels for MTSS skills applied to behavior content exceeded 3.0 at the beginning of Year 1 for both groups and increases across the three years of Project support were evident (mean responses approximated 3.5 at the end of Year 3). Furthermore, SBLT members indicated similar levels of perceived skills from Year 3 to Year 4. However, instructional staff perceptions of their skills in applying MTSS concepts to behavior content decreased. Regarding data manipulation and technology use skills, mean skill levels slightly exceeded or were below 3.0 across both groups at the beginning of Year 1. At the end of Year 3, discernible increases in mean reported skill levels were evident for both groups (SBLT members mean skills approximated 3.5 and instructional staff now exceeded 3.0). However, both SBLT members and instructional staff perceived skills in data manipulation and technology use decreased from Year 3 to Year 4.

Overall, data from the Perceptions of RtI Skills Survey suggest a potential relationship

between the level of systematic, intensive professional development provided and the perceived skill levels of educators. (See Tables 3a-3c and 4a-4c located in Appendix C for item-level data on SBLT members’ and instructional staff perceptions of MTSS skills, respectively). During the three-years of systematic professional development, SBLT members’ responses indicated increasing perceptions of MTSS skills across several domains. However, SBLT members’ responses during the follow-up year indicated, on average, decreases in perceived skills related to academic content and data manipulation and technology use. These data seem to indicate that additional supports (e.g., additional training, technical assistance, and coaching) may be necessary for SBLT members and instructional staff to develop and sustain the skills to implement an MTSS.

In addition to additional professional development, other hypotheses for lower levels of

perceived skills include turnover among leadership team members in pilot schools, as well as a lack of district provided professional development and support. Specifically, only approximately 54% of SBLT members present at the first day of training during Year 1 remained at the end of Year 3. Additional analyses showed that an average of 19% of instructional school staff were new to their school each year, resulting in new staff each year that had likely not been trained to implement an MTSS. These high turnover rates could have lead to new staff and SBLT members missing training, resulting in lower perceived skills. Related to district support, several stakeholders reported that inconsistent district policies and procedures, as well as limited district-level professional development were barriers to training staff and implementing an MTSS in the pilot schools.

28 Problem Solving/Response to Intervention Program Evaluation

Page 33: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

To what extent did the conclusion of professional development and support provided by the Project relate to changes in infrastructure development? Project staff used data from the SAPSI to address this evaluation question. Specifically, SBLT members responded to items that assessed indicators of infrastructure development designed to support MTSS implementation. Responses to these items were analyzed in terms of the number of schools that reported not starting, being in progress, achieving, or maintaining the activity. Figures 4a and 4b below contain reported levels of infrastructure activities from 26 of the 27 pilot schools across the 4-year period.

Problem Solving/Response to Intervention Program Evaluation 29

Page 34: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Figure 4a. School-Based Leadership Team

(SBLT) Self-Assessment of Problem

Solving Implem

entation (SAPSI) Trends: Infrastructure D

evelopment - D

ata Collection Activities (BO

Y= Beginning of Year; EO

Y= End of Year; Y1 =

Year 1; Y2 = Year 2; Y3 =

Year 3; Y4 = Year

4).

30 Problem Solving/Response to Intervention Program Evaluation

Page 35: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Fi

gure

4b.

Sch

ool-B

ased

Lea

ders

hip

Team

(SBL

T) S

elf-A

sses

smen

t of P

robl

em S

olvi

ng Im

plem

enta

tion

(SAP

SI) T

rend

s: In

fras

truc

ture

D

evel

opm

ent -

Tea

m S

truc

ture

and

Pro

cess

(BO

Y= B

egin

ning

of Y

ear;

EO

Y= E

nd o

f Yea

r; Y

1 =

Yea

r 1; Y

2 =

Yea

r 2; Y

3 =

Yea

r 3; Y

4 =

Ye

ar 4

).

Problem Solving/Response to Intervention Program Evaluation 31

Page 36: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Visual analysis of the data suggested increases on all indicators of infrastructure development assessed by the SAPSI from Year 1 to Year 3. Notably, SBLT members reported continued increases for more than half of the indicators during Year 4, despite the end of professional development supports. At the beginning of Year 1, the percentage of schools reporting having achieved or maintained an infrastructure activity ranged from approximately 5% to slightly greater than 50%. By the end of Year 4, the percentage of schools that reported achieving or maintaining a given infrastructure activity exceeded 80% for all but three items. These data suggest that most pilot schools reported increases in their capacity to collect and use data to inform decisions regarding student need and RtI across tiers during the 4-year period examined. Additionally, the majority of SBLTs at pilot schools reported continuing to engage in activities to build the infrastructure necessary to implement an MTSS following the conclusion of Project-provided professional development. Schools reported engaging in activities such as collecting data to evaluate core instruction, identifying students who are at-risk academically, establishing a process to identify evidence-based practices across tiers, and meeting to evaluate student RtI regularly.

While SBLT reports suggested increases in some infrastructure-building activities, there

were decreases from Year 3 to Year 4 in the percentage of schools reporting achieved or maintained for a few indicators. Indicators of infrastructure development on which decreases occurred included activities such as data being collected, selecting evidence-based practices for Tiers I and II, and having an SBLT that meets regularly. However, it is important to note that these decreases typically represented less than 5% of schools. Implementation

To what extent did the conclusion of professional development and support provided by the Project relate to changes in the establishment of a three-tiered instruction and intervention system? Project staff used data from the SAPSI to address this evaluation question. Specifically, SBLT members responded to items that assessed the extent to which schools clearly identified evidence-based academic and behavioral practices available across tiers. Responses to these items were analyzed in terms of the number of schools that reported not starting, being in progress, achieving, or maintaining the activity. Figure 5 below shows the reported levels of activities from 26 of the 27 pilot schools across the four-year period.

32 Problem Solving/Response to Intervention Program Evaluation

Page 37: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Fi

gure

5. S

choo

l-Bas

ed L

eade

rshi

p Te

am (S

BLT)

Sel

f-Ass

essm

ent o

f Pro

blem

Sol

ving

Impl

emen

tatio

n (S

APSI

) Tre

nds:

Impl

emen

tatio

n -

Thre

e-Ti

ered

Inte

rven

tion

Syst

em (B

OY

= B

egin

ning

of Y

ear;

EO

Y =

End

of Y

ear;

Y1

= Y

ear 1

; Y2

= Y

ear 2

; Y3

= Y

ear 3

; Y4

= Y

ear 4

).

Problem Solving/Response to Intervention Program Evaluation 33

Page 38: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Visual analysis of the data revealed continued increases in the extent to which schools reported clearly defining what evidence-based practices existed across content areas and tiers. At the beginning of Year 1, the percent of schools that reported achieving or maintaining activities in this area was below 50% for all items and below 25% for five of the six items. At the end of Year 4, the percentage of schools reporting establishing clearly defined evidence based practices for Tiers 1, 2, and 3 for academic content exceeded 80% for all items. The percentage of schools reporting achieving or maintaining the same activities for behavior content exceeded 80% for Tiers I and III and exceeded 60% for Tier II. Consistent with the SAPSI infrastructure items and other data sources, pilot schools typically reported higher levels of engagement in MTSS activities for academic content when compared to MTSS activities for behavior content across all time points. Regardless of whether the indicator focused on academic or behavior content, increases in reported implementation of an MTSS continued following the conclusion of Project-provided professional development.

To what extent did the conclusion of professional development and support provided by the Project relate to changes in implementation of problem solving steps when addressing student needs at the Tier I and/or II levels? Project staff used the Tier I and II Critical Components Checklist as the primary data source to address this evaluation question. MTSS Coaches and Project staff (RCs and RFs) completed the Tier I and II Critical Components Checklist at pilot schools three times per year to coincide with typical universal screening windows. Because the checklist involved applying a standard rubric (0=Absent, 1=Partially Present, 2=Present) to rate evidence of problem solving reflected in permanent products from data meetings, MTSS Coaches could complete the checklist from products from previous years as well. The three administrations within each year were combined to yield one mean score for each item to facilitate interpretation.

Figure 6 shows the mean levels of implementation of components of problem solving when

addressing Tier I and II issues for pilot schools. Data are included from meetings during the 2004-05 through 2010-11 school years (2004-05 through 2006-07 school year data represent baseline years, 2007-08 through 2009-10 data represent years during with the Project provided professional development, and 2010-11 data represent the year after systematic professional development) that addressed reading content only. The decision to analyze only data for meetings that focused on reading performance was made because the vast majority of pilot schools chose to focus on reading, while only a small subset focused on math and/or behavior. The data represent checklists completed for 24 pilot schools because three pilot schools within one demonstration district did not target reading.

34 Problem Solving/Response to Intervention Program Evaluation

Page 39: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Fi

gure

6. A

vera

ge P

ilot S

choo

l Lev

els o

f Pro

blem

Sol

ving

Impl

emen

tatio

n Ev

iden

t Dur

ing

Dat

a M

eetin

gs T

arge

ting

Tier

I an

d/or

II

Inst

ruct

ion

in R

eadi

ng: 2

004-

05 T

hrou

gh 2

010-

11 S

choo

l Yea

rs.

Problem Solving/Response to Intervention Program Evaluation 35

Page 40: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Visual analysis of the data suggests that pilot schools demonstrated increases in levels of implementation across the 3-year period of systematic professional development, followed by decreases in implementation of all problem-solving components during the subsequent year. During the 2006-07 schools year (the final baseline year before Project implementation), mean item values ranged from 0 to .74. Further examination of the problem analysis (Step 2) and intervention development and implementation (Step 3) steps suggest that pilot schools typically did not engage in these components during the 2006-07 school year. Mean values were slightly higher for some components of the problem identification and program evaluation/RtI steps, but the majority of the values were typically below .50.

From 2007-08 to 2009-10, increases in implementation were evident. At the end of Year 3, mean implementation levels ranged from .61 to 1.71 across problem solving components with values exceeding 1.0 for eight of the 15 items. However, when Project supports concluded following the 2009-10 school year, mean item values decreased for each of the 15 components. At the end of Year 4 (2010-11), pilot school mean implementation levels ranged from .13 to 1.25. To determine the exact magnitude of the differences in implementation decreases from 2009-10 to 2010-11, differences scores were calculated for each item. Decreases in the level of implementation in pilot schools ranged from -.16 to -.90.

The increases observed during the three years of Project provided professional development

and subsequent decreases in the year following the conclusion of those supports suggest a potential relationship between the professional development and support provided to schools and implementation of problem solving. Despite the increases observed during the first three years, it should be noted that the data suggest that full implementation of all steps of the problem-solving process was not achieved. At the end of Year 3, mean implementation levels for seven of the 15 components did not exceed 1.0 (equivalent to the component being partially present). Furthermore, the mean implementation level exceeded 1.5 for only two of the 15 components. These data, when considered with the decreases in mean implementation levels following the conclusion of Project-provided professional development suggest that additional supports are necessary for schools to reach full implementation. Two additional hypotheses for factors contributing to lower levels of implementation include a lack of sustained district professional development and support as well as high levels of turnover among school staff (these issues were discussed above in terms of their potential relationship to educator skills).

To what extent did the conclusion of professional development and support provided by the Project relate to changes in implementation of problem solving steps when addressing individual student needs? Project staff used the Tier III Critical Components Checklist as the primary data source to address this evaluation question. MTSS Coaches and Project staff completed the Tier III Critical Components Checklist at pilot schools on five randomly selected individual student-focused cases each year. In some instances, checklists were completed on less than five cases within a given year because too few students were referred to the identified team. The checklist involved applying a standard rubric (0=Absent, 1=Partially Present, 2=Present) similar to the one used for the Tiers I and II Critical Components Checklist. See Figure 7 below for the pilot schools’ (n = 27) mean level of implementation of problem solving components applied to individual student cases from the 2007-08 through 2010-11 school years. The data represent only those checklists completed on student cases in which reading was the primary concern. Due to other demands placed on MTSS Coaches by the Project and demonstration districts, MTSS Coaches only collected data using the permanent product review protocol for the three years of Project-provided professional development. Thus, baseline data were not gathered.

36 Problem Solving/Response to Intervention Program Evaluation

Page 41: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Fi

gure

7. P

ilot S

choo

l Ave

rage

Lev

els o

f Pro

blem

Sol

ving

Impl

emen

tatio

n Ev

iden

t Dur

ing

Indi

vidu

al S

tude

nt-F

ocus

ed D

ata

Mee

tings

Ad

dres

sing

Rea

ding

Con

cern

s: 2

007-

08 T

hrou

gh 2

010-

11 S

choo

l Yea

rs.

Problem Solving/Response to Intervention Program Evaluation 37

Page 42: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Visual analysis of the data revealed increases in the levels of implementation from Year 1 to Year 3, followed by decreases in the levels of implementation for 14 of the 16 items. The mean implementation levels from 2009-10 to 2010-11 decreased for all but two items. A small increase of .05 was evident for one item, which measured the extent to which progress monitoring data were collected and reported graphically. A slightly larger increase of .17 was evident for the second item, which measured the extent to which criteria for positive RtI were agreed upon. In order to examine the magnitude of decreases in implementation from Year 3 to Year 4, difference scores were calculated by subtracting the mean implementation level for each component during the 2009-10 school year from the mean implementation level for each component during the 2010-11 school year. The magnitude of decreases for the 14 items ranged from -.08 to -.70.

Notably, SBLT reported levels of implementation derived from items on the SAPSI examining use of the problem-solving process suggested higher levels of implementation than the other data sources. Figure 8 below includes data from items examining problem-solving components across 26 pilot schools. Increases in the components of problem solving were evident when comparing the percentage of schools reporting achieving or maintaining a given activity at the beginning of Year 1 (less than 20% of schools reported achieving or maintaining any problem-solving activity) to the end of Year 4 (greater than 80% of schools reported achieving or maintaining all problem solving activities). One plausible explanation for difference in implementation levels across data sources is that self-report data tend to be positively biased (see Noell and Gansle, 2006). Another possible explanation is that the permanent product review methodology that the critical components checklists are derived from rely on documentation of activities. It is conceivable that pilot schools implemented the steps of problem solving at higher levels than the available documentation suggests (i.e., the steps were implemented, but records not maintained at the same level). However, it should be noted that direct observations of data meetings conducted during previous years resulted in implementation levels consistent with the critical components checklists.

38 Problem Solving/Response to Intervention Program Evaluation

Page 43: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Fi

gure

8. S

choo

l-Bas

ed L

eade

rshi

p Te

am (S

BLT)

Sel

f-Ass

essm

ent o

f Pro

blem

Sol

ving

Impl

emen

tatio

n (S

APSI

) Tre

nds:

Impl

emen

tatio

n -

Prob

lem

-Sol

ving

Pro

cess

(BO

Y =

Beg

inni

ng o

f Yea

r; E

OY

= E

nd o

f Yea

r; Y

1 =

Yea

r 1; Y

2 =

Yea

r 2; Y

3 =

Yea

r 3; Y

4 =

Yea

r 4).

Problem Solving/Response to Intervention Program Evaluation 39

Page 44: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

To what extent did pilot schools continue to engage in data-based planning to facilitate implementation of an MTSS after the professional development and support provided by the Project concluded? Project staff used data from the SAPSI administered to pilot schools to address this evaluation question. Specifically, SBLT members responded to items that assessed the extent to which schools have engaged in strategic planning and decision-making activities intended to increase levels of MTSS implementation. Responses to these items were analyzed in terms of the number of schools that reported not starting, being in progress, achieving, or maintaining the activity. Figure 9 below contains the school reported level of strategic planning and decision-making activities from 26 pilot schools.

40 Problem Solving/Response to Intervention Program Evaluation

Page 45: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Fi

gure

9. S

choo

l-Bas

ed L

eade

rshi

p Te

am (S

BLT)

Sel

f-Ass

essm

ent o

f Pro

blem

Sol

ving

Impl

emen

tatio

n (S

APSI

) Tre

nds:

Impl

emen

tatio

n -

Mon

itori

ng a

nd A

ctio

n Pl

anni

ng (B

OY

= B

egin

ning

of Y

ear;

EO

Y =

End

of Y

ear;

Y1

= Y

ear 1

; Y2

= Y

ear 2

; Y3

= Y

ear 3

; Y4

= Y

ear 4

).

Problem Solving/Response to Intervention Program Evaluation 41

Page 46: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Visual analysis of the data revealed increases in the percent of schools engaging in strategic planning and decision-making across the 4-year period. At the beginning of Year 1, less than 15% of pilot schools reported achieving or maintaining any of the activities examined. At the end of Year 3, the percent of schools that reported achieving or maintaining strategic planning and decision-making activities ranged from approximately 50% to 95% depending on the item. After Project support concluded following Year 3, continued increases in the percentage of SBLTs that reported engaging in strategic planning and decision-making activities were evident. Specifically, the percent of schools that reported achieving or maintaining activities increased for four out of five items. By the end of Year 4, three items had 80% or more of schools reporting achieved or maintained activities. Those three items involved using a strategic MTSS implementation plan to guide efforts, SBLT members meeting at least two times per year to review implementation and student outcome data to inform decision-making, and providing feedback to the staff on the progress of the school’s MTSS initiative at least one time per year. Consistent with data from the item examining district commitment and support discussed above, fewer schools reported consistently meeting with district leadership to review implementation issues and student outcomes as well as to use data to make changes to the school’s implementation plan. Increasing capacity for statewide implementation of an MTSS As mentioned in the Introduction, the Project also worked to increase the capacity for statewide implementation of an MTSS during the 2010-11 school year. The Project’s two main goals were to increase the Project’s capacity to support districts in implementing an MTSS and to build district capacity for MTSS implementation. Activities that Project leadership and staff engaged in to meet these two goals are described below.

To what extent did the Project increase internal capacity to support districts in implementing an MTSS?

Capacity Development for All Staff. As the Project mission evolved, Project staff engaged in

several activities to increase their capacity to support school districts. In collaboration with the Florida PBS/RtI:B Project, workgroups were created to develop service delivery models and generic professional development materials to support training and technical assistance provided to districts (these workgroups are described in greater detail below). Using the models and materials developed by the workgroups, several days of professional development and support were provided to Project staff members. The professional development activities focused on several topics that coalesced around processes and procedures for collaborating with school districts and State partners. The professional development was provided to all PS/RtI and PBS/RtI:B Project staff and was delivered by Project leadership and content experts within the Projects’ staff. Overall, the professional development was intended to help staff build relationships with one another, come to consensus on a common way of work, and develop goals and expectations for the upcoming year.

Statewide Professional Development and Support for MTSS Implementation. For the past

several years, RCs have provided professional development opportunities to various stakeholders throughout the state. During the 2010-11 school year, Project leadership hired additional staff to support the shift in the Project mission to building the capacity of districts to implement an MTSS. Three RFs were hired during the winter of the 2010-11 school year. The RFs worked to coordinate activities and collaborate with their respective RC and began to support the Project’s training, implementation, and data collection activities.

42 Problem Solving/Response to Intervention Program Evaluation

Page 47: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

In addition to acclimating new staff to the Project, RCs and RFs also engaged in piloting a district needs assessment process for MTSS implementation. RCs and RFs met with District Leadership Teams (DLTs) to collect data on what DLTs currently used to evaluate student progress, evaluation models that DLTs currently employ, and support that districts needed to implement MTSS in their schools. A protocol for engaging in the needs assessment process was developed and used as the basis for the needs assessment component of a district action planning and problem solving process collaboratively developed by members of the Florida PS/RtI and PBS/RtI:B Projects.

Secondary School MTSS Implementation. While the Project focused primarily on elementary school implementation during its first couple of years, feedback from stakeholders often indicated that support was necessary to scale up implementation to middle and high schools. In order to assist schools and districts in implementing at the secondary level, the Project’s internal capacity for secondary implementation needed to be addressed. Prior to the 2010-11 school year, the Project hired a new staff member to coordinate a secondary initiative. In addition to the Secondary Coordinator, two Learning and Development Facilitators (Reading and Math) were hired to assist in scaling up implementation to the secondary level during the Fall of the 2010-11 school year. The team was constructed to include representation of content expertise in academic subject areas as well as an MTSS. The team worked with Project leadership during the 2010-11 school year to develop and deliver MTSS trainings for secondary settings across the state.

Differentiated Accountability Support for MTSS. The Differentiated Accountability (DA) Model was adopted by the Florida DOE in 2008 to improve student outcomes in schools in need of improvement. The DA Model created five Regional Teams to collect and monitor data, deliver professional development, and provide direct support to staff in targeted schools. Each of the five Regional Teams included MTSS Specialists who are employed by the Project but primarily work with their DA Regional Teams. While the DA MTSS Specialists work for the Project, the majority of their roles, responsibilities, and activities are governed by the DA initiative. Given their unique roles and responsibilities within the Project and the DA Model, information on the activities of the DA MTSS Specialists is typically more limited than for other Project staff. However, the following activities were engaged in by the DA Specialists during the 2010-11 school year to build their capacity to support schools and districts in implementing an MTSS. First, one additional DA Regional MTSS Specialist was hired during the 2010-11 school year to assist the DA Region 4 team in supporting schools and districts within that DA Region. Finally, all DA Regional MTSS Specialists attended the Project-level professional development meetings described above. For more information on the Florida Differentiated Accountability Model, see the FDOE website (Florida Differentiated Accountability Model).

Technology to Support Improved Outcomes for All Students. During the latter part of the 2010-11 school year, the Regional Technology Network (RTN) was incorporated into the Florida PS/RtI Project. The RTN consists of a Technology Project Coordinator, a Regional Loan Library Specialist, and five Regional Technology Coordinators. RTN staff began to attend both internal and external professional development events to familiarize themselves with the Project’s future mission to assist districts in implementing an MTSS.

Integrated Model for Academic and Behavior Supports. In order to increase internal capacity to assist districts in implementing an MTSS, the Project began collaborating more actively with Florida’s PBS/RtI:B Project during the 2010-11 school year. Despite the similarity in approaches to meeting the needs of students through an MTSS, the two projects had been operating separately in

Problem Solving/Response to Intervention Program Evaluation 43

Page 48: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

terms of service delivery. In collaboration with the FDOE, the two Projects engaged in several activities during the school year to build internal capacity for a more integrated academic and behavior MTSS model. First, leadership from both the PS/RtI and PBS/RtI:B Projects held initial meetings to engage in conversations about what the inter-project collaboration would entail. Subsequently, the Inter-Project Leadership Team (ILT), a team of representatives from the leadership of both projects, met throughout the 2010-11 school year to create a vision and mission for the Inter-Project initiative, develop norms and expectations for a way of work, and develop common goals.

In addition to establishing the ILT, the following six workgroups were established: Coaching,

Data-Based Problem Solving, Leadership, Program Evaluation, K-12 Alignment, and Family and Community Engagement. These six workgroups were established based on feedback from a district needs assessment survey administered during the Spring of 2010 in which district-level staff requested support in these areas. During the 2010-11 school year, the six workgroups established norms and expectations among team members, developed models for integrated service delivery within their domain of interest, and began developing generic professional development materials. For the presentations and materials that each workgroup developed and presented at the 2011 MTSS Leadership Institute, see the Leadership Institute website (click here for access to 2011 MTSS Leadership Institute Presentations).

In addition to the workgroup products highlighted above, inter-project collaboration also

produced the District Action Planning and Problem Solving (DAPPS) process. The process was created to provide a systematic way of work for the projects’ staff to use when working with district leadership teams to build their capacity to support MTSS implementation. The process was comprised of a formal initiation of services with an interested school district, a comprehensive needs assessment, and a structured planning and problem-solving process intended to assist districts in achieving goals identified through the needs assessment. The professional development provided to all staff included a focus on the DAPPS process and the collaborative problem solving skills needed to implement it.

To what extent did the Project engage in activities to build district capacity for MTSS

implementation?

Statewide Professional Development and Support for MTSS Implementation. During the 2010-11 school year, Regional Coordinators (RCs) and Regional Facilitators (RFs) engaged in similar activities to support the statewide implementation of an MTSS. One of the main professional development activities RCs and RFs engaged in was the Training of Trainers Regional Meetings. At these meetings, RCs and RFs trained attendees to deliver systematic and structured professional development to their schools and districts using the same scope and sequence delivered to pilot school SBLTs during the three-year demonstration site initiative. RCs and RFs delivered five training sessions during the Fall of 2010 and four follow-up technical assistance sessions during the Spring of 2011. Overall, 329 individuals representing 53 out of Florida’s 67 school districts attended the trainings during the Fall and 102 individuals representing 20 school districts attended the technical assistance sessions during the Spring. Training Evaluation Survey data collected from attendees indicate a high level of satisfaction with the professional development provided by the RCs and RFs. Specifically, 89% of attendees from the fall trainings and almost 70% of attendees from the spring technical assistance sessions reported that they were “Satisfied” or “Very Satisfied” with the quality of the training they received. When asked about the content that was provided, 96% of attendees at the fall trainings and 74% of attendees at the technical assistance sessions “Agreed” or

44 Problem Solving/Response to Intervention Program Evaluation

Page 49: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

“Strongly Agreed” that the content presented addressed the needs of their school or district. See the Florida PS/RtI Project website for the complete, three-year MTSS training module presented at the Training of Trainers sessions (Training of Trainers Modules).

In addition to the Training of Trainers Regional Meetings, RCs and RFs reported almost 300 service delivery events. These service delivery events included consultation and assessment related activities at the state, regional, and district levels. The vast majority of services were delivered at the district level (238 service delivery events), but also included services delivered to school sites (34), as well as regional (8), state (14), and national audiences (1). RCs and RFs provided services to stakeholders on topics including assessment and evaluation, instructional strategies, personnel preparation, and ESE policy and procedures. Consistent with one of the Project’s 2010-11 goals, the RCs and RFs frequently collaborated with staff from Florida’s PBS/RtI:B Project to deliver services.

Secondary School MTSS Implementation. Project staff developed trainings geared towards assisting schools in building their capacity to implement an MTSS at the secondary level. In the Spring of 2011, the Secondary team delivered five training sessions throughout the state. Overall, 288 individuals representing 48 of Florida’s 67 school districts attended these trainings. Training Evaluation Survey data were not available to evaluate these trainings during the 2010-11 school year.

The Secondary Team also worked with schools and districts to support the Educational

Systems Review process. Over the course of the school year, Secondary Team members reported 94 instances of service delivery events. The services provided were to schools (38 service delivery events) and districts (17), as well as regional (4), state (28), and national (1) audiences. The content of the services delivered focused on several educational topics, including accessing instructional materials, assessment and evaluation, curriculum, instructional strategies, personnel preparation, and secondary transition. Finally, it is important to note that the Secondary team collaborated with staff from Florida’s PBS/RtI:B Project during 22 of these service delivery events.

Finally, the Secondary Team created instructional and informational deliverables to support

schools and districts in implementing an MTSS at the secondary level. Approximately 20 deliverables were created. The majority of these deliverables included information on assessment and evaluation, curriculum, instructional strategies, and personnel preparation and involved collaboration with other state-level projects.

Differentiated Accountability Support for MTSS. During the 2010-11 school year, the five DA Regional MTSS Specialists engaged in over 600 service delivery events in the form of Instructional Reviews, site visits/technical assistance, and coaching/training. The vast majority of the services were provided to school sites (75%), while DA Regional MTSS Specialists also worked with districts (12%), regional teams (7%), and state teams (6%).

Technology to Support Improved Outcomes for All Students. The RTN was incorporated into the Project during the Summer of 2011. As a result, information on the activities that the RTN engaged in to assist districts in building their capacity to implement an MTSS was not available.

Integrated Model for Academic and Behavior Supports. During the 2010-11 school year, the two Projects began working to increase the capacity of schools and districts to implement an MTSS by organizing the MTSS Summer Leadership Institute in June of 2011. The Institute, attended by 214 individuals representing 48 districts, provided stakeholders with information regarding MTSS.

Problem Solving/Response to Intervention Program Evaluation 45

Page 50: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Training Evaluation Survey results suggest that attendees were satisfied with the quality of information and training provided at the Summer Institute. Specifically, 93% of attendees reported that they were “Satisfied” or “Very Satisfied” with the quality of the information received. Additionally, when asked about the content of the Summer Institute, 91% of attendees reported that they “Agreed” or “Strongly Agreed” that the content addressed the needs of their school or district.

46 Problem Solving/Response to Intervention Program Evaluation

Page 51: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Summary of Findings

Visual and descriptive analyses of data collected during the three years of professional development and support provided to pilot schools and the one year of follow-up data collection were reported as part of this evaluation. Self-report data from SBLT members and instructional staff from pilot schools (e.g., needs assessments, surveys) and permanent product reviews (e.g., Tiers I & II Critical Components Checklist) suggested that increases in consensus, infrastructure development, and implementation of PS/RtI observed during the three years of Project support did not continue for many indicators. In fact, decreases in educators’ beliefs and perceived skills and PS/RtI implementation were evident. These data suggest that a relationship between systematic professional development and important educational outcomes may exist. Pilot schools may have needed more than three years of support to build sustainable implementation levels. It has been suggested that full implementation can take up to 4-6 years (Batsche et al., 2005). Other potential explanations such as staff turnover and district expectations and support also should be examined.

The second component of this report was to provide stakeholders with information regarding

the Project’s evolving mission to build district capacity to implement an MTSS. Project staff provided training throughout the state, created instructional and informational deliverables, and provided other services to stakeholders at the school, district, state, and national levels. Available data on these services suggest that stakeholders were typically satisfied. Additionally, the Project hired several new staff members during the year and provided professional development to all staff to increase capacity to provide these services. Finally, the Florida PS/RtI Project began collaborating with the Florida PBS/RtI:B Project to assist districts in implementing an integrated academic and behavior MTSS model.

Problem Solving/Response to Intervention Program Evaluation 47

Page 52: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

References

Batsche, G.M., Elliot, J., Graden, J.L., Grimes, J., Kovaleski, J.F., Prasse, D., Reschly, D.J., Schrag, J., Tilly, W.D. (2005). Response to intervention: Policy considerations and implementation. Alexandria, VA: National Association of State Directors of Special Education, Inc.

Bergan, J.R., & Kratochwill, T.R. (1990). Behavioral consultation and therapy. New York: Plenum. Castillo, J.M., Batsche G.M., Curtis, M.J., Stockslager, K., March, A., & Minch, D. (2010). Problem

Solving/Response to Intervention: Evaluation tool technical assistance manual. Retrieved from http://floridarti.usf.edu/resources/program_evaluation/ta_manual/index.html

Castillo, J.M., Batsche G.M., Curtis, M.J., Stockslager, K., March, A., & Minch, D. (2012). Problem

Solving/Response to Intervention: Evaluation tool technical assistance manual- Revised. Retrieved from http://floridarti.usf.edu/resources/program_evaluation/ta_manual_revised2012/index.html

Castillo, J.M., Hines, C.M., Batsche, G.M., & Curtis, M.J. (2011). The Florida Problem

Solving/Response to Intervention Project: Year 3 evaluation report. University of South Florida, Florida Problem Solving/Response to Intervention Project. Retrieved from http://floridarti.usf.edu/resources/program_evaluation/index.html

Individuals with Disabilities Education Improvement Act, U.S.C. H.R. 1350 (2004). Individuals With Disabilities Education Act regulations. 34 C.F.R. Part 300 (2006). No Child Left Behind Act, U.S.C. 115 STAT. 1426 (2002). Noell, G.H., & Gansle, K.A. (2006). Assuring the form has substance: Treatment plan

implementation as the foundation of assessing response to intervention. Assessment for Effective Intervention, 32(1), 32-39.

Spectrum K12 School Solutions (2010). Response to Intervention Adoption Survey 2010. Retrieved

from the Spectrum K12 School Solutions website: http://www.spectrumk12.com//uploads/file/Collateral/2010RTIAdoptionSurveyReport-SpectrumK12.pdf

United States Department of Education, Office of Planning, Evaluation and Policy Development

(2010). A blueprint for reform: The reauthorization of the Elementary and Secondary Education Act. Washington, D.C.

48 Problem Solving/Response to Intervention Program Evaluation

Page 53: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

App

endi

x A

- Pr

oble

m S

olvi

ng -

Res

pons

e to

Inst

ruct

ion/

Inte

rven

tion

Tra

inin

g O

utlin

e

Y

ear O

ne

Yea

r Tw

o Y

ear T

hree

D

ay 1

C

urri

culu

m

Cha

nge

Mod

el -

Con

sens

us, I

nfra

stru

ctur

e,

Impl

emen

tatio

n B

ig id

eas o

f Pro

blem

Sol

ving

Fo

ur P

robl

em S

olvi

ng S

teps

– O

verv

iew

Pr

oble

m Id

entif

icat

ion

Pr

oble

m A

naly

sis

Inte

rven

tion

Des

ign/

Impl

emen

tatio

n R

espo

nse

to In

stru

ctio

n/In

terv

entio

ns

Thre

e Ti

ered

Mod

el o

f Ser

vice

Del

iver

y La

w –

NC

LB, I

DEA

, Flo

rida

Rul

e/St

atut

e Fo

rmat

ion,

Fun

ctio

n an

d Pu

rpos

e of

Pro

blem

So

lvin

g Te

ams

Dat

a C

olle

ctio

n B

elie

fs S

urve

y Pe

rcep

tion

of P

ract

ices

Sc

hool

Per

sonn

el S

atis

fact

ion

Cur

ricu

lum

R

evie

w o

f Yea

r 1 T

rain

ing

Con

sens

us

Focu

s on

Tier

One

Fo

ur P

robl

em S

olvi

ng S

tep

Stat

e R

tI Pl

an

Nat

iona

l RtI

Dat

a R

evie

w D

ata

from

Yea

r One

SA

PSI D

ata

Surv

ey D

ata

Skill

Ass

essm

ent D

ata

Stra

tegi

es fo

r Con

sens

us

Rol

es fo

r Tea

m M

embe

rs

Dat

a C

olle

ctio

n Pe

rcep

tion

of P

ract

ices

Sc

hool

Per

sonn

el S

atis

fact

ion

Skill

Ass

essm

ent

Trai

ning

Eva

luat

ion

Cur

ricu

lum

Pr

oble

m S

olvi

ng

Cas

e St

udy

Exam

ple

Tier

Thr

ee P

robl

em Id

entif

icat

ion

T1, T

2, T

3 da

ta so

urce

Li

nkin

g th

e Ti

ers i

n co

ntex

t U

sing

Tie

r Tw

o da

ta to

det

erm

ine

effe

ctiv

enes

s of T

ier T

wo

and

appr

opria

tene

ss o

f Tie

r Thr

ee

inte

rven

tion

T3 P

robl

em A

naly

sis

Hyp

othe

sis G

ener

atio

n, V

alid

atio

n,

Pred

ictio

n St

atem

ents

W

orks

heet

- Pr

oble

m Id

entif

icat

ion,

Pro

blem

A

naly

sis

Scho

ol B

luep

rint -

Con

sens

us

Dat

a C

olle

ctio

n Sk

ill A

sses

smen

t Tr

aini

ng E

valu

atio

n

Day

s 1 &

2 b

ack

to b

ack

Tech

nica

l Ass

ista

nce

Sess

ion

(s)

Tech

nica

l Ass

ista

nce

Sess

ion

(s)

Appendix A – Problem Solving - Response to Instruction/Intervention Training Outline 1

Page 54: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Y

ear One

Year Tw

o Y

ear Three D

ay 2 C

urriculum

Step I – Problem Identification

Tier One D

ata Sources A

cademic, B

ehavioral R

eplacement B

ehaviors C

urrent Performance

Benchm

ark Performance

Peer Performance

Gap A

nalysis D

ata Collection

Perception of Skills B

eliefs Survey Skill A

ssessment

Training Evaluation

Curriculum

D

ata Feedback Activity

Examples: Tier 1 D

ata Indicating Tier 2 Needs

Tier 2 Defined &

Characterized

Standard Treatment Protocol

Strategies for Identifying Tier 2/Standard Protocol N

eeds Tier 2 and the K

-12 Reading Plan

Decision M

aking at Tier 2 D

ata Collection

Skill Assessm

ent Training Evaluation

Curriculum

C

ase Study Review

R

eview Y

3D1 C

ontent Skill A

ssessment Perform

ance Review

Integrated Tier O

ne, Tier Two, Tier Three

Scheduling with exam

ples R

eview of M

aster Schedule & R

esource Maps

Tier Three Intervention Developm

ent C

haracteristics of Tier Three Interventions Intervention Support

Com

prehensive Intervention Plan Tier Three: C

omponents 1 &

2 G

reen Book Exam

ples/References

Worksheet - Intervention D

evelopment

School Blueprint – Infrastructure

Collect School B

lueprint – Consensus

Data C

ollection Skill A

ssessment

Training Evaluation

Technical Assistance Session (s)

Technical Assistance Session (s)

Technical Assistance Session (s)

50 Appendix A – Problem Solving - Response to Instruction/Intervention Training Outline

Page 55: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Y

ear O

ne

Yea

r Tw

o Y

ear T

hree

D

ay 3

Cur

ricu

lum

St

ep II

– P

robl

em A

naly

sis

Dat

a Fe

edba

ck A

ctiv

ity

Rev

iew

: Pro

blem

Iden

tific

atio

n B

ig Id

eas/

Con

cept

s of P

robl

em A

naly

sis

Hyp

othe

sis/

Pred

ictio

n St

atem

ent

Ass

essm

ent &

Hyp

othe

sis

Val

idat

ion

Exam

ples

of H

ypot

hesi

s G

ener

atio

n an

d Ev

alua

tion

Dat

a C

olle

ctio

n Sk

ill A

sses

smen

t Tr

aini

ng E

valu

atio

n

Cur

ricu

lum

D

ata

Feed

back

Act

ivity

In

terv

entio

n Ev

alua

tion

Prot

ocol

R

esou

rce

Map

s In

terv

entio

n Ev

alua

tion

Plan

G

oal S

ettin

g R

esou

rce

Map

ping

Act

ivity

In

terv

entio

n In

tegr

ity

Type

s B

arrie

rs

Impr

ovin

g A

sses

sing

D

ata

Col

lect

ion

Skill

Ass

essm

ent

Trai

ning

Eva

luat

ion

Cur

ricu

lum

C

ase

Stud

y R

evie

w

Rev

iew

Y3D

2 C

onte

nt

Skill

Ass

essm

ent P

erfo

rman

ce R

evie

w

Tier

Thr

ee In

terv

entio

n D

esig

n In

terv

entio

n In

tegr

ity

Doc

umen

tatio

n Ex

amin

atio

n of

Inte

grity

mea

sure

s cu

rren

tly u

sed

to a

sses

s Tie

r Th

ree

Tier

Thr

ee R

tI Pr

ogre

ss M

onito

ring

Arr

ange

men

ts (f

requ

ency

, dat

a so

urce

, who

, etc

.) C

onte

nt sp

ecifi

c m

easu

res

Dec

isio

n R

ules

A

ctio

ns w

hen

RtI

is P

ositi

ve,

Que

stio

nabl

e, P

oor

Mov

emen

t am

ong

Tier

s rel

ativ

e to

st

uden

t nee

d C

ompl

ete

Com

p. In

terv

entio

n Pl

an w

ith

supp

ortin

g R

esou

rce

Map

& S

ched

ule

SLD

TA

P Sc

hool

Blu

eprin

t - Im

plem

enta

tion

Col

lect

Sch

ool B

luep

rint –

Infr

astru

ctur

e D

ata

Col

lect

ion

Scho

ol P

erso

nnel

Sat

isfa

ctio

n Su

rvey

Pe

rcep

tions

of P

ract

ices

Sk

ill A

sses

smen

t Tr

aini

ng E

valu

atio

n

Appendix A – Problem Solving - Response to Instruction/Intervention Training Outline 3

Page 56: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Y

ear One

Year Tw

o Y

ear Three

Technical Assistance Session (s)

Technical Assistance Session (s)

Technical Assistance Session (s)

Day 4

Curriculum

Step III – Intervention D

esign and Im

plementation D

ata Feedback Activity

Review

: Consensus, Infrastructure,

Implem

entation Linking Problem

Analysis to

Intervention Intervention D

esign Intervention C

ontent Intervention Plan Intervention Integrity, Support, D

ocumentation

Integrating Tiers of Intervention D

ata Collection

Skill Assessm

ent Training Evaluation

Curriculum

R

eview Foundational C

oncepts D

ata Feedback Activity

Small G

roup Planning/Problem Solving

Goal Setting and Planning

Data C

ollection B

eliefs Survey Perception of Skills Skill A

ssessment

Training Evaluation

Curriculum

R

eview Y

3D3 C

ontent Skill A

ssessment Perform

ance Review

C

ase Study – Eligibility decisions SLD

Eligibility C

ollect School Blueprint - Im

plementation

Data C

ollection B

eliefs Survey Perception of Skills Skill A

ssessment

Training Evaluation

Technical A

ssistance Session (s)

D

ay 5 C

urriculum

Step IV – R

esponse to Intervention R

ationale for Progress Monitoring

Graphing

Goal Setting

Interpreting Graphs

Decision M

aking Positive R

esponse to Instruction/ Intervention

Questionable R

esponse to Instruction/ Intervention

Poor Response to Instruction/

Intervention R

eview of Problem

-Solving Steps D

ata Collection

Beliefs Survey

Perception of Skills Skill A

ssessment

Training Evaluation

52 Appendix A – Problem Solving - Response to Instruction/Intervention Training Outline

Page 57: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Appendix B – Copies of Evaluation Instruments Beliefs Survey

Directions: For items 2-5 below, please shade in the circle next to the response option that best represents your answer. 2. Job Description:

PS/RtI Coach Teacher-General Education Teacher-Special Education

School Counselor School Psychologist School Social Worker

Principal Assistant Principal

Other (Please specify): 3. Years of Experience in Education:

Less than 1 year 1 – 4 years 5-9 years

10 – 14 years 15-19 years 20-24 years

25 or more years Not applicable 4. Number of Years in your Current Position:

Less than 1 year 1 – 4 years 5-9 years

10 – 14 years 15-19 years 20 or more years 5. Highest Degree Earned:

B.A./B.S. M.A./M.S. Ed.S. Ph.D./Ed.D. Other (Please specify):

1. Your PS/RtI Project ID: Your PS/RtI Project ID was designed to assure confidentiality while also providing a method to match an individual’s responses across instruments. In the space provided (first row), please write in the last four digits of your Social Security Number and the last two digits of the year you were born. Then, shade in the corresponding circles.

Appendix B – Copies of Evaluation Instruments 53

Page 58: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Directions: Using the scale below, please indicate your level of agreement or disagreement with each of the following statements by shading in the circle that best represents your response.

= Strongly Disagree (SD) = Disagree (D) = Neutral (N) = Agree (A) = Strongly Agree (SA)

SD D N A SA

6. I believe in the philosophy of No Child Left Behind (NCLB) even if I disagree with some of the requirements.

7. Core instruction should be effective enough to result in 80% of the students achieving benchmarks in

7.a. reading

7.b. math

8. The primary function of supplemental instruction is to ensure that students meet grade-level benchmarks in

8.a. reading

8.b. math

9. The majority of students with learning disabilities achieve grade-level benchmarks in

9.a. reading

9.b. math

10. The majority of students with behavioral problems (EH/SED or EBD) achieve grade-level benchmarks in

10.a. reading

10.b. math

11. Students with high-incidence disabilities (e.g. SLD, EBD) who are receiving special education services are capable of achieving grade-level benchmarks (i.e., general education standards) in

11.a. reading

11.b. math

12. General education classroom teachers should implement more differentiated and flexible instructional practices to address the needs of a more diverse student body.

54 Appendix B – Copies of Evaluation Instruments

Page 59: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

SD D N A SA

13. General education classroom teachers would be able to implement more differentiated and flexible interventions if they had additional staff support.

14. The use of additional interventions in the general education classroom would result in success for more students.

15. Prevention activities and early intervention strategies in schools would result in fewer referrals to problem-solving teams and placements in special education.

16. The “severity” of a student’s academic problem is determined not by how far behind the student is in terms of his/her academic performance but by how quickly the student responds to intervention.

17. The “severity” of a student’s behavioral problem is determined not by how inappropriate a student is in terms of his/her behavioral performance but by how quickly the student responds to intervention.

18. The results of IQ and achievement testing can be used to identify effective interventions for students with learning and behavior problems.

19. Many students currently identified as “LD” do not have a disability, rather they came to school “not ready” to learn or fell too far behind academically for the available interventions to close the gap sufficiently.

20. Using student-based data to determine intervention effectiveness is more accurate than using only “teacher judgment.”

21. Evaluating a student’s response to interventions is a more effective way of determining what a student is capable of achieving than using scores from “tests” (e.g., IQ/Achievement test).

22. Additional time and resources should be allocated first to students who are not reaching benchmarks (i.e., general education standards) before significant time and resources are directed to students who are at or above benchmarks.

23. Graphing student data makes it easier for one to make decisions about student performance and needed interventions.

24. A student’s parents (guardian) should be involved in the problem-solving process as soon as a teacher has a concern about the student.

25. Students respond better to interventions when their parent (guardian) is involved in the development and implementation of those interventions.

Appendix B – Copies of Evaluation Instruments 55

Page 60: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

SD D N A SA

26. All students can achieve grade-level benchmarks if they have sufficient support.

27. The goal of assessment is to generate and measure effectiveness of instruction/intervention.

THANK YOU!

56 Appendix B – Copies of Evaluation Instruments

Page 61: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Perceptions of RtI Skills Survey

Directions: Please read each statement about a skill related to assessment, instruction, and/or intervention below, and then evaluate YOUR skill level within the context of working at a school/building level. Where indicated, rate your skill separately for academics (i.e., reading and math) and behavior. Please use the following response scale:

= I do not have this skill at all (NS) = I have minimal skills in this area; need substantial support to use it (MnS) = I have this skill, but still need some support to use it (SS) = I can use this skill with little support (HS) = I am highly skilled in this area and could teach others this skill (VHS)

The skill to: NS MnS SS HS VHS

2. Access the data necessary to determine the percent of students in core instruction who are achieving benchmarks (district grade-level standards) in:

a. Academics

b. Behavior

3. Use data to make decisions about individuals and groups of students for the:

a. Core academic curriculum

b. Core/Building discipline plan

1. Your PS/RtI Project ID: Your PS/RtI Project ID was designed to assure confidentiality while also providing a method to match an individual’s responses across instruments. In the space provided (first row), please write in the last four digits of your Social Security Number and the last two digits of the year you were born. Then, shade in the corresponding circles.

Appendix B – Copies of Evaluation Instruments 57

Page 62: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

The skill to: NS MnS SS HS VHS

4. Perform each of the following steps when identifying the problem for a student for whom concerns have been raised:

a. Define the referral concern in terms of a replacement behavior (i.e., what the student should be able to do) instead of a referral problem for:

• Academics

• Behavior

b. Use data to define the current level of performance of the target student for:

• Academics

• Behavior

c. Determine the desired level of performance (i.e., benchmark) for:

• Academics

• Behavior

d. Determine the current level of peer performance for the same skill as the target student for:

• Academics

• Behavior

e. Calculate the gap between student current performance and the benchmark (district grade level standard) for:

• Academics

• Behavior

f. Use gap data to determine whether core instruction should be adjusted or whether supplemental instruction should be directed to the target student for:

• Academics

• Behavior

5. Develop potential reasons (hypotheses) that a student or group of students is/are not achieving desired levels of performance (i.e., benchmarks) for:

a. Academics

b. Behavior

6. Identify the most appropriate type(s) of data to use for determining reasons (hypotheses) that are likely to be contributing to the problem for:

a. Academics

b. Behavior

58 Appendix B – Copies of Evaluation Instruments

Page 63: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

The skill to: NS MnS SS HS VHS

7. Identify the appropriate supplemental intervention available in my building for a student identified as at-risk for:

a. Academics

b. Behavior

8. Access resources (e.g., internet sources, professional literature) to develop evidence-based interventions for:

a. Academic core curricula

b. Behavioral core curricula

c. Academic supplemental curricula

d. Behavioral supplemental curricula

e. Academic individualized intervention plans

f. Behavioral individualized intervention plans

9. Ensure that any supplemental and/or intensive interventions are integrated with core instruction in the general education classroom:

a. Academics

b. Behavior

10. Ensure that the proposed intervention plan is supported by the data that were collected for:

a. Academics

b. Behavior

11. Provide the support necessary to ensure that the intervention is implemented appropriately for:

a. Academics

b. Behavior

12. Determine if an intervention was implemented as it was intended for:

a. Academics

b. Behavior

13. Select appropriate data (e.g., Curriculum-Based Measurement, DIBELS, FCAT, behavioral observations) to use for progress monitoring of student performance during interventions:

a. Academics

b. Behavior

Appendix B – Copies of Evaluation Instruments 59

Page 64: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

The skill to: NS MnS SS HS VHS

14. Construct graphs for large group, small group, and individual students:

a. Graph target student data

b. Graph benchmark data

c. Graph peer data

d. Draw an aimline

e. Draw a trendline

15. Interpret graphed progress monitoring data to make decisions about the degree to which a student is responding to intervention (e.g., positive, questionable or poor response).

16. Make modifications to intervention plans based on student response to intervention.

17. Use appropriate data to differentiate between students who have not learned skills (e.g., did not have adequate exposure to effective instruction, not ready, got too far behind) from those who have barriers to learning due to a disability.

18. Collect the following types of data:

a. Curriculum-Based Measurement

b. DIBELS

c. Access data from appropriate district- or school-wide assessments

d. Standard behavioral observations

19. Disaggregate data by race, gender, free/reduced lunch, language proficiency, and disability status

20. Use technology in the following ways:

a. Access the internet to locate sources of academic and behavioral evidence-based interventions.

b. Use electronic data collection tools (e.g., PDAs)

c. Use the Progress Monitoring and Reporting Network (PMRN)

d. Use the School-Wide Information System (SWIS) for Positive Behavior Support

e. Graph and display student and school data

21. Facilitate a Problem Solving Team (Student Support Team, Intervention Assistance Team, School-Based Intervention Team, Child Study Team) meeting.

THANK YOU!

60 Appendix B – Copies of Evaluation Instruments

Page 65: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Self-Assessment of Problem Solving Implementation (SAPSI)* School Name

Date of Report

District Name

District & School ID

INSTRUCTIONS The members of your School-Based Leadership Team (Problem Solving Team) should complete this needs assessment as a group. We ask that all members of the team participate in this process. Each group member will receive a copy of the needs assessment; however, only one form should be returned to the Problem Solving/Response to Intervention (PS/RtI) Project. Your PS/RtI Coach will work with your team to facilitate completion of the SAPSI and will serve as the recorder for the version to be sent to Project staff. This needs assessment will be completed three times per school year to help you and the Project monitor activities for implementation of PS/RtI in your school. The items on the SAPSI are meant to assess the degree to which schools implementing the PS/RtI model are (1) achieving and maintaining consensus among key stakeholders, (2) creating and maintaining the infrastructure necessary to support implementation, and (3) implementing practices and procedures consistent with the model. Members of the team should not be discouraged if your school has not achieved many of the criteria listed under the Consensus, Infrastructure, and Implementation domains. This instrument is intended to help your team identify needs at your school for which action plans can be developed. Whenever possible, data should be collected and/or reviewed to determine if evidence exists that suggests that a given activity is occurring.

Appendix B – Copies of Evaluation Instruments 61

Page 66: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

School-Based Leadership Team Members (Name & Position)

Person(s) Completing Report (Name & Position)

62 Appendix B – Copies of Evaluation Instruments

Page 67: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

PS/RtI Implementation Assessment

Directions: In responding to each item below, please use the following response scale: Not Started (N) — (The activity occurs less than 24% of the time) In Progress (I) — (The activity occurs approximately 25% to 74% of the time) Achieved (A) — (The activity occurs approximately 75% to 100% of the time) Maintaining (M) — (The activity was rated as achieved last time and continues to occur approximately 75%

to 100% of the time) For each item below, please write the letter of the option (N, I, A, M) that best represents your School-Based Leadership Team’s response in the column labeled “Status”. In the column labeled “Comments/Evidence”, please write any comments, explanations and/or evidence that are relevant to your team’s response. When completing the items on the SAPSI, the team should base its responses on the grade levels being targeted for implementation by the school.

Additional Comments/Evidence:

Consensus: Comprehensive Commitment and Support Status Comments/Evidence

1. District level leadership provides active commitment and support (e.g., meets to review data and issues at least twice each year).

2. The school leadership provides training, support and active involvement (e.g., principal is actively involved in School-Based Leadership Team meetings).

3. Faculty/staff support and are actively involved with problem solving/RtI (e.g., one of top 3 goals of the School Improvement Plan, 80% of faculty document support, 3-year timeline for implementation available).

4. A School-Based Leadership Team is established and represents the roles of an administrator, facilitator, data mentor, content specialist, parent, and teachers from representative areas (e.g., general ed., special ed.)

5. Data are collected (e.g., beliefs survey, satisfaction survey) to assess level of commitment and impact of PS/RtI on faculty/staff.

Appendix B – Copies of Evaluation Instruments 63

Page 68: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

PS/RtI Implementation Assessment (Cont’d)

Scale: Not Started (N) — (The activity occurs less than 24% of the time)

In Progress (I) — (The activity occurs approximately 25% to 74% of the time) Achieved (A) — (The activity occurs approximately 75% to 100% of the time) Maintaining (M) — (The activity was rated as achieved last time and continues to occur

approximately 75% to 100% of the time)

Infrastructure Development: Data Collection and Team Structure Status Comments/Evidence

6. School-wide data (e.g., DIBELS, Curriculum-Based Measures, Office Discipline Referrals) are collected through an efficient and effective systematic process.

7. Statewide and other databases (e.g., Progress Monitoring and Reporting Network [PMRN], School-Wide Information System [SWIS]) are used to make data-based decisions.

8. School-wide data are presented to staff after each benchmarking session (e.g., staff meetings, team meetings, grade-level meetings).

9. School-wide data are used to evaluate the effectiveness of core academic programs.

10. School-wide data are used to evaluate the effectiveness of core behavior programs.

11. Curriculum-Based Measurement (e.g., DIBELS) data are used in conjunction with other data sources to identify students needing targeted group interventions and individualized interventions for academics.

12. Office Disciplinary Referral data are used in conjunction with other data sources to identify students needing targeted group interventions and individualized interventions for behavior.

13. Data are used to evaluate the effectiveness (RtI) of Tier 2 intervention programs.

14. Individual student data are utilized to determine response to Tier 3 interventions.

15. Special Education Eligibility determination is made using the RtI model for the following ESE programs:

a. Emotional/Behavioral Disabilities (EBD)

b. Specific Learning Disabilities (SLD)

64 Appendix B – Copies of Evaluation Instruments

Page 69: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

PS/RtI Implementation Assessment (Cont’d)

Scale: Not Started (N) — (The activity occurs less than 24% of the time)

In Progress (I) — (The activity occurs approximately 25% to 74% of the time) Achieved (A) — (The activity occurs approximately 75% to 100% of the time) Maintaining (M) — (The activity was rated as achieved last time and continues to occur

approximately 75% to 100% of the time)

Infrastructure Development: Data Collection and Team Structure (Cont’d) Status Comments/Evidence

16. The school staff has a process to select evidence-based practices.

a. Tier 1

b. Tier 2

c. Tier 3

17. The School-Based Leadership Team has a regular meeting schedule for problem-solving activities.

18. The School-Based Leadership Team evaluates target student’s/students’ RtI at regular meetings.

19. The School-Based Leadership Team involves parents.

20. The School-Based Leadership Team has regularly scheduled data day meetings to evaluate Tier 1 and Tier 2 data.

Additional Comments/Evidence:

Appendix B – Copies of Evaluation Instruments 65

Page 70: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

PS/RtI Implementation Assessment (Cont’d)

Scale: Not Started (N) — (The activity occurs less than 24% of the time)

In Progress (I) — (The activity occurs approximately 25% to 74% of the time) Achieved (A) — (The activity occurs approximately 75% to 100% of the time) Maintaining (M) — (The activity was rated as achieved last time and continues to occur

approximately 75% to 100% of the time)

Implementation: Three-Tiered Intervention System and Problem-Solving Process Status Comments/Evidence

21. The school has established a three-tiered system of service delivery.

a. Tier 1 Academic Core Instruction clearly identified.

b. Tier 1 Behavioral Core Instruction clearly identified.

c. Tier 2 Academic Supplemental Instruction/Programs clearly identified.

d. Tier 2 Behavioral Supplemental Instruction/Programs clearly identified.

e. Tier 3 Academic Intensive Strategies/Programs are evidence-based.

f. Tier 3 Behavioral Intensive Strategies/Programs are evidence-based.

22. Teams (e.g., School-Based Leadership Team, Problem-Solving Team, Intervention Assistance Team) implement effective problem solving procedures including:

a. Problem is defined as a data-based discrepancy (GAP Analysis) between what is expected and what is occurring (includes peer and benchmark data).

b. Replacement behaviors (e.g., reading performance targets, homework completion targets) are clearly defined.

c. Problem analysis is conducted using available data and evidence-based hypotheses.

d. Intervention plans include evidence-based (e.g., research-based, data-based) strategies.

e. Intervention support personnel are identified and scheduled for all interventions.

66 Appendix B – Copies of Evaluation Instruments

Page 71: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

PS/RtI Implementation Assessment (Cont’d)

Scale: Not Started (N) — (The activity occurs less than 24% of the time)

In Progress (I) — (The activity occurs approximately 25% to 74% of the time) Achieved (A) — (The activity occurs approximately 75% to 100% of the time) Maintaining (M) — (The activity was rated as achieved last time and continues to occur

approximately 75% to 100% of the time)

Implementation: Three-Tiered Intervention System and Problem-Solving Process (Cont’d) Status Comments/Evidence

f. Intervention integrity is documented.

g. Response to intervention is evaluated through systematic data collection.

h. Changes are made to intervention based on student response.

i. Parents are routinely involved in implementation of interventions.

Additional Comments/Evidence:

Appendix B – Copies of Evaluation Instruments 67

Page 72: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

PS/RtI Implementation Assessment (Cont’d)

Scale: Not Started (N) — (The activity occurs less than 24% of the time)

In Progress (I) — (The activity occurs approximately 25% to 74% of the time) Achieved (A) — (The activity occurs approximately 75% to 100% of the time) Maintaining (M) — (The activity was rated as achieved last time and continues to occur

approximately 75% to 100% of the time)

Implementation: Monitoring and Action Planning Status Comments/Evidence

23. A strategic plan (implementation plan) exists and is used by the School-Based Leadership Team to guide implementation of PS/RtI.

24. The School-Based Leadership Team meets at least twice each year to review data and implementation issues.

25. The School-Based Leadership Team meets at least twice each year with the District Leadership Team to review data and implementation issues.

26. Changes are made to the implementation plan as a result of school and district leadership team data-based decisions.

27. Feedback on the outcomes of the PS/RtI Project is provided to school-based faculty and staff at least yearly.

Additional Comments/Evidence:

68 Appendix B – Copies of Evaluation Instruments

Page 73: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Tier III Critical Components Checklist

School Name: ______________________ FL or District Student ID: ______________ School Year: 2007-08 2008-09 2009-10 2010-11 Date Initial Meeting Occurred: ___________________ Grade Level: ________________ Area(s) of Concern (Check all that apply): Reading Math Behavior

Directions: For each selected student, please use the scale provided to indicate the extent to which each critical component of problem-solving is present in the Problem-Solving Team (i.e., Intervention Assistance Team, School-Based Intervention Team, Student Success Team, Child Study Team) paperwork. See the attached rubric for the criteria for determining the extent to which each critical component is present.

Component

0 = Absent 1 = Partially Present 2 = Present

Evidence/Comments

Problem Identification 1. Replacement behavior (i.e., target skill) was identified 0 1 2 2. Data were collected to determine the target student’s

current level of performance, the expected level, and peer performance

0 1 2

3. A gap analysis between the student’s current level of performance and the benchmark, and the peers’ current level of performance (or adequate representation of peer performance) and the benchmark was conducted

0 1 2

Problem Analysis 4. Hypotheses were developed across multiple domains

(e.g., curriculum, classroom, home/family, child, teacher, peers) or a functional analysis of behavior was completed

0 1 2

5. Data were used to determine viable or active hypotheses for why students were not attaining benchmarks

0 1 2

Intervention Development and Implementation 6. A complete intervention plan (i.e., who, what, when) was

developed in areas for which data were available and hypotheses were verified

0 1 2

7. An intervention support plan was developed (including actions to be taken, who is responsible, and when the actions will occur)

0 1 2

8. A plan for assessing intervention integrity (i.e., fidelity) was agreed upon

0 1 2

9. Frequency, focus, dates of progress monitoring, and responsibilities for collecting the data were agreed upon

0 1 2

10. Criteria for positive response to intervention were agreed upon prior to implementing the intervention plan

0 1 2

11. A follow-up meeting was scheduled at the initial meeting 0 1 2

Appendix B – Copies of Evaluation Instruments 69

Page 74: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Component

0 = Absent 1 = Partially Present 2 = Present

Evidence/Comments

Program Evaluation/RtI 12. Progress monitoring data were collected and presented

graphically 0 1 2

13. Documentation of implementation of the intervention plan was presented

0 1 2

14. A decision regarding good, questionable, or poor RtI was made

0 1 2

15. A decision to continue, modify, or terminate the intervention plan was made

0 1 2

16. An additional follow-up meeting was scheduled to re-address student progress at the follow-up meeting

0 1 2

Additional Comments: ______________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________

70 Appendix B – Copies of Evaluation Instruments

Page 75: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Tiers I and II Critical Components Checklist Rubric

1. Data were used to determine the effectiveness of core academic and behavior instruction 1 = Present — Data quantifying the effectiveness of core academic and/or behavior instruction

for all students, and for demographic subgroups of students are documented 2 = Partially Present — Data quantifying the effectiveness of core academic and/or behavior

instruction for all students, or for demographic subgroups of students are documented 3 = Absent — No data quantifying the effectiveness of core academic and/or behavior instruction

are document 2. Decisions were made to modify core instruction or to develop supplemental (Tier II)

interventions 1 = Present — A decision to modify core instruction or to develop supplemental interventions

was indicated and the decision was appropriate given the data used to evaluate the effectiveness of core instruction

2 = Partially Present — A decision to modify core instruction or to develop supplemental interventions was indicated, but the decision was not appropriate given the data used to evaluate the effectiveness of core instruction

3 = Absent — No decision regarding modifying core instruction or developing supplemental interventions was indicated

3. Universal screening (e.g., DIBELS, ODRs) or other data sources (e.g., district-wide assessments)

were used to identify groups of students in need of supplemental intervention 1 = Present — Data from universal screening assessments or other data sources were factored

into the decision to identify students as needing supplemental intervention 2 = Partially Present — Students were identified for supplemental intervention based on data;

however, the data used to make the decision came from outcome assessments such as the SAT-10 or FCAT

3 = Absent — Data were not used to identify students in need of supplemental intervention 4. The school-based team generated hypotheses to identify potential reasons for students not

meeting benchmarks 1 = Present — Reasons for the students not meeting benchmarks were developed. The reasons

provided span multiple hypotheses domains (e.g., child, curriculum, peers, family/community, classroom, teacher)

2 = Partially Present — Reasons for the students not meeting benchmarks were developed, but the reasons do not span multiple hypotheses domains (e.g., curriculum hypotheses only).

3 = Absent — Reasons for the students not meeting benchmarks were not developed

Appendix B – Copies of Evaluation Instruments 71

Page 76: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

5. Data were used to determine viable or active hypotheses for why students were not attaining benchmarks 1 = Present — Data collected using RIOT (Review, Interview, Observe, Test) procedures for all

hypotheses to determine the reasons that are likely to be barriers to the students attaining benchmarks

2 = Partially Present — Data collected using RIOT (Review, Interview, Observe, Test) procedures for some hypotheses to determine the reasons that are likely to be barriers to the students attaining benchmarks

3 = Absent — Data not collected to determine the reasons that are likely to be barriers to the students attaining benchmarks

6a. A plan for implementation of modifications to core instruction was documented

1 = Present — A plan for implementing modifications to core instruction was documented, and included the personnel responsible, the actions to be completed and the deadline for completing those actions

2 = Partially Present — A plan for implementing modifications to core instruction was documented, but the personnel responsible, the actions to be completed or the deadline for completing those actions was not included

3 = Absent — No plan for implementing the modifications to core instruction was documented N/A = Not applicable — The data used to evaluate the effectiveness of the core curriculum

suggested that the development or modification of supplemental instruction was appropriate

6b. Support for implementation of modifications to core instruction was documented

1 = Present — A plan for providing support to the personnel implementing modifications to core instruction was documented, and included the personnel responsible, the actions to be completed and the deadline for completing those actions

2 = Partially Present — A plan for providing support to the personnel implementing modifications to core instruction was documented, but the personnel responsible, the actions to be completed or the deadline for completing those actions was not included

3 = Absent — No plan for providing support to the personnel implementing the modifications to core instruction was documented

N/A = Not applicable — The data used to evaluate the effectiveness of the core curriculum suggested that the development or modification of supplemental instruction was appropriate

6c. Documentation of implementation of modifications to core instruction was provided

1 = Present — Data were documented demonstrating that the modifications to core instruction were implemented and at least some of the data were quantifiable

2 = Partially Present — Data were documented demonstrating that the modifications to core instruction were implemented, but none of the data were quantifiable

3 = Absent — No information on the degree to which the modifications to core instruction were implemented was documented

N/A = Not applicable — The data used to evaluate the effectiveness of the core curriculum suggested that the development or modification of supplemental instruction was appropriate

72 Appendix B – Copies of Evaluation Instruments

Page 77: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

7a. A plan for implementation of supplemental instruction was documented 1 = Present — A plan for implementation of supplemental instruction was documented, and

included the personnel responsible, the actions to be completed and the deadline for completing those actions

2 = Partially Present — A plan for implementation of supplemental instruction was documented, but the personnel responsible, the actions to be completed or the deadline for completing those actions was not included

3 = Absent — No plan for implementation of supplemental instruction was documented N/A = Not applicable — The data used to evaluate the effectiveness of the core curriculum

suggested that modification of core instruction was appropriate 7b. Support for implementation of supplemental instruction was documented

1 = Present — A plan for providing support to the personnel implementing supplemental instruction was documented, and included the personnel responsible, the actions to be completed and the deadline for completing those actions

2 = Partially Present — A plan for providing support to the personnel implementing supplemental instruction was documented, but the personnel responsible, the actions to be completed or the deadline for completing those actions was not included

3 = Absent — No plan for providing support to the personnel implementing supplemental instruction was documented

N/A = Not applicable — The data used to evaluate the effectiveness of the core curriculum suggested that modifications to core instruction were appropriate

7c. Documentation of implementation of supplemental instruction was provided

1 = Present — Data were documented demonstrating that the supplemental instruction protocol was implemented and at least some of the data were quantifiable

2 = Partially Present — Data were documented demonstrating that the supplemental instruction protocol was implemented, but none of the data were quantifiable

3 = Absent — No information on the degree to which supplemental instruction was implemented was documented

N/A = Not applicable — The data used to evaluate the effectiveness of the core curriculum suggested that modifications to core instruction were appropriate

8. Criteria for determining positive RtI defined

1 = Present — The rate at which improvement on the target skill is needed for student RtI to be considered positive was provided in measurable terms

2 = Partially Present — Quantifiable data defining improvement in the target skill needed for positive RtI was provided, but the data did not include a rate index

3 = Absent — No criteria for determining positive RtI were provided 9. Progress monitoring data collected/scheduled

1 = Present — Progress monitoring data were collected at an appropriate frequency using measures that are sensitive to small changes in the target skill

2 = Partially Present — Progress monitoring data were collected, but were not collected frequently enough or were collected using measures that were are not sensitive to small changes in the target skill

3 = Absent — Little or no progress monitoring data were collected

Appendix B – Copies of Evaluation Instruments 73

Page 78: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

10. Decisions regarding student RtI documented 1 = Present — Documented decisions regarding whether the students demonstrated positive,

questionable, or poor RtI were made based on progress monitoring data 2 = Partially Present — A discussion of student RtI was provided, but no decisions regarding

positive, questionable, or poor RtI were made 3 = Absent — No discussion of the students RtI was provided

11. Plan for continuing, modifying, or terminating the intervention plan provided

1 = Present — A plan for continuing, modifying, or terminating the intervention plan was provided based on the students’ RtI

2 = Partially Present — A plan for continuing, modifying, or terminating the intervention plan was provided, but it did not link directly to the students’ RtI

3 = Absent — No plan for continuing, modifying, or terminating the intervention plan was provided

74 Appendix B – Copies of Evaluation Instruments

Page 79: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Tiers I and II Critical Components Checklist

Directions: For each selected grade-level, please use the scale provided to indicate the degree to which each critical component of problem solving is present in the problem-solving team paperwork. See the attached rubric for the criteria for determining the degree to which each critical component is present. Component 1 = Present

2 = Partially Present 3 = Absent N/A = Not applicable

Evidence/Comments

Problem Identification 1. Data were used to determine the

effectiveness of core academic and behavior instruction

1 2 3

2. Decisions were made to modify core instruction or to develop supplemental (Tier II) interventions

1 2 3

3. Universal screening (e.g., DIBELS, ODRs) or other data sources (e.g., district-wide assessments) were used to identify groups of students in need of supplemental intervention

1 2 3

Problem Analysis 4. The school-based team generated hypotheses

to identify potential reasons for students not meeting benchmarks

1 2 3

5. Data were used to determine viable or active hypotheses for why students were not attaining benchmarks

1 2 3

Intervention Development and Implementation 6. Modifications to core instruction

a. A plan for implementation of modifications to core instruction was documented

1 2 3 N/A

b. Support for implementation of modifications to core instruction was documented

1 2 3 N/A

c. Documentation of implementation of modifications to core instruction was provided

1 2 3 N/A

7. Supplemental (Tier II) instruction development or modification

a. A plan for implementation of supplemental instruction was documented

1 2 3 N/A

b. Support for implementation of supplemental instruction was documented

1 2 3 N/A

Appendix B – Copies of Evaluation Instruments 75

Page 80: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Component 1 = Present 2 = Partially Present 3 = Absent N/A = Not applicable

Evidence/Comments

c. Documentation of implementation of supplemental instruction was provided

1 2 3 N/A

Program Evaluation/RtI 8. Criteria for positive response to intervention

defined 1 2 3

9. Progress monitoring data were collected/scheduled

1 2 3

10. A decision regarding student RtI was documented

1 2 3

11. A plan for continuing, modifying, or terminating the intervention plan was provided

1 2 3

76 Appendix B – Copies of Evaluation Instruments

Page 81: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Appendix C — Data Tables

Table 1a Frequency and Percentage of School Based Leadership Teams’ Beliefs Ratings Across Pilot Schools: Factor One (Student Academic Ability and Benchmarks) Belief Statement Response Y1 BOY Y1 EOY Y2 EOY Y3 EOY Y4 EOY 9A. The majority of students with learning disabilities achieve grade-level benchmarks in reading

SA A N D SD

8 (4.3%) 38 (20.4%) 32 (17.2%) 104 (55.9%) 4 (2.2%)

14 (8.7%) 50 (31.1%) 37 (23.0%) 57 (35.4%) 3 (1.9%)

15 (8.4%) 65 (36.3%) 49 (27.4%) 46 (25.7%) 4 (2.2%)

13 (7.8%) 55 (33.1%) 50 (30.1%) 48 (28.9%) 0 (0%)

15 (6.2%) 70 (28.8%) 77 (31.7%) 75 (30.9%) 6 (2.5%)

9B. The majority of students with learning disabilities achieve grade-level benchmarks in math

SA A N D SD

10 (5.4%) 45 (24.3%) 29 (15.7%) 97 (52.4%) 4 (2.2%)

10 (6.2%) 59 (36.7%) 37 (23.0%) 53 (32.9%) 2 (1.2%)

18 (10.1%) 66 (36.9%) 48 (26.8%) 43 (24.0%) 4 (2.2%)

15 (9.1%) 57 (34.6%) 49 (29.7%) 44 (26.7%) 0 (0%)

16 (6.6%) 75 (30.9%) 76 (31.3%) 71 (29.2%) 5 (2.1%)

10A. The majority of students with behavioral problems (EH/SED or EBD) achieve grade-level benchmarks in reading

SA A N D SD

18 (9.7%) 39 (21.0%) 39 (21.0%) 82 (44.1%) 8 (4.3%)

11 (6.9%) 57 (35.6%) 43 (26.9%) 45 (28.1%) 4 (2.5%)

15 (8.4%) 69 (38.6%) 55 (30.7%) 38 (21.2%) 2 (1.1%)

8 (4.8%) 59 (35.5%) 64 (38.6%) 33 (19.9%) 2 (1.2%)

12 (4.9%) 69 (28.4%) 84 (34.6%) 65 (26.8%) 13 (5.4%)

10B. The majority of students with behavioral problems (EH/SED or EBD) achieve grade-level benchmarks in math

SA A N D SD

12 (6.5%) 48 (26.0%) 41 (22.2%) 76 (41.1%) 8 (4.3%)

11 (6.9%) 59 (36.9%) 42 (26.3%) 44 (27.5%) 4 (2.5%)

14 (7.8%) 67 (37.4%) 58 (32.4%) 38 (21.2%) 2 (1.1%)

8 (4.9%) 60 (36.4%) 63 (38.2%) 33 (20.0%) 1 (0.6%)

12 (4.9%) 71 (29.2%) 83 (34.2%) 65 (26.8%) 12 (4.9%)

11A. Students with high-incidence disabilities (e.g. SLD, EBD) who are receiving special education services are capable of achieving grade-level benchmarks (i.e., general education standards) in reading

SA A N D SD

27 (14.5%) 103 (55.4%) 36 (19.4%) 17 (9.1%) 3 (1.6%)

24 (14.9%) 88 (54.7%) 32 (19.9%) 16 (9.9%) 1 (0.6%)

38 (21.4%) 78 (43.8%) 46 (25.8%) 16 (9.0%) 0 (0%)

32 (19.4%) 91 (55.2%) 35 (21.2%) 7 (4.2%) 0 (0%)

37 (15.2%) 94 (38.7%) 77 (31.7%) 34 (14.0%) 1 (0.4%)

11B. Students with high-incidence disabilities (e.g. SLD, EBD) who are receiving special education services are capable of achieving grade-level benchmarks (i.e., general education standards) in math

SA A N D SD

30 (16.2%) 106 (57.3%) 31 (16.8%) 16 (8.7%) 2 (1.1%)

24 (14.9%) 89 (55.3%) 31 (19.3%) 16 (9.9%) 1 (0.6%)

38 (21.2%) 82 (45.8%) 43 (24.0%) 16 (8.9%) 0 (0%)

32 (19.4%) 92 (55.8%) 34 (20.6%) 7 (4.2%) 0 (0%)

37 (15.2%) 96 (39.5%) 75 (30.9%) 35 (14.4%) 0 (0%)

Note. SA = Strongly Agree; A = Agree; N = Neutral; D = Disagree; SD = Strongly Disagree; Y1BOY = Year 1 Beginning of Year; Y1EOY = Year 1 End of Year; Y2EOY= Year 2 End of Year; Y3EOY = Year 3 End of Year; Y4 EOY = Year 4 End of Year.

Appendix C – Data Tables 77

Page 82: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Table 1b Frequency and Percentage of School Based Leadership Teams’ Beliefs Ratings Across Pilot Schools: Factor Two (Data-Based Decision Making) Belief Statement Response Y1 BOY Y1 EOY Y2 EOY Y3 EOY Y4 EOY 12. General education classroom teachers should implement more differentiated and flexible instructional practices to address the needs of a more diverse student body

SA A N D SD

109 (58.6%) 65 (35.0%) 9 (4.8%) 3 (1.6%) 0 (0%)

90 (55.9%) 65 (40.4%) 3 (1.9%) 3 (1.9%) 0 (0%)

113 (62.8%) 63 (35.0%) 4 (2.2%) 0 (0%) 0 (0%)

87 (52.7%) 71 (43.0%) 6 (3.6%) 0 (0%) 1 (0.6%)

101 (41.7%) 114 (47.1%) 19 (7.9%) 8 (3.3%) 0 (0%)

13. General education classroom teachers would be able to implement more differentiated and flexible interventions if they had additional staff support

SA A N D SD

122 (65.6%) 59 (31.7%) 5 (2.7%) 0 (0%) 0 (0%)

102 (63.4%) 51 (31.7%) 5 (3.1%) 3 (1.9%) 0 (0%)

113 (62.8%) 58 (32.2%) 5 (2.8%) 4 (2.2%) 0 (0%)

111 (66.9%) 48 (28.9%) 6 (3.6%) 1 (0.6%) 0 (0%)

153 (63.0%) 75 (30.9%) 13 (5.4%) 2 (0.8%) 0 (0%)

14. The use of additional interventions in the general education classroom would result in success for more students

SA A N D SD

92 (49.5%) 87 (46.8%) 5 (2.7%) 2 (1.1%) 0 (0%)

94 (58.4%) 61 (37.9%) 6 (3.7%) 0 (0%) 0 (0%)

104 (57.8%) 69 (38.3%) 7 (3.9%) 0 (0%) 0 (0%)

101 (61.2%) 56 (33.9%) 8 (4.9%) 0 (0%) 0 (0%)

109 (44.9%) 104 (42.8%) 24 (9.9%) 5 (2.1%) 1 (0.4%)

15. Prevention activities and early intervention strategies in schools would result in fewer referrals to problem-solving teams and placements in special education

SA A N D SD

83 (44.9%) 86 (46.5%) 12 (6.5%) 4 (2.2%) 0 (0%)

94 (58.4%) 56 (34.8%) 10 (6.2%) 1 (0.6%) 0 (0%)

103 (57.2%) 71 (39.4%) 6 (3.3%) 0 (0%) 0 (0%)

94 (57.0%) 63 (38.2%) 7 (4.2%) 1 (0.6%) 0 (0%)

92 (37.9%) 104 (42.8%) 34 (14.0%) 12 (4.9%) 1 (0.4%)

16. The “severity” of a student’s academic problem is determined not by how far behind the student is in terms of his/her academic performance but by how quickly the student responds to intervention

SA A N D SD

28 (15.1%) 100 (54.1%) 34 (18.4%) 23 (12.4%) 0 (0%)

48 (30.0%) 87 (54.4%) 17 (10.6%) 8 (5.0%) 0 (0%)

60 (33.3%) 86 (47.8%) 28 (15.6%) 6 (3.3%) 0 (0%)

47 (28.5%) 88 (53.3%) 24 (14.6%) 6 (3.6%) 0 (0%)

64 (26.3%) 101 (41.6%) 53 (21.8%) 22 (9.1%) 3 (1.2%)

17. The “severity” of a student’s behavioral problem is determined not by how inappropriate a student is in terms of his/her behavioral performance but by how quickly the student responds to intervention

SA A N D SD

27 (14.6%) 94 (50.8%) 36 (19.5%) 26 (14.1%) 2 (1.1%)

48 (29.8%) 84 (52.2%) 18 (11.2%) 11 (6.8%) 0 (0%)

46 (25.6%) 95 (52.8%) 24 (13.3%) 15 (8.3%) 0 (0%)

45 (27.4%) 86 (52.4%) 25 (15.2%) 8 (4.9%) 0 (0%)

44 (18.1%) 111 (45.7%) 54 (22.2%) 32 (13.2%) 2 (0.8%)

20. Using student-based data to determine intervention effectiveness is more accurate than using only “teacher judgment”

SA A N D SD

64 (34.4%) 88 (47.3%) 18 (9.7%) 13 (7.0%) 3 (1.6%)

86 (53.4%) 59 (36.7%) 10 (6.2%) 5 (3.1%) 1 (0.6%)

97 (53.9%) 69 (38.3%) 11 (6.1%) 3 (1.7%) 0 (0%)

81 (49.1%) 71 (43.0%) 8 (4.9%) 5 (3.0%) 0 (0%)

69 (28.4%) 111 (45.7%) 40 (16.5%) 21 (8.6%) 2 (0.8%)

78 Appendix C – Data Tables

Page 83: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

21. Evaluating a student’s response to interventions is a more effective way of determining what a student is capable of achieving than using scores from “tests” (e.g., IQ/Achievement test)

SA A N D SD

32 (17.2%) 98 (52.7%) 38 (20.4%) 17 (9.1%) 1 (0.5%)

54 (33.5%) 74 (46.0%) 20 (12.4%) 13 (8.1%) 0 (0%)

56 (31.1%) 100 (55.6%) 21 (11.7%) 2 (1.1%) 1 (0.6%)

58 (35.2%) 86 (52.1%) 15 (9.1%) 6 (3.6%) 0 (0%)

50 (20.6%) 111 (45.7%) 62 (25.5%) 18 (7.4%) 2 (0.8%)

22. Additional time and resources should be allocated first to students who are not reaching benchmarks (i.e., general education standards) before significant time and resources are directed to students who are at or above benchmarks

SA A N D SD

21 (11.3%) 89 (47.9%) 34 (18.3%) 36 (19.4%) 6 (3.2%)

30 (18.9%) 69 (43.4%) 24 (15.1%) 31 (19.5%) 5 (3.1%)

23 (12.8%) 78 (43.3%) 39 (21.7%) 35 (19.4%) 5 (2.8%)

24 (14.6%) 77 (46.7%) 37 (22.4%) 24 (14.6%) 3 (1.8%)

24 (9.9%) 109 (45.0%) 48 (19.8%) 52 (21.5%) 9 (3.7%)

23. Graphing student data makes it easier for one to make decisions about student performance and needed interventions

SA A N D SD

37 (19.9%) 116 (62.4%) 24 (12.9%) 8 (4.3%) 1 (0.5%)

75 (46.6%) 75 (46.6%) 10 (6.2%) 1 (0.6%) 0 (0%)

84 (46.9%) 80 (44.7%) 14 (7.8%) 1 (0.6%) 0 (0%)

75 (45.5%) 89 (53.9%) 1 (0.6%) 0 (0%) 0 (0%)

60 (24.7%) 129 (53.1%) 39 (16.1%) 13 (5.4%) 2 (0.8%)

24. A student’s parents (guardian) should be involved in the problem-solving process as soon as a teacher has a concern about the student

SA A N D SD

89 (47.9%) 87 (46.8%) 7 (3.8%) 3 (1.6%) 0 (0%)

92 (57.1%) 64 (39.8%) 3 (1.9%) 1 (0.6%) 1 (0.6%)

118 (65.9%) 58 (32.4%) 3 (1.7%) 0 (0%) 0 (0%)

99 (60.4%) 58 (35.4%) 4 (2.4%) 2 (1.2%) 1 (0.6%)

140 (57.6%) 93 (38.3%) 8 (3.3%) 2 (0.8%) 0 (0%)

25. Students respond better to interventions when their parent (guardian) is involved in the development and implementation of those interventions

SA A N D SD

106 (57.0%) 64 (34.4%) 10 (5.4%) 5 (2.7%) 1 (0.5%)

86 (53.4%) 62 (38.5%) 12 (7.5) 0 (0%) 1 (0.6%)

85 (47.2%) 73 (40.6%) 18 (10.0%) 4 (2.2%) 0 (0%)

79 (47.9%) 69 (41.8%) 16 (9.7%) 1 (0.6%) 0 (0%)

128 (52.7%) 82 (33.7%) 31 (12.8%) 2 (0.8%) 0 (0%)

27. The goal of assessment is to generate and measure effectiveness of instruction or intervention

SA A N D SD

54 (29.0%) 109 (58.6%) 13 (7.0%) 9 (4.8%) 1 (0.5%)

57 (35.4%) 88 (54.7%) 8 (5.0%) 7 (4.4%) 1 (0.6%)

71 (39.7%) 102 (57.0%) 5 (2.8%) 1 (0.6%) 0 (0%)

61 (37.0%) 90 (54.6%) 10 (6.1%) 4 (2.4%) 0 (0%)

57 (23.5%) 154 (63.4%) 25 (10.3%) 3 (1.2%) 4 (1.7%)

Note. SA = Strongly Agree; A = Agree; N = Neutral; D = Disagree; SD = Strongly Disagree; Y1BOY = Year 1 Beginning of Year; Y1EOY = Year 1 End of Year; Y2EOY= Year 2 End of Year; Y3EOY = Year 3 End of Year; Y4 EOY = Year 4 End of Year.

Appendix C – Data Tables 79

Page 84: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Table 1c Frequency and Percentage of School Based Leadership Teams’ Beliefs Ratings Across Pilot Schools: Factor Three (Functions of Instruction) Belief Statement Response Y1 BOY Y1 EOY Y2 EOY Y3 EOY Y4 EOY 7A. Core instruction should be effective enough to result in 80% of the students achieving benchmarks in reading

SA A N D SD

75 (40.5%) 95 (51.4%) 10 (5.4%) 3 (1.6%) 2 (1.1%)

85 (52.8%) 70 (43.5%) 1 (0.6%) 3 (1.9%) 2 (1.2%)

96 (53.3%) 80 (44.4%) 3 (1.7%) 0 (0%) 1 (0.6%)

77 (46.4%) 82 (49.4%) 5 (3.0%) 2 (1.2%) 0 (0%)

82 (33.7%) 126 (51.9%) 24 (9.9%) 10 (4.1%) 1 (0.4%)

7B. Core instruction should be effective enough to result in 80% of the students achieving benchmarks in math

SA A N D SD

73 (39.9%) 94 (51.4%) 11 (6.0%) 3 (1.6%) 2 (1.1%)

84 (52.8%) 69 (43.4%) 1 (0.6%) 3 (1.9%) 2 (1.3%)

93 (52.5%) 79 (44.6%) 4 (2.3%) 0 (0%) 1 (0.6%)

74 (45.7%) 82 (50.6%) 4 (2.5%) 2 (1.2%) 0 (0%)

81 (33.3%) 126 (51.9%) 24 (9.9%) 11 (4.5%) 1 (0.4%)

8A. The primary function of supplemental instruction is to ensure that students meet grade-level benchmarks in reading

SA A N D SD

54 (29.0%) 115 (61.8%) 8 (4.3%) 9 (4.8%) 0 (0%)

70 (44.0%) 82 (51.6%) 2 (1.3%) 5 (3.1%) 0 (0%)

76 (42.7%) 93 (52.3%) 4 (2.3%) 5 (2.8%) 0 (0%)

62 (37.8%) 90 (54.9%) 7 (4.3%) 4 (2.4%) 1 (0.6%)

77 (31.7%) 141 (58.0%) 15 (6.2%) 9 (3.7%) 1 (0.4%)

8B. The primary function of supplemental instruction is to ensure that students meet grade-level benchmarks in math

SA A N D SD

50 (27.0%) 115 (62.2%) 11 (6.0%) 9 (4.9%) 0 (0%)

70 (44.0%) 82 (51.6%) 2 (1.3%) 5 (3.1%) 0 (0%)

75 (42.1%) 94 (52.8%) 4 (2.3%) 5 (2.8%) 0 (0%)

59 (36.2%) 92 (56.4%) 7 (4.3%) 4 (2.5%) 1 (0.6%)

77 (31.7%) 142 (58.4%) 14 (5.8%) 9 (3.7%) 1 (0.4%)

Note. SA = Strongly Agree; A = Agree; N = Neutral; D = Disagree; SD = Strongly Disagree; Y1BOY = Year 1 Beginning of Year; Y1EOY = Year 1 End of Year; Y2EOY= Year 2 End of Year; Y3EOY = Year 3 End of Year; Y4 EOY = Year 4 End of Year.

80 Appendix C – Data Tables

Page 85: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Table 2a Frequency and Percentage of Instructional School Staff Beliefs Ratings Across Pilot Schools: Factor One (Student Academic Ability and Benchmarks) Belief Statement Response Y1 BOY Y1 EOY Y2 EOY Y3 EOY Y4 EOY 9A. The majority of students with learning disabilities achieve grade-level benchmarks in reading

SA A N D SD

31 (2.0%) 261 (17.0%) 491 (31.9%) 660 (42.9%) 97 (6.3%)

25 (1.6%) 346 (22.1%) 522 (33.3%) 596 (38.0%) 79 (5.0%)

44 (2.8%) 405 (26.2%) 577 (37.3%) 482 (31.2%) 39 (2.5%)

46 (3.4%) 361 (26.6%) 477 (35.2%) 429 (31.6%) 43 (3.2%)

19 (2.4%) 230 (29.3%) 276 (35.2%) 233 (29.7%) 26 (3.3%)

9B. The majority of students with learning disabilities achieve grade-level benchmarks in math

SA A N D SD

31 (2.0%) 286 (18.7%) 505 (33.0%) 618 (40.3%) 92 (6.0%)

27 (1.7%) 391 (25.0%) 522 (33.3%) 554 (35.4%) 72 (4.6%)

45 (2.9%) 426 (27.6%) 578 (37.4%) 460 (29.8%) 37 (2.4%)

50 (3.7%) 366 (27.1%) 475 (35.2%) 417 (30.9%) 41 (3.0%)

20 (2.6%) 240 (30.7%) 277 (35.4%) 226 (28.9%) 20 (2.6%)

10A. The majority of students with behavioral problems (EH/SED or EBD) achieve grade-level benchmarks in reading

SA A N D SD

28 (1.8%) 265 (17.3%) 559 (36.4%) 588 (38.3%) 95 (6.2%)

16 (1.0%) 318 (20.3%) 645 (41.2%) 518 (33.1%) 67 (4.3%)

38 (2.5%) 366 (23.7%) 662 (42.8%) 427 (27.6%) 53 (3.4%)

39 (2.9%) 333 (24.4%) 562 (41.2%) 389 (28.5%) 41 (3.0%)

15 (1.9%) 204 (26.0%) 332 (42.4%) 205 (26.2%) 28 (3.6%)

10B. The majority of students with behavioral problems (EH/SED or EBD) achieve grade-level benchmarks in math

SA A N D SD

29 (1.9%) 287 (18.8%) 557 (36.4%) 565 (37.0%) 91 (6.0%)

16 (1.0%) 327 (20.9%) 652 (41.7%) 500 (32.0%) 67 (4.3%)

38 (2.5%) 382 (24.7%) 666 (43.1%) 410 (26.5%) 51 (3.3%)

37 (2.7%) 338 (25.0%) 555 (41.0%) 386 (28.5%) 38 (2.8%)

15 (1.9%) 211 (27.0%) 334 (42.7%) 196 (25.0%) 27 (3.5%)

11A. Students with high-incidence disabilities (e.g. SLD, EBD) who are receiving special education services are capable of achieving grade-level benchmarks (i.e., general education standards) in reading

SA A N D SD

72 (4.7%) 477 (31.0%) 596 (38.7%) 330 (21.4%) 66 (4.3%)

62 (4.0%) 558 (35.6%) 589 (37.6%) 300 (19.1%) 58 (3.7%)

76 (4.9%) 555 (35.9%) 615 (39.8%) 263 (17.0%) 38 (2.5%)

70 (5.1%) 480 (35.1%) 525 (38.4%) 253 (18.5%) 38 (2.8%)

52 (6.6%) 298 (38.0%) 291 (37.1%) 118 (15.0%) 26 (3.3%)

11B. Students with high-incidence disabilities (e.g. SLD, EBD) who are receiving special education services are capable of achieving grade-level benchmarks (i.e., general education standards) in math

SA A N D SD

74 (4.8%) 479 (31.3%) 602 (39.4%) 311 (20.3%) 63 (4.1%)

63 (4.0%) 560 (35.8%) 592 (37.9%) 295 (18.9%) 54 (3.5%)

77 (5.0%) 563 (36.6%) 613 (39.8%) 250 (16.2%) 37 (2.4%)

70 (5.2%) 477 (35.2%) 519 (38.3%) 251 (18.5%) 38 (2.8%)

51 (6.5%) 302 (38.6%) 287 (36.7%) 118 (15.1%) 25 (3.2%)

Note. SA = Strongly Agree; A = Agree; N = Neutral; D = Disagree; SD = Strongly Disagree; Y1BOY = Year 1 Beginning of Year; Y1EOY = Year 1 End of Year; Y2EOY= Year 2 End of Year; Y3EOY = Year 3 End of Year; Y4 EOY = Year 4 End of Year.

Appendix C – Data Tables 81

Page 86: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Table 2b Frequency and Percentage of Instructional School Staff Beliefs Ratings Across Pilot Schools: Factor Two (Data-Based Decision Making)

Belief Statement Response Y1 BOY Y1 EOY Y2 EOY Y3 EOY Y4 EOY

12. General education classroom teachers should implement more differentiated and flexible instructional practices to address the needs of a more diverse student body

SA A N D SD

347 (22.5%) 684 (44.4%) 277 (18.0%) 178 (11.6%) 55 (3.6%)

372 (23.6%) 780 (49.5%) 258 (16.4%) 138 (8.8%) 29 (1.8%)

366 (23.6%) 754 (48.6%) 283 (18.3%) 125 (8.1%) 23 (1.5%)

344 (25.1%) 676 (49.4%) 234 (17.1%) 101 (7.4%) 14 (1.0%)

205 (26.2%) 424 (54.2%) 88 (11.3%) 47 (6.0%) 18 (2.3%)

13. General education classroom teachers would be able to implement more differentiated and flexible interventions if they had additional staff support

SA A N D SD

826 (53.6%) 571 (37.0%) 100 (6.5%) 33 (2.1%) 12 (0.8%)

866 (55.0%) 584 (37.1%) 91 (5.8%) 25 (1.6%) 10 (0.6%)

835 (53.8%) 577 (37.2%) 108 (7.0%) 25 (1.6%) 8 (0.5%)

784 (57.2%) 486 (35.5%) 80 (5.8%) 15 (1.1%) 6 (0.4%)

456 (58.1%) 263 (33.5%) 50 (6.4%) 6 (0.8%) 10 (1.3%)

14. The use of additional interventions in the general education classroom would result in success for more students

SA A N D SD

454 (29.5%) 733 (47.6%) 253 (16.4%) 83 (5.4%) 18 (1.2%)

503 (32.0%) 773 (49.1%) 220 (14.0%) 69 (4.4%) 9 (0.6%)

478 (30.8%) 777 (50.0%) 240 (15.5%) 51 (3.3%) 7 (0.5%)

447 (32.7%) 682 (49.8%) 171 (12.5%) 61 (4.5%) 8 (0.6%)

262 (33.4%) 392 (49.9%) 104 (13.3%) 19 (2.4%) 8 (1.0%)

15. Prevention activities and early intervention strategies in schools would result in fewer referrals to problem-solving teams and placements in special education

SA A N D SD

353 (22.9%) 744 (48.3%) 315 (20.4%) 106 (6.9%) 24 (1.6%)

369 (23.4%) 806 (51.2%) 301 (19.1%) 88 (5.6%) 11 (0.7%)

422 (27.2%) 766 (49.4%) 274 (17.7%) 74 (4.8%) 15 (1.0%)

365 (26.7%) 696 (50.9%) 252 (18.4%) 50 (3.7%) 5 (0.4%)

225 (28.7%) 382 (48.7%) 126 (16.1%) 48 (6.1%) 4 (0.5%)

16. The “severity” of a student’s academic problem is determined not by how far behind the student is in terms of his/her academic performance but by how quickly the student responds to intervention

SA A N D SD

93 (6.0%) 595 (38.7%) 545 (35.4%) 269 (17.9%) 37 (2.4%)

116 (7.4%) 716 (45.7%) 515 (32.9%) 199 (12.7%) 21 (1.3%)

124 (8.0%) 762 (48.2%) 488 (31.5%) 157 (10.1%) 18 (1.2%)

147 (10.8%) 653 (47.9%) 402 (29.5%) 150 (11.0%) 12 (0.9%)

106 (13.5%) 394 (50.3%) 182 (23.2%) 94 (12.0%) 8 (1.0%)

17. The “severity” of a student’s behavioral problem is determined not by how inappropriate a student is in terms of his/her behavioral performance but by how quickly the student responds to intervention

SA A N D SD

77 (5.0%) 555 (36.2%) 526 (34.3%) 314 (20.5%) 63 (4.1%)

97 (6.2%) 657 (41.9%) 518 (33.0%) 256 (16.3%) 41 (2.6%)

109 (7.0%) 685 (44.3%) 490 (31.7%) 232 (15.0%) 32 (2.1%)

125 (9.2%) 585 (42.8%) 421 (30.8%) 210 (15.4%) 25 (1.8%)

81 (10.3%) 353 (45.0%) 210 (26.8%) 128 (16.3%) 12 (1.5%)

20. Using student-based data to determine intervention effectiveness is more accurate than using only “teacher judgment”

SA A N D SD

155 (10.1%) 660 (42.9%) 403 (26.2%) 286 (18.6%) 36 (2.3%)

157 (10.0%) 728 (46.4%) 401 (25.5%) 250 (15.9%) 34 (2.2%)

171 (11.0%) 734 (47.4%) 363 (23.4%) 244 (15.7%) 38 (2.5%)

170 (12.4%) 647 (47.3%) 348 (25.4%) 177 (12.9%) 26 (1.9%)

117 (14.9%) 375 (47.8%) 171 (21.8%) 104 (13.3%) 18 (2.3%)

82 Appendix C – Data Tables

Page 87: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

21. Evaluating a student’s response to interventions is a more effective way of determining what a student is capable of achieving than using scores from “tests” (e.g., IQ/Achievement test)

SA A N D SD

123 (8.0%) 729 (47.5%) 513 (33.4%) 150 (9.8%) 21 (1.4%)

106 (6.8%) 798 (50.8%) 510 (32.9%) 133 (8.5%) 23 (1.5%)

135 (8.7%) 730 (47.2%) 536 (34.7%) 130 (8.4%) 16 (1.0%)

116 (8.5%) 654 (47.8%) 465 (34.0%) 120 (8.8%) 13 (1.0%)

94 (12.0%) 413 (52.7%) 202 (25.8%) 62 (7.9%) 13 (1.7%)

22. Additional time and resources should be allocated first to students who are not reaching benchmarks (i.e., general education standards) before significant time and resources are directed to students who are at or above benchmarks

SA A N D SD

130 (8.5%) 548 (35.7%) 314 (20.4%) 395 (25.7%) 150 (9.8%)

111 (7.1%) 595 (37.9%) 344 (21.9%) 390 (24.9%) 129 (8.2%)

118 (7.6%) 574 (37.0%) 349 (22.5%) 378 (24.4%) 133 (8.6%)

115 (8.4%) 538 (39.3%) 336 (24.5%) 279 (20.4%) 102 (7.5%)

58 (7.4%) 314 (40.1%) 170 (21.7%) 191 (24.4%) 51 (6.5%)

23. Graphing student data makes it easier for one to make decisions about student performance and needed interventions

SA A N D SD

126 (8.2%) 779 (50.6%) 467 (30.3%) 137 (8.9%) 31 (2.0%)

129 (8.2%) 799 (50.9%) 478 (30.5%) 143 (9.1%) 21 (1.3%)

177 (11.4%) 797 (51.3%) 444 (28.6%) 110 (7.1%) 25 (1.6%)

181 (13.2%) 710 (51.9%) 355 (25.9%) 95 (6.9%) 28 (2.1%)

117 (14.9%) 429 (54.7%) 171 (21.8%) 55 (7.0%) 12 (1.5%)

24. A student’s parents (guardian) should be involved in the problem-solving process as soon as a teacher has a concern about the student

SA A N D SD

754 (49.0%) 684 (44.4%) 80 (5.2%) 16 (1.0%) 5 (0.3%)

759 (48.2%) 714 (45.4%) 81 (5.2%) 16 (1.0%) 4 (0.3%)

793 (51.1%) 656 (42.3%) 82 (5.3%) 15 (1.0%) 5 (0.3%)

692 (50.5%) 589 (43.0%) 80 (5.8%) 8 (0.6%) 2 (0.2%)

382 (48.7%) 367 (46.8%) 25 (3.2%) 8 (1.0%) 3 (0.4%)

25. Students respond better to interventions when their parent (guardian) is involved in the development and implementation of those interventions

SA A N D SD

725 (47.1%) 644 (41.9%) 136 (8.8%) 28 (1.8%) 6 (0.4%)

746 (47.5%) 640 (40.8%) 151 (9.6%) 29 (1.9%) 4 (0.3%)

736 (47.4%) 643 (41.4%) 145 (9.3%) 22 (1.4%) 7 (0.5%)

659 (48.4%) 543 (39.9%) 142 (10.4%) 14 (1.0%) 4 (0.3%)

374 (47.6%) 311 (39.6%) 87 (11.1%) 10 (1.3%) 3 (0.4%)

27. The goal of assessment is to generate and measure effectiveness of instruction or intervention

SA A N D SD

208 (13.5%) 906 (58.9%) 250 (16.2%) 129 (8.4%) 46 (3.0%)

229 (14.6%) 980 (62.5%) 238 (15.2%) 95 (6.1%) 25 (1.6%)

299 (19.2%) 904 (58.2%) 237 (15.3%) 93 (6.0%) 21 (1.4%)

286 (21.0%) 807 (59.3%) 204 (15.0%) 56 (4.1%) 9 (0.7%)

141 (18.0%) 488 (62.2%) 116 (14.8%) 32 (4.1%) 7 (0.9%)

Note. SA = Strongly Agree; A = Agree; N = Neutral; D = Disagree; SD = Strongly Disagree; Y1BOY = Year 1 Beginning of Year; Y1EOY = Year 1 End of Year; Y2EOY= Year 2 End of Year; Y3EOY = Year 3 End of Year; Y4 EOY = Year 4 End of Year.

Appendix C – Data Tables 83

Page 88: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Table 2c Frequency and Percentage of Instructional School Staff Beliefs Ratings Across Pilot Schools: Factor Three (Functions of Instruction) Belief Statement Response Y1 BOY Y1 EOY Y2 EOY Y3 EOY Y4 EOY 7A. Core instruction should be effective enough to result in 80% of the students achieving benchmarks in reading

SA A N D SD

286 (18.6%) 906 (59.0%) 179 (11.7%) 147 (9.6%) 18 (1.2%)

309 (19.7%) 984 (62.8%) 173 (11.1%) 88 (5.6%) 12 (0.8%)

342 (22.2%) 930 (60.4%) 182 (11.8%) 77 (5.0%) 9 (0.6%)

320 (23.4%) 803 (58.7%) 154 (11.3%) 78 (5.7%) 12 (0.9%)

179 (22.8%) 462 (58.9%) 92 (11.7%) 44 (5.6%) 8 (1.0%)

7B. Core instruction should be effective enough to result in 80% of the students achieving benchmarks in math

SA A N D SD

282 (18.9%) 875 (58.6%) 172 (11.5%) 147 (9.8%) 18 (1.2%)

302 (19.6%) 971 (63.1%) 171 (11.1%) 86 (5.6%) 9 (0.6%)

340 (22.2%) 931 (60.7%) 182 (11.9%) 75 (4.9%) 7 (0.5%)

320 (23.7%) 796 (58.9%) 155 (11.5%) 69 (5.1%) 12 (0.9%)

174 (22.2%) 469 (59.9%) 87 (11.1%) 45 (5.8%) 8 (1.0%)

8A. The primary function of supplemental instruction is to ensure that students meet grade-level benchmarks in reading

SA A N D SD

288 (18.8%) 993 (64.9%) 155 (10.1%) 79 (5.2%) 15 (1.0%)

301 (19.3%) 1065 (68.1%) 138 (8.8%) 48 (3.1%) 11 (0.7%)

368 (23.8%) 1000 (64.8%) 127 (8.2%) 44 (2.9%) 5 (0.3%)

331 (24.4%) 879 (64.7%) 104 (7.7%) 37 (2.7%) 7 (0.5%)

177 (22.6%) 508 (64.7%) 62 (7.9%) 30 (3.8%) 8 (1.0%)

8B. The primary function of supplemental instruction is to ensure that students meet grade-level benchmarks in math

SA A N D SD

278 (18.5%) 980 (65.2%) 158 (10.5%) 75 (5.0%) 13 (0.9%)

292 (18.8%) 1067 (68.7%) 138 (8.9%) 45 (2.9%) 12 (0.8%)

361 (23.4%) 999 (64.8%) 129 (8.4%) 47 (3.1%) 5 (0.3%)

329 (24.3%) 875 (64.7%) 105 (7.8%) 37 (2.7%) 7 (0.5%)

173 (22.1%) 511 (65.3%) 62 (7.9%) 29 (3.7%) 8 (1.0%)

Note. SA = Strongly Agree; A = Agree; N = Neutral; D = Disagree; SD = Strongly Disagree; Y1BOY = Year 1 Beginning of Year; Y1EOY = Year 1 End of Year; Y2EOY= Year 2 End of Year; Y3EOY = Year 3 End of Year; Y4 EOY = Year 4 End of Year.

84 Appendix C – Data Tables

Page 89: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Table 3a Frequency and Percentage of School Based Leadership Team Perceptions of RtI Skills Ratings Across Pilot Schools: Factor One (Perceptions of RtI Skills When Addressing Academic Issues)

Skill Statement Response Y1 BOY Y1 EOY Y2 EOY Y3 EOY Y4 EOY

2A. Access the data necessary to determine the percent of students in core instruction who are achieving benchmarks (district grade-level standards) in academics

VHS HS SS MnS NS

40 (21.5%) 74 (39.8%) 45 (24.2%) 21 (11.3%) 6 (3.2%)

47 (29.2%) 73 (45.3%) 38 (23.6%) 3 (1.9%) 0 (0%)

54 (29.8%) 70 (38.7%) 52 (28.7%) 3 (1.7%) 2 (1.1%)

55 (32.9%) 69 (41.3%) 39 (23.4%) 4 (2.4%) 0 (0%)

71 (32.3%) 95 (43.2%) 42 (19.1%) 9 (4.1%) 3 (1.4%)

3A. Use data to make decisions about individuals and groups of students for the: core academic curriculum

VHS HS SS MnS NS

40 (21.5%) 84 (45.2%) 48 (25.8%) 11 (5.9%) 3 (1.6%)

38 (23.5%) 87 (53.7%) 33 (20.4%) 4 (2.5%) 0 (0%)

53 (29.6%) 81 (45.3%) 43 (24.0%) 2 (1.1%) 0 (0%)

57 (34.3%) 77 (46.4%) 31 (18.7%) 1 (0.6%) 0 (0%)

74 (33.6%) 98 (44.6%) 39 (17.7%) 9 (4.1%) 0 (0%)

4. Perform each of the following steps when identifying the problem for a student for whom concerns have been raised:

A1. Define the referral concern in terms of a replacement behavior (i.e., what the student should be able to do instead of a referral problem for: academics

VHS HS SS MnS NS

34 (18.3%) 81 (43.6%) 57 (30.7%) 13 (7.0%) 1 (0.5%)

33 (20.4%) 91 (56.2%) 35 (21.6%) 3 (1.9%) 0 (0%)

42 (23.2%) 97 (53.6%) 36 (19.9%) 6 (3.3%) 0 (0%)

53 (31.7%) 79 (47.3%) 32 (19.2%) 3 (1.8%) 0 (0%)

62 (28.3%) 88 (40.2%) 53 (24.2%) 11 (5.0%) 5 (2.3%)

B1. Use data to define the current level of performance of the target student for: academics

VHS HS SS MnS NS

49 (26.3%) 90 (48.4%) 38 (20.4%) 7 (3.8%) 2 (1.1%)

47 (29.0%) 94 (58.0%) 19 (11.7%) 2 (1.2%) 0 (0%)

51 (28.2%) 99 (54.7%) 30 (16.6%) 1 (0.6%) 0 (0%)

58 (34.7%) 86 (51.5%) 22 (13.2%) 1 (0.6%) 0 (0%)

75 (34.4%) 94 (43.1%) 39 (17.9%) 7 (3.2%) 3 (1.4%)

C1. Determine the desired level of performance (i.e., benchmark) for: academics

VHS HS SS MnS NS

46 (25.0%) 98 (53.3%) 26 (14.1%) 13 (7.1%) 1 (0.5%)

52 (32.1%) 85 (52.5%) 23 (14.2%) 2 (1.2%) 0 (0%)

57 (31.7%) 93 (51.7%) 26 (14.4%) 4 (2.2%) 0 (0%)

62 (37.1%) 80 (47.9%) 23 (13.7%) 2 (1.2%) 0 (0%)

73 (33.3%) 97 (44.3%) 42 (19.2%) 6 (2.7%) 1 (0.5%)

Appendix C – Data Tables 85

Page 90: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

D1.Determine the current level of peer performance for the same skill as the target student for: academics

VHS HS SS MnS NS

37 (20.1%) 93 (50.5%) 39 (21.2%) 15 (8.2%) 0 (0.0%)

47 (29.0%) 83 (51.2%) 30 (18.5%) 2 (1.2%) 0 (0%)

47 (26.0%) 88 (48.6%) 45 (24.9%) 1 (0.6%) 0 (0%)

52 (31.1%) 80 (47.9%) 32 (19.2%) 3 (1.8%) 0 (0%)

69 (31.5%) 99 (45.2%) 42 (19.2%) 7 (3.2%) 2 (0.9%)

E1. Calculate the gap between student current performance and the benchmark (district grade level standard) for: academics

VHS HS SS MnS NS

35 (18.8%) 65 (35.0%) 54 (29.0%) 23 (12.4%) 9 (4.8%)

34 (21.1%) 87 (54.0%) 35 (21.7%) 5 (3.11%) 0 (0%)

40 (22.1%) 60 (33.2%) 66 (36.5%) 15 (8.3%) 0 (0%)

47 (28.3%) 67 (40.4%) 44 (26.5%) 8 (4.8%) 0 (0%)

54 (24.7%) 74 (33.8%) 65 (29.7%) 20 (9.1%) 6 (2.7%)

F1. Use gap data to determine whether core instruction should be adjusted or whether supplemental instruction should be directed to the target student for: academics

VHS HS SS MnS NS

24 (12.9%) 60 (32.4%) 60 (32.4%) 30 (16.2%) 11 (6.0%)

39 (24.2%) 80 (49.7%) 36 (22.4%) 6 (3.7%) 0 (0%)

38 (21.0%) 70 (38.7%) 64 (35.4%) 9 (5.0%) 0 (0%)

48 (28.7%) 75 (44.9%) 39 (23.4%) 4 (2.4%) 1 (0.6%)

62 (28.3%) 75 (34.3%) 60 (27.4%) 16 (7.3%) 6 (2.7%)

5A. Develop potential reasons (hypotheses) that a student or group of students is/are not achieving desired levels of performance (i.e., benchmarks) for: academics

VHS HS SS MnS NS

24 (12.9%) 91 (48.9%) 51 (27.4%) 19 (10.2%) 1 (0.5%)

32 (19.8%) 76 (46.9%) 50 (30.9%) 4 (2.5%) 0 (0%)

35 (19.3%) 90 (49.7%) 49 (27.1%) 7 (3.9%) 0 (0%)

45 (27.0%) 83 (49.7%) 35 (21.0%) 4 (2.4%) 0 (0%)

59 (26.8%) 90 (40.9%) 60 (27.3%) 9 (4.1%) 2 (0.9%)

6A. Identify the most appropriate type(s) of data to use for determining reasons (hypotheses) that are likely to be contributing to the problem for: academics

VHS HS SS MnS NS

24 (13.0%) 72 (38.9%) 54 (29.2%) 31 (16.8%) 4 (2.2%)

28 (17.3%) 79 (48.8%) 49 (30.3%) 6 (3.7%) 0 (0%)

32 (17.7%) 79 (43.7%) 61 (33.7%) 9 (5.0%) 0 (0%)

44 (26.4%) 75 (44.9%) 45 (27.0%) 3 (1.8%) 0 (0%)

50 (22.7%) 93 (42.3%) 59 (26.8%) 14 (6.4%) 4 (1.8%)

7A. Identify the appropriate supplemental intervention available in my building for a student identified as at-risk for: academics

VHS HS SS MnS NS

22 (11.8%) 72 (38.7%) 68 (36.6%) 22 (11.8%) 2 (1.1%)

30 (18.6%) 77 (47.8%) 49 (30.4%) 5 (3.1%) 0 (0%)

32 (17.7%) 85 (47.0%) 54 (29.8%) 9 (5.0%) 1 (0.6%)

36 (21.6%) 79 (47.3%) 46 (27.5%) 6 (3.6%) 0 (0%)

50 (22.7%) 89 (40.5%) 62 (28.2%) 14 (6.4%) 5 (2.3%)

86 Appendix C – Data Tables

Page 91: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

8. Access resources (e.g., internet sources, professional literature) to develop evidence-based interventions for:

A. Academic core curricula

VHS HS SS MnS NS

40 (21.5%) 70 (37.6%) 57 (30.7%) 13 (7.0%) 6 (3.2%)

36 (22.4%) 79 (49.1%) 41 (25.5%) 5 (3.1%) 0 (0%)

43 (23.8%) 91 (50.3%) 37 (20.4%) 10 (5.5%) 0 (0%)

46 (27.5%) 86 (51.5%) 30 (18.0%) 4 (2.4%) 1 (0.6%)

59 (26.8%) 95 (43.2%) 52 (23.6%) 12 (5.5%) 2 (0.9%)

C. Academic supplemental curricula

VHS HS SS MnS NS

33 (17.7%) 75 (40.3%) 56 (30.1%) 16 (8.6%) 6 (3.2%)

31 (19.3%) 78 (48.5%) 42 (26.1%) 10 (6.2%) 0 (0%)

41 (22.8%) 86 (47.8%) 41 (22.8%) 12 (6.7%) 0 (0%)

41 (24.6%) 92 (55.1%) 30 (18.0%) 4 (2.4%) 0 (0%)

53 (24.1%) 101 (45.9%) 48 (21.8%) 16 (7.3%) 2 (0.9%)

E. Academic individualized intervention plans

VHS HS SS MnS NS

35 (18.8%) 74 (39.8%) 55 (29.6%) 19 (10.2%) 3 (1.6%)

34 (21.1%) 75 (46.6%) 42 (26.1%) 10 (6.2%) 0 (0%)

36 (20.0%) 87 (48.3%) 49 (27.2%) 8 (4.4%) 0 (0%)

39 (23.4%) 89 (53.3%) 35 (21.0%) 4 (2.4%) 0 (0%)

57 (25.9%) 86 (39.1%) 60 (27.3%) 15 (6.8%) 2 (0.9%)

9A. Ensure that any supplemental and/or intensive interventions are integrated with core instruction in the general education classroom: academics

VHS HS SS MnS NS

31 (17.0%) 74 (40.4%) 47 (25.7%) 28 (15.3%) 3 (1.6%)

30 (18.6%) 77 (47.8%) 49 (30.4%) 5 (3.1%) 0 (0%)

33 (18.2%) 71 (39.2%) 63 (34.8%) 13 (7.2%) 1 (0.6%)

37 (22.2%) 85 (50.9%) 42 (25.2%) 3 (1.8%) 0 (0%)

54 (24.7%) 91 (41.5%) 54 (24.7%) 18 (8.2%) 2 (0.9%)

10A. Ensure that the proposed intervention plan is supported by the data that were collected for: academics

VHS HS SS MnS NS

31 (16.7%) 74 (39.8%) 62 (33.3%) 17 (9.1%) 2 (1.1%)

27 (16.8%) 90 (55.9%) 40 (24.8%) 4 (2.5%) 0 (0%)

37 (20.6%) 86 (47.8%) 48 (26.7%) 8 (4.4%) 1 (0.6%)

54 (32.5%) 75 (45.2%) 35 (21.1%) 2 (1.2%) 0 (0%)

65 (29.6%) 90 (40.9%) 51 (23.2%) 11 (5.0%) 3 (1.4%)

11A. Provide the support necessary to ensure that the intervention is implemented appropriately for: academics

VHS HS SS MnS NS

28 (15.0%) 85 (45.7%) 46 (24.7%) 23 (12.4%) 4 (2.2%)

31 (19.4%) 86 (53.8%) 37 (23.1%) 6 (3.8%) 0 (0%)

33 (18.2%) 85 (47.0%) 51 (28.2%) 11 (6.1%) 1 (0.6%)

42 (25.3%) 82 (49.4%) 38 (22.9%) 3 (1.8%) 1 (0.6%)

57 (25.9%) 94 (42.7%) 53 (24.1%) 15 (6.8%) 1 (0.5%)

12A. Determine if an intervention was implemented as it was intended for: academics

VHS HS SS MnS NS

33(17.9%) 74 (40.2%) 55 (29.9%) 18 (9.8%) 4 (2.2%)

34 (21.1%) 87 (54.0%) 36 (22.4%) 4 (2.5%) 0 (0%)

37 (20.6%) 90 (50.0%) 44 (24.4%) 8 (4.4%) 1 (0.6%)

47 (28.5%) 81 (49.1%) 33 (20.0%) 4 (2.4%) 0 (0%)

61 (27.7%) 91 (41.4%) 55 (25.0%) 11 (5.0%) 2 (0.9%)

Appendix C – Data Tables 87

Page 92: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

13A. Select appropriate data (e.g., Curriculum-Based Measurement, DIBELS, FCAT, behavioral observations) to use for progress monitoring of student performance during interventions: academics

VHS HS SS MnS NS

48 (26.0%) 74 (40.0%) 46 (24.9%) 13 (7.0%) 4 (2.2%)

43 (26.7%) 76 (47.2%) 37 (23.0%) 5 (3.1%) 0 (0%)

51 (28.3%) 84 (46.7%) 39 (21.7%) 5 (2.8%) 1 (0.6%)

50 (30.1%) 88 (53.0%) 24 (14.5%) 4 (2.4%) 0 (0%)

65 (29.6%) 93 (42.3%) 45 (20.5%) 12 (5.5%) 5 (2.3%)

16. Make modifications to intervention plans based on student response to intervention.

VHS HS SS MnS NS

26 (14.0%) 93 (50.0%) 49 (26.3%) 16 (8.6%) 2 (1.1%)

34 (21.3%) 86 (53.8%) 38 (23.8%) 2 (1.3%) 0 (0%)

29 (16.0%) 100 (55.3%) 51 (28.2%) 1 (0.6%) 0 (0%)

45 (27.3%) 83 (50.3%) 36 (21.8%) 1 (0.6%) 0 (0%)

60 (27.3%) 108 (49.1%) 41 (18.6%) 9 (4.1%) 2 (0.9%)

17. Use appropriate data to differentiate between students who have not learned skills (e.g., did not have adequate exposure to effective instruction, not ready, got too far behind) from those who have barriers to learning due to a disability.

VHS HS SS MnS NS

19 (10.3%) 78 (42.4%) 54 (29.4%) 26 (14.1%) 7 (3.8%)

24 (15.2%) 71 (44.9%) 54 (34.2%) 9 (5.7%) 0 (0%)

21 (11.8%) 74 (41.6%) 63 (35.4%) 17 (9.6%) 3 (1.7%)

41 (25.0%) 73 (44.5%) 43 (26.2%) 6 (3.7%) 1 (0.6%)

53 (24.1%) 99 (45.0%) 49 (22.3%) 16 (7.3%) 3 (1.4%)

18. Collect the following types of data:

A. Curriculum-Based Measurement

VHS HS SS MnS NS

38 (20.4%) 77 (41.1%) 48 (25.8%) 12 (6.5%) 11 (5.9%)

35 (21.6%) 84 (51.9%) 31 (19.1%) 9 (5.6%) 3 (1.9%)

49 (27.2%) 89 (49.4%) 31 (17.2%) 6 (3.3%) 5 (2.8%)

54 (32.7%) 71 (43.0%) 31 (18.8%) 8 (4.9%) 1 (0.6%)

71 (32.3%) 85 (38.6%) 46 (20.9%) 13 (5.9%) 5 (2.3%)

B. DIBELS VHS HS SS MnS NS

62 (33.3%) 69 (37.1%) 36 (19.4%) 10 (5.4%) 9 (4.8%)

63 (39.1%) 66 (41.0%) 18 (11.2%) 10 (6.2%) 4 (2.5%)

72 (40.0%) 69 (38.3%) 27 (15.0%) 7 (3.9%) 5 (2.8%)

55 (33.7%) 65 (39.9%) 30 (18.4%) 9 (5.5%) 4 (2.5%)

67 (30.5%) 75 (34.1%) 46 (20.9%) 16 (7.3%) 16 (7.3%)

C. Access data from appropriate district- or school-wide assessments

VHS HS SS MnS NS

48 (25.8%) 79 (42.5%) 41 (22.0%) 9 (4.8%) 9 (4.8%)

47 (29.2%) 77 (47.8%) 29 (18.0%) 8 (5.0%) 0 (0%)

62 (34.4%) 81 (45.0%) 31 (17.2%) 3 (1.7%) 3 (1.7%)

55 (33.3%) 75 (45.5%) 27 (16.4%) 7 (4.2%) 1 (0.6%)

76 (34.6%) 95 (43.2%) 33 (15.0%) 11 (5.0%) 5 (2.3%)

88 Appendix C – Data Tables

Page 93: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Note. VHS = Very Highly Skilled; HS = Highly Skilled; SS = Somewhat Skilled; MnS = Minimally Skilled; NS = Not Skilled; Y1BOY = Year 1 Beginning of Year; Y1EOY = Year 1 End of Year; Y2EOY= Year 2 End of Year; Y3EOY = Year 3 End of Year; Y4 EOY = Year 4 End of Year.

20C. Use technology in the following ways: Use the Progress Monitoring and Reporting Network (PMRN)

VHS HS SS MnS NS

55 (29.6%) 49 (26.3%) 35 (18.8%) 23 (12.4%) 24 (12.9%)

52 (32.3%) 55 (34.2%) 32 (19.9%) 18 (11.2%) 4 (2.5%)

59 (32.6%) 59 (32.6%) 41 (22.7%) 14 (7.7%) 8 (4.4%)

56 (33.9%) 68 (41.2%) 26 (15.8%) 14 (8.5%) 1 (0.6%)

67 (30.5%) 86 (39.1%) 45 (20.5%) 14 (6.4%) 8 (3.6%)

Appendix C – Data Tables 89

Page 94: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Table 3b Frequency and Percentage of School-Based Leadership Team Perceptions of RtI Skills Ratings Across Pilot Schools: Factor Two (Perceptions of RtI Skills When Addressing Behavior Issues) Skill Statement Response Y1 BOY Y1 EOY Y2 EOY Y3 EOY Y4 EOY 2B. Access the data necessary to determine the percent of students in core instruction who are achieving benchmarks (district grade-level standards) in behavior

VHS HS SS MnS NS

16 (8.7%) 43 (23.2%) 61 (33.0%) 47 (25.4%) 18 (9.7%)

20 (12.5%) 61 (38.1%) 53 (33.1%) 13 (8.1%) 13 (8.1%)

23 (12.8%) 63 (35.0%) 63 (35.0%) 16 (8.9%) 15 (8.3%)

31 (18.9%) 57 (34.8%) 42 (25.6%) 28 (17.1%) 6 (3.7%)

42 (19.1%) 87 (39.6%) 56 (25.5%) 23 (10.5%) 12 (5.5%)

3B. Use data to make decisions about individuals and groups of students for the: core/building discipline plan

VHS HS SS MnS NS

21 (11.4%) 64 (34.6%) 61 (33.0%) 29 (15.7%) 10 (5.4%)

19 (11.7%) 79 (48.8%) 45 (27.8%) 15 (9.3%) 4 (2.5%)

26 (14.5%) 67 (37.4%) 70 (39.1%) 13 (7.3%) 3 (1.7%)

39 (23.5%) 59 (35.5%) 46 (27.7%) 17 (10.2%) 5 (3.0%)

51 (23.2%) 99 (45.0%) 46 (20.9%) 17 (7.7%) 7 (3.2%)

4. Perform each of the following steps when identifying the problem for a student for whom concerns have been raised:

4A2. Define the referral concern in terms of a replacement behavior (i.e., what the student should be able to do) instead of a referral problem for: behavior

VHS HS SS MnS NS

32 (17.2%) 79 (42.5%) 52 (28.0%) 19 (10.2%) 4 (2.2%)

28 (17.3%) 80 (49.4%) 42 (25.9%) 9 (5.6%) 3 (1.9%)

34 (19.0%) 85 (47.5%) 45 (25.1%) 10 (5.6%) 5 (2.8%)

41 (24.6%) 71 (42.5%) 39 (23.4%) 14 (8.4%) 2 (1.2%)

57 (26.0%) 81 (37.0%) 56 (25.6%) 18 (8.2%) 7 (3.2%)

4B2. Use data to define the current level of performance of the target student for: behavior

VHS HS SS MnS NS

26 (14.0%) 74 (39.8%) 57 (30.7%) 23 (12.4%) 6 (3.2%)

32 (19.8%) 74 (45.7%) 43 (26.5%) 11 (6.8%) 2 (1.2%)

31 (17.2%) 71 (39.4%) 64 (35.6%) 8 (4.4%) 6 (3.3%)

35 (21.1%) 75 (45.2%) 36 (21.7%) 17 (10.2%) 3 (1.8%)

53 (24.3%) 91 (41.7%) 49 (22.5%) 16 (7.3%) 9 (4.1%)

90 Appendix C – Data Tables

Page 95: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

4C2. Perform each of the following steps when identifying the problem for a student for whom concerns have been raised: Determine the desired level of performance (i.e., benchmark) for: behavior

VHS HS SS MnS NS

31 (16.8%) 87 (47.0%) 40 (21.6%) 19 (10.3%) 8 (4.3%)

33 (20.4%) 74 (45.7%) 45 (27.8%) 7 (4.3%) 3 (1.9%)

36 (20.2%) 78 (43.8%) 48 (27.0%) 9 (5.1%) 7 (3.9%)

46 (27.7%) 62 (37.4%) 39 (23.5%) 16 (9.6%) 3 (1.8%)

56 (25.6%) 95 (43.4%) 51 (23.3%) 11 (5.0%) 6 (2.7%)

4D2. Perform each of the following steps when identifying the problem for a student for whom concerns have been raised: Determine the current level of peer performance for the same skill as the target student for: behavior

VHS HS SS MnS NS

24 (13.0%) 80 (43.2%) 55 (29.7%) 18 (9.7%) 8 (4.3%)

32 (19.8%) 65 (40.1%) 52 (32.1%) 8 (4.9%) 5 (3.1%)

35 (19.6%) 68 (38.0%) 59 (33.0%) 9 (5.0%) 8 (4.5%)

43 (26.1%) 60 (36.4%) 40 (24.2%) 17 (10.3%) 5 (3.0%)

51 (23.3%) 96 (43.8%) 52 (23.7%) 14 (6.4%) 6 (2.7%)

4E2. Perform each of the following steps when identifying the problem for a student for whom concerns have been raised: Calculate the gap between student current performance and the benchmark (district grade level standard) for: behavior

VHS HS SS MnS NS

20 (10.9%) 51 (27.7%) 63 (34.2%) 31 (16.9%) 19 (10.3%)

23 (14.3%) 56 (34.8%) 68 (42.2%) 8 (5.0%) 6 (3.7%)

23 (12.8%) 48 (26.7%) 78 (43.3%) 21 (11.7%) 10 (5.6%)

32 (19.4%) 54 (32.7%) 53 (32.1%) 18 (10.9%) 8 (4.9%)

36 (16.4%) 70 (32.0%) 76 (34.7%) 25 (11.4%) 12 (5.5%)

Appendix C – Data Tables 91

Page 96: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

4F2. Perform each of the following steps when identifying the problem for a student for whom concerns have been raised: Use gap data to determine whether core instruction should be adjusted or whether supplemental instruction should be directed to the target student for: behavior

VHS HS SS MnS NS

20 (10.8%) 51 (27.6%) 58 (31.4%) 38 (20.5%) 18 (9.7%)

28 (17.3%) 65 (40.1%) 58 (35.8%) 8 (4.9%) 3 (1.9%)

25 (13.9%) 57 (31.7%) 72 (40.0%) 19 (10.6%) 7 (3.9%)

31 (18.7%) 65 (39.2%) 46 (27.7%) 18 (10.8%) 6 (3.6%)

47 (21.5%) 70 (32.0%) 69 (31.5%) 24 (11.0%) 9 (4.1%)

5B. Develop potential reasons (hypotheses) that a student or group of students is/are not achieving desired levels of performance (i.e., benchmarks) for: behavior

VHS HS SS MnS NS

23 (12.4%) 80 (43.0%) 55 (29.6%) 23 (12.4%) 5 (2.7%)

29 (17.9%) 61 (37.7%) 63 (38.9%) 7 (4.3%) 2 (1.2%)

30 (16.8%) 72 (40.2%) 57 (31.8%) 15 (8.4%) 5 (2.8%)

39 (23.4%) 65 (38.9%) 48 (28.7%) 12 (7.2%) 3 (1.8%)

53 (24.1%) 85 (38.6%) 60 (27.3%) 15 (6.8%) 7 (3.2%)

6B. Identify the most appropriate type(s) of data to use for determining reasons (hypotheses) that are likely to be contributing to the problem for: behavior

VHS HS SS MnS NS

16 (8.7%) 55 (29.7%) 71 (38.4%) 35 (18.9%) 8 (4.3%)

23 (14.2%) 57 (35.2%) 66 (40.7%) 13 (8.0%) 3 (1.9%)

18 (10.0%) 65 (36.1%) 71 (39.4%) 19 (10.6%) 7 (3.9%)

32 (19.2%) 62 (37.1%) 54 (32.3%) 16 (9.6%) 3 (1.8%)

41 (18.6%) 80 (36.4%) 64 (29.1%) 25 (11.4%) 10 (4.6%)

7B. Identify the appropriate supplemental intervention available in my building for a student identified as at-risk for: behavior

VHS HS SS MnS NS

16 (8.6%) 60 (32.3%) 70 (37.6%) 33 (17.7%) 7 (3.8%)

21 (13.0%) 65 (40.4%) 57 (35.4%) 14 (8.7%) 4 (2.5%)

17 (9.4%) 68 (37.8%) 68 (37.8%) 20 (11.1%) 7 (3.9%)

21 (12.7%) 69 (41.6%) 49 (29.5%) 22 (13.3%) 5 (3.0%)

41 (18.6%) 75 (34.1%) 69 (31.4%) 26 (11.8%) 9 (4.1%)

92 Appendix C – Data Tables

Page 97: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

8B. Access resources (e.g., internet sources, professional literature) to develop evidence-based interventions for: Behavioral core curricula

VHS HS SS MnS NS

22 (11.9%) 60 (32.4%) 64 (34.6%) 29 (15.7%) 10 (5.4%)

16 (9.9%) 73 (45.3%) 56 (34.8%) 15 (9.3%) 1 (0.6%)

28 (15.6%) 67 (37.2%) 64 (35.6%) 18 (10.0%) 3 (1.7%)

26 (15.6%) 65 (38.9%) 54 (32.3%) 16 (9.6%) 6 (3.6%)

43 (19.6%) 86 (39.1%) 60 (27.3%) 25 (11.4%) 6 (2.7%)

8D. Access resources (e.g., internet sources, professional literature) to develop evidence-based interventions for: Behavioral supplemental curricula

VHS HS SS MnS NS

18 (9.7%) 62 (33.3%) 62 (33.3%) 33 (17.7%) 11 (5.9%)

16 (9.9%) 63 (39.1%) 64 (39.8%) 15 (9.3%) 3 (1.9%)

23 (12.8%) 62 (34.4%) 71 (39.4%) 20 (11.1%) 4 (2.2%)

23 (13.8%) 71 (42.5%) 50 (29.9%) 17 (10.2%) 6 (3.6%)

39 (17.7%) 82 (37.3%) 63 (28.6%) 30 (13.6%) 6 (2.7%)

8F. Access resources (e.g., internet sources, professional literature) to develop evidence-based interventions for: Behavioral individualized intervention plans

VHS HS SS MnS NS

20 (10.8%) 64 (34.4%) 64 (34.4%) 28 (15.1%) 10 (5.4%)

21 (13.0%) 67 (41.6%) 56 (34.8%) 13 (8.1%) 4 (2.5%)

22 (12.2%) 64 (35.6%) 74 (41.1%) 15 (8.3%) 5 (2.8%)

28 (16.8%) 65 (38.9%) 50 (29.9%) 16 (9.6%) 8 (4.8%)

44 (20.0%) 84 (38.2%) 54 (24.6%) 32 (14.6%) 6 (2.7%)

9B. Ensure that any supplemental and/or intensive interventions are integrated with core instruction in the general education classroom: behavior

VHS HS SS MnS NS

16 (8.7%) 54 (29.5%) 74 (40.4%) 34 (18.6%) 5 (2.7%)

19 (11.8%) 62 (38.5%) 65 (40.4%) 13 (8.1%) 2 (1.2%)

16 (8.9%) 62 (34.4%) 76 (42.2%) 20 (11.1%) 6 (3.3%)

25 (15.0%) 67 (40.1%) 56 (33.5%) 13 (7.8%) 6 (3.6%)

41 (18.7%) 89 (40.6%) 53 (24.2%) 30 (13.7%) 6 (2.7%)

10B. Ensure that the proposed intervention plan is supported by the data that were collected for: behavior

VHS HS SS MnS NS

18 (9.7%) 62 (33.5%) 71 (38.4%) 26 (14.1%) 8 (4.3%)

20 (12.4%) 76 (47.2%) 55 (34.2%) 7 (4.4%) 3 (1.9%)

23 (12.9%) 68 (38.0%) 63 (35.2%) 17 (9.5%) 8 (4.5%)

39 (23.5%) 59 (35.5%) 53 (31.9%) 12 (7.2%) 3 (1.8%)

46 (21.0%) 86 (39.3%) 57 (26.0%) 24 (11.0%) 6 (2.7%)

11B. Provide the support necessary to ensure that the intervention is implemented appropriately for: behavior

VHS HS SS MnS NS

19 (10.2%) 68 (36.6%) 62 (33.3%) 28 (15.1%) 9 (4.8%)

19 (12.0%) 76 (47.8%) 54 (34.0%) 8 (5.0%) 2 (1.3%)

23 (12.8%) 70 (38.9%) 62 (34.4%) 20 (11.1%) 5 (2.8%)

30 (18.2%) 69 (41.8%) 44 (26.7%) 16 (9.7%) 6 (3.6%)

47 (21.4%) 85 (38.6%) 55 (25.0%) 28 (12.7%) 5 (2.3%)

Appendix C – Data Tables 93

Page 98: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

12B. Determine if an intervention was implemented as it was intended for: behavior

VHS HS SS MnS NS

24 (13.0%) 70 (38.0%) 60 (32.6%) 21 (11.4%) 9 (4.9%)

25 (15.6%) 79 (49.4%) 47 (29.4%) 7 (4.4%) 2 (1.3%)

29 (16.3%) 73 (41.0%) 53 (29.8%) 18 (10.1%) 5 (2.8%)

35 (21.2%) 67 (40.6%) 42 (25.5%) 14 (8.5%) 7 (4.2%)

50 (22.7%) 88 (40.0%) 52 (23.6%) 24 (10.9%) 6 (2.7%)

13B. Select appropriate data (e.g., Curriculum-Based Measurement, DIBELS, FCAT, behavioral observations) to use for progress monitoring of student performance during interventions: behavior

VHS HS SS MnS NS

26 (14.1%) 58 (31.4%) 58 (31.4%) 31 (16.8%) 12 (6.5%)

20 (12.4%) 63 (39.1%) 62 (38.5%) 12 (7.5%) 4 (2.5%)

25 (14.0%) 62 (34.6%) 69 (38.6%) 17 (9.5%) 6 (3.4%)

29 (17.5%) 58 (34.9%) 53 (31.9%) 19 (11.5%) 7 (4.2%)

47 (21.4%) 84 (38.2%) 53 (24.1%) 27 (12.3%) 9 (4.1%)

18D. Collect the following types of data: Standard behavioral observations

VHS HS SS MnS NS

39 (21.1%) 74 (40.0%) 46 (24.9%) 19 (10.3%) 7 (3.8%)

27 (16.7%) 84 (51.9%) 40 (24.7%) 10 (6.2%) 1 (0.6%)

31 (17.2%) 85 (47.2%) 50 (27.8%) 10 (5.6%) 4 (2.2%)

43 (26.4%) 62 (38.0%) 41 (25.2%) 11 (6.8%) 6 (3.7%)

60 (27.3%) 80 (36.4%) 55 (25.0%) 19 (8.6%) 6 (2.7%)

Note. VHS = Very Highly Skilled; HS = Highly Skilled; SS = Somewhat Skilled; MnS = Minimally Skilled; NS = Not Skilled; Y1BOY = Year 1 Beginning of Year; Y1EOY = Year 1 End of Year; Y2EOY= Year 2 End of Year; Y3EOY = Year 3 End of Year; Y4 EOY = Year 4 End of Year.

94 Appendix C – Data Tables

Page 99: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Table 3c Frequency and Percentage of School Based Leadership Teams Perceptions of RtI Skills Ratings: Factor Three (Perceptions of RtI Skills in Accessing, Interpreting, and Graphing Data) Skill Statement Response Y1 BOY Y1 EOY Y2 EOY Y3 EOY Y4 EOY 14. Construct graphs for large group, small group, and individual students:

A. Graph target student data

VHS HS SS MnS NS

20 (10.8%) 61 (32.8%) 63 (33.9%) 28 (15.1%) 14 (7.5%)

25 (15.5%) 46 (28.6%) 59 (36.7%) 22 (13.7%) 9 (5.6%)

20 (11.1%) 66 (36.5%) 63 (34.8%) 26 (14.4%) 6 (3.3%)

35 (21.2%) 67 (40.6%) 39 (23.6%) 20 (12.1%) 4 (2.4%)

51 (23.2%) 71 (32.3%) 61 (27.7%) 31 (14.1%) 6 (2.7%)

B. Graph benchmark data

VHS HS SS MnS NS

19 (10.2%) 51 (27.4%) 73 (39.3%) 28 (15.1%) 15 (8.1%)

23 (14.3%) 44 (27.3%) 64 (39.8%) 22 (13.7%) 8 (5.0%)

22 (12.2%) 64 (35.6%) 61 (33.9%) 27 (15.0%) 6 (3.3%)

34 (20.6%) 69 (41.8%) 36 (21.8%) 21 (12.7%) 5 (3.0%)

47 (21.4%) 68 (30.9%) 67 (30.5%) 32 (14.6%) 6 (2.7%)

C. Graph peer data

VHS HS SS MnS NS

18 (9.7%) 55 (29.6%) 64 (34.4%) 32 (17.2%) 17 (9.1%)

23 (14.3%) 45 (28.0%) 60 (37.3%) 25 (15.5%) 8 (5.0%)

21 (11.7%) 64 (35.6%) 56 (31.1%) 31 (17.2%) 8 (4.4%)

31 (18.8%) 66 (40.0%) 42 (25.5%) 21 (12.7%) 5 (3.0%)

47 (21.4%) 67 (30.5%) 66 (30.0%) 34 (15.5%) 6 (2.7%)

D. Draw an aimline

VHS HS SS MnS NS

12 (6.5%) 44 (23.7%) 57 (30.7%) 38 (20.4%) 35 (18.8%)

24 (14.8%) 37 (22.8%) 70 (43.2%) 22 (13.6%) 9 (5.6%)

17 (9.5%) 44 (24.6%) 65 (36.3%) 42 (23.5%) 11 (6.2%)

31 (18.8%) 62 (37.6%) 43 (26.1%) 23 (13.9%) 6 (3.6%)

37 (16.8%) 52 (23.6%) 74 (33.6%) 40 (18.2%) 17 (7.7%)

E. Draw a trendline

VHS HS SS MnS NS

12 (6.5%) 38 (20.5%) 63 (34.1%) 35 (18.9%) 37 (20.0%)

22 (13.6%) 38 (23.5%) 71 (43.8%) 23 (14.2%) 8 (4.9%)

20 (11.2%) 45 (25.1%) 63 (35.2%) 39 (21.8%) 12 (6.7%)

33 (20.0%) 59 (35.8%) 45 (27.3%) 23 (13.9%) 5 (3.0%)

42 (19.1%) 48 (21.8%) 77 (35.0%) 38 (17.3%) 15 (6.8%)

15. Interpret graphed progress monitoring data to make decisions about the degree to which a student is responding to intervention (e.g., positive, questionable or poor response).

VHS HS SS MnS NS

32 (17.2%) 75 (40.3%) 54 (29.0%) 21 (11.3%) 4 (2.2%)

36 (22.4%) 91 (56.5%) 29 (18.0%) 5 (3.1%) 0 (0%)

33 (18.2%) 92 (50.8%) 49 (27.1%) 7 (3.9%) 0 (0%)

59 (35.8%) 81 (49.1%) 21 (12.7%) 4 (2.4%) 0 (0%)

68 (30.9%) 94 (42.7%) 48 (21.8%) 8 (3.6%) 2 (0.9%)

19. Disaggregate data by race, gender, free/reduced lunch, language proficiency, and disability status

VHS HS SS MnS NS

32 (17.4%) 55 (29.9%) 49 (26.6%) 34 (18.5%) 14 (7.6%)

27 (17.2%) 59 (37.6%) 55 (35.0%) 13 (8.3%) 3 (1.9%)

30 (16.7%) 67 (37.2%) 57 (31.7%) 20 (11.1%) 6 (3.3%)

44 (26.8%) 56 (34.2%) 41 (25.0%) 17 (10.4%) 6 (3.7%)

62 (28.3%) 76 (34.7%) 54 (24.7%) 16 (7.3%) 11 (5.0%)

Appendix C – Data Tables 95

Page 100: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

20. Use technology in the following ways:

A. Access the internet to locate sources of academic and behavioral evidence-based interventions.

VHS HS SS MnS NS

61 (32.8%) 72 (38.7%) 43 (23.1%) 10 (5.4%) 0 (0%)

54 (33.3%) 77 (47.5%) 25 (15.4%) 6 (3.7%) 0 (0%)

58 (32.0%) 85 (47.0%) 30 (16.6%) 5 (2.8%) 3 (1.7%)

71 (43.3%) 64 (39.0%) 27 (16.5%) 1 (0.6%) 1 (0.6%)

80 (36.4%) 90 (40.9%) 35 (15.9%) 9 (4.1%) 6 (2.7%)

B. Use electronic data collection tools (e.g., PDAs)

VHS HS SS MnS NS

22 (11.8%) 43 (23.1%) 55 (29.6%) 38 (20.4%) 28 (15.1%)

19 (11.8%) 48 (29.8%) 49 (30.4%) 31 (19.3%) 14 (8.7%)

18 (10.1%) 43 (24.0%) 68 (38.0%) 30 (16.8%) 20 (11.2%)

33 (20.3%) 53 (32.5%) 43 (26.4%) 24 (14.7%) 10 (6.1%)

46 (20.9%) 62 (28.2%) 64 (29.1%) 30 (13.6%) 18 (8.2%)

D. Use the School-Wide Information System (SWIS) for Positive Behavior Support

VHS HS SS MnS NS

11 (6.0%) 14 (7.7%) 38 (20.8%) 44 (24.0%) 76 (41.5%)

12 (7.6%) 23 (14.5%) 40 (25.2%) 39 (24.5%) 45 (28.3%)

13 (7.3%) 31 (17.3%) 44 (24.6%) 36 (20.1%) 55 (30.7%)

22 (13.4%) 34 (20.7%) 41 (25.0%) 23 (14.0%) 44 (26.8%)

26 (11.8%) 59 (26.8%) 53 (24.1%) 39 (17.7%) 43 (19.6%)

E. Graph and display student and school data

VHS HS SS MnS NS

25 (13.5%) 60 (32.4%) 49 (26.5%) 38 (20.5%) 13 (7.0%)

27 (16.7%) 56 (34.6%) 46 (28.4%) 27 (16.7%) 6 (3.7%)

28 (15.6%) 61 (33.9%) 63 (35.0%) 22 (12.2%) 6 (3.3%)

44 (27.0%) 57 (35.0%) 41 (25.2%) 17 (10.4%) 4 (2.5%)

56 (25.5%) 70 (31.8%) 61 (27.7%) 23 (10.5%) 10 (4.6%)

21. Facilitate a Problem Solving Team (Student Support Team, Intervention Assistance Team, School-Based Intervention Team, Child Study Team) meeting.

VHS HS SS MnS NS

33 (17.7%) 68 (36.6%) 45 (24.2%) 29 (15.6%) 11 (5.9%)

35 (21.7%) 54 (33.5%) 53 (32.9%) 16 (9.9%) 3 (1.9%)

25 (13.8%) 74 (40.8%) 65 (35.9%) 14 (7.7%) 3 (1.7%)

42 (26.4%) 54 (34.0%) 40 (25.2%) 19 (12.0%) 4 (2.5%)

57 (25.9%) 71 (32.3%) 63 (28.6%) 20 (9.1%) 9 (4.1%)

Note. VHS = Very Highly Skilled; HS = Highly Skilled; SS = Somewhat Skilled; MnS = Minimally Skilled; NS = Not Skilled; Y1BOY = Year 1 Beginning of Year; Y1EOY = Year 1 End of Year; Y2EOY= Year 2 End of Year; Y3EOY = Year 3 End of Year; Y4 EOY = Year 4 End of Year.

96 Appendix C – Data Tables

Page 101: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Table 4a Frequency and Percentage of Total School Perceptions of RtI Skills Ratings Across Pilot Schools: Factor One (Perceptions of RtI Skills When Addressing Academic Issues) Skill Statement Response Y1 BOY Y1 EOY Y2 EOY Y3 EOY Y4 EOY 2A. Access the data necessary to determine the percent of students in core instruction who are achieving benchmarks (district grade-level standards) in academics

VHS HS SS MnS NS

160 (11.6%) 637 (46.0%) 405 (29.3%) 103 (7.4%) 79 (5.7%)

240 (15.6%) 731 (47.5%) 381 (24.7%) 106 (6.9%) 82 (5.3%)

205 (13.3%) 806 (52.4%) 382 (24.8%) 88 (5.7%) 58 (3.8%)

233 (17.1%) 744 (54.8%) 289 (21.3%) 61 (4.5%) 32 (2.4%)

91 (12.6%) 358 (49.5%) 189 (26.1%) 58 (8.0%) 28 (3.9%)

3A. Use data to make decisions about individuals and groups of students for the: core academic curriculum

VHS HS SS MnS NS

205 (14.8%) 680 (49.2%) 357 (25.8%) 81 (5.9%) 59 (4.3%)

289 (18.8%) 774 (50.4%) 344 (22.4%) 80 (5.2%) 50 (3.3%)

248 (16.1%) 890 (57.8%) 303 (19.7%) 60 (3.9%) 38 (2.5%)

271 (20.0%) 782 (57.6%) 240 (17.7%) 44 (3.2%) 20 (1.5%)

86 (11.9%) 399 (55.1%) 175 (24.2%) 48 (6.6%) 16 (2.2%)

4. Perform each of the following steps when identifying the problem for a student for whom concerns have been raised:

A1. Define the referral concern in terms of a replacement behavior (i.e., what the student should be able to do instead of a referral problem for: academics

VHS HS SS MnS NS

132 (9.6%) 638 (46.6%) 425 (31.0%) 123 (9.0%) 52 (3.8%)

191 (12.4%) 714 (46.3%) 497 (32.3%) 96 (6.2%) 43 (2.8%)

148 (9.7%) 822 (53.7%) 448 (29.2%) 80 (5.2%) 34 (2.2%)

146 (10.8%) 746 (55.1%) 380 (28.0%) 58 (4.3%) 25 (1.9%)

60 (8.3%) 336 (46.5%) 254 (35.1%) 57 (7.9%) 16 (2.2%)

B1. Use data to define the current level of performance of the target student for: academics

VHS HS SS MnS NS

219 (15.9%) 717 (52.2%) 332 (24.2%) 73 (5.3%) 34 (2.5%)

283 (18.4%) 794 (51.5%) 361 (23.4%) 75 (4.9%) 29 (1.9%)

241 (15.7%) 910 (59.2%) 316 (20.6%) 46 (3.0%) 24 (1.6%)

255 (18.8%) 799 (58.8%) 253 (18.6%) 39 (2.9%) 14 (1.0%)

88 (12.2%) 405 (55.9%) 178 (24.6%) 39 (5.4%) 14 (1.9%)

C1. Determine the desired level of performance (i.e., benchmark) for: academics

VHS HS SS MnS NS

218 (15.9%) 729 (53.3%) 321 (23.5%) 68 (5.0%) 33 (2.4%)

278 (18.1%) 850 (55.2%) 320 (20.8%) 63 (4.1%) 28 (1.8%)

245 (16.0%) 932 (60.8%) 286 (18.6%) 47 (3.1%) 24 (1.6%)

282 (20.9%) 787 (58.2%) 241 (17.8%) 29 (2.1%) 13 (1.0%)

86 (11.9%) 416 (57.5%) 171 (23.7%) 39 (5.4%) 11 (1.5%)

D1.Determine the current level of peer performance for the same skill as the target student for: academics

VHS HS SS MnS NS

171 (12.6%) 669 (49.1%) 382 (28.1%) 93 (6.8%) 47 (3.5%)

240 (15.6%) 791 (51.4%) 379 (24.6%) 90 (5.9%) 38 (2.5%)

211 (13.7%) 868 (56.5%) 363 (23.6%) 66 (4.3%) 28 (1.8%)

232 (17.1%) 775 (57.2%) 298 (22.0%) 29 (2.1%) 21 (1.6%)

78 (10.8%) 371 (51.2%) 210 (29.0%) 47 (6.5%) 18 (2.5%)

E1. Calculate the gap between student current performance and

VHS HS SS MnS

124 (9.1%) 472 (34.6%) 500 (36.6%) 179 (13.1%)

173 (11.2%) 577 (37.5%) 526 (34.2%) 176 (11.4%)

149 (9.7%) 657 (42.7%) 515 (33.5%) 146 (9.5%)

147 (10.8%) 596 (43.9%) 462 (34.0%) 111 (8.2%)

50 (6.9%) 261 (36.1%) 273 (37.8%) 102 (14.1%)

Appendix C – Data Tables 97

Page 102: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

the benchmark (district grade level standard) for: academics

NS 90 (6.6%) 88 (5.7%) 71 (4.6%) 42 (3.1%) 37 (5.1%)

F1. Use gap data to determine whether core instruction should be adjusted or whether supplemental instruction should be directed to the target student for: academics

VHS HS SS MnS NS

114 (8.4%) 527 (38.7%) 423 (31.1%) 190 (14.0%) 108 (7.9%)

176 (11.5%) 585 (38.1%) 501 (32.6%) 182 (11.9%) 92 (6.0%)

126 (8.3%) 672 (44.0%) 484 (31.7%) 163 (10.7%) 81 (5.3%)

141 (10.4%) 627 (46.2%) 435 (32.1%) 115 (8.5%) 38 (2.8%)

50 (6.9%) 267 (36.9%) 269 (37.2%) 94 (13.0%) 44 (6.1%)

5A. Develop potential reasons (hypotheses) that a student or group of students is/are not achieving desired levels of performance (i.e., benchmarks) for: academics

VHS HS SS MnS NS

134 (9.8%) 611 (44.6%) 418 (30.5%) 147 (10.7%) 60 (4.4%)

181 (11.8%) 756 (49.2%) 434 (28.2%) 121 (7.9%) 46 (3.0%)

149 (9.7%) 809 (52.6%) 472 (30.7%) 75 (4.9%) 32 (2.1%)

159 (11.7%) 747 (55.0%) 371 (27.3%) 63 (4.6%) 19 (1.4%)

58 (8.0%) 344 (47.6%) 246 (34.0%) 58 (8.0%) 17 (2.4%)

6A. Identify the most appropriate type(s) of data to use for determining reasons (hypotheses) that are likely to be contributing to the problem for: academics

VHS HS SS MnS NS

85 (6.3%) 557 (41.0%) 456 (33.5%) 179 (13.2%) 83 (6.1%)

154 (10.1%) 626 (40.9%) 520 (33.9%) 168 (11.0%) 64 (4.2%)

112 (7.3%) 702 (45.8%) 549 (35.8%) 121 (7.9%) 49 (3.2%)

126 (9.3%) 646 (47.5%) 477 (35.1%) 93 (6.8%) 17 (1.3%)

42 (5.8%) 296 (40.9%) 285 (39.4%) 75 (10.4%) 25 (3.5%)

7A. Identify the appropriate supplemental intervention available in my building for a student identified as at-risk for: academics

VHS HS SS MnS NS

106 (7.7%) 545 (39.7%) 497 (36.2%) 169 (12.3%) 57 (4.2%)

151 (9.8%) 671 (43.7%) 522 (34.0%) 144 (9.4%) 46 (3.0%)

138 (9.0%) 729 (47.5%) 522 (34.0%) 108 (7.0%) 38 (2.5%)

132 (9.7%) 653 (48.1%) 472 (34.8%) 86 (6.3%) 15 (1.1%)

41 (5.7%) 310 (42.9%) 268 (37.1%) 81 (11.2%) 23 (3.2%)

8. Access resources (e.g., internet sources, professional literature) to develop evidence-based interventions for:

A. Academic core curricula

VHS HS SS MnS NS

154 (11.2%) 594 (43.3%) 416 (30.3%) 153 (11.2%) 54 (3.9%)

217 (14.1%) 663 (43.1%) 483 (31.4%) 124 (8.1%) 50 (3.3%)

203 (13.2%) 757 (49.3%) 435 (28.3%) 101 (6.6%) 39 (2.5%)

197 (14.5%) 711 (52.4%) 355 (26.2%) 78 (5.8%) 16 (1.2%)

77 (10.7%) 362 (50.1%) 210 (29.1%) 58 (8.0%) 16 (2.2%)

98 Appendix C – Data Tables

Page 103: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

C. Academic supplemental curricula

VHS HS SS MnS NS

136 (9.9%) 545 (39.8%) 457 (33.4%) 170 (12.4%) 62 (4.5%)

197 (12.8%) 648 (42.2%) 510 (33.3%) 129 (8.4%) 50 (3.3%)

171 (11.1%) 729 (47.5%) 485 (31.6%) 108 (7.0%) 43 (2.8%)

171 (12.6%) 679 (50.1%) 409 (30.2%) 79 (5.8%) 18 (1.3%)

71 (9.9%) 338 (46.9%) 233 (32.3%) 65 (9.0%) 14 (1.9%)

E. Academic individualized intervention plans

VHS HS SS MnS NS

121 (8.8%) 534 (39.0%) 456 (33.3%) 196 (14.3%) 62 (4.5%)

180 (11.7%) 639 (41.6%) 525 (34.2%) 137 (8.9%) 54 (3.5%)

159 (10.4%) 705 (46.0%) 505 (32.9%) 115 (7.5%) 50 (3.3%)

166 (12.2%) 650 (47.9%) 437 (32.2%) 85 (6.3%) 18 (1.3%)

67 (9.3%) 312 (43.3%) 254 (35.2%) 74 (10.3%) 14 (1.9%)

9A. Ensure that any supplemental and/or intensive interventions are integrated with core instruction in the general education classroom: academics

VHS HS SS MnS NS

138 (10.1%) 617 (45.0%) 425 (31.0%) 143 (10.4%) 47 (3.4%)

181 (11.8%) 742 (48.4%) 457 (29.8%) 100 (6.5%) 53 (3.5%)

172 (11.2%) 806 (52.5%) 432 (28.2%) 82 (5.4%) 42 (2.7%)

190 (14.1%) 735 (54.4%) 348 (25.7%) 64 (4.7%) 15 (1.1%)

61 (8.4%) 382 (52.8%) 211 (29.2%) 49 (6.8%) 20 (2.8%)

10A. Ensure that the proposed intervention plan is supported by the data that were collected for: academics

VHS HS SS MnS NS

116 (8.5%) 583 (42.6%) 453 (33.1%) 157 (11.5%) 61 (4.5%)

174 (11.3%) 706 (46.0%) 474 (30.9%) 129 (8.4%) 51 (3.3%)

153 (10.0%) 751 (48.9%) 490 (31.9%) 96 (6.3%) 46 (3.0%)

169 (12.5%) 744 (54.9%) 361 (26.6%) 64 (4.7%) 18 (1.3%)

65 (9.0%) 350 (48.4%) 242 (33.5%) 42 (5.8%) 24 (3.3%)

11A. Provide the support necessary to ensure that the intervention is implemented appropriately for: academics

VHS HS SS MnS NS

125 (9.1%) 626 (45.7%) 438 (32.0%) 130 (9.5%) 52 (3.8%)

199 (13.0%) 744 (48.5%) 436 (28.4%) 111 (7.2%) 44 (2.9%)

168 (11.0%) 797 (52.0%) 449 (29.3%) 78 (5.1%) 41 (2.7%)

185 (13.6%) 758 (55.8%) 343 (25.3%) 59 (4.3%) 13 (1.0%)

71 (9.8%) 360 (49.8%) 217 (30.0%) 55 (7.6%) 20 (2.8%)

12A. Determine if an intervention was implemented as it was intended for: academics

VHS HS SS MnS NS

117 (8.6%) 624 (45.6%) 427 (31.2%) 139 (10.2%) 62 (4.5%)

200 (13.1%) 741 (48.4%) 443 (28.9%) 99 (6.5%) 48 (3.1%)

180 (11.7%) 797 (52.0%) 438 (28.6%) 76 (5.0%) 42 (2.7%)

189 (14.0%) 753 (55.6%) 335 (24.7%) 62 (4.6%) 16 (1.2%)

63 (8.8%) 367 (50.8%) 226 (31.3%) 46 (6.4%) 21 (2.9%)

13A. Select appropriate data (e.g., Curriculum-Based Measurement, DIBELS, FCAT, behavioral observations) to use for progress monitoring of student performance during interventions: academics

VHS HS SS MnS NS

200 (14.6%) 696 (50.9%) 344 (25.2%) 85 (6.2%) 43 (3.1%)

295 (19.2%) 772 (50.3%) 341 (22.2%) 72 (4.7%) 54 (3.5%)

261 (17.0%) 850 (55.4%) 318 (20.7%) 75 (4.9%) 30 (2.0%)

248 (18.3%) 748 (55.0%) 286 (21.0%) 62 (4.6%) 15 (1.1%)

74 (10.2%) 362 (50.1%) 216 (29.9%) 48 (6.6%) 23 (3.2%)

16. Make modifications to intervention plans based on student response to intervention.

VHS HS SS MnS NS

128 (9.3%) 581 (42.4%) 463 (33.8%) 137 (10.0%) 63 (4.6%)

154 (10.1%) 739 (48.3%) 461 (30.2%) 126 (8.2%) 49 (3.2%)

148 (9.6%) 772 (50.3%) 474 (30.9%) 93 (6.1%) 48 (3.1%)

143 (10.6%) 733 (54.3%) 389 (28.8%) 65 (4.8%) 21 (1.6%)

58 (8.0%) 348 (48.1%) 243 (33.6%) 53 (7.3%) 22 (3.0%)

Appendix C – Data Tables 99

Page 104: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

17. Use appropriate data to differentiate between students who have not learned skills (e.g., did not have adequate exposure to effective instruction, not ready, got too far behind) from those who have barriers to learning due to a disability.

VHS HS SS MnS NS

87 (6.4%) 530 (38.9%) 479 (35.1%) 185 (13.6%) 82 (6.0%)

122 (8.2%) 592 (39.7%) 516 (34.6%) 184 (12.4%) 76 (5.1%)

103 (6.8%) 702 (46.0%) 521 (34.1%) 151 (9.9%) 49 (3.2%)

119 (8.8%) 626 (46.5%) 470 (34.9%) 107 (7.9%) 25 (1.9%)

50 (6.9%) 295 (40.8%) 285 (39.4%) 68 (9.4%) 25 (3.5%)

18. Collect the following types of data:

A. Curriculum-Based Measurement

VHS HS SS MnS NS

205 (15.0%) 641 (47.0%) 339 (24.8%) 113 (8.3%) 67 (4.9%)

298 (19.5%) 703 (46.1%) 332 (21.8%) 113 (7.4%) 80 (5.2%)

260 (17.0%) 774 (50.6%) 345 (22.6%) 89 (5.8%) 62 (4.1%)

244 (18.1%) 724 (53.6%) 280 (20.7%) 65 (4.8%) 37 (2.7%)

96 (13.3%) 348 (48.1%) 191 (26.4%) 56 (7.7%) 33 (4.6%)

B. DIBELS VHS HS SS MnS NS

260 (19.0%) 630 (46.0%) 271 (19.8%) 90 (6.6%) 118 (8.6%)

375 (24.5%) 653 (42.7%) 279 (18.3%) 98 (6.4%) 123 (8.1%)

343 (22.4%) 708 (46.3%) 301 (19.7%) 100 (6.5%) 77 (5.0%)

233 (17.8%) 601 (45.9%) 267 (20.4%) 107 (8.2%) 102 (7.8%)

99 (13.7%) 289 (39.9%) 172 (23.7%) 85 (11.7%) 79 (10.9%)

C. Access data from appropriate district- or school-wide assessments

VHS HS SS MnS NS

213 (15.6%) 626 (45.9%) 346 (25.4%) 126 (9.2%) 52 (3.8%)

316 (20.1%) 700 (45.9%) 322 (21.1%) 120 (7.9%) 66 (4.3%)

271 (17.7%) 795 (52.0%) 329 (21.5%) 85 (5.6%) 50 (3.3%)

261 (19.4%) 716 (53.1%) 281 (20.8%) 68 (5.0%) 23 (1.7%)

104 (14.4%) 348 (48.1%) 181 (25.0%) 59 (8.2%) 32 (4.4%)

20C. Use technology in the following ways: Use the Progress Monitoring and Reporting Network (PMRN)

VHS HS SS MnS NS

181 (13.3%) 517 (37.9%) 314 (23.0%) 174 (12.8%) 179 (13.1%)

276 (18.1%) 516 (33.8%) 391 (25.6%) 175 (11.5%) 170 (11.1%)

243 (15.9%) 613 (40.1%) 407 (26.6%) 158 (10.3%) 109 (7.1%)

302 (22.4%) 671 (49.7%) 271 (20.1%) 66 (4.9%) 40 (3.0%)

126 (17.4%) 332 (45.9%) 167 (23.1%) 58 (8.0%) 41 (5.7%)

Note. VHS = Very Highly Skilled; HS = Highly Skilled; SS = Somewhat Skilled; MnS = Minimally Skilled; NS = Not Skilled; Y1BOY = Year 1 Beginning of Year; Y1EOY = Year 1 End of Year; Y2EOY= Year 2 End of Year; Y3EOY = Year 3 End of Year; Y4 EOY = Year 4 End of Year.

100 Appendix C – Data Tables

Page 105: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Table 4b Frequency and Percentage of Total School Perceptions of RtI Skills Ratings Across Pilot Schools: Factor Two (Perceptions of RtI Skills When Addressing Behavior Issues) Skill Statement Response Y1 BOY Y1 EOY Y2 EOY Y3 EOY Y4 EOY 2B. Access the data necessary to determine the percent of students in core instruction who are achieving benchmarks (district grade-level standards) in behavior

VHS HS SS MnS NS

96 (7.2%) 446 (33.2%) 419 (31.2%) 164 (12.2%) 217 (16.2%)

151 (10.0%) 561 (37.0%) 434 (28.7%) 163 (10.8%) 206 (13.6%)

120 (8.0%) 553 (36.8%) 474 (31.5%) 166 (11.0%) 191 (12.7%)

115 (8.6%) 562 (42.2%) 387 (29.1%) 151 (11.3%) 116 (8.7%)

63 (8.8%) 248 (34.4%) 244 (33.8%) 94 (13.0%) 76 (10.0%)

3B. Use data to make decisions about individuals and groups of students for the: core/building discipline plan

VHS HS SS MnS NS

120 (8.8%) 542 (39.8%) 411 (30.2%) 144 (10.6%) 145 (10.7%)

181 (11.8%) 664 (43.3%) 431 (28.1%) 140 (9.1%) 119 (7.8%)

139 (9.1%) 698 (45.8%) 444 (29.1%) 125 (8.2%) 119 (7.8%)

153 (11.3%) 640 (47.4%) 382 (28.3%) 105 (7.8%) 71 (5.3%)

59 (8.2%) 297 (41.0%) 248 (34.3%) 82 (11.3%) 38 (5.3%)

4. Perform each of the following steps when identifying the problem for a student for whom concerns have been raised:

4A2. Define the referral concern in terms of a replacement behavior (i.e., what the student should be able to do) instead of a referral problem for: behavior

VHS HS SS MnS NS

119 (8.7%) 566 (41.5%) 447 (32.8%) 150 (11.0%) 81 (5.9%)

149 (9.7%) 681 (44.5%) 523 (34.1%) 123 (8.0%) 56 (3.7%)

124 (8.1%) 736 (48.2%) 501 (32.8%) 123 (8.1%) 44 (2.9%)

123 (9.1%) 639 (47.3%) 457 (33.8%) 93 (6.9%) 40 (3.0%)

57 (7.9%) 288 (39.9%) 283 (39.2%) 66 (9.1%) 28 (3.9%)

4B2. Use data to define the current level of performance of the target student for: behavior

VHS HS SS MnS NS

150 (11.0%) 562 (41.2%) 428 (31.4%) 137 (10.0%) 87 (6.4%)

180 (11.7%) 688 (44.7%) 463 (30.1%) 130 (8.4%) 77 (5.0%)

133 (8.7%) 738 (48.1%) 476 (31.1%) 109 (7.1%) 77 (5.0%)

150 (11.1%) 681 (50.4%) 379 (28.0%) 105 (7.8%) 37 (2.7%)

58 (8.0%) 322 (44.5%) 249 (34.4%) 63 (8.7%) 31 (4.3%)

4C2. Determine the desired level of performance (i.e., benchmark) for: behavior

VHS HS SS MnS NS

161 (11.9%) 613 (45.2%) 404 (29.8%) 111 (8.2%) 68 (5.0%)

207 (13.5%) 771 (50.2%) 398 (25.9%) 101 (6.6%) 60 (3.9%)

164 (10.7%) 795 (52.0%) 410 (26.8%) 98 (6.4%) 61 (4.0%)

194 (14.4%) 692 (51.3%) 342 (25.4%) 80 (5.9%) 40 (3.0%)

66 (9.1%) 341 (47.2%) 239 (33.1%) 51 (7.1%) 26 (3.6%)

Appendix C – Data Tables 101

Page 106: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

4D2. Determine the current level of peer performance for the same skill as the target student for: behavior

VHS HS SS MnS NS

134 (9.9%) 565 (41.8%) 434 (32.1%) 132 (9.8%) 87 (6.4%)

169 (11.0%) 716 (46.6%) 461 (30.0%) 119 (7.8%) 71 (4.6%)

139 (9.1%) 775 (50.6%) 442 (28.8%) 112 (7.3%) 65 (4.2%)

155 (11.5%) 672 (49.7%) 397 (29.4%) 85 (6.3%) 43 (3.2%)

61 (8.4%) 316 (43.7%) 254 (35.1%) 55 (7.6%) 37 (5.1%)

4E2. Calculate the gap between student current performance and the benchmark (district grade level standard) for: behavior

VHS HS SS MnS NS

87 (6.4%) 393 (29.0%) 506 (37.3%) 210 (15.6%) 159 (11.7%)

109 (7.1%) 504 (32.8%) 578 (37.6%) 209 (13.6%) 136 (8.9%)

87 (5.7%) 550 (35.9%) 570 (37.2%) 202 (13.2%) 125 (8.2%)

95 (7.0%) 487 (36.1%) 529 (39.2%) 162 (12.0%) 78 (5.8%)

33 (4.6%) 211 (29.1%) 308 (42.5%) 110 (15.2%) 62 (8.6%)

4F2. Use gap data to determine whether core instruction should be adjusted or whether supplemental instruction should be directed to the target student for: behavior

VHS HS SS MnS NS

78 (5.8%) 447 (33.1%) 455 (33.7%) 211 (15.6%) 161 (11.9%)

118 (7.7%) 520 (34.0%) 543 (35.5%) 218 (14.3%) 131 (8.6%)

67 (4.4%) 558 (36.8%) 554 (36.5%) 215 (14.2%) 122 (8.1%)

88 (6.5%) 523 (38.6%) 497 (36.7%) 170 (12.6%) 76 (5.6%)

35 (4.8%) 217 (30.0%) 305 (42.1%) 107 (14.8%) 60 (8.3%)

5B. Develop potential reasons (hypotheses) that a student or group of students is/are not achieving desired levels of performance (i.e., benchmarks) for: behavior

VHS HS SS MnS NS

115 (8.5%) 536 (39.4%) 445 (32.7%) 166 (12.2%) 99 (7.3%)

144 (9.4%) 689 (44.9%) 476 (31.0%) 164 (10.7%) 62 (4.0%)

97 (6.3%) 749 (49.0%) 508 (33.2%) 116 (7.6%) 59 (3.9%)

114 (8.4%) 661 (48.8%) 444 (32.8%) 99 (7.3%) 37 (2.7%)

50 (6.9%) 289 (39.9%) 283 (39.1%) 77 (10.6%) 25 (3.5%)

6B. Identify the most appropriate type(s) of data to use for determining reasons (hypotheses) that are likely to be contributing to the problem for: behavior

VHS HS SS MnS NS

64 (4.7%) 472 (34.8%) 476 (35.1%) 225 (16.6%) 120 (8.8%)

117 (7.8%) 542 (35.5%) 564 (36.9%) 208 (13.6%) 98 (6.4%)

73 (4.8%) 590 (38.6%) 598 (39.1%) 184 (12.0%) 85 (5.6%)

89 (6.6%) 536 (39.5%) 538 (39.7%) 142 (10.5%) 51 (3.8%)

31 (4.3%) 240 (33.2%) 322 (44.5%) 88 (12.2%) 42 (5.8%)

7B. Identify the appropriate supplemental intervention available in my building for a student identified as at-risk for: behavior

VHS HS SS MnS NS

77 (5.6%) 444 (32.4%) 502 (36.6%) 242 (17.7%) 105 (7.7%)

102 (6.7%) 571 (37.3%) 584 (38.1%) 199 (13.0%) 77 (5.0%)

80 (5.2%) 594 (38.8%) 596 (39.0%) 185 (12.1%) 75 (4.9%)

82 (6.1%) 515 (38.1%) 558 (41.3%) 140 (10.4%) 56 (4.2%)

32 (4.4%) 229 (31.7%) 323 (44.7%) 101 (14.0%) 38 (5.3%)

102 Appendix C – Data Tables

Page 107: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

8B. Access resources (e.g., internet sources, professional literature) to develop evidence-based interventions for: Behavioral core curricula

VHS HS SS MnS NS

101 (7.4%) 464 (33.8%) 467 (34.0%) 245 (17.8%) 96 (7.0%)

126 (8.2%) 522 (34.1%) 600 (39.1%) 189 (12.3%) 96 (6.3%)

98 (6.4%) 594 (38.8%) 583 (38.0%) 181 (11.8%) 77 (5.0%)

103 (7.6%) 500 (36.9%) 538 (39.7%) 172 (12.7%) 44 (3.2%)

48 (6.6%) 256 (35.4%) 292 (40.4%) 97 (13.4%) 30 (4.2%)

8D. Access resources (e.g., internet sources, professional literature) to develop evidence-based interventions for: Behavioral supplemental curricula

VHS HS SS MnS NS

96 (7.0%) 442(32.2%) 479 (34.9%) 259 (18.9%) 96 (7.0%)

114 (7.4%) 510 (33.3%) 608 (40.0%) 205 (13.4%) 96 (6.3%)

85 (5.6%) 571 (37.3%) 601 (39.3%) 189 (12.3%) 85 (5.6%)

91 (6.7%) 481 (35.5%) 564 (41.6%) 170 (12.5%) 50 (3.7%)

42 (5.8%) 238 (33.0%) 307 (42.5%) 104 (14.4%) 31 (4.3%)

8F. Access resources (e.g., internet sources, professional literature) to develop evidence-based interventions for: Behavioral individualized intervention plans

VHS HS SS MnS NS

94 (6.9%) 432 (31.6%) 492 (36.0%) 254 (18.6%) 96 (7.0%)

109 (7.1%) 530 (34.7%) 607 (39.8%) 187 (12.3%) 94 (6.2%)

89 (5.8%) 571 (37.3%) 602 (39.4%) 187 (12.2%) 81 (5.3%)

97 (7.2%) 492 (36.3%) 551 (40.7%) 170 (12.6%) 45 (3.3%)

45 (6.2%) 238 (33.0%) 314 (43.5%) 93 (12.9%) 32 (4.4%)

9B. Ensure that any supplemental and/or intensive interventions are integrated with core instruction in the general education classroom: behavior

VHS HS SS MnS NS

101 (7.4%) 544 (39.9%) 450 (33.0%) 190 (13.9%) 79 (5.8%)

134 (8.8%) 644 (42.2%) 522 (34.2%) 140 (9.2%) 85 (5.6%)

125 (8.2%) 690 (45.2%) 497 (32.5%) 141 (9.2%) 75 (4.9%)

116 (8.6%) 610 (45.3%) 458 (34.0%) 122 (9.1%) 42 (3.1%)

52 (7.2%) 312 (43.2%) 262 (36.2%) 68 (9.4%) 29 (4.0%)

10B. Ensure that the proposed intervention plan is supported by the data that were collected for: behavior

VHS HS SS MnS NS

82 (6.0%) 489 (35.9%) 491 (36.0%) 204 (15.0%) 98 (7.2%)

127 (8.3%) 600 (39.3%) 566 (37.1%) 148 (9.7%) 86 (5.6%)

95 (6.2%) 658 (42.9%) 545 (35.5%) 159 (10.4%) 77 (5.0%)

105 (7.8%) 615 (45.5%) 462 (34.2%) 113 (8.4%) 57 (4.2%)

46 (6.4%) 289 (40.0%) 291 (40.3%) 60 (8.3%) 37 (5.1%)

11B. Provide the support necessary to ensure that the intervention is implemented appropriately for: behavior

VHS HS SS MnS NS

93 (6.8%) 530 (38.8%) 483 (35.4%) 178 (13.0%) 82 (6.0%)

143 (9.4%) 644 (42.2%) 534 (35.0%) 127 (8.3%) 79 (5.2%)

126 (8.3%) 704 (46.2%) 504 (33.1%) 130 (8.5%) 60 (3.9%)

122 (9.0%) 627 (46.2%) 454 (33.5%) 114 (8.4%) 40 (3.0%)

51 (7.1%) 299 (41.4%) 266 (36.8%) 77 (10.7%) 30 (4.2%)

12B. Determine if an intervention was implemented as it was intended for: behavior

VHS HS SS MnS NS

89 (6.5%) 540 (39.6%) 463 (34.0%) 178 (13.1%) 93 (6.8%)

157 (10.3%) 648 (42.5%) 519 (34.0%) 123 (8.1%) 78 (5.1%)

138 (9.0%) 712 (46.6%) 486 (31.8%) 134 (8.8%) 58 (3.8%)

134 (9.9%) 632 (46.8%) 431 (31.9%) 115 (8.5%) 40 (3.0%)

49 (6.8%) 315 (43.6%) 265 (36.7%) 64 (8.9%) 30 (4.2%)

Appendix C – Data Tables 103

Page 108: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

13B. Select appropriate data (e.g., Curriculum-Based Measurement, DIBELS, FCAT, behavioral observations) to use for progress monitoring of student performance during interventions: behavior

VHS HS SS MnS NS

127 (9.3%) 534 (39.3%) 430 (31.6%) 171 (12.6%) 98 (7.2%)

178 (11.6%) 671 (43.9%) 466 (30.5%) 126 (8.2%) 89 (5.8%)

155 (10.2%) 689 (45.2%) 435 (28.5%) 160 (10.5%) 87 (5.7%)

130 (9.6%) 600 (44.3%) 443 (32.7%) 128 (9.5%) 54 (4.0%)

49 (6.8%) 279 (38.6%) 282 (39.0%) 78 (10.8%) 35 (4.8%)

18D. Collect the following types of data: Standard behavioral observations

VHS HS SS MnS NS

162 (11.9%) 591 (43.5%) 388 (28.6%) 147 (10.8%) 71 (5.2%)

229 (15.1%) 708 (46.7%) 381 (25.1%) 128 (8.4%) 71 (4.7%)

184 (12.0%) 721 (47.2%) 425 (27.8%) 119 (7.8%) 80 (5.2%)

159 (11.8%) 698 (51.8%) 351 (26.1%) 95 (7.1%) 44 (3.3%)

87 (12.0%) 318 (43.9%) 219 (30.3%) 70 (9.7%) 30 (4.1%)

Note. VHS = Very Highly Skilled; HS = Highly Skilled; SS = Somewhat Skilled; MnS = Minimally Skilled; NS = Not Skilled; Y1BOY = Year 1 Beginning of Year; Y1EOY = Year 1 End of Year; Y2EOY= Year 2 End of Year; Y3EOY = Year 3 End of Year; Y4 EOY = Year 4 End of Year.

104 Appendix C – Data Tables

Page 109: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

Table 4c Frequency and Percentage of Total School Perceptions of RtI Skills Ratings Across Pilot Schools: Factor Three (Perceptions of RtI Skills in Accessing, Interpreting, and Graphing Data) Skill Statement Response Y1 BOY Y1 EOY Y2 EOY Y3 EOY Y4 EOY 14. Construct graphs for large group, small group, and individual students:

A. Graph target student data

VHS HS SS MnS NS

115 (8.4%) 418 (30.6%) 461 (33.7%) 218 (15.9%) 156 (11.4%)

156 (10.2%) 474 (30.9%) 539 (35.2%) 228 (14.9%) 135 (8.8%)

140 (9.1%) 503 (32.7%) 560 (36.4%) 218 (14.2%) 116 (7.6%)

123 (9.1%) 467 (34.5%) 507 (37.4%) 187 (13.8%) 70 (5.2%)

58 (8.0%) 203 (28.0%) 277 (38.3%) 140 (19.3%) 46 (6.4%)

B. Graph benchmark data

VHS HS SS MnS NS

108 (7.91%) 404 (29.6%) 459 (33.6%) 232 (17.0%) 163 (11.9%)

146 (9.5%) 460 (30.1%) 545 (35.6%) 241 (15.8%) 138 (9.0%)

134 (8.8%) 490 (31.9%) 564 (36.7%) 226 (14.7%) 122 (7.9%)

110 (8.1%) 470 (34.7%) 511 (37.7%) 190 (14.0%) 73 (5.4%)

58 (8.0%) 193 (26.7%) 284 (39.2%) 137 (18.9%) 52 (7.2%)

C. Graph peer data

VHS HS SS MnS NS

100 (7.3%) 362 (26.5%) 483 (35.4%) 235 (17.2%) 184 (13.5%)

131 (8.6%) 439 (28.8%) 549 (36.0%) 254 (16.6%) 154 (10.1%)

117 (7.6%) 438 (28.5%) 590 (38.4%) 250 (16.3%) 140 (9.1%)

107 (7.9%) 426 (31.5%) 532 (39.4%) 203 (15.0%) 83 (6.1%)

53 (7.3%) 181 (25.0%) 281 (38.8%) 151 (20.9%) 58 (8.0%)

D. Draw an aimline

VHS HS SS MnS NS

54 (4.0%) 205 (15.0%) 378 (27.8%) 299 (22.0%) 425 (31.2%)

65 (4.3%) 242 (15.9%) 540 (35.4%) 320 (21.0%) 357 (23.4%)

73 (4.8%) 279 (18.2%) 540 (35.2%) 339 (22.1%) 302 (19.7%)

71 (5.3%) 295 (21.8%) 521 (38.5%) 301 (22.3%) 164 (12.1%)

37 (5.1%) 103 (14.2%) 276 (38.1%) 196 (27.1%) 112 (15.5%)

E. Draw a trendline

VHS HS SS MnS NS

50 (3.7%) 216 (15.9%) 387 (28.5%) 288 (21.2%) 419 (30.8%)

74 (4.9%) 247 (16.2%) 535 (35.1%) 331 (21.7%) 336 (22.1%)

73 (4.8%) 287 (18.7%) 557 (36.4%) 335 (21.9%) 280 (18.3%)

73 (5.4%) 307 (22.7%) 531 (39.3%) 281 (20.8%) 159 (11.8%)

35 (4.8%) 120 (16.6%) 276 (38.1%) 188 (26.0%) 105 (14.5%)

15. Interpret graphed progress monitoring data to make decisions about the degree to which a student is responding to intervention (e.g., positive, questionable or poor response).

VHS HS SS MnS NS

123 (9.0%) 490 (35.7%) 477 (34.8%) 183 (13.4%) 98 (7.2%)

148 (9.7%) 625 (41.1%) 505 (33.2%) 164 (10.8%) 79 (5.2%)

132 (8.6%) 712 (46.3%) 497 (32.3%) 128 (8.3%) 68 (4.4%)

144 (10.7%) 665 (49.2%) 425 (31.5%) 88 (6.5%) 29 (2.2%)

71 (9.8%) 326 (45.1%) 241 (33.3%) 62 (8.6%) 23 (3.2%)

19. Disaggregate data by race, gender, free/reduced lunch, language proficiency, and disability status

VHS HS SS MnS NS

120 (8.9%) 440 (32.6%) 451 (33.4%) 179 (13.3%) 159 (11.8%)

153 (10.4%) 496 (33.8%) 493 (33.6%) 171 (11.7%) 153 (10.4%)

165 (10.9%) 530 (35.1%) 498 (32.9%) 204 (13.5%) 115 (7.6%)

153 (11.4%) 546 (40.7%) 428 (31.9%) 149 (11.1%) 66 (4.9%)

59 (8.2%) 227 (31.5%) 259 (35.9%) 115 (16.0%) 61 (8.5%)

Appendix C – Data Tables 105

Page 110: The Florida Problem Solving/ Response to …floridarti.usf.edu › resources › format › pdf › yr4_eval_report.pdfThis report presents information on the Florida Problem Solving/Response

20. Use technology in the following ways:

A. Access the internet to locate sources of academic and behavioral evidence-based interventions.

VHS HS SS MnS NS

223 (16.3%) 547 (39.9%) 389 (28.4%) 131 (9.6%) 80 (5.8%)

295 (19.3%) 637 (41.6%) 379 (24.8%) 138 (9.0%) 81 (5.3%)

299 (19.5%) 681 (44.4%) 386 (25.2%) 106 (6.9%) 62 (4.0%)

260 (19.2%) 633 (46.8%) 345 (25.5%) 93 (6.9%) 23 (1.7%)

130 (18.0%) 321 (44.3%) 183 (25.3%) 54 (7.5%) 36 (5.0%)

B. Use electronic data collection tools (e.g., PDAs)

VHS HS SS MnS NS

99 (7.3%) 323 (23.7%) 392 (28.7%) 280 (20.5%) 272 (19.1%)

133 (8.7%) 365 (23.9%) 502 (32.9%) 310 (20.3%) 215 (14.1%)

121 (7.9%) 402 (26.3%) 534 (34.9%) 264 (17.3%) 209 (13.7%)

130 (9.6%) 426 (31.6%) 433 (32.1%) 209 (15.5%) 151 (11.2%)

61 (8.4%) 191 (26.4%) 255 (35.2%) 130 (18.0%) 87 (12.0%)

D. Use the School-Wide Information System (SWIS) for Positive Behavior Support

VHS HS SS MnS NS

30 (2.2%) 192 (14.1%) 367 (26.9%) 301 (22.1%) 473 (34.7%)

78 (5.1%) 249 (16.4%) 455 (29.9%) 339 (22.3%) 400 (26.3%)

61 (4.0%) 276 (18.1%) 512 (33.6%) 319 (20.9%) 357 (23.4%)

68 (5.1%) 311 (23.1%) 428 (31.8%) 272 (20.2%) 266 (19.8%)

26 (3.6%) 128 (17.7%) 233 (32.2%) 180 (24.9%) 156 (21.6%)

E. Graph and display student and school data

VHS HS SS MnS NS

109 (8.0%) 365 (26.7%) 460 (33.7%) 253 (18.5%) 180 (13.2%)

177 (11.6%) 441 (28.8%) 474 (31.0%) 265 (17.3%) 174 (11.4%)

150 (9.8%) 462 (30.2%) 533 (34.9%) 241 (15.8%) 142 (9.3%)

149 (11.1%) 473 (35.1%) 453 (33.6%) 192 (14.2%) 81 (6.0%)

64 (8.8%) 209 (28.9%) 265 (36.6%) 131 (18.1%) 55 (7.6%)

21. Facilitate a Problem Solving Team (Student Support Team, Intervention Assistance Team, School-Based Intervention Team, Child Study Team) meeting.

VHS HS SS MnS NS

73 (5.34%) 308 (22.6%) 415 (30.4%) 297 (21.7%) 273 (20.0%)

110 (7.2%) 372 (24.4%) 523 (34.4%) 320 (21.0%) 197 (12.9%)

80 (5.3%) 358 (23.5%) 577 (37.9%) 299 (19.7%) 208 (13.7%)

87 (6.5%) 354 (26.4%) 476 (35.5%) 249 (18.6%) 176 (13.1%)

21 (2.9%) 129 (17.8%) 247 (34.2%) 197 (27.3%) 129 (17.8%)

Note. VHS = Very Highly Skilled; HS = Highly Skilled; SS = Somewhat Skilled; MnS = Minimally Skilled; NS = Not Skilled; Y1BOY = Year 1 Beginning of Year; Y1EOY = Year 1 End of Year; Y2EOY= Year 2 End of Year; Y3EOY = Year 3 End of Year; Y4 EOY = Year 4 End of Year.

106 Appendix C – Data Tables