the$importance$of$using$research$and$data$to$ … · 2015-11-23 · formative evaluation .90 !...
TRANSCRIPT
Presented by: Kim Gibbons, Ph.D. Associate Director for Innovation and Outreach
(CAREI)
CLM Fall Conference November 20th, 2015
The Importance of Using Research and Data to Design, Implement, and Evaluate Quality Core Instruction
AGENDA
1. Share some examples of how research can inform practice.
2. Discuss the importance of evaluating core instruction using data.
3. Talk about CAREI and our mission to collaborate in the areas of research, evaluation, and assessment.
THREE CULTURES THAT NEED TO CHANGE
We need to move from: Excuse to accountability Compliance to performance Uniformity to differentiality based on talent and need
CHANGE
True or False: People resist change.
People resist change when they experience a loss.
PIECEMEALNESS
“It is not the pace of change that is the culprit, it is the peacemealness and
fragmentation that wears us down.” Fullan, 2003
QUESTIONS
How important is research to you as administrator? How do you stay current on research? What are the barriers to using research to inform practice?
REASSESSING OUR DIRECTION IS NOT ALWAYS EASY!
How to do it:
1. When adopting new policies, programs, or curriculum, examine the research base and pick things that have already been proven to work with large effect sizes.
2. Take inventory of current policies, programs, & practices and compare to current research.
3. Continually evaluate policies, programs, & practices using data to drive decisions.
0.4 0.15
0.0
-‐0.2
Teacher
Developmental
Reverse
1.2
Neg
ative
Medium
d = 1.09
0.7
1.0
Response to Intervention (MTSS)
Desired Effects
VARIABLES IMPACTING STUDENT ACHIEVEMENT: THE BIGGIES!
Student self report of grades 1.44
Formative Evaluation .90
Teacher Clarity .75
Reciprocal Teaching .74
Feedback .73
Teacher-Student Relationship .72
Spaced vs. mass practice .71
Metacognitive strategies .69
Repeated Reading .67
Vocabulary Programs .67
S e =
0.4 0.15
0.0
-‐0.2
Teacher
Developmental
Reverse
1.2
Neg
ative
Medium
Feedback d = 0.90
0.7
1.0
Formative evaluation
Teaching
Desired Effects
ASK THESE QUESTIONS:
Does our district have reliable and valid systems in place to monitor the progress of students receiving supplemental and intensive intervention? Do teachers in our district regularly collect formative assessment data to guide their decisions in the classroom with all students?
S e =
0.4 0.15
0.0
-‐0.2
Teacher
Developmental
Reverse
1.2
Neg
ative
Medium
Feedback d = 0.73
0.7
1.0
Feedback
Teaching
Desired Effects
FEEDBACK (.73)
Feedback is most powerful when it is from student to teacher!
Teachers need to seek and be open to feedback about: What students know What they understand Where they make errors When they have misconceptions When they are not engaged
Feedback is the breakfast of champions! -‐ Ken Blanchard
ASK THESE QUESTIONS:
1. How do teachers in your district know: What students know? What they understand? Where they make errors? When they have misconceptions? When they are not engaged?
2. What type of professional development and coaching exists in this area?
S e =
0.4 0.15
0.0
-‐0.2
Teacher
Developmental
Reverse
1.2
Neg
ative
Medium
© John Hattie Visible Learning
Feedback d = .72
0.7
1.0
Relationships
Teacher
Desired Effects
TEACHER STUDENT RELATIONSHIPS
Factors that are important:
ü Demonstrate that you care for the learning of each student as a person
ü High levels of empathy (see their perspective and communicate back)
ü Warmth
ü Encouragement of higher-‐order thinking
ü Adapting to differences
ü Genuineness
ü Respect of self and others
ASK THESE QUESTIONS:
ü How do teachers in your district connect with students who have challenging behaviors?
ü How do teachers in your district connect with students who lack motivation?
ü How do teachers in your district connect with students who are very shy and withdrawn?
ü How do teachers in your district connect with students from different ethnic backgrounds?
THE POLITICS OF DISTRACTION (HATTIE)
• Too much discussion focused on between school differences (34%) when the biggest issue is within school differences (64%)!
• What is the cause of the variance?
• Teachers: Variability in impact on student learning
• We all know this, but it is absent in any discussions about policy, teachers, and schools!
INSTEAD WE USE………
The Politics of Distraction!
DISTRACTION #1: APPEASE THE PARENTS
q School Choice q Class Size What changes in teaching as result of smaller class size?
DISTRACTION #2: FIX THE INFRASTRUCTURE
Do we need:
• More effective curriculum?
• More rigorous standards?
• More tests?
• More alternatively shaped buildings?
DISTRACTION #3: FIX THE STUDENTS
If only we had better and more well-‐prepared students!
ü Label students who don’t fit the norm.
ü Hold students back. ü Teach to learning styles.
DISTRACTION #4: FIX THE SCHOOLS!
If only schools had more money and more autonomy they would be better schools! ü Charter schools ü Local decision-‐making ü New leaders ü Invest more money
DISTRACTION #5: FIX THE TEACHERS
If only teachers had better initial training, were paid for performance, and had better technology!
ü Change teacher education – while there may be some truth to this, the greatest source of teacher learning is not from their teacher education program but from their first year of teaching, and then from their second. After the second year, teacher education programs matters very little.
ü Performance Pay
ü Technology ü More adults in schools
SUMMING IT ALL UP!
ü We love talking about distractors that don’t matter!
ü Some of our most politically popular “fixes” have the lowest impact on student achievement!
EVALUATING CORE INSTRUCTION
WHY SHOULD WE EVALUATE CORE INSTRUCTION?
EVALUATING CORE INSTRUCTION
1. Problem Identification Is the core program sufficient?
2. Problem Analysis If the core program is not
sufficient, why isn’t it?
3. Plan Development How will the needs identified in
the core be addressed?
4. Plan Implementation How will the effectiveness and efficiency of
the core be monitored over time?
5. Plan Evaluation Have improvement to the
core been effective?
TURN AND TALK
How do you define a sufficient core instructional program? “A sufficient core instructional program is one in which…”
DEFINING A STRONG CORE
All materials & instruction used to provide the main classroom instruction in a particular content area Often more than a single textbook
Whatever it takes to get most students meeting grade level standards Will differ from district to district, school to school, cohort to cohort
GOAL FOR A STRONG CORE
To create a core instructional program that results in about 80 percent of students meeting grade level expectations without additional support
At least 95% of students who begin the year at grade level expectations will end the year (begin the next year) at grade level expectations Utilizing evidence based materials and instructional techniques
Utilizing personnel and time resources creatively and wisely
A SUFFICIENT CORE PROGRAM
At least 80% of students will meet grade level standards given access to core instruction alone These students did not need supplemental support to meet grade level standards Under this goal scenario, more than 80% of students are proficient: some reached proficiency through the support of core + supplemental or intensive instruction!
ASSESSING CORE PROGRAM SUFFICIENCY
MCA and other District Assessments: Total student population Students who only received core instruction Disaggregated subgroups Individual grade levels Subject areas
KEY QUESTIONS
What percentage of students in each grade level met our target expectations in fall, winter, and spring in Reading and Math?
KEY QUESTIONS
How do percent above target scores for current year in each grade level compare to those of previous (3?) years?
KEY QUESTIONS
What percentage of students in our grade level met our target expectations in spring without receiving supplemental interventions?
KEY QUESTIONS
What percentage of students who began the year at or above target also ended the year at or above target?
Fall 76% above target Spring 75% above target
136 = 91% stayed proficient 150
HOW DO YOU KNOW:
What percentage of students who began the year at or above target also ended the year at or above target?
EVALUATION OF IMPLEMENTATION: THE BIG FIVE!
Assessments Data-‐Based Decision Making
Multilevel Instruction
Infrastructure & Support
Fidelity & Evaluation
Tools • Reliable, valid, predictive validity • Staff can articulate
Screening • All students, implementation accuracy, more than once per year
Data Points • Screening data plus two other data sources • Convergence
PM Tools • Alternate forms with equal difficulty, specification of minimum expected growth, benchmarks, reliable & valid
PM Process
• Schedules • Implementation accuracy
Assessments
DATA-‐BASED DECISION MAKING
Process
• Decisions about participation in intervention levels is data-‐driven, involves teams, operationalized with clear decision rules
Data System
• System in place to access data, timely, graphical display, process for setting and evaluating goals.
Responsiveness
• Decisions made based on reliable and valid data reflecting slope or goal attainments
• Implemented accurately
MULTILEVEL INSTRUCTION
Research-‐Based Curriculum
Articulations of Teaching and Learning
Differentiated Instruction
Standards Based
Exceeding Benchmark
Evidence-‐Based
Compliments Universal
Instructional Characteristics
Supplemental to Universal
Intensity matched to Need
Instructional Characterizes
Relationship to Universal
Universal Tier 1 Supplemental
Tier 2 Intensive Tier 3
Fidelity & Evaluation
Curriculum & Instruction
Assessments
Action plan Evaluation of Plan
Review of Student Data
Implementation Data
Fidelity
Evaluation
Adherence
Exposure
Quality of Delivery
Program Specificity
Student Engagement
FIVE ELEMENTS OF FIDELITY
50
(Dane & Schneider, 1998; Gresham et al., 1993; O’Donnell, 2008)
Adherence: How well do we stick to the plan/ curriculum/assessment?
Exposure/Duration: How often does a student receive an intervention? How long does an intervention last?
Quality of Delivery: How well is the intervention, assessment, or instruction delivered? Do you use good teaching practices?
Program specificity: How well is the intervention defined and different from other interventions?
Student Engagement: How engaged and involved are the students in this intervention or activity?
FIDELITY AT THE PRIMARY PREVENTION LEVEL
Staff implements program components or core components of the curriculum with fidelity Might not be scripted but is comprehensive given the age group and content area
High-‐quality instruction Includes differentiation
Staff may use a mix of whole group, small group, dyadic, and independent practice
§ Staff implement curriculum based on established delivery timeframe For example: Provide 90–120 minutes of reading instruction five days a week
Students are engaged throughout lessons 51
Take a moment to discuss with your colleagues…
How do we evaluate our core instruction? What data do we use, how often, where stored? What are our areas of strength and weakness? How will we work to address these?
Your Turn
CENTER FOR APPLIED RESEARCH AND EDUCATIONAL IMPROVEMENT
(CAREI) How can we help?
CONVERSATIONS AND SURVEYS What we know:
ü Most educators agree that quality data can improve decision making!
ü Schools are data rich but information poor. An overarching theme was that our school staff and leaders need to understand how to use data for various types of decisions related to student outcomes!
Needs Around Data Usage:
ü Professional development around data literacy
ü Point person with expertise
ü Customized Data Reports
ü Timely Data
ü Help with evaluating existing assessments (almost half indicated this was a need)
ü Time to Collaborate
CAPACITY BUILDING AROUND DATA USE
ü A recent survey of MASA and MASE members indicted that only 7% of respondents believe they have “very good” capacity for data use.
ü Staff expertise is an issue. ü Needs exist in the areas of:
Providing training on data interpretation How to use data to make decisions Developing surveys and questionnaires Data for resource allocation decisions Selecting diagnostic assessments Infrastructure for progress monitoring Analyze data from screening Selecting screening assessments
SUMMARY OF CONVERSATIONS AND SURVEYS
1. District level respondents believe that quality data can improve their decision-‐making.
2. There is a large need for increased data literacy at all levels (teachers and administrators).
3. Districts currently lack capacity in the area of data based decision making for a variety of reasons. Only 33% of districts indicated that they have staff with advanced training in evaluation and assessment.
4. Statewide MTSS implementation survey indicates many needs Ø Differentiated instruction (27% full implementation) Ø Multiple measures used to evaluate effectiveness of instruction (21%) and for
decision making (23%) Ø Evaluation of interventions (29%) Ø Accessible and timely data for decision making (40%) Ø Valid and reliable assessments at the unit and lesson levels (30%) Ø Use of valid and reliable progress monitoring measures (39%) Ø Using progress monitoring data to determine if interventions are effective (33%)
CAREI WANTS TO HELP!
v CAREI is well-‐positioned to address the statewide gaps in data based decision making!
v We can assist with data literacy training!
v We have staff with high levels of expertise in areas of research, evaluation, and assessment!
v We can increase services statewide through collaboration!
EXAMPLES OF QUESTIONS WE CAN ANSWER:
1. What is the current level of implementation of a district framework or initiative (e.g., MTSS, PBIS,PLC’s Anti-‐bullying Curriculum)? What things are working well? Barriers to success?
2. What are the effects of school wide or district wide decisions (block scheduling, after-‐school programs, curriculum adoption, etc.) on student achievement?
3. What is the effect of a flipped instruction model on student achievement? Does this framework produce better outcomes for certain students?
4. What impact do 1:1 technology initiatives have on student achievement? Student engagement?
5. Is a particular assessment valid for the intended purpose? 6. What is the process to develop reliable and valid common formative
assessments?
TECHNICAL ASSISTANCE WE CAN PROVIDE:
1. How to develop surveys to assess …… (student engagement, staff satisfaction, parental engagement)
2. Customized data reports targeting district variables of interest.
3. What types of assessments would be needed to answer specific district questions.
4. And much more! If you ask, we can help figure it out!
CAREI ASSEMBLY ü Participation in the District Assembly four times per year with
remote access to meetings via video conferencing.
ü Participation in the CAREI collaborative grant program, with the potential of your district receiving up to $3,500 in funds for a teacher/leader in your district to collaborate with a College of Education faculty member on a co-‐designed research project.
ü A half-‐day of research consultation available to your district at no cost (e.g., meeting with school leaders, feedback on a survey design, a brief compilation of research literature on a topic, etc.) with CAREI staff.
ü Linkage to other resources in the University of Minnesota.
ü At-‐cost assistance in larger projects, such as large-‐scale survey design and data analysis, in-‐district or grant-‐funded evaluation projects, and so on.
FOR MORE INFORMATION
Please contact Kim Gibbons! [email protected] 651-‐303-‐4141
CAREI Website http://www.cehd.umn.edu/carei/