data guided decision-making how asking questions can improve fidelity
DESCRIPTION
Data Guided Decision-Making How asking questions can improve fidelity. SPDG Grantees Meeting US Department of Education Office Special Education Programs November 6, 2013 Washington, DC. Allison Metz, Ph.D. National Implementation Research Network FPG Child Development Institute - PowerPoint PPT PresentationTRANSCRIPT
Data Guided Decision-Making
How asking questions can improve fidelity
Allison Metz, Ph.D. National Implementation Research Network
FPG Child Development Institute
University of North Carolina at Chapel Hill
SPDG Grantees Meeting
US Department of Education
Office Special Education Programs
November 6, 2013Washington, DC
Goals for Today’s Session
• Why should we start with the questions?
• How can we foster the curiosity of key stakeholders?
• How can data guided decision-making improve fidelity of a well-defined “what”?
• How can data guided decision-making further operationalize your “what” and “how”?
Decision Support Data Systems
Starting with Questions
Achieving Good Outcomes Implementation Science
Effective
Interventions
The “WHAT”
Effective Implementation
The “HOW”
Positive Outcomes
for Children
DATA DATA DATA
Where to start
1. Determine what questions you want to answer (implementation team)
2. Determine what data will help to answer questions
3. Determine the simplest way to get the data
4. Put systems in place to collect data5. Analyze data to answer questions
(Lewis, 2009)
Data-Based Decision Making
Questions to Answer
Consider questions related to:
Data-Based Decision Making
Intervention Implementation Outcomes
How is this different?
Practitioner fidelity is important component of DSDS, but not the full picture…what else do we need to understand?
Decision Support Data
Questions to Answer
•Are practitioners implementing intervention with fidelity?
•Are practitioners “bought in” to the new way of work?
•Are teachers, families, partners satisfied?
•Are stakeholders collaborating?
•Are coaching and supervision being delivered as intended?
•Are outcomes being achieved?
•Are appropriate referrals being made?
Data-Based Decision Making
Fostering Curiosity
Rather than simply providing grantees with data (e.g., fidelity data provided by program developers), we need to foster the curiosity of grantees regarding their own implementation efforts so that grantees become learning organizations.
Metz and Albers, In Press
Discussion Points Starting with
Questions• What do you need to know? Why do
you need to know it?
• What type of data could help you answer this question?
• What would be the easiest way to get this information?
• Do you have systems in place to collect this information? To conduct the analysis to answer your questions?
• What are your next right steps?
SupportingNew Ways of Work
Asking questions to improve fidelity of the
“what”…
Implementation Supports
Promote High Fidelity
Fidelity is an implementation outcome
How can we create an implementation infrastructure that supports high fidelity implementation?
IMPROVED OUTCOMES FOR CHILDREN AND FAMILIES
Performance Assessment (fidelity)
Coaching
Training
Selection
Integrated & Compensatory
Systems Intervention
Facilitative Administration
Decision Support Data System
AdaptiveTechnical
Com
pete
ncy
Driv
ers O
rganization Drivers
Leadership Drivers
Integrated & Compensatory
Common Questions
FIDELITY IS AN IMPLEMENTATION OUTCOMETherefore, common questions include: •Are staff able to implement the intervention as intended (with fidelity)?
– If yes, what supports are in place to continue to ensure ongoing performance? (i.e. ongoing training, coaching, data systems, facilitative administration, teams)
– If no, what barriers or challenges are impeding their ability to implement with fidelity?
• Competency challenge – i.e. need for more training, ongoing coaching, additional support from developer
• Organizational challenge – i.e. a policy or procedures in the agency or system inhibiting fidelity
• Leadership challenge – i.e. need for leadership to attend to organizational or system barriers
Fidelity Results
Common QuestionsFidelity Results
FIDELITY IS AN IMPLEMENTATION OUTCOMECommon questions include:
– If no, who is primarily responsible for this driver?
– What stage of implementation is this program in?
– What data do we have telling us this is a challenge?
– How can potential solutions be identified?
– How can potential solutions can be tracked to monitor improvement?
– Is this challenge being experience with other programs?
– Who needs to know about this challenge?
Results from Child Wellbeing Project
Case Example
Component T1
Selection 1.44
Training 1.33
Coaching 1.27
Perf. Assessment 0.78
DSDS 0.18
Fac. Administration 1.38
Systems Intervention 1.29
Average Composite Score 1.1
Fidelity (% of cases) 18%
Case management model involved intense program development of core intervention components and accompanying implementation drivers.
Clinical case management and home visiting model for families post-care.
Program Improvement
Program Review Process •Process and Outcome Data•Detection Systems for Barriers •Communication protocols
Questions to Ask •What formal and informal data have we reviewed? •What is the data telling us?•What barriers have we encountered?•Would improving the functioning of any Implementation Driver help address barrier?
Fidelity Data
Results from Child Wellbeing Project
Case Example
Component T1 T2 T3
Selection 1.44 2.00* 1.89*
Training 1.33 1.5* 1.10
Coaching 1.27 1.73* 1.83*
Perf. Assessment 0.78 1.34 2.0*
DSDS 0.18 1.36 2.0*
Fac. Administration 1.38 2.00* 2.0*
Systems Intervention 1.29 1.86* 2.0*
Average Composite Score 1.1 1.68* 1.83*
Fidelity (% of cases)
18% 83% 83%
Success Coach model involved intense program development of core intervention components and accompanying implementation drivers
Discussion Points Data Guided Decision-
Making• How curious is your agency about your
program fidelity? How can you foster greater curiosity?
• Are you currently using a structured process to assess how implementation efforts are contributing to program fidelity? How could you improve this process?
• How might developing a core set of questions to address each month regarding fidelity and implementation be useful? How could you build this process into regular routines?
SupportingNew Ways of Work
Asking questions to define the “what” and
“how”…
Analyze Data
7 Basic Evaluation Questions:
1.What does “it” look like now?
2.Are we satisfied with how “it” looks?
3.What would we like “it” to look like?
4.What would we need to do to make “it” look like that?
5.How would we know if we’ve been successful with “it”?
6.What can we do to keep “it” like that?
7.What can we do to make “it” more efficient & durable?
(Sugai, 2004)
Data-Based Decision Making
“Key Aspects of Improvement”
“Many initiatives fail for lack of study and reflection on what is actually being done and what the results are from having done it. Observing, describing, and documenting are key aspects to a program improvement cycle, and particularly critical during the pilot phase when key functions of interventions are emerging.” The Child Wellbeing Project, Improvement Cycle Tool
Improvement Cycles
The crux of the DSDS are Improvement Cycles
•Decisions should be based on data
•Change should occur on purpose
•Improvements must be ongoing – always striving to be better in order to succeed and have impact
DSDS
Cycles
Improvement Cycles
Three critical Improvement Cycles:
1.Usability Testing
2.Rapid Cycle Improvement Teams
3.Practice-Policy Communication Loops
DSDS
Cycles
Usability Testing
• Usability Testing is the strategic use of Plan, Do, Study, Act cycles to “test” and improve processes and procedure that are being used for the “first time”
• Occurs during initial implementation of the process of procedure being testing
DSDS
Cycles
Usability Testing
Why is it helpful?
•Designed to improve and “stabilize”
• Early occurring components
• Implementation supports
• Data collection processes
•So that major “bugs” are worked out
•And therefore:
• Processes are more likely to be “effective”
• Implementation Drivers can support the “right” processes
DSDS
Cycles
Which processes to test?
• Intervention Processes
• Are literacy coaches able to engage children & families?
• Are literacy coaches able to do the intervention as intended?
• Implementation Processes
• Does training occur as intended?
• Can fidelity measures be collected as intended?
• Data Collection Processes
• Area assessment done on schedule?
• Is the data entry system functional?
Usability Testing
Cycles
Testing Dimensions
Limited number of “cases” within a given test
• Enough to sample with variability in order to detect systematic problems rather an individual challenges
• Staged to quickly get a sense of challenges
– Small enough number to give you quick early returns of data
– Metrics are both functional and easy to collect
– Quantitative and Qualitative Information
Usability Testing
Cycles
Testing Processes
For EACH Test…• What’s the relevant measure/key outputs that
can be quickly revealed? (e.g., opinions of teachers or parents, percentage of assessments done on schedule)
• Who will be responsible for reporting the data and on what schedule? (e.g., Teachers will send an e-mail at the end of each week to their supervisor indicating the percentage of home visits that were done as scheduled)
• Who will be responsible for summarizing the data? (e.g., Supervisors will summarize the data across all practitioners and forward to the Implementation Task Group by 4 p.m. on Tuesday for the prior week’s data).
Usability Testing
Testing Results
What if it’s not good enough?
Usability Testing
Plan, Do, Study, Act1.You Planned2.You Did It As Intended3.You Studied the Unacceptable Results
1. Data2. Conversation
4.You Act – Plan, Do, Study Again
“Get Started, Get Better”
Improvement Cycles
Three critical Improvement Cycles:
1.Usability Testing
2.Rapid Cycle Improvement Teams
3.Practice-Policy Communication Loops
DSDS
Cycles
Rapid Cycle Problem Solving
• Problem-solving during early efforts– Team Lead identified
– Right people on the team
– Time-limited to address the problem
– Team disbands
• Practice Improvement – On-going efforts to improve practices
and competencies
– Use data to achieve better outcomes for children and “embed” solutions
DSDS
Cycles
Use of Data
Quickly Identify:
•Data needs
•Potential Indicators
•Methods of assessment
•Efficient analysis
•Targeted strategies based on analysis and how reassessment will occur quickly
Rapid Cycle Problem Solving
Cycles
Improvement Cycles
Three critical Improvement Cycles:
1.Usability Testing
2.Rapid Cycle Improvement Teams
3.Practice-Policy Communication Loops
DSDS
Cycles
Practice-Policy Communication Cycle Practice-Policy Communication Cycle
Policy
Practice
Plan
Do
Policy
Practice
Structure
ProcedureFee
db
ack
Stu
dy - A
ct
FORM SUPPORTS FUNCTION
Practice-Policy Communication Cycle Practice-Policy Communication Cycle
Policy
Practice
Plan
Do
Fee
db
ack
Stu
dy - A
ct
Data Data
Practice to Policy
Practice to Policy Communication Loops Use Data: 1.To understand what’s happening during service delivery
2.Create hospitable organizations and supports for practice
3.To achieve results
DSDS
Cycles
Practice to Policy
Typically, Implementation Teams are the vehicle for this information:
1.Gather information from practitioners
2.Share results with leadership and practitioners
3.Communicate changes and responses bi-directionally
DSDS
Cycles
Discussion Points Data Guided Decision-
Making• How might you employ usability testing
to stabilize your “what”?
– What questions would you ask?
– What would your targets be?
– How would you collect and analyze the data?
– Who would be responsible?
• How are you making use of improvement cycles on a regular basis to ensure your infrastructure (the “how”) is supporting your innovation or program model (the “what”)?
SupportingNew Ways of Work
Summary Data-based Decision Making Requires:
•Thoughtful consideration of what’s most important to measure
•An efficient system for measuring data
•Simple processes for analyzing and targeting strategies for improvement
•Transparent and inclusive processes for communicating results of data and improvement strategies
•Plans to celebrate strengths and successes
•Teams that use stage-appropriate data-based decisions to make improvements though various types of cycles