nrs data monitoring for program improvement
Post on 31-Dec-2015
20 Views
Preview:
DESCRIPTION
TRANSCRIPT
04/19/23 M. Corley 2
Objectives—Day 1
1. Describe the importance of getting involved with and using data;
2. Identify four models for setting performance standards as well as the policy strategies, advantages,and disadvantages of each model;
3. Determine when and how to adjust standards for local conditions;
4. Set policy for rewards and sanctions for local programs;
5. Identify programmatic and instructional elements underlying the measures of educational gain, NRS follow-up, enrollment, and retention.
04/19/23 M. Corley 3
Agenda—Day 1 Welcome, Introduction, Objectives, Agenda Review The Power of Data
– Why Get Engaged with Data? Exercise
– The Data-driven Program Improvement Model
– Setting Performance Standards
– Adjusting Standards for Local Conditions
– Establishing a Policy for Rewards and Sanctions
Getting Under the Data– Data Pyramids
– Data Carousel
Evaluation and Wrap-up for Day 1
04/19/23 M. Corley 4
Objectives—Day 2
1. Distinguish between the uses of desk reviews and on-site monitoring of local programs;
2. Identify steps for monitoring local programs;
3. Identify and apply key elements of a change model; and
4. Work with local programs to plan for and implement changes that will enhance program performance and quality.
04/19/23 M. Corley 5
Agenda—Day 2
Agenda Review Planning for and Implementing Program Monitoring
– Desk Reviews Versus On-site Reviews – Data Sources (small group work)– Steps and Guidelines for Monitoring Local Programs
Planning for and Implementing Program Improvement– A Model of the Program Improvement Process– State Action Planning
Closing and Evaluation
04/19/23 M. Corley 7
Question for Consideration
Why is it important to be able to produce evidence of what your state (or local) adult education program
achieves for its students?
04/19/23 M. Corley 8
The Motivation Continuum
Intrinsic Extrinsic
Which is the more powerful force for change?
04/19/23 M. Corley 9
NRS Data-driven Program Improvement (Cyclical Model)
STEPS– Set performance standards– Examine program elements underlying the
data– Monitor program data, policy, and
procedures– Plan and implement program improvement– Evaluate progress and revise, as necessary,
and recycle
04/19/23 M. Corley 10
What’s Under Your Data?The Powerful Ps
__Performance_(Data)_
Program Policies
Procedures
Processes
Products
04/19/23 M. Corley 11
NRS Data-driven Program Improvement Model
NRSDATA Examine Program
Elements Underlying the Data
Monitor Program Data, Policy, Procedures
Plan and Implement Program
Improvement; Evaluate
Improvement
Set Performance Standards
04/19/23 M. Corley 12
Educational Gains for ESL Levels and Performance Standards
80%91%
33%
22%
50%
26%26%
34%36%
27%
31%
0%
20%
40%
60%
80%
100%
Beg.
Lit
Beg. Low
Int.
High
Int.
Low
Adv.
High
Adv.
Program
Performance Standards
Exhibit 1-2
04/19/23 M. Corley 13
Questions Raised by Exhibit 1-2 How were performance standards set? Based on past
performance?
Are standards too low at the higher levels?
Is performance pattern similar to that of previous years? If not, why not?
What are program’s assessment and placement procedures? Same assessments for high and low ESL?
How do curriculum and instruction differ by level?
What are student retention patterns by level?
04/19/23 M. Corley 15
Essential Elements of Accountability Systems• Goals
• Measures
• Performance Standards
• Sanctions and Rewards
04/19/23 M. Corley 16
National Adult Education Goals
educational gain, GED credential attainment, entry into postsecondary
education, and employment.
Reflected in NRS Outcome Measures of
04/19/23 M. Corley 17
Performance Standards Similar to a “sales quota”: how well are you
going to perform this year? – Should be realistic and attainable, but
– Should stretch you toward improvement
Set by each state in collaboration with ED
Each state’s performance is a reflection of the aggregate performance of all the programs it funds
04/19/23 M. Corley 18
Standards-setting Models
Continuous Improvement Relative Ranking External Criteria Return on Investment (ROI)
04/19/23 M. Corley 19
Continuous Improvement Standard based on past performance
Designed to make all programs improve compared to themselves
Works well when there is stability and a history of performance on which to base standard
Ceiling reached over time, resulting in little additional improvement
04/19/23 M. Corley 20
Relative Ranking Standard is mean or median performance of
all programs
Programs ranked relative to each other
Works for stable systems where median performance is acceptable
Improvement focus mainly on low-performing programs
Little incentive for high-performing programs to improve
04/19/23 M. Corley 21
External Criteria Set by formula or external policy
Promotes a policy goal to achieve a higher standard
Used when large-scale improvements are called for, over the long term
No consideration of past performance: unrealistic, unattainable
04/19/23 M. Corley 22
Return on Investment Value of program :: Cost of program
A business model; answers question, Are services or program worth the investment?
Can be a powerful tool for garnering funding (high ROI) or for losing funding (low ROI)
May ignore other benefits of program
04/19/23 M. Corley 23
Decision Time for State Teams
1. Which model(s) do you favor for setting standards for/with locals?
2. Is it appropriate to use one statewide model or different models for different programs?
3. How will you involve the locals in setting the standards they will be held to?
04/19/23 M. Corley 24
Question for Consideration
How do the standard-setting model(s) that states select represent a policy statement on the relationship between performance and quality that states want to instill in local programs?
04/19/23 M. Corley 25
Adjusting Standards for Local Conditions
Research suggests that standards often need to be adjusted for local conditions before locals can work to improve program quality.
WHY IS THIS SO?
04/19/23 M. Corley 26
Factors that May Require Adjustment of Standards
Student Characteristics– An especially challenging group
– Students at lower end of level
– Influx of different types of students
Local Program Elements External Conditions
04/19/23 M. Corley 27
Shared Accountability
State and locals share responsibility to meet accountability requirements– State provides tools and environment
for improved performance
– Locals agree to work toward improving performance
04/19/23 M. Corley 28
Locals should know…
The purpose of the performance standards;
The policy and programmatic goals the standards are meant to accomplish;
The standard-setting model that the state adopts; and
That State guidance and support is available to locals in effecting change.
04/19/23 M. Corley 29
Shared Accountability Which state-initiated efforts have been
easy to implement at the local level? Which have not? What factors contributed to locals’
successfully and willingly embracing the effort?
What factors contributed to a failed effort?
04/19/23 M. Corley 30
Shared Accountability
Locals Out of Control??
Hot Dog!! We’re really moving!
Anything Happening Out
There??
GetOFF
our backs!!
High
LowLow High
State Administrative Control
Local P
rog
ram
In
volv
em
en
t
04/19/23 M. Corley 31
What About Setting Rewards and Sanctions?
Which is the more powerful motivator: rewards or sanctions?
List all the different possible reward structures you can think of for local programs.
How might sanctioning be counter-productive?
List sanctioning methods that will not destroy locals’ motivation to improve or adversely affect relationships with the state office.
04/19/23 M. Corley 32
Variations on a Theme Exercise (Refer to H-10). Brainstorm as many possible rewards
or incentives as you can for recognizing local programs that meet their performance standards.
Then brainstorm sanctions that the state might impose on local programs that do not meet their performance standards.
Select a recorder for your group to write one reward per Post-It Note and one sanction per Post-It Note.
When you have finished, wait for further instructions from the facilitator.
04/19/23 M. Corley 33
Summary of Local Performance Standard-setting Process
Procedure Goal
Select standard-setting model
Reflect state policies;
Promote program improvement
Set rewards and sanctions policy
Create incentives;
Avoid unintended effects
Make local adjustments
Ensure standards are fair & realistic for all programs
Provide T/A Create atmosphere of shared accountability
Monitor often Identify and avoid potential problems
04/19/23 M. Corley 34
Getting Under the Data
NRS data, as measured and reported by states, represent the product of underlying programmatic and instructional decisions and procedures.
04/19/23 M. Corley 35
Four Sets of Measures
1. Educational gain
2. NRS Follow-up Measures– Obtained a secondary credential– Entered and retained employment– Entered postsecondary education
3. Retention
4. Enrollment
04/19/23 M. Corley 36
Educational Gain
I n s t r u c t i o n
Educational Gain
Goal Setting and Placement ProceduresRetention
Professional DevelopmentClass Organization
Assessment Procedures
Assessment Policies and Approach
04/19/23 M. Corley 37
Follow-up Measures
G o a l-S e t t i n g
GED
Support Services
Tracking Procedures
Professional Development
Retention
EmploymentPostsecondary
Instruction
04/19/23 M. Corley 38
Retention
I n s t r u c t i o n
Retention
Students
Class Schedules and Locations
Placement Procedures
Support Services
Professional Development
Retention Support and Policies
04/19/23 M. Corley 39
Enrollment
R e c r u i t m e n t
Enrollment
Community Characteristics
Class Schedules and Locations
Professional Development
Instruction
04/19/23 M. Corley 41
Question for Consideration
How might it benefit local programs if the State office were to initiate and maintain a regular monitoring schedule to compare local program performance against performance standards?
04/19/23 M. Corley 42
Regular Monitoring of Performance
Compared with Standards Keeps locals focused on outcomes and
processes; Highlights issues of importance; Increases staff involvement in the process; Helps refine data collection processes and
products; Identifies areas for program improvement; Identifies promising practices; Yields information for decision-making; Enhances program accountability.
04/19/23 M. Corley 43
BUT…
How can states possibly monitor performance of all local programs?
Don’t we have enough to do already??
Where will we find staff to conduct the reviews?
You’re kidding, right??
04/19/23 M. Corley 45
So….Let’s Find Some Answers How can you monitor performance of
locals without overburdening state staff?
What successful models are already out there??
How does your state office currently ensure local compliance with state requirements?
Can you build on existing structures?
04/19/23 M. Corley 46
Approaches to Monitoring
Desk Reviews– Ongoing process– Useful for
quantitative data• Proposals
• Performance measures
• Program improvement plans
• Staffing patterns
• Budgets
On-site Reviews– Single event,
lasting 1-3 days– Useful for
qualitative data– Review of
processes & program quality
– Input from diverse stakeholders
04/19/23 M. Corley 47
Advantages and Disadvantages of Desk Reviews
Advantages Disadvantages
Data, reports, proposals, etc., already in state office
Assumes accurate data that reflect reality
Review can be built into staff’s regular workload
Local staff and stakeholders not heard
Data is quantitative; can be compared to previous years
Static view of data; no interaction in context
No travel time or costs required
No team perspective
04/19/23 M. Corley 48
Advantages and Disadvantages of On-site Reviews
Advantages DisadvantagesData is qualitative; review of processes & program quality
Stressful for local program and team
Input from perspectives of diverse stakeholders
Arranging site visits and team is time-intensive for both locals and state
State works with locals to explore options for improvement; provides T/A
Requires time out-of-office
Opportunity to recognize strengths; offer praise; identify best practices
Incurs travel costs
04/19/23 M. Corley 49
Data Collection Strategies for Monitoring
1. Program Self-Reviews (PSRs)
2. Document Reviews
3. Observations
4. Interviews
04/19/23 M. Corley 50
Program Self-Reviews
Conducted by local program staff
Review indicators of program quality
Completed in advance of monitoring visit and can help focus the on-site review
Results can guide the program improvement process
04/19/23 M. Corley 51
Document Reviews
Can review from a distance:– Proposals
– Qualitative and quantitative reports
– Improvement plans
Can review on-site:– Student files
– Attendance records
– Entry and update records
– Course evaluations
04/19/23 M. Corley 53
Observations Interactions
– during meetings– At intake and orientation– In hallways and on grounds– In the classroom
Link what is observed to– Indicators of quality– Activities in the program plan– Professional development workshops
04/19/23 M. Corley 54
Interviews
Help clarify or explore ambiguous findings
Provide information re: stakeholders’ opinions, knowledge, and needs– Administrative, instructional, and support staff– Community partners – Community agencies (e.g., employment, social
services)– Learners
04/19/23 M. Corley 55
Fill in the Boxes: Monitoring with Indicators of Program Quality
In teams of 4-5 and using H-12, fill in the data sources you would expect to use, the questions you would ask locals, and the strategies you would use in conducting a desk review versus an on-site review.
04/19/23 M. Corley 56
Steps for Monitoring Local Programs
1. Identify state policy for monitoring; gather support from stakeholders.
2. Consider past practices when specifying scope of work for monitoring.
3. Identify persons to lead and participate in monitoring.
4. Identify resources available for monitoring locals.5. Determine process for collecting data with clearly
defined criteria for rating; conduct monitoring.6. Report findings and recommendations.
7. Follow-up on results.
04/19/23 M. Corley 57
Data Help… Measure student progress
Measure program effectiveness
Assess instructional effectiveness
Guide curriculum development
Allocate resources wisely
Promote accountability
Report to funders and to the community
Meet state and federal reporting requirements
Show trends
04/19/23 M. Corley 58
BUT…
Data do not help: If the data are not valid and reliable;
If the appropriate questions are not asked after reviewing the data; or
If data analysis is not used for making wise decisions.
04/19/23 M. Corley 59
A Word about the Change Process
Factors that allow us to accept change:1. There is a compelling reason to do so;
2. We have a sense of ownership of the change;
3. Our leaders model they are serious about supporting the change;
4. We have a clear picture of what the change will look like; and
5. We have organizational support for lasting systemic change.
04/19/23 M. Corley 60
Stages of Change
1. Maintenance of the old system
2. Awareness of new possibilities
3. Exploration of those new possibilities
4. Transition to some of those possibilities or changes
5. Emergence of a new infrastructure
6. Predominance of the new system
04/19/23 M. Corley 61
A Word of Caution Start small; don’t overwhelm locals with a “data
dump.” Begin with the core issues, such as educational gain. Listen to what the data tell about the big picture; don’t
get lost in too many details. Work to create trust and build support by laying data
on the table without fear of recrimination. Provide training opportunities for staff on how to use
data. Be patient, working with what is possible in the local
program.Source: Spokane, WA School Superintendent Brian Benzel
04/19/23 M. Corley 62
Planning and Implementing Program Improvement
Stages of the Program Improvement Process
1. Planning;
2. Implementing;
3. Evaluating; and
4. Documenting Lessons Learned and Making Adjustments, as needed
04/19/23 M. Corley 63
Planning Questions
Who should be included on your program improvement team?
How will you prioritize areas needing improvement?
How will you identify and select strategies for effecting improvement?
04/19/23 M. Corley 64
Guiding Questions for Strategies
Is the strategy: Clear and understandable to all users? One specific action or activity, or dependent on other
activities? (If so, describe the sequence of actions.) An activity that will lead to accomplishing the goal? Observable and measurable? Assignable to specific persons? Based on best practices? One that all team members endorse? Doable—one that can be implemented?
04/19/23 M. Corley 65
Implementation Questions
Who will be responsible for taking the lead on ensuring that the change is implemented?
Who will be members of the “change” team and what will be their roles?
How will expectations for the change be promoted and nurtured?
How will the change be monitored?
04/19/23 M. Corley 66
Evaluation Questions
How will the changes that are implemented be evaluated?
How will the team ensure that both short- and long-term effects are measured?
Who will interpret the results?
Who will be on the look-out for unintended consequences?
04/19/23 M. Corley 67
Possible Evaluation Results
Significant improvement with no significant unintended consequences: Stay the course.
Little or no improvement: Stay the course OR scrap the changes?
A deterioration in outcomes: Scrap the changes.
04/19/23 M. Corley 68
Documenting the Process
Document what worked and what didn’t; lessons learned; and logical next steps or changes to the
plan.
Use as guide for future action.
04/19/23 M. Corley 69
State Planning Time
In your state teams, consider the questions on H-14 and begin planning.
Consider the stakeholders you want to include in your planning for data monitoring and program improvement.
Consider the problems you anticipate facing and propose solutions to those problems.
Complete H-14 to the best of your ability and be prepared to report on your plan in one hour.
top related