performance measurement community literacy
DESCRIPTION
Performance Measurement Community Literacy. March 19, 2007 Harry P. Hatry The Urban Institute Washington DC. Key Distinctions. Performance Measurement vs. Program Evaluation Performance Measurement vs. Performance Management. PROGRAMPERFORMANCE EVALUATIONS MONITORING - PowerPoint PPT PresentationTRANSCRIPT
1
Performance Measurement Community Literacy
March 19, 2007
Harry P. Hatry
The Urban InstituteWashington DC
2
Key Distinctions
• Performance Measurement vs. Program Evaluation
• Performance Measurement vs. Performance Management
3
PROGRAM PERFORMANCEEVALUATIONS MONITORING
Frequency: Irregular Regular, Continuing Coverage: Done on only Covers most
a few programs programs Depth of Seeks reasons for Only tells “theInformation: poor performance score”, not WHY
Cost High for each study Cost spread out
Utility Major Decisions Continuous program Improvement
4
Performance Measurement Information
Plus
Use of that Information to Improve Services
Produces
Performance Management
5
Outcome Sequence Chart
Fed/StateFundsProvided
OrganizationDevelopsImprovementPlan
TeachersImplementChanges toInstructionalPractice inClassroom
Studentsparticipate inregularclassroomInstruction
StudentsDemonstrateImprovedPerformance
StudentsSuccessfullyCompleteEducationRequirements
Students Enrolled inPost SecondaryEducation and/orEmployed
“Intermediate” OutcomesOutput
“End” Outcomes
6
Outcome Sequence Chart
Fed/StateFundsProvided
OrganizationDevelopsImprovementPlan
TeachersImplementChanges toInstructionalPractice inClassroom
Studentsparticipate inregularclassroomInstruction
Children withDisabilitiesDemonstrateImprovedPerformance
ChildrenSuccessfullyComplete GeneralEducationRequirements
Students Enrolled inPost SecondaryEducation and/orEmployed
“Intermediate” OutcomesOutput
“End” Outcomes
• # Schools with SEAapproved plans
• #/% of students
participating inregular instruction
• #/% of studentsenrolled in postsecondary education, and/or employed
• #/% of studentswho completeeducation requirements
• #/% of studentsdemonstrating improvedperformance
• Quantity offunds provided • #/% teachers
reporting changes
7
Suggested Added Outcome Indicators
• #/% of students whose test scores showed one year’s gain. [End Outcome ]
• #/% of students reporting satisfaction with the assistance they received. [End Outcome]
• # of students that volunteered for the assistance [Intermediate Outcome]
• % of students enrolling divided by the number of eligibles. [Intermediate Outcome]
8
TYPICAL SERVICE QUALITY CHARACTERISTICS
1. TIMELINESS/WAIT TIMES 2. PLEASANTNESS/FRIENDLINESS 3. CONVENIENCE/ACCESSIBILITY OF HOURS OF OPERATION CUSTOMER CAN REACH SOMEONE TO TALK TO
4. AWARENESS OF PROGRAM SERVICES 5. CLARITY OF INFORMATION/REGULATIONS 6. STAFF/TEACHER HELPFULNESS/KNOWLEDGE
7. OVERALL CUSTOMER SATISFACTION
9
Categories of Data Sources and Collection Procedures
• Agency Records
• Administered Tests
• Customer Surveys
• Trained Observer Procedures
• Expert Judgments
• Focus Groups
10
Sample Outcome Information From Customer Surveys
• Ratings of overall satisfaction
• Ratings of specific service quality characteristics
• Ratings of results of the service
• Whether actions/behavior sought by the program occurred
• Extent of service use
• Awareness of services
• Reasons for dissatisfaction or non-use
• Suggestions for improvements
11
1. Provide frequent, timely information to programs and their staffs.
2. Set targets each year.
3. Disaggregate outcome data by customer and service characteristics.
4. Do regular, basic, analysis of the data, such as comparisons.
5. Seek explanations for unexpected outcomes.
Making Performance Information Really Useful
12
Percent of Student That Reported The Program’s Assistance Had Helped Them Improve Their Reading
N
Very or Somewhat
Helpful
Target
Difference (Percentage
Points) All Clients 560 50% 60% -10 Gender Females 230 30% 60% -30 Males 330 64% 60% 4 BeginningReading Level
Lowest 100 60% 60% 0 Second 220 55% 60% -5 Third 180 44% 60% -16 Highest 60 33% 60% -27
Faculty A 190 53% 60% -7
B 30 67% 60% 7 C 150 33% 60% -27 D 190 58% 60% -2
13
Which Hospital Would You Choose?
MERCY HOSPITAL
APOLLO HOSPITAL
2,100SURGERYPATIENTS
63DEATHS
3%DEATHRATE
800SURGERYPATIENTS
16DEATHS
2%DEATHRATE
600 IN GOOD
CONDITION
1,500IN POOR
CONDITION
6DEATHS
57DEATHS
1%DEATHRATE
3.8%DEATHRATE
600 IN GOOD
CONDITION
200IN POOR
CONDITION
8DEATHS
8DEATHS
1. 3%DEATHRATE
4%DEATHRATE
BUT… BUT…
14
Compare the Latest Outcome Data:
1. To previous performance
2. To targets set by the organization
3. Among categories of customers
4. Among facilities
5. By type and amount of service
6. To results in other communities
Types of Comparison
15
Making Performance Information Really Useful (Continued)
6. Hold “How Are We Doing?” sessions after each performance report.
7. Prepare “Service Improvement Action Plans” for areas with low performance.
8. Provide recognition rewards.
9. Identify successful practices.
16
17
Website:
http://www.urban.org/center/cnp/projects/outcomeindicators.cfm
Outcome Indicators Project
A joint project of the Urban Institute and The Center for What Works
The Outcome Indicators Project provides a framework for tracking nonprofit performance. It suggests candidate outcomes and outcome indicators to assist nonprofit organizations that seek to develop new outcome monitoring processes or improve their existing systems.
This website contains three primary elements:
1. Building a Common Outcome Framework to Measure Nonprofit Performance
2. Outcomes and Performance Indicators for 14 Specific Program Areas
3. Nonprofit Taxonomy of Outcomes
Adult Education and Family Literacy Employment Training
Advocacy Health Risk Reduction
Affordable Housing Performing Arts
Assisted Living Prisoner Re-entry
Business Assistance Transitional Housing
Community Organizing Youth Mentoring
Emergency Shelter Youth Tutoring
18
Crocodiles May Get You
But in the End
It Should be Very Worthwhile
For Student Literacy