conducting program evaluations for federal programs brooke blair, alsde mark ward, alsde erin...

35
CONDUCTING PROGRAM EVALUATIONS FOR FEDERAL PROGRAMS Brooke Blair, ALSDE Mark Ward, ALSDE Erin McCann, SEDL Mary Lou Meadows, SEDL

Upload: claude-charles

Post on 25-Dec-2015

216 views

Category:

Documents


1 download

TRANSCRIPT

CONDUCTING PROGRAM EVALUATIONS FOR

FEDERAL PROGRAMS

Brooke Blair, ALSDEMark Ward, ALSDEErin McCann, SEDL

Mary Lou Meadows, SEDL

Where is Home?

Session ObjectivesParticipants will:

• Increase their understanding of the connection between program evaluation, the Federal Programs Monitoring document, and the eGAP Consolidated Application.

• Increase their understanding of the differences between

immediate, short-term, intermediate, and long-term outcomes.

• Increase their knowledge of indicators and performance measures for reporting the effectiveness of actions using some short-term and intermediate outcomes.

• One measure, by itself, gives some useful information . . . But:

• Comprehensive measures used together and over time provide much richer information.

• Together, these measures can provide a powerful picture that can help us understand the school’s impact on student achievement.

• These measures, when used together, give schools the information they need to get the results they want.

Needs Assessment Data

Student Learning

Demographics

PerceptionsSc

hool

Pro

cess

es

Bernhardt’s Model of Data Categories

Bernhardt, V. (2004). Data analysis for continuous school improvement (2nd ed.). Larchmont, NY: Eye on Education.

Student Learning

Demographics

Perceptions

Scho

ol P

roce

sses

Examples:

Enrollment

Attendance

Drop-out Rate

Ethnicity

Gender

Grade Level

Language Proficiency

Bernhardt’s Model of Data Categories

Student Learning

Demographics

Perceptions

Scho

ol P

roce

sses

Examples:

Perceptions of learning environment

Values and beliefs

Attitudes

Observations

Bernhardt’s Model of Data Categories

Student Learning

Demographics

Perceptions

Scho

ol P

roce

sses

Examples:

Norm-referenced tests

Criterion-referenced tests

Teacher observations

Bernhardt’s Model of Data Categories

Student Learning

Demographics

Perceptions

Scho

ol P

roce

sses

Examples:

Scheduling

Common Planning Time

Special Services Referrals

School Policies

Bernhardt’s Model of Data Categories

Why do you think that time would be an important variable to data collection?

Time

Compliance AssistanceReview Document

Examples of Programs Requiring a Needs Assessment:

•Title I•Title II•Title III•McKinney Vento•Neglected & Delinquent

Data Quality

No Child Left Behind Act of 2001Title I – Best Use of Funds

• SEC.1001.Statement of purpose:• (4) holding schools, LEAs…accountable for improving

the academic achievement of all students, and identifying and turning around low performing schools that have failed to provide a high-quality education to their students, while providing alternatives to students in such schools to enable the students to receive a high-quality education;

• (5)distributing and targeting resources sufficiently to make a difference to LEAs and schools where needs are the greatest;

(Title I, Improving the Academic Achievement of the Disadvantaged)

Key Considerations forProgram Evaluation:

• The types of data used to determine success.• The activities that are associated with success.• How the results are being used to drive future

improvement effort.• How you are prioritizing needs to make the

greatest impact.• AND whether you are achieving the desired

outcomes.

Outcomes/Impacts

Immediate

Short-Term

Intermediate

Long-Term

Adapted from Innovation Network, Inc., Logic Model Workbook, www.innonet.org

IMMEDIATE IMPACTS

Direct results of an activity

– # of participants who attended a workshop– # of students attending a tutoring program– # of materials provided– Web site designed and activated– Policy manual written and approved– Position descriptions developed– Job positions filled

SHORT-TERM IMPACTS

Changes in Learning as a result of an activity

– New knowledge– New skills– Changed attitudes, opinions, or

values

– Changed motivation– Changed aspirations

INTERMEDIATE IMPACTS

Changes in Action as a result of gains in learning:

– Modified behavior– Changed practice– Changed decisions– Changed policies

LONG-TERM IMPACTS

Changes in Condition as a result of actions taken:

– Human– Economic– Civic/Community– Environment

Strategy and Action Steps

Strategy: Provide supplemental reading/literacy instruction for students identified as at risk.

Actions:Purchase Read with Ease (computer assisted learning

program).Hire lab instructors, or reallocate teacher time to allow for

time to work in lab with at risk students.Schedule lab hours for at risk students before and after

school.Train lab instructors in use of Read with Ease.Lab instructors provide support to at risk students in

computer reading lab.

Strategy Action Steps changed: Types of expected outcomes/impacts

•Purchase Read with Ease -- Immediate

-- Immediate•Hire lab instructors

•Schedule lab hours -- Immediate

•Train lab instructors -- Immediate, Short,

Intermediate

•Lab instructors provide support -- Short, Intermediate, Long

Evidence of Outcome/Impact:Performance Measures

Evidence of Outcome/Impact:Performance Measures

Strategy and Action Steps

Strategy: Provide school-based reading/literacy professional development for administrators, teachers, and other instructional staff.

Actions:• Hire reading coach to facilitate ongoing

reading/literacy professional development at the school.

• Reading coach and principal meet weekly to discuss reading/literacy issues related to students and teachers.

• Instructional staff meet weekly for 1 hour on reading/literacy instruction.

• Reading coach assists instructional staff in meetings and in implementation of new reading/literacy strategies.

Measuring Impacts:Performance Measurements

• Surveys, interviews, focus groups • teachers, administrators, coaches/mentors, students,

parents, community• Pre-post tests of knowledge/skill

• professional development participants, teachers, students• Observations

• of teachers, administrators, coaches, students• Document/records reviews

• participation/attendance records, lesson plans, journals/logs, student homework/projects, class grades, performance on benchmark and standardized tests