welcome, introductions conceptualizing the evaluation problem stakeholder interests divergent and...
TRANSCRIPT
CADEA WORKSHOP 2 INPUTBRAD COUSINS, UNIVERSITY OF OTTAWA
Welcome, introductions Conceptualizing the evaluation problem
stakeholder interests Divergent and convergent processes Developing evaluation questions Program Logic Models Constructing the evaluation framework
CONCEPTUALIZING THE EVALUATION PROBLEM Program (Non-evaluator) Stakeholders
Individuals, groups or organizations with a stake in the program or its evaluation
Program structural and context considerationsAge, program theory, program organization
openness to change, consensus among stakeholders, micro politics
CONCEPTUALIZING THE EVALUATION PROBLEM Evaluator-Stakeholder relationships
External evaluation, internal evaluation and collaborative evaluation (Collab, Partic, Emp)
Relationships among non-evaluator stakeholdersDifferential access to power, knowledge,
expertiseConflicting values perspectives
Resource considerations: fiscal resources, human resources, expertise (evaluation logic, program logic), time.
TYPES OF STAKEHOLDERS Policy makers/sponsors Program developers Program administrators/managers Program implementers Intended program beneficiaries Special interest groups OthersWHO ARE THEY? WHAT ARE THEIR
INTERSTS? WHOSE INTERESTS COUNT?
DIVERGENT PHASE Identify reasons for initiating the evaluation,
issues, questions of interest Identify contextual conditions: micro political
analysis, evaluability assessment List stakeholders and specify their interest and
relative importance Identify dimensions of performance (criteria) and
standards Describe object for evaluation Tap multiple sources:
Stakeholders, models/frameworks, literature, professional standards, expert consultants, professional judgement.
DIVERGENT PHASE Useful information from stakeholders
Perception of the programProgram purposes/goalsProgram theoryConcernsEvaluation questions Intended uses of evaluationOther stakeholders and their stake.
CONVERGENT STRATEGIES FOR PLANNING AND FRAMING THE EVALUATION Identification of primary users and other
stakeholders and their main interests, issues or questions.
Initial statement of the problem from primary stakeholders.
Other stakeholders take on the problem. Generate list of rank ordered issues. Attention to potential uses and decision
rules. Identify constraints and modify
evaluation issues as appropriate.
FORMULATE EVALUATION QUESTIONS AND ISSUES From convergent process, identify and
determine priority of feasible list of questions
Implementation evaluation Is the programme being implemented as
intended? Why/not? Process evaluation
Which aspects or components of the program are most potent in affecting desired outcomes?
Impact evaluation Did the program meet its objectives? What
unintended effects of the programme can be observed?
PROGRAM LOGIC MODEL
“A tool for describing program theory and for guiding program measurement, monitoring, and management” Rossi et al.
PROGRAM LOGIC MODELS Needs: Raison d’être for the program. The
problem to be solved Inputs: human, fiscal and other resources
(e.g., partnerships, infrastructure) needed to run the program
Activities: all action steps needed to produce program outputs
Outputs: goods and services generated by the program (necessary but insufficient conditions required to realize program outcomes; can be counted)
Outcomes: link to program objectives; short-term or immediate, intermediate, long term (observed changes)
PROGRAM LOGIC MODEL
Program Activities
Inputs/resources
Program Need
Outputs
Outcomes
‘Results Chain’ Logic Model
Efficiency
Effectiveness
Area of Control
Internal to the Organization
Area of Influence
External to the Organization
External Factors
Inputs
(Resources) Activities Outputs
Immediate
Outcomes
(Direct)
Intermediate
Outcomes
(Indirect)
FINAL
OUTCOME
PROGRAM LOGIC MODELS Activities (service utilization plan):
Recruitment InductionDelivery
Regular program activities Streaming Supplemental activities
Terminal
EVALUATION FRAMEWORK Matrix that associates evaluation
questions with methods Elements (rows)
Evaluation questions Indicators (how would we know?)Data sources (from whom to gather
evidence)Methods (how evidence will be gatheredBasis for Comparison (how good is good
enough?)
EVALUATION FRAMEWORKQuestions Indicators Data
SourcesMethods Basis for
comparison
Implemented as intended?
-Instructionalactivities-Scope & sequence
-teachers-admin
ObservationFocus groupInterviewDocument
PLMInst’l plans
Objectives met?
Student learning
-students-teachers
-pre-posttest-interviews
-Baseline-external std.-comparison group