program evaluation psyco 325 oct 23, 2008 rebecca watchorn

35
Program Evaluation PSYCO 325 Oct 23, 2008 Rebecca Watchorn

Upload: clyde-mathews

Post on 26-Dec-2015

225 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Program Evaluation PSYCO 325 Oct 23, 2008 Rebecca Watchorn

Program Evaluation

PSYCO 325

Oct 23, 2008

Rebecca Watchorn

Page 2: Program Evaluation PSYCO 325 Oct 23, 2008 Rebecca Watchorn

History

• Effect of– Installing electric street lighting– Purification of water– Prohibiting child labour– Unemployment benefits

Page 3: Program Evaluation PSYCO 325 Oct 23, 2008 Rebecca Watchorn

Modern example

• Documentary about juvenile delinquents sent to visit a prison, meet actual inmates who show what life is like in prison to deter the young people from this life

• Developed into actual programs across US

Petrosino, A., Turpin-Petrosino, C., & Buehler, J. (2005). Scared Straight and Other Juvenile Awareness Programs for Preventing Juvenile Delinquency. The Scientific Review of Mental Health Practice, 4(1), 48-54.

Page 4: Program Evaluation PSYCO 325 Oct 23, 2008 Rebecca Watchorn

• organization of scientists, early childhood educators and teachers, who work on improving science literacy

• hands-on program for children to explore, learn and apply basic mathematics, sciences and technology concepts in daily programming within a structured early years learning program (ages infant – 5 years) and an after-school program (ages 6-12).

• preschool / after-school care facility in Waterloo, ON.

Becky’s example

developed

tested in

Page 5: Program Evaluation PSYCO 325 Oct 23, 2008 Rebecca Watchorn

What is Evaluation?

Coming up…

• Definition

• Outcome vs. Process evaluation

• Who wants evaluation?

• Functions of evaluation

Page 6: Program Evaluation PSYCO 325 Oct 23, 2008 Rebecca Watchorn

Definition of Evaluation

The systematic _____________

of the _________ and/or __________

of a program or policy,

compared to a set of explicit or implicit _____________,

as a means of contributing to the improvement of the

program or policy. (Weiss, 1998)

Page 7: Program Evaluation PSYCO 325 Oct 23, 2008 Rebecca Watchorn

Outcome vs. Process Evaluation

• Outcomes/results/effects • What about things you don’t want to happen

(e.g. self-labels)?

• Process• Integrity of the protocol• Can help with understanding the outcome

(what are the outcomes actually OF?)

(Summative vs. Formative)

Page 8: Program Evaluation PSYCO 325 Oct 23, 2008 Rebecca Watchorn

Process evaluation

• Mobiles in nursery

• Found no effect. Why?

• Importance of program fidelity

Page 9: Program Evaluation PSYCO 325 Oct 23, 2008 Rebecca Watchorn

Who wants evaluation?

• Philanthropic donors

• Local/provincial/federal governments

• Program directors

• Program managers

• Mandated grant requirement

– Each has their own questions and concerns (organizational learning, decision making)

Page 10: Program Evaluation PSYCO 325 Oct 23, 2008 Rebecca Watchorn

Who wants evaluation?

• Community / corporate donors

• BLC teachers

• BLC directors / board of governors

• Wings of Discovery program creators

• Let’s Talk Science administrators

• Queen’s University Professors

• Wilfred Laurier University graduate students

• Parents

$30,000 - Premiere Room Sponsors

ASSESSMENT AND EVALUATION GROUP

Page 11: Program Evaluation PSYCO 325 Oct 23, 2008 Rebecca Watchorn

Evaluation measures

• How is information collected?– E.g., implementation logs, questionnaires,

interviews, observations, tests, expert review

Page 12: Program Evaluation PSYCO 325 Oct 23, 2008 Rebecca Watchorn

Functions of Evaluation• Formative evaluation• Effectiveness evaluation• Impact evaluation• Maintenance evaluation

Page 13: Program Evaluation PSYCO 325 Oct 23, 2008 Rebecca Watchorn

Formative evaluation

Purpose: to provide information to guide decisions about fixing problems or enhancing a program at various stages of development

Page 14: Program Evaluation PSYCO 325 Oct 23, 2008 Rebecca Watchorn

Formative Evaluation

E.g.

• How is the Wings of Discovery program being implemented by actual teachers?

• How are the kids responding to the lessons? Do they seem to like them? Are they giving the types of responses program creators anticipated?

Page 15: Program Evaluation PSYCO 325 Oct 23, 2008 Rebecca Watchorn

Effectiveness evaluation

Purpose: to determine whether a program accomplishes its objectives within the immediate or short-term context of its implementation

Page 16: Program Evaluation PSYCO 325 Oct 23, 2008 Rebecca Watchorn

Effectiveness Evaluation

E.g.• Are the children learning what they are supposed to?

• Are they using the science terms they are introduced to?

Page 17: Program Evaluation PSYCO 325 Oct 23, 2008 Rebecca Watchorn

Impact evaluation

Purpose: to determine whether the knowledge, skills, and attitudes learned via the program transfer to the intended context of use

Page 18: Program Evaluation PSYCO 325 Oct 23, 2008 Rebecca Watchorn

Impact Evaluation

E.g.

• Are the children transferring what they learn from this program to other contexts (at home, outside of school, or later grades)?

Page 19: Program Evaluation PSYCO 325 Oct 23, 2008 Rebecca Watchorn

Maintenance evaluation

Purpose: to monitor the progress and regular use of the program so that decisions about support, modification or reconceptualization can be influenced.

Page 20: Program Evaluation PSYCO 325 Oct 23, 2008 Rebecca Watchorn

Maintenance Evaluation

E.g. Future evaluation:

• Is the program still up to date? (advances in technology, scientific understanding)

• Have any aspects of the program been dropped through the years?

• Do parents still think this program is something worthwhile?

Page 21: Program Evaluation PSYCO 325 Oct 23, 2008 Rebecca Watchorn

Planning the evaluation

Coming up…

• How to decide which questions to pursue

• Goals of evaluation questions

• Quantitative or qualitative?

Page 22: Program Evaluation PSYCO 325 Oct 23, 2008 Rebecca Watchorn

How to decide which questions to pursue

– Possible criteria:• Decisional timetable (can evaluation information

contribute to making a more informed decision?)• Relative clout of interested parties• Preferences of stakeholders• Uncertainties in the knowledge base• Practicalities• Assumptions of program theory• Potential for use of the findings• Evaluator’s professional judgment

Page 23: Program Evaluation PSYCO 325 Oct 23, 2008 Rebecca Watchorn

• Attributing outcomes to the program: determining whether any changes are due to the program– Economy might have improved and better jobs became

available, trainees at low point before? Now just older & more saavy?

• Links between process and outcomes– Are particular features of the program related to better or poorer

outcomes?– E.g. did group discussions lead to better outcomes than the

same information given one on one?

• Explanations: not only what happened, but how and why?– If you want to improve the likelihood of success, it helps to know

the reasons for achievements and shortfalls

Goals of evaluation questions

Page 24: Program Evaluation PSYCO 325 Oct 23, 2008 Rebecca Watchorn

Quantitative or qualitative?

• Methods should match central focus of the inquiry• Program process:

– New programs: often qualitative. Whole program may be too volatile to tie to a few arbitrary measures. Can remain open to new info and ideas about what the program is and does.

– Established programs: often quantitative. Clearly defined program, well-specified activities, can use quantitative methods to characterize program process.

• Program outcomes:– Are there precise questions? – quantitative may be preferable

Page 25: Program Evaluation PSYCO 325 Oct 23, 2008 Rebecca Watchorn

Quantitative or qualitative?

• E.g. job training program:– Qualitative: how trainees feel about the job

hunting process, the kinds of jobs they look for, why they quit jobs after a brief time, etc.

– Quantitative: accurate data on the proportion of trainees who find a job after training, wages they earn

Note: neither approach is as limited as this implies, these are just tendencies

Page 26: Program Evaluation PSYCO 325 Oct 23, 2008 Rebecca Watchorn

Developing Measures

Coming up…

• Choices among measures

• Measurement of variables

• Developing new measures

• Desirable characteristics of measures in evaluation

• Sources of data

• Interpretation of results

Page 27: Program Evaluation PSYCO 325 Oct 23, 2008 Rebecca Watchorn

Choice among measures

• Want to measure: inputs, processes, interim markers of progress, longer term outcomes, unintended consequences

• Want to be sure the measures are tapping the outcome you want to assess

Page 28: Program Evaluation PSYCO 325 Oct 23, 2008 Rebecca Watchorn

Measurement of variables

• A demanding/time-consuming phase

• Might be able to use existing measures from earlier studies (trial-and-error work done, established reliability, comparison groups)

Page 29: Program Evaluation PSYCO 325 Oct 23, 2008 Rebecca Watchorn

Developing new measures

• If can’t find existing measures, may need to develop your own.

• Much more difficult than it looks!– Balancing, interpretation of questions, etc.

Page 30: Program Evaluation PSYCO 325 Oct 23, 2008 Rebecca Watchorn

Desirable characteristics of measures in evaluation

• Reliability (do repeated efforts to measure the same phenomenon come up with the same answer?)

• Validity (extent to which measure captures concept of interest)

• Direction (for evaluation, outcome measures usually have a “good end” and

a “bad end”. E.g., which direction do you hope to see unemployment rates, birth weights, history test scores, etc. go?)

VS.

Page 31: Program Evaluation PSYCO 325 Oct 23, 2008 Rebecca Watchorn

Sources of data• Informal / Formal interviews• Observations• Written questionnaires• Program records• Data from other institutions (e.g., school, jail,

etc.)• Many others (e.g., tests of knowledge,

simulation games, psychometric tests of attitude/value/personality/beliefs, diaries, focus groups, physical tests, etc.)

Page 32: Program Evaluation PSYCO 325 Oct 23, 2008 Rebecca Watchorn

Interpretation of results

• Actual graph:

– Mentoring effective for boys and not for girls

– Funder asks: Scrap program for girls?

• Averages• What are we

measuring?• How could you improve

the program for girls (or more girls)?

BASC TRS AB - Final Scores Spring 2005Interaction of Gender and Months Mentored

0

5

10

15

20

25

30

35

40

45

50

female male

Gender

BA

SC

TR

S A

B -

25th Percentile Months Mentored

75th Percentile Months Mentored

Page 33: Program Evaluation PSYCO 325 Oct 23, 2008 Rebecca Watchorn

Post-evaluation

• Replication

• Meta-analysis

• Cost-benefit analysis (Is the program worth the cost? Do the benefits outweigh the costs that the program incurs?)

Page 34: Program Evaluation PSYCO 325 Oct 23, 2008 Rebecca Watchorn

Evaluation vs. other research

• After all of this, what do you think?– Purpose for which it is done– Judgmental quality

Page 35: Program Evaluation PSYCO 325 Oct 23, 2008 Rebecca Watchorn

References• Weiss, C. H. (1998) Evaluation. Upper Saddle River, NJ: Prentice Hall.

• Reeves, T. C. & Hedberg, J. G. (2003). Interactive learning systems evaluation. Englewood Cliffs, NJ : Educational Technology Publications.