evaluation – principles and methods matt barnard head of evaluation nspcc

9
Evaluation – Principles and methods Matt Barnard Head of Evaluation NSPCC

Upload: buddy-simpson

Post on 01-Jan-2016

213 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Evaluation – Principles and methods Matt Barnard Head of Evaluation NSPCC

Evaluation – Principles and methodsMatt Barnard

Head of Evaluation

NSPCC

Page 2: Evaluation – Principles and methods Matt Barnard Head of Evaluation NSPCC

Purpose of evaluation• Definition “Examine how a policy or intervention was

designed and carried out and with what results.” (Magenta Book)

• Asks objective questions– What were the impacts?– How was it delivered?– What were barriers and facilitators?– Did it deliver value for money?

• Aims to provide– ‘Scientific’ basis for policy making

Page 3: Evaluation – Principles and methods Matt Barnard Head of Evaluation NSPCC

Evaluation Design Process

Evaluation findings

Intervention Design

Logic model

Intervention implementation

planning

Evaluation implementation

Intervention implementation

Evaluation Design

Page 4: Evaluation – Principles and methods Matt Barnard Head of Evaluation NSPCC

Logic models

• Characteristics– Mechanisms not processes

– Key steps not every step

– Explanatory not descriptive

– Reflects theoretical assumptions

• Benefits– Sense check

– Identifies realistic outcomes

– Facilitates evaluation design

Page 5: Evaluation – Principles and methods Matt Barnard Head of Evaluation NSPCC

Strength of designWeak design

(Poor/ no counterfactual)

Strong design (Realistic counterfactual)

Low power

(Small numbers/ effect size)

Unlikely to detect difference/

Low confidence in attribution

Unlikely to detect difference/

High confidence in attribution

High power

(Large numbers/ effect size)

Likely to detect difference/

Low confidence in attribution

Likely to detect difference/

High confidence in attribution

Strength of design matrix

Page 6: Evaluation – Principles and methods Matt Barnard Head of Evaluation NSPCC

Strength – Evaluation design

Randomized controlled trial

Quasi-experimental design

Before and after measures

Page 7: Evaluation – Principles and methods Matt Barnard Head of Evaluation NSPCC

Types of design

• RCT– Individual randomization

– Cluster randomization/ roll out

– BAU/waiting list/alternative services

• Quasi-experimental designs– Matched area/ groups

– Matched individual

– Interrupted time series

– Regression discontinuity

Page 8: Evaluation – Principles and methods Matt Barnard Head of Evaluation NSPCC

Factors influencing methodology

• Intervention stage of development– Early exploration

– Defined and established but not proven

– Transferability

• Potential Costs and benefits– Resources

– Timescales

Page 9: Evaluation – Principles and methods Matt Barnard Head of Evaluation NSPCC

Key principles

• Clarity about key question– Avoid ‘default’ questions

• Methods matched to question– Ensure methods match desired questions

• Claims match evidence– Avoid over-claiming

• Have a coherent story to tell