ppwnov13- day 2 keynote- s.sridharan- u toronto

59
Towards a Transformative View of Evaluation: Evaluating Global Health Interventions Presentation at the International Food Policy Research Institute Sanjeev Sridharan The Evaluation Centre for Complex Health Interventions University of Toronto & St. Michael’s Hospital November 19, 2013

Upload: ag4healthnutrition

Post on 09-May-2015

140 views

Category:

Technology


0 download

DESCRIPTION

Day 2 keynote: Sanjeev Sridharan, University of Toronto: “Research and evaluation in global health policy processes” Workshop on Approaches and Methods for Policy Process Research, co-sponsored by the CGIAR Research Programs on Policies, Institutions and Markets (PIM) and Agriculture for Nutrition and Health (A4NH) at IFPRI-Washington DC, November 18-20, 2013.

TRANSCRIPT

Page 1: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

Towards a Transformative View of Evaluation: Evaluating Global

Health Interventions

Presentation at the International Food Policy Research Institute

Sanjeev Sridharan The Evaluation Centre for Complex Health

InterventionsUniversity of Toronto &St. Michael’s Hospital

November 19, 2013

Page 2: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

What is evaluation?

How do we make

evaluations matter?

TwoExamples

Summing up

Evaluating Research: Pathways

of Influence

Page 3: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

3

What is evaluation? A useful

but perhaps incomplete

definition

• Evaluation is defined both as a means of assessing performance and to identify alternative ways to deliver

• “evaluation is the systematic collection and analysis of evidence on the outcomes of programs to make judgments about their relevance, performance and alternative ways to deliver them or to achieve the same results.”

…..what role can evaluation/ evaluative thinking play in navigating interventions?

Page 4: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

4

Purpose of evaluation (Mark, Henry and Julnes, 2000

• Assessing merit and worth• Causal questions, RCT, observational studies

• Programme and organizational improvemento Formative evaluation

• Oversight and compliance

• Knowledge developmento Neglected purpose of many evaluations

Page 5: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

An Example: Primary Prevention Have a Heart Paisley

Page 6: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

Features of complex interventions (Pawson et al.,

2004) The intervention is a theory or theories The intervention involves the actions of people. The intervention consists of a chain of steps These chains of steps or processes are often not

linear, and involve negotiation and feedback at each stage.

Interventions are embedded in social systems and how they work is shaped by this context.

Interventions are prone to modification as they are implemented.

Interventions are open systems and change through learning as stakeholders come to understand them.

Page 7: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

System Dynamic Approaches (Sterman,

2006)

• Constantly changing;• Governed by feedback;• Non-linear, History-dependent;• Adaptive and evolving;• Characterized by trade-offs;• Policy resistance: “The result is policy

resistance, the tendency for interventions to be defeated by the system’s response to the intervention itself.”

Page 8: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

“Solutions” Can Also Create New Problems

Meadows DH, Richardson J, Bruckmann G. Groping in the dark: the first decade of global modelling. New York, NY: Wiley, 1982.

Merton RK. The unanticipated consequences of purposive social action. American Sociological Review 1936;1936:894-904.

Forrester JW. Counterintuitive behavior of social systems. Technology Review 1971;73(3):53-68.

Policy resistance is the tendency for interventions to be delayed, diluted, or defeated by the response of the system to the intervention

itself. -- Meadows, Richardson, Bruckman

Page 9: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

System-as-Cause

Forrester JW. Counterintuitive behavior of social systems. Technology Review 1971;73(3):53-68.

Meadows DH. Leverage points: places to intervene in a system. Sustainability Institute, 1999. Available at <http://www.sustainabilityinstitute.org/pubs/Leverage_Points.pdf>.

Richardson GP. Feedback thought in social science and systems theory. Philadelphia, PA: University of Pennsylvania Press, 1991.

Sterman JD. Business dynamics: systems thinking and modeling for a complex world. Boston, MA: Irwin McGraw-Hill, 2000.

Page 10: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

So why are evaluations so often

not very useful?

Page 11: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

UN Office of the Internal

Oversight Services, 2008• A Critique of Results-Based Management (2008).• “Results-based management at the United

Nations has been an administrative chore of little value to accountability and decision-making.”

Page 12: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

The UN Critique of performance

management and evaluation• Lack of strategic direction and cross-

organizational performance incentives

• Problems of attribution and trivializing innovation

• Trivializing outcomes

• The practice of lacks rigor

• A lack of purpose

Page 13: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

The UN Critique (2)• Lack of clarity on the consequences of good

and poor performance

• Lack of clarity on the capacity needed to build a results-based management system

• Technical solutions are not a substitute for substantive clarity

Page 14: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

14

The logic of an evolutionary strategy

Box et al (1978, p. 303):

... the best time to design an experiment is after it is finished, the converse is that the worst time is the beginning, when least is known. If the entire experiment was designed at the outset, the following would have to be assumed as known: (1) which variables were the most important, (2) over what ranges the variables should be studied... The experimenter is least able to answer such questions at the outset of an investigation but gradually becomes more able to do so as a program evolves. (p. 303)

Page 15: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

15

What kind of evaluation will you be doing?

Formative

Developmental

Summative

Page 16: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

16

INTERVENTION THEORY AND DEVELOPING EXPECTATIONS OF IMPACTS OVER TIME

The key components of the complex intervention

The program theory of the complex intervention

Learning from the Evidence Base

The anticipated timeline of impact

LEARNING FRAMEWORKS AND PATHWAYS OF INFLUENCE

The pathways of influence of an evaluation

Learning framework for the evaluation

IMPACTS AND LEARNINGAssessing the impact of the intervention: DESIGN

Learning about the intervention over time

SPREAD AND SUSTAINABILITY

Spreading learning from an evaluation

Reflections on performance and sustainability

A B

C D

A Ten Step approach to Evualation

Page 17: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

An Example: Primary Prevention Have a Heart Paisley

What are the key

LEARNINGS from the

evaluation?

On what basis do we make a

decision to SUSTAIN the

program?

What gets SPREAD as a result of the evaluation?

Page 18: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

Design, methods,

approaches

ACTION

PROGRAMMES

Interpersonal,

Individual & Collective

Mechanisms

EVALUATION

THEORY

EVIDENCE

POLICY LANDSCAPE

(dynamic, changes over time)

Hypothesised Pathways

Evidence Base Linked to Pathways

Areas of Uncertainty

A

B

CE

D

Evaluator

ALIGNMENT(moving beyond programmes at

level of analysis)

POLICY IMPACT

INDIVIDUAL IMPACTS

ORGANISATIONAL LEARNING

PROCESS LEARNING

RISK LANDSCAP

E

Page 19: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

19

Example 1

Page 20: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

20

Background• Project in Collaboration with the China National

Health Development Research Centre to build Evaluation Capacity for evaluating health system reform efforts

• Focus on Health Inequities• Developing guidelines to evaluate health inequity

initiatives• Relationship to complexity

Page 21: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

Evaluate 3 projects:

Test and Refine the Guidelines

Levels of Evaluation Influence

Individual

Inter-personal

Collective

Create Guidelines

• Literature Reviews

• Surveys of Innovations

• Determine Evaluation Assets and Needs

Revise Guidelines

Knowledge Translation

Evaluation Capacity Building

21

Page 22: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

22

Complexity of Intervention

Eva

luati

on

Ap

pro

ach

Page 23: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

23

Complexity of Intervention

Ap

pro

ach

to E

valu

ati

on

Clin

ical

Int

erve

ntio

n

Com

mun

ity

Inte

rven

tion

Sys

tem

-leve

l Int

erve

ntio

n

Page 24: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

24

Review of articles

ContextExpectations and

timelines

Understanding and changing

understanding

Organizational structures for

learning

Multiple understandings of

impacts

Sustainability and spread

Page 25: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

25

Describe the

Intervention

What was the Setting of the intervention? What was the

Context?

Was there a discussion of the evidence informing the program? Was the evidence from multiple disciplines?

Is the program a

pilot?

Challenges of

adaptation to specific setting?

What was the

duration of the

intervention?

Discussion on

timelines and

trajectories of impacts?

Were there changes in

the intervention over time?

How did the evaluation

explore this?

Was the theory of change

described? Did it

change over time?

Page 26: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

26

How were impacts studied;

what design were

implemented?

Was there a Formal

process of ruling out threats to

internal and external validity?

Was the program a success?

Unintended outcomes? Differential impacts for

different groups?

Did the evaluation help with decisions

about sustainability

?

Was the organizatio

nal structure of

the interventio

n described?

What was the

Intervention Planners’

view of success?

Were there formal

structures to learn and modify the

program over time? Was there a

discussion of what can be

spread as part of

learnings from the

evaluation?

Page 27: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

27

Example 2

Page 28: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

Goals“To contribute to improving health and strengthening health systems in low and middle income countries (LMICs), by supporting innovative international approaches to integrating health knowledge generation and synthesis (including consideration of environmental, economic, socio-cultural, and public policy factors) through research, health research capacity development, and the use of research evidence for health policy and practice.”

Page 29: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

Objectives• Foster international partnerships and collaboration

to promote the generation and effective communication and use of relevant health research (including consideration of environmental, economic, socio-cultural, and public policy factors) in, for and by low- and middle-income countries (LMICs);

• Train and support researchers responsive to policy and practice priorities of LMICs relating to or influencing health; and

• Support active collaboration between researchers and research users (e.g. policymakers, practitioners, civil society organizations, and community members) to support health priorities of LMICs.

Page 30: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

Basic QuestionsHow complete is our knowledge of how these initiatives are intended to work?Do we have clarity on the timeline of impact of such initiatives? How do we ensure that we don’t get caught up in the ‘activity space’ of such initiatives? How do we align metrics to provide incentives to focus on the outcomes?

Page 31: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

Key Concepts• Limited sphere of control

• Flexible sphere of control

• Timeline of impact

• Metrics and incentives

• Integrated planning and evaluation

Page 32: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

The Problem of Insufficient Imagination

Page 33: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

Granting Mechanisms

Research, Capacity

Building and Knowledge Translation activities

Health Impacts

Page 34: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

Direct Control

7. R

ESEA

RCH

,CAP

ACIT

Y BU

ILD

ING

AN

D K

NO

WLE

DG

E TR

ANSL

ATIO

N P

ROCE

SSES

and

OU

TPU

TS

Direct Influence

16. P

LAN

NIN

G F

OR

IMPL

EMEN

TATI

ON

: ALI

GN

MEN

T O

F RE

SEAR

CH A

ND

CAP

ACIT

Y BU

ILD

ING

WIT

H L

OCA

L PO

LICY

AN

D

PRAC

TICE

NEE

DS

Indirect Influence

8. KNOWLEDGE GENERATION:

FOCUSSED AND CROSS-CUTTING

THEMES

1. DRIVERS FOR THE

DEVELOPMENT OF TEASDALE-

CORTI

4. MERIT REVIEW

9. ENHANCED RESEARCH

CAPACITY AND CAPACITY TO

USE EVIDENCE

10.STRENGTHENE

D RELATIONSHIP

S BETWEEN NORTH AND

SOUTH RESEARCHERS

12. ENABLED RESEARCHERS

13. RESEARCH

DISSEMINATION

14. GREATER AWARENESS OF SALIENCE

OF RESEARCH

AND CAPACITY BUILDING

FOR LOCAL POLICY AND

PRACTICE CONCERNS

15. ENHANCED LOCAL

FOUNDATIONS FOR FUTURE

RESEARCH AND CAPACITY

BUILDING

18. L

OCA

L U

SE A

ND

INFL

UEN

CE O

F RE

SEAR

CH A

ND

CAP

ACIT

Y BU

ILD

ING

19. S

TREN

GTH

ENED

HEA

LTH

SYS

TEM

S A

ND

EN

HA

NCE

D

HEA

LTH

EQ

UIT

Y

20. I

MPR

OVE

D H

EALT

H

17. MECHANISMS

TO SUSTAIN RELATIONSHI

PS AND FOUNDATION

S

2. COMMISSIONING RESEARCH AND CAPACITY BUILDING AND KNOWLEDGE

TRANSLATION GRANTS : TEAM AND

LEADERSHIP GRANTS, KTE

GRANTS

3. NORTH-SOUTH RESEARCHERS

AND KNOWLEDGE USERS

RESPOND TO REQUEST FOR PROPOSALS

11. KNOWLEDGE

USER ENABLED TO IMPLEMENT LEARNINGS

5. IMPLEMENTATI

ON OF PROJECTS

6. BUILDING

TEAMS: INITIAL RELATIONSHIP

BUILDING

A

B

C

D

EF

G

H

I

J

K

L

M

N

O

P

Q

RS

T

U

V

W

X

Z AA BB

CC

Page 35: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

DataInterviews with Teasdale-Corti planners in multiple funding organizations including CIHR and IDRC

Formal analysis of 8 final reports

Formal analysis of 8 proposals

Surveys of Teasdale-Corti grantees—separate surveys were conducted with Canadian researchers, Southern researchers and knowledge users

Interviews with grantees at the October 2012 GHRI Ottawa meeting—this includes video interviews with grantees

Brief case studies of three grantees including Skype interviews with Southern partners

Bibliometrics analysis

Data collection to support theory of change work.

Page 36: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

Support from evidence• There was strong support from the

evidence for the overall goal of Teasdale-Corti: there was support for how Teasdale-Corti contributed to building research and practice that can contribute to health and strengthening health systems.

• There was limited evidence that the Teasdale-Corti initiative directly impacted health outcomes (but need to reflect on the timeline of impact)

Page 37: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

Some learnings (1)• Greater upfront clarity of the anticipated

timeline of impact • Greater upfront thinking about planning for

sustainability • Greater clarity around who counts as a

knowledge user • Focused monitoring on knowledge-user

engagement: possible metrics• Greater clarity on what equity in partnerships

means • Boundaries and contexts of the Teasdale-Corti

model

Page 38: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

Some learnings (2)• Limited focus on health equity • Greater clarify on amount and types of support

grantees need • Focus on the end-user and not just the

knowledge user• Greater focus on the benefits of the North-

South relationships • Greater clarity on what progress and

accountability means given the complexity of Teasdale-Corti • Incentives in the system of metrics to

encourage collaboration and co-creation

Page 39: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

Refining the Theory of Change:

examples of learnings

Greater clarity (by providing models, exemplars, narratives of successful implementation, etc.) on how best to create synergy between knowledge generation, knowledge translation and capacity building

More explicitly clarify the role of the knowledge user in the implementation of Teasdale-Corti

Explicitly recognize the ‘values’ that guide the implementation of Teasdale-Corti

The implementation of the theory of change needs to be supported by a M&E system that pays attention to the complexities and heterogeneities of the implementation of Teasdale-Corti

Page 40: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

Towards a more imagined plan

Page 41: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

Problem Definition

1

LMIC Priorities:

2

Designing Locally-Useful Research

3

Promoting Research Use and Influence

4

Potential for Useful Capacity Building:

5

Collaboration

6

Page 42: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

Equity

7

Sustainability

8

Clarity and coherence of outcomes and specific theory of change

9

Timeline of Impact:

10

Monitoring/Evaluation Plan:

11

Gender and Ethical Considerations:

12

Page 43: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

Ideas for improvement

Recommendations based on proposal analysis

• More explicit focus on barriers

• Have a clearer focus on the diversity of knowledge users

• Equity should have a more explicit focus

• Encourage thinking about sustainability upfront

• Planning for relationship-building over time

• Monitoring and evaluation of complex interventions

Page 44: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

Synergizing activities

The ‘black-box’ of capacity-building

Consistency in values and metrics

Non-linear models of the research/policy interface

Long timeline of impact that may exist for research to influence health outcomes

Page 45: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

Unpacking the Plan

Theory of Change

Developmental - what are capacities needed?

Sphere of Control

Timeline of Impact

Metrics and incentives

Phased approach to commissioning

Page 46: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

Evaluating research• What are pathways of influence by which

research impacts policies? • Can a results based culture help with greater

understanding of the pathways of influence?• Approaches:

o Conceptualizing research as an intervention with short and long term goals. One of the intermediate goals is to influence policy. The long term goal is to improve individual lives

Page 47: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

? Policy Influence

Social Betterme

nt

Research as Intervention

Individual Research Project

Multiple research projects

Monitoring and

Evaluation

Evaluative

Thinking

Monitoring and

Evaluation

Evaluative

Thinking

Monitoring and

Evaluation

Evaluative

Thinking

Monitoring and

Evaluation

Evaluative

Thinking

Page 48: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

Monitoring and

Evaluation

System

Research Quality

Pathways of Policy Influence

Page 49: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

Research as

Intervention

ContextMechanis

msOutcomes

Page 50: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

MODEL 1From

Knowledge to Policy

See Knowledge to Policy: Making the most of Development Research, Fred

Carden, 2009

Page 51: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

Research as

Intervention

Overall Context

Mechanism: So,

what can you do?

Outcomes

Stability of Decision Making Institutions

Policy Absorption Capacity

Nature of Governance

Opportunities for countries in transition

Economic crisis and pressures on government

Relationship Building

 Communication Networks of Influence

 Institutional Mechanisms

Policy Influence

“Better” Decisions

Social Betterment

Government Context

Clear Government Demand

 Government interest but leadership absent

Interest but no capacity

 Interesting research but research uninteresting for Government

 Government disinterest or hostility

Research Quality

Research Networks

Action Research

New tools

Process Outcome

sPolicy Capacity

Broader Policy Horizons

Policy Regimes

Page 52: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

Model 2: Pathways of

InfluenceMel Mark and Gary Henry, Evaluation,

2003

Page 53: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto
Page 54: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto
Page 55: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

Research Initiatives in specific places

• Local research for lasting solutions

• Some key questions:

o What is local about your research? How does your research respond to local needs? What is different about your local networks? What is the value added of the local research?

o Are there local pathways of influence? o How different are the local pathways of influence from other more

general pathways?

Page 56: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

Research as

Intervention

Global Context

Global Mechanis

ms

Outcomes

Local Context

Local Mechanis

ms

Page 57: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

Local Context• Are your theories locally generated? Are the

needs local? • Examples of local context that have interfered

with standard influence processes• Does your research address local needs in a

timely manner?• Are the local networks different from other more

general networks of influence?• Example: What are some of the challenges of

policy influence in specific contexts in Africa?

Page 58: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

58

Summing up and looking ahead

Page 59: PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

59

Models of Causation

(Successionist vs. Generative Models of Causation)

Ecology of Evidence

Integrating Knowledge Translation

with evaluation

Capacity Building

Development al evaluation in Complex

Dynamic Settings

Portfolio of designs

and approaches

Program Theory and

Incompleteness

Time Horizons and

Functional forms

Spread, Scaling up

and Generalization