impact evaluation in the real world one non-experimental design for evaluating behavioral hiv...

20
Impact Evaluation in the Real World One non-experimental design for evaluating behavioral HIV prevention campaigns

Upload: clifton-boyd

Post on 27-Dec-2015

216 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Impact Evaluation in the Real World One non-experimental design for evaluating behavioral HIV prevention campaigns

Impact Evaluation in the Real World

One non-experimental design for evaluating behavioral HIV

prevention campaigns

Page 2: Impact Evaluation in the Real World One non-experimental design for evaluating behavioral HIV prevention campaigns

Implementation realities

• BCC program:– Has already started– Builds on the previous campaign (not the first one

addressing behaviour)– Is being rolled out in communities that have other HIV

prevention interventions– There are endogenous ‘interventions’ (e.g. conversations on

the way to school, or in the waiting line at the clinic)– Diffusion is a good thing– Cannot (and does not want to) control implementation

Page 3: Impact Evaluation in the Real World One non-experimental design for evaluating behavioral HIV prevention campaigns
Page 4: Impact Evaluation in the Real World One non-experimental design for evaluating behavioral HIV prevention campaigns
Page 5: Impact Evaluation in the Real World One non-experimental design for evaluating behavioral HIV prevention campaigns

Difference in Differences Example

Page 6: Impact Evaluation in the Real World One non-experimental design for evaluating behavioral HIV prevention campaigns

6

57.50 - 46.37 = 11.13

66.37 – 62.90 = 3.47

Non-participants

Participants

Effect = 3.47 – 11.13 = - 7.66

Page 7: Impact Evaluation in the Real World One non-experimental design for evaluating behavioral HIV prevention campaigns

7

After

Before

Effect = 8.87 – 16.53 = - 7.66

66.37 – 57.50 = 8.87

62.90 – 46.37 = 16.53

Page 8: Impact Evaluation in the Real World One non-experimental design for evaluating behavioral HIV prevention campaigns

8

Counterfactual assumption:

Without intervention participants and nonparticipants’ pregnancy rates follow same trends

Page 9: Impact Evaluation in the Real World One non-experimental design for evaluating behavioral HIV prevention campaigns

9

74.0

16.5

Page 10: Impact Evaluation in the Real World One non-experimental design for evaluating behavioral HIV prevention campaigns

10

74.0 -7.6

Page 11: Impact Evaluation in the Real World One non-experimental design for evaluating behavioral HIV prevention campaigns

Matching Example

Page 12: Impact Evaluation in the Real World One non-experimental design for evaluating behavioral HIV prevention campaigns

Implementation realities

• BCC program:– Has already started– Builds on the previous campaign (not the first one

addressing behaviour)– Is being rolled out in communities that have other HIV

prevention interventions– There are endogenous ‘interventions’ (e.g. conversations on

the way to school, or in the waiting line at the clinic)– Diffusion is a good thing– Cannot (and does not want to) control implementation

Page 13: Impact Evaluation in the Real World One non-experimental design for evaluating behavioral HIV prevention campaigns

What do we need to know?

• Can a specific set of communication messages manipulate a specific set of sexual behaviors?

• What magnitude of behaviour change will give what magnitude of changes in incidence?

Page 14: Impact Evaluation in the Real World One non-experimental design for evaluating behavioral HIV prevention campaigns

Approach decided on

• NON-intervention approach: – We are NOT trying to prove that one campaign

works….. BUT we are trying to see whether a specific set of messages work, irrespective of the method of delivery or transmission of the method

• Observation approach– We are not trying to force one intervention to work; not

focusing on implementation of one intervention

Page 15: Impact Evaluation in the Real World One non-experimental design for evaluating behavioral HIV prevention campaigns

So what WILL we do?

• Non-experimental design – Researcher does not manipulate the independent variable (message

exposure)– No control group in the community; create the control group

statistically through matching• Collection of exposure, behavioural and biological data from random

sample of individuals and their sexual partners• Develop a measurement of intensity of exposure (‘doses’ of exposure)• Determine the probability of having a specific dose of exposure• Match individuals with similar covariates, but different doses of exposure• Compare biological and behavioural outcomes

Page 16: Impact Evaluation in the Real World One non-experimental design for evaluating behavioral HIV prevention campaigns

So what WILL we do?

1. Survey to measure demographic covariates (or use pop survey data)

2. Measure type and intensity of exposure to messages– Different doses of exposure to MCP campaign messages among the population– Detailed measurement of method of exposure to messages during surveys: Direct

channels (# times heard messages on radio…); AND indirect channels (conversation with friends, relatives, etc.; as shown to be important in accounting for HIV declines in Uganda)

– Construct message exposure scale (low vs. high, or more detailed) using statistical techniques (e.g., principal components analysis)

– Every individual has a single score for message exposure

Page 17: Impact Evaluation in the Real World One non-experimental design for evaluating behavioral HIV prevention campaigns

So what WILL we do?

3. Survey to measure exposure, behavioural outcomes, couple and social network norms and HIV incidence amongst random selection of individuals

4. Nested sub-study to trace partners of those who reported one or more sexual partners, and collect same data from them

5. Parallel measurement of ‘social norms’ – hearsay ethnography or other methods

Page 18: Impact Evaluation in the Real World One non-experimental design for evaluating behavioral HIV prevention campaigns

So what WILL we do?6. Analyses

– Use covariates to calculate an individual’s propensity (scalar summary of all covariates) to receive a specific ‘dose of treatment’ (message exposure scale)

– Match pairs of participants (index cases and their sexual partners) with similar propensity scores and different doses of treatment (control and treatment groups)

– Now, we can calculate impact (behavioural and biological outcomes) by comparing the means of outcomes across participants and their matched pairs

7. Modeling– Has the density of the sexual network changed over time, and to what extent has

it changed? – How 'much' behaviour change is needed, over what period of time in how many

individuals, to bring about what levels of reductions in new infections– What are the individual and the combined effects of MC, ART, increased condom

use, and MCP reductions, respectively, on the number of new infections– What is the ideal 'mix' of interventions to implement?

Page 19: Impact Evaluation in the Real World One non-experimental design for evaluating behavioral HIV prevention campaigns

‘Low

exp

osur

e’

‘Hig

h ex

posu

re’

High probability of exposure given XLow probability of exposure given X

Density of scoresfor low exposure

Density of scoresfor high exposure

Page 20: Impact Evaluation in the Real World One non-experimental design for evaluating behavioral HIV prevention campaigns

What we will know

• Can a specific set of communication messages (delivered in different ways) manipulate a specific set of sexual behaviors?

• What magnitude of behaviour change will give what magnitude of changes in incidence?