the national evaluation platform approach robert e black md, mph institute for international...

31
The National Evaluation Platform Approach Robert E Black MD, MPH Institute for International Programs Bloomberg School of Public Health Johns Hopkins University

Upload: antoine-lunn

Post on 31-Mar-2015

214 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: The National Evaluation Platform Approach Robert E Black MD, MPH Institute for International Programs Bloomberg School of Public Health Johns Hopkins University

The National Evaluation Platform

Approach

Robert E Black MD, MPHInstitute for International ProgramsBloomberg School of Public Health

Johns Hopkins University

Page 2: The National Evaluation Platform Approach Robert E Black MD, MPH Institute for International Programs Bloomberg School of Public Health Johns Hopkins University

Outline

1. Why a new approach is needed

2. National Evaluation Platforms (NEPs): The basics

3. Country example: Malawi

4. Practicalities and costs

Page 3: The National Evaluation Platform Approach Robert E Black MD, MPH Institute for International Programs Bloomberg School of Public Health Johns Hopkins University

Most current evaluations of large-scale programs aim to use designs like this

Impact

Coverage

Program

No impact

No coverage

No program

Page 4: The National Evaluation Platform Approach Robert E Black MD, MPH Institute for International Programs Bloomberg School of Public Health Johns Hopkins University

But reality is much more complex

General socioeconomic and other contextual factors

Impact

Coverage

Routine health services

Interventions in other sectors

Other healthprograms

Program

Other nutrition and health

programs

Page 5: The National Evaluation Platform Approach Robert E Black MD, MPH Institute for International Programs Bloomberg School of Public Health Johns Hopkins University

Mozambique

• How to evaluate the impact of USAID-supported programs?

• Traditional approach: intervention versus comparison areasSource:

Hilde De Graeve, Bert Schreuder.

Page 6: The National Evaluation Platform Approach Robert E Black MD, MPH Institute for International Programs Bloomberg School of Public Health Johns Hopkins University

Mozambique

• Simultaneous implementation of multiple programs

• Separate, uncoordinated, inefficient evaluations– if any

• Inability to compare different programs due to differences in methodological approaches and indicatorsSource:

Hilde De Graeve, Bert Schreuder.

Page 7: The National Evaluation Platform Approach Robert E Black MD, MPH Institute for International Programs Bloomberg School of Public Health Johns Hopkins University

New evaluation designs are needed

• Large-scale programs

• Evaluators do not control timetable or strength of implementation

• Multiple simultaneous programs with overlapping interventions and aims

• Contextual factors that cannot be anticipated

• Need for country capacity and local evidence to guide programming

Lancet, 2007

Bulletin of WHO, 2009

Sources: Victora CG, Bryce JB, Black RE. Learning from new initiatives in maternal and child health. Lancet 2007; 370 (9593): 1113-4. Victora CG, Black RE, Bryce J. Evaluating child survival programs. Bull World Health Organ 2009; 87: 83.

Page 8: The National Evaluation Platform Approach Robert E Black MD, MPH Institute for International Programs Bloomberg School of Public Health Johns Hopkins University

NATIONAL EVALUATION PLATFORMS: THE BASICS

Lancet, 2011

Page 9: The National Evaluation Platform Approach Robert E Black MD, MPH Institute for International Programs Bloomberg School of Public Health Johns Hopkins University

Builds on a common evaluation framework, adapted at country level

Common principles (with IHP+, Countdown, etc.)

Standard indicators Broad acceptance

Page 10: The National Evaluation Platform Approach Robert E Black MD, MPH Institute for International Programs Bloomberg School of Public Health Johns Hopkins University

Evaluation databases with districts as the units

• District-level databases covering the entire country• Containing standard information on:

Inputs (partners, programs, budget allocations, infrastructure)Processes/outputs (DHMT plans, ongoing training, supervision,

campaigns, community participation, financing schemes such as conditional cash transfers)

Outcomes (availability of commodities, quality of care measures, human resources, coverage)

Impact (mortality, nutritional status)Contextual factors (demographics, poverty, migration)

Permits national-level evaluations of multiple simultaneous programs

Page 11: The National Evaluation Platform Approach Robert E Black MD, MPH Institute for International Programs Bloomberg School of Public Health Johns Hopkins University

A single data base with districts as the rows

District … …

Chitipa

Karonga

….

Core Data Points from Health Sector

Core Data Points from Other Sectors

HMIS

DHS

National Stocks data base

Nutrition Surveillance System

Rainfall patterns

Women’s education

Quality Checking & Feedback to Source

Page 12: The National Evaluation Platform Approach Robert E Black MD, MPH Institute for International Programs Bloomberg School of Public Health Johns Hopkins University

Types of comparisons supported by the platform approach

• Areas with or without a given program – Traditional before-and-after analysis with a

comparison group• Dose response analyses

– Regression analyses of outcome variables according to dose of implementation

• Stepped wedge analyses– In case program is implemented sequentially

Page 13: The National Evaluation Platform Approach Robert E Black MD, MPH Institute for International Programs Bloomberg School of Public Health Johns Hopkins University

Evaluation platform Interim (formative) data analyses

• Are programs being deployed where need is greatest?– Correlate baseline characteristics (mortality, coverage, SES, health

systems strength, etc) with implementation strength– Allows assessment of placement bias

• Is implementation strong enough to have an impact?– Document implementation strength and run simulations for likely

impact (e.g., LiST)

• How to best increase coverage?– Correlate implementation strength/approaches with achieved

coverage (measured in midline surveys)

• How can programs be improved?– Disseminate preliminary findings with feedback to government and

partners

(All analyses at district level)

Page 14: The National Evaluation Platform Approach Robert E Black MD, MPH Institute for International Programs Bloomberg School of Public Health Johns Hopkins University

Evaluation platform Summative data analyses

• Did programs increase coverage?– Comparison of areas with and without each program over time– Dose-response time-series analyses correlating strength of program

implementation to achieved coverage

• Was coverage associated with impact?– Dose-response time-series analyses of coverage and impact indicators– Simulation models (e.g. LiST) to corroborate results

• Did programs have an impact on mortality and nutritional status?– Comparison of areas with and without each program over time– Dose-response time-series analyses correlating strength of program

implementation with impact measures

Page 15: The National Evaluation Platform Approach Robert E Black MD, MPH Institute for International Programs Bloomberg School of Public Health Johns Hopkins University

The platform approach can contribute to all types of designs

• Having baseline information on all districts allows researchers to measure and control placement bias

• In real life one cannot predict which districts will have strong implementation and which ones will not

• In intervention/comparison designs, it is important to document that comparison districts are free of the intervention

• Collecting information on several outcomes allows assessment of side-effects of the program on other health indicators

Page 16: The National Evaluation Platform Approach Robert E Black MD, MPH Institute for International Programs Bloomberg School of Public Health Johns Hopkins University

COUNTRY EXAMPLE: CCM IN MALAWI

Page 17: The National Evaluation Platform Approach Robert E Black MD, MPH Institute for International Programs Bloomberg School of Public Health Johns Hopkins University

Simultaneous implementation of multiple programs

Separate, uncoordinated, inefficient evaluations (if any)

Inability to compare different programs due to differences in methodological approaches and indicators

Page 18: The National Evaluation Platform Approach Robert E Black MD, MPH Institute for International Programs Bloomberg School of Public Health Johns Hopkins University

Malawi CCM scale-up limits use of intervention-comparison design

CCM supported in all districts beginning in 2009…

… and implemented in Hard-to-Reach Areas! (March 2011)

DistrictsUNICEF /

WHO/UNFPA

Save the Children

DELIVER/ BASICS

PSI SC4CCM Concern At risk

Balaka

BlantyreChikhwawaChiradzuluChitipaDedzaDowaKarongaKasunguLikomaLilongweMachingaMangochiMchinjiMulanjeMwanzaMzimba North Mzimba South NenoNkhata BayNkhotakotaNsanjeNtcheuNtchisiPhalombeRumphiSalimaThyoloZomba

BALAKABLANTYRE

CHIKWAWACHIRADZULU

CHITIPADEDZADOWA

KARONGAKASUNGU

LIKOMALILONGWEMACHINGAMANGOCHI

MCHINJIMULANJEMWANZA

MZIMBA (north)MZIMBA (south)

NENONKHATABAY

NKHOTAKOTANSANJE

NTCHEUNTCHISI

PHALOMBERUMPHISALIMATHYOLOZOMBA

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

81%100%

83%90%

77%53%

65%60%

48%100%

56%66%

39%85%

38%87%

50%30%

83%100%

88%89%

59%100%

78%38%

57%51%51%

Proportion of Hard-to-Reach Areas with ≥1 Functional Village Clinic, March 2011

Page 19: The National Evaluation Platform Approach Robert E Black MD, MPH Institute for International Programs Bloomberg School of Public Health Johns Hopkins University

Malawi adaptation of National Evaluation Platform approach

National Evaluation Platform design using dose-response analysis, with

DOSE = PROGRAM IMPLEMENTATION STRENGTHRESPONSE = INCREASES IN COVERAGE;

DECREASES IN MORTALITY

Evaluation Question:Are increases in coverage and reductions in mortality greater in districts with stronger MNCH program implementation?

Page 20: The National Evaluation Platform Approach Robert E Black MD, MPH Institute for International Programs Bloomberg School of Public Health Johns Hopkins University

Platform design overviewDesign element Data sources (sample = 29 districts) Documentation of program implementation and contextual factors

Full documentation every 6 months through systematic engagement of DHMTs

Quality of care survey at 1st-level health facilities

Existing 2009 data to be used for 18 districts; repeat survey in 2011

Quality of care at community level (HSAs)

Desirable to conduct in all 28 districts (Not included in this budget proposal)

Intervention coverage DHS 2010, with samples of 1,000 households representative at district level in all 28 districts

DHS/MICS 2014 with samples representative at district level in all 28 districts

Costs Costing exercises in ≈ 1/3 of districts distributed by region and chosen systematically to reflect differences in implementation strategy or health system context

Impact (under-five mortality and nutritional status)

End-line household survey (MICS or DHS?) in 2014

Modeled estimates of impact based on measured changes in coverage using LiST

Page 21: The National Evaluation Platform Approach Robert E Black MD, MPH Institute for International Programs Bloomberg School of Public Health Johns Hopkins University

National Evaluation Platform: Progress in Malawi - 1

Continued district level documentation in 16 districts Pilot of cellphone interviews for community-level

documentation Stakeholder meeting in April 2011

− Full endorsement by the MOH− Partners urged to coordinate around developing a common

approach for assessment of CCM and non-CCM program implementation strength

− Need to allow sufficient implementation time to increase likelihood of impact

− MOH addressed letter to donors requesting support for platform

Partners’ meetings in September and December 2011 to agree on plans for measuring implementation strength

Page 22: The National Evaluation Platform Approach Robert E Black MD, MPH Institute for International Programs Bloomberg School of Public Health Johns Hopkins University

National Evaluation Platform: Progress in Malawi - 2

All partners (SCF, PSI, WHO, UNICEF) actively monitoring CCM implementation in their districts

Funding secured for 16 of 28 districts; additional funding for remaining districts seems probable

Discussions under way about broadening platform to cover nutrition programs

Other countries expressing interest! Mozambique, Bangladesh, Burkina Faso, …

Page 23: The National Evaluation Platform Approach Robert E Black MD, MPH Institute for International Programs Bloomberg School of Public Health Johns Hopkins University

Analysis Plan

“Dose” CCM implementation

strength (per 1,000 pop): CHWs CHWs trained in CCM CHWs supervised CHWs with essential

commodities available Financial inputs

“Response” Change in Tx rates for

childhood illnesses Change in U5M

Page 24: The National Evaluation Platform Approach Robert E Black MD, MPH Institute for International Programs Bloomberg School of Public Health Johns Hopkins University

Contextual FactorsCategories Indicators

ENVIRONMENTAL, DEMOGRAPHIC AND SOCIOECONOMICRainfall patterns Average annual rainfall; seasonal rain patterns Altitude Height above sea levelEpidemics QualitativeHumanitarian crises Qualitative

Socio-economic factors Women’s education & literacy; household assets; ethnicity, religion and occupation of head of household

Demographic Population; population density; urbanization; total fertility rate; family size

Fuel costs! Added as this slowed program implementation in 2010-11

HEALTH SYSTEMS AND PROGRAMSUser fees Changes in user fees for IMCI drugsOther MNCH Health Programs

The presence of other programs or partners working in MNCH

Page 25: The National Evaluation Platform Approach Robert E Black MD, MPH Institute for International Programs Bloomberg School of Public Health Johns Hopkins University

Advantageous context for NEP

strong network of MNCH partners implementing CCM

administrative structure decentralized to districts SWAp II in development now district-level data bases (2006 MICS, 2010 DHS,

Malawi Socio-Economic Database (MASEDA)) DHS includes approx. 1,000 households in each

district

Page 26: The National Evaluation Platform Approach Robert E Black MD, MPH Institute for International Programs Bloomberg School of Public Health Johns Hopkins University

PRACTICALITIES AND LIMITATIONS

Page 27: The National Evaluation Platform Approach Robert E Black MD, MPH Institute for International Programs Bloomberg School of Public Health Johns Hopkins University

Sample sizes must be calculatedon a country-by-country basis

Statistical power (likelihood of detecting an effect) will depend on:– Number of districts in country (fixed; e.g. 28 in Malawi)

– How strongly the program is implemented, and by how much implementation affects coverage and mortality

– How much implementation varies from district to district

– Baseline coverage levels

– Presence of other programs throughout the districts

– How many households are included in surveys in each district

• May require oversampling

Page 28: The National Evaluation Platform Approach Robert E Black MD, MPH Institute for International Programs Bloomberg School of Public Health Johns Hopkins University

Practical arrangements

Platform should be led by credible independent partner (e.g. University or Statistical Office) Supported by an external academic group if

necessary Steering committee with MOH and other relevant

government units (Finance, Planning), Statistical Office, international and bilateral organizations, NGOs, etc.

Page 29: The National Evaluation Platform Approach Robert E Black MD, MPH Institute for International Programs Bloomberg School of Public Health Johns Hopkins University

Main costs of the platform approach

Building and maintaining database with secondary information already collected by others Requires database manager and statistician/epidemiologist for supervision May require reanalysis of existing surveys, censuses, etc

Keeping track of implementation of different programs at district level Requires hiring local informants, training them and supervising their work

Adding special assessments (costs, quality of care, etc) May require substantial investments in facility or CHW surveys

Oversampling household surveys May require substantial investments But this will not be required in all countries

Page 30: The National Evaluation Platform Approach Robert E Black MD, MPH Institute for International Programs Bloomberg School of Public Health Johns Hopkins University

Summary: Evaluation platform

Advantages– Adapted to current reality of

multiple simultaneous programs/interventions

– Identification of selection biases– Promotes country ownership

and donor coordination– Evaluation as a continuous

process– Flexible design allows for

changes in implementation

Limitations– Observational design (but no

other alternative is possible)– High cost particularly due to

large size of surveys• But cheaper than standalone

surveys– Requires transparency and

collaboration by multiple programs and agencies

Page 31: The National Evaluation Platform Approach Robert E Black MD, MPH Institute for International Programs Bloomberg School of Public Health Johns Hopkins University

Thank you