evaluation made easy 5 th december, 2012. nottingham trent university designing & evaluating...

29
Evaluation made Easy 5 th December, 2012. Nottingham Trent University Designing & Evaluating Interventions Dr Zoe Rutherford- Nottingham Trent University Mr Daniel Parnell- University of Derby

Upload: miles-copeland

Post on 13-Jan-2016

220 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Evaluation made Easy 5 th December, 2012. Nottingham Trent University Designing & Evaluating Interventions Dr Zoe Rutherford- Nottingham Trent University

Evaluation made Easy5th December, 2012.

Nottingham Trent University

Designing & Evaluating Interventions

Dr Zoe Rutherford- Nottingham Trent University

Mr Daniel Parnell- University of Derby

Page 2: Evaluation made Easy 5 th December, 2012. Nottingham Trent University Designing & Evaluating Interventions Dr Zoe Rutherford- Nottingham Trent University

21 April 2023 2

Outline

• Research, monitoring and evaluation– Types of monitoring & evaluation– Cost effectiveness & cost benefit– Evaluation Design

• Designing interventions in Public Health

Page 3: Evaluation made Easy 5 th December, 2012. Nottingham Trent University Designing & Evaluating Interventions Dr Zoe Rutherford- Nottingham Trent University

So why do we bother with M&E?

• Because we have to?– To satisfy commissioners?– To satisfy our bosses?

• To find out whether the intervention ‘worked’?– What’s the context?– Did you reach the people you were trying to reach?– Did you effect the behaviour you were trying to change?– Could your results be down to chance?

• Or due to other factors?

• To find out what we could do better next time?– What went well?– What didn’t work so well?

21 April 2023 3

Page 4: Evaluation made Easy 5 th December, 2012. Nottingham Trent University Designing & Evaluating Interventions Dr Zoe Rutherford- Nottingham Trent University

So why do we bother with M&E?

• e.g. Exercise referral…

21 April 2023 4

(NICE, 2006)

Page 5: Evaluation made Easy 5 th December, 2012. Nottingham Trent University Designing & Evaluating Interventions Dr Zoe Rutherford- Nottingham Trent University

21 April 2023 5

Research

• Rigorously designed studies that aim to produce generalised knowledge, based on inference from a sample to a population

Page 6: Evaluation made Easy 5 th December, 2012. Nottingham Trent University Designing & Evaluating Interventions Dr Zoe Rutherford- Nottingham Trent University

21 April 2023 6

Monitoring

• The routine collection of information that will help you answer questions about your programme

Page 7: Evaluation made Easy 5 th December, 2012. Nottingham Trent University Designing & Evaluating Interventions Dr Zoe Rutherford- Nottingham Trent University

21 April 2023 7

Evaluation

• Using monitoring and other information to assess the process and

impact of a programme.

• Used to determine whether a programme is effective in achieving its

aims and objectives

– Quantitative Methods: Include subjective (e.g., questionnaires) and

objective measurements (e.g., pedometers) and are analysed using

statistical techniques

– Qualitative Methods: Include open ended questionnaires, interviews and

focus group discussions and can be used to gain a more in depth

understanding of peoples experiences. Data may be analysed with the use

of qualitative software (i.e., NVIVO)

• Identify aspects that can be improved in both the short and long-term

Page 8: Evaluation made Easy 5 th December, 2012. Nottingham Trent University Designing & Evaluating Interventions Dr Zoe Rutherford- Nottingham Trent University

21 April 2023 8

Types of Evaluation

• Formative Evaluation: Background work undertaken to prepare for

the planning and delivery of a programme/intervention and provide a

sound basis for programme content and subsequent process and

outcome evaluation

• Process Evaluation: Recording and assessing how an initiative is

delivered or implemented, including who attended and participant

satisfaction, with the aim of understanding how the programme worked

• Outcome Evaluation: Assesses the short and long-term effects of an

initiative

Page 9: Evaluation made Easy 5 th December, 2012. Nottingham Trent University Designing & Evaluating Interventions Dr Zoe Rutherford- Nottingham Trent University

21 April 2023 9

Cost Effectiveness

• Compares the relative expenditure (costs) and outcomes (effects) of

two or more courses of action

• Often used to evaluate new interventions and compare with current

practice

• Costs are usually in monetary terms

• Outcomes are measured in some other unit appropriate to the

condition being treated

– QALY- Quality Adjusted Life Year

• NICE’s cost effectiveness threshold is currently £30K

Page 10: Evaluation made Easy 5 th December, 2012. Nottingham Trent University Designing & Evaluating Interventions Dr Zoe Rutherford- Nottingham Trent University

21 April 2023 10

Cost Benefit

• Usually considers one intervention at a time

• Assesses whether the value of the benefits is greater than the costs

• Both costs and benefits are expressed in monetary terms

Cost (£)

Benefit (£)

Page 11: Evaluation made Easy 5 th December, 2012. Nottingham Trent University Designing & Evaluating Interventions Dr Zoe Rutherford- Nottingham Trent University

Football Foundation:Evaluation of Extra Time Wednesday 12 January 2011Friday 17th August 2012

Extra Time

Page 12: Evaluation made Easy 5 th December, 2012. Nottingham Trent University Designing & Evaluating Interventions Dr Zoe Rutherford- Nottingham Trent University

• Extra Time aimed:

To promote positive physical and social opportunities in older people (over 55 years)

• The research aimed to explore:

To explore the social and economic impacts of Extra Time using a Social Return on Investment (SROI) analysis

Extra Time.What is the project?

Extra Time | Friday 17th August 2012, World Congress on Active Ageing

Page 13: Evaluation made Easy 5 th December, 2012. Nottingham Trent University Designing & Evaluating Interventions Dr Zoe Rutherford- Nottingham Trent University

• SROI is an adjusted cost-benefit analysis that quantifies the value of social, environmental and economic outcomes that result from a service/programme

(NEF, 2004).

• Stakeholder (i.e., participants and the state) engagement helped identify the outcomes (using entrance and exit surveys).

• A focus on 5 FitC schemes undergoing in-depth study.

Research methodology.How did we complete the research?

Extra Time | Friday 17th August 2012, World Congress on Active Ageing

Page 14: Evaluation made Easy 5 th December, 2012. Nottingham Trent University Designing & Evaluating Interventions Dr Zoe Rutherford- Nottingham Trent University

• Five projects purposefully selected: mix of delivery methods, target groups,

new and existing projects, and spread around the country.

– Bristol Rovers.

– Colchester United.

– QPR.

– Rotherham United.

– Scunthorpe United.

• Each club was visited by Lizzie, who employed informal and interactional research techniques (including observations, informal interviews and personal reflections) - over 70 people consulted.

• Project partners also interviewed.

Research methodology.In depth case study in the 5 clubs

Extra Time | Friday 17th August 2012, World Congress on Active Ageing

Page 15: Evaluation made Easy 5 th December, 2012. Nottingham Trent University Designing & Evaluating Interventions Dr Zoe Rutherford- Nottingham Trent University

The results show £1 of investment into Extra Time results in £5.22 of social value.

Extra Time | Friday 17th August 2012, World Congress on Active Ageing

Most value is created for the participant. Authors are keen to tell the story of the complex measures and data involved and not merely focus on the SROI headline.

Page 16: Evaluation made Easy 5 th December, 2012. Nottingham Trent University Designing & Evaluating Interventions Dr Zoe Rutherford- Nottingham Trent University

How to…?

• The key to good monitoring and evaluation is to start right at the very beginning…

21 April 2023 16

Page 17: Evaluation made Easy 5 th December, 2012. Nottingham Trent University Designing & Evaluating Interventions Dr Zoe Rutherford- Nottingham Trent University

21 April 2023 17

The Scientific Method

1. Make an observation

2. Propose a testable hypothesis

3. Design an experiment to test the hypothesis

4. Collect and analyse the data

5. Draw conclusions to accept or reject the hypothesis

“Hypothesis driven research”

Introduction & Literature review

Methods

Results & Discussion

Page 18: Evaluation made Easy 5 th December, 2012. Nottingham Trent University Designing & Evaluating Interventions Dr Zoe Rutherford- Nottingham Trent University

21 April 2023 18

Designing Interventions

What should happen:

•Identify a need for intervention

•Explore what’s been done before

– Literature?

– Regionally?

– Nationally?

•Examine the evidence base for interventions that have taken place

•Design a robust intervention whereby a controlled evaluation can be

conducted

– Identify cause and effect

– Has your intervention had the desired effect?

Page 19: Evaluation made Easy 5 th December, 2012. Nottingham Trent University Designing & Evaluating Interventions Dr Zoe Rutherford- Nottingham Trent University

Designing Interventions

21 April 2023 19

• Who?– Population demographics

• Gender, ethnicity, age, social economic status,

• Why?– Observation i.e. there is a problem with X or Y

• evidence base

• Where?– Region, area, wards, groups/buildings…?

• When & how long?– At what point might you intervene?

• Prevention, management, cure, secondary prevention…

• 3, 6, 12, 24 months…?

• How?– Evidence based intervention– Theoretically driven?

Page 20: Evaluation made Easy 5 th December, 2012. Nottingham Trent University Designing & Evaluating Interventions Dr Zoe Rutherford- Nottingham Trent University

How do you know if it worked?

21 April 2023 20

1. What’s your question(s)?

2. What are your variables? i.e. what are you measuring?

– Independent variable(s): what are you manipulating to effect change?

– Dependent variable(s): what are you trying to change?

3. What do you need to/have you measured?

– How and on what occasion?

4. How do you analyse the data you’ve collected?

– Quant/qualitative data?

– Stats?

5. How do you interpret your results?

– Have you considered the other variables that may have influenced your

results?

Page 21: Evaluation made Easy 5 th December, 2012. Nottingham Trent University Designing & Evaluating Interventions Dr Zoe Rutherford- Nottingham Trent University

A Framework for action:‘How’ to Evaluate programmes

CDC Framework for Program Evaluation in Public Health: http://www.cdc.gov/

Page 22: Evaluation made Easy 5 th December, 2012. Nottingham Trent University Designing & Evaluating Interventions Dr Zoe Rutherford- Nottingham Trent University

21 April 2023 22

Evaluation Design- ‘Gold Standard’

Randomised Control Trial (RCT):

• A study in which people are allocated at random to receive one of

several interventions.

• One of these interventions is the standard of comparison or control.

• Every individual group has equal chance of being offered the

programme or not

• Significant implications for interventions

– Ethics

– Feasibility

– Cost

Page 23: Evaluation made Easy 5 th December, 2012. Nottingham Trent University Designing & Evaluating Interventions Dr Zoe Rutherford- Nottingham Trent University

21 April 2023 23

Evaluation Design- Real Life!

Pre-Post Study Design:

• Involves the assessment of variables before and after an

intervention.

• Improvements/regression are then attributed to the intervention

Case Study:

• In depth examination of aspects of a programme or of specific

individuals or groups experiences of a programme

Quasi-Experimental:

• Has all the makings of an experiment, but lacks randomisation

Page 24: Evaluation made Easy 5 th December, 2012. Nottingham Trent University Designing & Evaluating Interventions Dr Zoe Rutherford- Nottingham Trent University

21 April 2023 24

NICE Guidance- Keep it simple!

The strategy outlines 3 phases

1. Planning

2. Design

3. Evaluation

• The guidelines go on to state the major principles for each phase,

and finally gives a shopping list of recommended actions for each

principle

• However, keeping it overly simple on the highest level-phases-

compromises practicability on the intermediate level-principles-and

leads to a mixture of numerous overly generic versus very specific

action recommendations that are not distinct to the phases.

Page 25: Evaluation made Easy 5 th December, 2012. Nottingham Trent University Designing & Evaluating Interventions Dr Zoe Rutherford- Nottingham Trent University

21 April 2023 25

Intervention Mapping- IM Framework (Bartholomew, Parcel, Kok, & Gottlieb, 2001)

Provides clearly defined, detailed guidance for six phases/steps:

1. Needs assessment including capacities and problems

2. Definition of evidence-based intervention objectives ranging from the

most proximal behavioral targets to health and quality of life outcomes

3. Theoretically informed selection of determinants of behaviour at

different levels of influence as well as methods for addressing them

4. Planning and producing practical program components and materials

5. Planning for program adoption, implementation and sustainability

6. Developing framework for process and impact evaluation.

Page 26: Evaluation made Easy 5 th December, 2012. Nottingham Trent University Designing & Evaluating Interventions Dr Zoe Rutherford- Nottingham Trent University

Inputs

Activities

Outputs

Short-term Outcomes

Intermediate outcomes

Long-term outcomes

Logic Model for programme planning

Resources necessary for program implementation

The actual interventions that the program implements in order to achieve health outcomes

Direct products obtained as a result of program activities

All the outcomes:The changes, impacts, or results of program implementation (activities and outputs)

Page 27: Evaluation made Easy 5 th December, 2012. Nottingham Trent University Designing & Evaluating Interventions Dr Zoe Rutherford- Nottingham Trent University

21 April 2023 27

RE-AIM Framework (Glasgow, McKay, Piette, & Reynolds, 2001).

Reach Efficacy/Adoption Implementation Maintenance

A guide for designing interventions that:

1. Will reach the intended population

2. Can be demonstrated to have high efficacy/effectiveness

3. Adopted by the programme users and implemented with high

treatment fidelity

4. Will be maintained-as programmes but also, preferably, as long-

term changes in the target behaviours

Page 28: Evaluation made Easy 5 th December, 2012. Nottingham Trent University Designing & Evaluating Interventions Dr Zoe Rutherford- Nottingham Trent University

Review Evaluation should follow a suitable

framework for preparing it’s usefulness and importantly whether it is feasible, ethical, and accurate.

There are various outcomes which the programme will have; make sure the outcomes are captured at the right time!

Consider the application to other programmes from today; can you create or refine a current programme /evaluation from here?

Page 29: Evaluation made Easy 5 th December, 2012. Nottingham Trent University Designing & Evaluating Interventions Dr Zoe Rutherford- Nottingham Trent University

Recommendations Understand your initiatives purpose and organisational

structure to support M&E Explore your peoples (human) capacity to engage in M&E Develop positive M&E partnerships Collaboratively develop an M&E plan Advocate M&E to work towards developing a culture of

evidence based practice Reflection on M&E strategically and operationally Disseminate and develop ways forward to improve – and

create positive action

www.derby.ac.uk/science