how can you measure magic? - clore social leadership how can you measure magic? how can smaller...

13
How can you Measure Magic? How can smaller organisations, especially those working with street-connected young people, measure impact? NICOLA SANSOM

Upload: doankhanh

Post on 17-Apr-2018

217 views

Category:

Documents


4 download

TRANSCRIPT

1

How can you Measure Magic? How can smaller organisations, especially those working with street-connected young people,

measure impact?

NICOLA SANSOM

2

“If you think you are too small to have an impact, try going to bed with a

mosquito”. Anita Roddick

Business woman, human rights and environmental

activist

Introduction This is a short report that was put together by Nicola Sansom, CEO of S.A.L.V.E. International, as part of her Clore Social Leadership Fellowship practical research, to help to explore how and why organisations that work with street-connected children and young people measure their impact. Special thanks goes to the Oak Foundation for funding this fellowship and being committed to developing work that is trying to enrich and improve work being done in the homelessness field. The report explores impact assessment after interviewing a range of leaders and monitoring and evaluation experts from across the global street-connected young people sector. It explores how they define impact, why they measure it, how they measure it, how much time and brain space is given to impact assessment within their organisation and what the barriers are to being able to measure it. This report sits beside a short toolkit “How can you Measure Magic? A toolkit” and it is advised that you read the report and its recommendations first, before using this toolkit. The purpose of this research and toolkit is to share ideas and learning with others who might find it useful to creatively develop the ideas and tools and adapt them to fit different contexts or to help them to create entirely new or different tools to suit their needs. The tools are not intended to be used exactly as they are presented here, and most of the tools in this toolkit are originally adaptations of other methodologies themselves.

3

“Success is never final, failure is

never fatal. It's courage that

counts. Never mistake activity

for achievement”. John Wooden

Basketball player and coach

Summary of Key Recommendations Why: Always define why and who you are measuring impact for, as this will affect which methodologies will be the most useful.

Don’t be afraid to experiment and adapt: Explore the tools that already exist and adapt them or be inspired by them to create something to suit your needs and context.

Keep things simple: Don’t try to overcomplicate the process and measure everything. Technology can help store and present the findings, but it can’t do the thinking for you.

Make time to analyse or there is no point measuring: If you don’t have time to analyse certain results or to be able to implement changes based on their findings, question if you should be using your time measuring it at all.

Find a buddy: Find a buddy, ideally outside your own organisation, who is willing to explore your assumptions on how you plan to measure impact.

Don’t over claim: be realistic about your effect. Remember safeguarding and transparency: Think about safeguarding and give stakeholders enough information to make an informed decision to opt in.

Introduction Imagine yourself in a small organisation with limited resources and a small multitasking

staff and volunteer team. You work with a unpredictable population group, who move

around a lot, and thus are hard to track change with over time. Your team are working

hard towards your vision and want to know if your programmes are actually achieving

what they set out to do or not. But you can’t afford an independent evaluation to

explore that for you, let alone a specialist role. It is possible to measure your activity

outputs and short term outcomes, but assessing the effect of your longer term

outcomes and wider societal impact is much more difficult. On top of this you are likely

to be doing different types of evaluations for different elements of your overall

programme, to suit your various funders’ needs.

So how could you measure your impact to really understand the difference you are making? Some

people suggest you shouldn’t even try; as you don’t have the qualified expertise or time to do it to a

quality standard, and you have an inherent bias to find positive impact to help you to secure funding.

All organisations, no matter their size, should be striving to deliver the best possible service they can,

and looking for ways to refine and improve their work based on the tools, partnerships and evidence

they can access. I believe that smaller organisations can, should and often are, trying their best to

measure impact and should be further encouraged and supported in this. This resource is written to

offer some ideas, examples and support to help smaller organisations measure their impact.

If you aren’t asking “Are we making a difference?” and testing your assumptions, you risk

not being the best service you can be at the time. There is no magician to pull social change out of

a hat for you, but there are lots of people working hard to develop trusting relationships and make

magic happen every day. We need to invest longer term funding to measure, support and harness that

energy, so that no young person has to sleep on the streets in the future.

4

Research Methodology

This exploration involved two core elements. 1. Interviewing a range of leaders and monitoring and evaluation experts from across the global street-connected young people sector. Exploring how they define impact, why they measure it, how they measure it, how much time and brain space is given to impact assessment within their organisation and what the barriers are to being able to measure it. The answers from these interviews have been made anonymous for the purpose of this report as requested by participants to allow them to speak freely.

2. Exploring impact assessment needs in depth with street-connected young people organisations in three countries – Uganda, India and the UK. This was to try to explore common challenges and to practically experiment with adapting resources for different contexts. Some of these adaptations are available in a separate toolkit with examples that can be adapted or learnt from such as an audio tour of stories from Manchester and Uganda.

How can you define Impact? “Impact assessment is the systematic analysis of the lasting or significant changes – positive or negative, intended or not intended – in people’s lives brought about by a given action or series of actions”. Chris Roche, Impact Assessment for Development Agencies, Oxfam 1999.

To gain a deeper understanding of how we could be measuring impact, it was important

to first understand what the term could mean, by asking a range of influential thinkers

across the sector. To be able to make any conclusions, we need to make sure that we are

exploring a similar concept in the first place.

“How would you define impact?” A sample of responses “The difference that has been made by your work in people’s lives”. “The visible change that has been made that any person can see”. “The broader change you are seeking to achieve in society, which you are one of the contributors towards”. “The young person has changed to be able to better cope with their challenges and shape their future”. “The intended and unintended consequences that occur as a result of your outcomes to make long term systematic change”. “The sustainable change that your work brings about”. “Concrete outcomes for the target population that you serve”. “The ultimate goal an organisation is trying to achieve with its beneficiaries”. “The change that occurs as a result of your activities”.

The variety within the responses seem to be able to be broken down around longevity of change and what is being measured. Interestingly no-one brought up the idea of impact meaning keeping things the same, or just not making things worse. Output V Outcome V Impact: If activities are what you do to lead to your outputs, then outcomes are the changes that happen as a result of your outputs. The nuancing of opinion seems to be over whether impact means general longer term outcomes, or does it mean sustainable outcomes for your service users that will carry on with or without your organisation, or is it defined by how your outcomes affect society beyond your service users?

5

“The very first requirement in a hospital is that it should do the sick no harm”. Florence Nightingale Nurse and social reformer

Why do you measure Impact?

Why? This is the most important question that you need to ask yourselves before attempting to measure any part of your impact. If you don’t know

why you are measuring something, or for whom, you might not ask the right questions and thus you might not pick the most appropriate tools and

methodologies to meet your needs.

After thinking about what you really want to know, why and for whom you are measuring impact, there are a few other key questions I would recommend that you consider.

- What would you do if you find the impact of your work is negative not positive? - How much of the impact that you find should be credited to you? - Do you have the capacity (time and skills) to analyse the information that you collect? - Do you have the flexibility to make changes based on the results of the analysis? - If smaller organisations don’t have the capacity to measure their impact, is that a problem?

“To find out if we are making a difference. Even if (gulp) it’s not the difference we were planning to make”.

“To find out if we achieved what we set out to do”.

“To feedback to donors that we have had impact using the funding they gave us. To provide evidence of need for further funds”.

“To make sure we stay focused on the target group; the hardest to reach and the most vulnerable young people”.

“To continually look for the best way of doing things and to overcome negatives along the way”.

“To take change to decision makers, so that wider benefit can be created from your evidence. Otherwise what is the point of collecting it”?

“To help you to keep building and refining what is working until it is close to perfect”.

“To give you confidence in the quality of your programme”.

“Do we measure impact? You need funding for that – which we don’t have… yet. But we hope to start soon”.

“People want to see an outcome. That it’s working and you’re going in the right direction. It can be hard when new children come to the streets and they just seem to replace the others you have successfully transitioned home. The community says, “I can still see children on the streets, so you must have failed”. But causal factors like war, poverty, disease and family breakdown are outside our control”.

6

“I think, at a child’s birth, if a mother could ask a fairy godmother to endow it with the most useful gift, that gift should be curiosity”. Eleanor Roosevelt Politician and activist

Some Existing Tools and Approaches There is no single tool that can demonstrate all of your organisation’s impact. The fascinating thing

about impact assessment is there are so many different ways of approaching it. I thought it would

be interesting to investigate a few tools that organisations are already using to assess their impact

while working with children on the streets, and the strengths and challenges they have found.

Tool or approach explained Some strengths Some challenges

Case notes and informal conversations

You can understand which services a young person has used, and any positive or negative change that they and staff identify they have made at the time. This can be tailored to the individual child. For example a child who takes drugs having one less hit per day could be a big change for one young person, but not for another who has never taken drugs.

This is a very anecdotal approach, which staff can approach with varying techniques, so it is not possible to compare results from across your service user group, but only to see the change made according to an individual.

Focus groups, surveys and interviews

They can be fully tailored to the specific questions and areas you are investigating. Open questions can allow the stakeholder the chance to give in depth feedback.

Who is facilitating the exercise and how they facilitate and record it can have a big effect on the quality of the data that is being gathered. Analysing open questions can be more time consuming and difficult than closed questions.

Story Telling Analysis It feels a more natural way of communicating for stakeholders and has a depth of information and meaning.

It can be time consuming to gather the stories, and different interpretations of meaning and context can affect the usefulness of the analysis.

Registers You can understand or how many young people are using particular services and how often they use them broken down by different demographics.

They can show attendance or non-attendance, but don’t give any information as to why services are or aren’t being used.

Head Counting You can get a sense of how many young people are still living and working on the streets and compare over time.

Young people on the streets often want to be hidden, so it is incredibly hard to get accurate data and changes could be due to other external factors.

Spiders web situational analysis

It helps you to track the need for your services in any given geographical location over time so you can see if you are still needed there or not.

It is hard to be able to compare the data as stakeholders can have different interpretations of what levels within an indicator mean to them.

7

Some Existing Tools and Approaches Tool or approach explained Some strengths Some challenges

Outcome Star/ Wellbeing Toolkit

It splits change into indicators that are significant to your service users, and helps you to measure distance travelled (positively or negatively) over time. It can also be used as a counselling and action point tool, not just for impact measurement.

It is hard to be able to compare the data, as service users can have different interpretations of what levels within an indicator mean to them. Some people believe that this means you can’t compare across a service user group, while others think it can if facilitated consistently and carefully.

Strengths and Difficulties Questionnaire

It helps you to explore where a young person is at and you can use it to compare and measure change over time because it is standardised and normative.

It can’t be adapted to fit the context you want to use it in, you have to use it exactly as it was designed.

Child Council/ Federation The young people are the ones investigating impact that matters to them in a way that is meaningful to them.

The group might be biased in favour of the organisation having high impact, and might not have the technical knowledge and skills needed.

Log Frame It gives a structural framework of the programme and how activities are supposed to lead to change so you can set indicators to measure by.

It is a structural framework that can encourage you to collect and explore evidence that is easier to find i.e. outputs rather than longer term outcomes.

Theory of Change Demonstrates a flexible causal chain framework of how the organisation hopes to stimulate change so that you can set indicators to plan and evaluate by.

It can be quite hard to capture all the assumptions and to create a shared understanding of a chain leading to change which everyone can agree with.

Social Return on Investment/ Social Audit

It helps you to demonstrate the extra value you give to society, the environment and the economy around where you work (if any).

It relies on being able to use similar but not exactly alike data to make calculations, still focuses on financial value, and is a long process to complete.

Long term stakeholder follow up after programme

You can find out to what extent your services have contributed to change in a young person’s life, over the course of their life.

It’s hard to know how much of the change (if any) can be attributed to your work, especially after a longer period of time has gone by. Also your current programme is likely to have evolved from the one they were a part of.

While selecting tools to suit your purpose, it is important to also consider standards of evidence, and whether some of these approaches might

carry more weight/ validity than others based on the type of evidence you are choosing to use. You also have to think about the rigor of

certain methods, versus their relevance based on the context that you are working in and what therefore can be practically applicable.

8

100% of people interviewed rated impact

assessment as being difficult.

With 83% of people rating it quite difficult

and 17% as very difficult.

Barriers and Challenges

Interviews helped to highlight a number of common barriers and challenges being faced by organisations in the

sector.

Measuring the current homeless

population in an area accurately is almost

impossible, due to the transient and often

hidden nature of homelessness, therefore

it is incredibly difficult to have a robust

baseline to measure change against.

In a small organisation there is not

enough staff time, staff skill and

knowledge or staff confidence to do

impact assessment. The day-to-day

running of the organisation comes first.

Not having enough money or training/

knowledge to make best use of current

technology support. Technology should

help speed up your organisation’s ability

to collate and analyse data.

Everyone is measuring impact

differently and there are a

confusing number of options to

choose from. Thus it’s hard to

compare data and tips on how to

improve tools and methodologies.

Impact assessment is often

done for the funder and the

learning is not internalised into

the organisation’s working

practices and future planning.

Staff working with the young people

don’t always fully understanding the

tools or why they are being used. The

staff might not use the tools as

effectively if they seem like a burden

not a benefit.

Funders often support short term project cycles,

with new and innovative projects, rather than those

with long-term evidence of success. This doesn’t

really fit with measuring and growing impact which is

about long term sustainability. Putting learning into

action can also be difficult if it makes programme

costs higher per person rather than cheaper.

9

Time currently spent on impact assessment

and analysis by leadership in the

organisations interviewed varied

from currently doing none up to 100%. Best practice was

identified as being a minimum of 25% of a

leader’s time

Improving Quality However at the same time interviews highlighted a range of thought-provoking ideas and suggestions to help organisations either to start to

measure their impact or to improve the quality of the assessments and analysis they are making.

More sharing of adaptable

templates, data and case

studies using tools in

practice which highlight the

strengths and weaknesses of

different approaches.

Be bolder and not only

celebrate success, but

also talk more

publically about things

that haven’t worked

and learning from it.

Long term funding to

invest in staff training

and development, and

to access existing or to

develop new technology

systems that make

impact assessment

easier.

The development of

more enjoyable and

more creative tools, so

that impact assessment

becomes more fun for

the stakeholders and

the staff.

Increased government

collaboration and open

access to regional and

national data and

government records.

Organisations working

together to offer support,

staff swaps and buddying

to solve problems and co-

develop methodologies.

10

Start with “Why?”: This is the most important question that you need to ask yourselves before attempting to measure any part of your impact. If you don’t know the right question, why you are measuring something, or for whom, you might not pick the most appropriate tools and methodologies to meet your needs. Be honest with yourselves, if you are doing it just to meet funder’s needs.

Have time to analyse: If you don’t have the time, or relevant knowledge or skills, to analyse certain results or to be able to implement changes based on their findings, then you should question if you should be measuring it at all. Choose to make time, upskill or find someone externally who can help you, or admit your limitations and don’t do it. There is no point having data just for the sake of it.

Keep it simple: Don’t try to overcomplicate the process and measure everything. Make sure that both stakeholders and facilitators understand the methodology and why you are doing the assessment. Remember that technology can help to store and present the findings, but it can’t do the thinking for you.

Find a buddy: Find a peer/buddy, ideally outside your own organisation, who is willing to share ideas and explore your assumptions in how you plan to measure impact together. Agree in advance how much time each of you can give to help each other and if it will be a one off or longer term arrangement.

Work together to adapt and make tools relevant and interesting: Involve your stakeholders in helping you to adapt a tool to make it fit for purpose you are doing the assessment and also make it interesting to take part in. Don’t be afraid to experiment and be creative. Remember that one tool alone can’t investigate everything, so think about which other tools and evidence could help to fill in the gaps.

Recommendations when Assessing Impact

Don’t over claim: Think about the assumptions that you are basing any claims of impact upon before you make them. Consider carefully which evidence you are using to prove that impact was a result of your work.

Remember safeguarding and transparency: Make sure you are always clear about why you are doing the impact assessment and what will happen with any results collected. Give stakeholders enough information, in a way that they can truly understand, so that they can make an informed decision to opt in or out of taking part. Make sure you consider any child safeguarding issues in how you are going to record and share the information before gathering the data.

11

“I can do things you cannot, you can do things I

cannot; together we can do great

things”. Mother Teresa Humanitarian

Conclusion

Exploring impact, in many ways feels like starting off a chain reaction, where one question leads to another, and every approach leads you to an alternative that you could try. This can be incredibly interesting to explore when you have enough time, energy and resources to put into it. But it can also be a huge barrier that makes smaller organisations feel less confident to know where best to focus their efforts. Caroline Fiennes even argues that small organisations shouldn’t be trying to measure impact at all because they can’t be expected to produce good quality, robust evidence about their own work, they should focus on what they are good at which is implementing the programmes. The fact that there is currently no common language or tools that are widely accepted as being best practice for impact measurement makes it difficult to work together and to be able to learn from others across the sector. However I don’t believe this means we shouldn’t keep working towards developing that best practice together and sharing the learning that we have,

regardless of the size of organisation. Everyone is having an impact. Practitioners need to be willing to challenge themselves as to whether their programme is completely unique, thus avoiding the subject of measuring impact. We are often encouraged into this kind of emotive language in an attempt to secure donors in a competitive funding environment. In reality most of our programmes are based around similar principles of trying to deliver high quality relationship-based social work with a hard-to-reach group of young people who are living and working on the streets. Organisations’ systems are nuanced for different country contexts and to allow for different skill sets and interests within particular staff teams, but they have a similar base underpinning them. What we need to be trying to understand is which adaptations have improved quality, and which of them might be applicable to the wider sector and not just the local context. Organisations also need to think about the priorities they are setting about continuing the day-to-day programme work versus trying to measure, analyse and improve the quality of their work. If funding is limited, or being reduced, then impact measurement should have greater rather than reduced focus, so that they can make sure they are doing the activities that create the most impact, rather than those they, or funders, like the best or have traditionally done. The sector needs to challenge themselves to be doing the best they can with whichever resources they have. Ultimately we are working towards the same goal of being able to do ourselves out of a job because our programme has been so successful there are no more young people having to live on the streets.

12

Giving Thanks I want to give thanks to everyone who gave their time, energy and ideas to help to develop this exploration and the accompanying toolkit.

Special mention goes to: - Street-connected young people (or those at risk of homelessness) and their family members where appropriate in Delhi, Agra, Jinja and Manchester. - Sanjay Gupta, Rila Kharwanlang and all the team at CHETNA - Mike Asiya, Amy Calcutt and all the team at S.A.L.V.E. International - Kate Macdonald, Alex Sporidou, and all the team at YPSF - Eileen O’Sullivan (Oak Foundation) - The Clore Fellowship network, especially Bethia McNeil, Michael Cooke and the Clore Social Fellowship team - Michael Ashe and Jessica Greenhalf (Bond) - Ian Harvey (Congo Children’s Trust) - Tricia Young (Child to Child) - Eleanor Harrison (Global Giving UK) - Duncan Dyason (Street Kids Direct) - Chris Hands (Moroccan Children’s Trust) - Hugo Rukavina (Street Invest) - Deborah Grossman (Mission Mexico) - Emilie Secker (Safe Child Africa) - Katie McQuaid (University of Leeds)

Conclusion

It feels like there is a lot of energy within the homeless young people organisational sector to keep developing the skills and methodologies to try to better understand and measure their impact, which I hope continues to flourish and evolve. To support this it would be wonderful to see more opportunities, through longer term funder investment and organisational buddying, that promote the sharing and co-development of tools and methodologies together. Rather than people feeling overwhelmed and that they have to do it all themselves. We need to stop re-inventing the wheel. The sector needs to develop its relationship with its funders so that we are working hand-in-hand on this, rather than leaving it to funders to drive the agenda with organisations delivering it for them. Funders should also question whether it is really possible to measure such relationship-centred work with street-connected young people in a rigorous enough way to conclusively demonstrate sustainable impact. Funding bodies should think about the time it will take to gather and analyse different types of data, and whether it is really answering the fundamental questions they are trying to find out through the impact investigation, so that time and resources aren’t being wasted in asking the wrong questions or collecting useless evidence. They also need to be willing to finance improvements that call for increased investment per person on a programme, rather than selectively interpreting impact assessments to justify cost cutting. Organisations also need to be braver (and not be penalised) in being able to talk about what they have (and haven’t) got the capacity to measure and analyse. They also need to share their learning from when things haven’t always been successful, where impacts may not have been intended, or where impacts have arisen due to outside circumstances. It’s hard to predict the future, but I hope that it will involve more government, academic, donor, organisational and most importantly service user collaboration and creativity. Together we can adapt, develop and share useful tools and methodologies so that we can better understand how, why, and if, a sustainable difference is being made. As John Wooden said “Success is never final, failure is never fatal, its courage that counts”.

13

An interesting article to read to help organisations to

question when evaluation is useful is Caroline Fiennes “Most Charities Shouldn’t

Evaluate their Work” Stanford Social Innovation

Review

Further Reading There are a wide variety of fascinating articles and approaches to impact assessment that you could explore. Here are just a few them that I have found most interesting and helpful so far. I also hope you will look at the toolkit and case studies that go alongside this report. Other useful tools/ approaches to consider: What are you trying to measure?

- Charities Evaluation Service’s “Assessing Change: Developing and using Outcomes Monitoring Tools” - Better Evaluation’s “Log Frame Development” - The Young Foundation’s “Framework of outcomes for young people” - Inspiring Impact’s “Measuring up tool” - Bond’s “Impact Builder”

Other useful tools/ approaches to consider: Difference made over the time of the programme

- Retrak’s “Family Integration Standard Operating Procedures” - Street Invest’s “Wellbeing Toolkit development” - Youth in Mind’s “Strengths and Difficulties questionnaire”

Other useful tools/ approaches to consider: Longer term impact for young person

- Most Significant Change Story Telling Technique - Global Giving’s “Storytelling Project” - Consortium for Street Children’s “Passport to Participatory Planning” - Child to Child’s “Participation Toolkit”

Other useful tools/ approaches to consider: How much of the impact was us? Attribution

- World bank’s “Address the Attribution Problem” - Zewo “Attribution Analysis”

Other useful tools/ approaches to consider: What is our wider impact (social, environmental and economic)

- Social Audit Network’s “Getting Started” - Cabinet Office’s “A Guide to Social Return on Investment”