evaluation workshop self evaluation – some workable ideas and approaches

Post on 04-Jan-2016

222 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Evaluation Workshop

Self evaluation – some workable ideas and approaches

Why self-evaluation?

• Reflexive practice – knowing what has been successful or unsuccessful and why – Evidence for beneficiaries – Evidence for commissioners or partners

• Practitioner-led research - encouraged by the Children’s Workforce Development Council

• Strengthening the wider evaluation arrangements for the 0-7 Pilot Partnership

Self evaluation: the stages involved

Getting Started

Doing your evaluation

Interpreting your findings

Reporting and sharing your findings

What do you want to evaluate or find out about?

What do you want to measure?Who will do the evaluation?

Who needs to be consulted?What methods will you use?What support is available?

How will you analyse your findings (surveys and

interviews)

How to write your reportSharing your findings with

others

Getting started

• The Action Learning Sets: – Aims – Objectives – Milestones – Outcomes

• Leap of faith from testing the activities to proving that certain outcomes have been achieved

• Theory of Change model breaks this down…

1. Start here…

The problem or issue to be addressed

7. Impacts

The ‘Big’ / wider effects that are brought about

2. Theory of Change

The idea or approach for tackling the problem

3. Inputs

Resources;time or money to be used

4. Activities

What will actually be done, and how

5. Outputs

The end result of the activities

6. Outcomes

What will be different for the people or service that you are trying to improve

Theory of change…

What to measure?

Hard outcome: a clearly defined (quantifiable) change• ‘The family learning course improved my literacy and

numeracy skills’

Soft outcome: a harder to measure (qualitative) change • ‘The family learning course made me more confident in

speaking about my experiences with other parents’

Indicator: a sign that change has occurred over time • Numbers of new enrolments• Percentage (%) of parents completing OCN Level 1

Showing that change has happened

Baseline

Time

Chosen Indicator

e.g. % of parents reading to their child every week

End of project

Start of project

Gathering the information

1. Secondary evidence

• Existing sources of information to show what has been done / achieved by the project

• Think - what data is already available? • Core service data * School visitor records• Child or parent-held records * Staff observations / CPD • CAF records * Partner evidence

• What else do you need, on top of this?• Adding to your records • Project-specific monitoring

Practical suggestions

• Have something in place from the start – Much harder to collect evidence after the event

– Need a ‘baseline’ to measure what has changed

• Who will be responsible for gathering the information and how often? – Realistic approach – Match it to the available time and resources

• Don’t change the format half way through…

2. Primary evidence

• New or additional information to show what has been achieved, e.g. insights, personal accounts, experiences and case studies

• Who needs to be consulted?– Stakeholders

– Beneficiaries of your project • Staff, parents, children

• Who will gather the information and how? – Self-reporting / gathered by someone else

Examples…

Self-reporting • Feedback sheets • Comments boxes • Research diaries • Pictorial or video

evidence • SMS / Text or email

feedback • Blogs

Gathered directly • Observations • Structured interviews

or focus groups • Action research /

practitioner research • Piggy-backing:

– Involve existing parent or community panels

– Work into annual resident’s surveys

Some key points

• Structure is needed - what are you asking people to feed back? – Structured doesn’t have to mean formal– Satisfaction is different to outcomes

• Impartiality and avoiding bias– Are you too close to ask the questions? – Who else could do this instead?

• Volunteers / parent advocates / partner staff • CRB checks and Health and Safety

• Confidentiality and informed consent – Do participants know how you intend to use the information?

Questionnaires – some tips

• Make them confidential

• Language used – support for completion

• Appearance / length

• Style of questions – avoid bias Was the session:– A) Excellent – B) Very good – C) Good

x

Questionnaires…

Overall, how would you rate your personal confidence, now? Tick the box that describes you.

a) I am confident in myself

(4)

b) I struggle a bit with my confidence

(3)

c) I need a lot of help with my confidence

(2)

d) I have very low confidence

(1)

How far would you agree with the following statements?

Strongly agree

(4)

Agree(3)

Disagree(2)

Strongly disagree

(1)

Don't know

a. "I feel nervous about using childcare services"

b. "I am confident about my child's development"

c. "I wish I knew more about services for families in my area"

Analysing and reporting on evidence

Analysing and reporting

1) Qualitative evidence

Case studies• Individual or service level• Telling a story:

– what was the issue? – how was it tackled? – what was successful? – what is different now?

Using quotes• Powerful form of evidence • Confidentiality

2) Quantitative evidence

Reporting survey findings • Less can be more• Non-numeric is often best::

– ‘approaching half’

– ‘nearly a third’

– ‘a sizeable majority’

Charts and tables • Presenting a clear message • Outcomes for different groups

Support from ECOTEC…

Support from ECOTEC

• Evaluation Toolkit – Key principles of doing evaluation – A source of ideas and examples – Benchmarking what you are doing – good practice examples

• One-to-one advice – Commenting on your individual evaluation plans – Q&A and troubleshooting via the support email / telephone

• Sharing findings from the independent evaluation – Interim and final reports – Case studies

Thank you for listening

top related