monitoring evaluation impact assessment objectives be able to n explain basic monitoring and...

22
Monitoring Evaluation Impact Assessment

Upload: lewis-bridges

Post on 26-Dec-2015

225 views

Category:

Documents


1 download

TRANSCRIPT

Monitoring

Evaluation

Impact Assessment

ObjectivesBe able to explain basic monitoring and evaluation theory in

relation to accountability Identify what you want to monitor, how and for

whom Identify indicators that can be used to monitor

and evaluate your work Know what tools can be used for monitoring and

evaluating, their limitations and advantages

Stakeholders Who has an interest in monitoring

and evaluation in your organisation is this primarily accountability,

quality, learning? What are the relationships

between them? What are their key questions from

m and e?

What is the cause and effect relationship?

The underlying principle of the logframe structure is cause and effect, ie. If……..Then

IF we carry out activities THEN we will produce outputs…THEN we will achieve objectives…THEN we will contribute to the overall goal.

The better the cause and effect linkage the better the project design.

Indicators for different levels of monitoring and evaluation

You need different indicators for the different levels of monitoring and evaluation: Point of measurement What is measured Indicators Inputs Money, people, resources Quantities, skills, amounts Outputs How much work has been

done: the effort put in. Implementation of activities

Objectives (or outcomes), The immediate effect the the project has had on the initial situation. What has happened as a result of the effort. How have beneficiaries benefited from the project

Use of outputs and sustained production of benefits

Impact The longer term change brought about as a result of the project

Difference from the original situation

Planning: Situation analysis/needs

assessment Aims (goals, wider objectives) Objectives (outcomes, purpose) Activities (outputs and inputs) plan of action, budget, resources Indicators to measure progress baseline information

Monitoring (& management) Continuous throughout project Strengths and weaknesses. What

adjustments need to be made? External changes. Does work need to

change to respond? Progress towards achieving

outcomes. Do activities need to be changed?

Impact. How is the project affecting different groups in longer term?

Critical analysis and feedback

Evaluation Carried out at significant stages in

project’s development. Looks backwards at what has been

done, learns lessons for the future. Assesses progress towards,

outcomes, objectives and goal. Answers specific questions about

effectiveness, efficiency, sustainability – set in TOR

Self evaluation or external can be peer evaluation

Impact assessment

Short term impact longer term impact intended and unintended impact positive and negative attribution vs contribution “plausible association” of

contributions to impact

A BOND Approach to Quality Standards in NGOs: Putting Beneficiaries First Throughout our research, we consistently

heard that NGOs deliver quality when their

work is based on a sensitive and dynamic

understanding of beneficiaries’ realities;

responds to local priorities in a way that

beneficiaries feel is appropriate; and is judged

to be useful by beneficiaries

Downward accountability

Meaningful participation: process outputs and empowering?

Adequate attention to quality of relationships

ongoing learning and reflection: keep debates alive

efficient use of resources and minimise costs

sustainability and impact. (quality does not always produce impact)

Tools

Methods for collecting and analysing information

Qualitative approaches: allow a more open-ended and in-depth investigation, but often over a smaller area.

Quantitative approaches: useful for examining predetermined variables.

Participatory approaches: allow differences in people’s interests, needs and priorities to be recognised, and form the basis for negotiation between stakeholders. They also allow people to

benefit from analysing and asserting their own interests.

How to choose different methods Who needs the information and why? What key questions are you trying to

answer? What indicators are you monitoring?

What resources are there to collect and analyse the information?

RIGOUR Vs. PARTICIPATION critical thinking more important than

actual tools

Specific Measurable or Monitorable Achievable Relevant Timebound

OBJECTIVES

Logical Framework Analysis The logical framework is a tool for: organising thinking relating activities to expected

results setting performance objectives allocating responsibilities

The logical framework matrix

Narrativesummary

Measurableindicators

Means ofverification

Assumptions/risks

Goal

Objectives

Outputs

Activities

Inputs

Indicators and Means of verification

Set indicators for all levels of the hierarchy

Show how you will get the information; sources and methods. Are these realistic?

What about the influence of uncertain factors?

Good project design requires uncertain factors to be taken into account

Assumptions/risks are the uncertainty factors between each level

Assumptions complete the if/then logic and are set out in the last column of the matrix

The project team is not responsible for Assumptions/Risks but they must monitor changes and report on the implications.

Complementary approaches

Most significant change - no indicators, reflect on what happened and what changes were significant

Outcome mapping - focus on changes in behaviour and relationships of “boundary partners”

Impact: What Change Did We Make?

Most Significant Change

Each tell a story Draw out common themes Pick a story that is representative Share stories

Put it into practice for your project

Who are main stakeholders? What is main focus – learning, quality

or accountability? Select one outcome and activity from

your project identify some indicators. What m and e tools could be used? How will the resulting information be

used?