1. key points 1. planning monitoring and evaluation in the spanish cooperation 2. manual for the...
DESCRIPTION
1. Planning, Monitoring and Evaluation in the Spanish Cooperation 1. History: Spanish Stakeholders (Ministries, Departaments, Municipalities…) II Master Plan for Spanish Cooperation ( ) Manual for Management of Evaluations of SC Evaluation of the IIMPSC (05-08) (EES Lisbon Oct 2008) 25 Evaluations ( ) of Spanish Cooperation 2. What did we learn? o Evaluability (Planning design and monitoring) o Evaluation culture (institutionalisation) 3. Future challenges: III Master Plan for Spanish Cooperation ( ) Implementing Paris Declaration and Accra commitments Putting in practice and Implementing Standards Learning to learn (Knowledge management and Evaluation system) 3TRANSCRIPT
1
Key points1. Planning Monitoring and Evaluation in the
Spanish Cooperation2. Manual for the Management of Evaluations
of the Spanish Cooperation3. Value and use of evaluation quality
standards 4. Standards in practice: Challenges and
Balances
2
1. Planning, Monitoring and Evaluation in the Spanish Cooperation1. History: Spanish Stakeholders (Ministries, Departaments, Municipalities…) II Master Plan for Spanish Cooperation (2005-2008) Manual for Management of Evaluations of SC Evaluation of the IIMPSC (05-08) (EES Lisbon Oct 2008) 25 Evaluations (1999-2009) of Spanish Cooperation2. What did we learn?o Evaluability (Planning design and monitoring) o Evaluation culture (institutionalisation)3. Future challenges: • III Master Plan for Spanish Cooperation (2009-2012)• Implementing Paris Declaration and Accra commitments• Putting in practice and Implementing Standards • Learning to learn (Knowledge management and Evaluation system)
3
Specific ObjectiveOffering the stakeholders
of the Spanish Cooperation a tool to manage evaluations: InstructiveFlexible Useful Systematic
4
General Objetives Strengthen the
evaluation culture of the Spanish Cooperation.
Strengthen the implemention evaluations: quality, systematic, participative and planning oriented.
2. Manual for the Management of Evaluations of the Spanish Cooperation
5
Manual for the Management of Evaluations of the Spanish Cooperation
Learning to do things better
Principles, Criteria and Standards for assessing the quality of SC evaluation reports From Philosophy (Principles) to Practice
(Standards)
They are based on these ones used by:the European Commission and include DAC evaluation standards and the criteria being used by the
Evaluation Division of DG POLDE 6
3. Value and use of evaluation quality standards
PRINCIPLES FOR EVALUATION OF DEVELOPMENT ASSISSTANCEDAC (1992) SPANISH COOPERATION
(MANUAL)•IMPARTIALITY AND INDEPENDENCE•CREDIBILITY•USELFULNESS•PARTICIPATION OF DONORS AND RECIPIENTS•DONOR-COOPERATION•EVALUATION PROGRAMMING, DESING AND IMPLEMENTATION•DISSEMINATION AND FEEDBACK
OTHER REVIEWS REINFORCING WITH GUIDANCES (1998)
•PARTICIPATION•LEARNING AND INCORPORATION OF LESSONS LEARNED•USEFULNESS•TRANSPARENCY
THIS IMPLIES:•A comprehensive result-centred approach. •A pluralistic and participatory approach. •An analytical, learning-centred and conclusive approach. •A strategy based on the use of results. •A comprehensive result-centred approach. •A pluralistic and participatory approach. •An analytical, learning-centred and conclusive approach. •A strategy based on the use of results.
7
CRITERIA FOR EVALUATION OF DEVELOPMENT ASSISSTANCEDAC SPANISH COOPERATION
(MANUAL)
RelevanceEffectivenessEffciciencyImpactSustaninability
DAC criteria
Other criteriaCONSISTENCY OWNERSHIPALIGNMENTHARMONISATIONPARTICIPATIONCOVERAGE
8
STANDARDS FOR EVALUATION OF DEVELOPMENT ASSISSTANCEDAC SPANISH COOPERATION
(MANUAL)1. Rationale, purpose and
objectives2. Scope3. Context4. Methodology5. Information Sources6. Independency7. Ethics8. Quality assurance9. Relevance of the evaluation
results10. Completeness
1: Compliance with requirements – responds to the evaluation questions
2: Context analysis3: Justification for the
methodology used4: Reliability of the data5: Robustness of the analysis6: Credibility of findings7: Validity of conclusions8: Usefulness of the
recommendations9: Clarity of the report
9
SPANISH STANDARDSStd 1: Compliance with requirements – responds to the
evaluation questionsThis assesses adherence to the terms of reference. In other words,
a good report is one which adequately fulfils the requirements laid down in the terms of reference and satisfactorily answers the evaluation questions. The report is expected to provide a good general overview of how the announced objectives were met and to clarify the logic of the intervention.
10
Std 2: Context analysis This assesses whether the report focuses on the intervention as
a whole including its temporal, geographical and regulatory dimensions and whether it analyses the context surrounding the intervention, i.e. the institutional, political economic and social situation and both the foreseen and unforeseen interactions with other related policies and their consequences.
Std 3: Justification for the methodology used This assesses whether the tools and methodology used are clearly
explained and whether they have indeed been applied throughout the process. The methodological choices made must meet the requirements laid down in the ToR. The limits inherent to the evaluation method should also be clearly defined and arguments should be presented as to why certain options were chosen over others.
11
Std 4: Reliability of the data This does not judge the intrinsic validity of the available data
but rather the way in which the evaluation team obtained and used that data. The evaluation team is expected to identify the sources of quantitative and qualitative data and explain and justify the reliability of the data. To this end, it must clearly explain the collection tools used which, in turn, must be adapted to the information sought.
Std 5: Robustness of the analysis A robust analysis of the quantitative and qualitative data should be done
by closely following the recognised and appropriate steps depending on the type of data analysed. The cause-effect relationships between the intervention and its consequences must be clearly explained and there must be consistency and a logical sequence between evidence and assessment; assessment and conclusions; and between conclusions and recommendations. We would recommend that the steps of the analysis be explained and its validity limits specified.
12
Std 6: Credibility of findings The findings made in the analysis are expected to be reliable
and balanced. The findings should suitably reflect the reality drawn by the data and documented test elements, on the one hand, and the reality of the intervention as perceived by actors and beneficiaries, on the other. The effects of the evaluated intervention should be isolated from external factors and from contextual restrictions.
Std 7: Validity of conclusions This does not judge the intrinsic value of the conclusions but
rather the way in which they were reached. In accordance with this criterion the conclusions must be rooted in the analysis, must be supported by facts and analyses easily identifiable in the rest of the report and must avoid bias or personal feelings. They should also indicate the limits and context of the validity of the conclusions.
13
Std 8: Usefulness of the recommendations
Recommendations should be formulated in a clear and concise manner, should derive from the conclusions and be based on balanced and unbiased analyses. They should also be detailed enough so as to be specifically applicable by the different actors responsible for the evaluated intervention.
Std 9: Clarity of the report
A clear report is one that is easy to read and follows a logical structure. A brief summary should be an accurate reflection of the report. Annexes should be provided focusing on specialised concepts and technical demonstrations clearly referenced throughout the text. The report should be brief, concise and easily readable and the structure of the report should be readily recognisable. The report should clearly describe the intervention evaluated, its context and the evaluation findings and the information furnished should be readily understandable.
14
Quality standards (SPAIN) / Quality standards (DAC)1: Compliance with requirements
– responds to the evaluation questions
2: Context analysis3: Justification for the
methodology used
4: Reliability of the data
5: Robustness of the analysis6: Credibility of findings7: Validity of conclusions
8: Usefulness of the recommendations
9: Clarity of the report
15
9.1 Formulation of evaluation findings (9. Relevance of the evaluation results)
10.1 Evaluation questions answered by conclusions (10. Completeness)
3. Context 4.1 Explanation of the methodology
used (4. Evaluation methodology)
5. Information sources
4.2 Assessment of results (4. Evaluation methodology)
10.2 Clarity of analysis (10. Completeness)
10.3 Distinction between conclusions, recommendations and lessons learned
9.3 Recommendations and lessons learned(9. Relevance of the evaluation results)
10.5 Clarity and representativeness of the summary (10. Completeness)
Quality standards (DAC) / Quality standards (Spanish)
16
1. Rationale, purpose and objectives
2. Scope3. Context4. Methodology5. Information Sources6. Independency7. Ethics8. Quality assurance9. Relevance of the evaluation
results10. Completeness
1. ToRs: Phase I
2. ToR: Phase I
3. Context (s 2)
4. Methodology (s3)
5. Reliability of data (s4)
6. No directly (principles, criteria / ToR)
7. No directly (principles, criteria / ToR)
8. No directly (principles, criteria / ToR)
9. Usefulness of recommendations (s8 and s1)
10. Compliance with requirements (s1) and validity of analysis, findings and conclusions (s5, s6 and s7)
4. Standards in practice: Challenges and Balances1. Rationale: Instituional motivation for learning / accountability 2. Scope: Ambition vs Realism (concetration / priorisation)3. Context: specific context that affects the evaluation4. Methodology: limitations (budget? evaluator´s skills?),
perception / evidence basis, ¿Impact and effciency?5. Information Sources: existence of consistent data for
evidences, privacity vs trazability6. Independency: evaluation culture and institutionalisation7. Ethics: evaluation culture and institutionalisation8. Quality assurance: experience for flexible vs rigid approaches9. Relevance of the evaluation results: Utility and use ¡¡10. Completeness: focus vs ambition
17