strategies for improving monitoring and evaluation

64
Strategies for improving Monitoring and Evaluation South African Monitoring & Evaluation Association Inaugural conference 28 to 30 March 2007 Associate Professor Patricia Rogers CIRCLE at RMIT University, Australia

Upload: long

Post on 23-Jan-2016

99 views

Category:

Documents


0 download

DESCRIPTION

Strategies for improving Monitoring and Evaluation. South African Monitoring & Evaluation Association Inaugural conference 28 to 30 March 2007. Associate Professor Patricia Rogers CIRCLE at RMIT University, Australia. Sign at the Apartheid Museum, Johannesburg. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Strategies for improving Monitoring and Evaluation

Strategies for improving Monitoring and Evaluation

South

AfricanMonitoring ampEvaluationAssociation

Inaugural conference28 to 30 March 2007

Associate Professor Patricia Rogers CIRCLE at RMIT University Australia

Sign at the Apartheid Museum Johannesburg

bull Good evaluation can help make things better

bull Bad evaluation can be useless ndash or worse ndash Findings that are too late not credible or not

relevantndash False positives (wrongly conclude things work)ndash False negatives (wrongly conclude things donrsquot

work)ndash Destructive effect of poor processes

The lsquoBig Fiversquo problems in M amp E

Seven possible strategies for improving the quality of M amp E

Overview of presentation

The lsquoBig Fiversquo problems in M amp E Are these relevant for South AfricaAre there othersWhich are most important to address

Seven possible strategies for improving the quality of M amp EAre these relevant for South AfricaAre there othersWhich are most important to enact ndash and how

Questions for you

1 Presenting a limited view

1 Presenting a limited view

1 Presenting a limited view

1 Presenting a limited view

bull Only in terms of stated objectives andor targets

bull Only from the perspectives of certain groups and individuals

bull Only certain types of data or research designs

bull Bare indicators without explanation

2 Unfocused

2 Unfocused

2 Unfocused

2 Unfocused

2 Unfocused

bullTrying to look at everything ndash and looking at nothing wellbullNot communicating clear messages

3 Unrealistic expectations

3 Unrealistic expectations

3 Unrealistic expectations

Expecting bull too much too soon

and too easilybulldefinitive answersbull immediate answers

about long-term impacts

4 Not enough good information

4 Not enough good information

4 Not enough good information

bullPoor measurement and other data collection

bullPoor response ratebullInadequate data analysis

bullSensitive data removed

bullPressure to fill in missing data

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

bull Collecting lots of data ndash and then not being sure how to analyse it

bull Doing lots of evaluations ndash and then not being sure how to use them

Avoiding the Big 5

Avoiding the Big 5

LIMITED VIEW

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

Seven strategies

1 Better ways to think about M amp E2 Training and professional development3 Organisational infrastructure4 Supportive networks 5 External review processes6 Strategies for supporting use7 Building knowledge about what works

in evaluation in particular contexts

1 Better ways to think about M amp E

Useful definitions

Models of what evaluation is and how it relates to policy and practice

1 Better ways to think about M amp E

Useful definitions

bull Not just measuring whether objectives have been met

bull Articulating negotiatingndash What do we valuendash How is it going

1 Better ways to think about M amp E

Models of what evaluation is and how it relates to policy and practice

1 Different types of evaluation at different stages of the programpolicy cycle ndash rather than a final activity

2 The effect of M amp E

3 Iteratively building evaluation capacity

Common understandings of M amp E

bull Including definitions and models in major documents not just training manuals

bull Having these familiar to managers staff and communities not just to evaluators

bull Also recognising the value of different definitions and conceptualisations

Conceptual model for evaluationEvaluation is often seen as a final activity

Mainstreaming Social Inclusion httpwwweuropemsiorgindexphp

Conceptual model for evaluationEvaluation is often seen as a final activity

But this can lead to

Leaving it all to the end (no baselines)

Not being evaluative early on

Needs analysis

Program or policy design

Implementation of activities and ongoing management

Continuous improvement

Outcomes Evaluation amp performance monitoring

Conceptual model for evaluationDifferent types of evaluation at different stages of the

programpolicy cycle

Based on Funnell 2006 Designing an evaluation

Conceptual model for evaluationThe effect of M amp E

Overall resultBetter governance and service delivery in South Africa

Problems areaddressed

Achievementsare affirmed

and promoted

Departmentsfocus on

priority areas

Learning fromgood practice

examples takesplace

Problem areasidentified

Good practiceby others is

identified andpromoted

Priority areasin public

administrationare

communicated

Departmentsreflect ontheir own

performance

Public service monitoring

FOLLOWUP

REPORTING

Public Service Commission 2003

The underlying programme logic of the South African Public Service Commission M amp E system

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 2: Strategies for improving Monitoring and Evaluation

Sign at the Apartheid Museum Johannesburg

bull Good evaluation can help make things better

bull Bad evaluation can be useless ndash or worse ndash Findings that are too late not credible or not

relevantndash False positives (wrongly conclude things work)ndash False negatives (wrongly conclude things donrsquot

work)ndash Destructive effect of poor processes

The lsquoBig Fiversquo problems in M amp E

Seven possible strategies for improving the quality of M amp E

Overview of presentation

The lsquoBig Fiversquo problems in M amp E Are these relevant for South AfricaAre there othersWhich are most important to address

Seven possible strategies for improving the quality of M amp EAre these relevant for South AfricaAre there othersWhich are most important to enact ndash and how

Questions for you

1 Presenting a limited view

1 Presenting a limited view

1 Presenting a limited view

1 Presenting a limited view

bull Only in terms of stated objectives andor targets

bull Only from the perspectives of certain groups and individuals

bull Only certain types of data or research designs

bull Bare indicators without explanation

2 Unfocused

2 Unfocused

2 Unfocused

2 Unfocused

2 Unfocused

bullTrying to look at everything ndash and looking at nothing wellbullNot communicating clear messages

3 Unrealistic expectations

3 Unrealistic expectations

3 Unrealistic expectations

Expecting bull too much too soon

and too easilybulldefinitive answersbull immediate answers

about long-term impacts

4 Not enough good information

4 Not enough good information

4 Not enough good information

bullPoor measurement and other data collection

bullPoor response ratebullInadequate data analysis

bullSensitive data removed

bullPressure to fill in missing data

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

bull Collecting lots of data ndash and then not being sure how to analyse it

bull Doing lots of evaluations ndash and then not being sure how to use them

Avoiding the Big 5

Avoiding the Big 5

LIMITED VIEW

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

Seven strategies

1 Better ways to think about M amp E2 Training and professional development3 Organisational infrastructure4 Supportive networks 5 External review processes6 Strategies for supporting use7 Building knowledge about what works

in evaluation in particular contexts

1 Better ways to think about M amp E

Useful definitions

Models of what evaluation is and how it relates to policy and practice

1 Better ways to think about M amp E

Useful definitions

bull Not just measuring whether objectives have been met

bull Articulating negotiatingndash What do we valuendash How is it going

1 Better ways to think about M amp E

Models of what evaluation is and how it relates to policy and practice

1 Different types of evaluation at different stages of the programpolicy cycle ndash rather than a final activity

2 The effect of M amp E

3 Iteratively building evaluation capacity

Common understandings of M amp E

bull Including definitions and models in major documents not just training manuals

bull Having these familiar to managers staff and communities not just to evaluators

bull Also recognising the value of different definitions and conceptualisations

Conceptual model for evaluationEvaluation is often seen as a final activity

Mainstreaming Social Inclusion httpwwweuropemsiorgindexphp

Conceptual model for evaluationEvaluation is often seen as a final activity

But this can lead to

Leaving it all to the end (no baselines)

Not being evaluative early on

Needs analysis

Program or policy design

Implementation of activities and ongoing management

Continuous improvement

Outcomes Evaluation amp performance monitoring

Conceptual model for evaluationDifferent types of evaluation at different stages of the

programpolicy cycle

Based on Funnell 2006 Designing an evaluation

Conceptual model for evaluationThe effect of M amp E

Overall resultBetter governance and service delivery in South Africa

Problems areaddressed

Achievementsare affirmed

and promoted

Departmentsfocus on

priority areas

Learning fromgood practice

examples takesplace

Problem areasidentified

Good practiceby others is

identified andpromoted

Priority areasin public

administrationare

communicated

Departmentsreflect ontheir own

performance

Public service monitoring

FOLLOWUP

REPORTING

Public Service Commission 2003

The underlying programme logic of the South African Public Service Commission M amp E system

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 3: Strategies for improving Monitoring and Evaluation

bull Good evaluation can help make things better

bull Bad evaluation can be useless ndash or worse ndash Findings that are too late not credible or not

relevantndash False positives (wrongly conclude things work)ndash False negatives (wrongly conclude things donrsquot

work)ndash Destructive effect of poor processes

The lsquoBig Fiversquo problems in M amp E

Seven possible strategies for improving the quality of M amp E

Overview of presentation

The lsquoBig Fiversquo problems in M amp E Are these relevant for South AfricaAre there othersWhich are most important to address

Seven possible strategies for improving the quality of M amp EAre these relevant for South AfricaAre there othersWhich are most important to enact ndash and how

Questions for you

1 Presenting a limited view

1 Presenting a limited view

1 Presenting a limited view

1 Presenting a limited view

bull Only in terms of stated objectives andor targets

bull Only from the perspectives of certain groups and individuals

bull Only certain types of data or research designs

bull Bare indicators without explanation

2 Unfocused

2 Unfocused

2 Unfocused

2 Unfocused

2 Unfocused

bullTrying to look at everything ndash and looking at nothing wellbullNot communicating clear messages

3 Unrealistic expectations

3 Unrealistic expectations

3 Unrealistic expectations

Expecting bull too much too soon

and too easilybulldefinitive answersbull immediate answers

about long-term impacts

4 Not enough good information

4 Not enough good information

4 Not enough good information

bullPoor measurement and other data collection

bullPoor response ratebullInadequate data analysis

bullSensitive data removed

bullPressure to fill in missing data

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

bull Collecting lots of data ndash and then not being sure how to analyse it

bull Doing lots of evaluations ndash and then not being sure how to use them

Avoiding the Big 5

Avoiding the Big 5

LIMITED VIEW

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

Seven strategies

1 Better ways to think about M amp E2 Training and professional development3 Organisational infrastructure4 Supportive networks 5 External review processes6 Strategies for supporting use7 Building knowledge about what works

in evaluation in particular contexts

1 Better ways to think about M amp E

Useful definitions

Models of what evaluation is and how it relates to policy and practice

1 Better ways to think about M amp E

Useful definitions

bull Not just measuring whether objectives have been met

bull Articulating negotiatingndash What do we valuendash How is it going

1 Better ways to think about M amp E

Models of what evaluation is and how it relates to policy and practice

1 Different types of evaluation at different stages of the programpolicy cycle ndash rather than a final activity

2 The effect of M amp E

3 Iteratively building evaluation capacity

Common understandings of M amp E

bull Including definitions and models in major documents not just training manuals

bull Having these familiar to managers staff and communities not just to evaluators

bull Also recognising the value of different definitions and conceptualisations

Conceptual model for evaluationEvaluation is often seen as a final activity

Mainstreaming Social Inclusion httpwwweuropemsiorgindexphp

Conceptual model for evaluationEvaluation is often seen as a final activity

But this can lead to

Leaving it all to the end (no baselines)

Not being evaluative early on

Needs analysis

Program or policy design

Implementation of activities and ongoing management

Continuous improvement

Outcomes Evaluation amp performance monitoring

Conceptual model for evaluationDifferent types of evaluation at different stages of the

programpolicy cycle

Based on Funnell 2006 Designing an evaluation

Conceptual model for evaluationThe effect of M amp E

Overall resultBetter governance and service delivery in South Africa

Problems areaddressed

Achievementsare affirmed

and promoted

Departmentsfocus on

priority areas

Learning fromgood practice

examples takesplace

Problem areasidentified

Good practiceby others is

identified andpromoted

Priority areasin public

administrationare

communicated

Departmentsreflect ontheir own

performance

Public service monitoring

FOLLOWUP

REPORTING

Public Service Commission 2003

The underlying programme logic of the South African Public Service Commission M amp E system

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 4: Strategies for improving Monitoring and Evaluation

The lsquoBig Fiversquo problems in M amp E

Seven possible strategies for improving the quality of M amp E

Overview of presentation

The lsquoBig Fiversquo problems in M amp E Are these relevant for South AfricaAre there othersWhich are most important to address

Seven possible strategies for improving the quality of M amp EAre these relevant for South AfricaAre there othersWhich are most important to enact ndash and how

Questions for you

1 Presenting a limited view

1 Presenting a limited view

1 Presenting a limited view

1 Presenting a limited view

bull Only in terms of stated objectives andor targets

bull Only from the perspectives of certain groups and individuals

bull Only certain types of data or research designs

bull Bare indicators without explanation

2 Unfocused

2 Unfocused

2 Unfocused

2 Unfocused

2 Unfocused

bullTrying to look at everything ndash and looking at nothing wellbullNot communicating clear messages

3 Unrealistic expectations

3 Unrealistic expectations

3 Unrealistic expectations

Expecting bull too much too soon

and too easilybulldefinitive answersbull immediate answers

about long-term impacts

4 Not enough good information

4 Not enough good information

4 Not enough good information

bullPoor measurement and other data collection

bullPoor response ratebullInadequate data analysis

bullSensitive data removed

bullPressure to fill in missing data

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

bull Collecting lots of data ndash and then not being sure how to analyse it

bull Doing lots of evaluations ndash and then not being sure how to use them

Avoiding the Big 5

Avoiding the Big 5

LIMITED VIEW

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

Seven strategies

1 Better ways to think about M amp E2 Training and professional development3 Organisational infrastructure4 Supportive networks 5 External review processes6 Strategies for supporting use7 Building knowledge about what works

in evaluation in particular contexts

1 Better ways to think about M amp E

Useful definitions

Models of what evaluation is and how it relates to policy and practice

1 Better ways to think about M amp E

Useful definitions

bull Not just measuring whether objectives have been met

bull Articulating negotiatingndash What do we valuendash How is it going

1 Better ways to think about M amp E

Models of what evaluation is and how it relates to policy and practice

1 Different types of evaluation at different stages of the programpolicy cycle ndash rather than a final activity

2 The effect of M amp E

3 Iteratively building evaluation capacity

Common understandings of M amp E

bull Including definitions and models in major documents not just training manuals

bull Having these familiar to managers staff and communities not just to evaluators

bull Also recognising the value of different definitions and conceptualisations

Conceptual model for evaluationEvaluation is often seen as a final activity

Mainstreaming Social Inclusion httpwwweuropemsiorgindexphp

Conceptual model for evaluationEvaluation is often seen as a final activity

But this can lead to

Leaving it all to the end (no baselines)

Not being evaluative early on

Needs analysis

Program or policy design

Implementation of activities and ongoing management

Continuous improvement

Outcomes Evaluation amp performance monitoring

Conceptual model for evaluationDifferent types of evaluation at different stages of the

programpolicy cycle

Based on Funnell 2006 Designing an evaluation

Conceptual model for evaluationThe effect of M amp E

Overall resultBetter governance and service delivery in South Africa

Problems areaddressed

Achievementsare affirmed

and promoted

Departmentsfocus on

priority areas

Learning fromgood practice

examples takesplace

Problem areasidentified

Good practiceby others is

identified andpromoted

Priority areasin public

administrationare

communicated

Departmentsreflect ontheir own

performance

Public service monitoring

FOLLOWUP

REPORTING

Public Service Commission 2003

The underlying programme logic of the South African Public Service Commission M amp E system

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 5: Strategies for improving Monitoring and Evaluation

The lsquoBig Fiversquo problems in M amp E Are these relevant for South AfricaAre there othersWhich are most important to address

Seven possible strategies for improving the quality of M amp EAre these relevant for South AfricaAre there othersWhich are most important to enact ndash and how

Questions for you

1 Presenting a limited view

1 Presenting a limited view

1 Presenting a limited view

1 Presenting a limited view

bull Only in terms of stated objectives andor targets

bull Only from the perspectives of certain groups and individuals

bull Only certain types of data or research designs

bull Bare indicators without explanation

2 Unfocused

2 Unfocused

2 Unfocused

2 Unfocused

2 Unfocused

bullTrying to look at everything ndash and looking at nothing wellbullNot communicating clear messages

3 Unrealistic expectations

3 Unrealistic expectations

3 Unrealistic expectations

Expecting bull too much too soon

and too easilybulldefinitive answersbull immediate answers

about long-term impacts

4 Not enough good information

4 Not enough good information

4 Not enough good information

bullPoor measurement and other data collection

bullPoor response ratebullInadequate data analysis

bullSensitive data removed

bullPressure to fill in missing data

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

bull Collecting lots of data ndash and then not being sure how to analyse it

bull Doing lots of evaluations ndash and then not being sure how to use them

Avoiding the Big 5

Avoiding the Big 5

LIMITED VIEW

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

Seven strategies

1 Better ways to think about M amp E2 Training and professional development3 Organisational infrastructure4 Supportive networks 5 External review processes6 Strategies for supporting use7 Building knowledge about what works

in evaluation in particular contexts

1 Better ways to think about M amp E

Useful definitions

Models of what evaluation is and how it relates to policy and practice

1 Better ways to think about M amp E

Useful definitions

bull Not just measuring whether objectives have been met

bull Articulating negotiatingndash What do we valuendash How is it going

1 Better ways to think about M amp E

Models of what evaluation is and how it relates to policy and practice

1 Different types of evaluation at different stages of the programpolicy cycle ndash rather than a final activity

2 The effect of M amp E

3 Iteratively building evaluation capacity

Common understandings of M amp E

bull Including definitions and models in major documents not just training manuals

bull Having these familiar to managers staff and communities not just to evaluators

bull Also recognising the value of different definitions and conceptualisations

Conceptual model for evaluationEvaluation is often seen as a final activity

Mainstreaming Social Inclusion httpwwweuropemsiorgindexphp

Conceptual model for evaluationEvaluation is often seen as a final activity

But this can lead to

Leaving it all to the end (no baselines)

Not being evaluative early on

Needs analysis

Program or policy design

Implementation of activities and ongoing management

Continuous improvement

Outcomes Evaluation amp performance monitoring

Conceptual model for evaluationDifferent types of evaluation at different stages of the

programpolicy cycle

Based on Funnell 2006 Designing an evaluation

Conceptual model for evaluationThe effect of M amp E

Overall resultBetter governance and service delivery in South Africa

Problems areaddressed

Achievementsare affirmed

and promoted

Departmentsfocus on

priority areas

Learning fromgood practice

examples takesplace

Problem areasidentified

Good practiceby others is

identified andpromoted

Priority areasin public

administrationare

communicated

Departmentsreflect ontheir own

performance

Public service monitoring

FOLLOWUP

REPORTING

Public Service Commission 2003

The underlying programme logic of the South African Public Service Commission M amp E system

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 6: Strategies for improving Monitoring and Evaluation

1 Presenting a limited view

1 Presenting a limited view

1 Presenting a limited view

1 Presenting a limited view

bull Only in terms of stated objectives andor targets

bull Only from the perspectives of certain groups and individuals

bull Only certain types of data or research designs

bull Bare indicators without explanation

2 Unfocused

2 Unfocused

2 Unfocused

2 Unfocused

2 Unfocused

bullTrying to look at everything ndash and looking at nothing wellbullNot communicating clear messages

3 Unrealistic expectations

3 Unrealistic expectations

3 Unrealistic expectations

Expecting bull too much too soon

and too easilybulldefinitive answersbull immediate answers

about long-term impacts

4 Not enough good information

4 Not enough good information

4 Not enough good information

bullPoor measurement and other data collection

bullPoor response ratebullInadequate data analysis

bullSensitive data removed

bullPressure to fill in missing data

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

bull Collecting lots of data ndash and then not being sure how to analyse it

bull Doing lots of evaluations ndash and then not being sure how to use them

Avoiding the Big 5

Avoiding the Big 5

LIMITED VIEW

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

Seven strategies

1 Better ways to think about M amp E2 Training and professional development3 Organisational infrastructure4 Supportive networks 5 External review processes6 Strategies for supporting use7 Building knowledge about what works

in evaluation in particular contexts

1 Better ways to think about M amp E

Useful definitions

Models of what evaluation is and how it relates to policy and practice

1 Better ways to think about M amp E

Useful definitions

bull Not just measuring whether objectives have been met

bull Articulating negotiatingndash What do we valuendash How is it going

1 Better ways to think about M amp E

Models of what evaluation is and how it relates to policy and practice

1 Different types of evaluation at different stages of the programpolicy cycle ndash rather than a final activity

2 The effect of M amp E

3 Iteratively building evaluation capacity

Common understandings of M amp E

bull Including definitions and models in major documents not just training manuals

bull Having these familiar to managers staff and communities not just to evaluators

bull Also recognising the value of different definitions and conceptualisations

Conceptual model for evaluationEvaluation is often seen as a final activity

Mainstreaming Social Inclusion httpwwweuropemsiorgindexphp

Conceptual model for evaluationEvaluation is often seen as a final activity

But this can lead to

Leaving it all to the end (no baselines)

Not being evaluative early on

Needs analysis

Program or policy design

Implementation of activities and ongoing management

Continuous improvement

Outcomes Evaluation amp performance monitoring

Conceptual model for evaluationDifferent types of evaluation at different stages of the

programpolicy cycle

Based on Funnell 2006 Designing an evaluation

Conceptual model for evaluationThe effect of M amp E

Overall resultBetter governance and service delivery in South Africa

Problems areaddressed

Achievementsare affirmed

and promoted

Departmentsfocus on

priority areas

Learning fromgood practice

examples takesplace

Problem areasidentified

Good practiceby others is

identified andpromoted

Priority areasin public

administrationare

communicated

Departmentsreflect ontheir own

performance

Public service monitoring

FOLLOWUP

REPORTING

Public Service Commission 2003

The underlying programme logic of the South African Public Service Commission M amp E system

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 7: Strategies for improving Monitoring and Evaluation

1 Presenting a limited view

1 Presenting a limited view

1 Presenting a limited view

bull Only in terms of stated objectives andor targets

bull Only from the perspectives of certain groups and individuals

bull Only certain types of data or research designs

bull Bare indicators without explanation

2 Unfocused

2 Unfocused

2 Unfocused

2 Unfocused

2 Unfocused

bullTrying to look at everything ndash and looking at nothing wellbullNot communicating clear messages

3 Unrealistic expectations

3 Unrealistic expectations

3 Unrealistic expectations

Expecting bull too much too soon

and too easilybulldefinitive answersbull immediate answers

about long-term impacts

4 Not enough good information

4 Not enough good information

4 Not enough good information

bullPoor measurement and other data collection

bullPoor response ratebullInadequate data analysis

bullSensitive data removed

bullPressure to fill in missing data

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

bull Collecting lots of data ndash and then not being sure how to analyse it

bull Doing lots of evaluations ndash and then not being sure how to use them

Avoiding the Big 5

Avoiding the Big 5

LIMITED VIEW

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

Seven strategies

1 Better ways to think about M amp E2 Training and professional development3 Organisational infrastructure4 Supportive networks 5 External review processes6 Strategies for supporting use7 Building knowledge about what works

in evaluation in particular contexts

1 Better ways to think about M amp E

Useful definitions

Models of what evaluation is and how it relates to policy and practice

1 Better ways to think about M amp E

Useful definitions

bull Not just measuring whether objectives have been met

bull Articulating negotiatingndash What do we valuendash How is it going

1 Better ways to think about M amp E

Models of what evaluation is and how it relates to policy and practice

1 Different types of evaluation at different stages of the programpolicy cycle ndash rather than a final activity

2 The effect of M amp E

3 Iteratively building evaluation capacity

Common understandings of M amp E

bull Including definitions and models in major documents not just training manuals

bull Having these familiar to managers staff and communities not just to evaluators

bull Also recognising the value of different definitions and conceptualisations

Conceptual model for evaluationEvaluation is often seen as a final activity

Mainstreaming Social Inclusion httpwwweuropemsiorgindexphp

Conceptual model for evaluationEvaluation is often seen as a final activity

But this can lead to

Leaving it all to the end (no baselines)

Not being evaluative early on

Needs analysis

Program or policy design

Implementation of activities and ongoing management

Continuous improvement

Outcomes Evaluation amp performance monitoring

Conceptual model for evaluationDifferent types of evaluation at different stages of the

programpolicy cycle

Based on Funnell 2006 Designing an evaluation

Conceptual model for evaluationThe effect of M amp E

Overall resultBetter governance and service delivery in South Africa

Problems areaddressed

Achievementsare affirmed

and promoted

Departmentsfocus on

priority areas

Learning fromgood practice

examples takesplace

Problem areasidentified

Good practiceby others is

identified andpromoted

Priority areasin public

administrationare

communicated

Departmentsreflect ontheir own

performance

Public service monitoring

FOLLOWUP

REPORTING

Public Service Commission 2003

The underlying programme logic of the South African Public Service Commission M amp E system

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 8: Strategies for improving Monitoring and Evaluation

1 Presenting a limited view

1 Presenting a limited view

bull Only in terms of stated objectives andor targets

bull Only from the perspectives of certain groups and individuals

bull Only certain types of data or research designs

bull Bare indicators without explanation

2 Unfocused

2 Unfocused

2 Unfocused

2 Unfocused

2 Unfocused

bullTrying to look at everything ndash and looking at nothing wellbullNot communicating clear messages

3 Unrealistic expectations

3 Unrealistic expectations

3 Unrealistic expectations

Expecting bull too much too soon

and too easilybulldefinitive answersbull immediate answers

about long-term impacts

4 Not enough good information

4 Not enough good information

4 Not enough good information

bullPoor measurement and other data collection

bullPoor response ratebullInadequate data analysis

bullSensitive data removed

bullPressure to fill in missing data

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

bull Collecting lots of data ndash and then not being sure how to analyse it

bull Doing lots of evaluations ndash and then not being sure how to use them

Avoiding the Big 5

Avoiding the Big 5

LIMITED VIEW

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

Seven strategies

1 Better ways to think about M amp E2 Training and professional development3 Organisational infrastructure4 Supportive networks 5 External review processes6 Strategies for supporting use7 Building knowledge about what works

in evaluation in particular contexts

1 Better ways to think about M amp E

Useful definitions

Models of what evaluation is and how it relates to policy and practice

1 Better ways to think about M amp E

Useful definitions

bull Not just measuring whether objectives have been met

bull Articulating negotiatingndash What do we valuendash How is it going

1 Better ways to think about M amp E

Models of what evaluation is and how it relates to policy and practice

1 Different types of evaluation at different stages of the programpolicy cycle ndash rather than a final activity

2 The effect of M amp E

3 Iteratively building evaluation capacity

Common understandings of M amp E

bull Including definitions and models in major documents not just training manuals

bull Having these familiar to managers staff and communities not just to evaluators

bull Also recognising the value of different definitions and conceptualisations

Conceptual model for evaluationEvaluation is often seen as a final activity

Mainstreaming Social Inclusion httpwwweuropemsiorgindexphp

Conceptual model for evaluationEvaluation is often seen as a final activity

But this can lead to

Leaving it all to the end (no baselines)

Not being evaluative early on

Needs analysis

Program or policy design

Implementation of activities and ongoing management

Continuous improvement

Outcomes Evaluation amp performance monitoring

Conceptual model for evaluationDifferent types of evaluation at different stages of the

programpolicy cycle

Based on Funnell 2006 Designing an evaluation

Conceptual model for evaluationThe effect of M amp E

Overall resultBetter governance and service delivery in South Africa

Problems areaddressed

Achievementsare affirmed

and promoted

Departmentsfocus on

priority areas

Learning fromgood practice

examples takesplace

Problem areasidentified

Good practiceby others is

identified andpromoted

Priority areasin public

administrationare

communicated

Departmentsreflect ontheir own

performance

Public service monitoring

FOLLOWUP

REPORTING

Public Service Commission 2003

The underlying programme logic of the South African Public Service Commission M amp E system

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 9: Strategies for improving Monitoring and Evaluation

1 Presenting a limited view

bull Only in terms of stated objectives andor targets

bull Only from the perspectives of certain groups and individuals

bull Only certain types of data or research designs

bull Bare indicators without explanation

2 Unfocused

2 Unfocused

2 Unfocused

2 Unfocused

2 Unfocused

bullTrying to look at everything ndash and looking at nothing wellbullNot communicating clear messages

3 Unrealistic expectations

3 Unrealistic expectations

3 Unrealistic expectations

Expecting bull too much too soon

and too easilybulldefinitive answersbull immediate answers

about long-term impacts

4 Not enough good information

4 Not enough good information

4 Not enough good information

bullPoor measurement and other data collection

bullPoor response ratebullInadequate data analysis

bullSensitive data removed

bullPressure to fill in missing data

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

bull Collecting lots of data ndash and then not being sure how to analyse it

bull Doing lots of evaluations ndash and then not being sure how to use them

Avoiding the Big 5

Avoiding the Big 5

LIMITED VIEW

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

Seven strategies

1 Better ways to think about M amp E2 Training and professional development3 Organisational infrastructure4 Supportive networks 5 External review processes6 Strategies for supporting use7 Building knowledge about what works

in evaluation in particular contexts

1 Better ways to think about M amp E

Useful definitions

Models of what evaluation is and how it relates to policy and practice

1 Better ways to think about M amp E

Useful definitions

bull Not just measuring whether objectives have been met

bull Articulating negotiatingndash What do we valuendash How is it going

1 Better ways to think about M amp E

Models of what evaluation is and how it relates to policy and practice

1 Different types of evaluation at different stages of the programpolicy cycle ndash rather than a final activity

2 The effect of M amp E

3 Iteratively building evaluation capacity

Common understandings of M amp E

bull Including definitions and models in major documents not just training manuals

bull Having these familiar to managers staff and communities not just to evaluators

bull Also recognising the value of different definitions and conceptualisations

Conceptual model for evaluationEvaluation is often seen as a final activity

Mainstreaming Social Inclusion httpwwweuropemsiorgindexphp

Conceptual model for evaluationEvaluation is often seen as a final activity

But this can lead to

Leaving it all to the end (no baselines)

Not being evaluative early on

Needs analysis

Program or policy design

Implementation of activities and ongoing management

Continuous improvement

Outcomes Evaluation amp performance monitoring

Conceptual model for evaluationDifferent types of evaluation at different stages of the

programpolicy cycle

Based on Funnell 2006 Designing an evaluation

Conceptual model for evaluationThe effect of M amp E

Overall resultBetter governance and service delivery in South Africa

Problems areaddressed

Achievementsare affirmed

and promoted

Departmentsfocus on

priority areas

Learning fromgood practice

examples takesplace

Problem areasidentified

Good practiceby others is

identified andpromoted

Priority areasin public

administrationare

communicated

Departmentsreflect ontheir own

performance

Public service monitoring

FOLLOWUP

REPORTING

Public Service Commission 2003

The underlying programme logic of the South African Public Service Commission M amp E system

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 10: Strategies for improving Monitoring and Evaluation

2 Unfocused

2 Unfocused

2 Unfocused

2 Unfocused

2 Unfocused

bullTrying to look at everything ndash and looking at nothing wellbullNot communicating clear messages

3 Unrealistic expectations

3 Unrealistic expectations

3 Unrealistic expectations

Expecting bull too much too soon

and too easilybulldefinitive answersbull immediate answers

about long-term impacts

4 Not enough good information

4 Not enough good information

4 Not enough good information

bullPoor measurement and other data collection

bullPoor response ratebullInadequate data analysis

bullSensitive data removed

bullPressure to fill in missing data

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

bull Collecting lots of data ndash and then not being sure how to analyse it

bull Doing lots of evaluations ndash and then not being sure how to use them

Avoiding the Big 5

Avoiding the Big 5

LIMITED VIEW

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

Seven strategies

1 Better ways to think about M amp E2 Training and professional development3 Organisational infrastructure4 Supportive networks 5 External review processes6 Strategies for supporting use7 Building knowledge about what works

in evaluation in particular contexts

1 Better ways to think about M amp E

Useful definitions

Models of what evaluation is and how it relates to policy and practice

1 Better ways to think about M amp E

Useful definitions

bull Not just measuring whether objectives have been met

bull Articulating negotiatingndash What do we valuendash How is it going

1 Better ways to think about M amp E

Models of what evaluation is and how it relates to policy and practice

1 Different types of evaluation at different stages of the programpolicy cycle ndash rather than a final activity

2 The effect of M amp E

3 Iteratively building evaluation capacity

Common understandings of M amp E

bull Including definitions and models in major documents not just training manuals

bull Having these familiar to managers staff and communities not just to evaluators

bull Also recognising the value of different definitions and conceptualisations

Conceptual model for evaluationEvaluation is often seen as a final activity

Mainstreaming Social Inclusion httpwwweuropemsiorgindexphp

Conceptual model for evaluationEvaluation is often seen as a final activity

But this can lead to

Leaving it all to the end (no baselines)

Not being evaluative early on

Needs analysis

Program or policy design

Implementation of activities and ongoing management

Continuous improvement

Outcomes Evaluation amp performance monitoring

Conceptual model for evaluationDifferent types of evaluation at different stages of the

programpolicy cycle

Based on Funnell 2006 Designing an evaluation

Conceptual model for evaluationThe effect of M amp E

Overall resultBetter governance and service delivery in South Africa

Problems areaddressed

Achievementsare affirmed

and promoted

Departmentsfocus on

priority areas

Learning fromgood practice

examples takesplace

Problem areasidentified

Good practiceby others is

identified andpromoted

Priority areasin public

administrationare

communicated

Departmentsreflect ontheir own

performance

Public service monitoring

FOLLOWUP

REPORTING

Public Service Commission 2003

The underlying programme logic of the South African Public Service Commission M amp E system

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 11: Strategies for improving Monitoring and Evaluation

2 Unfocused

2 Unfocused

2 Unfocused

2 Unfocused

bullTrying to look at everything ndash and looking at nothing wellbullNot communicating clear messages

3 Unrealistic expectations

3 Unrealistic expectations

3 Unrealistic expectations

Expecting bull too much too soon

and too easilybulldefinitive answersbull immediate answers

about long-term impacts

4 Not enough good information

4 Not enough good information

4 Not enough good information

bullPoor measurement and other data collection

bullPoor response ratebullInadequate data analysis

bullSensitive data removed

bullPressure to fill in missing data

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

bull Collecting lots of data ndash and then not being sure how to analyse it

bull Doing lots of evaluations ndash and then not being sure how to use them

Avoiding the Big 5

Avoiding the Big 5

LIMITED VIEW

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

Seven strategies

1 Better ways to think about M amp E2 Training and professional development3 Organisational infrastructure4 Supportive networks 5 External review processes6 Strategies for supporting use7 Building knowledge about what works

in evaluation in particular contexts

1 Better ways to think about M amp E

Useful definitions

Models of what evaluation is and how it relates to policy and practice

1 Better ways to think about M amp E

Useful definitions

bull Not just measuring whether objectives have been met

bull Articulating negotiatingndash What do we valuendash How is it going

1 Better ways to think about M amp E

Models of what evaluation is and how it relates to policy and practice

1 Different types of evaluation at different stages of the programpolicy cycle ndash rather than a final activity

2 The effect of M amp E

3 Iteratively building evaluation capacity

Common understandings of M amp E

bull Including definitions and models in major documents not just training manuals

bull Having these familiar to managers staff and communities not just to evaluators

bull Also recognising the value of different definitions and conceptualisations

Conceptual model for evaluationEvaluation is often seen as a final activity

Mainstreaming Social Inclusion httpwwweuropemsiorgindexphp

Conceptual model for evaluationEvaluation is often seen as a final activity

But this can lead to

Leaving it all to the end (no baselines)

Not being evaluative early on

Needs analysis

Program or policy design

Implementation of activities and ongoing management

Continuous improvement

Outcomes Evaluation amp performance monitoring

Conceptual model for evaluationDifferent types of evaluation at different stages of the

programpolicy cycle

Based on Funnell 2006 Designing an evaluation

Conceptual model for evaluationThe effect of M amp E

Overall resultBetter governance and service delivery in South Africa

Problems areaddressed

Achievementsare affirmed

and promoted

Departmentsfocus on

priority areas

Learning fromgood practice

examples takesplace

Problem areasidentified

Good practiceby others is

identified andpromoted

Priority areasin public

administrationare

communicated

Departmentsreflect ontheir own

performance

Public service monitoring

FOLLOWUP

REPORTING

Public Service Commission 2003

The underlying programme logic of the South African Public Service Commission M amp E system

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 12: Strategies for improving Monitoring and Evaluation

2 Unfocused

2 Unfocused

2 Unfocused

bullTrying to look at everything ndash and looking at nothing wellbullNot communicating clear messages

3 Unrealistic expectations

3 Unrealistic expectations

3 Unrealistic expectations

Expecting bull too much too soon

and too easilybulldefinitive answersbull immediate answers

about long-term impacts

4 Not enough good information

4 Not enough good information

4 Not enough good information

bullPoor measurement and other data collection

bullPoor response ratebullInadequate data analysis

bullSensitive data removed

bullPressure to fill in missing data

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

bull Collecting lots of data ndash and then not being sure how to analyse it

bull Doing lots of evaluations ndash and then not being sure how to use them

Avoiding the Big 5

Avoiding the Big 5

LIMITED VIEW

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

Seven strategies

1 Better ways to think about M amp E2 Training and professional development3 Organisational infrastructure4 Supportive networks 5 External review processes6 Strategies for supporting use7 Building knowledge about what works

in evaluation in particular contexts

1 Better ways to think about M amp E

Useful definitions

Models of what evaluation is and how it relates to policy and practice

1 Better ways to think about M amp E

Useful definitions

bull Not just measuring whether objectives have been met

bull Articulating negotiatingndash What do we valuendash How is it going

1 Better ways to think about M amp E

Models of what evaluation is and how it relates to policy and practice

1 Different types of evaluation at different stages of the programpolicy cycle ndash rather than a final activity

2 The effect of M amp E

3 Iteratively building evaluation capacity

Common understandings of M amp E

bull Including definitions and models in major documents not just training manuals

bull Having these familiar to managers staff and communities not just to evaluators

bull Also recognising the value of different definitions and conceptualisations

Conceptual model for evaluationEvaluation is often seen as a final activity

Mainstreaming Social Inclusion httpwwweuropemsiorgindexphp

Conceptual model for evaluationEvaluation is often seen as a final activity

But this can lead to

Leaving it all to the end (no baselines)

Not being evaluative early on

Needs analysis

Program or policy design

Implementation of activities and ongoing management

Continuous improvement

Outcomes Evaluation amp performance monitoring

Conceptual model for evaluationDifferent types of evaluation at different stages of the

programpolicy cycle

Based on Funnell 2006 Designing an evaluation

Conceptual model for evaluationThe effect of M amp E

Overall resultBetter governance and service delivery in South Africa

Problems areaddressed

Achievementsare affirmed

and promoted

Departmentsfocus on

priority areas

Learning fromgood practice

examples takesplace

Problem areasidentified

Good practiceby others is

identified andpromoted

Priority areasin public

administrationare

communicated

Departmentsreflect ontheir own

performance

Public service monitoring

FOLLOWUP

REPORTING

Public Service Commission 2003

The underlying programme logic of the South African Public Service Commission M amp E system

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 13: Strategies for improving Monitoring and Evaluation

2 Unfocused

2 Unfocused

bullTrying to look at everything ndash and looking at nothing wellbullNot communicating clear messages

3 Unrealistic expectations

3 Unrealistic expectations

3 Unrealistic expectations

Expecting bull too much too soon

and too easilybulldefinitive answersbull immediate answers

about long-term impacts

4 Not enough good information

4 Not enough good information

4 Not enough good information

bullPoor measurement and other data collection

bullPoor response ratebullInadequate data analysis

bullSensitive data removed

bullPressure to fill in missing data

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

bull Collecting lots of data ndash and then not being sure how to analyse it

bull Doing lots of evaluations ndash and then not being sure how to use them

Avoiding the Big 5

Avoiding the Big 5

LIMITED VIEW

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

Seven strategies

1 Better ways to think about M amp E2 Training and professional development3 Organisational infrastructure4 Supportive networks 5 External review processes6 Strategies for supporting use7 Building knowledge about what works

in evaluation in particular contexts

1 Better ways to think about M amp E

Useful definitions

Models of what evaluation is and how it relates to policy and practice

1 Better ways to think about M amp E

Useful definitions

bull Not just measuring whether objectives have been met

bull Articulating negotiatingndash What do we valuendash How is it going

1 Better ways to think about M amp E

Models of what evaluation is and how it relates to policy and practice

1 Different types of evaluation at different stages of the programpolicy cycle ndash rather than a final activity

2 The effect of M amp E

3 Iteratively building evaluation capacity

Common understandings of M amp E

bull Including definitions and models in major documents not just training manuals

bull Having these familiar to managers staff and communities not just to evaluators

bull Also recognising the value of different definitions and conceptualisations

Conceptual model for evaluationEvaluation is often seen as a final activity

Mainstreaming Social Inclusion httpwwweuropemsiorgindexphp

Conceptual model for evaluationEvaluation is often seen as a final activity

But this can lead to

Leaving it all to the end (no baselines)

Not being evaluative early on

Needs analysis

Program or policy design

Implementation of activities and ongoing management

Continuous improvement

Outcomes Evaluation amp performance monitoring

Conceptual model for evaluationDifferent types of evaluation at different stages of the

programpolicy cycle

Based on Funnell 2006 Designing an evaluation

Conceptual model for evaluationThe effect of M amp E

Overall resultBetter governance and service delivery in South Africa

Problems areaddressed

Achievementsare affirmed

and promoted

Departmentsfocus on

priority areas

Learning fromgood practice

examples takesplace

Problem areasidentified

Good practiceby others is

identified andpromoted

Priority areasin public

administrationare

communicated

Departmentsreflect ontheir own

performance

Public service monitoring

FOLLOWUP

REPORTING

Public Service Commission 2003

The underlying programme logic of the South African Public Service Commission M amp E system

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 14: Strategies for improving Monitoring and Evaluation

2 Unfocused

bullTrying to look at everything ndash and looking at nothing wellbullNot communicating clear messages

3 Unrealistic expectations

3 Unrealistic expectations

3 Unrealistic expectations

Expecting bull too much too soon

and too easilybulldefinitive answersbull immediate answers

about long-term impacts

4 Not enough good information

4 Not enough good information

4 Not enough good information

bullPoor measurement and other data collection

bullPoor response ratebullInadequate data analysis

bullSensitive data removed

bullPressure to fill in missing data

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

bull Collecting lots of data ndash and then not being sure how to analyse it

bull Doing lots of evaluations ndash and then not being sure how to use them

Avoiding the Big 5

Avoiding the Big 5

LIMITED VIEW

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

Seven strategies

1 Better ways to think about M amp E2 Training and professional development3 Organisational infrastructure4 Supportive networks 5 External review processes6 Strategies for supporting use7 Building knowledge about what works

in evaluation in particular contexts

1 Better ways to think about M amp E

Useful definitions

Models of what evaluation is and how it relates to policy and practice

1 Better ways to think about M amp E

Useful definitions

bull Not just measuring whether objectives have been met

bull Articulating negotiatingndash What do we valuendash How is it going

1 Better ways to think about M amp E

Models of what evaluation is and how it relates to policy and practice

1 Different types of evaluation at different stages of the programpolicy cycle ndash rather than a final activity

2 The effect of M amp E

3 Iteratively building evaluation capacity

Common understandings of M amp E

bull Including definitions and models in major documents not just training manuals

bull Having these familiar to managers staff and communities not just to evaluators

bull Also recognising the value of different definitions and conceptualisations

Conceptual model for evaluationEvaluation is often seen as a final activity

Mainstreaming Social Inclusion httpwwweuropemsiorgindexphp

Conceptual model for evaluationEvaluation is often seen as a final activity

But this can lead to

Leaving it all to the end (no baselines)

Not being evaluative early on

Needs analysis

Program or policy design

Implementation of activities and ongoing management

Continuous improvement

Outcomes Evaluation amp performance monitoring

Conceptual model for evaluationDifferent types of evaluation at different stages of the

programpolicy cycle

Based on Funnell 2006 Designing an evaluation

Conceptual model for evaluationThe effect of M amp E

Overall resultBetter governance and service delivery in South Africa

Problems areaddressed

Achievementsare affirmed

and promoted

Departmentsfocus on

priority areas

Learning fromgood practice

examples takesplace

Problem areasidentified

Good practiceby others is

identified andpromoted

Priority areasin public

administrationare

communicated

Departmentsreflect ontheir own

performance

Public service monitoring

FOLLOWUP

REPORTING

Public Service Commission 2003

The underlying programme logic of the South African Public Service Commission M amp E system

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 15: Strategies for improving Monitoring and Evaluation

3 Unrealistic expectations

3 Unrealistic expectations

3 Unrealistic expectations

Expecting bull too much too soon

and too easilybulldefinitive answersbull immediate answers

about long-term impacts

4 Not enough good information

4 Not enough good information

4 Not enough good information

bullPoor measurement and other data collection

bullPoor response ratebullInadequate data analysis

bullSensitive data removed

bullPressure to fill in missing data

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

bull Collecting lots of data ndash and then not being sure how to analyse it

bull Doing lots of evaluations ndash and then not being sure how to use them

Avoiding the Big 5

Avoiding the Big 5

LIMITED VIEW

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

Seven strategies

1 Better ways to think about M amp E2 Training and professional development3 Organisational infrastructure4 Supportive networks 5 External review processes6 Strategies for supporting use7 Building knowledge about what works

in evaluation in particular contexts

1 Better ways to think about M amp E

Useful definitions

Models of what evaluation is and how it relates to policy and practice

1 Better ways to think about M amp E

Useful definitions

bull Not just measuring whether objectives have been met

bull Articulating negotiatingndash What do we valuendash How is it going

1 Better ways to think about M amp E

Models of what evaluation is and how it relates to policy and practice

1 Different types of evaluation at different stages of the programpolicy cycle ndash rather than a final activity

2 The effect of M amp E

3 Iteratively building evaluation capacity

Common understandings of M amp E

bull Including definitions and models in major documents not just training manuals

bull Having these familiar to managers staff and communities not just to evaluators

bull Also recognising the value of different definitions and conceptualisations

Conceptual model for evaluationEvaluation is often seen as a final activity

Mainstreaming Social Inclusion httpwwweuropemsiorgindexphp

Conceptual model for evaluationEvaluation is often seen as a final activity

But this can lead to

Leaving it all to the end (no baselines)

Not being evaluative early on

Needs analysis

Program or policy design

Implementation of activities and ongoing management

Continuous improvement

Outcomes Evaluation amp performance monitoring

Conceptual model for evaluationDifferent types of evaluation at different stages of the

programpolicy cycle

Based on Funnell 2006 Designing an evaluation

Conceptual model for evaluationThe effect of M amp E

Overall resultBetter governance and service delivery in South Africa

Problems areaddressed

Achievementsare affirmed

and promoted

Departmentsfocus on

priority areas

Learning fromgood practice

examples takesplace

Problem areasidentified

Good practiceby others is

identified andpromoted

Priority areasin public

administrationare

communicated

Departmentsreflect ontheir own

performance

Public service monitoring

FOLLOWUP

REPORTING

Public Service Commission 2003

The underlying programme logic of the South African Public Service Commission M amp E system

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 16: Strategies for improving Monitoring and Evaluation

3 Unrealistic expectations

3 Unrealistic expectations

Expecting bull too much too soon

and too easilybulldefinitive answersbull immediate answers

about long-term impacts

4 Not enough good information

4 Not enough good information

4 Not enough good information

bullPoor measurement and other data collection

bullPoor response ratebullInadequate data analysis

bullSensitive data removed

bullPressure to fill in missing data

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

bull Collecting lots of data ndash and then not being sure how to analyse it

bull Doing lots of evaluations ndash and then not being sure how to use them

Avoiding the Big 5

Avoiding the Big 5

LIMITED VIEW

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

Seven strategies

1 Better ways to think about M amp E2 Training and professional development3 Organisational infrastructure4 Supportive networks 5 External review processes6 Strategies for supporting use7 Building knowledge about what works

in evaluation in particular contexts

1 Better ways to think about M amp E

Useful definitions

Models of what evaluation is and how it relates to policy and practice

1 Better ways to think about M amp E

Useful definitions

bull Not just measuring whether objectives have been met

bull Articulating negotiatingndash What do we valuendash How is it going

1 Better ways to think about M amp E

Models of what evaluation is and how it relates to policy and practice

1 Different types of evaluation at different stages of the programpolicy cycle ndash rather than a final activity

2 The effect of M amp E

3 Iteratively building evaluation capacity

Common understandings of M amp E

bull Including definitions and models in major documents not just training manuals

bull Having these familiar to managers staff and communities not just to evaluators

bull Also recognising the value of different definitions and conceptualisations

Conceptual model for evaluationEvaluation is often seen as a final activity

Mainstreaming Social Inclusion httpwwweuropemsiorgindexphp

Conceptual model for evaluationEvaluation is often seen as a final activity

But this can lead to

Leaving it all to the end (no baselines)

Not being evaluative early on

Needs analysis

Program or policy design

Implementation of activities and ongoing management

Continuous improvement

Outcomes Evaluation amp performance monitoring

Conceptual model for evaluationDifferent types of evaluation at different stages of the

programpolicy cycle

Based on Funnell 2006 Designing an evaluation

Conceptual model for evaluationThe effect of M amp E

Overall resultBetter governance and service delivery in South Africa

Problems areaddressed

Achievementsare affirmed

and promoted

Departmentsfocus on

priority areas

Learning fromgood practice

examples takesplace

Problem areasidentified

Good practiceby others is

identified andpromoted

Priority areasin public

administrationare

communicated

Departmentsreflect ontheir own

performance

Public service monitoring

FOLLOWUP

REPORTING

Public Service Commission 2003

The underlying programme logic of the South African Public Service Commission M amp E system

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 17: Strategies for improving Monitoring and Evaluation

3 Unrealistic expectations

Expecting bull too much too soon

and too easilybulldefinitive answersbull immediate answers

about long-term impacts

4 Not enough good information

4 Not enough good information

4 Not enough good information

bullPoor measurement and other data collection

bullPoor response ratebullInadequate data analysis

bullSensitive data removed

bullPressure to fill in missing data

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

bull Collecting lots of data ndash and then not being sure how to analyse it

bull Doing lots of evaluations ndash and then not being sure how to use them

Avoiding the Big 5

Avoiding the Big 5

LIMITED VIEW

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

Seven strategies

1 Better ways to think about M amp E2 Training and professional development3 Organisational infrastructure4 Supportive networks 5 External review processes6 Strategies for supporting use7 Building knowledge about what works

in evaluation in particular contexts

1 Better ways to think about M amp E

Useful definitions

Models of what evaluation is and how it relates to policy and practice

1 Better ways to think about M amp E

Useful definitions

bull Not just measuring whether objectives have been met

bull Articulating negotiatingndash What do we valuendash How is it going

1 Better ways to think about M amp E

Models of what evaluation is and how it relates to policy and practice

1 Different types of evaluation at different stages of the programpolicy cycle ndash rather than a final activity

2 The effect of M amp E

3 Iteratively building evaluation capacity

Common understandings of M amp E

bull Including definitions and models in major documents not just training manuals

bull Having these familiar to managers staff and communities not just to evaluators

bull Also recognising the value of different definitions and conceptualisations

Conceptual model for evaluationEvaluation is often seen as a final activity

Mainstreaming Social Inclusion httpwwweuropemsiorgindexphp

Conceptual model for evaluationEvaluation is often seen as a final activity

But this can lead to

Leaving it all to the end (no baselines)

Not being evaluative early on

Needs analysis

Program or policy design

Implementation of activities and ongoing management

Continuous improvement

Outcomes Evaluation amp performance monitoring

Conceptual model for evaluationDifferent types of evaluation at different stages of the

programpolicy cycle

Based on Funnell 2006 Designing an evaluation

Conceptual model for evaluationThe effect of M amp E

Overall resultBetter governance and service delivery in South Africa

Problems areaddressed

Achievementsare affirmed

and promoted

Departmentsfocus on

priority areas

Learning fromgood practice

examples takesplace

Problem areasidentified

Good practiceby others is

identified andpromoted

Priority areasin public

administrationare

communicated

Departmentsreflect ontheir own

performance

Public service monitoring

FOLLOWUP

REPORTING

Public Service Commission 2003

The underlying programme logic of the South African Public Service Commission M amp E system

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 18: Strategies for improving Monitoring and Evaluation

4 Not enough good information

4 Not enough good information

4 Not enough good information

bullPoor measurement and other data collection

bullPoor response ratebullInadequate data analysis

bullSensitive data removed

bullPressure to fill in missing data

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

bull Collecting lots of data ndash and then not being sure how to analyse it

bull Doing lots of evaluations ndash and then not being sure how to use them

Avoiding the Big 5

Avoiding the Big 5

LIMITED VIEW

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

Seven strategies

1 Better ways to think about M amp E2 Training and professional development3 Organisational infrastructure4 Supportive networks 5 External review processes6 Strategies for supporting use7 Building knowledge about what works

in evaluation in particular contexts

1 Better ways to think about M amp E

Useful definitions

Models of what evaluation is and how it relates to policy and practice

1 Better ways to think about M amp E

Useful definitions

bull Not just measuring whether objectives have been met

bull Articulating negotiatingndash What do we valuendash How is it going

1 Better ways to think about M amp E

Models of what evaluation is and how it relates to policy and practice

1 Different types of evaluation at different stages of the programpolicy cycle ndash rather than a final activity

2 The effect of M amp E

3 Iteratively building evaluation capacity

Common understandings of M amp E

bull Including definitions and models in major documents not just training manuals

bull Having these familiar to managers staff and communities not just to evaluators

bull Also recognising the value of different definitions and conceptualisations

Conceptual model for evaluationEvaluation is often seen as a final activity

Mainstreaming Social Inclusion httpwwweuropemsiorgindexphp

Conceptual model for evaluationEvaluation is often seen as a final activity

But this can lead to

Leaving it all to the end (no baselines)

Not being evaluative early on

Needs analysis

Program or policy design

Implementation of activities and ongoing management

Continuous improvement

Outcomes Evaluation amp performance monitoring

Conceptual model for evaluationDifferent types of evaluation at different stages of the

programpolicy cycle

Based on Funnell 2006 Designing an evaluation

Conceptual model for evaluationThe effect of M amp E

Overall resultBetter governance and service delivery in South Africa

Problems areaddressed

Achievementsare affirmed

and promoted

Departmentsfocus on

priority areas

Learning fromgood practice

examples takesplace

Problem areasidentified

Good practiceby others is

identified andpromoted

Priority areasin public

administrationare

communicated

Departmentsreflect ontheir own

performance

Public service monitoring

FOLLOWUP

REPORTING

Public Service Commission 2003

The underlying programme logic of the South African Public Service Commission M amp E system

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 19: Strategies for improving Monitoring and Evaluation

4 Not enough good information

4 Not enough good information

bullPoor measurement and other data collection

bullPoor response ratebullInadequate data analysis

bullSensitive data removed

bullPressure to fill in missing data

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

bull Collecting lots of data ndash and then not being sure how to analyse it

bull Doing lots of evaluations ndash and then not being sure how to use them

Avoiding the Big 5

Avoiding the Big 5

LIMITED VIEW

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

Seven strategies

1 Better ways to think about M amp E2 Training and professional development3 Organisational infrastructure4 Supportive networks 5 External review processes6 Strategies for supporting use7 Building knowledge about what works

in evaluation in particular contexts

1 Better ways to think about M amp E

Useful definitions

Models of what evaluation is and how it relates to policy and practice

1 Better ways to think about M amp E

Useful definitions

bull Not just measuring whether objectives have been met

bull Articulating negotiatingndash What do we valuendash How is it going

1 Better ways to think about M amp E

Models of what evaluation is and how it relates to policy and practice

1 Different types of evaluation at different stages of the programpolicy cycle ndash rather than a final activity

2 The effect of M amp E

3 Iteratively building evaluation capacity

Common understandings of M amp E

bull Including definitions and models in major documents not just training manuals

bull Having these familiar to managers staff and communities not just to evaluators

bull Also recognising the value of different definitions and conceptualisations

Conceptual model for evaluationEvaluation is often seen as a final activity

Mainstreaming Social Inclusion httpwwweuropemsiorgindexphp

Conceptual model for evaluationEvaluation is often seen as a final activity

But this can lead to

Leaving it all to the end (no baselines)

Not being evaluative early on

Needs analysis

Program or policy design

Implementation of activities and ongoing management

Continuous improvement

Outcomes Evaluation amp performance monitoring

Conceptual model for evaluationDifferent types of evaluation at different stages of the

programpolicy cycle

Based on Funnell 2006 Designing an evaluation

Conceptual model for evaluationThe effect of M amp E

Overall resultBetter governance and service delivery in South Africa

Problems areaddressed

Achievementsare affirmed

and promoted

Departmentsfocus on

priority areas

Learning fromgood practice

examples takesplace

Problem areasidentified

Good practiceby others is

identified andpromoted

Priority areasin public

administrationare

communicated

Departmentsreflect ontheir own

performance

Public service monitoring

FOLLOWUP

REPORTING

Public Service Commission 2003

The underlying programme logic of the South African Public Service Commission M amp E system

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 20: Strategies for improving Monitoring and Evaluation

4 Not enough good information

bullPoor measurement and other data collection

bullPoor response ratebullInadequate data analysis

bullSensitive data removed

bullPressure to fill in missing data

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

bull Collecting lots of data ndash and then not being sure how to analyse it

bull Doing lots of evaluations ndash and then not being sure how to use them

Avoiding the Big 5

Avoiding the Big 5

LIMITED VIEW

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

Seven strategies

1 Better ways to think about M amp E2 Training and professional development3 Organisational infrastructure4 Supportive networks 5 External review processes6 Strategies for supporting use7 Building knowledge about what works

in evaluation in particular contexts

1 Better ways to think about M amp E

Useful definitions

Models of what evaluation is and how it relates to policy and practice

1 Better ways to think about M amp E

Useful definitions

bull Not just measuring whether objectives have been met

bull Articulating negotiatingndash What do we valuendash How is it going

1 Better ways to think about M amp E

Models of what evaluation is and how it relates to policy and practice

1 Different types of evaluation at different stages of the programpolicy cycle ndash rather than a final activity

2 The effect of M amp E

3 Iteratively building evaluation capacity

Common understandings of M amp E

bull Including definitions and models in major documents not just training manuals

bull Having these familiar to managers staff and communities not just to evaluators

bull Also recognising the value of different definitions and conceptualisations

Conceptual model for evaluationEvaluation is often seen as a final activity

Mainstreaming Social Inclusion httpwwweuropemsiorgindexphp

Conceptual model for evaluationEvaluation is often seen as a final activity

But this can lead to

Leaving it all to the end (no baselines)

Not being evaluative early on

Needs analysis

Program or policy design

Implementation of activities and ongoing management

Continuous improvement

Outcomes Evaluation amp performance monitoring

Conceptual model for evaluationDifferent types of evaluation at different stages of the

programpolicy cycle

Based on Funnell 2006 Designing an evaluation

Conceptual model for evaluationThe effect of M amp E

Overall resultBetter governance and service delivery in South Africa

Problems areaddressed

Achievementsare affirmed

and promoted

Departmentsfocus on

priority areas

Learning fromgood practice

examples takesplace

Problem areasidentified

Good practiceby others is

identified andpromoted

Priority areasin public

administrationare

communicated

Departmentsreflect ontheir own

performance

Public service monitoring

FOLLOWUP

REPORTING

Public Service Commission 2003

The underlying programme logic of the South African Public Service Commission M amp E system

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 21: Strategies for improving Monitoring and Evaluation

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

bull Collecting lots of data ndash and then not being sure how to analyse it

bull Doing lots of evaluations ndash and then not being sure how to use them

Avoiding the Big 5

Avoiding the Big 5

LIMITED VIEW

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

Seven strategies

1 Better ways to think about M amp E2 Training and professional development3 Organisational infrastructure4 Supportive networks 5 External review processes6 Strategies for supporting use7 Building knowledge about what works

in evaluation in particular contexts

1 Better ways to think about M amp E

Useful definitions

Models of what evaluation is and how it relates to policy and practice

1 Better ways to think about M amp E

Useful definitions

bull Not just measuring whether objectives have been met

bull Articulating negotiatingndash What do we valuendash How is it going

1 Better ways to think about M amp E

Models of what evaluation is and how it relates to policy and practice

1 Different types of evaluation at different stages of the programpolicy cycle ndash rather than a final activity

2 The effect of M amp E

3 Iteratively building evaluation capacity

Common understandings of M amp E

bull Including definitions and models in major documents not just training manuals

bull Having these familiar to managers staff and communities not just to evaluators

bull Also recognising the value of different definitions and conceptualisations

Conceptual model for evaluationEvaluation is often seen as a final activity

Mainstreaming Social Inclusion httpwwweuropemsiorgindexphp

Conceptual model for evaluationEvaluation is often seen as a final activity

But this can lead to

Leaving it all to the end (no baselines)

Not being evaluative early on

Needs analysis

Program or policy design

Implementation of activities and ongoing management

Continuous improvement

Outcomes Evaluation amp performance monitoring

Conceptual model for evaluationDifferent types of evaluation at different stages of the

programpolicy cycle

Based on Funnell 2006 Designing an evaluation

Conceptual model for evaluationThe effect of M amp E

Overall resultBetter governance and service delivery in South Africa

Problems areaddressed

Achievementsare affirmed

and promoted

Departmentsfocus on

priority areas

Learning fromgood practice

examples takesplace

Problem areasidentified

Good practiceby others is

identified andpromoted

Priority areasin public

administrationare

communicated

Departmentsreflect ontheir own

performance

Public service monitoring

FOLLOWUP

REPORTING

Public Service Commission 2003

The underlying programme logic of the South African Public Service Commission M amp E system

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 22: Strategies for improving Monitoring and Evaluation

5 Waiting till the end to work out what to do with what comes out

5 Waiting till the end to work out what to do with what comes out

bull Collecting lots of data ndash and then not being sure how to analyse it

bull Doing lots of evaluations ndash and then not being sure how to use them

Avoiding the Big 5

Avoiding the Big 5

LIMITED VIEW

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

Seven strategies

1 Better ways to think about M amp E2 Training and professional development3 Organisational infrastructure4 Supportive networks 5 External review processes6 Strategies for supporting use7 Building knowledge about what works

in evaluation in particular contexts

1 Better ways to think about M amp E

Useful definitions

Models of what evaluation is and how it relates to policy and practice

1 Better ways to think about M amp E

Useful definitions

bull Not just measuring whether objectives have been met

bull Articulating negotiatingndash What do we valuendash How is it going

1 Better ways to think about M amp E

Models of what evaluation is and how it relates to policy and practice

1 Different types of evaluation at different stages of the programpolicy cycle ndash rather than a final activity

2 The effect of M amp E

3 Iteratively building evaluation capacity

Common understandings of M amp E

bull Including definitions and models in major documents not just training manuals

bull Having these familiar to managers staff and communities not just to evaluators

bull Also recognising the value of different definitions and conceptualisations

Conceptual model for evaluationEvaluation is often seen as a final activity

Mainstreaming Social Inclusion httpwwweuropemsiorgindexphp

Conceptual model for evaluationEvaluation is often seen as a final activity

But this can lead to

Leaving it all to the end (no baselines)

Not being evaluative early on

Needs analysis

Program or policy design

Implementation of activities and ongoing management

Continuous improvement

Outcomes Evaluation amp performance monitoring

Conceptual model for evaluationDifferent types of evaluation at different stages of the

programpolicy cycle

Based on Funnell 2006 Designing an evaluation

Conceptual model for evaluationThe effect of M amp E

Overall resultBetter governance and service delivery in South Africa

Problems areaddressed

Achievementsare affirmed

and promoted

Departmentsfocus on

priority areas

Learning fromgood practice

examples takesplace

Problem areasidentified

Good practiceby others is

identified andpromoted

Priority areasin public

administrationare

communicated

Departmentsreflect ontheir own

performance

Public service monitoring

FOLLOWUP

REPORTING

Public Service Commission 2003

The underlying programme logic of the South African Public Service Commission M amp E system

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 23: Strategies for improving Monitoring and Evaluation

5 Waiting till the end to work out what to do with what comes out

bull Collecting lots of data ndash and then not being sure how to analyse it

bull Doing lots of evaluations ndash and then not being sure how to use them

Avoiding the Big 5

Avoiding the Big 5

LIMITED VIEW

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

Seven strategies

1 Better ways to think about M amp E2 Training and professional development3 Organisational infrastructure4 Supportive networks 5 External review processes6 Strategies for supporting use7 Building knowledge about what works

in evaluation in particular contexts

1 Better ways to think about M amp E

Useful definitions

Models of what evaluation is and how it relates to policy and practice

1 Better ways to think about M amp E

Useful definitions

bull Not just measuring whether objectives have been met

bull Articulating negotiatingndash What do we valuendash How is it going

1 Better ways to think about M amp E

Models of what evaluation is and how it relates to policy and practice

1 Different types of evaluation at different stages of the programpolicy cycle ndash rather than a final activity

2 The effect of M amp E

3 Iteratively building evaluation capacity

Common understandings of M amp E

bull Including definitions and models in major documents not just training manuals

bull Having these familiar to managers staff and communities not just to evaluators

bull Also recognising the value of different definitions and conceptualisations

Conceptual model for evaluationEvaluation is often seen as a final activity

Mainstreaming Social Inclusion httpwwweuropemsiorgindexphp

Conceptual model for evaluationEvaluation is often seen as a final activity

But this can lead to

Leaving it all to the end (no baselines)

Not being evaluative early on

Needs analysis

Program or policy design

Implementation of activities and ongoing management

Continuous improvement

Outcomes Evaluation amp performance monitoring

Conceptual model for evaluationDifferent types of evaluation at different stages of the

programpolicy cycle

Based on Funnell 2006 Designing an evaluation

Conceptual model for evaluationThe effect of M amp E

Overall resultBetter governance and service delivery in South Africa

Problems areaddressed

Achievementsare affirmed

and promoted

Departmentsfocus on

priority areas

Learning fromgood practice

examples takesplace

Problem areasidentified

Good practiceby others is

identified andpromoted

Priority areasin public

administrationare

communicated

Departmentsreflect ontheir own

performance

Public service monitoring

FOLLOWUP

REPORTING

Public Service Commission 2003

The underlying programme logic of the South African Public Service Commission M amp E system

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 24: Strategies for improving Monitoring and Evaluation

Avoiding the Big 5

Avoiding the Big 5

LIMITED VIEW

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

Seven strategies

1 Better ways to think about M amp E2 Training and professional development3 Organisational infrastructure4 Supportive networks 5 External review processes6 Strategies for supporting use7 Building knowledge about what works

in evaluation in particular contexts

1 Better ways to think about M amp E

Useful definitions

Models of what evaluation is and how it relates to policy and practice

1 Better ways to think about M amp E

Useful definitions

bull Not just measuring whether objectives have been met

bull Articulating negotiatingndash What do we valuendash How is it going

1 Better ways to think about M amp E

Models of what evaluation is and how it relates to policy and practice

1 Different types of evaluation at different stages of the programpolicy cycle ndash rather than a final activity

2 The effect of M amp E

3 Iteratively building evaluation capacity

Common understandings of M amp E

bull Including definitions and models in major documents not just training manuals

bull Having these familiar to managers staff and communities not just to evaluators

bull Also recognising the value of different definitions and conceptualisations

Conceptual model for evaluationEvaluation is often seen as a final activity

Mainstreaming Social Inclusion httpwwweuropemsiorgindexphp

Conceptual model for evaluationEvaluation is often seen as a final activity

But this can lead to

Leaving it all to the end (no baselines)

Not being evaluative early on

Needs analysis

Program or policy design

Implementation of activities and ongoing management

Continuous improvement

Outcomes Evaluation amp performance monitoring

Conceptual model for evaluationDifferent types of evaluation at different stages of the

programpolicy cycle

Based on Funnell 2006 Designing an evaluation

Conceptual model for evaluationThe effect of M amp E

Overall resultBetter governance and service delivery in South Africa

Problems areaddressed

Achievementsare affirmed

and promoted

Departmentsfocus on

priority areas

Learning fromgood practice

examples takesplace

Problem areasidentified

Good practiceby others is

identified andpromoted

Priority areasin public

administrationare

communicated

Departmentsreflect ontheir own

performance

Public service monitoring

FOLLOWUP

REPORTING

Public Service Commission 2003

The underlying programme logic of the South African Public Service Commission M amp E system

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 25: Strategies for improving Monitoring and Evaluation

Avoiding the Big 5

LIMITED VIEW

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

Seven strategies

1 Better ways to think about M amp E2 Training and professional development3 Organisational infrastructure4 Supportive networks 5 External review processes6 Strategies for supporting use7 Building knowledge about what works

in evaluation in particular contexts

1 Better ways to think about M amp E

Useful definitions

Models of what evaluation is and how it relates to policy and practice

1 Better ways to think about M amp E

Useful definitions

bull Not just measuring whether objectives have been met

bull Articulating negotiatingndash What do we valuendash How is it going

1 Better ways to think about M amp E

Models of what evaluation is and how it relates to policy and practice

1 Different types of evaluation at different stages of the programpolicy cycle ndash rather than a final activity

2 The effect of M amp E

3 Iteratively building evaluation capacity

Common understandings of M amp E

bull Including definitions and models in major documents not just training manuals

bull Having these familiar to managers staff and communities not just to evaluators

bull Also recognising the value of different definitions and conceptualisations

Conceptual model for evaluationEvaluation is often seen as a final activity

Mainstreaming Social Inclusion httpwwweuropemsiorgindexphp

Conceptual model for evaluationEvaluation is often seen as a final activity

But this can lead to

Leaving it all to the end (no baselines)

Not being evaluative early on

Needs analysis

Program or policy design

Implementation of activities and ongoing management

Continuous improvement

Outcomes Evaluation amp performance monitoring

Conceptual model for evaluationDifferent types of evaluation at different stages of the

programpolicy cycle

Based on Funnell 2006 Designing an evaluation

Conceptual model for evaluationThe effect of M amp E

Overall resultBetter governance and service delivery in South Africa

Problems areaddressed

Achievementsare affirmed

and promoted

Departmentsfocus on

priority areas

Learning fromgood practice

examples takesplace

Problem areasidentified

Good practiceby others is

identified andpromoted

Priority areasin public

administrationare

communicated

Departmentsreflect ontheir own

performance

Public service monitoring

FOLLOWUP

REPORTING

Public Service Commission 2003

The underlying programme logic of the South African Public Service Commission M amp E system

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 26: Strategies for improving Monitoring and Evaluation

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

Seven strategies

1 Better ways to think about M amp E2 Training and professional development3 Organisational infrastructure4 Supportive networks 5 External review processes6 Strategies for supporting use7 Building knowledge about what works

in evaluation in particular contexts

1 Better ways to think about M amp E

Useful definitions

Models of what evaluation is and how it relates to policy and practice

1 Better ways to think about M amp E

Useful definitions

bull Not just measuring whether objectives have been met

bull Articulating negotiatingndash What do we valuendash How is it going

1 Better ways to think about M amp E

Models of what evaluation is and how it relates to policy and practice

1 Different types of evaluation at different stages of the programpolicy cycle ndash rather than a final activity

2 The effect of M amp E

3 Iteratively building evaluation capacity

Common understandings of M amp E

bull Including definitions and models in major documents not just training manuals

bull Having these familiar to managers staff and communities not just to evaluators

bull Also recognising the value of different definitions and conceptualisations

Conceptual model for evaluationEvaluation is often seen as a final activity

Mainstreaming Social Inclusion httpwwweuropemsiorgindexphp

Conceptual model for evaluationEvaluation is often seen as a final activity

But this can lead to

Leaving it all to the end (no baselines)

Not being evaluative early on

Needs analysis

Program or policy design

Implementation of activities and ongoing management

Continuous improvement

Outcomes Evaluation amp performance monitoring

Conceptual model for evaluationDifferent types of evaluation at different stages of the

programpolicy cycle

Based on Funnell 2006 Designing an evaluation

Conceptual model for evaluationThe effect of M amp E

Overall resultBetter governance and service delivery in South Africa

Problems areaddressed

Achievementsare affirmed

and promoted

Departmentsfocus on

priority areas

Learning fromgood practice

examples takesplace

Problem areasidentified

Good practiceby others is

identified andpromoted

Priority areasin public

administrationare

communicated

Departmentsreflect ontheir own

performance

Public service monitoring

FOLLOWUP

REPORTING

Public Service Commission 2003

The underlying programme logic of the South African Public Service Commission M amp E system

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 27: Strategies for improving Monitoring and Evaluation

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

Seven strategies

1 Better ways to think about M amp E2 Training and professional development3 Organisational infrastructure4 Supportive networks 5 External review processes6 Strategies for supporting use7 Building knowledge about what works

in evaluation in particular contexts

1 Better ways to think about M amp E

Useful definitions

Models of what evaluation is and how it relates to policy and practice

1 Better ways to think about M amp E

Useful definitions

bull Not just measuring whether objectives have been met

bull Articulating negotiatingndash What do we valuendash How is it going

1 Better ways to think about M amp E

Models of what evaluation is and how it relates to policy and practice

1 Different types of evaluation at different stages of the programpolicy cycle ndash rather than a final activity

2 The effect of M amp E

3 Iteratively building evaluation capacity

Common understandings of M amp E

bull Including definitions and models in major documents not just training manuals

bull Having these familiar to managers staff and communities not just to evaluators

bull Also recognising the value of different definitions and conceptualisations

Conceptual model for evaluationEvaluation is often seen as a final activity

Mainstreaming Social Inclusion httpwwweuropemsiorgindexphp

Conceptual model for evaluationEvaluation is often seen as a final activity

But this can lead to

Leaving it all to the end (no baselines)

Not being evaluative early on

Needs analysis

Program or policy design

Implementation of activities and ongoing management

Continuous improvement

Outcomes Evaluation amp performance monitoring

Conceptual model for evaluationDifferent types of evaluation at different stages of the

programpolicy cycle

Based on Funnell 2006 Designing an evaluation

Conceptual model for evaluationThe effect of M amp E

Overall resultBetter governance and service delivery in South Africa

Problems areaddressed

Achievementsare affirmed

and promoted

Departmentsfocus on

priority areas

Learning fromgood practice

examples takesplace

Problem areasidentified

Good practiceby others is

identified andpromoted

Priority areasin public

administrationare

communicated

Departmentsreflect ontheir own

performance

Public service monitoring

FOLLOWUP

REPORTING

Public Service Commission 2003

The underlying programme logic of the South African Public Service Commission M amp E system

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 28: Strategies for improving Monitoring and Evaluation

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

Seven strategies

1 Better ways to think about M amp E2 Training and professional development3 Organisational infrastructure4 Supportive networks 5 External review processes6 Strategies for supporting use7 Building knowledge about what works

in evaluation in particular contexts

1 Better ways to think about M amp E

Useful definitions

Models of what evaluation is and how it relates to policy and practice

1 Better ways to think about M amp E

Useful definitions

bull Not just measuring whether objectives have been met

bull Articulating negotiatingndash What do we valuendash How is it going

1 Better ways to think about M amp E

Models of what evaluation is and how it relates to policy and practice

1 Different types of evaluation at different stages of the programpolicy cycle ndash rather than a final activity

2 The effect of M amp E

3 Iteratively building evaluation capacity

Common understandings of M amp E

bull Including definitions and models in major documents not just training manuals

bull Having these familiar to managers staff and communities not just to evaluators

bull Also recognising the value of different definitions and conceptualisations

Conceptual model for evaluationEvaluation is often seen as a final activity

Mainstreaming Social Inclusion httpwwweuropemsiorgindexphp

Conceptual model for evaluationEvaluation is often seen as a final activity

But this can lead to

Leaving it all to the end (no baselines)

Not being evaluative early on

Needs analysis

Program or policy design

Implementation of activities and ongoing management

Continuous improvement

Outcomes Evaluation amp performance monitoring

Conceptual model for evaluationDifferent types of evaluation at different stages of the

programpolicy cycle

Based on Funnell 2006 Designing an evaluation

Conceptual model for evaluationThe effect of M amp E

Overall resultBetter governance and service delivery in South Africa

Problems areaddressed

Achievementsare affirmed

and promoted

Departmentsfocus on

priority areas

Learning fromgood practice

examples takesplace

Problem areasidentified

Good practiceby others is

identified andpromoted

Priority areasin public

administrationare

communicated

Departmentsreflect ontheir own

performance

Public service monitoring

FOLLOWUP

REPORTING

Public Service Commission 2003

The underlying programme logic of the South African Public Service Commission M amp E system

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 29: Strategies for improving Monitoring and Evaluation

Avoiding the Big 5

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

Seven strategies

1 Better ways to think about M amp E2 Training and professional development3 Organisational infrastructure4 Supportive networks 5 External review processes6 Strategies for supporting use7 Building knowledge about what works

in evaluation in particular contexts

1 Better ways to think about M amp E

Useful definitions

Models of what evaluation is and how it relates to policy and practice

1 Better ways to think about M amp E

Useful definitions

bull Not just measuring whether objectives have been met

bull Articulating negotiatingndash What do we valuendash How is it going

1 Better ways to think about M amp E

Models of what evaluation is and how it relates to policy and practice

1 Different types of evaluation at different stages of the programpolicy cycle ndash rather than a final activity

2 The effect of M amp E

3 Iteratively building evaluation capacity

Common understandings of M amp E

bull Including definitions and models in major documents not just training manuals

bull Having these familiar to managers staff and communities not just to evaluators

bull Also recognising the value of different definitions and conceptualisations

Conceptual model for evaluationEvaluation is often seen as a final activity

Mainstreaming Social Inclusion httpwwweuropemsiorgindexphp

Conceptual model for evaluationEvaluation is often seen as a final activity

But this can lead to

Leaving it all to the end (no baselines)

Not being evaluative early on

Needs analysis

Program or policy design

Implementation of activities and ongoing management

Continuous improvement

Outcomes Evaluation amp performance monitoring

Conceptual model for evaluationDifferent types of evaluation at different stages of the

programpolicy cycle

Based on Funnell 2006 Designing an evaluation

Conceptual model for evaluationThe effect of M amp E

Overall resultBetter governance and service delivery in South Africa

Problems areaddressed

Achievementsare affirmed

and promoted

Departmentsfocus on

priority areas

Learning fromgood practice

examples takesplace

Problem areasidentified

Good practiceby others is

identified andpromoted

Priority areasin public

administrationare

communicated

Departmentsreflect ontheir own

performance

Public service monitoring

FOLLOWUP

REPORTING

Public Service Commission 2003

The underlying programme logic of the South African Public Service Commission M amp E system

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 30: Strategies for improving Monitoring and Evaluation

Seven strategies

1 Better ways to think about M amp E2 Training and professional development3 Organisational infrastructure4 Supportive networks 5 External review processes6 Strategies for supporting use7 Building knowledge about what works

in evaluation in particular contexts

1 Better ways to think about M amp E

Useful definitions

Models of what evaluation is and how it relates to policy and practice

1 Better ways to think about M amp E

Useful definitions

bull Not just measuring whether objectives have been met

bull Articulating negotiatingndash What do we valuendash How is it going

1 Better ways to think about M amp E

Models of what evaluation is and how it relates to policy and practice

1 Different types of evaluation at different stages of the programpolicy cycle ndash rather than a final activity

2 The effect of M amp E

3 Iteratively building evaluation capacity

Common understandings of M amp E

bull Including definitions and models in major documents not just training manuals

bull Having these familiar to managers staff and communities not just to evaluators

bull Also recognising the value of different definitions and conceptualisations

Conceptual model for evaluationEvaluation is often seen as a final activity

Mainstreaming Social Inclusion httpwwweuropemsiorgindexphp

Conceptual model for evaluationEvaluation is often seen as a final activity

But this can lead to

Leaving it all to the end (no baselines)

Not being evaluative early on

Needs analysis

Program or policy design

Implementation of activities and ongoing management

Continuous improvement

Outcomes Evaluation amp performance monitoring

Conceptual model for evaluationDifferent types of evaluation at different stages of the

programpolicy cycle

Based on Funnell 2006 Designing an evaluation

Conceptual model for evaluationThe effect of M amp E

Overall resultBetter governance and service delivery in South Africa

Problems areaddressed

Achievementsare affirmed

and promoted

Departmentsfocus on

priority areas

Learning fromgood practice

examples takesplace

Problem areasidentified

Good practiceby others is

identified andpromoted

Priority areasin public

administrationare

communicated

Departmentsreflect ontheir own

performance

Public service monitoring

FOLLOWUP

REPORTING

Public Service Commission 2003

The underlying programme logic of the South African Public Service Commission M amp E system

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 31: Strategies for improving Monitoring and Evaluation

1 Better ways to think about M amp E

Useful definitions

Models of what evaluation is and how it relates to policy and practice

1 Better ways to think about M amp E

Useful definitions

bull Not just measuring whether objectives have been met

bull Articulating negotiatingndash What do we valuendash How is it going

1 Better ways to think about M amp E

Models of what evaluation is and how it relates to policy and practice

1 Different types of evaluation at different stages of the programpolicy cycle ndash rather than a final activity

2 The effect of M amp E

3 Iteratively building evaluation capacity

Common understandings of M amp E

bull Including definitions and models in major documents not just training manuals

bull Having these familiar to managers staff and communities not just to evaluators

bull Also recognising the value of different definitions and conceptualisations

Conceptual model for evaluationEvaluation is often seen as a final activity

Mainstreaming Social Inclusion httpwwweuropemsiorgindexphp

Conceptual model for evaluationEvaluation is often seen as a final activity

But this can lead to

Leaving it all to the end (no baselines)

Not being evaluative early on

Needs analysis

Program or policy design

Implementation of activities and ongoing management

Continuous improvement

Outcomes Evaluation amp performance monitoring

Conceptual model for evaluationDifferent types of evaluation at different stages of the

programpolicy cycle

Based on Funnell 2006 Designing an evaluation

Conceptual model for evaluationThe effect of M amp E

Overall resultBetter governance and service delivery in South Africa

Problems areaddressed

Achievementsare affirmed

and promoted

Departmentsfocus on

priority areas

Learning fromgood practice

examples takesplace

Problem areasidentified

Good practiceby others is

identified andpromoted

Priority areasin public

administrationare

communicated

Departmentsreflect ontheir own

performance

Public service monitoring

FOLLOWUP

REPORTING

Public Service Commission 2003

The underlying programme logic of the South African Public Service Commission M amp E system

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 32: Strategies for improving Monitoring and Evaluation

1 Better ways to think about M amp E

Useful definitions

bull Not just measuring whether objectives have been met

bull Articulating negotiatingndash What do we valuendash How is it going

1 Better ways to think about M amp E

Models of what evaluation is and how it relates to policy and practice

1 Different types of evaluation at different stages of the programpolicy cycle ndash rather than a final activity

2 The effect of M amp E

3 Iteratively building evaluation capacity

Common understandings of M amp E

bull Including definitions and models in major documents not just training manuals

bull Having these familiar to managers staff and communities not just to evaluators

bull Also recognising the value of different definitions and conceptualisations

Conceptual model for evaluationEvaluation is often seen as a final activity

Mainstreaming Social Inclusion httpwwweuropemsiorgindexphp

Conceptual model for evaluationEvaluation is often seen as a final activity

But this can lead to

Leaving it all to the end (no baselines)

Not being evaluative early on

Needs analysis

Program or policy design

Implementation of activities and ongoing management

Continuous improvement

Outcomes Evaluation amp performance monitoring

Conceptual model for evaluationDifferent types of evaluation at different stages of the

programpolicy cycle

Based on Funnell 2006 Designing an evaluation

Conceptual model for evaluationThe effect of M amp E

Overall resultBetter governance and service delivery in South Africa

Problems areaddressed

Achievementsare affirmed

and promoted

Departmentsfocus on

priority areas

Learning fromgood practice

examples takesplace

Problem areasidentified

Good practiceby others is

identified andpromoted

Priority areasin public

administrationare

communicated

Departmentsreflect ontheir own

performance

Public service monitoring

FOLLOWUP

REPORTING

Public Service Commission 2003

The underlying programme logic of the South African Public Service Commission M amp E system

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 33: Strategies for improving Monitoring and Evaluation

1 Better ways to think about M amp E

Models of what evaluation is and how it relates to policy and practice

1 Different types of evaluation at different stages of the programpolicy cycle ndash rather than a final activity

2 The effect of M amp E

3 Iteratively building evaluation capacity

Common understandings of M amp E

bull Including definitions and models in major documents not just training manuals

bull Having these familiar to managers staff and communities not just to evaluators

bull Also recognising the value of different definitions and conceptualisations

Conceptual model for evaluationEvaluation is often seen as a final activity

Mainstreaming Social Inclusion httpwwweuropemsiorgindexphp

Conceptual model for evaluationEvaluation is often seen as a final activity

But this can lead to

Leaving it all to the end (no baselines)

Not being evaluative early on

Needs analysis

Program or policy design

Implementation of activities and ongoing management

Continuous improvement

Outcomes Evaluation amp performance monitoring

Conceptual model for evaluationDifferent types of evaluation at different stages of the

programpolicy cycle

Based on Funnell 2006 Designing an evaluation

Conceptual model for evaluationThe effect of M amp E

Overall resultBetter governance and service delivery in South Africa

Problems areaddressed

Achievementsare affirmed

and promoted

Departmentsfocus on

priority areas

Learning fromgood practice

examples takesplace

Problem areasidentified

Good practiceby others is

identified andpromoted

Priority areasin public

administrationare

communicated

Departmentsreflect ontheir own

performance

Public service monitoring

FOLLOWUP

REPORTING

Public Service Commission 2003

The underlying programme logic of the South African Public Service Commission M amp E system

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 34: Strategies for improving Monitoring and Evaluation

Common understandings of M amp E

bull Including definitions and models in major documents not just training manuals

bull Having these familiar to managers staff and communities not just to evaluators

bull Also recognising the value of different definitions and conceptualisations

Conceptual model for evaluationEvaluation is often seen as a final activity

Mainstreaming Social Inclusion httpwwweuropemsiorgindexphp

Conceptual model for evaluationEvaluation is often seen as a final activity

But this can lead to

Leaving it all to the end (no baselines)

Not being evaluative early on

Needs analysis

Program or policy design

Implementation of activities and ongoing management

Continuous improvement

Outcomes Evaluation amp performance monitoring

Conceptual model for evaluationDifferent types of evaluation at different stages of the

programpolicy cycle

Based on Funnell 2006 Designing an evaluation

Conceptual model for evaluationThe effect of M amp E

Overall resultBetter governance and service delivery in South Africa

Problems areaddressed

Achievementsare affirmed

and promoted

Departmentsfocus on

priority areas

Learning fromgood practice

examples takesplace

Problem areasidentified

Good practiceby others is

identified andpromoted

Priority areasin public

administrationare

communicated

Departmentsreflect ontheir own

performance

Public service monitoring

FOLLOWUP

REPORTING

Public Service Commission 2003

The underlying programme logic of the South African Public Service Commission M amp E system

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 35: Strategies for improving Monitoring and Evaluation

Conceptual model for evaluationEvaluation is often seen as a final activity

Mainstreaming Social Inclusion httpwwweuropemsiorgindexphp

Conceptual model for evaluationEvaluation is often seen as a final activity

But this can lead to

Leaving it all to the end (no baselines)

Not being evaluative early on

Needs analysis

Program or policy design

Implementation of activities and ongoing management

Continuous improvement

Outcomes Evaluation amp performance monitoring

Conceptual model for evaluationDifferent types of evaluation at different stages of the

programpolicy cycle

Based on Funnell 2006 Designing an evaluation

Conceptual model for evaluationThe effect of M amp E

Overall resultBetter governance and service delivery in South Africa

Problems areaddressed

Achievementsare affirmed

and promoted

Departmentsfocus on

priority areas

Learning fromgood practice

examples takesplace

Problem areasidentified

Good practiceby others is

identified andpromoted

Priority areasin public

administrationare

communicated

Departmentsreflect ontheir own

performance

Public service monitoring

FOLLOWUP

REPORTING

Public Service Commission 2003

The underlying programme logic of the South African Public Service Commission M amp E system

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 36: Strategies for improving Monitoring and Evaluation

Conceptual model for evaluationEvaluation is often seen as a final activity

But this can lead to

Leaving it all to the end (no baselines)

Not being evaluative early on

Needs analysis

Program or policy design

Implementation of activities and ongoing management

Continuous improvement

Outcomes Evaluation amp performance monitoring

Conceptual model for evaluationDifferent types of evaluation at different stages of the

programpolicy cycle

Based on Funnell 2006 Designing an evaluation

Conceptual model for evaluationThe effect of M amp E

Overall resultBetter governance and service delivery in South Africa

Problems areaddressed

Achievementsare affirmed

and promoted

Departmentsfocus on

priority areas

Learning fromgood practice

examples takesplace

Problem areasidentified

Good practiceby others is

identified andpromoted

Priority areasin public

administrationare

communicated

Departmentsreflect ontheir own

performance

Public service monitoring

FOLLOWUP

REPORTING

Public Service Commission 2003

The underlying programme logic of the South African Public Service Commission M amp E system

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 37: Strategies for improving Monitoring and Evaluation

Needs analysis

Program or policy design

Implementation of activities and ongoing management

Continuous improvement

Outcomes Evaluation amp performance monitoring

Conceptual model for evaluationDifferent types of evaluation at different stages of the

programpolicy cycle

Based on Funnell 2006 Designing an evaluation

Conceptual model for evaluationThe effect of M amp E

Overall resultBetter governance and service delivery in South Africa

Problems areaddressed

Achievementsare affirmed

and promoted

Departmentsfocus on

priority areas

Learning fromgood practice

examples takesplace

Problem areasidentified

Good practiceby others is

identified andpromoted

Priority areasin public

administrationare

communicated

Departmentsreflect ontheir own

performance

Public service monitoring

FOLLOWUP

REPORTING

Public Service Commission 2003

The underlying programme logic of the South African Public Service Commission M amp E system

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 38: Strategies for improving Monitoring and Evaluation

Conceptual model for evaluationThe effect of M amp E

Overall resultBetter governance and service delivery in South Africa

Problems areaddressed

Achievementsare affirmed

and promoted

Departmentsfocus on

priority areas

Learning fromgood practice

examples takesplace

Problem areasidentified

Good practiceby others is

identified andpromoted

Priority areasin public

administrationare

communicated

Departmentsreflect ontheir own

performance

Public service monitoring

FOLLOWUP

REPORTING

Public Service Commission 2003

The underlying programme logic of the South African Public Service Commission M amp E system

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 39: Strategies for improving Monitoring and Evaluation

Simple model of building evaluation capacity

Various activities

Build skills and knowledge in M amp E

Better outcomes for the public

Application of new capacity

Improved programs

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 40: Strategies for improving Monitoring and Evaluation

Conceptual model for evaluationIterative model of building evaluation capacity

Various activities

Identify existing capacity and build new capacity (types of capital)

Human Economic Social Organisational

Opportunities to deploy the

capacity

Better outcomes for the public

Development of systems to apply evaluation capacity to undertake oversee and use discrete evaluations ongoing evaluative activity and monitoring

Improved programs (through improved implementation better resource allocation or improved selection of programs)

Rogers 2002

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 41: Strategies for improving Monitoring and Evaluation

2 Training and professional development

WHO is to receive training

HOW will training be undertaken

WHAT will training cover

WHO will control content certification and accreditation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 42: Strategies for improving Monitoring and Evaluation

Training and professional development - WHO

WHO is to receive training ndash Those formally named as evaluatorsndash Those with formal responsibility for doing

evaluationndash Those who will commission or require

evaluationndash Those who will use evaluation (eg program

managers policy makers)ndash Citizens and citizen advocates

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 43: Strategies for improving Monitoring and Evaluation

Training and professional development - HOW

HOW will training be undertaken ndash Timing ndash before working in evaluation or as

ongoing professional developmentndash Duration ndash a few days a few weeks a few yearsndash Intensity ndash concentrated weekly annually

ldquosandwichrdquondash Method ndash face to face distance (email webinars

teleconference videoconference) selfpacedndash Level ndash short course certificate graduate program

(Masterrsquos Graduate Diploma PhD)ndash Customisation ndash generic sector-specific

organisation-specific

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 44: Strategies for improving Monitoring and Evaluation

Training and professional development - WHAT

WHAT will training cover ndash An integrated package ndash or a specific topicndash Methods for identifying the type of M amp E required and

Key Evaluation Questionsndash Evaluation designs

bull Specific types or a rangendash Methods of data collection

bull Specific types or a range- especially mixed qual and quantndash Methods of data analysis

bull Specific types or a rangebull Focus on critical thinking

ndash Approaches to reporting and supporting usendash Managing evaluation ndash including participatory processesndash Identifying and including existing skills and knowledge

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 45: Strategies for improving Monitoring and Evaluation

Example of suggested evaluation competencies

1 Professional Practice11 Applies professional evaluation standards12 Acts ethically and strives for integrity and honesty in conducting

evaluations13 Conveys personal evaluation approaches and skills to potential clients14 Respects clients respondents program participants and other

stakeholders15 Considers the general and public welfare in evaluation practice16 Contributes to the knowledge base of evaluation2 Systematic Inquiry21 Understands the knowledge base of evaluation (terms concepts

theories assumptions)22 Knowledgeable about quantitative methods23 Knowledgeable about qualitative methods24 Knowledgeable about mixed methods25 Conducts literature reviews26 Specifies program theory27 Frames evaluation questions28 Develops evaluation design29 Identifies data sources210 Collects data211 Assesses validity of data212 Assesses reliability of data213 Analyzes data214 Interprets data215 Makes judgments216 Develops recommendations217 Provides rationales for decisions throughout the evaluation218 Reports evaluation procedures and results219 Notes strengths and limitations of the evaluation220 Conducts meta-evaluations

3 Situational Analysis31 Describes the program32 Determines program evaluability33 Identifies the interests of relevant stakeholders34 Serves the information needs of intended users35 Addresses conflicts36 Examines the organizational context of the evaluation37 Analyzes the political considerations relevant to the evaluation38 Attends to issues of evaluation use39 Attends to issues of organizational change310 Respects the uniqueness of the evaluation site and client311 Remains open to input from others312 Modifies the study as needed40 Project Management41 Responds to requests for proposals42 Negotiates with clients before the evaluation begins43 Writes formal agreements44 Communicates with clients throughout the evaluation process45 Budgets an evaluation 46 Justifies cost given information needs 47 Identifies needed resources for evaluation such as information

expertise personnel instruments48 Uses appropriate technology 49 Supervises others involved in conducting the evaluation 410 Trains others involved in conducting the evaluation 411 Conducts the evaluation in a nondisruptive manner 412 Presents work in a timely manner 50 Reflective Practice51 Aware of self as an evaluator (knowledge skills dispositions) 52 Reflects on personal evaluation practice (competencies and areas for

growth)53 Pursues professional development in evaluation

(Stevahn et al 2006)

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 46: Strategies for improving Monitoring and Evaluation

Training and professional development ndash Short course examples

bull University of Zambia M amp E coursebull IPDET (International Program for Development Evaluation Training)

Independent Evaluation Group of the World Bank and Carleton University httpwwwipdetorg

bull CDC (Centers for Disease Control) Summer Institute USA httpwwwevalorgSummerInstitute06SIhomeasp

bull The Evaluators InstituteSan Francisco Chicago Washington DC USA wwwevaluatorsinstitutecom

bull CDRA (Community Development Resource Association) Developmental Planning Monitoring Evaluation and Reporting Cape Town South Africa wwwcdraorgza

bull Pre-conference workshops AfrEA - African Evaluation Association wwwafreaorgAEA - American Evaluation Association wwwevalorgSAMEA - South African Monitoring and Evaluation Association

wwwsameaorgzaAES - Australasian Evaluation Society wwwaesasnauEES ndash European Evaluation Society wwweuropeanevaluationorgCES ndash Canadian Evaluation Society wwwevaluationcanadacaUKES ndash United Kingdom Evaluation Society wwwevaluationorguk

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 47: Strategies for improving Monitoring and Evaluation

Training and professional development ndash Graduate programs

bull Centre for Research on Science and Technology The University of Stellenbsoch Cape Town South Africa Postgraduate Diploma in Monitoring and Evaluation Methods One year couse

delivered in intensive mode of face to face courses interspersed with self-study bull School of Health Systems and Public Health (SHSPH) the University of

Pretoria South Africa in collaboration with the MEASURE Evaluation Project MampE concentration in their Master of Public Health degree program Courses

taught in modules of one to three weeks six-month internship and individual research

bull Graduate School of Public amp Development Management (PampDM) the University of the Witwatersrand Johannesburg ndash Electives on monitoring and evaluation as part of their Masters Degree

programmes in Public and Development Management as well as in Public Policybull Centre for Program Evaluation University of Melbourne Australia

Masters of Assessment and Evaluation Available by distance education wwwunimelbeduaucpe

bull CIRCLE Royal Melbourne Institute of Technology AustraliaMasters and PhD by research

bull University of Western Michigan USAInterdisciplinary PhD residential coursework program

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 48: Strategies for improving Monitoring and Evaluation

Training and professional development ndash On-line material

bull Self-paced coursesbull Manualsbull Guidelines

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 49: Strategies for improving Monitoring and Evaluation

Training and professional development ndash Key Questions

bull Who controls the curriculum accreditation of courses and certification of evaluators

bull What are the consequences of this control

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 50: Strategies for improving Monitoring and Evaluation

3 Organisational infrastructure

bull Manualsbull Evaluation frameworksbull Guidelines bull Principlesbull Standards bull Checklistsbull Processes for commissioning and prioritising

evaluation including contractsbull Data resources ndash databases collection hardware

and software analysis hardware and software standardised data collection tools and measures

bull Evaluation journals and booksbull Support for an evaluation culture

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 51: Strategies for improving Monitoring and Evaluation

Options for organisational infrastructure

bull Importing existing infrastructure bull Adapting existing infrastructurebull Developing locally specific infrastructure

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 52: Strategies for improving Monitoring and Evaluation

Some existing infrastructure

bullGuidelines eg Australasian Evaluation Society Ethical Guidelineswwwaesasnau

6 Practise within competenceThe evaluator or evaluation team should possess the

knowledge abilities skills and experience appropriate to undertake the tasks proposed in the evaluation Evaluators should fairly represent their competence and should not practice beyond it

21 Fully reflect evaluatorrsquos findings The final report(s) of the evaluation should reflect fully the

findings and conclusions determined by the evaluator and these should not be amended without the evaluators consent

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 53: Strategies for improving Monitoring and Evaluation

Some existing infrastructure

bull Checklist eg Pattonrsquos Qualitative Evaluation Checklisthttpwwwwmicheduevalctrchecklists

1 Determine the extent to which qualitative methods are appropriate given the evaluationrsquos purposes and intended uses

1048709 Be prepared to explain the variations strengths and weaknesses of qualitative evaluations

1048709 Determine the criteria by which the quality of the evaluation will be judged

1048709 Determine the extent to which qualitative evaluation will be accepted or controversial given the evaluationrsquos purpose users and audiences

1048709 Determine what foundation should be laid to assure that the findings of a qualitative evaluation will be credible

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 54: Strategies for improving Monitoring and Evaluation

4 Supportive networks

bull Informal networks

bull Evaluation societies and associations

bull Learning circles

bull Mentoring

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 55: Strategies for improving Monitoring and Evaluation

5 External review processes

bull WHATndash Priorities for evaluationndash Guidelines manualsndash Plans for individual evaluationsndash Specifications for indicatorsndash Data collection ndash Data analysisndash Reports

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 56: Strategies for improving Monitoring and Evaluation

5 External review processes

bull WHENndash Before next stage of evaluation (review for

improvement)ndash Before acceptance of evaluation reportndash At end of an episode of evaluation ndash identify

and document lessons learned about evaluation

ndash As part of ongoing quality assurance

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 57: Strategies for improving Monitoring and Evaluation

5 External review processes

bull WHOndash Peer review ndash reciprocal review of each

otherrsquos workndash External expert

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 58: Strategies for improving Monitoring and Evaluation

6 Supporting use

bull Register of evaluation reportsSummary of methods used findings availability

of report

bull Publishing evaluation reportsLibrary and web access

bull Tracking and reporting on implementation of recommendations

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 59: Strategies for improving Monitoring and Evaluation

7 Building knowledge about what works in evaluation in particular contexts

bull Research into evaluation ndash empirically documenting what is done and how it goes

bull Publishing accounts and lessons learned

ndash Booksndash Journalsndash Web sitesndash Locally and internationally

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 60: Strategies for improving Monitoring and Evaluation

Example Kaupapa Maori evaluation (New Zealand)

Seven key ethical considerations 1 Aroha ki te tangata (respect for people)2 Kanohi kitea (the seen face a requirement to present

yourself lsquoface to facersquo)3 Titiro whakarongohellipkorero (look listenhellip then

speak)4 Manaaki ki te tangata (share and host people be

generous)5 Kia tupato (be cautious)6 Kaua e takahia te mana o te tangata (do not trample

on the mana of people)7 Kaua e mahaki (do not flaunt your knowledge)

Smith GH (1997) The Development of Kaupapa Maori theory and praxis University ofAuckland Auckland

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 61: Strategies for improving Monitoring and Evaluation

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 62: Strategies for improving Monitoring and Evaluation

7 strategies to avoid the big 5 mistakes (and others)

LIMITED VIEW

UNFOCUSED

UNREALISTIC

GAPS IN DATA

WHAT TO DO WITH IT

1 Ways of thinking about M amp E

2 Training and professional development

3 Organisational infrastructure

4 Supportive networks

5 External review processes

6 Supporting use

7 Building and sharing knowledge about what works in local contexts

Are these relevant for South Africa

Are there others

Which are most important to enact ndash and how

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64
Page 63: Strategies for improving Monitoring and Evaluation

PatriciaRogersrmiteduau

CIRCLE at RMIT University

Collaborative Institute for Research Consulting and Learning in Evaluation

Royal Melbourne Institute of Technology

Melbourne AUSTRALIA

  • Strategies for improving Monitoring and Evaluation
  • Slide 2
  • Slide 3
  • Overview of presentation
  • Questions for you
  • 1 Presenting a limited view
  • Slide 7
  • Slide 8
  • Slide 9
  • Slide 10
  • 2 Unfocused
  • Slide 12
  • Slide 13
  • Slide 14
  • Slide 15
  • 3 Unrealistic expectations
  • Slide 17
  • Slide 18
  • 4 Not enough good information
  • Slide 20
  • Slide 21
  • 5 Waiting till the end to work out what to do with what comes out
  • Slide 23
  • Slide 24
  • Avoiding the Big 5
  • Slide 26
  • Slide 27
  • Slide 28
  • Slide 29
  • Slide 30
  • Seven strategies
  • 1 Better ways to think about M amp E
  • Slide 33
  • Slide 34
  • Common understandings of M amp E
  • Slide 36
  • Slide 37
  • Slide 38
  • Slide 39
  • Simple model of building evaluation capacity
  • Conceptual model for evaluation Iterative model of building evaluation capacity
  • 2 Training and professional development
  • Training and professional development - WHO
  • Training and professional development - HOW
  • Training and professional development - WHAT
  • Example of suggested evaluation competencies
  • Training and professional development ndash Short course examples
  • Training and professional development ndash Graduate programs
  • Training and professional development ndash On-line material
  • Training and professional development ndash Key Questions
  • 3 Organisational infrastructure
  • Options for organisational infrastructure
  • Some existing infrastructure
  • Slide 54
  • 4 Supportive networks
  • 5 External review processes
  • Slide 57
  • Slide 58
  • 6 Supporting use
  • 7 Building knowledge about what works in evaluation in particular contexts
  • Example Kaupapa Maori evaluation (New Zealand)
  • 7 strategies to avoid the big 5 mistakes (and others)
  • Slide 63
  • Slide 64