knowledge exchange orum - evaluationcanada.ca · 6/8/2010  · a cross‐section of the ideas...

59
KNOWLEDGE EXCHANGE FORUM CREATING A CULTURE OF EVALUATION FOR LEARNING AND IMPROVEMENT IN CHRONIC DISEASE PREVENTION AND HEALTH PROMOTION OTTAWA MARCH 2930, 2010 SUMMARY REPORT June 8, 2010

Upload: others

Post on 06-Aug-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

  

     

 

    

KNOWLEDGE EXCHANGE FORUM CREATING A CULTURE OF EVALUATION FOR LEARNING AND IMPROVEMENT  

IN CHRONIC DISEASE PREVENTION AND HEALTH PROMOTION  

OTTAWA MARCH 29­30, 2010 

 

  

SUMMARY REPORT   

June 8, 2010 

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 2

TABLE OF CONTENTS Executive Summary                                                                                                  3 

A. Introduction     5 

B. Concepts, Approaches and Techniques     6 

  Examples of Evaluation for Learning Approaches                             11 

C. Creating a Culture Shift Towards Evaluation for Learning  14 

  1. Systems Thinking  14 

  2. Action Suggestions Using Levels of a System Framework  16 

D. Closing Reflections  21 

E. Forum Evaluation Feedback                                                                          22 

Appendices: 

1: KE Forum Participants                                                                                          23 2: KE Forum Program                                                                                                 27 3: KE Forum Presentations                                                                                       31 

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 3

KNOWLEDGE EXCHANGE FORUM 2010: EXECUTIVE SUMMARY

The Public Health Agency of Canada’s Interventions Research and Knowledge Exchange Unit of the Centre for Chronic Disease Prevention and Control and its partners the CAPTURE Project and Propel Centre for Population Health Impact held a Knowledge Exchange Forum in Ottawa, March 29 ‐ 30, 2010. The Forum was attended by 40 participants, including community‐based practitioners, evaluators, policy makers, funders and researchers from across Canada working in the fields of planning and evaluation, knowledge development and exchange, public health, health promotion and community health. The Forum was designed to focus on developing a culture of evaluation for learning and improvement in chronic disease prevention and health promotion. 

Concepts Presentations and discussions during the first part of the session generated a range of ideas and approaches about evaluation for learning, some of which are highlighted below: Re­Think Evaluation Questions: In designing an evaluation to lead to learning and improvement, it is crucial to get the evaluation issues right, especially in the design stage, and to clarify the purpose and use of the evaluation.   Evaluation for Learning vs. Evaluation for Accountability: Most organizations are interested in learning – the challenge is that they rarely associate this with evaluation, which can be a negative experience for them.  Discussion explored the idea of separating evaluation for accountability and evaluation for learning, as one way to encourage organizations to enter into evaluations with more openness and curiosity. In evaluations for learning, organizations are encouraged to share “failures” as a basis for future improvements, rather than only focusing on “successes”.   Timing: The pace of decision‐making and the pace of evaluation need to mesh, or the evaluation results will not effectively contribute to improvements in what the organization does.  Many learning‐based approaches are time intensive and may involve many questions and multiple evaluations as part of one initiative. This is challenging for organizations with limited capacity.  Identify end­users and their information needs: Understanding who will use the results of the evaluation and what will they use it for early on in the evaluation process will help identify how user information needs can be best met to support use of evaluation results in their work to learn and improve, not judge.   “It’s all about relationships”: Ensuring take‐up of evaluation results has a great deal to do with the relationship developed between the evaluator and those who will decide on and implement learning arising from the evaluation, which may include audiences who have different contexts and perspectives (program deliverers, senior management, clients, etc).  These relationships are most effective when they are built early on, as this allows the evaluator to design the evaluation in a way that meets the needs of end‐users from the outset and engages them at each stage. Change in the role of the evaluator: Discussions challenged the traditional role of the evaluator as “objective” in a narrow sense, and encouraged evaluators to take on roles such as facilitator, relationship‐builder, connector, and organizational consultant. Embedding evaluation Embedding roles related to evaluation throughout an organization brings it closer to implementers and thus more likely to lead to learning and improvement. 

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 4

Helpful methodologies: Presenters and participants shared their experiences with developmental evaluation, the use of mock data to prepare organizations for potentially challenging results, participatory approaches, techniques to engage stakeholders in reviewing evaluation results, and the importance of using tools appropriate to the organization. It was emphasized that reports cannot speak for themselves and need to be supplemented by meaningful dialogue to engage stakeholders in building from the report to implementation. Ultimately, reports are optional and only one method for this engagement. Sharing the evaluation results: There are many ways to share results. The specific methods chosen need to be tailored to the particular audience and their needs. Evaluation reports cannot speak for themselves, and need to be accompanied by knowledge sharing and an implementation strategy. Sharing the evaluation results effectively includes the notion of negotiating findings and implementation options in a spirit of mutual exchange.    Creating a Culture Shift Towards Evaluation for Learning 

The second part of the Forum addressed how to create a culture that nourishes evaluation for learning.  The scale and scope of change being considered lends itself to a systems approach, in which we need to be conscious that many inter‐related influences are at work and that there are existing initiatives and infrastructure that can support evaluation for learning.  The Forum generated many suggested actions to advance a culture of evaluation for learning related to: evaluation process, funding of evaluations, communication of learning/results, capacity building, evaluation methodology and collaboration/communities of practice.  Closing comments by participants indicated that: 1) there was enthusiasm for the potential for new ways of undertaking evaluation and enabling it to more effectively nourish decision‐making; 2) work in this area is progressing on many fronts and is helping to shine the light on evaluation with a focus on learning and ways to foster a supportive evaluation culture; and 3) there was a strong interest in continuing the dialogue and exchange to support individual and collective follow‐up action in the future. 

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 5

A. Introduction The Public Health Agency of Canada’s Interventions Research and Knowledge Exchange Unit of the Centre for Chronic Disease Prevention and Control and its partners the CAPTURE Project and Propel Centre for Population Health Impact held a Knowledge Exchange Forum in Ottawa, March 29 ‐ 30, 2010. The Forum was attended by a range of community‐based practitioners, evaluators, policy makers, funders and researchers from across Canada who work in the fields of planning and evaluation, knowledge development and exchange, public and community health. A complete list of participants is included in Appendix 1 of this Report. This event was the 2nd Annual Knowledge Exchange (KE) Forum and was designed to focus on developing a culture of evaluation for learning and improvement in chronic disease prevention and health promotion. The Forum was intended to build a supportive learning environment in which participants could share experiences and discuss together ways to move their collective wisdom forward. In particular, the Forum discussions provided an opportunity to: 

1. Share and learn about innovative examples of documenting, sharing and using evaluation findings for improvement and learning purposes. 

2. Identify existing or potential strategies and opportunities to strengthen this approach. 3. Identify steps to promote and sustain learning and evaluation cultures in public health 

systems. Participants were welcomed by representatives of the three organizations which collaborated to plan the Forum: Tim Hutchinson, Director, Chronic Disease Prevention Division on behalf of the Public Health Agency of Canada, Diane Finegood, Executive Director on behalf of the CAPTURE Project and Barb Riley, Director, Strategy and Capacity Development on behalf of the Propel Centre for Population Health Impact. The Forum extended over an evening and the following day, and included keynote and panel presentations from the following: 

• Dr. Simon Roy, Goss Gilroy Inc, Management Consultants • Dr. Barb Riley, Propel Centre for Population Health Impact • Daniela Seskar‐Hencic, Waterloo Public Health • Dana Vocisano, McConnell Foundation • Dr. Maximilien Tereraho, Human Resources & Skills Development Canada • Bill Gordon, Alberta Coalition for Healthy School Communities • Jamie Gamble, Imprint Consulting Inc.  • Dr. Diane Finegood, CAPTURE • Dr. Kerry Robinson, Centre for Chronic Disease Prevention and Control, PHAC 

 A copy of the Forum program is included in Appendix 2, and presentation slides and notes from speakers are included in Appendix 3. 

Every truth passes through three stages before it is recognized. In the first, it is ridiculed.  In the second, it is opposed. In the third, it is regarded as self­evident.   ­ Marianne Williamson, Illuminata 

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 6

B. Concepts, Approaches and Techniques  

The first part of the Forum provided opportunities for presenters and participants to share examples of their experiences with evaluation for learning, and extract lessons learned.  In the course of this discussion, a number of concepts, approaches and practical techniques were shared.  These are grouped into nine general categories, and ideas under each category are described in the sections below.  Some of these are ideas that could be implemented by practitioners and researchers, while others are more appropriate to funders, and some can be put into action by any of these. It should also be noted that a Background Paper on Evaluation for Learning and Improvement in Chronic Disease Prevention and Health Promotion was prepared by Dr. Simon Roy for the Centre for Chronic Disease Prevention and Control and circulated to participants prior to the Forum. The paper summarized the key concepts and research findings in the literature related to evaluation for learning and improvement. The Utilization­Focused Evaluation Process Model (Patton, MQ) provided by Dr. Simon Roy (see below) sets out five stages for utilization‐focused evaluation.  Suggestions for how to encourage and undertake evaluation for learning were provided during the course of the Forum on all five stages (see notes in Appendix 2). It was clear from the discussions that no one approach to evaluation for improvement will work in all settings and situations, and thus a wide range of ideas and techniques were proposed at the Forum.  A cross‐section of the ideas shared is summarized below. Figure 1: Utilization­focused evaluation process model 

   

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 7

Re­Think Evaluation Questions 

In designing an evaluation to lead to learning and improvement, it is important to return to first principles. It is crucial to get the issues right, especially in the design stage, and to clarify the purpose and use of the evaluation.  Before embarking on an evaluation is the moment to re‐think the evaluation questions.  For example, in population health, there are four main aspects of interventions: effectiveness; adaptation; feasibility and reach.  In order to come to grips with a learning agenda, it would be important to identify evaluation questions that address all four of these and that relate to making improvements at a population level. When developing evaluation questions is it important to consider the “5 W’s” why are we doing this program, what is the goal, what are we trying to achieve, etc.    

Evaluation for Learning vs. Evaluation for Accountability 

The term “evaluation” is fraught with stereotypes and organizations often approach an evaluation with fear and hesitation. Most organizations are interested in learning – the challenge is that they rarely associate this with evaluation, which is more often a negative or threatening experience for them. The McConnell Foundation is experimenting with separating evaluation for accountability and evaluation for learning, as one way to encourage organizations to enter into evaluations with more openness and curiosity.  Thus, some funders are making a distinction between grants management and monitoring and may even assign different staff to evaluate a project for learning purposes than those who are charged with ensuring accountability. In evaluation for learning, organizations are encouraged to share “failures” as a basis for future improvements, rather than only focusing on “successes”. Reports should reveal not just what is being done to produce results, but also what has been learned to shift thinking.   It was also pointed out that accountability and learning are not necessarily mutually exclusive.  There are evaluation questions that relate to accountability and others that are related to learning, however there is also a large area where they overlap.  Many organizations may not have the resources to separate these two areas.   

Timing 

Timeliness is key in effective evaluation for learning.  The pace of decision‐making and the pace of evaluation need to mesh, or the evaluation results will not effectively contribute to improvements in what the organization does. There is a lack of capacity and expertise in evaluation especially when resources are scarce.  Many learning‐based approaches are time intensive and may involve many questions and multiple evaluations as part of one initiative. This can be challenging for organizations that have limited capacity.  In larger initiatives, some of the most successful evaluations are short, but may be conducted within a longer‐term relationship between the evaluator and the organization. This can facilitate an understanding of the results of the evaluation and a higher level of trust between the evaluator and the organization.  

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 8

Identify end­users and their information needs In “evaluation for learning”, who is intended to do the learning?  And what is the learning for? In other words, who will use the results of the evaluation and what will they use it for? Undertaking this discussion early in the evaluation process will help identify the end‐users, who can then discuss some of their information needs and ways in which they might use the eventual results of the evaluation in their work.    

 End‐users may include decision‐makers, program staff who implement the service or activity being evaluated, clients and beneficiaries, public officials, members of the public, and partners. It is important to consider the audience in the widest sense, including those who will be affected by decisions arising from the evaluation results.  These audiences often have different needs and perspectives and it is useful for the evaluator to be aware of them and to accommodate their needs to the extent possible.  For example, senior management may have a different strategic context for the evaluation than those who are most directly involved in the program or service. 

 One of the keys to successfully engaging those who are intended to use the evaluation results for improvements is to address the “audit” mentality associated with evaluations and to clearly establish that the purpose is to learn and improve, not judge. Another key to successfully engaging stakeholders may be the issue of language and how we label things (e.g., in the business sector, the term “continuous improvement” is often used).    

It’s all about Relationships 

Ensuring take‐up of evaluation results has a great deal to do with the relationship developed between the evaluator and those who will decide on and implement learnings arising from the evaluation. Meaningful engagement increases trust, buy‐in and take‐up of the evaluation results. These relationships are most effective when they are built early on, as this allows the evaluator to design the evaluation focus, questions and methodology in a way that meets the needs of end‐users and also engages them throughout. Without trust, the evaluation can more easily be dismissed, for example, the methodology may be criticized when the results are difficult to accept.   In large institutions that have in‐house evaluators, it is important to have an ongoing relationship between evaluation staff and decision‐makers, so that evaluation is strategically linked to policy and programs, and is of use in answering questions that matter to the organization. However, for external evaluators, there are ethical issues that may arise, for example if an evaluator gets too close to a project. There was a lively debate on “objectivity” vs. engagement in the methodology used. It was suggested that qualitative research methods can be used in a manner that ensures the use of solid rigor and that objectivity is not the only way to achieve rigor and reliability. One approach is to see the job of an evaluator as a “critical friend” who can bring an informed outside perspective while still having the best interests of the organization at heart. Leaders in the organization play a key role in creating space for learning during and after the evaluation process, and encouraging a strategic link between evaluation and operations or 

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 9

decision making. Establishing a positive relationship between them and the evaluator, whether in‐house or external, is key to implementation and learning.  

Change in the role of the evaluator 

A number of comments throughout the Forum tended towards a shift in the traditional role and competencies of the evaluator.  Some of this has been referred to in the previous paragraphs about relationship‐building.  Other thoughts about possible elements of this emerging role were also raised, including: 

Behave as a management consultant for the organization or program, not a judge.  Even when accountability is part of the purpose of the evaluation, opportunities for learning will be lost if the evaluation is seen as punitive. 

Be aware of political sensitivities and various possible external uses of the data, e.g. creative permutations of the performance story. 

Focus less on providing an objective body of knowledge and focus more on helping groups understand themselves and grow. 

Re‐consider old practices and paradigms about evaluation and introduce emerging practices and approaches, such as developmental evaluation and evaluation for learning. 

Developmental evaluation can be a companion to change management.  Learning is at the core of change management, and it is important to encourage organizations and their people to examine things from a different perspective. 

Facilitation that helps organizations ask the right questions is very important.  If the organization is not in a learning mode, it is difficult to push them there. 

The researcher and evaluator can be a catalyst and bring people together more effectively.  

Embedding evaluation throughout organization  

Part of the discussion focused on evaluation for learning not simply as a set of techniques used by the evaluators, but as a different way of working throughout each organization, a way of bringing an evaluative lens and evaluative thinking to all aspects of an organization’s work. Evaluation can be less threatening by embedding an evaluation lens and evaluation skills as part of the regular daily work of an organization, so evaluative thinking becomes more natural.  The analogy was made with housework: if you only clean house once a year, it becomes an overwhelming task. Comments also distinguished between evaluation thinking (critically examining) and actually undertaking a formal evaluation.  Certain technical skills and methodologies may be useful for the latter, but everyone in an organization can contribute to the former. When evaluation becomes part of how an organization works, it becomes evaluation for learning.  In organizations where this is the practice, evaluation feeds into decisions in a timely way.  This can happen throughout the management cycle (strategic planning, operational planning, budgeting, day‐to‐day implementation, etc.) Linking evaluation directly into the strategic planning process is one way to facilitate implementation. The shorter the distance between evaluation results and decision‐makers or the decision‐making process, the easier implementation and use becomes.  This becomes self‐

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 10

reinforcing if those in an organization see that the suggestions arising from previous evaluations are implemented, and the incentive to participate in future evaluations grows. One way to do this is to understand how the reward system in institutions is changing, and to align research and evaluation to ensure that it is making a difference in informing practice and decisions.   Helpful methodologies  Many specific techniques and methods were mentioned in the course of the Forum, however it is only possible to mention a few of them here.  Developmental evaluation was seen as a useful tool in encouraging a learning environment during the evaluation process. Developmental evaluation is particularly relevant in early‐stage innovation and in highly unstable and complex situations where there are constantly changing conditions. Developmental evaluation also includes a stronger attention to community‐based action research and aims to enable groups to examine options for action and formulate mutually acceptable solutions.    Others saw developmental evaluation as a powerful approach but believe that traditional formative and summative evaluations can also lead to learning, if they are approached in a way that engages decision‐makers and those who will be affected by evaluation results. Some also linked evaluation for learning with Continuous Quality Improvement.  Using mock data at the beginning of an evaluation to role play decision‐making and implications of results can help explore delicate or difficult issues in a hypothetical context. Mock data and mock decision sessions can thus help lay the ground for thinking to address issues if they arise from the evaluation.  Participatory approaches in data collection with program participants, stakeholders or staff participating in co‐developing questions and data collection approaches can be extremely useful techniques in themselves, and also help engage stakeholders throughout the evaluation.   It is also sometimes helpful to offer stakeholders an opportunity to reflect on the process and learn from it; debriefing on learnings as they occur. It was also noted that there is a need to go beyond one‐off evaluations and to synthesize learning from multiple evaluations. For small organizations, especially not‐for‐profits, simple data collection tools are often best. These include online surveys with membership, surveys at the end of events, reviewing minutes of meetings and reports from events, and using existing templates such as the Program Evaluation Reporting Tool.   For more complex evaluations, for small organizations it is often useful to partner with a university, as some of these have the capacity to work with community agencies at low cost. 

Sharing the evaluation results Reports are an option, not a default – there are many ways to share the results.  Among the other techniques are: executive summaries, PowerPoint slides or visual presentations, newsletter / brochure format, oral briefing with charts, video presentations, etc.  The specific methods chosen need to be tailored to the particular audience and their needs. 

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 11

However, reports are not enough. Evaluation reports cannot speak for themselves, and the report, or whatever other method is used, needs to be accompanied by knowledge sharing and an implementation strategy.   The key to effectively sharing evaluation results is to engender a meaningful conversation with the stakeholders. Learning takes place through social interactions and the evaluation process must provide for these in order for those affected to digest, understand and act upon the evaluation results.  The report should not be seen as a final product, but the beginning of the next stage in the process.  Stakeholders should be encouraged to challenge and delve into the evaluation results.  For example, one technique is to ask what results were expected, before unveiling what the findings actually were.  This helps frame some of the gaps and disconnects that can be the basis for discussion.  It is important to acknowledge that organizations may find it more difficult to reflect critically on some aspects of their work, and evaluation information that suggests certain strategies or activities need to change, can be challenging to digest.  

It was suggested that the term “disseminate” be dropped, as it is one‐way, top down and disempowering.  Sharing the evaluation results in more effective ways includes the notion of negotiating findings and implementation options in a spirit of mutual exchange.  Program and service practitioners are generally interested in their own experience and, although it is an essential starting point, a knowledge synthesis approach can help raise that practical knowledge to a higher level. Sometimes, it is desirable to share the results beyond the organization itself and to put information into the hands of the public or the clients of the organization, a practice referred to by some as “democratization of data”.  There are a number of challenges associated with sharing evaluation results. What if the results conflict with organizational direction?  What if the organization does not possess the capacity to implement the changes that emerge from the results?  What if there are political sensitivities that affect the way the results are seen and used?  Part of sharing evaluation results is helping groups understand their experience, and then generating a synthesis of learnings among many groups. Practitioners and program staff are generally hungry for ideas to improve their work and will utilize information gained through evaluation and strategic planning if it is relevant.  To do this at an organizational level involves a talent for learning and communicating information “up the chain” from programs to policy and from one partner to another. The inherent complexity of some population health issues will be a challenge – evaluation cannot always “connect the dots” and deal with attribution because of complexity.  Examples of Evaluation for Learning Approaches  Throughout the Forum participants shared stories of how programs or organizations they had worked with had incorporated an evaluation for learning and improvement perspective into their work. Below are a few of these examples: 

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 12

“Healthy Weights: Halton Takes Action” was introduced in response to a 2004 report from the Ontario Chief Medical Officer of Health. The report highlighted alarming obesity rates and the factors that affect weight, reviewing environmental factors such as urban design.   Memorial University, Halton Region Health Department, Dalhousie University and community partners have examined research on the obesity issue, identified possible ways to promote healthy weights.  Stakeholders chose to focus on the built environment and on ways to improve walkability and access to healthy foods. Geographic Information System (GIS) data were examined. The baseline data collection and analysis is funded by CIHR.    A Healthy Weights Report Card is being prepared.  Analyses and recommendations will be presented to Municipal Councillors.  Halton Region is committed to using the data to inform policy about neighbourhoods and intensification and to target programming to high need areas. These baseline data will be used in an outcome evaluation.  “Vibrant Communities” is the application of a poverty reduction approach which had been prototyped in 12 communities across Canada.  A range of evaluation activities had been carried out including systematic outcome tracking, a summative end of campaign initiative and two formative evaluations.  It was discovered that there was a need to bring together the “top down” and “bottom up” approach and adapt it within different contexts. The top‐down/bottom‐up refers to the way this initiative was shaped ‐ there were five main principles that were key for all participating communities:  ¤Focusing on poverty reduction over poverty alleviation; ¤Comprehensive local initiatives aimed at poverty reduction; ¤Grassroots collaboration involving all sectors of the community in these initiatives; ¤Identifying community assets and putting them to good use in poverty‐reduction efforts; ¤A commitment to learning, change and sharing learnings – whether they are the product of successes or failures.  Communities each emphasized and shaped these principles in different ways, depending on a variety of factors. In terms of impact, Vibrant Communities (VC) stakeholders are in a process of making sense of this currently. Initial understanding of this is concluding that communities using a VC approach can  1) attract significant resources, engage a broad and diverse range of multi‐sectoral leadership, raise the profile and understanding of poverty and introduce innovative solutions;  2)  influence substantive public policies related to poverty, strengthen links and coordination of responses to poverty,  shape private sector practices; and 3) assist a large number of people in their journey out of poverty, address more than one root causes of poverty, and contribute to deep and durable impacts. The overall approach can raise the profile of the issue, build constituency, begin to shift systems and introduce new ways of working.   “KidsFirst” is an early childhood development intervention program in Saskatchewan that works at the family level to address multiple determinants of health: social support networks, personal health practices and coping skills, education, income and social services, and health services. By identifying and providing support to very vulnerable families with young children (beginning prenatally, up to five years of age), KidsFirst aims to reduce disparities in maternal and child health outcomes. The evaluation is being carried out in three phases:  

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 13

In Phase 1, in consultation with program managers and staff, the researchers developed a comprehensive evaluation framework that will enable communication about the program and guide subsequent evaluation activities.   Phase 2 will consist of two parts: an analysis of existing quantitative data and comparison of KidsFirst participants to a reference population using secondary data, followed by qualitative case studies designed to enrich understanding of the quantitative findings.   Finally, in Phase 3, findings will be integrated, linking them to program goals and objectives and to the provincial context. Ultimately, through this research they hope to inform emerging knowledge on promising or best practices in early childhood intervention in Canada.  "The Healthy Beginnings Enhanced Home Visiting Initiative" in Nova Scotia supports families facing challenges through a comprehensive home visiting program for up to three years after the birth of a baby.  An evaluation framework was developed through a participatory and consultative process that included users of the evaluation at the local, district, regional and provincial levels. As a result, the framework reflected the views and was supported by of a wide variety of stakeholders throughout the province.    The evaluation was implemented in three phases. Phase I:  “Implementation” explored whether the program was implemented as planned and how implementation varied between regions.  Findings were used to inform program development (e.g. the evaluation identified the need for staff capacity building and a strengthened information system to support full program implementation).    Phase II:  “Quality Assurance” explored family, staff and partner satisfaction with the program; identified what was working well and what could be improved.  The findings were used at district, regional, and provincial levels to enhance program implementation.   Phase III:  “Outcomes for Families” explored the difference the program has made for families; family achievement of goals; and achievement of short‐ and mid‐term outcomes.  These findings will be used in discussions with funders, partners, and stakeholders working together for on‐going implementation and enhancement of the program.   

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 14

C. Creating a Culture Shift Towards Evaluation for Learning The second part of the Forum focused on ways to encourage a shift in how we think about evaluation, at the organizational, sectoral and systemic levels.  Systems thinking was a key element of this discussion, and using that framework a number of potential actions and strategies were generated. 

1. Systems Thinking  

A panel of the host organizations introduced some observations about systems thinking, as a backdrop to generating some ideas on how to create a culture shift towards evaluation for learning.  The scale and scope of change being considered lends itself to a system approach, in which we need to be conscious that many inter‐related influences are at work, and causality is not always clear and probably irrelevant.    An insightful “Word cloud” based on the actions described in the pre‐meeting survey was presented to illustrate the complexity of the challenges involved in implementing evaluation for learning. Figure 2: Complexity of the solution landscape 

 Based on software developed by: http://www.wordle.net/ 

 

There are 12 places to intervene in a complex system (from most to least effective).1  1. The power to transcend paradigms 2. The paradigm that the system arises out of 3. The goal of the system 4. The power to add, change, evolve, or self‐organize system structure 5. The rules of the system  

1 D. Meadows. Thinking in Systems, Chelsea Green, 2009.

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 15

6. The structure of information flow  7. The gain around driving positive feedback loops 8. The strength of negative feedback loops 9. The length of delays 10. The structure of material stocks and flows  11. The size of buffers and other stabilizing stocks 12. Constants, parameters, numbers 

 The bar graph below illustrates the distribution of effort and resources involved in each of these aspects.  It was observed that there appears to be an opportunity gap with respect to feedback and delays, and this may be an accessible level for strategies for change. Figure 3: Distribution of effort/resources  

 

 

 

 

 

 

 

 

 

 

 

 Three principles were proposed to guide various strategies and actions towards a culture shift: 1. Think like a system: Questions to ask ourselves about evaluation for learning at each level 

of a system:  Paradigm: What are existing beliefs related to evaluation for learning?  Goals: What are the goals of evaluation for learning?  Structure: What structures exist or could be created?  Feedback and delays: What are mechanisms for providing evaluation feedback?   Structural elements: Who are key players – individuals and organizations? How do you currently understand their roles and assets? 

2.  Be the change we want to create:  What can each of us do given our current mandates and responsibilities?  What examples can be built on to demonstrate a new paradigm for evaluation?   How can we individually and collectively lead by example?  How might we take stock of progress with evaluation for learning, adapt and re‐focus our efforts – individually and collectively? 

Intervention Level

Paradigm Goals Structure Feedback Elements

Dis

tribu

tion

of A

ctio

n

0

10

20

30

40

50

60

SpeculativeGap

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 16

3.  Learn jujitsu (be adaptable, and take advantage of the momentum of other initiatives)  How can evaluation for learning become part of existing agendas?   How can we adapt our long‐term goals to be a realistic stretch in the short‐term?   What are the strongest assets that could be used to advance evaluation for learning?  

2. Action Suggestions Using Levels of a System Framework   A simplified listing of the 12 elements of a system was used as a method for identifying levels where it would be possible to intervene and contribute towards a culture shift towards evaluation for learning.  Under each of the headings below, ideas are listed that were generated by Forum participants, whether from presenters, from general and small group discussions, or from the pre‐Forum survey.  Paradigm (System’s deepest beliefs) 

View evaluation as critical and vital for learning and improvement  View and trust evaluation as being relevant to policy and program development  Consider evaluation an integral component of program planning  Consider routine evaluation, implementation of change, and re‐evaluation as a dynamic continuum 

Shift paradigm so that a culture of learning and improvement is built into our thinking around developing interventions so that changes along the way are seen as welcome opportunities and not failures or mistakes 

 Goals (and rules) (What the system is trying to achieve) 

Establish evaluation  as a core element of accountability within public health  Build evaluation for learning into public health goals, strategies and standards for accountability: 

o Core public health functions: surveillance, population health assessment, health promotion / disease and injury prevention, health protection. 

o Public health core competencies (E.g., Ontario Public Health Standards).    Shift evaluation culture away from something that ‘is being done’ to people in a negative or blaming way to be participatory, non blaming and perhaps even appreciative focused, no matter what is found. Distinguish between evaluation for learning and evaluation for accountability 

Encourage and support evaluation as one of the key instruments of program or policy development 

Make evaluation more utilization‐focused and relevant    Structure (as a whole)  

Make it common for organizations to have evaluation department or do evaluation much like they have finance staff and finance departments 

Start evaluation planning before the initiative. Consider evaluation as part of planning “Why are we doing evaluation?” “How are we using evaluation?” 

Share structures/positioning of evaluation within organizations 

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 17

For large organizations with in‐house evaluation capacity, develop a more systematic and strategic process of determining where, when and how to deploy the evaluation unit in order to have maximum impact on high priority decisions 

Embed evaluation in programs  Build absorptive capacity in organizations – more integration of evaluation with policy makers and planning functions 

 Feedbacks and delays (Self‐regulation, self‐reinforcement, adaptation) 

Develop a common language that reinforces evaluation for learning concepts and practices  Build evaluation into decision‐making timelines  Expect evaluation results before making decisions about a program, but not before the program has had a chance to show some impact 

Report on evaluation findings quickly so that they can inform program improvement, policy development and funding decisions 

Update objectives as programs evolve (flexibility from learnings)  Reward/value evidence‐based policy and management  Unlink evaluation from monitoring and accountability functions; remove it from performance measures 

 Structural elements: Evaluation Process 

Make all new initiatives evaluable by ensuring that appropriate and meaningful data are collected (before and after implementation) 

Set up controlled experiments wherever possible  Ensure clarity and transparency about the assumptions that underlie interventions/programs and prompt inquiry into whether these assumptions are borne out in practice 

Support and respect a range of methods including participatory and action oriented approaches so that "real world evaluations" are grounded in the community/organizational context and involve those being evaluated in a collaborative manner 

 Structural elements: Funding  

Encourage and capitalize on the growing interest in the philanthropic community in the possibilities of evaluation for learning. Many are searching for new forms of evaluation that respond to the realities of complex systems.   

Budget for standardized data collection systems that support comparisons across jurisdictions 

PHAC Grants and Contributions program requirements and funding opportunities (HLCD grant programs, Innovation Strategy): 

o Extend program funding term and amount to incorporate evaluation for learning approaches and considerations 

o Revise reporting (and exchange/dissemination) requirements o Support sharing and exchange mechanisms for evaluation methods as well as 

findings/learning (G&C Database)  Provide supports and funding for evaluation planning in public health and NGOs: logic models, participatory, range of methods/approaches, combination of in‐depth evaluation and evaluation synthesis  

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 18

Build evaluation for learning concepts and approaches into CIHR Institute of Population & Public Health: intervention research proposal funding calls 

Address timing and length of funding (timelines meeting decision‐maker’s needs)  Embed enhanced evaluation funding into grants  Set up a federal government funding pot practitioners can access for learning support and to increase their ability to evaluate 

Make funding accessible for community engaged evaluation projects  Structural elements: Communications  

Share evaluation results and practice learning with others through networks and committees and other platforms (see for example, through CAPTURE project, Canadian Best Practices Portal, and other projects under way to enhance this including one at Carleton University and one at the Public Health Agency of Canada)  

Share pre‐reading KE Forum Report with Provincial Health Promotion Clearinghouse to initiate awareness/thought on evaluation for learning 

Discuss evaluation for learning/improvement with quality departments to encourage internal discussion on this and potentially influence quality indicators 

Generate discussion with senior management about evaluation for learning and implications for how organizations structure, do and require evaluations and positive utility of evaluations 

Share our stories – the details of what worked and didn’t when we applied strategies to promote evaluation for learning, including various structures of evaluation capacity (within the program area or as a corporate resource) 

Require timely, regular feedback reports from evaluators to start encouraging the idea that evaluation is for learning 

Use a range of methods for communication: stories; DVD/videos; posters; website; found poetry; meetings/forums 

Clarify relationship between evaluation and other approaches understood by health professionals (e.g. performance monitoring, internal audit, research, quality improvement). 

Develop and clarify for all stakeholders what a culture for learning and improvement looks like (including simplifying language of evaluation) 

Acknowledge the value and usefulness of findings from existing evaluations  Recognize the role of evaluation and evaluators in organizations i.e. as essential as front line service 

Find opportunities to celebrate and publicly recognize good evaluation work (work that has enhanced learning and program improvement, including showing how evaluation has made a clear impact). 

 Structural elements: Build Capacity (Subsystems, actors and the physical structure of the system) 

Develop an evaluation (for learning on‐line learning module or course)‐ could build into PHAC Skills Enhancement Program: on‐line facilitated learning courses or others 

Link or build evaluation for learning concepts/content to related workshops/on‐line learning:  

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 19

o NCC Methods and Tools‐ Evidence informed decision‐making o OPHA‐Towards Evidence Informed Practice‐ tools, training o The Health Communication Unit‐ Evaluation process and workshops o Canadian Evaluation Society workshops/learning events 

Build evaluation for learning into Masters of Public Health Program‐ curriculum development, improve availability of places where people can receive training and continue to upgrade their skills (including masters & PhD programs)   

Connect with universities to access practicum students to assist with and bring cost down of evaluations as well as facilitate access to research and evaluation support from universities/academia 

Identify and assess current capacity within our organization to support evaluation  Encourage program managers to incorporate learning and knowledge exchange goals into performance measurement plan 

Getting the right staff and skill sets (technical/interpersonal) and visionary leadership that supports evaluation and learning is critical 

Support demonstration project (utilizing evaluation teams, capacity building within organizations, sharing results) 

Reduce the anxiety around evaluation by supporting capacity for evaluation within the public health and health promotion fields (capacity in its broadest sense) 

Establish the value of evaluation setting 2‐3 public health issues as priorities so evaluations can be completed and lessons learned shared nationally 

Establish greater expertise in cost benefit analysis  Attract and retain academic researchers in evaluation to study and develop strong methodologies for the evaluation of complex systems  

 Structural elements: Methodology  

Develop standardized evaluation tools that incorporate evaluation for learning concepts and questions 

Develop clearly defined indicators that will lend themselves to learning‐oriented evaluations  

Use dynamic data collection methods and develop ways to better use this type of data: Stories, qualitative themes; dynamic exchange events between programs; sharing of experience, challenge, learning 

Development of government and NGO evaluation guidelines and requirements  Conduct long term follow‐up (impact post program)  Develop new and innovative methodologies to support outcome/impact  evaluation of community based programs 

Develop methods that can encourage sharing of innovative methodologies for evaluators across Canada working in similar areas (e.g., website or casebook of examples of completed evaluations and methods, share innovative indicators) 

Recognize the specialized skills set  and “science” of evaluation  

Structural elements: Collaboration / Communities of Practice   Establish collaboration/linkages/partnerships that foster knowledge development, transfer and exchange 

Utilize existing communication and collaboration structures: 

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 20

o Public Health Network: network of PHAC, Provincial and Territorial Ministry Public Health leaders 

o National Collaborating Centres for Public Health (e.g., Methods and Tools, Healthy Public Policy, Determinants of Health) 

o PHIRIC (Population Health Intervention Research Initiative of Canada) o Evaluation teams within PHAC, Provinces/Territories, Regional Health 

Authorities/Public Health Units  Use existing networks and communities of practice: 

o PHAC Population Health Evaluators Network o Dialogue Public Health (NCC Methods and Tools) o Health Promotion Clearinghouse Network (Nova Scotia) o Health Nexus and OPHA networks (Ontario) o Canadian Evaluation Society 

Share models of collaboration (researcher, evaluator, program/policy) to set and act on learning agendas and improve collaboration between research and practice/policy on real life issues 

Networks‐ informal and formal; animators/convenor to keep track, identify issues and facilitate; network of colleagues (informal) 

Develop an expert directory on evaluation for learning  Conduct Additional KE Forums events, webinars , regional events or 'collision opportunities' for exchange between groups that do not normally interact 

Coordinate existing evaluations on specific programs/topics so that researchers can share methods and findings  

Create forums, meetings and opportunities where learnings/tools/expertise related to evaluation, corporate evaluation policy or frameworks, and results and lessons learned can be shared and made easily accessible nationally  

Support non‐academic organizations in disseminating their evaluation results  Enhance role in feedback loops (knowledge brokers? Dynamic exchange forums?)  Ensure there are learning organizations, communities of practice, mentorship and other bodies to support the learning and development of those who want to conduct evaluation, or know who/how/when to use evaluation 

D. Closing Reflections At the closing of the Forum, participants reflected on the discussions that had unfolded. Several points emerged based on their comments.   First of all, the interest in evaluation for learning methods and approaches continues to be very strong among those who participated.  The potential for new ways of undertaking evaluation and enabling it to more effectively nourish the decision‐making of different agencies, institutions and governments was very attractive to many of those present, and there is an appetite to learn more about these concepts and approaches.  A number of participants commented that the discussion had re‐ignited their interest in evaluation and in seeking out ways of conducting evaluation to make it more useful to their organization.  Others indicated that the range and diversity of participants was instrumental in this, and included many perspectives outside their usual networks. Secondly, work in this area is progressing on many fronts and in many different ways, yielding a wealth of ideas and possibilities, a number of which were shared in the course of the Forum.  

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 21

Comments throughout the Forum indicated that the examples and suggestions shared were often ones which participants felt they could apply in their own settings, and that there was a thirst for more ideas of the same quality.  For example: 

• Several participants indicated they will be taking ideas back to their organization and sharing these with their colleagues with a view to implementing them.  Others stated they would also share the ideas more widely through networks of which they are members. 

• A number of participants commented on specific concepts or challenging ideas they heard during the course of the Forum which will influence their work in future (e.g. using a systems lens, using more imaginative and engaging ways to share results other than a report, use of wordles to convey a pattern, the simplified levels of a system as a framework for influencing change, etc.).   One participant indicated they took “pages of notes” they intended to apply in their work, and others echoed this comment. 

• One participant commented on the immediate application of some of these concepts to an evaluation their organization was about to undertake. 

• One government funder expressed interest in following up with the McConnell Foundation to explore their approach to separating evaluation for accountability from evaluation for learning. 

Thirdly, there was an interest in continuing the dialogue and exchange.  As participants return to their respective environments and move forward with any of the tools or ideas that they feel are relevant, there is a desire to keep sharing in the future.  This exchange could take many forms and participants identified a number of areas in their own fields of influence which they felt they would like to move ahead with following the Forum.  In addition, several comments noted and appreciated the list of participant contact information provided in the Forum information kit.  Several indicated that connections made at the Forum will be followed up afterwards.  For example: 

• Some participants expressed interest in a teleconference to follow‐up on ideas, and others suggested using existing fora to expand this conversation. 

• Several institutional participants said they intended to look at ways they can support continuing dialogue. 

• Some commented that there is much going on that is extremely interesting and needs to be documented more consistently in order to be able to pass it on to others.  Too often, organizations are so engrossed in the implementation that they neglect to capture their experience so that it is reflected in the literature and shared.  

• Several commented that the collaboration between PHAC, CAPTURE, and Propel to organize this Forum was highly successful and augurs well for future collaborations of a similar nature. 

In closing, Kerry Robinson thanked the participants for their excellent contributions, and encouraged everyone to continue their pioneering work in evaluation for learning.   

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 22

 

E. Forum Evaluation Feedback  The evaluation forms completed by participants indicated that they felt the Forum was highly effective in providing a better understanding of learning‐oriented evaluation (85%), at providing examples of innovative approaches and current initiatives supporting evaluation for learning and improvement (75%), and as an opportunity to network and share with others (80%).  It was also effective, although slightly less so, at identifying actions and opportunities to advance learning‐oriented evaluation(65%), and at setting out how to scale up learning‐oriented evaluation from a systems perspective for public health efforts in chronic disease prevention and health promotion (55%).  The general design of the Forum, especially the mix of plenary, small group discussions and speakers was helpful (90%). Overall, the Forum was very well‐received by participants who also provided some helpful suggestions for improvement. Here are some participant comments:  Strengths:   

• “Fabulous balance of practitioners, researchers and funders!”  • “I appreciated learning about diverse concrete examples of how people in different 

contexts are using Evaluation for Learning.” • “Unique organization of Forum ‐ Well done!” • “Please continue to sponsor these types of events.” 

Suggestions for Improvement: • “Having more practitioners/front line workers in the room”  • “More visual aids from some presenting” 

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 23

Appendix 1: KE Forum Participants Name Organization Role E-mail Address Province /

City

Daniela Seskar-Hencic

Waterloo Public Health

Manager, Planning and Evaluation

[email protected]

Waterloo

Judy Corcoran

Alberta Health Services

Manager, Public Health

[email protected]

Alberta

Charlene Beynon

Middlesex-London Health Unit

Director, Research Education Evaluation & Development

[email protected]

London

Sara Kreindler

Winnipeg Regional Health Authority

Researcher [email protected] Winnipeg

Ian Parker Yukon Department of Health & Social Services

Manager, Health Promotion Unit

[email protected]

Yukon

Mary-Anne Finlayson

Nova Scotia Health Promotion and Protection

Manager, Research & Evaluation

[email protected]

Nova Scotia

Kathryn Graham

Alberta Innovates Director, Performance Management

[email protected]

Alberta

Bonnie Jeffrey

University of Regina

Professor, Faculty of Social Work Director, SPHERU

[email protected] Regina

Cathy Steven

Health in Common

Executive Director, Health in Common

[email protected]

Winnipeg

Dana Vocisano

McConnell Foundation

Senior Program Officer

[email protected]

Montreal

Bill Gordon Alberta Coalition Treasurer [email protected] Alberta

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 24

for Healthy School Communities

Jamie Gamble

Imprint Consulting Inc.

Principal [email protected]

New Brunswick

Pamela Forsyth

NCCMT Knowledge Broker

[email protected]

Hamilton

Wendy Young

Memorial University

Canada Research Chair in Healthy Aging

[email protected]

Newfoundland & Labrador

Katherine Gray-Donald

McGill University Associate Professor, Dietetics and Human Nutrition

[email protected] Montreal

Mary Frances MacLellan-Wright

PHAC, Alberta Region

Evaluation Analyst

[email protected]

Alberta

Munira Lalji

PHAC, Alberta Region

KDE Analyst [email protected] Alberta

Leslie Payne

PHAC, Ontario Region

KDE Analyst [email protected] Toronto

David Crouch

CAPTURE Director of Innovation

[email protected] BC

Marla Steinberg

CAPTURE Director of Evaluation

[email protected] BC

Barb Riley Propel Centre for Population Health Impact, University of Waterloo

Director, Strategy and Capacity Development and Senior Scientist

[email protected] Waterloo

Barb Zupko Propel Centre for Population Health Impact- University of Waterloo

Senior Manager

[email protected] Waterloo

Diane Finegood

CAPTURE Executive Director, CAPTURE

[email protected] BC

Judy Purcell

Cancer Care Nova Scotia

Prevention Coordinator

[email protected] Nova Scotia

James Interior Health Leader- [email protected] BC

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 25

Coyle Authority Evaluation, Planning and Improvement

Laurie Woodland

BC Ministry of Healthy Living and Sport

Executive Director, Chronic Disease/Injury Prevention and Built Environment

[email protected]

BC

Jay Barber

PHAC, Children’s Programs Evaluation

Team Leader, Evaluation

[email protected] Ottawa

Charlene Cook

PHAC, Innovation Strategy

Senior Policy Analyst

[email protected] Ottawa

Blaize Mumford

PHAC, CCDPC Evaluation

Manager, Evaluation

[email protected] Ottawa

Christine Marshall

Health Canada, Evaluation

Evaluation Analyst

[email protected] Ottawa

Maximilien Tereraho

Human Resources & Skills Development Canada, Evaluation

Director, Feedback and Knowledge Management

[email protected]

Ottawa

Christine Le Grand

Heart and Stroke Foundation, Science Policy

Science Policy Analyst

[email protected]

Ottawa

Kerry Robinson

PHAC, Knowledge Development & Exchange Unit

Manager, KDE [email protected] Ottawa

Fowsia Abdulkadir

PHAC, Knowledge Development & Exchange Unit

Planning and Evaluation Analyst

[email protected] Ottawa

Manal Salibi

PHAC, Knowledge Development & Exchange Unit

KDE and Evaluation Analyst

[email protected] Ottawa

Dawn Sheppard

PHAC, Knowledge Development &

Senior Policy Analyst

[email protected] Ottawa

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 26

Exchange Unit Nina Jetha

PHAC, Canadian Best Practices Initiative

Manager, Best Practices Portal

[email protected] Ottawa

Lynne Tyler

Catalyst Research and Communications

Facilitator [email protected]

Ottawa

Diane Finkle

Wordsmith Writing and Editing Services

Note taker [email protected]

Ottawa

Simon Roy Goss Gilroy Inc. Key Note Speaker

[email protected]

Ottawa

Derrick Jolicoeur

PHAC, Cancer Coordination Unit

Policy Analyst [email protected] Ottawa

Abbreviations: CAPTURE: Canadian Platform to Increase Usage of Real-World Evidence CBPI: Canadian Best Practices Initiative CCDPC: Centre for Chronic Disease Prevention and Control CIHR: Canadian Institutes for Health Research HRSDC: Human Resources Services Development Canada KDE: Knowledge Development and Exchange KT: Knowledge Translation NCCMT: National Collaborating Centre for Methods and Tools PHAC: Public Health Agency of Canada SPHERU: Saskatchewan Population Health & Evaluation Research Unit

 

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 27

Appendix 2: KE Forum Program 

 

Knowledge Exchange Forum: Creating a Culture of Evaluation for Learning and Improvement in Chronic

Disease Prevention and Health Promotion

March 29-30, 2010

Public Program Purpose of the Forum The Knowledge Exchange Forum 2010 will focus on evaluation for learning and improvement in chronic disease prevention and health promotion. In particular, the Forum will discuss how to advance documenting, sharing and using evaluation findings to improve program practice in chronic disease prevention and healthy living promotion. Desired outcomes from the Forum discussions include:

share and learn about innovative examples of documenting, sharing and using evaluation findings for improvement and learning purposes; and

identify existing or potential strategies and opportunities to strengthen this approach; and

identify steps to promote and sustain learning and evaluation cultures in public health systems.

This event will bring together community-based practitioners, evaluators, policymakers, funders, and researchers from across Canada to develop a better understanding of how to advance evaluation for learning and improvement in chronic disease prevention and health promotion. This Forum is also meant to identify opportunities to strengthen the culture and use of evaluations for learning in participants’ respective current and future work.

Stories drawn from experiences at multiple levels and from a range of professionals will help to illustrate successes, challenges and lessons learned from evaluations for learning and improvement.

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 28

Monday, March 29 5:00pm Arrival and check-in, hors d’oeuvres/refreshments 6:00pm Opening Session Opening Remarks (5 minutes each) Public Health Agency of Canada – Tim Hutchinson, Director, Chronic Disease

Prevention Division CAPTURE – Diane Finegood, Executive Director Propel Centre for Population Health Impact– Barb Riley, Director 6:15 Keynote Address: Dr. Simon Roy

6:45 Invitation to Conversation: Three Views of Evaluation

Researcher perspective: Dr. Barb Riley, Propel Centre for Population Impact (highlights from CBRPE needs assessment)

Practitioner/Evaluator perspective: Daniela Seskar-Hencic, Waterloo Public

Health

Funder perspective: Dana Voscisano, McConnell Foundation 7:15 Open to all for comments and discussion 7:30 Dinner and Conversation

8:30 Adjourn (facilitator)

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 29

Tuesday, March 30 8:00 Light breakfast and networking (Late arrival and check-in) 9:00 Opening (facilitator) 9:15 Stories of Innovation

A panel will profile three Canadian stories of innovative approaches and current initiatives in using knowledge from evaluations to advance practice and policy. (15 min each)

Maximillien Tereraho, Director, Human Resources and Skills Development Canada Bill Gordon, Alberta Coalition for Healthy School Communities Jamie Gamble, Imprint Consulting Inc.

10:45 Break- coffee & tea provided 1:00 Breakout Groups Discussion questions:

• What are the big ideas we heard from the panel? (distill issues to discuss and think about)

• The panel presented three examples of applying evaluation to implement change. What other examples are you aware of? What is each of our organizations doing?

11:45 Group Report Back Brief reports from small group discussions: 3 to 4 highlights per group. Discussion questions, if needed:

• What patterns have you observed in the examples so far about evaluation for learning and improvement?

• What gaps are emerging in our knowledge and in our practice of evaluation for learning?

12:15 Lunch will be provided (45 min lunch break) (support available for travel claim completion) 1:00 Creating a Culture Shift Towards Evaluation for Learning

A panel of three presenters from CAPTURE, PHAC, and Propel will propose some initial thoughts about how systems thinking and other “big ideas” can help move all sectors towards a culture of evaluation for learning and improvement.

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 30

CAPTURE: What is systems thinking? Systems perspective? Why is it important? PHAC: What are or could be elements or components for a system to support evaluation for learning? Propel: What needs to happen from a systems perspective to create a culture for learning and improvement?

1:25 General discussion in plenary to identify how to move forward: what we can each do in our respective organizations and sectors, how to exchange and learn from each other, and so on.

Discussion questions:

• What can we each do to bring about a culture of evaluation for learning and improvement? What scope do you see in your own organization and your sector? What can we learn from what has succeeded so far?

• What possibilities are there for improved sharing, exchange and support amongst those who are trying to practice and promote this culture?

• What are your hopes in terms of actions we can take together? 2:30 Break- refreshments provided 2:45 Where Do We Go From Here? This round of discussion will sift through the suggestions for action by various

sectors, and suggest some possible next steps. Small groups will generate specific actions and these will then be reported back to the full group so that we can compile an overview of the planned actions going forward.

Discussion in small groups: • What is your thinking about your own next steps after this Forum? For

example, are there ideas that you think may be promising for your agency or sector that you have heard?

• What are some joint action steps? Who should be involved and how (who are the key players and how should they be involved)?

• Given all of the suggestions for action arising from the previous discussion, what do you think are the three most critical next steps/ to do list? Why?

4:00 Closing, Reflections and Next Steps Final comments from participants Closing remarks from Kerry Robinson Thank-you to all presenters and participants Completion of evaluation forms and travel claim forms 4:30 Closing

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 31

Appendix 3: KE Forum Presentations 

Evaluation for Learning Evaluation for Learning and Improvement in and Improvement in

Chronic Disease Chronic Disease Prevention and Prevention and

Health Promotion: Health Promotion:

Lessons from LiteratureLessons from Literature

Dr. Simon RoyDr. Simon RoyDr. Wendy RyanDr. Wendy Ryan

March 29, 2010March 29, 2010

2

The Issue

• According to research, gaps exist in the ability to share and use evaluation results for practical learning and improvement in population and public health

• 2005 survey of Canadian evaluators:Only 67% of evaluators thought that their evaluations improve programs65% thought that they significantly benefit program clients

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 32

3

Why Evaluations Become Shelving Material

• Poor evaluations (e.g., methodology, reporting, recommendations, etc.)

• Evaluators exclude program involvement‘We are the experts’

• Evaluators ignore the bigger picture

• General denial of results or resistance to change on the part of programs or senior management

Evaluation for Learning and Improvement: What is it?

• Evaluations that are designed, conducted and reported in ways to maximize the dissemination and use of the evaluation findings, as well as lessons learned

• Purpose of evaluation in this context is learning and improvement, and not strictly accountability

• Literature identifies a number of strategies to achieve this

4

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 33

5

6

Evaluation for Learning: Key Strategies by Phase

1. Evaluation Design and Advisory Body

2. Identification of Stakeholders and Users

3. Data Collection Process

4. Analysis and Reporting

5. Knowledge Transfer and Follow-up Activities

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 34

Governance and Evaluation Design

• Governance of evaluation should include stakeholders: Committees

• Good evaluation design facilitates learning and takes into account the following:

Recognized evaluation principles and good practicesSolid methodology that meets the following standards:

Three basic evaluation issues: Rationale, Delivery, ImpactsPlan to have recommendations.Proper sampling to ensure representativeness

7

8

Identification of Stakeholders and Users and their Needs

• Identify users and beneficiaries upfrontChallenges:

Determining, building and sustaining interest throughout the evaluation processDetermining knowledge and increasing knowledge as needed (e.g., training)

• Identify needs of users and beneficiariesEvaluation is a great opportunity to gather information – not just for traditional evaluation purposes

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 35

9

Data Collection Process

• Participatory approaches in data collection

• If stakeholders are not directly involved in the data collection, it is important that they be kept well informed of progress

• Any problems or delays should be communicated to relevant stakeholders as soon as they are known

• During this phase it is also important to offer stakeholders opportunities to reflect on the process and learn from it; debriefing process learnings as they occur

10

Analysis and Reporting

• Useful reports based on following principles:

Objective analysis and presentation of findings

Validation from stakeholders to ensure solid results and sense of ownership (recognition that evaluation data-gathering process is not a perfect science)

Evaluation report structured to meet stakeholder needs (avoid long ‘detective story’)

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 36

11

Analysis and Reporting• Recommendations:

Allow time to do a good job on recommendationsDevelop strategies for getting recommendations taken seriouslyRecommendations should clearly follow from and be supported by the evaluation findingsNegotiate and clarify with stakeholders and evaluation fundersConsider costs, benefits, and challenges of implementing recommendationsFocus on actions within the control of intended usersExercise political sensitivity

• Should have a follow-up plan

Knowledge Transfer and Follow-up Activities

• Tailor reports, communications for each audience in terms of scope, timing, and presentation format

• Options include:Executive summary followed by a report or key graphicsTraditional academic research paper Newsletter, press release, brochureOral briefing with chartsDiscussions groups based on hand outsRetreat-like work sessionVideo or audiotape presentationDramatic, creative presentation that involves role-playing

12

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 37

Contextual Considerations

• Need to consider:Culture of organization, History Resources Political context

• Do homework in terms of relevant policies –need to be consistent

13

Final Thoughts

• All of this can summarized as how to gain user buy-in of results through:

InvolvementSolid methodology Useful recommendationsGood communications

• Stakeholder involvement not a panacea –should not be a justification for insufficient funds for evaluation

14

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 38

Evaluation for learning involves five key strategies: 1. Evaluation design and advisory body 

Good evaluation design facilitates learning and takes into account recognized evaluation principles and good practices as well as solid methodology. 

Users and beneficiaries should be identified at the beginning. However, it may be challenging to determine, build and sustain interest throughout the evaluation process and to determine and increase knowledge as needed (e.g., training). 

It is also important to identify the needs of users and beneficiaries.  Evaluation provides an opportunity to gather information that can be used for a variety of purposes that may extend beyond traditional evaluation purposes. 

2. Data collection process  Participatory approaches in data collection are extremely important.  If stakeholders are not directly involved in data collection, they should be kept well informed of its progress and any problems or delays should be communicated to them as soon as they are known.   

During this phase, it is also important to offer stakeholders an opportunity to reflect on the process and learn from it; debriefing can process learnings as they occur. 

3. Analysis and reporting  Useful reports are based on principles such as objective analysis and presentation of findings and validation from stakeholders to ensure solid results and a sense of ownership (recognizing that the evaluation data‐gathering process is not a perfect science).  

The evaluation report should be structured to meet stakeholder needs and avoid lengthy ‘detective story’ descriptions). 

Recommendations should: - Be allowed sufficient time to ensure quality - Include strategies to ensure they are taken seriously - Clearly follow from and be supported by the evaluation findings - Be negotiated and clarified with stakeholders and evaluation funders - Consider the costs, benefits, and challenges of implementation - Focus on actions within the control of intended users - Exercise political sensitivity 

It is important to have a follow‐up plan 4. Knowledge transfer and follow‐up activities. 

Tailor reports and communications for each audience in terms of scope, timing, and presentation format.  

Options include: - Executive summaries followed by a report or key graphics - Traditional academic research papers  - Newsletters, press releases, brochures - Oral briefing with charts - Discussions groups based on hand outs - Retreat‐like work sessions - Video or audiotape presentations 

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 39

- Dramatic, creative presentations that involve role‐playing 5.   Evaluation Use Evaluators should consider the culture of the organization, its history, resources and the political context.  It is also important to remain consistent and aware of relevant policies.  Dr. Roy concluded by summarizing that user buy‐in of results can be enhanced through involvement, solid methodology, useful recommendations and good communication.  He also stressed that stakeholder involvement is not a panacea and should not be a justification for insufficient funds for evaluation. 

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 40

mtereraho Knowledge Exchange Forum- PHA 2010 1

Institutionalizing Evaluation for Learning and Improvement

Experience of HRSDC/Feedback & Knowledge Management Division

Maximilien Tereraho, Ph.D., Adm.A.Human Resources and Social Development Canada (HRSDC)

Disclaimer: The views expressed in the paper are solely those of the author and do not necessary represent the position of HRSDC.

@ 30 March 2010. Work in progress; comments should be sent directly to the author.

mtereraho Knowledge Exchange Forum- PHA 2010 2

Introduction

This presentation uses lessons from HRSDC/FKM Division’s work to discuss opportunities and challengesfor institutionalizing learning/ improvement-oriented evaluation practice.Feedback and Knowledge Management (FKM) is a small unit (1-4 FTEs) within the Evaluation Directorate (7 units) dedicated to support utilization of evaluation knowledge for decision-making by fostering linkages between evaluation, policy and programs. FKM was formally created and staffed in 2006-2007.

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 41

mtereraho Knowledge Exchange Forum- PHA 2010 3

Main Objectives of FKMFKM aims to enhance the use and usefulness of evaluations as well as their impact on decision-making, innovation and continuous improvement in HRSDC:

– Provide easily accessible comprehensive body of evidence, lessons learned and good practices from internal and external evaluations;

– Promote the application of evaluation findings, good practices, and lessons learned to improve the performance of HRSDC programs/policies;

– Fostering evaluation capacity development within the Evaluation Directorate, the Department and the broader evaluation community.

mtereraho Knowledge Exchange Forum- PHA 2010 4

Key Activities and OutputsMapping and meta-analysis of HRSDC evaluations (thematic synthesis studies, Annual Evaluation Report). Dissemination of evaluation knowledge within and outside the Department. Current dissemination/brokering channels include theWebsite/Knowledge Portal, newsletters, workshops and participation in departmental committees/working groups.Support to Strategic Review by reviewing, compiling and organizing evaluation evidence, and effectively participating indedicated working groups/committees. Coordinating Evaluation assistance and producing the annual report on the state of performance measurement at HRSDC.Analysis and preparation of reports to the Evaluation Committee on follow-up actions taken by Program Management in response to Evaluations.

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 42

mtereraho Knowledge Exchange Forum- PHA 2010 5

Institutionalizing/Implementation PhasesPhase I- Problematizing (2004-2006):

Lessons learned from the Audit and Evaluation Committee process as arguments for shift from passive role of evaluation to an active one: Advocate for use of evaluation findings in policyand program improvement.In follow-up to a Branch retreat, a briefing note to the ADM (July 2005) emphasized the need to look for better way to transfer knowledge of what works and lessons learned from evaluations into the process of organizational decision-making, learning and innovation beyond the concrete case at hand (individual evaluations).FKM design was inspired by existing best practices in leading national and international organizations, including those visited by its director in 2006 in New York and Washington DC.

mtereraho Knowledge Exchange Forum- PHA 2010 6

Institutionalizing/Implementation PhasesPhase II- Enrollment and Legitimacy Building (2005-2007):

Senior Management (specifically ADM Strategic Policy) supported the Program Evaluation’s initiative to develop new tools and approaches to disseminate findings and to influence policy / program development.An action plan to improve evaluation support for evidence-based policy/program development was developed in 2006-2007 in consultation with a Branch task team. The plan emphasized:

– Policy’s interest for both single program evaluation results and thematic syntheses from multiple evaluation findings and other sources of evidence; and,

– Need to develop forward-looking and better focused thematic analysis addressing program-mix and cross-cutting issues.

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 43

mtereraho Knowledge Exchange Forum- PHA 2010 7

Institutionalizing/Implementation Phases

Phase II- Enrollment and Legitimacy Building (con’d):A joint Newsletter with Policy Research Directorate was established in 2006.A first HRSDC Knowledge Conference was co-organized in December 2006 with the new (corporate) Knowledge and Data Management Directorate.Participation and presentations at the CES Annual Conferences and various national and international learning events since 2006, including panels.Participation in the TB Evaluation Policy renewal process through Heads of evaluation consultations (special invitation and/or in replacement of DG). Etc.

mtereraho Knowledge Exchange Forum- PHA 2010 8

Institutionalizing/Implementation Phases

Phase III- Contest and Collaboration (2007- +):FKM slowly became a standalone unit within ED and its visibilityis even more increasing (Renewed TB Evaluation Policy; Evaluation Committee and branches’ revealed demand).The unit is becoming a centre of excellence in evaluation knowledge management recognized as such within HRSDC and the Government of Canada as well as internationally. The importance of evaluation knowledge development and sharing for policy/program improvement is recognized by both program / policy areas and Evaluation. Evaluation knowledge sharing is now a shared responsibility within the Evaluation Directorate and a cross-functional product.Both direct and indirect use of evaluation has significantly improved although attribution is theoretically difficult to establish.

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 44

mtereraho Knowledge Exchange Forum- PHA 2010 9

Institutionalizing/Implementation- BarriersThere is a persistent lack of sufficient awareness and accessibility of existing knowledge and projects undertaken for both evaluators and policy/program analysts.Traditional evaluation business lines are not always adapted to policy/program needs.Matching policy/program process pace and timeliness of evaluation products is a major challenge.Evaluation and program Management turnover is associated with lack of ownership and commitment; and therefore, the reversibility of processes and wins.Resistance to change towards learning-oriented evaluation could also come from inside (Evaluation) for various reasons but including the audit-like and risk-aversion culture of evaluation embedded in the Public Service bureaucracy, and negative experience with evaluation use.

mtereraho Knowledge Exchange Forum- PHA 2010 10

Institutionalizing/Implementation-Facilitators

The new draft TB Evaluation Policy was putting emphasis on direct strategic use of evaluation for policy decision making, expenditure management, learning and program innovation. Actual senior management interest in evaluation as strategic knowledge likely reflected commitment and support.Expertise and experience of the initiative proponent in introduction of new management practice in organizations was an asset.Successive organizational changes proved to be somewhat positive:

– Location of the evaluation function in a policy branch (link);– Short merger with Audit provided exposure to rigours.

In-house and participatory project delivery is likely to increase trust, buy-in and evaluation knowledge take-up.

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 45

mtereraho Knowledge Exchange Forum- PHA 2010 11

Lessons LearnedEvaluation reports cannot speak for themselves, an integrated evaluation knowledge management framework is needed.Learning takes place through social interactions: understand, interprete and make sense of experience for further action.An outcome of such social experiments is change in the mindset of the learner: break from traditions and adopt new practices. For it to likely happen:– Behave as a management consultant for program/policy, not an judge:

assess needs and communicate constantly;– Senior management views can be different from the program/evaluation

manager: understand their perspective and get their attention;– Take advantage of all possible new opportunities, build legitimacy through

expertise and network, and compromise, not capitulate;– Be aware of political sensitivity and possible request of data on creative

permutations of the performance story.

mtereraho Knowledge Exchange Forum- PHA 2010 12

ConclusionIt should be understood from the particular HRSDC’s organizational context and experiences elsewhere in Canada and abroad, that the proposed shift from traditional audit-like program evaluation towards a more learning-oriented (developmental) evaluation constitutes a significant cultural shift and challenge.The effective implementation of such an initiative may require more time as originally expected and will certainly depend heavily on senior management commitment.“Good ideas are firstly ignored, then fought, and finally taken for guaranteed / obvious.”

 

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 46

Developmental evaluation from a practitioner/ consultant perspective Jamie Gamble of Imprint Consulting Inc. shared his unique perspective as a practitioner and consultant in evaluation.  He offered a number of observations and pointed out that developmental evaluation was still in its early days and its boundaries may not be totally clear.  He stressed the distinction between developmental evaluation and evaluation for improvement and stressed that many practitioners operate within a dynamic and changing environment.  He shared three different case studies and offered some controversial recommendations such as the fact that evaluation reports may not always be necessary! 

Organizations generally have an interest in evaluation and are generally very capable of learning, processing and articulating, however, things may become problematic when they perceive that an “evaluation” is being “done to them”. 

Because developmental evaluation is still in its early days, there is permission to explore and play with it in different contexts such as during early‐stage innovation and in highly unstable and complex situations where boundaries may not be clear.   

Developmental evaluation is useful because practitioners operate within a dynamic environment that involves new populations, evolving target groups and constantly changing conditions.   

Case #1: Jamie shared a case example of his experience with an environmental organization that utilized a simple planning tool to work with municipalities in making decisions about sustainability.  As the evaluation evolved, it became evident that there was a need to reframe the principles of the planning tool.  Developmental evaluation was useful in this context because the organization understood the importance of science and rigour and its members were able to think critically, interpret their findings in a reasonable way and make the necessary adjustments to their model.   

Case #2: Developmental evaluation is a niche that can serve a specific purpose such as in the case of a cross‐border network of organizations who worked in community economic development, sustainable environment and food security.  The organization reached a “summative moment” where they needed to take stock of where they had been and there they were going.  A developmental evaluation approach provided an opportunity for the organization and its stakeholders to speak up and helped the network come to the conclusion that they needed to cease their operations. 

The point of these stories is that developmental evaluation can serve a variety of different purposes.  Developmental evaluation can also assist as a crisis response and can help organizations move quickly to assess and make changes. 

Leadership from within organizations is essential and funders can also play a key role by investing in and supporting learning space. 

Evaluating the work of the Alberta Coalition for Healthy School Communities Bill Gordon of the Alberta Coalition for Healthy School Communities shared his experience with evaluation as an educator for more than 40 years.  He stressed that many communities do not have a strong research/evaluation base and that evaluation has always naturally been for learning.  He also provided information about the results of a 2007 evaluation which included strong recommendations to engage more policy makers that were eventually acted upon. 

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 47

ACHSC was funded by PHAC for a project to build capacity for comprehensive school health (CSH) in Alberta.  The project was designed to build effective partnerships and community linkages to advance CSH and to build on strong awareness and education among Alberta school communities of the ACHSC network and the supports available to help implement CSH.  

The ACHSC network strengthens understanding of health disparities among children and youth, families, and communities and provides supports to help address these issues.  In addition, the project focuses on ways to mobilize regional groups to look at the health of aboriginal children and youth. 

Work to address these goals has included the development of background documents; hosting interactive learning events, meeting and networking opportunities; facilitating opportunities for partnership development; conducting research / evaluation and the formation of task groups / committees. 

Evaluation has always been a strong component of planning.  Because the organization is not‐for‐profit, its work is driven by the needs of its membership and the community.  Evaluation is used as a way to determine if the needs of stakeholders are being met, desired outcomes are obtained and to shape future work.   

For example, in 2007, evaluation results included a strong recommendation to move to engage policy makers.  As a result, an “invitation only” event brought together school board trustees, superintendents, physicians, Medical Officers of Health, the Alberta School Councils Association and the Alberta Teachers’ Association.  After this successful event, the delegates and our general membership conveyed the message that there was a need to partner at a local/regional level.   

Subsequently, in 2008‐2009, ACHSC hosted regional school health team meetings around the province as a way to address this need and to further determine if there was support for a local level team structure.  In 2009‐2010, based on the evaluation of 2009, ACHSC supported the development of five regional teams and the addition of seven regional Aboriginal team meetings.  

ACHSC uses various methods to evaluate its work.  These include online surveys with membership, surveys at the end of conferences, research projects, reviewing minutes of meetings and reports from events, and completing the PERT tool.  Evaluation is often considered separately for each strategy or activity and compiled into a large evaluation report / analysis for the work of the year.  Developing the PHAC work plans has also provided a structure to think about the expected results for activities and how to measure results.   

ACHSC also recognizes the importance of evaluating processes.  Its most recent process evaluation was a survey of board members about the organization, how it does its work, and future directions.  This year, ACHSC is working with the University of Alberta’s Community‐University Partnership team to develop an evaluation tool which will focus on processes, successes and challenges of creating effective regional teams for Comprehensive School Health as identified by the teams with which we have been working for the past two years. 

A focus on evaluation has led to ACHSC to think critically about the impact that its initiatives will have on developing healthy school communities in Alberta.   This was 

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 48

especially true in its efforts to bring together education, health promotion, the medical community and parent organizations within each of Alberta’s health regions and education zones.   If ACHSC cannot link its work to its vision or mission through strong evaluation, then activities should be reconsidered.  Considering past evaluations allows the organization to be deliberate in the activities that it undertakes. 

Evaluation has allowed ACHSC to improve its practices and actions, and to best meet the needs of its stakeholder groups.  The organization has become more active in traveling to communities throughout Alberta and facilitating real conversation and communication between many groups. 

ACHSC has become more active in its relationships with policy level organizations such as the Alberta School Boards Association, the College of Alberta School Superintendents, the Alberta School Councils Association, the Alberta Medical Association, the Association of Medical Officers of Health and the ATA. 

Evaluation which pointed to the need for some financial support for small and large CSH best practices and system wide policy development, lead to ACHSC’s decision to successfully lobby for the creation of a $3,000,000 Wellness Fund in joint partnership with Alberta Health and Wellness and the University of Alberta’s School of Public Health.  ACHSC is hopeful that the knowledge gathered from the evaluation/research components of the projects funded will be used as templates for other Alberta communities to build on successes or avoid the roadblocks encountered.  

Even something as simple as evaluating a learning event allows ACHSC to identify the learning needs of its stakeholders and how to best meet these needs.  An annual evaluation report to stakeholders also provides a way to express priorities and future activities.   

It is a challenge to acknowledge that evaluation can reveal both successes and areas for improvement.  It is very easy to pick up on the information that validates what an organization is doing, but is more difficult to reflect critically on the information that suggests that strategies or activities need to change.  It can be difficult to recognize that process and outcome information are equally important.  How to achieve outcomes is just as important as whether or not they are being achieved.  

More details regarding the Coalition, the Wellness Fund and the work of the Coalition in Alberta can be obtained at: www.achsc.org  

 

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 49

Systems Thinking and the Culture Shift Toward Evaluation for Learning

Diane T. Finegood, Executive Director, The CAPTURE Project

Kerry Robinson, Manager Knowledge Development and Exchange, Public 

Health Agency of Canada

Barb Riley, Director Strategy and Capacity Development, Propel Centre for 

Population Health Impact

Complexity of the Solution Landscape

Based on the actions described in the pre‐meeting survey, using http://www.wordle.net/

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 50

Part 1What is systems thinking? 

Why is it important? 

Diane Finegood, PhD

CAPTURE Project

Places to Intervene in a Complex System1. The power to transcend paradigms

2. The paradigm that the system arises out of

3. The goal of the system

4. The power to add, change, evolve, or self‐organize system structure

5. The rules of the system 

6. The structure of information flow 

7. The gain around driving positive feedback loops

8. The strength of negative feedback loops

9. The length of delays

10. The structure of material stocks and flows 

11. The size of buffers and other stabilizing stocks

12. Constants, parameters, numbers

Effectiven

ess

Difficulty

D. Meadows. Thinking in Systems, Chealsea Green, 2009.

From flickr.com by michael heiss

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 51

System Level Framework (adapted from Meadows)

• Paradigm

– System’s deepest beliefs 

• Goals (and Rules)

– What the system is trying to achieve

• Structure (as a whole)

– Connectivity, information flows, resources, trust

• Feedback and delays

– Self‐regulation, self‐reinforcement, adaptation

• Structural Elements

– Subsystems, actors, and the physical structure of the system

Pre‐Meeting Survey Actions Sorted into Intervention Level Framework 

• Paradigm

– View evaluation as critical and vital for learning and improvement.  

– View and trust evaluation as being relevant to policy development

– Consider evaluation an integral component of program planning

• Goals (and Rules)

– Establish evaluation  as a core element of accountability within public health

– Shift focus from evaluation for assessment to evaluation for learning and improvement.  

– Shift evaluation culture away from something that ‘is being done’ to people in a negative or blaming way to be participatory, non blaming and perhaps even appreciative focused, no matter what is found 

– Encourage and support evaluation as one of the key instruments of program or policy development

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 52

Pre‐Meeting Survey Actions Sorted into Intervention Level Framework 

• Structure (as a whole)

– Budget for standardized data collection systems that support comparisons across jurisdictions

– Develop a common language

– Develop standardized evaluation tools

– Provide clearly defined indicators

– Establish collaboration/linkages/partnerships that foster knowledge development, transfer and exchange

– Make it common place for organizations to have evaluation department or do evaluation much like they have finance staff and finance departments

– Start evaluation planning before the initiative

Pre‐Meeting Survey Actions Sorted into Intervention Level Framework 

• Feedbacks and Delays

– Build evaluation into decision‐making timelines

– Expect evaluation results before making decisions about a program, but not before the program has had a chance to show some impact

– Report on evaluation findings quickly so that they can inform program improvement, policy development and funding decisions

– Reward/value evidence‐based policy and management

– Unlink evaluation from monitoring and accountability functions; remove it from performance measures

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 53

Pre‐Meeting Survey Actions Sorted into Intervention Level Framework 

• Structural Elements

– Evaluation Process

– Funding

– Communications / Information Flows

– Advance Methodology

– Build Capacity

– Collaboration / Communities of Practice

Distribution of Effort / Resources (?)

Intervention Level

Paradigm Goals Structure Feedback Elements

Dis

tribu

tion

of A

ctio

n

0

10

20

30

40

50

60

SpeculativeGap

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 54

Part 2What are public health assets & system opportunities that could contribute to evaluation for 

learning?

Kerry Robinson, PhD

Centre for Chronic Disease Prevention & Control 

Public Health Agency of Canada

Goals

• Build into public health goals, strategies,         and standards for accountability

– Core public health functions: surveillance, population health assessment, health promotion / disease & injury prevention, health protection… + capacity building & knowledge exchange?

– Public health core competencies:1.4 Use evidence to inform policies and programs

2.1-2.6 Assessment and analysis 3.0 Policy and program planning, implementation & evaluation

– Example: Ontario Public Health Standards-Principles of need, impact and capacity

-Foundations of Research and Knowledge Exchange

From flickr.com by gunnisal

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 55

Structure

• Public Health Network: network of PHAC, Provincial and Territorial Ministry Public Health leaders– Council & Expert Groups

• National Collaborating Centres for Public Health– Methods & Tools, Healthy Public Policy, Determinants of Health

• PHIRIC (Population Health Intervention Research Initiative of Canada)

• Evaluation teams within PHAC, Provinces/Territories, Regional Health Authorities/Public Health Units

From flickr.com ‐ Syntopia

Structural Elements: Funding & Evaluation 

Process• PHAC Grants & Contributions program requirements:

• Extend program funding term & amount to incorporate evaluation for learning

• Revise reporting (and exchange/dissemination) requirements• Support sharing and exchange mechanisms for evaluation methods as well as findings/learning (G&C Database)

• Evaluation planning: participatory, range of methods/approaches,combination of in‐depth evaluation and evaluation synthesis

• Intervention Research Funding opportunities:– PHAC Innovation Strategy: Mental Health & Obesity – CIHR Institute of Population & Public Health: intervention research proposal funding calls

From flickr.com by Patrick Smith

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 56

Structural Elements: Communication 

Flow/Collaboration

• Repositories/Information Hubs:– Canadian Best Practices Portal

– CAPTURE platform

• Networks & Communities of Practice:– PHAC Population Health Evaluators Network

– Dialogue Public Health (NCC Methods & Tools)

– Health Promotion Clearinghouse Network (Nova Scotia)

– Health Nexus & OPHA networks (Ontario)

– Others?

From flickr.com by Steve Webel

• PHAC Skills Enhancement Program: on‐line facilitated learning courses– Epidemiology, evidence‐informed planning… + evaluation?

• Workshops/on‐line learning: – NCC Methods & Tools‐ Evidence informed decision‐making– OPHA‐Towards Evidence Informed Practice‐ tools, training– The Health Communication Unit‐ Evaluation process & workshops

– Canadian Evaluation Society workshops/learning events• Masters of Public Health Program: curriculum development

Structural Elements: Capacity Building & 

Methods

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 57

Part 3How do we make it happen?

Barbara Riley, PhD

Propel Centre for Population Health Impact

Three Principles

1. Think like a system

2. Be the change we want to create

3. Learn jujitsu

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 58

1. Think like a system

• Paradigm: What are existing beliefs related to evaluation for learning?

• Goals: What are the goals of evaluation for learning?

• Structure: What structures exist or could be created?

• Feedback and delays: What are mechanisms for providing evaluation feedback? 

• Structural elements: Who are key players –individuals and organizations? How do you currently understand their roles and assets?

2. Be the change we want to create

• What can each of us do given our current mandates and responsibilities?• What examples can be built on to demonstrate a new paradigm for evaluation? • How can we individually and collectively lead by example?• How might we take stock of progress with evaluation for learning, adapt and re‐focus our efforts – individually and collectively?

KNOWLEDGE EXCHANGE FORUM~ SUMMARY REPORT PAGE 59

3. Learn jujitsu

Psychology 101: Piaget’s assimilation and accommodation

Strategy 101: Vision  Action

• How can evaluation for learning become part of existing agendas? 

• How can we adapt our long‐term goals to be a realistic stretch in the short‐term? 

• What are the strongest assets that could be used to advance evaluation for learning?