evaluation of “building capacity to manage water …pdf.usaid.gov/pdf_docs/pa00kngc.pdf · plan...

72
Evaluation of “Building Capacity to Manage Water Resources and Climate Risk in the Caribbean” by Columbia University and University of the West Indies/Centre for Resource Management and Environmental Studies (UWI/CERMES) By Program Evaluators Todd A. Eisenstadt and Stephen MacAvoy of American University for Higher Education in Development (HED) 1 April 29, 2015 Contents of this Document Page I. Partnership Overview 1 II. Best Practices – Climate Change Science 3 III. Best Practices – Hydrology and Meteorology related to Climate Change 6 IV. Best Practices – Barbados and Region 8 V. Evaluation Methodology 9 VI. Data Collection Plan 10 VII. Data Collection Tools 11 VIII. Plan for Structural Data Analysis 11 IX. Limitations of Evaluation 13 X. The Object of the Evaluation: Overall HED Criteria Project Compliance 14 XI. Evaluation of CERMES and CIMH Products with Regard to HED “Guiding Question” Criteria 16 XII. Evaluation of CERMES Products 18 Tables 1-3 29-31 XIII. Evaluation of CIMH Products 33 Tables 4-6 39-42 XV. Overall Project Achievements and Limitations 44 XVI. Recommendations 47 Appendix Documents 51 Table 1A. HED Methodology Guiding Questions 51 Table 2A. Comparisons with other Caribbean nations. EPI, CRI, Fresh Water 52 Table 3A. Comparisons with other Caribbean nations. Water Resources 53 Table 4A. Columbia University Final M&E Objectives 54 Table 5A. Caribbean M-E Table (RF, PMP, PIP) 55 Figure 1A. Barbados population density 56 Sample of Questions Asked of Stakeholders 56 WACEP – Overall Evaluation 65 Sources Cited and List of Interviews Conducted 67 1 The authors thank American University doctoral student Daniela Stevens Leon for research assistance.

Upload: truongcong

Post on 26-Jun-2018

213 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

Evaluation of “Building Capacity to Manage Water Resources and Climate Risk in the Caribbean” by Columbia University and University of the West Indies/Centre for Resource

Management and Environmental Studies (UWI/CERMES)

By Program Evaluators Todd A. Eisenstadt and Stephen MacAvoy of American University for Higher Education in Development (HED)1

April 29, 2015

Contents of this Document Page I. Partnership Overview 1 II. Best Practices – Climate Change Science 3 III. Best Practices – Hydrology and Meteorology related to Climate Change 6 IV. Best Practices – Barbados and Region 8 V. Evaluation Methodology 9 VI. Data Collection Plan 10 VII. Data Collection Tools 11 VIII. Plan for Structural Data Analysis 11

IX. Limitations of Evaluation 13

X. The Object of the Evaluation: Overall HED Criteria Project Compliance 14 XI. Evaluation of CERMES and CIMH Products with Regard to HED “Guiding Question” Criteria 16 XII. Evaluation of CERMES Products 18

Tables 1-3 29-31

XIII. Evaluation of CIMH Products 33

Tables 4-6 39-42

XV. Overall Project Achievements and Limitations 44

XVI. Recommendations 47

Appendix Documents 51 Table 1A. HED Methodology Guiding Questions 51 Table 2A. Comparisons with other Caribbean nations. EPI, CRI, Fresh Water 52 Table 3A. Comparisons with other Caribbean nations. Water Resources 53 Table 4A. Columbia University Final M&E Objectives 54 Table 5A. Caribbean M-E Table (RF, PMP, PIP) 55 Figure 1A. Barbados population density 56 Sample of Questions Asked of Stakeholders 56 WACEP – Overall Evaluation 65 Sources Cited and List of Interviews Conducted 67

1 The authors thank American University doctoral student Daniela Stevens Leon for research assistance.

Page 2: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

1

I. Partnership Overview

With a population of merely 280,000 people and a GDP per capita of over $15,000 per year,

Barbados is a relatively affluent member of the dozen small island developing states in the Caribbean

Basin (see Table I for climate change comparisons with other Caribbean nations). While the country has

a high Environmental Performance Index (ranking 118th out of 178 nations in 2014)2, indicating a

moderately successful implementation of environmental policy, the island nevertheless is being

threatened by climate change. Perceived recent changes in precipitation patterns and sea level rise

causing possible saline intrusion may have prompted a dramatic scarcity of fresh water on the island.

While Barbados’ vulnerability has been low,3 the island is located at the edge of the Atlantic storm zone,

has been affected in recent years by tropical storms. This is especially troublesome as the country relies

directly on tourism for 11.8% percent of Gross Domestic Product (GDP) annually, and indirectly the

island relies on tourism for 36.2% of GDP (which includes wider effects from investment, the supply

chain and induced income impacts). More broadly, the service sector (tourism and related industries)

constitutes some 80 percent of the island’s economy (Ministry of Finance and Economic Affairs, 2013).

The biggest health threat from climate change Barbados has faced most recently is chikungunya,

which is spread like dengue through mosquitos, and causes fever and joint swelling. Barbados also has

the highest rate of dengue fever in the Americas (World Health Organization, 2015) due to the shortage

of sanitary drinking water. International organizations have helped Barbados mitigate climate change

damage. For example, Barbados is part of the first global project on public health adaptation to climate

change, launched in 2010 by the World Health Organization (WHO) and the UNDP to treat Barbados

wastewater. In 2007 the island joined the Caribbean Community and Common Market’s (CARICOM)

Caribbean Catastrophe Risk Insurance Facility (CCRIF) which covers nations against hurricane damages.

Unlike some Caribbean nations like St. Lucia, which suffered severe losses from Hurricane Tomas in 2010

and severe rainfall damage in 2013 (“Accounting for Climate Variability: Time to Move On?”), Barbados

has not suffered severely from a hurricane since 1955,4 with scientists and forecasters saying this has

made it difficult to create a sense of urgency to force the dedication of extensive resources to disaster

planning and mitigation.

2 See Hsu, A., et al. 2014, The 2014 Environmental Performance Index.

3 While several interviewees invoked the folk saying that “God is Bajan” when asked why Barbados had been

spared from most hurricanes, the scientific reason is more likely that the island lies south and east of the frequently traveled tropical storm track through the Caribbean. 4 Barbados did suffer some damage from Tomas, mostly for related rains, and did receive a payout from the CCRIF.

Page 3: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

2

USAID and the Office of Economic Policy and Summit Coordination in the U.S. Department of

State in 2011 put forth a call for proposals for a U.S. university to partner with the University of the

West Indies/Centre for Resource Management and Environmental Studies (UWI/CERMES) for a three-

year program to train students at UWI in climate adaptation. According to the call for proposals, “the

partnership will focus on research and policy and will result in tangible products such as expanded

research, outreach to policy makers, short-course development for academia, public and private sector

audiences and strategic planning to secure long-term funding (Request for Applications).” Seeking to

help address a shortage of highly trained water and climate risk managers in the Caribbean region

generally, UWI/CERMES sought to partner with the Caribbean Institute for Meteorology and Hydrology

(CIMH) and three Columbia University institutes, the International Research Institute for Climate and

Society (IRI), Columbia’s Water Center (CWC), and the Center for New Media Teaching and Learning

(CCNMTL). Higher Education in Development (HED), the non-profit implementer which manages and

evaluates the administrative, economic, and legal aspects for United States Agency for International

Development (USAID) development assistance to US universities and international partners for the

transfer of applied knowledge and research in development, managed the project on behalf of the US

government, and commissioned this independent evaluation. In the sections that follow, this report

mentions the main components of the project, discusses best practices with regard to the program

components raised (and the teaching of climate change science more generally), and then elaborates

the methodology of the evaluation.

Broadly speaking, the project sought to construct human resources to foster the creation of

applied knowledge, education, and training in several areas. First, the project sought to develop and

offer short courses for Caribbean environmental and water specialists in the public sector “engaging a

range of topics related to water resource management, climate variability and climate change (Columbia

University, 3).” Second, the project partners sought to construct a long-term research agenda to

address Caribbean-wide issues relating to water management and climate change adaptation. Third,

the partners proposed to create and strengthen a Caribbean-wide “community of practice” through

contact, communication and exchange with Columbia University, the CIMH and CERMES (Columbia

University, 8).

Concretely, programs to be evaluated include, first, the establishment of UWI courses in: 1)

Introduction to Climate & Water Resources, 2) Climate Information & Predictions from Seasons to

Decades, 3) Tailoring Climate Information for the Water Sector, and 4) Using Climate & Water

Information to Inform Policy. Second, to pursue a common research agenda, CIMH, CERMES, and

Page 4: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

3

Columbia hired a post-doc to “explore a research agenda that ties together all parties (Columbia

University, 11),5” enrolled three Caribbean-based students in Columbia University’s MA program in

Climate and Society, and set up exchange visits to identify modes of collaboration. Third, to establish a

“community of practice” involving the three country campuses of the University of the West Indies

(Barbados, Jamaica, Trinidad and Tobago), the project sought to construct a virtual network for

information sharing on climate change adaptation. Before elaborating criteria for evaluating each of

these activities, we discuss best practices in each area.

II. Best Practices – Climate Change Science

In terms of best practices, several aspects of the partnership between Columbia University’s IRI,

CERMES and CIMH were examined. CERMES’ goal, as stated in the proposal, was to develop effective

means of educating students regarding climate changes issues, both from 1) the basic science literacy

perspective and 2) the implications for the Caribbean region. CIMH and CERMES provided the technical

deliverables and data to enhance/enable the learning objectives, as executed by the CERMES classes

and CIMH trainings. Columbia's role is to provide rigorous models of what is effective for teaching (HED,

2015). In the curricular area, the program sought to construct modules and case studies that, when

combined with CIMH information, enhance the climate change curriculum. Ultimately, the partnership

goal was to develop a cadre of water resource professionals with the tools, both technical and

analytical, to meet the changing hydrological needs of the Caribbean region. If the partnership

produced a high quality educational product, efforts to sustain the collaboration beyond the grant

period would help achieve that target objective at the level of outcomes as well as outputs.

Each educational module/case study developed in this partnership was evaluated on whether it

included some aspect of STEM (Science, Technology, Engineering and Mathematics) knowledge applied

to the particular issue or problem at hand. Activities that increased knowledge of climate change

concepts should have been used directly with students. Below we address more specifically the

structure such curriculum could take.

Best practices enable climate literacy (a core goal) (Climate Literacy: The Essential Principles of Climate

Science, 2009, p2): Climate literacy entails producing students who are 1) aware of essential principles

5 While CIMH commenters on the initial draft of this report stated that no post-doctoral fellow was hired, but

rather a research associate, Columbia project manager Curtis confirmed that was the post-doctoral fellow position.

Page 5: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

4

of earth's climate, 2) know how to assess (and access) scientifically credible information, 3)

communicate effectively, and 4) make informed decisions.

With these overall goals in mind, climate change best practices can be distilled to a few points,

enumerated below. We have included what we intend to look for as evidence for addressing each goal,

but recognize that the partners may effectively approach each goal using different information.

1. The program must enable a solid understanding of "climate function". Students must demonstrate

scientifically credible data assessment. Understanding and using climate data generated from remote

sensing data are important. Understanding basic principles of climatology, including air mass

movements with seasons, large-scale oscillations in ocean basins and paleoclimate proxies is also

central. Topics for study may include, but not be limited to, those listed below (Climate Literacy: The

Essential Principles of Climate Science, 2009; AAAS Benchmarks for Science Literacy):

1A. Be aware of essential principles of Earth's climate. This should at a minimum include: 1)

radiation and heat distribution, 2) interactions between atmosphere and ocean, 3) global carbon cycle,

4) general circulation models, 5) natural sources of climate change.

1B. Know how to assess (and access) scientifically credible information. Demonstrate

competency in basic information literacy regarding quality of source material; i.e. government versus

private sources for information and the possible agendas of non-government organizations in

organizing/distributing information. Become familiar with remote sensing data and analysis.

1C. Understand greenhouse gas emissions changes over time, but also land use change (rice,

livestock, deforestation, crop productivity) and how these are affecting the global carbon cycle.

2. Standards require understanding of earth processes. Lessons must allow students to demonstrate

knowledge (demonstrate what they learned in 1 (above)) (Climate Literacy: The Essential Principles of

Climate Science, 2009; AAAS Benchmarks for Science Literacy).

2A. Students should become familiar with using tables and graphs of data. They must evaluate

trends and discuss/distill key findings.

2B. Understanding the earth as an interconnected system: Students should demonstrate how

biological, hydrological and geological cycles are interlinked. That understanding should be mixed with

case studies showing effects.

2C. Students should be familiar with how we know there is anthropogenic climate change in the

first place. They must demonstrate the ability to understand the signs pointing to disrupted cycles with

Page 6: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

5

assignable causes. They must also demonstrate an ability to interpret data and evaluate trends, plus

determine causation. They should also differentiate between correlation and causation in the climate

change context.

3. Feedback loops: examples of how positive feedback loops work (Climate Literacy: The Essential

Principles of Climate Science, 2009; AAAS Benchmarks for Science Literacy) must be part of any training.

Examples may include, but are not limited to the following:

3A. Feedback loops associated with earth's changing albedo as ice cover changes: ex. warming

reduces ice, which reduces albedo, which exacerbates warming.

3B. A potential feedback loop associated with methane hydrate (clathrate) in permafrost and

oceans: ex. warming may destabilize hydrate, releasing methane that may cause more warming.

3C. Potential ocean warming may cause currents near glaciers/ice sheets to melt, which

decreases resistance to warming (in addition to decreasing earth's albedo).

4. Consequences: Students should show an understanding that human activities are impacting the

climate system and that there are real consequences.

4A. Humans’ health of populations along coasts may be part of these consequences. Three

millimeters per year is rapid sea level rise, roughly three times the rate of that during the 20th century.

What does that mean for costal economic development or outlook? What about displaced people? Can

either be forecast? For Caribbean nations (Barbados) this is a national security and existential issue.

4B. Some ecosystems are undergoing structural changes, including in plankton productivity,

terrestrial primary production, ecological function (metabolic activity of systems). For example, shifting

plankton populations alter the base of oceanic food webs could impact fisheries. Given that 10,000 kg

of plankton is needed to grow 1kg tuna or other piscivorous fish (Garrison, 2011), what are fishery

outlooks for the Caribbean if plankton populations are shifting?

4C. Hydrological cycle changes are already underway. The intensity and duration of rain events

may be changing. Seasonal fluxes (sometimes associated with monsoons in other parts of the world) are

changing. There will be a strain on water infrastructure and availability (see below for best practices

related to hydrology).

5. Communication: Students should understand that meaningful, accurate and accessible delivery of

science information is critical for it to have value. This is particularly true for communication regarding

Page 7: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

6

climate science, which is not only complex, but also involves projections into the future behavior of large

systems (Nisbet, 2009). Students should be able to do the following:

5A. Identify media appropriate to a target audience.

5B. Learn techniques for communicating dense content to a lay audience with limited time.

5C. Learn effective methods to reach those who 1) make policy decisions and 2) will be directly

affected by climate change events.

6. Adaptation/Mitigation: The climate is changing with real impacts on many communities. Although

addressing the root causes is a global political, scientific and economic issue, managers and policy

makers on the ground need tools to help defend or address changes happening now. They also need

information to hedge against future impact scenarios. Students should be able to offer guidance based

on the best science available. They should know methods for mitigating risk where needs are greatest,

and help identify areas where policy makers can make effective decisions. Topics to be considered

include the following:

6A. Improve or sustain natural coastal defenses, such as barrier islands, wetlands, tidal marshes.

6B. Improve defenses, such as sea walls, and construction standards (elevate buildings), etc.

6C. Know how to effectively use forecasting. For example: modification of infrastructure may

be necessary to manage more intense rainfall. Students should be able to forecast amounts. Looking

longer term, students should be able to identify what areas are at risk for ocean incursion, plus advise

policy makers on when mitigation measures need to be in place. Students should be able to help

identify where and when populations are at physical risk.

III. Best Practices – Hydrology and Meteorology related to Climate Change

Hydrology: Threats to water availability are numerous. Some are widely recognized as related

to changing climate; specifically: the frequency and intensity of storms, changing season onset (spring

earlier or monsoon timing shift or example), erosion, saltwater incursion from rising sea levels, and

wastewater infrastructure damage, among others (Arnell, 2006). Some threats are not directly related

such as pollution problems, availability of potable water to growing urban populations and lack of

distribution equity among users.

The partnership goal was to develop a cadre of water managers equipped with the science and

technological tools to craft policy in an era of climate change, particularly in the Caribbean area

Page 8: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

7

(Columbia, 2011).6 Best Practices for climate education to meet this objective in the context of island

hydrology have not been formulated, although understanding hydrological changes is part of any

climate change general curriculum (Climate Literacy, 2009). Our review will examine the partnership's

goals for advancing hydrological expertise in the region based on the proposed goals (as articulated in

Building Capacity to Manage Water Resources and Climate Risk in the Caribbean (Columbia, 2011)). We

will focus on what was done and why, and how actions reflected goals or target outcomes.

Understanding what climate change may bring, and mitigating or adapting to the expected

challenges through policy or infrastructure development, will require a number of components The

following are considered key areas for water resources managers in an era of climate change (UNESCO-

IHE, International Hydrological Program, 2015): 1) Urban resilience: As increased intensity events occur,

is infrastructure robust enough to weather extreme events that will become more frequent, 2) Erosion:

Increasing rate of coastal erosion is likely, and mitigation efforts will be necessary. Both hard and soft

efforts should be taken to adapt and reduce risk. 3) Soil conservation: land use policy towards soil

conservation (in face of increasing erosion) must change in light of changing rainfall patterns. 4) Data

access: Data must be available and shared to make information accessible by researchers everywhere.

This information should also be available to stakeholders (farmers for example) in a form that is useful

to them, 5) Risk analysis: Geological and flood histories need to be calculated in 100-year and 500-year

probabilities, 6. Modeling tools need to be improved and adapted as resolution and more robust

statistical analysis becomes available, 7. Training exercises need to be integrated into curriculum.

Meteorology: For stakeholders, meteorological forecasting in the short term is often what is

most necessary. More medium term forecasting, based on patterns of seasonal drought, changes in

annual rainfall, and storm frequency are also important. However climatological projections are often

larger scale with respect to space and time. Applied meteorology brings insights from climate trends

into forecasting and is necessary for adaptation to changes rather than merely documenting these.

However, best practices for meteorology in the context of climate change have not been

formulated. This is especially true when examining curricula for graduate schools or post-graduate

training programs that train professions with applied tools. That having been said, topics that regularly

appear in meteorological graduate curricula and texts can be compiled and compared to what topics are

contained in the CIMH trainings (which is what we do below). Because CIMH trains climate

professionals who are meteorologists, climatologists, or academic researchers, the workshop coverage

6 CIMH commented that the institution “definitely did not work toward that partnership goal.”

Page 9: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

8

can be expected to be narrowly focused and less concerned with including overarching concepts (of

which the participants will already be familiar).

Using graduate curricula in applied meteorology from Plymouth University, University of

Maryland and Florida Institute of Technology as well as published texts, we have developed a range of

topics that would be covered by an applied meteorological program. They include: 1) The development

of applied impact assessments, 2) Tools of research, including: ground based instruments, remote

sensing, statistical tools, models and management techniques, 3) Physical and biological environment

(as impacted by climatology), including: hydrology and soils/animals/vegetation, 4) Culture and climate,

including: human health, agriculture, forestry, fisheries, commerce, planning and energy production, 5)

Challenging environments, such as: urban areas, polluted areas and climate as a human hazard.

IV. Best Practices – Barbados and Region

Barbados, and the Caribbean region in general, has intrinsic interests in climate change

outcomes, which maybe slightly different in focus from the general interest in anticipating outcomes.

For example, Barbados has an existential interest in sea level rise, but much less of an interest in the

frequency of tornadic storms. A series of “best practices” for climate change education for the

Caribbean have not been well established, however the specific interests of the region have been

identified (Mullholland et al., 1997; Pulwarty et al., 2010; Poore et al., 2008). We would expect topics

covered by a science curriculum aimed at providing forecasting tools for practitioners in the Caribbean

regions to include (but not be limited to): seasonal rain intensity models; scaling techniques, regional to

local; methods for incorporating uncertainties into model projections; database use and data access;

appropriate statistical methods (formulating % likelihood etc.); integrating anthropogenic climate

change into models of Atlantic Oscillation and the Walker Circulation (impacted by El Niño); among

others.

Specific topics of interest may also include (but not be limited to): 1) effective communication

techniques (to non-science stakeholders), 2) using area NGOs effectively (getting the message out and

generating interest), 3) developing regional networks of expert users (communities of practice).

Page 10: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

9

V. Evaluation Methodology

Following USAID and HED methodologies, this evaluation employs what HED has defined as an

“evaluative case study approach.” As per HED instructions (HED, 2015), the approach parts from “in-

depth description and analysis” of a particular case (here climate change education in Barbados), and

will take “account of the case as a whole and its surrounding context (HED, 2015).” The evaluation

addresses how the program operates, why the program operates the way it does, describes program

outcomes (lower order) and impacts (higher order) and how these emerged (whether planned or

unplanned; negative or positive). Following HED recommended practices, we make assertions about

program functioning based on preliminary research and “test” these assumptions in the field through a

series of 30 interviews with a wide range of project principals and stakeholders. Hence, following best

practices in program evaluation, we mix the inductive approach (gleaning tendencies and assumptions

through an initial round of research) with a more deductive round of research (confirming or

disconfirming these assumptions through interviews and data collection).

We also adhere to the HED recommended practice of gathering information via “thick

description,” the inductive approach in the field to gathering targeted information but also having a

sense for the general context of the study. As defined by the anthropologist Clifford Geertz, thick

description is an account that conveys not only facts but also conceptual structures and meanings that

make up a culture. Such meanings are extracted via interpretation of local behaviors and truths, that is,

thick description goes beyond pure observation (Geertz, 1973:6). Finally, as per HED recommended

practices, we also undertake “progressive focusing” which implies two practices: 1) that we updated

instruments in the progress of our study, and 2) that we “process traced" (Gerring, 2012); that is, that

we longitudinally studied the set of practices constituting the project, from start to finish, rather than

just seeking to understand a cross-sectional “snap shot” at the temporal moment of our study.

Within this methodology, we sought to evaluate five basic criteria, as set out in the Scope of

Work (HED, 2015): 1) the efficiency with which the project was executed (comparing outputs to inputs);

2) the relevance of the project to priorities of the donor, recipient groups, and stakeholder beneficiaries;

3) the effectiveness of activities in attaining objectives; 4) the (higher order) impacts of the project, as

well as its direct outputs (which are considered “efficiency”); and 5) the sustainability of the project, or

likelihood of its continuance even after donor support ends. While all of these criteria are of central

importance, sustainability was a central focus, given that the three-year program is in its final months.

Page 11: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

10

VI. Data Collection Plan

The project sought to “drill” down to collect facts through several procedures. After collecting

background information about the project’s objectives and “fit” with USAID and HED projects, we

considered self-reporting by via the Results Framework and Partnership Management Plan and

Partnership Implementation Plan. The desk study of these documents and others solicited from project

principals early in the evaluation allowed us to establish the assumptions tested via field research. In

that later stage we utilized four specific methods to gather data, evaluate the assumptions (or

hypotheses), and draw conclusions. First, we devised a questionnaire for interviewing project directors

and staff, project beneficiaries (students, climate analysts, public officials), and objective observers

(stakeholders in the dissemination of climate change knowledge and technical training in Barbados and

elsewhere in the Caribbean but who were not directly involved in the project). Second, we obtained,

where possible, objective quantitative data about enrollments in classes, “profiles” of students,

placements of these students upon finishing training in the project, and whether and how knowledge

was diffused at the university, in Bridgetown and beyond, and in the Caribbean Basin, which was also

part of the proposed work. Third, we qualitatively assessed, against a “best practice” baseline and our

own experience, the content of course offerings, other training, and policy briefs. We elaborate on each

of these procedures below. Fourth, we tested our assumptions also as participant observers seeking to

absorb the “thick description” of the context of the project (considering it as a dynamic process rather

than a momentary “snapshot”) and made observations from that perspective.

Using flow sheets from Higher Education in Development “Objectives through Outputs,” we

were able to identify where a particular activity/intention was successful and where it was lacking or

unsuccessful. Most of the information we started with is contained in the monitoring plans and

progress reports supplied by the participants. We developed a questionnaire (Appendix 1) to ask

stakeholders. We conducted more open interviews with staff, participants and other stakeholders

during our site visit in March. These three sources of data illustrate our model of how the partnership

implemented the plan to achieve the proposed objectives.

Qualitative analysis frequently involves coding keywords and concepts in order to aggregate

these into meaningful patterns. We expected to follow this logic by undertaking our fieldwork with the

goal of disaggregating broader theoretical interests from the more deductive to the more inductive. At

the same time it was our aim to have inductive data aggregate from the empirical realm to that of more

theoretical generalization. While we identified phrases and concepts that emerged in our analysis, we

did not expect to gather enough data in a systematic fashion through a score of interviews during our

Page 12: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

11

short deployment, to generate quantitative patterns from our qualitative analysis. We have, however,

been attentive to the possible development of such patterns, and identify them as they emerge. In

essence, we moved between categories and coding and back again in the analysis and classification of

interview notes.

VII. Draft Data Collection Tools

TOOLS FOR METHOD 1: Questionnaire and Questionnaire Analysis (Presented in Appendix)

The central methodology was to conduct interviews with stakeholders. Most of these interviews

were conducted during a trip by the two evaluators to Barbados in March, 2015, but others were

conducted via telephone or Skype upon our return, and a preliminary draft report was sent with a ten-

day comment period to principals from the contract and the sub-contracts on April 1, with HED program

officers also having a similar time period to offer comments. A comprehensive set of questions has been

prepared to ascertain the project’s fulfillment of objectives. The questions were based, in part, on

project outcome materials (and other primary sources) as mentioned below as METHOD 2. They will

help the evaluators also understand the context and structural circumstances, as mentioned in METHOD

3. The questionnaire was the most important source of research and hence a battery of questions to be

asked and stakeholders to be interviewed were developed during the weeks prior to the field work. The

questionnaire is presented in the Appendix and the list of interviewees are given in the bibliography.

TOOLS FOR METHOD 2: Analysis of Project Outcome Materials

To better glean an understanding of the actual project results, we assessed the texts of the

“products” we could access (course syllabi, newsletters, policy briefs) to evaluate relevance and impact,

and ask project administrators to comment regarding the efficiency, effectiveness, and sustainability of

these outputs.

TOOLS FOR METHOD 3: Contextual/Structural Analysis

This portion of our analysis evolved as we interviewed officials and received other exposures to

the project and its outcomes.

VIII. Plan for Structural Data Analysis

Our quantitative analysis was descriptive, as we did not have sufficient data to fully establish

causal relationships. However, we strived, given the call for a structural component of the evaluation, to

approach parts from “in-depth description and analysis” of climate change education in Barbados, and

Page 13: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

12

take “account of the case as a whole and its surrounding context (HED, 2015).” To achieve this beyond

the well-prescribed evaluation exercises listed above, we sought to gain knowledge of context and other

particular questions which arose but were not addressed in the Appendix I questionnaire.

As a precursor to understanding what some have viewed as a lack of cooperation between

CERMES and CIMH on this project, it is important to note that formal cooperation between the two

groups was never agreed upon, according to the two principals. According to CERMES Director Adrian

Cashman and CIMH Chief of Applied Meteorology and Climatology Adrian Trotman (David Farrell, CIMH

Principal, was not available to interview), there was never any direct cooperation spelled out in the

project, although donors (Blackwood, Hutchinson, Royer) argued that a vital synergy between the two

institutes should be forged so that the important data generated by CIMH could be contextualized,

applied, and interpreted for the public by the faculty at CERMES.7 This is an important note since calls

for closer collaboration between the two entities, while highly recommended in future projects, needs

to be institutionally required by the donor. In this case, USAID/HED apparently “told” Cashman, after he

had submitted his bid with Columbia, that CIMH was going to also be a subcontractor. There was no

formal linkage between the institutions because, apparently, such collaboration was never initially

conceptualized by the directors of the institutes or Columbia IRI, but rather was “cobbled in” later after

the program was already conceived.8 While CERMES and CIMH independently constructed excellent

“expanded research” and “short-course development for academia, public, and private sector

audiences,” both groups fell short on “outreach to policy makers” (although the CERMES “discussion

briefs” were a step in the right direction, and CERMES, CIMH, and IRI principals were at the time of

submission of this evaluation planning to meet in the late spring to discuss project outreach efforts and

heighten these with Caribbean water managers) and “strategic planning to secure long-term funding

(Request for Applications).” In the following sections we consider the CERMES and CIMH contributions

separately as well as evaluating their contributions to the overall partnership, offer lessons learned from

each institutional partnership and the project as a whole, consider how more collaborative future

arrangements might look, and offer ideas for future joint opportunities to appeal more to policy makers,

and conduct joint strategic planning.

7 CIMH commenters pointed out that this was not part formally part of their sub-award.

8 CIMH commenters wrote that “such collaboration was conceptualized in the preparation documents.”

Page 14: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

13

IX. Limitations of Evaluation

In the course of our individual conversations with stakeholders receiving products from the

partnership and professionals generating and delivering the content, we identified trends in what

appeared to work as well as consistent problem areas. We sought problems with assumptions that may

have been made during project development, and disconnects between indicators of success and

objectives as outlined in partnership plans. While we cannot reduce the analysis to a cause and effect

model, we can determine frequency of challenges and successes from our discussions and their

contexts. The evaluators, one with extensive social science evaluation experience and the other with

extensive environmental science curricular design experience, were both present at key stakeholder

interviews (and also conduct some separately). Data collected during the fieldwork was assessed daily

to identify patterns in project successes and shortcomings which may not be as tangible as those

addressed directly by the questions in the appendix.

Several limitations hindered the acquisition of perfect data for analysis. With regard to

METHOD 1, Questionnaire and Questionnaire Analysis, we were not able to get through all questions

with all respondents, and in fact for the last few interviews with HED officials, the focus was more on

“filling holes” in author understanding than on covering the questionnaire. Furthermore, we did not get

enough data (two dozen responses) to systematically code responses, although we did informally track

consistency of responses and draw conclusions from patterns of reply. We sought to offset these

shortcomings by interviewing as wide a range of stakeholders as possible, as well as representatives

from each sub-contractor and sub-sub-contractor, and by triangulating information gleaned from

interviews from that yielded by documents.

Regarding METHOD 2, Analysis of Project Outcome Materials, we were unable to systematically

analyze budgets, as HED decided not to release that information. This limited our ability to evaluate

cost-effectiveness under the Efficiency, Effectiveness, and Sustainability rubra of the HED evaluation

scheme, and, perhaps more importantly, to get an idea of whether fiscal prioritization of project

deliverables was consistent with stated programmatic priorities. Additionally, the evaluators did not

initially get access to the “scale up” documents generated when the project budget was doubled on

request of USAID and State Department additional allocations. While they should also perhaps have

asked for those documents earlier, they were received only on April 20 when requested from Paez-

Cook, rather than with the initial receipt of desk copy documents in February 2015. The evaluators did

have access to the information to consider it in time for the final draft.

Page 15: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

14

In reference to METHOD 3, Contextual/Structural Analysis, the authors of this evaluation may

have been biased by their “bottom up” field interviews. That is, upon learning of the lack of

coordination between CIMH and CERMES (mentioned above and to be mentioned further in the

following section), the evaluation authors should have perhaps sought explanations from the project

contractors (IRI) and HED, rather than inquiring through interviews in Barbados with CIMH and CERMES

officials who were not privy to decisions made at higher levels regarding the project. While CIMH in

particular seemed to possess administrative shortcomings offering partial explanations (and the

evaluators’ inability to interview that institution’s principal did not help clarify circumstances), it was

clarified later to the evaluators that failures of coordination between CIMH and CERMES needed to be

resolved by Columbia IRI (Paez-Cook and Sanchez interviews). We address this issue in the section that

follows.

X. . The Object of the Evaluation: Overall HED Criteria Project Compliance

Overall, HED’s program results framework seeks to evaluate the translation of management

efficiency and project relevance into effectiveness within a range of rubra summarized in the Guiding

Questions given in the Appendix (and addressed systematically in the paragraphs that follow, and in

context of individual products thereafter). As mentioned in the limitations section above, shortcomings

in project management early on reduced the effectiveness of the totality of the project, although it did

succeed as “the sum of its parts.” The evaluators found it difficult to find contents “bigger than the sum

of its parts” as the pattern of collaboration between IRI (the principal), CERMES (the initial subcontract),

and CIMH (an additional subcontract added later), did not develop to full strength early in the project,

and CERMES and CIMH, in contrast to subcontractors on most other HED projects (Paez-Cook and

Sanchez interviews) did not formally work together at all. A partial exception would seemingly be at the

very end of the three-year project, in late May 2015, when the three project principals: IRI, CIMH, and

CERMES have a formal scheduled event at the CariCOF meeting in St. Lucia to discuss lessons learned

and common agendas moving forward.9

What caused this failure of coordination, acknowledged, in one form or another, by all the

principals? It seems there were several immediate causes. In the first place, it would seem that

CERMES was listed as the primary sub-contractor, and while CIMH was mentioned in the original

9 The three participated together in selection of MA fellowship recipients to Columbia, but scheduled no joint

public events, as specified in the original contracts, to establish common research agendas, a community of practice, or public policy outreach. Separate efforts were made by CIMH and by CERMES.

Page 16: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

15

contract, the State Department wanted to increase the project’s funding. According to Paez-Cook: “At

first the primary partner was CERMES, and because of the expertise and role of CIMH in the region – as

stated directly by USAID and State players in the region – CIMH had to be involved. The agreement

spoke of leveraging synergies with CIMH, but the involvement of CIMH was very limited at the

beginning.” Paez-Cook and Baethgen said CIMH’s role was to be further specified, but that this did not

occur sufficiently early in the project10, leaving questions in relation to the coordination with CERMES,

and specifically, how IRI, CERMES, and CIMH would jointly participate in the nebulous (and never further

specified) objectives of constructing a common research agenda, furthering communities of practice,

and offering policy-relevant outreach via the synergies gained from joint collaboration. With regard to

the “community of practice” project area, the HED December 2014 site visit to Barbados concluded as

much: “… answers to these questions [about the type of “community of practice” and how it could be

constructed sustainably beyond the life of the project] were never clearly defined and in hindsight, all

partners acknowledge that they should have had more honest conversations about this all during

partnership design (“Caribbean Region Climate Adaptation [CRCA] Monitoring Visit report).”

Second, the reporting requirements of the project were more onerous than any of the partners

expected (as evidenced in the HED “Columbia University Domestic Monitoring Visit” notes and

interviews with Baethgen, Cashman, Paez-Cook, Curtis, Sanchez, Trotman, and Van Meerbeeck). As

stated by Sanchez (interview), “USAID reporting requirements are what they are and need to be

complied with. Columbia accepted those realities and allocated additional admin resources to attend to

those issues so that problems did not continue.” However, according to Paez-Cook, Columbia IRI

accounted for only 5 percent of Project Manager Baethgen’s time in the project, whereas most HED

project managers dedicate some 70 to 80 percent of their time to the project, intimately involved in

“day to day activities.” Initially, Columbia IRI was unable, with existing personnel to fulfill these

reporting requirements, and this caused a delay in project start up, particularly at CIMH, which also

seemed understaffed, at that moment, with regard to project administration.

As a consequence of the under-specification early on of the specific role for CIMH (which had its

role stepped up by the outside US government funder rather than by project planners), possible

administrative shortcomings by Columbia IRI as the project commenced, and a failure to overcome such

communication failures as the project proceeded, the project wound up being an IRI-CERMES project

10

As pointed out in August 2012 notes (“Conference Call: Barbados Caribbean Region Climate Adaptation Partnership”), the main CIMH agenda of incorporating “disaster and climate risk management” into the project was not even considered until after the project had been awarded and was scaled up.

Page 17: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

16

and an IRI-CIMH project, rather than an IRI-CERMES-CIMH project. The evaluation proceeds to evaluate

the CERMES and CIMH products, which were successfully executed, in accordance with the HED

“Guiding Questions” criteria, in the remainder of this section. However, we note here, and in the

conclusion and recommendations, that the initial failure to formally establish strong institutional

relationships between all three partners, rather than just between IRI and the other two, was

detrimental to the overall Impact (in terms of all Outcome Level Results and Expansive Effects), as well

as the Sustainability: Program Effects (Institutionalization, Stakeholder Engagement, and Continued

Relevance of Program Design). Furthermore, in accordance with the overall framework of the

evaluation, the lack of decisive decision-making by Columbia/IRI very early in the project reduced the

project’s effectiveness, even though the US sponsor did more actively manage the project after this

initial period, allowing CERMES and CIMH to execute their roles in the project, even if they never

interacted. Having established these complications, which we will return to later, we do now proceed to

evaluate the two “separate” projects (the CERMES and CIMH subcontracts) with regard to their

fulfillment of the HED “Guiding Question” criteria, and then in accordance with industry best practices

(with reference back to the HED criteria).

XI. Evaluation of CERMES and CIMH Products with Regard to HED “Guiding Question” Criteria

With regard to Efficiency (Management Systems), CIMH exceeded expectations, rebudgeting

from conference lines to create budget room for elaboration of the CID product. Additionally, the CIMH

made due for six months without the hire of a program associate and managed to get the project

underway. The reason CIMH did not have a program associate does relate to efficiency (decision-

making processes), an area where CIMH failed to meet expectations. Here, CIMH fell short because they

failed to submit their initial baseline indicators in a satisfactory manner quickly enough to hire

administrative assistance (“Memo for the Record”). Several interviewees (such as Baethgen and

Blackwood) reported that CIMH needed better administration and outreach, and the institute’s failure

hampered this HED project’s timely initiation.11 That said, a CIMH climatologist stressed that the

institution is not equipped with the professional administrative staff to manage grant requirements as

would a mid-sized university (Van Meerbeeck).

11

CIMH Principal Farrell, in comments on a prior draft acknowledged that CIMH “recognizes the concern,” but argued that a set of conditions prompted the six-month delay, including failure of CIMH officials to undertake HED project management training, no fault delays in preparing baseline tool measurements, and inability to hire a strong candidate for the post doc position.

Page 18: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

17

In the area of relevance (contextualization), CIMH exceeded expectations with regard to the

creation of fine-grained forecasting tools and the creation of the Climate Information Database (CID).

The CIMH was able to take the Columbia International Research Institute for Climate and Society

forecasting tools (specifically the Climate Predictability Tool) and apply this to the Caribbean context

based on stakeholder feedback given at the Caribbean Climate Outlook Forum (CariCOF) meetings.

Indeed, interviewees (Royer, Nurse) agreed that the CIMH had learned, over the course of the project,

to convey the relevance and longer-term implications of its forecasts to relevant stakeholders, and in

simplified ways. Additionally, the June 2015 release of the Climate Impacts Database was eagerly

awaited by emergency management personnel (Royer, Thomas) who expected it to offer much-needed

Caribbean-relevant information. Regarding relevance (logic), the CIMH project also exceeded standards

by identifying additional needs and filling them, by creating a range of new prediction tools and by

focusing on “climate variability” at the present moment, rather than on climate change and adaptation

based on models of the distant future.

CIMH exceeded expectations in the achievement of effectiveness (results). The institution

delivered excellent new forecasting products and training, and offered these at the CariCOF meetings in

order to maximize their ability to deliver these. They met with relevant stakeholders (meteorologists,

water managers, emergency management officials, agriculturalists, tourism industries) and got “buy in”

through live dissemination, as well as circulation of information online, in order to increase use of their

prediction tools and databases. In effectiveness (science collaboration), CIMH met expectations as the

group failed to collaborate closely with CERMES (elaborated in “Lessons Learned” below), but

established strong and beneficial ties to IRI.

In the area of impact (expansive effects), CIMH exceeded expectations by achieving Caribbean-

wide impacts by training meteorologists from around the region in the use of the Climate Predictability

Tool, and got a range of region-wide stakeholders to engage with the output forecasts. Furthermore, as

articulated by IRI managers (Baethegen, interview), progress was made in instilling a consideration of

climate change effects into the “day-to-day” considerations of water managers and other policy makers.

While the project did not try to transmit new norms of “building” climate change awareness directly to

the public, they did manage to influence meteorologists by training meteorologists to consider such

issues in forecasting models and having the meteorologists, in turn, convey this new approach to the

stakeholders and consumers of forecasts.

Page 19: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

18

CIMH exceeded expectations in project sustainability (program effects) as HED-USAID invested

in the adaptation of the Climate Predictability Tool, its application to the Caribbean environment, and

the training of dozens of meteorologists in its use in longer term forecasting (and drought and flood

forecasting). The CIMH quickly made these tools part of the normal course of their normal forecast

presentations (such as at the now-semi-annual CariCOF meetings), ensuring that the project’s benefits in

that regard will continue indefinitely. While it is too early to assess whether the Disaster Information

Database elaborated by post-doctoral fellow Shelly-Anne Cox (who was officially hired as a research

associate, although on the post-doctoral fellow budget line)12 will also continue, we would expect that

with minimal investment that database can be updated and maintained too.

XII. Evaluation of CERMES Products

Recall that the original proposal from Columbia did not call for any CIMH-only activities, and that

all of the initially-planned outputs were CERMES-only. This makes evaluation of CERMES products

readily possible. We evaluate, in accordance with best practices and some stakeholder input, the

establishment of UWI courses in: 1) Introduction to Climate & Water Resources, 2) Climate Information

& Predictions from Seasons to Decades, 3) Tailoring Climate Information for the Water Sector, and 4)

Using Climate & Water Information to Inform Policy. Second, while the common research agenda was

not established, we analyze the effort to pursue a common research agenda among the three groups -

CIMH, CERMES, and Columbia – by hiring a post-doc to “explore a research agenda that ties together all

parties (Columbia University, 11),” and to enroll two UWI alumni in Columbia University’s MA program

in Climate and Society, and set up exchange visits to identify modes of collaboration. Third, we

evaluate the effort to establish a “community of practice” starting with the three country campuses of

the University of the West Indies (Barbados, Jamaica, Trinidad and Tobago) to construct a virtual

network for information sharing on climate change adaptation.

The short course organization, perhaps the “core” of the project for CERMES, seems to have

been extremely successful. Already adept at online course offerings as part of their ongoing MA

program13 using the Moodle platform, CERMES constructed the one-credit (equivalent) courses on: 1)

Introduction to Climate & Water Resources, 2) Climate Information & Predictions from Seasons to

12

CIMH contested the characterization of Cox, completing a UWI doctoral dissertation, as a “post-doctoral fellow” although HED and IRI officials (Paez-Cook, Curtis interviews) confirmed that Cox was hired in that budget line. 13

In the Climate Change sub-specialty, CERMES MA students regularly took online courses in natural resource economics online.

Page 20: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

19

Decades, 3) Tailoring Climate Information for the Water Sector, and 4) Using Climate & Water

Information to Inform Policy.

The courses were developed starting with the topics proposed by Columbia/IRI in their initial

proposal. The proposed topics were 1) Introduction to climate and water resources, 2) Climate

information and predictions from seasons to decades, 3) Tailoring climate information for the water

sector and 4) Using climate and water information to inform policy (Columbia University Proposal). As is

readily seen, the proposed topics and the final topics are identical. All principles agreed that the topics

were "adequate and relevant" (Baethgen, interview). However, CIMH expressed the need to "improve

seasonal outlooks"(Baethgen, interview) as part of their effort to revive their climate outlook forums. IRI

provided effective guidance on incorporating information, tools and products into models that regional

stakeholders could use for water resource management. Course topics adhered to what IRI originally

proposed, but the Caribbean institutions delivering the material developed the content and case studies

used to illuminate the topics. The material brought into the courses was decided in consultation with

stakeholders (Cashman, interview). Essentially, IRI worked to help implement the content that the

delivery institutions deemed relevant within the four course topics. Under the HED Framework, this was

an "Efficient" method of deciding content in the categories of Timeliness and Resource Use. The

principles did not comment on how the course content administration were altered, or planned to be

altered, based on feedback after the experience, so there is lack of evidence regarding that Category of

Efficiency. It is our experience however that all academics delivering content actively alter their

approach, as well as content, as courses progress. Is very much an organic process and we suspect that

CERMES used active management with regard to these courses.

The courses are evaluated at length with regard to their emphasis on best practices below.

Here it is sufficient to say that the courses were well evaluated and provided relevant content

(Relevance Contextualization). Examining the post-course evaluations by participants showed 62%

increase in knowledge as a result of taking the CERMES classes (of those that finished the courses) (see

appendix X Course Evaluations). While one interviewee said that the courses were not widely publicized

and hence may not have engaged new interest or stakeholders in the courses (Paul), and most

interviewees (policymakers and CERMES MA students and alums) had never heard of them, the courses

themselves were thoughtfully conceived and well executed (Relevance Logic). Details on possible

shortcomings regarding impact (expansive effects) and sustainability are raised in the section below

(Compliance with HED criteria). Overall, this was a successfully conceived and executed component of

the project.

Page 21: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

20

The effort to establish a “community of practice” starting with the three country campuses of

the University of the West Indies (Barbados, Jamaica, Trinidad and Tobago) was less successful than

anticipated, although a reporting memo from Cashman summarizes an effort made midway through the

project to consider “discussion briefs” as a means of generating information for a community of

practice:

… whilst there is a role for posting information that might be generated by Columbia, CIMH and UWI that has a water and climate outlook focus there is a need to identify relevant links to other sources of information and perhaps provide some form of synopsis as to what could be found there. This is very much a one way linkage; to go beyond this might be a challenge (“Community of Practice”).

The memo (“Community of Practice”) recognizes that the discussion briefs would be “one way”

in that they would not foster a dialog and debate among the community, but Cashman in an interview

offered that these could be an important source of background for policymakers. The briefs (the last

three of which were available to us) were informative syntheses of central issues on Barbados (see for

example “Accounting for Climate Variability: Time to Move On? Discussion Brief No. 1” and “Water

and Food Security: Facing Up to Climate Variability: Discussion Brief No. 4”). While it is not clear how

widely disseminated these were (although they were made available at the annual meetings of the

water ministers), the briefs were a welcome additional benefit from the project (addressing the HED

category “Relevance: Contextualization: Responsiveness to Beneficiary Needs). Cashman suggested that

had the briefs been available to the ministry's staff earlier, they might have had more effect.

Additionally, the briefs could, if coordinated with CIMH, have had a much greater impact. For example,

Brief No. 4 calls for seasonal forecasting of precisely the type CIMH adopted as part of this project, but

without mentioning CIMH or other Caribbean seasonal forecasting efforts underway (HED category

“Impact: Outcome Level Results: Negative”).

Having generally described the main products from the CERMES portion of the project, we

evaluate these in the sections that follow with regard to the following criteria: fit with best practices,

adherence to HED objectives, and lessons learned. In the next section we consider the

conceptualization and design of the short courses and an assessment of how well they embody

international best practices. We then directly evaluate the CERMES project in terms of the HED

objectives. We offer quantitative measures relating to whether the project fell below expectations (1),

met expectations (2), or exceeded expectations (3) for each of the HED criteria. Then we interpret the

findings with regard to lessons learned from the CERMES portion of the project before proceeding to

evaluate the CIMH components.

Page 22: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

21

A. CERMES Compliance with Best Practices in Curriculum Development

This section of the report will focus on the CERMES courses developed through the program and

assess their content as related to accepted best practices in climate change education. The standards

used are based on “Climate Literacy: The Essential Principles of Climate Science” (2009), which was

drafted under the U.S. Global Change Research Program, a partnership involving the Departments of

Agriculture, Defense, Commerce, Energy, Health and Human Services, Interior, State, transportation,

Environmental Protection as well as NASA, NSF, The Smithsonian and US AID. The evaluation also refers

to the AAAS Benchmarks for Science Literacy where appropriate. These evaluation tools are outlined in

Section II (above) of this report.

Climate change educational best practices have not been developed specifically for hydrology or

water resources as “stand alone” topics. This is generally because these areas of inquiry present very

different sets of concerns for different regions. The more specialized the area examined in the context

of climate change, the more problematic it is to apply a universal standard for study. However, a widely

accepted set of topics that should be addressed, are outlined by the UNESCO-IHE International

Hydrological Program (2015), and these are used to examine the hydrological content of the course

work. The evaluation tool is outlined in Section III of this report.

It should be noted that the best practices are rubrics for educators and scientists to guide

curriculum development. They do not dictate specific models, forecasting methods, productivity models

or any other specific tool that must be used. They are guide principles towards reaching a

comprehensive education curriculum around the topic. Additionally, because these CERMES classes are

post-graduate courses developed for practitioners, some of the very basic background information may

not be appropriate for the classes. Therefore, although such information is included in the climate

change best practices recommendations, it would not be appropriate for it to appear in either the

CERMES post-graduate short courses. The 4 courses were: 1) Introduction to Water Sustainability and

Climate, 2) Climate information and Predictions from Seasons to Decades, 3) Climate Information for

Improved Water Management, and 4) Water Planning and Policy for Climate Variability and Change.

Each class’s content will be evaluated separately for best practices, and student assessments will be

included in the summary.

There were three Partnership Objectives for the project, one of which was addressed with the

CERMES courses. That objective was to "Improve the capacity of UWI CERMES to train professional and

high-level students in issues surrounding climate risk and water resources management". The other two

Page 23: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

22

objectives were to develop a research agenda within the three institutional partners and to bolster a

community of practice (Columbia University Final M&E Condensed Copy). The success regarding those

two objectives is discussed elsewhere in this report.

The Outcomes associated with achieving the first project objective above appear to have

evolved as the project developed. The "Columbia University Final M&E Condensed Copy" has different

Outcomes than the "Caribbean M-E Table (RF, PMP, PIP)" (both are included in Appendix). For this

evaluation, we use the criteria established in the "Columbia University Final Condensed Copy." That

document lists two Outcomes associated with the objective above: 1) A heightened ability on the part of

UWI to attract and train students interested in water and climate related issue and 2) A greater capacity

on the part of the Caribbean community to manage climate- and water- related risks (Columbia

University final M&E). There is little doubt that the second Outcome was achieved based on the

outputs expected, which included training students and professionals in climate risk and water

resources management. The "student" part of the training was undertaken via CERMES classes and the

"professional" training was performed via CIMH workshops. The particulars (content, enrollment,

attrition etc.) of these programs are discussed in the document below. The first outcome, related to

"heightened" ability to attract students, was envisioned to include the following four Outputs: 1) the

development of modules for online training on water and climate, 2) consolidation and tailoring of

curricula for courses, 3) delivering four face-to-face training events in the Caribbean to educate

professional communities, and 4) development of certificate programs and/or a semester-long for-credit

course on climate risk management and water resources.

While the program met the Outputs criteria as constructed, these do not flow naturally from the

Outcome criteria as written (heightened ability to attract students interested in UWI courses). The

program certainly attracted students and enrollment numbers demonstrate that fact. However there

was high attrition (discussed below) and the tuition was subsidized by the grant, which is what attracted

the students as much as course content. Now that the grant has ended, the course material is off line

and the courses will not be offered again at CERMES. Even though CERMES recognizes the value of the

courses, and would like to incorporate them in existing programs as electives (Cashman, interview),

administrative hurtles existed to making the classes credit-granting and/or part of a certificate program.

We believe the Outputs do not measure the longer-term goals suggested by the Outcomes. Under the

HED evaluation rubric, Sustainability: Program Effects: Institutionalization was lacking, and

Sustainability: Program Effects: Stakeholder engagement could have been more aggressive.

Page 24: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

23

With regard to content, the first course, “Introduction to Water Sustainability and Climate”

included the following topics: 1) Water resources management, 2) Principles of hydrology, 3)

hydrological modeling, 4) Water Resources Management: supply systems, 5) Water Resources

Management: Flood Risk, and 6) Integrated Water Resources Management Concepts. The six modules

served as both an introduction to hydrological principles and an introduction to water management

approaches. They also included a very strong and discrete hydrological modeling component (that

contained a bit about climate change variation, but that was not the focus). Additionally, the class had

section on statistical methods for assessing hydrological risk, which is an applied use of theory. This class

was viewed as providing a strong background from which the other courses could build. However, it

also provided instruction on applied statistics and modeling for hydrological research/data towards

management goals. This emphasis on statistics in the appropriate modules provided a valuable and

applied frame for understanding hydrological aspects of other courses. The class was less about climate

change than it was about hydrology and water resources. It is our view that a class with a focus on

hydrological and resource management underpinnings is absolutely necessary for gaining as much value

as possible from the other course offerings.

The course covers areas 1 (climate function), 2 (earth principals), 4 (consequences), and 6

(adaptation and mitigation) of the Climate Literacy Guide (see Table 1). Feedback loops (area 3) were

touched on within the discussion of hydrological principles, but were not a strong focus for overall

content (which is appropriate given the intent of the class). Individual modules 1 and 2 (see course one

syllabi) cover basic information on understanding the global hydrological cycle (1A, 2B). It very directly

covered hydrological consequences of climate change (4B) and forecast use (6C) (Table 1).

In terms of hydrological best practices (as outlined in UNESCO-IHE, 2015), the course offered

training exercises, risk evaluation tools and forecast use for hydrological modeling tools (areas 4, 5 and 6

in Table 2). The course focused on applying tools and models plus the background for understanding

them. As was true for all of the courses, a strong background in science or natural resource

management was assumed, or it was assumed that students would rise to the challenge (there were not

any prerequisites for the classes). Although there are six topics in the best practices in hydrology rubric,

coverage of three in the regional context would be adequate (see below). In terms of best practices for

topics of regional importance (Mullholland et al., 1997; Pulwarty et al., 2010; Poore et al., 2008), the

course included three of the six: Database use (area 3), modeling uncertainty (area 4) and appropriate

statistical methods (area 5). These are appropriate for an introductory class to the global hydrological

Page 25: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

24

cycles and we would even see the modeling and applied statistics (as opposed to methodological

statistics) as very welcome, highly relevant addition. In terms of HED Relevance: Contextualization

criteria, this course: reflected national (and regional) development priorities, was responsive to

beneficiary (i.e. management) needs and reflected funder strategies (empowering national and regional

water managers).

The second course, “Climate Information and Predictions from Seasons to Decades” included

the following topics 1) overview of climate and variability (study oscillations). 2) Drivers and sources of

variability (teleconnections), 3) Forecast verification, 4) Seasonal/probabilistic forecasting, 5) Decadal

prediction and 6) Long-term scenarios. The six modules build an understanding of climate variability and

demonstrate how models deal with climate dynamics. They provided training in interpreting

probabilities within a climate-forecasting context. An emphasis was on developing the relatively near

term projections of climate change (seasonal to decadal). The content covers area 1 (climate function),

2 (earth principals), 3 (feedback loops) and 6 (adaptation and mitigation) of the Climate Literacy Guide.

Within those areas the course covers several specific topics mentioned in the Guide. The initial modules

cover basic information regarding climate function (1A), including an examination of historic drivers of

climate (including carbon dioxide) (1C) and using data libraries (1B) (Table 1). The material for this class

was condensed into two weeks, which might have been too much, too quickly. Students commented

the course moved quickly that information couldn't be absorbed fully and they fell behind (Course 2

Survey of Results). Cashman (interview) stated that ideally the class should be extended to a full

semester.

In terms of hydrological best practices (UNESCO-IHE, 2015), the course addressed four of the six.

The four the course incorporated included the access and use data mapping information in modules 1

and 3 (area 3), model uncertainty and statistics in modules 1, 3, 4, 5 and 6 (areas 4 and 5), and

use/potentially add data in module 1 (area 6) (Table 1). In terms of best practices for topics of regional

importance (Mullholland et al., 1997; Pulwarty et al., 2010; Poore et al., 2008), the course included 4 of

the 6. The four areas included rain intensity models, database use, modeling uncertainty, and statistical

methods.

In terms of HED Relevance: Contextualization criteria, the course: reflected national (and

regional) development priorities in the sense that the ability to interpret modeling outcomes is vital to

sustainable development and water management. The course also was responsive to beneficiary (i.e.

management) needs and reflected funder strategies (empowering national and regional water

Page 26: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

25

managers). In terms of the HED evaluation rubric Efficiency: Decision-making Processes: Timeliness of

activities could have been improved (based on student comments about cramming too much into two

weeks).

The third course “Climate information for improved water management” included the following

topics: 1) forecast use in water resource management, 2) Inter-annual hydrological forecasting, 3) Use of

climate projections, 3) Decision Analysis, 4) Problems with forecast use and 5) Climate risk management

concepts. These general topics contain the background necessary to understand the problems of the

region, especially as related to water management using forecasting tools. The content covers area 4

(consequences) and 6 (adaptation mitigation) of the Climate Literacy Guide. Specifically the

consequences of changing seasonal rain patterns and water availability (area 4C), using forecasting to

inform adaptation plans (area 6C) and evaluating data (2A) (Table 1).

For the hydrological recommended best practices, the course covers area 5 (risk analysis) in

modules 4 “Decision Analysis” and 6 “Climate Risk Management Concepts”; area 6 (modeling tools) in

modules 2 “Developing seasonal to inter-annual hydrological forecasts”, 3 “Use of climate change

projections” and 6 “Climate Risk Management Concepts”; and area 7 (training exercises) which included

modules 3 “Use of climate change projections” and 6 “Climate Risk Management Concepts”. All of the

topics of importance to the region were addressed, except perhaps area 6, which suggests adding data

to models for the region (Table 1).

In terms of HED Relevance: Contextualization criteria, the title of the course and content make

its relevance self-evident. It reflected national (and regional) development priorities since the class was

all about water management and risk assessment. The course also was responsive to beneficiary (i.e.

management) needs and reflected funder strategies (empowering national and regional water

managers).

The fourth course “Water Planning and Policy for Climate Variability and Change” included the

following topics: 1) Introduction to Water Resources Planning and Management, 2) Planning for the

Water Sector, 3) Developing a planning model, 4) Assessing climate change risks, 5) Climate Risk

Management and 6) Adaptation to climate change. This course is an appropriate end for the set of

WACEP classes, offering applied concrete methods for using hydrological and climate forecasting in an

effective manner for planning. Water managers were taught how to make adaptation plans in the face

of uncertainty (which they are taught is simply an unavoidable outcome of any projection, including

Page 27: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

26

those related to climate forecasting). They were taught to use the modeling for considering economic

and sustainability outcomes. In terms of climate literacy, the content covers areas 1 (understanding

climate function), 4 (consequences), 5 (communication) and 6 (adaptation/mitigation) (Table 1).

Specifically the content pertains to areas 1A as related to climate influences on hydrology, 1B regarding

the value of (perfect) information and forecasting, 5C dealing with risk aversion and risk opportunities,

and 6C hydrological forecasting and climate projection use. All of the specific best practice applications

were in the context of managing effectively. This is appropriate for the capstone course of the series.

In terms of best hydrological practices, the course covers areas 5, 6 and 7. Area 5 (risk analysis)

is covered in modules 4 and 6 in particular, which examine decision analysis and climate risk

management. Area 6 (statistics) is covered by module 2, seasonal and inter-annual hydrological

forecasting. Lastly, area 7 (training) is covered by modules 2-6, each of which incorporate modeling

exercises, or uses forecasts. In terms of important regional topics, all were covered except for area 6

(Table 1). Data was modeled and applied as suggested in areas 1, 2, and 4. Databases were used and

analyzed using appropriate tools as suggested in areas 1, 3, and 5.

Climate literacy topics not covered. Examining Tables 1 and 2, there were areas in the climate

literacy best practices and hydrological topics list that were not covered by the CERMES classes. This is

not a problem, however, and actually speaks to the strength of the coursework given program

objectives. The climate literacy guide was developed by over a dozen government agencies and science

organizations as a curriculum guide for content. It is quite comprehensive and includes much more

earth science and “proving the case” for anthropogenic climate drivers than is included in the courses

we are evaluating. Hence, the hydrological list contains a wide range of important topics necessary for

comprehensive knowledge. However, covering the whole list is not necessary for the courses. The

CERMES courses are post-graduate course and students should not need to be taught background in all

areas related to climate change curriculum. It is entirely appropriate for the CERMES courses to focus

on hydrology, modeling, and risk management within the context of a changing climate. So, while the

program could be criticized for not addressing climate change literacy comprehensively (therefore

possibly lacking in the HED Relevance criterion), we think the narrow focus of the courses was a

strength. While the general topics for the classes were put forth by Columbia in the original proposal,

the specific needs of the region were addressed through development of content by CERMES and CIMH.

Students Observations and Evaluations. The survey and assessment completed for each of the

four courses provides excellent data regarding course effectiveness. Students ranked their own

Page 28: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

27

expertise in areas before and after the class, which allowed them to reflect on how much the classes

helped them. For course 1 (Introduction to Water Sustainability and Climate), students gained 33.1%

overall knowledge with 77% reporting at least some increase. For course 2 (Climate information and

Predictions from Seasons to Decades) the overall knowledge increase was 64.5%, and 90% reported at

least some gain. For the 3rd course (Climate Information for Improved Water Management) overall

knowledge was over 100%, meaning that they essentially doubled their knowledge. Lastly, for the 4th

course, “Water Planning and Policy for Climate Variability and Change”, there was a 41% knowledge

gain. This knowledge gain speaks to the Effectiveness: Results: Progress versus Targets of the program

in the HED rubric. The targets were not firmly set, but the results exceeded most possible targets,

especially as related to the 3rd course.

Student comments were generally positive and suggestions constructive. Descriptions included

“effective” and “a good package/combination”, that was “intense” (Eversley interview), "very good"

(Broodram interview), “useful for understanding concepts”, “extremely applicable”, “particularly

informative”, and allowed “better use of information and knowledge” (WACEP Overall Course

Evaluation). It was also noted that the courses encouraged water professionals to "keep… thinking

outside the box." Suggestions included having a longer course so that information could be processed,

more discussion (Eversley interview) and reducing the number of exercises in the second course

(Broodram interview). For more student comments, see the course evaluation document (See appendix

for WACEP Overall Course Evaluation). Student comments praise the courses’ high “Relevance: Logic:

Causal Linkages in the Results Framework” and Effectiveness: Results: Progress vs Targets.

General Observations and Concluding Comments. None of the courses included the “topic of

regional importance” Area 6, namely “adding data” to regional models. As will be discussed below, this

does not reflect poorly on the CERMES classes. This practice is perhaps beyond CERMES’ role in this

particular grant, which was teaching and not data gathering. There is no doubt that better quality data

would benefit the region, but it was not CERMES role, in this case, to gather it. Gathering data or

developing it was beyond the explicit mandate of the grant for CERMES, although it could have been

done as research collaboration in CIMH. Additionally, Cashman noted (interview) that the course

material might have been better specified to Barbados and the Caribbean. Barbados and Caribbean

specific data were not sufficient (not in existence) to inform the hydrological content requirements in a

geographically specific manner. However, the course content can be applied to different regional areas

even if it cannot be derived from these areas.

Page 29: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

28

Large attrition rates, averaging a 37 percent loss of students between start and finish (Table 4)

were perhaps a foreseeable outcome of the course structure. This definitely reduced the Effectiveness:

Results: Program Fidelity of the program and negatively affected Impact: Outcomes: Positive, as these

could not be fully achieved. Losing students might have been a function of the course being offered

tuition-free, or the fact that there were no prerequisites, so students could have been “in over their

heads.” Also, “online only” classes with no financial incentive AND no degree nor certificate linked to

the outcome are likely to see attrition. Completion will rise if CERMES monetizes the course or

develops it into an elective within the existing graduate programs or develops a certificate (Cashman,

interview). As mentioned previously in this document, CERMES may have generated greater interest in

the courses if they were publicized more aggressively. Indeed, one interviewee stated that: "it felt like

they were keeping them a secret" (Paul interview), although this is open to debate. 14

The course as originally proposed linked a full 50 percent of the course grade to an

"assessment" at the end of the class. This assessment was apparently conceived as an assignment and

not a course assessment (Curtis email April 7, 2015). In fact, the courses were neither offered for credit

nor graded. This might also explain some of the attrition. It seems that the tools used to evaluate the

effectiveness of the classes were 1) a course evaluation related to knowledge gain and 2) comments on

the course. In the original course syllabus there are percentages of grades assigned to various

assignments, however since grades were not given, the professors did not evaluate student

achievement in any quantitative sense. We see this as a potential problem. Professional

scientists/teachers should assess student progress, not the students themselves, even if those students

are professionals. The method of assessment limited the ability of instructors to evaluate 1)

Effectiveness: Results: Progress vs. Targets and Program Fidelity, and 2) Efficiency: Decision-making: Use

of Human Resources and 3) Impact: Outcome level results (whether positive or negative) since

evaluators cannot assess advancements in understanding except as the students convey it.

14

Cashman, in comments on earlier draft of this document said the WACEP course advertisements did generate some 200 email inquiries, with 100 coming from outside the Caribbean. He said the publicity was sufficient, but that attrition was an issue.

Page 30: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

29

Table 1. Best practices in climate change as related to CERMES courses.

Climate literacy Guide CERMES Course

Number

1 Understand Climate Function

1A. Essential principles 1,2,4

1B. Access credible information 2,4

1C. Carbon Cycle 2

2 Demonstrate Knowledge of Earth Principles

2A. Using graphs and tables 2

2B. Understanding interconnections 1,2

2C. Anthropogenic vs. natural climate change

3 Feedback loops

3A. Related to albedo 2

3B. Related to methane clathrate

3C. Warming oceans and ice

4 Consequences

4A. Human health

4B. Changing productivity

4C. Hydrological cycle 1,2,3,4

5 Communication

5A. Identify appropriate media for target

5B. Communicating with lay audience

5C. Effective communication (to policy makers) 4

6 Adaptation/Mitigation

6A. Improve natural defenses (coasts, wetlands)

6B. Improve constructed defenses

6C. Effective forecast use 1,2,3,4

Page 31: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

30

Table 2. Hydrological Best Practices as related to CERMES courses.

Best Practice Hydrological Curriculum Components

CERMES Course

Number

1 Urban Resilience

2 Coastal Issues

3 Soil Conservation

4 Date sharing 2

5 Risk Analysis 1,2,3

6 Modeling tools 1,2,3,4

7 Training exercises 1,2,3,4

Best Practice Hydrological Topics of Regional Importance

CERMES Course Number

1 Rain intensity models 2,3,4

2 Applying granular data to scale 3,4

3 Database use and access 1,2,3,4

4 Modeling uncertainly 1,2,3,4

5 Appropriate Statistical Methods 1,2,3,4

6 Adding data to models for the region 3*

* Indicates the topic may not be directly covered although closely related concepts were.

On the whole, the courses and their arrangement were beneficial and relevant to the needs of

water resource managers in the region (Paul, Broodram, Eversley interviews). While the courses did not

cover the entire range of the Climate Literacy Guide recommendations, the topics covered were highly

appropriate for the course objectives (teaching post-graduate water professionals). Regional topics

were well covered by the curricula and the courses were successful in providing methods for giving

resource managers climate science applications. The courses would have been more effective if they

had been marketed or advertised more extensively, were incorporated as electives into a wider

certificate or degree program, and if a method to reduce attrition could have been introduced.

Page 32: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

31

Table 3. Enrollment and completion data for CERMES courses.

Course # 1 2 3 4

Number enrolled 43 29 34 19

Number completed 21 6 11 9

Percent completed 48.84 20.69 32.35 47.37 The courses were: 1) Introduction to Water Sustainability and Climate, 2) Climate information and Predictions from Seasons to Decades, 3) Climate Information for Improved Water Management, and 4) Water Planning and Policy for Climate Variability and Change.

B. Compliance with HED Criteria

CERMES met expectations as the institute delivered well-designed content, but might have

better “learned” from one year to the next by offering the courses more than once as the “start up

costs” were in development. The courses might have been adjusted and re-offered, and perhaps with

more input from CERMES faculty (the three water management courses were taught online by a staff

member at Columbia’s Water Center (CWC), and the other, on meteorological forecasting, was taught

by the director of IRI. Furthermore, while Columbia said they were responsible for hiring the post-doc

(which wound up being Shelley-Ann Cox at CIMH) and hosting two Caribbean-originating MA students

and partly responsible for establishing the research community with CERMES and CIMH, CERMES did not

execute the “community of practice” portion of the project efficiently.15 With regard to efficiency

(decision-making), CERMES met expectations as the institute seems also to have fulfilled its duties well

by designing and offering relevant courses, and adding the discussion briefs in mid-course as a means of

generating some “community of practice”-like content. The institute did not launch an effective online

community of practice dialogue with University of West Indies and other stakeholders, but it did make

overtures by releasing “discussion briefs” in October 2014 (to be discussed at May/June 2015 CARICOF).

With regard to relevance (contextualization), CERMES did an outstanding job, exceeding

expectations, in offering curriculum that was timely and relevant to Caribbean community stakeholders.

The courses focused on Caribbean case studies and issues, and addressed these extensively.

Furthermore, they invited a range of Caribbean stakeholders to participate by taking the courses,

although they seemingly might have cast a slightly wider net to seek participants. CERMES met

expectations for relevance (logic) as the course material followed one topic from another, although the

15

CIMH commenters said the statement that Columbia had hired the post-doc, as stated in the proposal and confirmed by Curtis (interview), was “a misinterpretation at best” as the research associate was not “meant to develop a common research agenda” as stated in the original proposal.

Page 33: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

32

institute might have given consideration to making the one-unit courses “permanent” by making them

units in extant MA courses, or conceiving of a professional certificate program encompassing them.

In rating effectiveness (results) CERMES also met expectations, producing the courses sought

and executing them well, although the course materials were taken down, many stakeholders had not

heard of them nor had the chance to access course materials, and means still need to be found to

continue making the pedagogical material available. With regard to effectiveness (science

collaboration) CERMES fell short of expectations. Whatever the structural problems inherent in the

project, the science collaboration was non-existent between CERMES and CIMH, and it appears also that

CERMES did not get any new scientific knowledge out of the project.

CERMES’ impact (outcome level results) met expectations, as while the materials may not have

enduring value or be accessible to the public, the institute offered the promised courses, and to good

effect. The CERMES portion of the project also met expectations regarding impact (expansive effects) by

contributing- as the institute does in its normal educational mission – to the construction of a strong

cadre of trained climate change and hydrology specialists. While the four USAID-funded courses added

only a portion to this, they did make a contribution.

Regarding sustainability (program effects) CERMES met expectations as it imparted good

training and produced discussion briefs. But the continuation of the short courses, either in

combination into one comprehensive course for the MA program, or as short courses which continue to

be offered through the University of the West Indies Open University online program, is uncertain.

Cashman (interview) expressed a strong will to continue offering the courses, but also conveyed that the

university would have to give them some sort of administrative status (certificate program, portion of an

MA program, etc.), and that this had not yet been achieved. Cashman also indicated that without strong

advocacy by other faculty members, program continuance was dependent on his involvement.

C. Successes, Failures, and Lessons Learned from CERMES Partnership

The short courses generated by CERMES engaged students with active exercises, offered a more

nuanced view of environmental value by considering systemic effects, externalities, and complex

impacts as part of the “cause and effect” of environmental conditions and policies, and contained a

wealth of Caribbean- and Barbados-related material. They were well regarded by students and

practitioners who knew about them. Not enough people in the policy community knew of them, as

Page 34: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

33

most interviewees in the sector had not heard of the courses (Paul, Braithwithe, Murray, Blenman). The

courses may not have been widely publicized, but they were very well designed and executed.

Moreover, CERMES has long been known for generating an outstanding pool of talent for

Barbados and USAID wisely hand-picked the institute as a primary recipient for HED assistance. Many

interviewees (Paul, Braithwithe, Rowe, Hinds, Hutchinson for example) spoke extremely highly of

CERMES expertise, research, and training in the climate change and hydrology areas, and MA program

alumni at the Ministry of Environment and Drainage (Hinds, Rowe), Banks Beer Bottling Plant (Knight,

Best), and the Canadian Embassy (Hutchinson) told us of strong networks the institute possesses.

Still, CERMES strong positioning as perhaps the premier academic institution for the study of

climate change-related science in the Caribbean made the institute’s failure to instigate any policy

debate or generate new research somewhat disappointing. Project managers can truthfully cite the

project’s unusual genesis as the cause of this problem. As stated by Cashman (interview), USAID

“consulted” CERMES, but reserved the right to name the US-based partner institution which was to work

with the institute. And the science generation and knowledge transfer component, as will be seen in the

discussion of the CIMH portion of the project, was fully reserved to that portion. CERMES was not

formally asked to specifically deliver products in this area, even though the call for proposals – for which

CERMES was the only Barbados service provider listed until after the project was bid and won –

mentioned research and knowledge generation among the core objectives. But this failing in the

structure of the CERMES sub-contract, and the inability of CERMES, Columbia, or USAID to rectify it, left

CERMES’ role truncated with regard to delivering research, policy outreach, or facilitating a community

of process. While we proceed now to discuss the CIMH program, we return in the conclusion to a

discussion of how a stronger project may be structured in the future.

XIII. Evaluation of CIMH Products

There were three Partnership Objectives for the project listed in the "Columbia University Final

M&E" document. CIMH participated in Objective one "Improving the capacity of UWI CERMES to train

professional and high-level students in issues surrounding climate risk and water resources

management" by teaching in the CERMES courses that were discussed in the preceding section. The

effectiveness and drawbacks associated with those courses is addressed above and will not be revisited

here. CIMH was specifically named in the second objective, "Developing a targeted research agenda

Page 35: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

34

that meets the identified needs of CIMH, CERMES, and the Caribbean community for information and

knowledge related to the management of water resources and climate-related risk." While the

development of a research agenda was an important (but neglected) component of this project, we are

focusing this section of the report on how CIMH delivered information and knowledge related to

regional climate forecast needs through workshops (CariCOF). The third Objective "Bolstering the

community of practice that informs work at the intersection of climate & water in the Caribbean" is

addressed elsewhere in this report.

CIMH generated some outstanding products. One of our interviewees, a water manager for the

Barbados Government, when asked about the three-month rainfall forecast product, stated that she

"lived by those" (Paul). That strong and supportive style of statement was consistent throughout our

stakeholder interviews. The institute’s transgressions in meeting reporting requirements, and failure,

along with CERMES, to generate synergies with regard to new research and public outreach will be

addressed more explicitly later, along with mention of the failure of USAID/HED to structure the

contract properly at its inception. In this section, it should be noted that the original proposal only

vaguely referenced CIMH in terms of pursuing a common research agenda between CIMH, CERMES, and

Columbia through the lead of a post-doctoral fellow and three students from the Caribbean enrolled in

Columbia’s MA in Climate and Society, and establishing the “community of practice” including a virtual

network for information sharing on climate change adaptation (these are explicitly stated in the

Columbia University Final M&E Condensed Copy).

However, in the undated subcontract released to evaluation consultants only on their visit to

Barbados (subsequent to the official contract between Columbia-IRI and HED), CIMH did provide

themselves a more specific scope of work.16 Specifically, they pledge to: 1) engage with the Climate

Outlook Forum of Central America and extend their Caribbean forecasting using data from that group; 2)

offer three training workshops in conjunction with annual Caribbean Regional Climate Outlook Forum

(CARICOF) meetings, on seasonal rainfall forecasts, forecast verification, and forecasting extensions

beyond their extant three-month outlooks; and 3) workshops at the CariCOF meetings of meteorological

forecasters on: socio-economic impacts of seasonal variability; public communication of forecasts; and

16

CIMH Principal Farrell, in comments on an earlier draft of this report, said that CIMH did contribute to the preparation of the proposal, and that despite evaluator notes on the “vagueness of CIMH’s role in the project inception,” CIMH did have email discussions with Columbia staff. Farrell added that “any pre-award discussion about courses etc. would have been inappropriate in our opinion since we did not engage with other institutions competing from the project on such matters.” Farrell also states that “the way the report is written distorts the facts.”

Page 36: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

35

development and coordination of impact reporting. In addition, CIMH managers interviewed (Trotman,

Van Meerbeeck, Cox) discussed a fourth contribution; they are constructing a Climate Impacts Database

(CID) to help provide information to decision-makers and researchers by gathering geographical and

meteorological information about past disaster events which authorities can then factor in to early

climate warning. Fifth and finally, CIMH has delivered the CariCOF Climate Outlook Newsletter (a

monthly bulletin published since June 2013), which several stakeholders have said they routinely

consult.

The first three of the CIMH components were successful, and the fourth (the CID) will be “rolled

out,” with much anticipation in June 2015. The CIMH has gotten compliance from the Climate Outlook

Forum of Central America, improving the quality of their forecasts with more data points, and sharing

information more seamlessly on important developments (Both Relevant in Context and Logic as well as

Effective in reaching targets and fidelity). The workshops on seasonal rainfall forecasts, forecast

verification, and forecasting extensions beyond their extant three-month outlooks have been well

received, and stakeholders commented repeatedly on the quality of CIMH data and the utility of

extending forecasts and adding drought and flood forecasts (Ballantyne, Simpson, interviews). Similarly,

participants in their workshops at the CariCOF meetings of meteorological forecasters on socio-

economic impacts of seasonal variability, public communication of forecasts, and development and

coordination of impact reporting also gave them strong reviews (Riley, Royer, Nurse, Simpson,

interviews). Below we address CIMH adherence to best practices, and then evaluate the project in

accordance with each HED objective, and then consider lessons learned.

A. Course Content Compliance with Best Practices

Within the context of the grant, CIMH focus was not just higher education, although they did

support the CERMES online courses with teaching staff. CIMH was working with IRI to build operational

capacity plus training of forecasters and meteorologists to effectively use products through workshops

and building new data (Van Meerbeeck, interview).

The relationship with IRI was greatly enhanced outcome of the HED grant and will be sustainable

into the future, as they have been integrated into the deliverables funded within the CIMH operational

budget (“will withstand the test of time” as the products developed with IRI “will never be lost”

(Trotman interview). Indeed, products developed include Rainfall Forecasting maps (both three- and

six- month forecasts), Drought Forecasts and forecasting tools (based on applications of IRI’s Climate

Predictability Tool) and a Climate Impacts Database. The Rainfall Forecasting development was partially

Page 37: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

36

supported by the HED grant. Distributing the forecasts, also funded by HED project, involved social

media and newsletters to potential users in agriculture, but also in the resource and environmental

planning communities (Riley interview). The forecasts are highly useful products and CIMH recently

developed an improved resolution of the models from 18km to 4km and expect to roll out a 2km

capacity (Riley interview).

The Drought Forecast programs are particularly important for users in the region, although they

are fairly new and are anticipated to be highly used, even if users are still learning about them (Simpson

interview). These were distributed (communicated) among Caribbean, Central American and South

Pacific forecasters through the HED co-funded training workshops offered during the CARICOF starting

with the 2014 “wet season” meeting in Jamaica (May 26-27, 2014). The rainfall forecast (six-month)

training sessions were offered concurrently with the CariCOFs in Trinidad May 2013, Jamaica May 2014,

and St. Lucia at the end of May 2015. These forecast workshops were highly useful and relevant to the

region (Ballantyne, Simpson, interviews), and although the content did not necessarily derive wholly

from HED funding, funding did support the six-month forecast product and did contribute to their

delivery (Trotman interview). So through workshops, HED helped deliver CIMH training to regional

stakeholders in addition to developing relevant products.

The Climate Impact Database is a geo-spatial tool that links previous and present climate events,

such as impacts from disasters (flood, landslide etc.) and mitigation/planning strategies such as Standard

Operating Procedures to anticipate future impacts associated to predicted climate events. It was

directly funded by HED, particularly with respect to support for dedicated researcher for its

development using the post-doctoral fellow specified in the project proposal (Shelly-Ann Cox,

interview). According to the initial project proposal, that post-doctoral position was to help establish a

CERMES-CIMH-IRI joint research agenda, and while that did not happen, the slot was put to good use on

a CIMH applied research project.

In addition to product development with IRI, CIMH offered training workshops for potential

users alongside the CariCOF meetings in May 2013 in Port of Spain, May 2014 in Jamaica, and December

2014 in Antigua. The topics of the training workshops were 1) Seasonal Climate Outlook Methods:

current forecast techniques, 2) Verification methods for seasonal forecasts, 3) Relative Operating

Characteristics (ROC) (for diagnosing forecast accuracy), 4) Functions of Climate Predictability Tool, 5)

Divers of Climate Variability, 6) Communication and Dissemination Strategies, 7) Forecasters role in

climate outlook use by economic sectors: currently, 8) Forecasters role in climate outlook use by

Page 38: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

37

economic sectors: aims. In aggregate, these workshops provide (in terms of the HED evaluation rubric)

Relevance: Contextualization criteria included national (and regional) development priorities and

fundraiser strategies, since the workshops involve both forecasting but also effective stakeholder

communication strategies, plus reflected funder strategies (empowering national and regional water

managers).

The "Overview" section above is necessary in order to place the workshop training in the

context of a traditional educational product. The workshops were non-traditional in the academic sense

since they were strictly for professional training. This section of the report will focus on the CIMH

workshops developed through the program, and assess their content as related to accepted best

practices in climate change education. The standards used are based on “Climate Literacy: The Essential

Principles of Climate Science” (2009), which was drafted under the U.S. Global Change Research

Program, a partnership involving the Departments of Agriculture, Defense, Commerce, Energy, Health

and Human Services, Interior, State, transportation, Environmental Protection as well as NASA, NSF, The

Smithsonian and US AID. The evaluation also refers to the AAAS Benchmarks for Science Literacy where

appropriate. These evaluation tools are outlined in Section II (above) of this report.

It must be noted immediately that the goal of these HED supported training workshops was not

primarily to educate university students about climate change and the potential outcomes. Nor was it

to educate post-graduate users tasked with regional water management, as was the CERMES work. The

workshops were to train professional forecasters and meteorologists about using products developed by

CIMH and its partners (including IRI, a very valuable partnership supported by HED). In light of the very

narrow workshop focus and needs of participants, it will be more important to look at how effective the

workshops were rather than the breadth of their coverage concerning climate literacy.

That said, we evaluated the training sessions using the Climate Literacy Guide as for the CERMES

online courses. We have not included the Hydrological Curriculum topics coverage since those are not

directly related to meteorology or forecasting (although here is broad topical overlap). We have

instead developed a list of major meteorological topics that would be included in graduate program

curricula (derived from applied climatology topics from graduate curricula in applied meteorology at

Plymouth University, University of Maryland and Florida Institute of Technology) (Table 1).

The workshops offered by CIMH during the CariCOF were not arranged as courses with syllabi.

They were arranged as hands-on exercises, interactive discussions, and technical presentations to

Page 39: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

38

introduce methodology. Some topic schedules were provided to the reviewers (two rainfall forecasts

for a three-month outlook for example). Based on the content extracted from the documents available,

the topics covered are listed along the best practice or major topic rubric in Table 1.

Topics not covered. Examining Table 1, one can observe that there were areas in the climate

literacy best practices as well as the meteorological topics list that were not covered by the CERMES

classes. This is not a problem for the workshops and actually speaks to their strength given the

objectives of the workshops. They were designed to provide practitioners tools they can use and were

justified in assuming that the participants had prior broad knowledge related to the science. Examining

the participant lists (all meteorologists or climatologists), almost all participants in the short courses

were professionals with college or advanced degrees.

Participant feedback and evaluation. CIMH provided the participant assessment from a training

evaluation May 22-25 in Port of Spain. That assessment tool provided information on entry -level skill (0-

100) and exit level skill (0-100) through self-assessment. The skill level assessments for each topic within

the training session appear in Table 6. The mean skill gain was 26.8% and participants recorded an

increase in skill for an average of 6 topic areas (out of 8). In short, these were valuable workshops for

professionals. A perhaps inevitable problem with the workshops was that they were not offered for

credit nor graded, so it is difficult to assess what was learned aside from the exit surveys. The percent of

participants filling out the survey was high however. For example, at the Port-of-Spain, Trinidad meeting

(May 22-25, 2013), of the 21 participants in the workshop, 17 filled out entrance and exit surveys. This

high level of participation speaks to Relevance: Contextualization. However, survey participation cannot

assess the effectiveness of instruction. That having been said, it was clear from interviews and from the

workshop reports (i.e. May 2013 Trinidad Report) that the workshops were outstanding in terms of

Relevance: Context and Logic, Effectiveness (reaching targets) and Impact: expansive effects (regional,

business, government), reaching intended targets with results directly attributable to the intervention.

Several professionals we talked to have taken CIMH trainings (Ballantyne, Simpson, Nurse, Paul,

Riley, and Royer interviews). Elizabeth Riley, Deputy Executive Director of Caribbean Disaster Emergency

Management Agency (CDEMA) stated that the CID was a high value added to their efforts and is

strengthening disaster prevention in the region. Evidence-based planning has been lacking and CIMH

products have been extremely helpful, she said (Riley interview). This statement was echoed by a

government water engineer in St. Vincent and the Grenadines (Ballantyne interview) and an agricultural

officer in Jamaica (Simpson interview), both of whom participated in three CIMH workshops.

Page 40: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

39

Additionally, CIMH has improved its communication with stakeholders who need information delivered

in a useful and effective manner (Roy, Royer interviews).

General Observations and Concluding Comments. The format of the CIMH training does not

easily fit into the traditional academic framework for evaluation along the lines of the Climate Literacy

Guide. The workshops, commenced with an August 2013 “train the trainers” session, were for training

professionals and regional stakeholders in how to use the models (not the background science).

However, the trainings did touch on a wide range of the more applied objectives of the

climate/meteorology topic list. CIMH's main role was to work with IRI and develop forecast models; the

"front end" work that was ultimately delivered in the CariCOF workshops. CIMH supported the CERMES

courses developed through this program, as WACEP course content was reviewed by CIMH and the

CariCOF seasonal forecasts by CIMH were included in the WACEP curriculum. Differences in teaching

approach between the two institutions existed but were to be expected given the different missions of

the faculty/staff at each. None of the CERMES WACEP students mentioned teaching-related problems

with the courses, so institutional differences were not perceived by students (efficiency: use of

resources). There is no doubt that the CIMH products (rainfall, drought alert outlooks and maps)

developed with IRI through this grant are extremely valuable to regional professionals. CIMH has shown

willingness, effort, and progress on delivering products in an effective manner, and increasing content

accessibility through better communication with stakeholders. The project was highly effective

(efficiency: decision making) in developing and delivering quality science effectively (results).

Page 41: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

40

Table 4. Best practices in climate change as related to CIMH trainings held in conjunction with CariCOM meetings.

Major Meteorological Curriculum Components were derived from graduate curricula and texts CID: Climate Impact Database; D: drought model.

1-8: the topics covered in the trainings as listed in the text.

Climate literacy Guide Workshop number

Addressing topic

1 Understand Climate Function

1A. Essential principles 1,5

1B. Access credible information 2,3,6,7,8 CID

1C. Carbon Cycle 5

2 Demonstrate Knowledge of Earth Principles

2A. Using graphs and tables 2,5

2B. Understanding interconnections 1,5,7,8

2C. Anthropogenic vs. natural climate change

3 Feedback loops

3A. Related to albedo 5,6

3B. Related to methane clathrate 5

3C. Warming oceans and ice 1,5

4 Consequences

4A. Human health 4,CID

4B. Changing productivity 4,7,8, D

4C. Hydrological cycle 1,2,4,6,7,8

5 Communication

5A. Identify appropriate media for target 6,7,8

5B. Communicating with lay audience 6,7,8

5C. Effective communication (to policy makers) 6,7,8

6 Adaptation/Mitigation

6A. Improve natural defenses (coasts, wetlands) 7,8

6B. Improve constructed defenses 7,8

6C. Effective forecast use 1,2,4,7,8

Page 42: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

41

Table 5. Major Meteorological Curriculum Components and Best Practice Meteorological Topics of Regional Importance.

Major Meteorological Curriculum Components Workshop number

Addressing topic

1 Emergence of Applied Climatological Impact Assessments 1,2,7

2 Tools of Research

2A. Remote Sensing 1

2B. Statistics 1,2,3

2C. Ground Based Technology 1,2

2D. Models 1,2,3

2E. Management 4,6,7,8

3 Physical and Biological Environment

3A. Hydrological processes and resources 1,5

3B. Soils/Animal/Vegetation Resources 4,7,8

4 Culture and Climate

4A. Health 4,7,8

4B. Planning/Building 4,7,8

4C. Industry/Commerce and Transportation 4,7,8

4D. Agriculture/Forestry/Fisheries 4,7,8

4E. Energy Production 4,7,8

5 Challenging Environments

5A. Urban Climate 4,7,8

5B. Pollution and Climate 4,7,8

5C. Climate Change as a Human Hazard 4,7,8

Best Practice Meteorological Topics of Regional Importance

Workshop number Addressing topic

1 Rain intensity models 1

2 Applying granular data to scale 1,2

3 Database use and access 1,2,3

4 Modeling uncertainly 1,2,3

5 Appropriate Statistical Methods 1,2,3

6 Adding data to models for the region

Page 43: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

42

Table 6. Exit survey self-assessment for CIMH training sessions

Topic Average entry

skill level Average exit

skill level % gain

CariCOF seasonal climate outlook methodology – current forecasting techniques

57.6 82.3 25.9

Methods used in verification of seasonal climate forecasts 48.2 74.1 28.2

Relative Operating Characteristics as a measure of skill in forecasts

40 74.1 31.8

Use of diverse functions of the Climate Predictability Tool 43.5 69.4 25.9

Drivers of climate variability in the Caribbean 64.7 82.4 18.8

Seasonal climate outlooks – communication and dissemination strategies

54.1 83.5 30.6

Forecasters’ role in the uptake of CariCOF climate outlooks by economic sectors – current situation

46.3 70.6 24.7

Forecasters’ role in the uptake of CariCOF climate outlooks by economic sectors – aim

45 74.1 28.2

Average overall skill gain (%)

26.8

Port-of-Spain, Trinidad. May 22-25, 2013 Skill level is from 0-100.

Lessons Learned: The CIMH partnership was a true mix of excellent impact and sustainability

with some true obstacles to implementation. By the acknowledgement of Project Director Baethgen

(interview), the lack of specific cooperation by CIMH early on made project design challenging.17 In

contrast to CERMES, CIMH did not have specific outputs attached to its participation until the

subcontract was signed, and “When we were designing the proposal we were interacting with David

[Farrell, CIMH Principal] and CIMH said they wanted to do it but it was not clear exactly what they were

going to do.” IRI brought CIMH into the project without full specification of their constant deliverable

outputs, according to Columbia interviewees, which were only agreed to later, and this created

problems until CIMH director Farrell, who travels constantly and was slow to return communications,

17

According to CIMH comments on an earlier version of this report, “the problem was not the cooperation of CIMH, but the engagement of CIMH in the initial discussions on the project.”

Page 44: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

43

turned project management to Chief of Applied Meteorology and Climatology Adrian Trotman. 18 The

failure of specification of project tasks generated uncertainty about the collaboration which was

resolved, Baethgen said, but not until the project had already started and CERMES (which had been

much more precise in specifying its deliverable outputs from the beginning) was on a different

timeline.19 To illustrate the degree of communication problems, let us provide a single example from

just the past month. In the Columbia University Final M&E document it is stated that as part of the

effort to foster collaboration between CIMH, CERMES, and Columbia, a Post-doctoral researcher will be

hired by CIMH. As of this past week, CIMH maintained (insistently) that the hire was a Research

Associate hired to develop a tool recording the impacts of climate on water availability, and not to

explore a common research agenda (CIMH draft comments).

As an aside, miscommunications between the parties involved in this project lead to an extra

layer of complexity for the reviewers to navigate. Different interpretations of responsibilities, intentions

and objectives were commonly encountered. This was exemplified by the extensive comments received

from CIMH Principal Farrell on the initial draft of this document, even though he said in an earlier email

he would not be available to interview during the evaluators’ site visit to Barbados.

In addition, CIMH had low administrative capability and HED-USAID required onerous reporting

requirements, the most burdensome, Baethgen said, of his 27-year career in project management.

Without dedicated project administrators or a budget in the relatively small HED-USAID grant to hire

administrative support, CIMH was unable to complete initial reporting in a timely way, and as a result,

was not able to formally start working on the project until six months after it had already started. And

even then, the solution was only found when IRI accepted responsibility for reporting on the CIMH

elements of the project directly from New York, rather than leaving it in the hands of the meteorologists

and hydrologists (but not project administrators) of the CIMH (Baethgen, Curtis interviews). Baethgen's

comments were also reflected in the interview with CIMH climatologists and modelers during the site

visit (Van Meerbeeck, interview).

Even with these obstacles, CIMH performed excellent and durable work. Deepening the scope

of the CariCOF meetings, introducing Central America Climate Forum data and personnel into the

Caribbean meetings (and having CIMH staff attend the Central America meetings) fortified the data and

18

CIMH comments on an earlier version of this report state that Trotman took over in May 2013, as the focus of the project came more into line with his meteorological expertise. 19

Farrell said in comments on earlier draft that “There were two key factors for the delays as noted: 1) the lack of awareness that CIMH had to complete the baseline tool as CIMH was not the primary regional partner on the project and it was assumed that only the primary parties were required to complete the tool . . . and 2) the late hiring of the research associate [Cox]… “

Page 45: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

44

human resources of the region. Furthermore, the regionally-relevant forecasting tools adapted by CIMH

increase the power of regional actors to anticipate medium-term climate variability and, thus, minimize

disruptions. CIMH did not solve its internal shortcomings by hiring dedicated administrators in time for

the HED project,20 but their meteorological work is the best in the Caribbean region, and this project

leveraged the institute’s expertise to its fullest.

XV. Overall Project Achievements and Limitations

“Building Capacity to Manage Water Resources and Climate Risk in the Caribbean” was a

successful project. Indeed, parts of it were exemplary; mainly the applied and practical training offered

in the CERMES courses and the CIMH professional training, the CIMH extension of its extended forecasts

to encompass valuable new tools by applying Columbia technology, the “discussion briefs” by CERMES,

and, if it works well, the encounter between CERMES and CIMH at the 2015 CariCOF meeting with water

resource managers at St. Lucia if that meeting goes well. However, the failure of Columbia/IRI to better

specify and incentive means of IRI-CERMES-CIMH collaboration (and especially to leverage CERMES-

CIMH synergies) caused a failure of the project to completely fulfill HED’s program results framework

(translating management efficiency and project relevance into effectiveness within a range of rubra,

including Efficiency, Impact, and Sustainability). The principle shortcomings of the project were two: 1)

IR/Columbia did not formulate a means for CERMES and CIMH to cooperate among themselves in some

better specified areas within the execution of the project, and 2) the diminishment of the public

projection of the project or its establishment of a future research agenda, fortification of communities

of practice, or in public outreach regarding the deliverables from the project. We briefly address, in

broader terms, the implications of the successes and failures of the project in the following section, and

then offer concrete recommendations to improve the odds of success for future HED/USAID projects.

Both Barbados-based institutions helped improve technical knowledge in Barbados and

regionally throughout the Caribbean. HED/USAID identified the central partners for generating vital

applied knowledge on water resources and climate risk, and these partners build courses which did, as

we have shown, follow best practices in creating an encompassing curriculum addressing central issues

in contemporary applied climate and water sciences, and deliver these courses in a manner compliant

with the HED objectives of efficiency, relevance, and effectiveness. The CIMH forecasting trainings using

the IRI Climate Predictability Tool appear to be sustainable and increasingly effective with each annual

20

In comments on an earlier version of this report, CIMH staff report that they have designed administrative work structure in relation to other projects with USAID and other funders.

Page 46: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

45

CariCOF meeting, where new stakeholders (like the Central American Climate Forum meteorologists and

Caribbean area water managers) seem to be constantly brought in. The CERMES courses will also

apparently be sustainable, as the institute’s director (Cashman interview) has pledged to construct two

“full” MA courses from the material; one on water management and the other on forecasting (although

University of West Indies approvals are still being received for this conversion of the four one-credit

courses into two three-credit courses).

Both CERMES and CIMH focused in this project on the relation of water issues to climate

change. This focus seems increasingly necessary as Barbados does not seem to be fully considering

“climate change” as well as “climate variability.” Indeed, while we interviewed several members of the

island’s inter-governmental task force to address climate change issues (Nurse, Rowe, Taylor), we were

unable, despite more than a half dozen phone calls and emails, to get a meeting with the coordinator of

that task force, Ricardo Ward, of the Ministry of Environment and Drainage. Members of the inter-

governmental group (Nurse, Rowe, Taylor) did not seem to have many details on concrete tasks at its

command, and while water managers (Paul interview) had strong ideas about the impact of climate-

related sea level rise on salt water intrusion into the fragile, porous limestone fresh water supply,

coastal authorities and government regulators of tourism (the largest component of the service sector

which comprises 80 percent of the island’s economy), did not mention “climate change.” While one

interviewee suggested that mention of “climate change” might be controversial, precisely because

authorities wished not to raise doubts about preparations by the tourism-driven economy, it appeared

that a comprehensive special office, which could be called whatever authorities want, is needed to

receive reporting from representatives “embedded” in the central ministries (Planning, Finance,

Environment and Drainage, Tourism). It seems that only such an arrangement can ensure that sufficient

planning occurs for medium- and long-term climate adaptation. As stated by Hutchinson (interview),

Barbados has invested in the Caribbean Catastrophe Risk Insurance Facility (CCRIF) and is seeking to

hedge climate risk. However, explicit attention needs to be given to the trade-offs between

preparations to minimize climate event impacts, and “hedging” against climate risks by paying in to an

international insurance facility.21

Although motivating public bureaucracies to consider climate change impacts which may be

decades away has proven to be an international problem (sources), IRI Project Director Baethgen

(interview) identified perhaps the greatest benefit from the project, getting officials to take incremental

21

Barbados has received two small payouts from the CCRIF to date.

Page 47: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

46

precautions to save water and clean up the environment, even if not to mitigate against the effects of

climate change. He said that too many climate change scientists predict weather 60 years hence, and

that perhaps the best way to improve adaptation is to focus on variability in the current climate

(droughts, storms, hurricanes, etc.) rather than climate change. “If the focus is on 2080,” Baethgen said,

“then climate change becomes a problem of your grandchildren. The truth is the climate is changing

already. What can we do today? If you focus on 2080, what decision-maker is concerned about that?

When you focus on 2080 you are giving policymakers the perfect recipe for paralysis.” This pragmatic

approach, which minimizes policymaker consideration of future risk and instead focuses on present

variability and how this may be combatted, is effective and location- and timing-relevant, and points to

an area where greater CERMES-CIMH collaboration could have improved the project.

The glaring weakness of the project was this lack of collaboration between the three principal

institutions early on, which hampered fulfillment of two objectives in the original call for proposals, the

establishment of a joint research agenda, and public outreach to sensitize policymakers, a community of

practice, and the public regarding climate change vulnerability and adaptation in the Caribbean. While

we have already discussed the miscommunications between Columbia and CIMH during project start-up

and how this led to a six-month delay in that institute’s participation, the problem seems to be bigger.

First of all, when asked about their collaboration failures, CIMH officials asked “What collaboration?”

(Trotman and Van Meerbeeck interviews) and cited their sub-contract, which did indeed specify a range

of “CIMH only” responsibilities.

The CERMES leadership (Cashman interview) agreed that the collaboration was not specified,

but did argue that such collaboration was necessary to achieve a more comprehensive result from the

project which would have resulted in a stronger projection to specialists and the public. And Baethgen,

the overall project director (interview) agreed that the lack of early collaboration made impossible any

joint dialog. Baethgen did state that early on the IRI-CERMES collaboration was prioritized as the

CERMES courses were being designed and implemented, whereas later in the project, after IRI took on

some of the CIMH project reporting requirements, the CIMH forecasting tool development was the

project’s priority. From the beginning, Baethgen said, “CIMH said they wanted to do it [participate in

the project], but it was not clear what they were going to do. The idea was that when the project

started then CIMH would do it, and when we started interacting more with mid-level people at the

CIMH we were able to resolve the problem.” CIMH Director David Farrell was not available to interview,

but did send comments on an initial draft of this report.

Page 48: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

47

The problem was indeed solved as CIMH executed its charge quite effectively once that charge

was developed (on the institute’s sub-contract but not in the original call for proposals or in the

Columbia IRI proposal), but the lack of communication between CERMES and CIMH did diminish the

effectiveness of the project in reaching out to communities of practice and the public. While “Building

Capacity to Manage Water Resources and Climate Risk in the Caribbean” was better suited to drawing

linkages among technical experts relating water resource usage and climate change, if CERMES and

CIMH leadership had worked together to elaborate joint outreach strategies, and brainstormed

regarding joint research strategies and community of practice impacts, they might have achieved

stronger results. Along those lines, we offer recommendations in the final section of this report.

The last word on responsibility for any project shortcomings was articulated by HED’s Paez-Cook

(interview). “The US university [implementing partner] is expected to have a strong leadership role in

bringing the partners together and expanding impact. That happens as a rule . . . My experience from

managing this partnership is that I am not sure how much that happened in this case.” Columbia/IRI did

an excellent job of working with local partners to diffuse applied knowledge (forecasting tools and

course content) in the Caribbean, after selecting the right set of sub-contractors. Delays and ambiguities

in implementation seem to have diminished the end results of the project in some areas, but the vital

ones, the project was extremely successful.

XVI. Recommendations

1) Incentivize collaboration between key partners in the design of future projects, and prevent

“triangular” projects (such as CERMES-CIMH-IRI collaboration) from degenerating into two

parallel but non-interactive relationships (CERMES and IRI and CIMH and IRI). The failure to

specify precise responsibilities of each sub-contractor early in the project created ambiguities in

how the two Barbados institutes would work together, despite the clear statement in the initial

call for proposals that such collaboration was central to the project. Responsibilities must be

specifically and clearly delineated early in the process so that all parties understand how the

collaboration can transpire and to ensure the best chance for such collaboration to succeed. For

example, CERMES’ publication for the community of practice on “Water and Food Security:

Facing Up to Climate Variability: Discussion Brief No. 4” recommended the adoption of “season

time-frame” reporting (3), which is precisely what CIMH developed. Had CERMES coordinated

with CIMH the two institutions could have reported the CIMH approach in the CERMES

Page 49: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

48

document and distributed that document at CIMH events. This would require a very basic level

of cooperation absent from the project.

2) Improve specification of project tasks from the outset, and make sure that these are well

defined at a high level in project management, and across implementers. HED should have

assured there was shared understanding of the terms like community of practice, policy, and

research agenda, rather than hoping these specifications would “percolate up” as the project

got underway. Delegating the job of establishing and advancing any joint research agenda

between the three institutions should have been addressed at annual meetings of the principals

rather than through a post doc (or research associate, whatever the title). Delegating that

central task to a relatively inexperienced researcher dramatically diminished its prospects for

effectiveness and impact.

3) Strive not to add new implementing sub-contractors to projects after the projects have

already been conceptualized and defined. The State Department, however well-intentioned in

pushing for greater CIMH involvement, generated some difficulties for all parties in the project.

The scale-up of the project eventually got untracked, but, as Baethgen and Paez-Cook said, it

was never clear how CIMH’s role would be better specified, particularly with regard to the ill-

defined research agenda, “community of practice,” and public policy outreach areas. Each of

the two sub-contractors fell back into their traditional role of providing products they knew best

(CERMES classes and CIMH training) and the more nebulous, but higher value added areas never

did get address (late May 2015 meeting of the three parties excepted, as that is likely “too little,

too late” to achieve any sort of sustainable project success).

4) Reduce HED reporting requirements and/or make sure that funds are sufficient to cover a

project administrator at each sub-contractor agency. Columbia, CIMH, and CERMES were

apparently caught off guard (Baethgen, Cashman, Paez-Cook, interviews) by onerous reporting

requirements. The size of the CIMH sub-contract (about $132,000) was not really big enough to

include a project manager, although one was needed. In general, HED reporting requirements

were said by IRI managers to be extreme (Curtis interview), and even the most burdensome in

their entire portfolios (Baethgen interview). CIMH officials (Van Meerbeeck and Trotman,

Page 50: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

49

interviews) said while only two percent of their sub-contract was dedicated to administrative

work, they needed a project manager, which they did not have. HED might strive to reduce this

administrative burden so that more grant funds can go to project expenditures in country.22

5) Continue promoting climate adaptation in the long term, but by emphasizing climate

variability now. The implementers expertly avoided what Baethgen referenced as “policymaker

paralysis” by focusing on present climate forecasts rather than those which might hold true

decades from now. The high degree of uncertainty of future forecasts, combined with the

imprecision of forecasting across small areas (like Barbados, which is 166 square miles, or about

2.5 times the size of the District of Columbia), makes even present forecasting tricky, as does the

faster and more chaotic changes in weather than occur in non-tropical climates. The CIMH has

been able to greatly improve precision and relevance of forecasts as part of this project, and

CERMES has helped provide policymakers and other stakeholders with needed context for

interpreting this data.

6) Continue adding more niche sectors and stakeholders to those presently impacted by project

gains, by publicizing project outcomes to “community of practice” stakeholders in the media,

emergency management, tourism, agriculture, water management, and environmental

regulation sectors. Late in the project, to be sure, implementers organized a presence of water

sector managers from around the Caribbean at the CariCOF meeting, which has, over its last few

iterations, emerged as an event where stakeholders from sectors with interests in meteorology

(like emergency management professionals) come together, along with meteorologists from the

region. That is an important precedent, and one which could be complemented in the future by

much better synergies between climate change science and policy “think tanks” like CERMES,

and applied research institutes like CIMH. Indeed it may be that these two institutions take

each other “for granted” and/or compete for attention in the Barbados policy space. In future

projects, it may behoove funders to locate sub-contractor institutions across the Caribbean so

22

While Columbia received some $893,000 initially, CIMH only received $132,000 and CERMES $149,000, meaning that the

expenditure by sub-contractors in Barbados was about 24 percent of the project total. Evaluators did not receive detailed

budget numbers after the “scale up” but understand the total budget was about $1.5 million and that each sub-contract was

approximately doubled. While understandable in this case given that Columbia faculty taught the CERMES courses, worked with

CIMH officials developing the Caribbean Climate Predictability Tools, funded the three Caribbean Columbia MA students, and

took over reporting, especially from CIMH, some projects have achieved a better rate of “in country” funds transfer.

Page 51: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

50

that these institutions must build a relationship from scratch and hence must communicate

better and build upon each other’s work.

7) Add oceanography and seismology applied science and, more concretely, ocean current

prediction information to the meteorological information now being generated in CIMH

extended forecasts released at CariCOF meetings. There does seem to be a bit of “turf” among

institutions, as the CIMH said that coastal management institutions need to address seismic and

oceanographic issues which could also impact climate events as they are felt in Barbados and

across the Caribbean. A CIMH project manager (Trotman, interview) said tsunami risk-

projection was not part of the scope of his institution’s work, even though climate change

clearly impacts this, and oceanographic phenomena, such as water currents and diminishment

of coral reef buffering of beaches through coral bleaching, also should be factored in to climate

change considerations. Moreover, Barbados seems only to have a couple of ocean current-

measuring buoys (Hutchinson, interview), although most stakeholders did not have much

information about access to seismic or oceanographic data.

8) Integrate climate variability forecasts and better understanding of fresh water variability with

emergency management systems and help integrate risk management and insurance into

stakeholder considerations. The Barbados Emergency Management System is extremely

antiquated and underfunded (Thomas interview). More precise coding of threats is needed so

that stakeholders can know not just that a threat exists, but exactly the nature of the threat,

especially if telecommunications falter. Furthermore, while the CCRIF has been established as a

Caribbean-wide insurance pool from which payouts can be received quickly (within three days) if

natural disaster strikes, risk assessment needs to be more explicitly brought into the

considerations of water managers, coastal zone administrators, environmental regulators, and

other stakeholders in the public sector as well as the private sector. More precise consideration

of “back up” supplies and contingencies should be part of a more comprehensive national plan

by Barbados – and other Caribbean nations - to address climate change adaptation explicitly

even if they are also more directly addressing “climate variability” – as per Recommendation 3 -

at the present moment.

Page 52: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

51

APPENDIX

Table 1A. HED METHODOLOGY GUIDING QUESTIONS

Sub criteria Guiding Questions Categories

Management

Systems

How did the partners make use of results-based management

systems? Results-based management

Timeliness of activities

Use of Resources (Human, Financial and Non-Financial)

Feedback Loops

Local and national development priorities

Responsiveness to beneficiaries needs

Reflection of funder strategies

Causal l inkages in the results framework

Single/multiple pathways

Progress vs. targets

Program fidelity

Intended

Unintended

Positive

Negative

Results directly attributable to intervention

Results not directly attributable to intervention

Local ownership

Regional institutions

Businesses affected

Government agencies

Other entities outside HCI

Financial resources management

Non-financial resource planning and management

Results based management

Institutionalization

Stakeholder engagement

Continued relevance of program design

To what extent were program activities and interventions adapted

for the local context?

How did the partnership make decisions regarding

implementation? Did decision making processes contribute to

efficient program implementation?

Did program design consistently l ink activities and outputs

logically to program outcomes and objectives?

Program effects

Outcome level

results

To what extent have the intended outputs and outcomes been

achieved or are l ikely to be achieved? How were the results

effected by program fidelity? (Program fidelity may be defined as

the extent to which delivery of an intervention adheres to the

protocol or program model originally developed.)

What were the outcome results of implementation?

To what degree are the outcome results attributable to

implementation?

Is there evidence that outcome level results had a wider effect

than anticipated? Consider strategic all iances, i .e., host country

higher education institution -private sector partnerships

Do you think that program effects (development results and/or

host-county and U.S. higher education partnership) are l ikely to

continue over time after funding has ceased? If so, what aspects

of the program have the best chance of continuing? In your

opinion, what would be the major factors that would influence

these lasting effects?

Results

Efficiency

Sustainability

Effectiveness

Impact

Expansive effects

Relevance

Logic

Contextualization

Decision Making

processes

Page 53: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

52

Table 2A. Comparisons with other Caribbean nations

Environmental

Performance Index (EPI) Climate Risk Index (CRI) Fresh Water

Country Pop. GDP pc

EPI Score Rank CRI Rank/ CRI Score

Death toll GDP loss

per capita Availability (km3/year)

Per capita

m3

Barbados 284 000 12,660

44.5 Highest score: Air Quality: 100 Lowest score: Biodiversity and Habitat: 2.28

118 154/ 137.50 0.05 0.119 0.1 N/a

Bahamas 377,000 21,540

46.58 Highest score: Air Quality: 98 Lowest score: Climate and Energy: 0

105 N/a N/a N/a 0.0 54

Haiti 10.17 million

760 19.01 Highest score: Air Quality: 69.6 Lowest score: Water Resources:0 and Fisheries: 0

176 3/ 16.83 307.50 1.729 14.0 1261

Trinidad 13.41 million

14,400

52.28 Highest score: Air Quality: 99.88 Lowest score: Water Resources: 5.25

79 158/ 142.33 0.85 0.011 3.8 2,863

Jamaica 2.71 million

5,140

58.26 Highest score: Air Quality: 96.33 Lowest score: Fisheries: 0

55 52/60.5 4.75 0.846 9.4 km3/year

3,464

Mexico 120.85 million

9,600

55.03 Highest score: Air Quality: 87.09 Lowest score: Forests: 19.87

65 48/ 57.67 0.14 0.194 457.2 3,343

Haiti 10.17 millions

760 19.01 Highest score: Air Quality: 69.97 Lowest score: Water Resources and Fisheries: 0

176 Honduras 1/ 10.17

329.80 2.623 Bahamas (0.0)

Sources: Hsu et al., 2014, Kreft et al., 2014, World Bank, 2015, World Water, 2015.

Page 54: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

53

Table 3A. Comparisons with other Caribbean nations. Water Resources.

Total renewable Freshwater Water use

water resources (km3) withdrawral (km3/y) per-capita (m3/y)

Antigua and Barbuda 0.05 0.01 97.67

Bahamas 0.02 NA NA

Barbados 0.08 0.01 371.3

Costa Rica 112.4 5.77 1582

Dominica NA 0.02 244.1

Dominican Republic 21.0 5.47 574.2

Haiti 14.03 1.20 134.3

Jamica 9.4 0.93 369.9

St. Kitts and Nevis 0.02 NA NA

Saint Lucia NA 0.02 98.22

St. Vincent/Grenadines NA 0.01 92.59

Trinidad and Tobago 3.84 0.23 177.9

Source: CIA FactBook, 2015. https://www.cia.gov/library/publications/the-world-factbook/wfbExt/region_cam.html

Page 55: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

54

Table 4A. Columbia University Final M&E Objectives

Page 56: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

55

Table 5A. Caribbean M-E Table (RF, PMP, PIP)

Page 57: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

56

Figure 1A. Barbados population density

Source: EPortfolio. 2014. Retrieved from: (http://olivia-eportfolio.wikispaces.com/Humanities). Accessed on March 25, 2015.

“Building Capacity to Manage Water Resources and Climate Risk in the Caribbean” by Columbia

University and University of the West Indies/Centre for Resource Management and Environmental

Studies (UWI/CERMES)

Instruments for Gathering Information Via Three Methods (Method 1 questions)

Todd Eisenstadt and Stephen MacAvoy

SAMPLE QUESTIONS Questions for project/partner administrators. As administrators, they may have something to add on almost all aspects of the project. This question list is largely unchanged from the original master list. 1. Efficiency (management systems): How did the partners make use of results-based management systems? With regard to issues briefs? With regard to courses (including personnel sharing)? With regard to research and knowledge exchange? With regard to diffusion and community of practice?

Page 58: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

57

With regard to sharing technology or models effectively? "Moodle" was used as an online course management system. Was this system effective? What would have enabled more effective use of course materials, exercises? Lessons learned? 2. Efficiency (decision-making processes): How did the partnership make decisions regarding implementation? Did decision making processes contribute to efficient program implementation? Why or why not? Did events get executed in the timeframe planned? Why or why not? Were financial and human resources adequate? Why or why not? Did activities have mutually reinforcing benefits of any kind? With regard to issues briefs? With regard to courses and personnel? With regard to research and knowledge exchange? With regard to diffusion and community of practice? With regard to sharing technology or models effectively? Lessons learned? 3. Relevance (contextualization): To what extent were program activities and interventions adapted for the local context? What was the local context for which activities were devised (i.e. what were the most important Barbados-centered issues)? How were activities tailored directly to this context? What was the regional context for which activities were devised (i.e. what were the most important Caribbean Basin-centered issues)? How were activities tailored directly to this context? Local and national development priorities? Responsiveness to beneficiaries’ needs (students, local policymakers)? Responsiveness to funder strategies (specifically and generally)? With regard to issues briefs? With regard to courses? With regard to research and knowledge exchange? With regard to diffusion and community of practice? With regard to sharing technology or models effectively? With regards to relevance of outcomes: participants/ beneficiaries were from the larger Caribbean region. Did they perceive relevance to their local situations? Are there any metrics to measure that perception? Your FY2013 Custom Survey (in Annual Progress Report 10/1/2012-9/30/2013 document list 12) mentions "new in-country stakeholders participating in the CARICOF.“ The number of attendees given is double your target for 2013 and 2014, which is a credit to the project. What positions did these attendees have? The report indicates that project offered Pre-COF tech training, CERMES interns, CU Masters students, Simon Mason's training at CIMH in July and "new in country stakeholders" (10 of those). Who were the stakeholders? Is this inclusion of these stakeholders a measure of initial successful outreach to greater Barbados? Project collaborating stakeholders listed include: Red Cross/Crescent, Climate Services Partnership (online data sharing? towards CoP right?), Caribbean Meteorological Organization, NOAA, and WMO.

Page 59: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

58

What did this collaboration entail? Were they the attendees mentioned in 1 above? Did they collaborate in additional ways, and if so, what were those? Lessons learned? 4. Relevance (logic): Did program design consistently link activities and outputs logically to program outcomes and objectives? Please give examples for each of the main project areas. Were there causal linkages between program results (i.e. did one activity lead naturally to another)? Were there multiple pathways for achieving success on occasions, and if so, why did you choose the path you did? Consider these points when answering the questions regarding project components below… With regard to issues briefs? With regard to courses? With regard to research and knowledge exchange? With regard to diffusion and community of practice? Were all of the objectives and activities realistic? Did some have to be scaled back for reasons of logic? Were materials/courses offered considered “needed” by target audience? Evidence of interest? Could content being offered by adjusted as course offerings unfolded? Lessons learned? 5. Effectiveness (results): From a project administration standpoint, to what extent have the intended outputs and outcomes been achieved or are likely to be achieved? How were the results affected by program fidelity (how much an “intervention” adheres to the protocol or program model originally developed)? Please be specific. In other words, were all targets reached? If not, how much progress was made? Why? How much did you have to deviate from initial plans (i.e. program fidelity)? Consider these points when answering the questions regarding project components below… With regard to issues briefs? With regard to courses? With regard to research and knowledge exchange? With regard to diffusion and community of practice? Could content being offered by adjusted as course offerings unfolded? Have climate related short courses been integrated into the curriculum for CERMES (UWI) into the future? Do they "stand alone" (such as those offered Jan and Feb 2013 online and in person at CERMES for example) or are they part of broader courses (or other forms of content)? If these are integrated as separate courses, can you describe how that was done? If they are not integrated, can you also discuss the form they are taking? Have other institutions (such as other branches of UWI or other universities) implemented any of this curriculum? If so, what role did your courses play? Can you describe examples of how CIMH assisted CERMES in curriculum development (shared data, models, personnel, etc)? What could have fostered better collaboration? How did you track success of students? Internships in the project? Positions obtained by former students in project courses? Publications generated from project research? Did these increase over time? What would be a standard of success (and why)? Is a system in place to continue to track alumni? Could that speak to legacy of the program partnership? How many students have taken courses that use information from the partnership? Certificate or MS or other non-degree? Are they spread Caribbean-wide? (as per the established goal). The progress reports suggest that direct beneficiaries were targeted to be 1420 in FY2014 (actual 183). What happened? The FY2015 target was 102. Why was it so different from the FY2014 goal? Was it revised? Why or why not??

Page 60: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

59

Lessons learned? 6. Effectiveness (science collaboration): How was technology proposed to be shared between partners? How were data rich products of CIMH proposed to be incorporated into CERMES curriculum (developing sharing databases, curricular collaboration, personnel sharing etc.)? What transfer/sharing was accomplished (what were the materials transferred between CIMH and CERMES?)? Were processes used effective? Why? Did products of collaboration reach users? Did they help with outreach to those outside of academia? What efforts were undertaken to support efforts for scientists from CIMH and CERMES to collaborate? Were these effective? Why or why not? Press coverage or Academic output metrics resulting from collaboration/partnership? (becoming an international center for excellence was a goal of the RFP). Lessons learned? 7. Impact (outcome level results): What were the outcome results of implementation? To what degree are the outcome results attributable to implementation? Can you discuss intended impacts and also any unintended ones? What were the positive and negative impacts? What results might be attributable directly to the intervention? What results may have had something to do with the intervention but may not be directly attributable? With regard to issues briefs? With regard to courses? With regard to research and knowledge exchange? With regard to diffusion and community of practice? Were outcomes properly requested by funder? Were they properly specified by executers of contract? Were they easily reported? Lessons learned? 8. Impact (expansive effects): Is there evidence that outcome level results had a wider effect than anticipated? (Consider strategic alliances, i.e. host country higher education institution-private sector partnerships). What measurable expansive effects can you describe? With regard to issues briefs? With regard to courses? With regard to research and knowledge exchange? With regard to diffusion and community of practice? Was there “local ownership” by stakeholders of the project as it proceeded? How? Did local public institutions actively participate? Did regional institutions (Caribbean-wide) actively participate? Did business affected participate (energy sector, tourism, water-dependent industry, health and insurance industries, fishing and agriculture)? The 2013 progress report documents a meeting in Port of Spain, Trinidad. Forecasters met from 16 nations. Among what was taught was ways to "improve effective communication" of forecasts to users (economic, policy, planners, political). How was this done? Lessons learned?

Page 61: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

60

9. Sustainability (program effects): What were challenges to integrating short courses and other offerings into the MS anticipated? Characterize certificate program effectiveness versus MS program (from sustainability standpoint); if more manageable/realistic, why? What could be done to improve sustainability long term? With regards to administration of new curriculum development, what were challenges to sustaining the programs? With regards to personnel, what were challenges to sustaining the programs? With regards to maintaining student interest, what were challenges to sustaining the programs? With regards to maintaining stakeholder interest outside of academia (political leaders, policy makers and those with economic interests), what were/are challenges to sustaining the programs? Lessons learned?

“Building Capacity to Manage Water Resources and Climate Risk in the Caribbean” by Columbia University and University of the West Indies/Centre for Resource Management and Environmental

Studies (UWI/CERMES) Instruments for Gathering Information Via Three Methods (Method 1 questions)

Todd Eisenstadt and Stephen MacAvoy Questions for technology and science users 1. Relevance (logic): Given what you expected from the partnership/program, did program design consistently link activities and outputs logically to program outcomes and objectives? Please give examples for each of the main project areas. Were there causal linkages between program results (i.e. did one activity lead naturally to another)? How would you have characterized success for the program from your vantage point? Were there multiple pathways for achieving success on occasions, and if so, why did you choose the path you did? Consider these points when answering the questions regarding project components below… With regard to issues briefs? With regard to courses? With regard to research and knowledge exchange? With regard to diffusion and community of practice? Were all of the objectives and activities realistic? Did some have to be scaled back for reasons of logic? Were materials/courses offered considered “needed” by target audience? Evidence of interest? Could content being offered by adjusted as course offerings unfolded? Lessons learned? 2. Effectiveness (results): Given what you expected from the partnership/program, to what extent have the intended outputs and outcomes been achieved or are likely to be achieved? How were the results affected by program fidelity (i.e. sticking to the original plan), versus program flexibility? Did “interventions” or alterations to plans adhere to the objectives of the program ideas/intentions or models as originally developed? Please be specific. In other words, were all targets reached? If not, how much progress was made? Why? How much did you have to deviate from initial plans (i.e. program fidelity)? Consider these points when answering the questions regarding project components below… With regard to issues briefs? With regard to courses? With regard to research and knowledge exchange?

Page 62: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

61

With regard to diffusion and community of practice? Could content being offered by adjusted as course offerings unfolded? Have climate related short courses been integrated into the curriculum for CERMES (UWI) into the future? Do they "stand alone" (such as those offered Jan and Feb 2013 online and in person at CERMES for example) or are they part of broader courses (or other forms of content)? If these are integrated as separate courses, can you describe how that was done? If they are not integrated, can you also discuss the form they are taking? Have other institutions (such as other branches of UWI or other universities) implemented any of this curriculum? If so, what role did your courses play? Can you describe examples of how CIMH assisted CERMES in curriculum development (shared data, models, personnel, etc.)? What could have fostered better collaboration? Lessons learned? 3. Effectiveness (science collaboration): How was technology proposed to be shared between partners? How were data rich products of CIMH proposed to be incorporated into CERMES curriculum (developing sharing databases, curricular collaboration, personnel sharing etc.)? What transfer/sharing was accomplished (what were the materials transferred between CIMH and CERMES?)? Were processes used effective? Why? Did products of collaboration reach users? Did they help with outreach to those outside of academia? What efforts were undertaken to support efforts for scientists from CIMH and CERMES to collaborate? Were these effective? Why or why not? Press coverage or Academic output metrics resulting from collaboration/partnership? (Becoming an international center for excellence was a goal of the RFP). Lessons learned? 4. Impact (outcome level results): What were the outcome results of implementation? To what degree are the outcome results attributable to implementation? Can you discuss intended impacts and also any unintended ones? What were the positive and negative impacts? What results might be attributable directly to the intervention? What results may have had something to do with the intervention but may not be directly attributable? Consider these points when answering the questions regarding project components below… With regard to issues briefs? With regard to courses? With regard to research and knowledge exchange? With regard to diffusion and community of practice? Did funder properly request outcomes? Were they properly specified by executers of contract? Were they easily reported? Lessons learned? 5. Impact (expansive effects): Is there evidence that outcome level results had a wider effect than anticipated? (Consider strategic alliances, i.e. host country higher education institution-private sector partnerships). What measurable expansive effects can you describe? Include research collaborations or partnerships in general. With regard to issues briefs? With regard to courses?

Page 63: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

62

With regard to research and knowledge exchange? With regard to diffusion and community of practice? Was there “local ownership” by stakeholders of the project as it proceeded? How? Did local public institutions actively participate? Did regional institutions (Caribbean-wide) actively participate? Did business affected participate (energy sector, tourism, water-dependent industry, health and insurance industries, fishing and agriculture)? The 2013 progress report documents a meeting in Port of Spain, Trinidad. Forecasters met from 16 nations. Among what was taught was ways to "improve effective communication" of forecasts to users (economic, policy, planners, political). How was this done? Lessons learned? 6. Sustainability (program effects) (this set of question only applies to scientists in the partner institutions): What were challenges to integrating short courses and other offerings into the MS anticipated? Characterize certificate program effectiveness versus MS program (from sustainability standpoint); if more manageable/realistic, why? What could be done to improve sustainability long term? With regards to administration of new curriculum development, what were challenges to sustaining the programs? With regards to personnel, what were challenges to sustaining the programs? With regards to maintaining student interest, what were challenges to sustaining the programs? With regards to maintaining stakeholder interest outside of academia (political leaders, policy makers and those with economic interests), what were/are challenges to sustaining the programs? Lessons learned?

“Building Capacity to Manage Water Resources and Climate Risk in the Caribbean” by Columbia University and University of the West Indies/Centre for Resource Management and Environmental

Studies (UWI/CERMES) Instruments for Gathering Information Via Three Methods (Method 1 questions)

Todd Eisenstadt and Stephen MacAvoy Questions for educators/education practitioners. 1. Relevance (contextualization): To what extent were program activities and interventions adapted for the local context? What was the local context for which activities were devised (i.e. what were the most important Barbados-centered issues)? How were activities tailored directly to this context? What was the regional context for which activities were devised (i.e. what were the most important Caribbean Basin-centered issues)? How were activities tailored directly to this context? Local and national development priorities? Responsiveness to beneficiaries’ needs (students, local policymakers)? Responsiveness to funder strategies (specifically and generally)? With regard to issues briefs? With regard to courses? With regard to research and knowledge exchange? With regard to diffusion and community of practice?

Page 64: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

63

With regard to sharing technology or models effectively? With regards to relevance of outcomes: participants/ beneficiaries were from the larger Caribbean region. Did they perceive relevance to their local situations? Are there any metrics to measure that perception? 2. Relevance (logic): Did program design consistently link activities and outputs logically to program outcomes and objectives? Please give examples for each of the main project areas. Were there causal linkages between program results (i.e. did one activity lead naturally to another)? Were there multiple pathways for achieving success on occasions, and if so, why did you choose the path you did? Consider these points when answering the questions regarding project components below… With regard to issues briefs? With regard to courses? With regard to research and knowledge exchange? With regard to diffusion and community of practice? Were all of the objectives and activities realistic? Did some have to be scaled back for reasons of logic? Were materials/courses offered considered “needed” by target audience? Evidence of interest? Could content being offered by adjusted as course offerings unfolded? Lessons learned? 3. Effectiveness (results): From an educator or curriculum development point of view, to what extent have the intended outputs and outcomes been achieved or are likely to be achieved? How were the results affected by program fidelity (how much an “intervention” adheres to the protocol or program model originally developed)? Please be specific. In other words, were all targets reached? If not, how much progress was made? Why? How much did you have to deviate from initial plans (i.e. program fidelity)? Consider these points when answering the questions regarding project components below… With regard to issues briefs? With regard to courses? With regard to research and knowledge exchange? With regard to diffusion and community of practice? Could content being offered by adjusted as course offerings unfolded? Have climate related short courses been integrated into the curriculum for CERMES (UWI) into the future? Do they "stand alone" (such as those offered Jan and Feb 2013 online and in person at CERMES for example) or are they part of broader courses (or other forms of content)? If these are integrated as separate courses, can you describe how that was done? If they are not integrated, can you also discuss the form they are taking? Have other institutions (such as other branches of UWI or other universities) implemented any of this curriculum? If so, what role did your courses play? Can you describe examples of how CIMH assisted CERMES in curriculum development (shared data, models, personnel, etc)? What could have fostered better collaboration? How did you track success of students? Internships in the project? Positions obtained by former students in project courses? Publications generated from project research? Did these increase over time? What would be a standard of success (and why)? Is a system in place to continue to track alumni? Could that speak to legacy of the program partnership? How many students have taken courses that use information from the partnership? Certificate or MS or other non-degree? Are they spread Caribbean-wide? (as per the established goal).

Page 65: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

64

The progress reports suggest that direct beneficiaries were targeted to be 1420 in FY2014 (actual 183). What happened? The FY2015 target was 102. Why was it so different from the FY2014 goal? Was it revised? Why or why not?? Lessons learned? 4. Effectiveness (science collaboration): How was technology proposed to be shared between partners? How were data rich products of CIMH proposed to be incorporated into CERMES curriculum (developing sharing databases, curricular collaboration, personnel sharing etc.)? What transfer/sharing was accomplished (what were the materials transferred between CIMH and CERMES?)? Were processes used effective? Why? Did products of collaboration reach users? Did they help with outreach to those outside of academia? Lessons learned? 5. Impact (expansive effects): Is there evidence that outcome level results had a wider effect than anticipated? (Consider strategic alliances, i.e. host country higher education institution-private sector partnerships). What measurable expansive effects can you describe? With regard to issues briefs? With regard to courses? With regard to research and knowledge exchange? With regard to diffusion and community of practice? Was there “local ownership” by stakeholders of the project as it proceeded? How? Did local public institutions actively participate? Did regional institutions (Caribbean-wide) actively participate? Did business affected participate (energy sector, tourism, water-dependent industry, health and insurance industries, fishing and agriculture)? 6. Sustainability (program effects): What were challenges to integrating short courses and other offerings into the MS anticipated? Characterize certificate program effectiveness versus MS program (from sustainability standpoint); if more manageable/realistic, why? What could be done to improve sustainability long term? With regards to administration of new curriculum development, what were challenges to sustaining the programs? With regards to personnel, what were challenges to sustaining the programs? With regards to maintaining student interest, what were challenges to sustaining the programs? With regards to maintaining stakeholder interest outside of academia (political leaders, policy makers and those with economic interests), what were/are challenges to sustaining the programs? Lessons learned?

Page 66: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

65

WACEP - Overall Evaluation Summary The majority of people in the course were from either a public utility company or another governmental environmental agency. There was also representation from non-governmental organization, the private sector, development, and academia. The group comprised of engineers, scientists, people in policy and management, and a student. All agreed that Course 4 was the most applicable to their work. Most also found Course 3 very applicable. Course 1 was found very applicable by a third of respondents, and finally Course 2 was least applicable. One person commented that Course 1 was “just understanding the principles. It is important to do so but just not as applicable as the other three.” When broken down by module, a similar pattern emerged. Course 4 Modules 2 & 5 were the most applicable, and Course 4 modules had the highest scores overall. Some commented that all were applicable. When asked what type of additional content would best supplement the courses, most participants said that more specialized topics would be best. A couple of respondents indicated that more general introductory material would be useful. All participants said they would recommend these courses to colleagues in their field. When asked about the best format for using this content in the future, nearly all favored a professional development certificate program. One person recommended master’s level elective course at CERMES and commented, “I believe the Caribbean lacks adequate undergrad and postgrad options in water management/specialist studies.” Another person commented that in the future there should be better advertising, it should be circulated online more, and to target institutions. Most found the remote course experience satisfactory or very satisfactory, with a few students indicating it was neutral and one that it was unsatisfactory. One student commented, “Lecture times were not convenient as such the interactive discussions should have also been stimulated through Moodle.” Most also found the length of the course to be satisfactory, with a few students indicating a longer duration (e.g. 3 weeks with 2-3 lectures/week) would be preferred. Suggestions/comments In response to how students have or plan to apply concepts from the course in their work: To advise national stakeholders of various options and approaches to risk assessment and management and climate adaptation I hope to use the information learned in these courses in future employment opportunities. One in particular I hope to achieve is with the Low Emissions Capacity Building Programme. Having insight into the impacts of climate change (including the water sector) is a valuable addition to the base knowledge I would need with this programme.

Page 67: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

66

I plan to use the information as I review and analysis water quality data and examine trend. Also I planning of future projects and plans. It is useful for my writing and understanding concepts. One policy diagram was particularly informative, wish I could have seen more. Considering the implications of climate change on water quality management and regulation Reservoir and watershed management approaches. Already I am able to implement the decision making tools and analytical techniques demonstrated in both Courses 3 and 4. I especially find relevant the planning exercise in Course 4 which is extremely applicable to my field and guided me on the process i should adopt. By using the information and knowledge, i will be better equipped to make more informed decisions and also plan the intended actions better. currently work with the University and also privately and this enhanced my knowledge in the subject area Incorporate the information I have learned in the policy sector. I work with the Water Resource Management Agency therefore I would be able to add my voice to decision making based on the data coming from the field. In response to what kind of additional content would best supplement and support these courses as part of a packaged curriculum: Application in a multi-sector scenario and tools used to weigh or balance needs of different types of users in a national context just specialize on the area that your group is interested in so that they can connect the teaching to their work. So get them to interact more (which you did) Watershed modeling - alternative water sources - water resource management for drought - water reuse areas specific to small island development i.e. current conditions and future as most of the course dealt with areas which actually have long historical records. How to handle situations with no data would be beneficial. Management and Policy. Hydrologic Modeling In addition to all participants saying they would recommend the courses to colleagues, they also provided the following comments: useful in understanding the science and premises behind forecast models and forming a better context for decision making It helps give good foundational knowledge. informative, create good skill thinking for scenarios They are more in depth than I thought, comprehensive which is a good thing. Enhances your professional capabilities to increase their knowledge of the topics Because in many instances they would not have done some in the past In response to how the remote and online experience could be improved (some responded that there didn’t need to be improvements; I’ve left those out): Ability to access CHOL from WACEPonline and vice versa Providing the presentations before the lectures. By addressing the technical glitches Timing of the lectures and more online discussion and more interaction

Page 68: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

67

As most people were working individuals, an added feature that could be considered is to disseminate the information (pre-recorded lecture) via an online platform in order for questions to be emailed/posted through an online forum throughout the duration of course and not necessarily when they are only logged in. Evaluations based on online submission such as quizzes, and homework as well instead of having participation the current setup in order to achieve the certificate. More classes and more detailed information. By making the WACEP modules available a week in advance for browsing SOURCES CITED

AAAS (American Association for the Advancement of Science). “Benchmarks for Science Literacy.” 1993. Oxford University Press, New York, NY. Retrieved from: (http://www.project2061.org/publications/bsl/), accessed on March 25, 2015. Arnell, Nigel W. 2006. “ Climate Change and Water Resources: A Global Perspective,” in Hans Joachim Schellnhuber, ed., Avoiding Dangerous Climate Change. New York: Cambridge University Press. Centre for Resource Management and Environmental Studies, University of the West Indies. 2014. “Accounting for Climate Variability: Time to Move On? Discussion Brief No. 1.” Typescript. Centre for Resource Management and Environmental Studies, University of the West Indies. 2014. “Water and Food Security: Facing Up to Climate Variability: Discussion Brief No. 4.” Typescript. Climate Literacy: The Essential Principles of Climate Science. 2009. Retrieved from: (http://www.globalchange.gov/browse/educators), accessed on March 25, 2015. Columbia University. 2011. “Building Capacity to Manage Water Resources and Climate Risk in the Caribbean.” Typescript. Caribbean Institute for Meteorology and Hydrology (CIMH). 2015. Comments on Eisenstadt, Todd A. and Stephen MacAvoy initial draft of “Evaluation of ‘Building Capacity to Manage Water Resources and Climate Risk in the Caribbean.’ Typescript notes submitted by staff of CIMH on April 10, 2015. Garrison, Tom. 2011. Essentials of Oceanography, 6th Ed. Belmont: Brooks/Cole. Gerring, John. 2012. Social Science Methodology: A Unified Framework. 2nd edition. New York: Cambridge University Press. Geertz, Clifford, 1973. The interpretation of cultures: selected essays. New York: Basic Books. Higher Education in Development (HED), 2011. “Request for Applications (RFA): Energy and Climate Partnership of the Americas (ECPA)/Caribbean Region Climate Adaptation (CRCA) Partnership Initiative.” Document released to public November 22, 2011. Higher Education in Development (HED), 2012. “Conference Call: Barbados Caribbean Region Climate Adaptation Partnership.” Notes emailed to authors by Diana Paez-Cook from August 13, 2012. Higher Education in Development (HED), 2014. “Columbia University Domestic Monitoring Visit:

Page 69: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

68

‘Building Capacity to Manage Water Resources and Climate Risk in the Caribbean’.” Typescript from April 3-4, 2014 visit by HED officials. Higher Education in Development (HED), 2014. “Caribbean Region Climate Adaptation (CRCA) Monitoring Visit: ‘Building Capacity to Manage Water Resources and Climate Risk in the Caribbean.’” Typescript from December 16-18, 2014 visit by Manny Sanchez and Omri Malul. Higher Education in Development (HED), 2014. “Columbia University Domestic Monitoring Visit “Building Capacity to Manage Water Resources and Climate Risk in the Caribbean.” Typescript from April 3-4, 2014 visit by HED officials. Higher Education in Development (HED). 2015. “Partnership External Evaluation – A Framework,” powerpoint presentation by Omri Malul and Manny Sanchez. February 4, 2015 via conference call with Todd Eisenstadt and Stephen MacAvoy. Hsu, A., J. Emerson, M. Levy, A. de Sherbinin, L. Johnson, O. Malik, J. Schwartz, and M. Jaiteh. 2014. The 2014 Environmental Performance Index. New Haven, CT: Yale Center for Environmental Law & Policy. Kreft, Sönke and David Eckstein. 2014. “Global Climate Risk Index 2014, Who Suffers Most from Extreme Weather Events? Weather-Related Loss Events in 2012 and 1993 to 2012,” Germanwatch. Retrieved from: (http://germanwatch.org/en/download/8551.pdf), accessed on October 13, 2014. Ministry of Finance and Economic Affairs. 2013. Barbados Economic and Social Report 2013. Retrieved from: (http://www.economicaffairs.gov.bb/news-detail.php?id=141), accessed on March 25, 2015 Mullholland, P., G. R. BEST, et al. 1997. “Effects of climate change on freshwater ecosystems of the south-eastern United States and the gulf coast of Mexico.” Hydrological Processes 11(8): 949-970. National Climate Assessment Resources for Educators. 2014. Retrieved from: (http://www.climate.gov/teaching/2014-national-climate-assessment-resources-educators ), accessed on February 13, 2015. Nisbet, Matt. 2009. “Communicating Climate Change: Why Frames Matter for Public Engagement,” in Environment, 51(2): 12-23. Poore, R. Z., K. DeLong, et al. 2008. Natural Climate Variability in Northern Gulf of Mexico: Implications for the Future. USGS Gulf Coast Science Conference and Florida Integrated Science Center Meeting: Proceedings with Abstracts. Orlando, Florida. Pulwarty, R. S., L. A. Nurse, et al. 2010. "Caribbean Islands in a Changing Climate," in Environment: Science and Policy for Sustainable Development, 52(6): 16-27. UNESCO-IHE. 2015. Education and Training Guide 2015. The Netherlands. 26 p. United States Agency for International Development. 2015. “USAID’s Evaluation Policy and Education Strategy,” power point presentation by Omri Malul and Manny Sanchez. February 4, 2015 via conference call with Todd Eisenstadt and Stephen MacAvoy.

Page 70: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

69

World Bank. “Renewable internal freshwater resources per capita (cubic meters).” 2014. Retrieved from: (http://data.worldbank.org/indicator/ER.H2O.INTR.PC), accessed February 16, 2015. World Health Organization. 2015. “Climate Change Adaptation to Protect Human Health. Barbados project profile.” Retrieved from (http://www.who.int/globalchange/projects/adaptation/en/index1.html), accessed on March 25, 2015. World Water, “Total Renewable Freshwater Supply by Country (2013 Update).” 2014. Retrieved from: (http://worldwater.org/wp-content/uploads/sites/22/2013/07/ww8-table1.pdf), accessed February 16, 2015. Interviews Conducted Baethgen, Walter, Senior Research Scientist, International Research Institute for Climate and Society (IRI), Columbia University, Skype, March 20, 2015 from New York, USA. Baethgen, Walter, Senior Research Scientist, International Research Institute for Climate and Society (IRI), Columbia University. Email response to course development methodology request. April 15, 2015 Ballantyne, Danroy, Senior Engineering Technician/Hydrologist. Central water Sewage authority (CSWA), St. Vincent and the Grenadines. Skype and email March 20, 2015 from St. Vincent and the Grenadines. Best, Lawanda, Integrated Management Systems Coordinator, Banks Beer, March 13, 2015, Barbados. Blackwood, Mansfield, United States Agency for International Development (USAID) program officer, March 12, 2015 in Barbados. Blenman, Rosalind, forecaster, Barbados Meteorological Services, March 12, 2015, Barbados. Braitwithe, Edmund, recent CERMES MA graduate and water resources professional, March 11, 2015, Barbados. Broodram, Natalie, Water Resources Specialist, Caribbean Environmental Health Institute, Trinidad and Tobago. Email March 26, 2015. Cashman, Adrian, Director, Centre for Resource Management and Environmental Studies CERMES), March 9, 2015, Barbados. Curtis, Ashley, International Research Institute for Climate and Society (IRI), March 16, 2015, from New York, USA. Cox, Shelly-Ann, Research Associate, Caribbean Institute for Meteorology and Hydrology (CIMH), March 9, 2015, Barbados. Eversley, Ann Marie, acting marine pollution officer, Environmental Protection Division, Ministry of Environment and Drainage, March 13, 2015, Barbados.

Page 71: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

70

Georges, Jeanel, Centre for Resource Management and Environmental Studies (CERMES) program officer, Skype, March 16, 2015 from Barbados. Goddard, Lisa, director, International Research Institute for Climate and Society (IRI), Columbia University, Skype, March 16, 2015 from New York, USA. Hinds, Fabian, Director of Coastal Management Unit, Ministry of Environment and Drainage, March 12, 2015, Barbados. Hutchinson, Natalie, Senior Development Officer, Government of Canada, March 13, 2015, Barbados. Jessamy, Kerriann, 2013-14 MA student in CERMES MA program (quality assurance officer at cement plant), March 13, 2015, Barbados. Jordan, Eleanor, tourism development officer and representative on the inter-agency Climate Change Committee for the 2nd National Communication to the United Nations Conference on Climate Change Framework Convention (UNFCCC), March 12, 2015, Barbados. Knight, Kelly Ann, Environment, Health, Energy and Safety Officer at Banks Beer, March 13, 2015, Barbardos. Leacock, Shannon, CERMES MA student in Climate Change specialization, March 13, 2015, Barbados. Marshall, Ricardo, project manager, Project Management Coordination Unit, Ministry of Environment and Drainage, March 13, 2015, Barbados. Murray, Brian, forecaster, Barbados Meteorological Services, March 12, 2015, Barbados. Nurse, Sonia, Deputy Director, Barbados Meteorological Services, March 12, 2015, Barbados. Paez-Cook, Diana, Latin America Program Officer, phone call April 20, 2015 from Rockville, MD. Paul, Jamie, geohydrologist, Barbados Water Authority, Ministry of Agriculture, March 10, 2015, Barbados. Prowell, Sherryann, CERMES MA student in Climate Change specialization, March 13, 2015, Barbados. Riley, Elizabeth, Deputy Executive Director, Caribbean Disaster Emergency Management Association, March 10, 2015, Barbados. Rowe, Antonio, director of Drainage Unit and former director of Coastal Management Unit, Ministry of Environment and Drainage, March 12, 2015, Barbados. Royer, Reynette, Coordinator, Red Cross Caribbean Disaster Risk Management Reference Centre (CADRIM), March 10, 2015, Barbados. Sanchez, Manny, program specialist and project manager, Higher Education in Development (HED), Phone call April 20, 2015 from Rockville, MD.

Page 72: Evaluation of “Building Capacity to Manage Water …pdf.usaid.gov/pdf_docs/PA00KNGC.pdf · Plan for Structural Data Analysis 11 . IX. Limitations of Evaluation 13 . X. The Object

71

Simpson, Leslie, Natural Resources Management Specialist, Caribbean Agricultural Research and Development Institute, Jamaica Unit, Skype, March 24, 2015 from Jamaica. Taylor, Chanise, program officer and representative in crisis management area, Ministry of Tourism, March 12, 2015, Barbados. Thomas, Judy, director of Emergency Management Services, Ministry of Homeland Services, March 16, 2015, Barbados. Trotman, Adrian, Chief of Applied Meteorology and Climatology, Caribbean Institute for Meteorology and Hydrology (CIMH), March 9, 2015, Barbados. Van Meerbeeck, Cedric, Program officer, Caribbean Institute for Meteorology and Hydrology (CIMH), March 9, 2015, Barbados.