From boring to brilliant: Tips for teaching impact evaluation to humanitarians

WFP Evaluation
6 min readJun 13, 2024

--

By Kristen McCollum, Impact Evaluation Analyst, WFP; Katie Pons, Advisor Knowledge Learning and Communications, Save the Children; Lloyd Banwart, Activity Director, HAEC

This blog is co-published with HAEC

Over the past 30 years, impact evaluations have become a routinely utilized tool in the development sector. However, in the humanitarian sector, the concept is comparatively new.

To address this gap, both the Humanitarian Assistance Evidence Cycle (HAEC) and the World Food Programme’s (WFP) Impact Evaluation Unit have focused on increasing the literacy within implementation teams on impact evaluation methods, and how to utilize their results to improve current — and future — humanitarian programme designs.

Generous funding from USAID and USAID Bureau of Humanitarian Assistance (BHA) has allowed WFP’s impact evaluation unit and the HAEC Activity to bring the classroom to humanitarian colleagues across the globe.

In April, HAEC hosted a workshop on Impact Evaluations & Operational Research in Humanitarian Activities in Bogota, Colombia for USAID BHA-funded implementing partners and WFP implementing colleagues from across the Latin America and Caribbean region. Given the overlapping objectives, the HAEC team invited WFP to support the delivery of technical content at the workshop.

Two workshop participants sitting at a table talking and smiling with workbooks in front of them.
Workshop participants discuss an activity from their workbook. Photo: HAEC

At the workshop, 50 participants from 17 organizations across the region, including colleagues from WFP El Salvador, Nicaragua, Ecuador, Colombia, Nicaragua, and the Dominican Republic, participated in four days of interactive sessions and activities. HAEC designed the workshop to give participants a solid understanding of Impact Evaluation methodologies, including experimental and quasi-experimental designs.

Alongside the technical sessions, implementing partners and WFP country teams presented practical examples from their work in Latin America to the wider group. This combination of theoretical content and real-world case studies helped participants understand how these methods could be applied in their own contexts, offering concrete illustrations to better ground the practical aspects of each approach.

The event received positive feedback from participants, with particularly positive responses to the integration of applied real-world examples with technical content. Inspired by the event, WFP and HAEC want to share tested and approved methods to keep impact evaluation training accessible and engaging for humanitarian colleagues.

Workshop facilitator stands in front of a presentation screen with a microphone in hand.
Facilitator Carlos Cousido Cores from TANGO International at the training in Colombia. Photo: HAEC

Make the skepticism explicit

Readers are likely familiar with the virtues of using platforms like AhaSlides or Mentimeter to keep audiences engaged. Rooted in promising practices for adult learning, these platforms help keep participants engaged consistently and allow a space to share their experiences and thoughts quickly with a large group. Experience has also shown that the anonymity provided by these platforms helps to make skepticism explicit.

Colleagues are often quick to question the feasibility of impact evaluations in the humanitarian sector. During this workshop, HAEC chose to address this head-on by asking participants to share their assumptions and skepticism directly through these anonymous platforms.

Participants were asked, “What are some examples of when the assumptions of RCTs will not hold?” Responses ranged from the straightforward “when the intervention has already started and was not randomized” (fair!) to the not-so-straightforward, “when evaluating an intervention with migrants”, “when evaluating emergency assistance after a natural disaster”, or “when everyone must receive assistance”.

Two workshop participants sit at a table labeled “Morado” with their workbooks in front of them.
Walking participants through hesitations helps skepticism to settle. Photo: HAEC

These are legitimate challenges, but many of these challenges do not necessarily preclude doing an RCT. By making the skepticism explicit, the facilitators were able to directly address these concerns and showcase real-world examples to dispel common misconceptions. For example, WFP’s anticipatory action evaluation in flood-prone Nepal, the HAEC-funded RCT of World Vision Guatemala’s Cash Transfer and Water filter programme, WFP “Lean Impact Evaluations” and other HAEC-funded Studies are good examples of successful RCTs in tricky circumstances. Eliciting these hesitations and walking through them with workshop participants helps the skepticism to settle and encourages more detailed questions on how and when RCTs are feasible and useful.

Use activities to bring concepts to life

Adult learners thrive when they’re able to apply concepts to their context. This is why workshop participants need to come up with a real-world example — often an upcoming project they are planning — to apply learning. To facilitate this, participants were provided with an activity book to accompany sessions. During each technical module, participants would turn to their activity book to allow time for reflection and application of concepts to their contexts.

For example, to support participants’ conceptual understanding of when an RCT or a quasi-experimental design was appropriate, there were activities titled “RCT Road” and “Quasi-experimental design road.”

These activities turn a standard decision tree into an interactive game board, allowing participants to break down each question that implementers might ask themselves before designing an impact evaluation. Through this process, participants navigate the flow chart of design questions to see if an impact evaluation is right for their project.

Throughout the workshop, the facilitation team looked for as many opportunities as possible to get participants to stand up and interact with one another to strengthen their understanding of challenging technical concepts. Another example was a WFP-led activity to explain regression discontinuity design (RDD), which is outlined by David Evans in his blog.

The blog contains a full explanation of the activity, but in essence, participants line up according to height — in a room of 50 (this gets noisy and chaotic quickly). From here, the facilitator asks probing questions to demonstrate how those standing near a selected cut-off point in the middle are more (or less) similar to those at either end of the height spectrum.

When the activity was wrapping up and participants linked the concept of RDD to the activity they had done there was a collective “Ohhhhhhhh!” in the room. Colombia can definitely be added to the list of countries where the activity has been tested and approved!

“The exercises allowed us to apply the theory from the practical part and to be able to see our projects reflected.” — Workshop Participant

Room full of people standing around tables, about half of the group has their hands raised.
Workshop participants participate in an icebreaker between sessions. Photo: HAEC

Another engaging and educational activity during the workshop was the “Can you find your match?” exercise. Participants were instructed to stand up and find someone in the room who identified as the same gender and had the same eye colour. Once matched, they asked each other if they liked spicy food and whether they preferred to get up early or stay up late. This interactive session highlighted the challenges of matching, emphasizing that even when matches are made on observable characteristics, it doesn’t guarantee alignment on unobservable traits.[1]

When participants made the connection between the activity and the concept of statistical matching, there were many lightbulb moments, accompanied by lively discussions and a deeper appreciation for the complexities involved.

These interactive exercises demonstrate the value of hands-on engagement to deepen participants’ understanding of technical concepts. By physically embodying the principles of different impact evaluation methods, participants broke down complex ideas into tangible, relatable experiences. This approach facilitated a lively and memorable learning moment for participants and fostered fun and engaging peer-to-peer interactions and discussions. Such dynamic activities effectively make abstract statistical methodologies more accessible, ensuring that participants leave the workshop with a solid grasp of the concepts and a sense of camaraderie with their colleagues.

Five workshop participants standing in a circle on a rooftop with mountains in the distance. Participants are laughing and smiling.
Workshop participants enjoy time for networking after a day of technical sessions. Photo: HAEC

You can keep in the loop through our newsletters by signing up here (WFP) and here (HAEC).

[1] While matching methods are employed by HAEC as a valid approach for creating counterfactuals in impact evaluations, it is important to note that WFP does not consider matching designs a credible counterfactual methodology. This reflects a difference in expert opinions on the definition and application of matching techniques within the field of impact evaluation.

--

--

WFP Evaluation

Delivering evidence critical to saving lives & changing lives. The Independent Office of Evaluation of the UN World Food Programme works for #ZeroHunger