Lauren Landis: “Two sides of the coin for evaluative evidence”

In a recent interview, WFP’s country director in Kenya reflects on the usefulness of evaluation to improve programme efficiency, balancing it with the need for fundraising evidence.

WFP Evaluation
4 min readJul 3, 2023
WFP Kenya Country Director Lauren Landis with Tana River CECM Mwanajuma Hiribae (second from left) at the Wayu Boru Health Facility in Tana River County, Kenya in 2022. WFP/Alauna Dunphy

At the June board session in June, WFP presented the Kenya country strategic plan evaluation for the 2018 to 2023 period. What is the relevance of the CSPE for the Kenya country office and the region?

It stands out as one of the most useful evaluations for the Kenya country office because of how the process was timed. The mid-term review allowed me to make the necessary tweaks and anticipate elements to focus on in the final evaluation.

For country strategic plan evaluations to work, it’s important that sufficient time and effort is spent with the country director and senior team, to ensure that the evaluation recommendations are meaningful. I believe strongly that evaluation recommendations should be carefully negotiated and worded — it is not useful if they are written in a way that they cannot be implemented.

A lot of effort was put into this process, which prepared the ground for us to develop the new country strategic plan in an effective way.

In your view and experience, what have been some influential decentralized evaluations from Kenya, and why?

Another effective evaluation, commissioned by the Kenya office, was the school feeding evaluation from the USDA/McGovern-Dole Food for Education and Child Nutrition Programme. The Kenyan Government is currently scaling up school feeding activities from 1.8-million children to 10-million children across the country. The evaluation was robust enough to enable us to proceed with this scale-up and transition, where the government has taken over the management of the school feeding process since 2018 and WFP provides technical assistance.

The evaluation findings surfaced certain things that were detracting from our results. For example, cash for school feeding was not received at the county level in a timely way. Understanding what went wrong, and why, allowed us to return, in an informed way, and design a new system.

What is the usefulness of evaluative evidence in starting conversations with donors and partners?

In the changing lives agenda, good evidence is one of the best ways to tell our story, highlighting what works and why.

My one dream would be for us to be able to have more evidence on whether those people that were supported through resilience building programmes, suffered less and were more quickly able to bounce back after a drought.

Hard-hitting evidence on resilience could improve the conversation to convince donors and partners. It’s the greatest piece of evidence we need in WFP. Without compelling evidence on resilience, it is difficult to have conversations in the changing lives space.

We had many useful evaluations to help inform programming, and guide us towards greater programme efficiency — another example that comes to mind is one on food systems. What we need more of, is the evaluations that help change the dialogue with big donors.

For example, data from a BCG study in 2009 showed that if you invest one dollar in school feeding, you get a 6 dollar return, but if you invest it in home-grown school feeding you get a 9 dollar return.

This is not the kind of evidence that helps you do your programmes better, but it does help you state something substantive that could change the dialogue in fundraising.

What has been your biggest lesson learnt from evaluations?

My years of experience with evaluation have taught me to pay attention at the beginning of the process and at the end. It is worth reading the terms of reference from cover to cover to ensure that the evaluation will answer the questions that you need to be answered — to avoid disappointment or dissatisfaction when the evaluation is complete. Equally, it’s critical that — at the end of the evaluation — that recommendations are written in a way that would get them off the country director’s to-do list. They must be actionable and not open-ended.

Good evaluations will be used. If I have my way in Kenya, I would have fewer evaluations, but focus on the ones that are really hard-hitting, and go into the depth of a specific issue — so to balance quality with quantity.

Joint evaluations with partners could also be a strategy to have more impactful results on specific issues that carry weight globally. However, it’s important to pick partners carefully and that WFP is closely involved in the process.

In 2019, Kenya was accepted into the thematic window on cash-based transfers and gender for impact evaluations. This was pre-covid and pre the worst drought the country has experienced in 40 years — both elements that have significantly changed the landscape for conducting Randomized Control Trials. A lesson I took from this evaluation, is that it’s crucial to be able to adapt to black swan events, by remaining flexible and ensuring clear, open lines of communication between the different actors involved.

How can we move towards a culture of evaluative thinking and evidence use in WFP?

We take evaluations very seriously in the Kenya country office. We have a highly inquisitive, highly skilled team, so often our biggest challenge is that we have too many questions that we want to be answered through the evaluation — and then often, we have to find the mid-way between the questions that will help us improve programmes, and those that will help raise funds.

--

--

WFP Evaluation
WFP Evaluation

Written by WFP Evaluation

Delivering evidence critical to saving lives & changing lives. The Independent Office of Evaluation of the UN World Food Programme works for #ZeroHunger

No responses yet