From Dr. Sue Goldstein:
Sitting in a room with armed guards permanently outside, I contemplate the nature of communication for development to eradicate polio and the role of monitoring and evaluation in the process. Surrounded in the trip by committed, enthusiastic people working to achieve the polio eradication elimination targets in Pakistan (and the world) it is heartening that such teams are willing to work in dangerous areas (of course many live in such areas without choice). There seems to be so much that the world of development and social change communication can learn from this massive effort. Such as how closely the services and communication teams work together at all levels (not always smoothly); such as the intense focus on the end point and the clarity of the aim of all the work - reaching every last child with polio vaccination; such as the intense monitoring of the reach and the social mapping in every Union Council, literally down to every house; such as the massive international persuasive effort to advocate for eradication, and ensure funding for the last massive effort; such as the critical advocacy and negotiation teams constantly working at gaining access for vaccination teams to enter areas to reach unvaccinated children.
How could we as an external team with little contextual experience add value to this effort? This was a thought constantly in my mind as we learned about the programme. But, I am told by the very people whose work I am here to review that an outside eye can be useful in finding blind spots and/or asking questions filtered by a different set of experiences. I can accept this but find myself respecting their resolve to leave no stone unturned in the struggle to find enough small gains to achieve eradication. This is even more remarkable when considering the willingness to listen to an outsider only marginally familiar with the complex, dangerous and difficult situations they work in daily.
But it also made me think about what we can all take away from the experience, whether inside or outside the programme. The level of meticulous monitoring and constant evaluation is something I fear is rarely seen elsewhere, but what may be more common to other programmes is how one integrates the information (evidence) often gathered meticulously into the communication plans, messages and processes.
The process of getting research understood, used and integrated into health communication is often taken for granted but it is not as easy as it seems. Starting at the formative stage how the data is collected and analysed is important to understand, we all know how statistics can help us prove what we wanted to see in the first place, and how we can conveniently ignore information that we don’t understand or that shows a different perspective from our own. Poorly analysed data can lead to incorrect conclusions and then poor communication.
But even further upstream if the questions are not broad enough or open enough all the research will do is reinforce our assumptions with numbers. The questions need to be guided by what has gone before – what can we learn from past experiences and the field? The questions need to be guided by a model or theory of change so that the research can unearth the critical areas to focus on. The questions need to be guided by the true desire to listen and hear other perspectives.
Listening is the one communication skill that health communicators often forget, so eager to tell the correct thing to do, the correct way to behave but forgetting to listen to how others see the world. This plays out at an interpersonal level and at a research level. Finally thinking about areas that are not traditionally communication - like improving services - communicators could be thinking about new and innovative ways to support services, motivate workers and engage communities to hold those responsible for delivering services accountable. In some instances this will have more impact than changing health behaviours.
In other instances looking for long term social change with empowered, equal societies is a dream to be pursued using an ever growing toolbox and community of practice. So, to come back to my original questions about the role played by external evaluation teams in support of the programme itself and as ‘learners’ who can take lessons away, I think the answers are yes and not enough. We must always ask ourselves if the information we ask ourselves is useful and are we using the information that we have? In this context the usefulness of information brought from external evaluators can be measured to some degree by the value it is given by those in the programme and by the impact it has on supporting the programme to achieve its goal. But in the context of what we take away from the programme and what we do with that information I think more could be done to ensure that lessons learned by the evaluators themselves are shared in ways that can be used in other contexts.