CI
If your organisation is interested in being the lead Communication Initiative Partner for this issue and space that would be great! Please email Warren

Search

Measuring Change: Planning, Monitoring and Evaluation in Media and Development Cooperations

How valuable is this shared knowledge to your work?
4
Average: 4 (1 vote)
Your rating: None
Author: 
A. Sofie Jannusch, ed.
Affiliation: 

Catholic Media Council (CAMECO)

Publication Date

September 27, 2007

This document summarises the proceedings of the symposium Measuring Change: Planning, Monitoring, Evaluation in Media Development which focused on the utilisation aspect of evaluation in media development, including an emphasis on learning from monitoring and evaluation (M&E) experiences, to facilitate the improvement of existing projects and programmes at all levels, from planning to implementation and follow-up.

In the "Setting the Framework" section of presentations, Andrew Puddephatt and Alan Davis introduced various levels and aspects of M&E in the field of media and media assistance. Andrew Puddephatt shared the toolkit approach structured around five principal media outcomes:

  1. "the system of regulation and control,
  2. plurality and transparency in ownership,
  3. media as a platform for democratic discourse,
  4. the professional capacity building and supporting institutions, and
  5. infrastructural capacity. The structure 'can be conceptualised as a process of ‘drilling down’ from the desired media development outcome to the specific means of verifying how far this outcome is achieved in practice.'"

Alan Davis proposed twin pillars of M&E in media development: an M&E Handbook meant to guide individual projects and a Media and Governance Index, showing the degree to which media actually report on and possibly influence each of the six components of governance as defined by the World Bank. "His suggestions led to commonly shared proposals for the follow-up initiative mediaME..." which is described in the publication as a newly launched initiative resulting from the symposium and incorporating the following proposals from the symposium workshops:

  • Create a Media Monitoring & Evaluation expert working group that will carry forward conference discussions. (Now established through the United Kingdom Department for International Development (DfiD), click here to register for access to the conference website Dgroups.)
  • Create a Wiki as a resource and a start to proceed with the idea of producing a practitioners’ handbook. (Now available here.)

 

In the "Concepts and Tools" section of presentations, Birgitte Jallov presented a tool, "The Most Significant Change method", to document community radio impact because she considers "community radio as 'hinge' where the concepts of media development and development communication intertwine, as community radio is not only seen as a medium for information but also as a 'tool to facilitate participatory development and spurring local action'." She demonstrated how this dialogical, story-based impact assessment tool "builds on the strong oral traditions usually prevailing in illiterate communities, a tool, easy to use for community groups themselves... also useful in settings where no baseline studies exist to reflect changes with earlier findings - before and after the establishment of the community station."

Nadia El-Awady and Jan Lublinski spoke on their use of the Outcome Mapping framework to build up a reporters’ network, as exemplified in the work of the Science Journalists’ Cooperative (SjCOOP), which aims to enhance the professional development of journalists in the developing world who cover health, environment, technology, and science issues. They shared the use of web-based and face-to-face communication processes used in SjCOOP's vision of the improvements to which the programme hopes to contribute. "Outcome mapping was established as a framework for quality management that allowed the group to overcome occurring difficulties, concentrating on the question: 'How can we help our partners?' rather than 'does our intervention work?'."

Serena Rix Triphatee demonstrated how the behavioural change of some listeners is influenced not just by the medium being evaluated, but also by the changing context in the country. She discusses a Nepali peacebuilding soap opera in the context of "'multiplying obstacles' in monitoring a moving target that is a country in conflict or a period of rapid transition....The presentation gives an insight into how Search for Common Ground tries to be continuously up-to-date on the question, how the changing country is affecting the lives of youths in the villages. With 20 young community focal points - an audience feedback team, and a story gathering team - a 'web of young grassroots monitors' has been established that 'has been complex and difficult to manage', but yet these field teams became the 'agents of change'."

Ondine Ullman shared the experiences of the establishment of a network of information gatherers that allows Pact Mongolia to undertake nationwide surveys across the country of Mongolia, where a nomadic population presents challenges to monitoring. "She also shows why Pact Mongolia prefers to gather data in one-on-one situations, utilising interviewer notations of a conversation style interview, how the Pact team tracks the respondents, and how the interviewers engage with them in their everyday activities, at a watering well with camels or while catching goats for cashmere combing."

Esther Saville and Anna Godfrey give an overview of the BBC World Trust Service’s approach to M&E at the four levels of intervention - system, organisation, practitioner, and public - and how the research is embedded in projects. The group uses the following techniques:

  • Formative Research to establish the general parameters and content of a project, conducted during the project development phase.
  • Pre-testing prior to broadcast to refine output in terms of tone, language, relevance, and appropriateness.
  • Audience Feedback during the project delivery phase to assess how audiences are engaging with and interpreting the output.
  • Impact Evaluation to determine how much influence a project has had on those who have engaged with it, for instance, an association between exposure to outputs and changes in knowledge, attitudes, and behaviours.

Christoph Spurk presented a baseline study on radio news in Zambia that demonstrates how content analysis is used as a tool for discovering training needs, providing at the same time the baseline data against which the changes realised by the training programme can be measured. "Based on the functions media should fulfil in democratic discourse, he outlines corresponding quality criteria for journalistic reporting, to best support the functions of information, orientation, being a forum for public discourse, and scrutiny (watchdog)." He concludes that measuring change at the outcome level is not only more economical and realistic, but also more trustworthy than impact studies.

In the section "Changing the Perspective", Luckson Chipare reflected on the question: Who evaluates the donors’ performance? He gave some examples of how the Paris Declaration on Aid Effectiveness could be implemented to emphasise ownership, harmonisation, alignment, results, and mutual accountability between beneficiaries, external evaluators, and donors.

Source: 

Email from Sofie Jannusch to The Communication Initiative on July 31 2009.

How valuable is this shared knowledge to your work?
4
Average: 4 (1 vote)
Your rating: None

Post new comment

By submitting this form, you accept the Mollom privacy policy.

The Communication Initiative Network and Partnership

All major development issues addressed. Convenes the communication/media development, social/behavioural change community. Social network - 85,000 please join. Knowledge sharing - 35,000 summaries, 1 million users pa. Critical peer review - ratings, comments, dialogue. Advocacy for this field. Strategic direction/funding by 20 Partners. To discuss partnership please contact Warren