Skip to content

Digital Sketches

digital citizen media, ict with a focus on Central Asia and the Middle East.

This summer I was so fortunate as to spend the first two weeks of July at an inspiring summer course in wonderful Budapest. The two days before the start of the summer course I spent cycling up and down the river Danube – crossing most of its eight bridges – and wandering in the 5th and 6th district. The beauty of the city is overwhelming and it can make you feel high by the impressive examples of architecture, the banks of the Danube, the Buda Castle Quarter, Andrássy Avenue, Heroes’ Square and the Millennium Underground Railway, the second oldest in the world. This sounds like a quote from a travel guide but I can not help it: it’s all true.

On Monday the course ‘Media Development and Democratization: Understanding and Implementing Monitoring and Evaluation Programs’ started. For the schedule and the syllabus, please have a look here. In advance the period of two weeks seemed to be a long time but it proved to be too short. And that is a good sign. In this post I will only give some sound bites and quotes but this does not at all reflect the quality of the presenters and the value of the programme as a whole.

‘We journalists do not want to be monitored. However, the question is how monitoring and evaluations can enable journalists to do their work better.’ Gerry Power, director of Intermedia, talked about the need for M&E to give information that supports media organizations in their decision making process. They need information to act on. When people – readers, listeners, visitors – start talking about a television programme, does that also affect their daily lives? For me Powers presentation came down to:

1. Generic (research) questions will only give generic answers. Therefore: define better and finetune the questions before doing anything else. Use one, broad general question and specify into sub questions. The more precise the questions, the better the results.

2. Only ask questions to which the answers do not already exist. It is phenomenal what is sitting out there on the shelves. So do not reinvent the wheel and do not spend precious time and money on existing data.

Why do we care about M&E was the question with which Gordon Adam started his presentation. For the founder and managing director of Media Support Solutions the bottom line is the increasing importance of media projects and programmes in the past twenty years in developing countries. In theory there is a clear distinction to make between media for development and development of media. In reality they quite often are combined, are cooperating or are at least interrelated. Therefore many donors think media projects are important from a PR perspective. From this perspective these projects can work as showcases to proof the tax payers’ money is well spent.

Adam quoted Einstein: `Everything that can be counted does not necessarily count.’ And he stresses `to keep M&E short and simple’. For example most of the times two or three measurable indicators will do. Adam: `Keep your M&E plans simple, this can increase clarity and accuracy. Also the implementation of complex M&E plans (including panels and other surveys) is a lot of work.

So what needs to be done? According to Adam amongst others 1) academic research on new media to provide tools for quantitative surveys and 2) educate funders and ngo’s; if they have a better understanding of what media can and cannot achieve this can create more meaningful evaluation criteria.

Another media professional who shared his M&E views and experiences in Budapest was Daniel Bruce. The international media development consultant also stressed accuracy especially on the point: what do indicators actually tell us? Define these as exact as possible from the start. Check, double-check and cross check the indicators, the research questions but also the budget, deliverables and reporting. Bruce provided a top 10 of mistakes amongst which vague and/or too general indicators, incorrect use of buzz words (what do you mean with ‘baseline’), general language mistakes and impropriate timelines. It is as somebody put it: get out the grandmothers as participants in the evaluation of an hiv/aidsprogramme.

Susan Haas, doctoral candidate at Annenberg School for Communication, talked about ‘the beauty of focus groups. Her colleague Amalie Arsenault discussed how our organizations define media development. Are media used for development project (strategic use) or do you want to support and strengthen existing and new journalistic media with reporting and investigative journalism as core business? Arsenault presented a nice summary of the history of media development (incl The Golden Age of Media development in the eighties in Eastern Europe). According to Arsenault there is no way to track down how big the total budgets are that developing countries receive for media development.

Antonio Lambino presented – as a member of the CommGAP team – presented a very clear way of looking from a M&E perspective at the logframe. A logframe is a management tool mainly used in the design, monitoring and evaluation of international development projects. He linked activities, outputs, purpose to the goal with the ‘if & then logic’ that underlies the logframe. If the assumptions stay unchanged, the if then thinking will lead to the goal to be achieved in the long-term. It is a step by step process, based on a good framework and a theory of change and finding agreement on the manageable bits. ‘Promising the moon is especially problematic with media development projects because expectations are often too high’

Sofie Jannusch coordinates the knowledge sharing initiative MediaME from within the German development organisation Cameco. ‘Media development professionals are not delivering enough proof. Very few look deeper into the effects of the work that has been done. Therefore we started two years ago with Media ME. It is a collaborative initiative by many organisations and individual experts engaged in media development to offer resources and discussion about best practices in monitoring and evaluation in this area.’ The wiki project still has a half-year to go before the budget runs out. Jannusch was using her time to recruit new, enthusiastic co-workers. She succeeded because at least three people from the course volunteered as participants.

Maureen Taylor from the University Oklahoma stressed the long-term perspective. `Keep in mind the long-term goal of assistance. Do not reinvent the wheel and there is no need to repeat mistakes made earlier. We spent more time collecting data than analyzing them.’ One of her practical suggestions, almost made in a sideline, was to spent at least one full day in the newsroom of your grantee/partner organization.

Sheldon Himelfarb works for the US Institute of Peace, funded and founded by the US Congress. Its goal is to design media interventions for fragile societies. Himelfarb focused on the process as a strict and standardized way of doing things. For this he used the example of aircrafts taking off and landing every minute from boats in a war situation). ‘This aircraft procedure is not a repetition but a well established process, clear and transparent.’ Himelfarb represented the clear cut US methods of approaching development evaluations. His presentation raised many questions mostly because many participants thought aircrafts are incomparable with development projects.

Course director Susan Abbott introduced most of the speakers and provided a kind of summary after the presentations. Like such as: integrate M&E into the media programme and projects when possible from the beginning until the end. Always look for a shared/collaborative approach. Change never is a one-way street. When there is donor coordination – which is the ideal situation – seek common ground for indicators used.

This course once more reinforced my idea that development work has more to do with art than with science; the art of balancing between instinct and a programmatic way of thinking. The presenters were very knowledgeable, open for all kinds of questions and remarks and in that way provided us with sound insights of the M&E field. Just as the multi-cultural diversity of my class and the perfect blend of academic theory with practice added up to the value of the course. Thereby the programme was managed in a very smooth and enjoyable way. This combination made these two weeks into an experience that I will not easily forget. Moreover; my gut feeling tells me there will be a long-term spin off: in lessons learned, experiences shared and the network connections made.

From here I thank again the great team behind the course: Susan Abbott, Eva Bognar, Kate Coyer and Amelia Arsenault. They stood firm but fair, are among the vanguard of M&E with regard to media, know how to keep a group of individuals together and – very important – they know how to party.

Some of the relevant websites mentioned:

Audience Scapes

Freedom on the Net, FreedomHouse

Media Sustainibility Index, IREX

Media Policy, by Marius Dragomir and Mark Thompson from the OSI Media Programme

Makutano Junction (case)

Kitchen Budapest (fieldtrip)

Radio C and Tilos Radio (fieldtrip)

Radio Okapi, DRC (case)

Propublica, investigative journalism in the public intrest

Global Voices Online

Radio Free Europe

Reporters sans Frontiers

Some M&E literature

Evaluating the evaluators, CIMA

Media Development indicators, IPDC

Media map, Internews

Africa Media Development Initiative

Ten steps to a result based monitoring (pdf), Worldbank, Kusek/Rist

The Road to results, Worldbank, Imas/Rist

Real World Evaluation, Bamberger/Rugh/Mabry

Several M&E guides

Evaluation for DFID

Good, but how good, CIMA

Evaluation Manual, CIDA

Outcome Mapping, IDRC

Glossary key terms in evaluation and results, OECD

For more guides and manuals

%d bloggers like this: