global partnership

Page 1

GLOBAL PARTN ER SHIP Volume 1, Issue 4 (OCTOBER - dECEMBER 2011)

PARTICIPATORY MONITORING AND EVALUATION

and provides demand-based advisory and consulting services. The broad thematic areas of PGP’s work are: • Democratic Governance • Participation, Voice and Social Accountability • Effective and Empowered Civil Society • Agency for Gender Equity • Environmental Governance

Photo courtesy PRIA Archives

PRIA Global Partnership (PGP) is a global initiative on citizenship and democracy which aims to foster knowledge and relationships. It strengthens and nurtures partnerships across communities and countries to ‘make democracy work for all citizens’. It undertakes research, advocacy and capacity building activities,

contents perspective

A New Road to Measuring Development or Another Dead End? by Anne Garbutt practice-based articles

Dealing with Complexity Through a Variety of Planning, Monitoring and Evaluation (PME) Approaches by Cristien Temmink Photo courtesy PRIA Archives

Co-Designing Learning, Monitoring and Evaluation (LME) Systems and Frameworks for Networks of Urban Poor by Kaustuv Kanti Bandyopadhyay Seeing Change Through: A Collaborative Approach to Monitoring Advocacy by Gweneth Barry and Janet Gunter Community Reflection Methodology: Suggested Framework to Promote Downward Accountability and Collective Learning by Awny Amer Morsy book review pgp initiatives

Photo courtesy PRIA Archives

resources for development practitioners announcements

From the Director’s Desk PRIA celebrates its 30th anniversary in February 2012. We are grateful to all our well wishers for their continued support all these years. From the early days of PRIA, in our endeavour to advocate, enable and create spaces for participation, an important arena has been people’s participation in monitoring and evaluation of development projects and programmes. It’s coincidental that the current theme of Global Partnership is focused on Participatory Monitoring and Evaluation, published when all of us are engaged in ‘revisiting our roots’ and ‘reconnecting our partnerships’. PRIA Global Partnership is proud to publish this issue in collaboration with International NGO Training and Research Centre (INTRAC), UK. All the articles published in this issue were presented and discussed in the International Conference on Monitoring and Evaluation: New Developments and Challenges, jointly organised by INTRAC, PSO and PRIA held on 14-16 June 2011 in the Netherlands. We have chosen these articles from among several others primarily because they provide fresh insights and advocate courageously for innovative thinking in the search for ‘evidences of changes’. From 2012, Global Partnership, the e-newsletter, will be published as an e-journal three times a year in order to add more articles without losing its niche for sharing practice-based knowledge generated by our colleagues around the world. We welcome your feedback, as always. Kaustuv Kanti Bandyopadhyay December 2011


Perspective A New Road to Measuring Development or Another Dead End? Anne Garbutt, INTRAC Fellow, INTRAC

As development organisations, we are witnessing a clear drive towards a results-based approach to Monitoring and Evaluation (M&E). Civil Society Organisations (CSOs) are being asked to justify how effectively they have spent money donated to them. It is no longer sufficient for CSOs to report to donors or the public on what they have done or what they have produced; instead, there is a growing emphasis on results and evidence of change. This demand for greater accountability, focussing on results and impact, raises the question: are we measuring our work to feed a ‘results industry’ rather than measuring the complex changes we are seeing and the long-term impact on the lives of the poor ?

Influences A rise in high-profile critiques of international development as being neither effective nor accountable has led to an upsurge of what looks to be a crisis of faith in development from different quarters – the media, parliamentarians, high-profile development analysts and academics, and in some cases the general public. This crisis has increased the power of organisations that control the financing of aid to insist on their right to ask what impact their money is having. If organisations cannot demonstrate their impact (in some cases outputs and outcomes masquerading as impact) through their M&E systems, then future funding might be under threat. A second trend has been the aid effectiveness agenda led by major bilateral donors and multilateral agencies, compounded on the one hand by political change across Europe and increased pressure from audit commissions on government departments responsible for international aid. The Paris Declaration on Aid Effectiveness was signed in March 2005 by over one hundred representatives of donor countries, international agencies and developing countries.1 Signatories committed their countries and organisations to increase efforts in harmonisation and alignment in order to reduce duplication of effort and waste in the use of aid, as well as to improve the management of aid for results. A set of ‘monitorable’ actions and indicators were established; consequently, the way governments of donor countries look at how they monitor their aid budgets, and the way governments of many partner countries track inflows and the use of aid, has changed dramatically. CSOs became drawn into an efficiency debate, where how efficiently we spend money was more important than how we influenced change in poor people’s lives.

Photo courtesy PRIA Archives

Challenges The effect on civil society, particularly those organisations that receive funding from donor agencies, has been profound. The architecture of international aid is now infused by a results-based culture. This total focus on effectiveness and efficiency, on cost of delivery and immediate results or outputs, is guiding the way many organisations are measuring their progress, including how they are developing their M&E systems. This change is having a negative impact on organisations who work within a very different perspective on determining whether a programme was worthwhile in terms of its wider impact – by aiming to gain an increased understanding of the nature of the change that has taken place and determining the significance of that change. A primary concern that arises out of this oversimplification of measuring change is how to disaggregate information into the short-term, results-based frameworks favoured by donors. There is a major challenge around how to develop M&E systems that capture the multiple levels of information needs, and that can both provide donors with their required results and ensure that the organisation is learning and growing. There is also an increasing reluctance on the part of some organisations to share their experiences and challenges whilst trying to measure social change using results-based methodologies. Many CSOs continue to find it difficult to introduce better quality M&E systems and approaches that will measure impact or even anything much higher than outputs. These same organisations face major funding pressures, and few appear to be willing to share objective information related to the impact they are having or the real challenges they are facing. Their concerns are often related to how the public view the organisation, and how any ‘negative’ information will affect their funding prospects.


For example, while some interventions can be easily monitored through simple results-based frameworks, others such as advocacy or empowerment are much more difficult to measure in this way – the linear progress leading to concrete results is much less likely to be clear. Such different types of interventions can be seen between large international organisations and those working directly with communities; sometimes these differences can even be encountered within the same organisation.

Navigating our way To address the issues highlighted we also need to consider the latest trends in M&E methodology. Progress in developing new and strengthening existing M&E approaches has been made in the last few years. This continued experimentation and discussions are essential to discover which M&E approaches are best suited to different contexts and types of intervention. The different approaches designed over the years – baseline surveys, outcome mapping, social auditing, logical framework analysis, most significant change stories – all have their place, but it is very clear that no single one of these tools is good enough to provide everything an organisation needs, particularly when organisations are becoming larger and more complex in their operational needs.

Promoting participation Ever since Robert Chambers wrote Rural Development: Putting the Last First in 1983, participation has been accepted as a key approach to M&E and it is considered essential to ensuring the success of any communitybased intervention. In his second book, Whose Reality Counts?: Putting the First Last, Chambers challenges us to accept the change that will be brought about by rural people expressing and analysing their local and complex realities. However, these realities can sometimes be in conflict with the external perspectives other stakeholders are experiencing or are expecting to measure. No matter how clear our expected results are, if there is no trust between the organisation serving the community and the community itself, the results will continue to be owned by the organisation and have very little impact on the community. The success of the project will not depend on how well the expected results have been defined but on improving the trust levels; the latter is often not so easily measured. Participation of partners in developing indicators is believed to be key to ensuring ownership of information and

consequent reporting. The need for intensive support to facilitate the process of developing indicators is essential as many smaller NGOs continue to struggle to provide data against results-focused indicators. One of the case studies from the INTRAC 7th Evaluation conference explored the importance of recognising the value of building both staff capacity and partner capacity to understand the concept of results at output, outcome and impact levels.2

Accountability versus learning The trend towards a results-based approach to M&E creates another major tension. If our M&E systems are designed to measure agreed targets, then it diverts our focus from measuring for learning purposes. The need for results reduced the freedom to learn. The conference on Impact Evaluation in International Development held in Cairo in 20093 brought together over 600 evaluators, policymakers and development practitioners. The conference included those who prioritised learning and others who prioritised accountability. Unsurprisingly, many questions were merely put on the table rather than answered, due to the wide diversity of priorities of the participants. However, there were glimmers of consensus, including on the importance of combining mixed methodologies suitable to support both accountability and learning for improving performance.

Conclusion A worrying sign is that many of us seem to be taking a headlong sprint towards developing results-based frameworks. Is this finally a solution to our dilemma or merely a race to produce quick-fix answers by measuring the lowest common denominator and focussing on the most easily measurable activity or output level indicators? The race to placate the results industry with a technocratic approach to M&E by developing rationally designed frameworks that provide increased managerial control, standardised procedures and replicable models will inevitably lead to a loss in measuring the space needed for civil society to grow, the space for dialogue and the perceptions of different stakeholders. We need a better balance where we are able to both feed the industry and at the same time measure the real change that has been brought about by civil society’s contribution to development.

1. See OECD (2008), ‘The Paris Declaration on Aid Effectiveness and the Accra Agenda for Action’, Paris, France: OECD. www.oecd.org/ dataoecd/11/41/34428351.pdf 2. See www.intrac.org/pages/en/conferences.html for more details. 3. Cairo 2009: Conference on Impact Evaluation in International Development, SAGE Publications, http://evi.sagepub.com/ content/15/4/487.full.pdf+html


Practice-based Articles Dealing with Complexity Through a Variety of Planning, Monitoring and Evaluation (PME) Approaches1 Cristien Temmink, Consultant Learning for Change, PSO,2 The Netherlands

A number of recent trends and challenges in international development have contributed to bringing PME higher on the agenda of many development organisations. These include (i) a growing international call for results-based management, whereby development actors are asked to be accountable for and demonstrate the achievement of ‘measurable’ results; (ii) a fired debate about the extent to which organisations should focus on quantifiable, easily measurable results versus less quantifiable results that are more difficult to measure; (iii) growing recognition that dominant PME approaches such as the logical framework approach are not always helpful for organisations that are supporting complex processes of change. This article shares the emerging insights of a collaborative action research process (2010-2012) in which 10 development organisations, together with their Southern partners, explored if and how more ‘complexity oriented’ PME approaches help them to address some of the challenges described above.

Towards a balanced PME approach It takes more than just the right kind of PME tool or method to improve one’s PME practice in order to deal with complex change. Other dimensions appeared to be important in a complexity oriented PME approach, such as: Values or principles for PME: • Strong commitment towards active participation of multiple stakeholders during the design and implementation of the PME system. The ‘actor focused’ PME approach involves stakeholders to reflect on their own change process and roles or expectations. • Commitment towards collaborative learning among stakeholders during PME activities. This is supported by fostering reflection which often takes place through the processes of dialogue between different stakeholders. PME agenda (PME for what?): Learning from the effects of programmes is an important motive for PME. Effects can mean different things among different actors, but one common denominator is the recognition that effects are changes (positive and negative, as well as intended and unintended) at the level of stakeholders. Improved programming by using the lessons generated by the PME system is another aspect of the PME agenda. This, however, does not happen automatically. In

Photo courtesy Cristien Temmink

several instances stakeholders had to be actively involved in making sense of monitoring information during reflection sessions. This asks for a learning culture and sufficient space for learning. Learning and reflection need to be organised and integrated in working practices and should contribute to making adjustments in the programme. The third important aspect of a PME agenda is satisfying upward and downward accountability needs. The challenge to demonstrate results is common in the contemporary context, irrespective of the fact that an explicit communication between NNGOs and SNGOs or between back donors and NNGOs is evident or not. Concepts, methods and tools for PME: Methodological diversity is crucial in dealing with complex contexts to plan and monitor changes from an actor oriented perspective. This involves looking for change within actors involved in or affected by the programme. Unsurprisingly, this often leads to diverse information needs of different actors at different levels in the programme (e.g., donor organisation, local partners, beneficiaries). Addressing these diverse needs also requires different PME approaches and sometimes a combination of different PME approaches that involve various PME methods. The following table illustrates some key PME methods that were applied in the action research cases. Northern NGO level Reflection meetings/ internal workshops and with SNGOs.

Southern NGO Level Outcome Mapping Most Significant Change Reflection Meetings PMEL Outcome Level Indicators Consumer Panels

Final Constituency Level Most Significant Change Log Frame Client Satisfactory Instruments (Client Satisfaction Surveys, Citizen Reporting Cards) Tailored M&E Toolkit Consisting of Participatory PME Methods such as Impact Maps, Quiz, Personal Goal Exercise Impact Level Indicators


PME approach in practice: Individuals within organisations are often the champions to make PME challenges and expectations explicit and who have the motivation to do something about them. Other enabling factors include: (i) an actual request for alternative PME methods by SNGOs; (ii) explicit support from higher management by providing financial resources and time for staff involved in PME; (iii) an environment of trust that allows dialogue among a group of people with a strong desire to learn from practice in order to improve that practice by learning. Disabling factors include: (i) shortage of time for reflection and trying out new PME methods; (ii) the absence of capacity and specific competencies for the application of new PME methods; and (iii) resistance resulting from previous or on-going experiences.

Learning to deal with complexity through PME Insights from the first phase of the action research reveal that the ‘key’ for dealing with complex change processes lies within our own organisations and programmes. First, it takes a learning culture within an organisation to take up the challenge of customising and implementing different PME approaches that are relevant for a specific context. A crucial element of a learning culture is the presence of a group of people who have the motivation, the courage and the mandate to address PME challenges in their organisations by introducing new PME approaches. Support from higher management and trustful relationships can nurture such a learning culture.

Second, an actor focused PME approach has the potential for enabling dialogue and collaborative learning as they can contribute towards more trustful relationships and active participation of various stakeholders in PME activities. However, this takes a lot of effort. Organisations which strongly value the need for collaborative learning and active participation would be able to sustain such efforts. Third, the ability to learn regularly about what works and what does not and adjusting the programme accordingly can help organisations to deal with complex unpredictable change. This implies a PME approach that facilitates cyclical relations between Planning, Monitoring and Evaluation. Fourth, a methodologically varied PME approach has the potential to help organisations deal with complex contexts. The various levels in the programme where change can happen and the various information needs from different stakeholders ask for a diverse ‘PME toolbox’ as well as skills and resources to apply a mixed approach that aligns with the different levels of complexity in a specific programme. And to conclude, exploring different PME approaches can contribute towards increased internal adaptive capacity of organisations. In the action research this could be evidenced by stronger reflective practice and a deliberate investment in learning practices (e.g., reflection meetings, peer assessments, etc.) in order to inform organisational practice and actions.

1. This is an abridged version of ‘Praxis Paper 26: Dealing with Complexity through Planning, Monitoring & Evaluation (PME)’. Available to download at www.intrac.org/resources.php?action=resource&id=736 2. PSO is an association of sixty Dutch development organisations that focuses on capacity development of civil society in developing countries.


Co-Designing Learning, Monitoring and Evaluation (LME) Systems and Frameworks for Networks of Urban Poor Kaustuv Kanti Bandyopadhyay, Director, PRIA Global Partnership

This article illustrates the process of co-designing an LME system and framework with two ‘national federations’ of Slum/Shack Dwellers International (SDI) and their ‘support organisations’ (together known as ‘national affiliates’) in Nepal and Sri Lanka. SDI is a trans-national network of national affiliates of urban poor active in 33 countries. SDI follows some common core rituals or methods developed through decades of experience. Some of the rituals are savings, participatory enumeration and mapping of slum settlements, promoting horizontal learning exchanges between the urban poor, catalysing and supporting precedent setting housing and/or local infrastructure projects, engaging in partnerships with governments and other actors, etc. These tools and practices help develop capacity of the national federations to design, plan and implement housing and infrastructure projects and to influence public policies. In order to further improve the learning systems within SDI nationally and trans-nationally, PRIA (Society for Participatory Research in Asia) was engaged as a facilitator to help develop an internal LME system. The process of designing the LME system Federation members carry out a range of activities for operationalising various SDI rituals. In order to develop an understanding on the differences between ‘activities’ and ‘change’ an ‘intended change mapping’ exercise was carried out. The members listed out all the activities they generally carry out to operationalise each ritual and then reflected upon the changes they ‘expect’ or ‘want’ to happen. The members identified the different ways in which ‘change’ could be defined, i.e., short term (outputs), medium term (outcomes) and long term (impacts) and they rearranged the identified intended changes into these three categories. Concentrating on the outputs, the members identified corresponding change indicators for various rituals to be monitored by the federation. The indicators measure different outputs at different levels of the federation. The decision of what level of the federation (whether primary groups, or city/district level groups, or national) will collect data for various indicators was discussed and agreed upon by the federations and NGOs. The exercise on ‘visioning the change’ identified three levels of change (short, medium and long term) associated with their interventions. The idea was not to seek a direct linear relationship between the activities and the resulting

Photo courtesy PRIA Archives

changes at all levels but to ascertain the contributions that a set of activities may have on these changes. A simple format was prepared and discussed with the federation. In this format, the federation collectively decided on who will collect data, at what level and whether it will be in qualitative (narrative/story based) or quantitative form, and what support they would need from the support organisations and facilitators for this task. These formats helped in collecting some data for the first time and aggregating it at settlement, city, district and country levels, quarterly and annually. At the outcome level four broad ‘change domains’ were anticipated which could contribute to the goal of the federation. They included: (i) strong and capable federation of the urban poor, (ii) policies and institutions are changed to make them pro-poor, (iii) resources (e.g., land, subsidies, technical capacities, etc.) from government and financial institutions are leveraged to enhance affordability, (iv) partnership with government, academia, media and other non-government organisations are developed to garner external support. In pursuing the core objectives, federation members interact and interface with a number of stakeholders. Each national affiliate also identified such stakeholders. It was realised that such changes at the outcome level cannot be tracked or monitored without the help of some indicators. However, as various city level federations are at different levels of maturity in terms of making progress towards achieving such outcomes, indicators which were fixed in nature and applicable throughout the federation were not very helpful. Instead, it was decided to use graded ‘progress markers’ in order to make it applicable to the entire federation, taking into account the uneven progress across various city level federations. The progress markers were graded as (i) fundamental (must be achieved), (ii) achievable (could be achieved), (iii) difficult (better to achieve). Each national affiliate identified a number of progress markers for each stakeholder category.


Following the identification of various progress markers in relation to changes in attitudes, behaviour and relationships for each category of stakeholders, it was decided to collect information in the form of development stories to substantiate and describe the change against each progress marker. This will follow the Most Significant Change (MSC) story method. By framing a simple question like: ‘Looking back over the last [number of months], what do you think was the most significant change in relation to [a particular progress marker] in the [community/city/settlement]?’, the idea is to collect change stories to demonstrate and describe achievements against various progress markers identified for each stakeholder. Each story will in turn substantiate the progress made towards achievement of one of the four ‘change domains’ (strong federation, policy influence, leverage of resources and partnership development). Contextual elements that influenced the LME design • The national affiliates are working in an unpredictable and complex environment including regressive actions by regulatory authorities (e.g., threats of or actual eviction). In such a complex and unpredictable environment a conventional linear model of LME with predictable outputs and outcomes which are measured through the use of fixed indicators of change is inappropriate. • The long-term objectives set by national affiliates are dependent on changes in the current policies and institutions of the government. It posed a considerable challenge to design and apply LME systems which could track such long-term changes on a regular basis. In order to overcome this challenge, it was decided to design the system in such a way so that it could track on an on-going basis the shortand medium-term changes focusing on attitude, behaviour and relationship with various stakeholder institutions.

• Interventions of the national affiliates involve decision making at multiple levels. The information and learning generated through LME need to be collated, synthesised and analysed for all these levels. Changes initiated by the LME designing process The LME designing exercise helped in collective reflections among the national affiliates and they introduced some immediate changes in their practices. Some of these changes include: • As the exercise clarified the understanding that planning, monitoring and evaluation is an integrated concept and practice, the national affiliates reflected on both their longer-term strategic planning as well as short-term (such as annual and quarterly) planning process. The practice of planning throughout the federations was very informal and mostly intuitive. However, the exercise helped members introduce regular planning. • During the discussion on ‘intended change mapping’, members also realised the current gaps between existing capacities (organisational, technical and resource) of the federation and the required interventions to achieve the intended changes. The discussion logically then focused on how to augment or acquire these capacities in the collectives. It also prompted development of a deeper understanding and planning for building new relationships and partnerships. • The collective vision of change catalysed an enormous amount of energy among the federation members. This energy generated by the core group involved in the exercise started radiating to provincial and district level federation members as well. The core group cascaded similar exercises with provincial and district level federations and it became a federationwide reflection for change of practices. It also catalysed a sense of ownership for LME design as well processes.


Seeing Change Through: A Collaborative Approach to Monitoring Advocacy1 gweneth barry and janet gunter, CAFOD2

‘Normally, when we are asked how is the advocacy work going, we simply answer, “We are in the process” … but now we haven’t got that excuse anymore as we can see exactly where we are within this process.’ – Bolivian Partner of CAFOD In 2007 CAFOD was asked by one of its institutional donors to redevelop a log frame for an existing grant to capture change across the organisation. The new log frame required an indicator and means of verification to be developed. This was a challenging task for several reasons as the political contexts in which CAFOD’s partners worked were very different. Some worked in relatively open democratic regimes like Brazil whereas others existed in more repressive regimes where the space to engage and speak out against policies was quite limited, such as Cambodia. Another problem was the requirement to use a single indicator of change within the log frame, which seemed inadequate to measure the complex process of policy change. The traditional log frame structure points towards a simple linear view of the world, whereas advocacy initiatives (in which most of the partners were involved) are often more complex and can spring off in surprising directions. It was also recognised that advocacy can lead to different types of success. Policy change is one element but it will be fragile and unconsolidated without the foundations of a strong strategy and community engagement. The voice and accountability tool In order to respond to these challenges, CAFOD followed an approach that enabled debate, dialogue and reflections on the different forms of change that could come about. To do this, a tool to record change within a scale from level 1 to level 5 in four independent aspects of advocacy3 was developed. According to this tool, the organisations had to identify the level which they felt best fits their work and track how it changes over time. Levels were decided through discussion between members of the partner organisation’s staff and a CAFOD representative. Often members of the same organisation have different interpretations and these are debated through the exercise, helping to generate a shared understanding. CAFOD’s role is to help facilitate but also challenge and question until the picture is clear. At the start of the process, the initial level is used to set a baseline. Annual follow-up conversations help reveal if this has gone up, stayed the same or fallen. The design process The tool was designed through a collaborative effort to ensure the use of the collective knowledge of CAFOD and

Photo courtesy WAVE Foundation

Photo courtesy PRIA Archives

its partners. The idea was to understand the relationship of the organisation with those they were representing. Each organisation had to identify its own constituency rather than make claims to speak for generic groups. The basic framework was developed and then discussed with partners and staff. Inspiration was taken from Arnstein’s ladder of participation4 to develop drafts. These were then refined through tools such as a ‘lowtech wiki’: the draft text was written up on large paper and CAFOD staff was given markers to edit, question or delete as they saw fit – allowing collective debate. In all of this, the work with users of the tool and designers, including CAFOD’s Senior Designer,5 was crucial to the process. An interdisciplinary design team, which included a designer, illustrator, advocacy staff, monitoring and evaluation adviser, and programme staff was formed. The outcome CAFOD used the tool with 14 partners in the Americas, Africa and Asia. With the use of the tool, the initial targets set with the donors were not only met but exceeded and it was reflected that a significant percentage of partners were making progress in one of the four categories over a three year period. The setbacks and challenges were also registered. The tool appeared to have its own ‘agency’ as it provided new perspectives towards ‘advocacy’. Challenges During the three year monitoring period it emerged that it was hardest to make progress in the ‘constituency and community’ column. This column was also the most sensitive, as in some way or the other it questioned the partners’ links to their ‘grassroots’. Another challenge was the presentation of data. People needed to be able to access trends and gain a sense of the results quickly but at the same time over-simplification also had to be checked.


The levels do have a number but drawing a numerical average, or median, across our partners globally would have been an empty exercise. It would represent what Dilnot and Blastland call the ‘white rainbow’ – an average devoid of all important colours.6 Therefore, the focus was on representing the percentages of partners who improved over the three years, and graphically representing the trajectories of a select group.

Conclusion The monitoring of advocacy has always been difficult as it is not necessarily a linear process and therefore not easy to fit within predetermined models. This is why it is important to recognise this complexity in both the method and the representation of the data. Through this tool, CAFOD is hoping to capture the unplanned and unpredictable, the good and the bad, and the detail alongside the bigger picture.

1. This is an abridged version of the main paper presented at the International Conference on Monitoring and Evaluation: New Developments and Challenges held in Netherlands from 14-16 June 2011, organised by INTRAC, PSO and PRIA. To read the full paper please visit the following link: http://www.intrac.org/data/files/Barry_Gunter_Seeing_Chnge_Through_Intrac_June_11.pdf 2. CAFOD is the official Catholic aid agency for England and Wales. It works with more than 500 partners overseas, and with partners in the UK to reduce poverty. 3. The tool measures four different aspects of advocacy – engagement with government processes, engagement with corporate actors (if this is relevant), organisational advocacy strategy and community or constituency development. 4. See http://lithgow-schmidt.dk/sherry-arnstein/ladder-of-citizen-participation.html 5. Our Senior Designer studied with John Wood, the driver behind the development of a new approach, to design practice called ‘metadesign’ (see http://en.wikipedia.org/wiki/Metadesign) 6. Blastland, M. and Dilnot, A., 2008, The Tiger That Isn’t: Seeing Through a World of Numbers, London: Profile Books.

Community Reflection Methodology: Suggested Framework to Promote Downward Accountability and Collective Learning1 Awny Amer Morsy, Monitoring, Research & Evaluation Consultant, and Capacity Building Trainer with INGOs and NGOs

Community Based Monitoring (CBM) is focused on participatory monitoring and evaluation, and this underpins the process of individual and collective learning. Many development professionals as well as public service agencies tend to define it as a process where concerned citizens, government agencies, industry, academia, community groups and local institutions collaborate to monitor, track and respond to issues of common community concern. However, CBM is considered the cornerstone of participatory monitoring and evaluation, which means ‘a process that leads to corrective actions by involving all levels of stakeholders in shared decision making’. CBM has been widely used in many sectors, with primary focus on water and sanitation, hygiene and health related practices, and education activities. Its increasing popularity can be traced back to being a multi-benefit monitoring approach as it brings people together from different groups. What is Community Reflection Methodology (CRM)? CRM as a key driver of CBM would be a participatory process to track what changes took place in a community/

Photo courtesy PRIA Archives

village in a given period of time. Since communities are unique, any approach to CBM should be appropriate to the local context, i.e., it should continually evolve and be flexible to change. It involves four key inter-related phases: Community Mapping, Participation Assessment, Capacity Building, and Information Gathering and Delivery. CRM provides the means to work together to gather and deliver information and to adapt to change, not as isolated communities but as a network that learns from each other and shares resources. A coordinated network of CRM can provide standardised protocols, training support and data management systems. It can also provide decision-makers with early warnings of health, education


and child’s rights issues before they become catastrophes difficult to contain. CRM depends mainly on combining each of the Most Significant Change techniques as one of the participatory M&E tools to capture the outcomes from an individual’s perspective and promote collective learning, while the other community reflection tools such as the H-method, Spider Diagram and Mood Meter can be used differentially to capture the outcomes from the community’s perspective using participatory/user friendly visual aids. Community reflection tools CRM proposes the following two components that include four tools/techniques: 1. Most Significant Change (MSC) The MSC technique is a form of participatory monitoring and evaluation to follow up and monitor qualitative outcomes. It is participatory because many project stakeholders are involved both in deciding the sort of changes to be recorded and in analyzing the data. It is a form of monitoring because it occurs throughout the programme cycle and provides information to help people manage the programme. It contributes to evaluation. The philosophy of using MSC is to promote collective learning among programme participants/ stakeholders. It involves the collection of Significant Change (SC) stories emanating from the field level, and the systematic selection of the MSC stories by a panel of designated stakeholders or staff in addition to partners and beneficiaries. 2. Participatory Community Reflection Participatory tools are used to conduct evaluation and they are generally visual. Some of the important participatory tools are: a. Mood Meter

community. It can be used in both small and large groups (5-20 persons). It is particularly useful/appropriate for evaluation work at the community level and with diverse groups of programme beneficiaries. b. H-method H-method is a relatively easy participatory community reflection technique for generating data from participants to evaluate the achievement of the objectives of the programme. This tool incorporates elements of ranking, consensus-building and evaluative approaches to a given issue. H-method enables individuals and/or groups to record their own views and ideas in a non-threatening yet structured way, and fosters individual expression as well as common understanding and consensus. It can be used in meetings, workshops, conferences and other group discussions. c. Spider diagram This tool plays a role somewhat same as the earlier one, the key difference being that through the use of the spider diagram one can evaluate and assess more than one programme at the same time at the national level of an organization’s scope of work. Impact assessment using combined MSC and community reflections Information and related analysis made through MSC and community reflections should be brought on the same platform to see the impact of the programme intervention in the community. The individual stories of changes are compiled according to domains. Similarly, the achievements of programmes mentioned in the community reflections are also compiled for each domain. Using different participatory tools and critical in-depth analysis by the people enables programme staff and CBO leaders view both positive and negative impacts of the interventions.

Mood Meter is a CRM tool and exercise designed to evaluate the performance of a programme by a

1. This is an abridged version of the main paper presented at the International Conference on Monitoring and Evaluation: New Developments and Challenges held in Netherlands from 14-16 June 2011, organised by INTRAC, PSO and PRIA. To read the full paper please visit the following link: http://www.intrac.org/data/files/ME_conference_papers_2011/Working_groups_papers/Working_group_5/Awny_Amer-_ Community_Reflection_Methodology_paper.pdf


BOOK REVIEW Planning, Monitoring and Evaluation in Development Organisations: Sharing Training and Facilitation Experiences Author: John de Coninck, Sage, 2008 Reviewed by: Mandakini Pant, Consultant in Development

This book titled Planning, Monitoring and Evaluation in Development Organisations: Sharing Training and Facilitation Experiences provides practical insights to organisational approaches to Planning, Monitoring and Evaluation (PME). The books editors have rightly warned the readers at the outset that the book does not purport to be a PME manual. Instead, they intend to share their real life experiences as PME facilitators and offer suggestions to support PME processes with a focus on Civil Society Organisations (CSOs), including NGOs, church-linked development offices, networks and people’s organisations. They have handpicked ‘real-life’ experiences of 20 PME trainers and facilitators from Africa, Asia and Europe. A basic premise of the book is that effective PME is essential for organisational survival and enables organisations to make an effective contribution to sustainable development. Few organisations have the know-how and skills to employ PME approaches and fewer still are able to design and implement effective PME systems. They tend to perceive PME as imposed donor conditionality rather than a way to strengthen their internal learnings. Consequently, a blueprint approach with an emphasis on application of instruments for efficiency and control is adopted. This approach tends to neglect the specificities of organisations vis-àvis their interventions. An organisational approach to PME is not only rooted in programmes and projects but also in the operational contexts of the organisation. The organisational approach to PME can be nurtured through careful and sensitive PME facilitation. The book is structured in three parts, each detailing the components necessary for building an organisational approach to PME systems. Each part is discussed elaborately in separate chapters. A brief presentation of each part follows. Part I, Why Are We Here?, explains the rationale of the book and addresses the basic concepts in PME.

It reflects on the challenges and the advantages of PME, highlighting in particular the need to support an organisational approach to PME, customised PME and integration of PME into daily learning practices. This section also focuses on the language used in PME practice and the confusions that arise, and delineates a few working definitions for facilitators. It suggests the ways organisational approach to PME can be facilitated effectively. It begins with the rationale of facilitating a total organisational approval to PME, going beyond project or programme to realising a coherent vision/ mission of the organisation and understanding a larger operational context of the organisation. The need for the inclusion of financial PME as overall PME is touched upon as well. Part II, Bringing PME in Daily Practice, suggests an organisational approach to PME. It explores what is needed at the organisational level for effective PME by highlighting the generic issues faced by many organisations and devising possible ways to overcome them. It also reflects on the organisational resources and structures necessary for PME work and the ways in which organisational learning on PME can be strengthened. Reflections on challenges before facilitators for stimulating ownership and participation in PME are included. This section also examines how influences beyond the boundaries of organisations, such as donors and national culture, need to be dealt with. Part III, Further Customising PME, outlines the ways to adapt PME processes and systems to fit diverse organisational characteristics, needs, skills and types, and to probe some practical dimensions of adapting PME processes and systems to diverse organisational contexts. It puts forth PME facilitation experiences with partner organisations involved in advocacy, emergency situations or capacity development and also discusses the ways to facilitate different types of PME processes as well as explores some of the PME challenges associated with interventions in advocacy, emergency situations, capacity development, and conflict and peace. It sums up the ways to plan, implement and evaluate such a process as well as the requisite attitude which facilitators need to adhere to. The book has been written essentially for PME facilitators and trainers supporting CSOs, PME practitioners working within CSOs or as consultants, and other PME users such as donor agency staff. It rightly highlights many critical challenges in facilitating PME within development organisations and the practical counsel that it offers is wise and timely. It is essential reading for all those who are in search of grounded perspectives on designing and using a well-structured PME system within development organisations.


PGP Initiatives PRIA Global Partnership in collaboration with Pro Public, Nepal is engaged in providing capacity development support to Civil Society Organisations (CSOs) in Nepal to deepen social accountability practices in the country. The capacity development interventions are being undertaken through the World Bank initiative ‘Programme on Accountability in Nepal’ (PRAN). This multi-year initiative focuses on three thematic areas: (i) municipal good governance,

(ii) public finance management, and (iii) public service delivery. The programme also emphasises the need for knowledge generation, networking and partnership and providing grants support to local CSOs to practice social accountability mechanisms. PRAN is expected to deepen democratic governance practices in Nepal through providing relevant information to citizens and facilitating civil society and citizen associations to hold governments accountable.

RESOURCES FOR DEVELOPMENT PRACTITIONERS Report on ‘International Conference on Monitoring and Evaluation (M&E): New Developments and Challenges’

INTRAC, UK, PSO, the Netherlands and PRIA, India organised an international conference in the Netherlands from 14-16 June 2011. It examined the key elements and challenges confronting the evaluation of international development, including its funding, practice and future. The purpose was to review and share new initiatives, approaches, issues and challenges to the M&E of development. To read the conference report, please visit: http://www.intrac.org/data/files/resources/714/ Monitoring-and-Evaluation-New-developments-andchallenges-conference-report.pdf To view the conference video visit: http://www.intrac. org/pages/en/conferences.html intrac resources

Monitoring and Evaluating Capacity Building: Is It Really That Difficult? (Praxis Paper 23) Whilst few doubt the importance of capacity building, and the need for effective monitoring and evaluation (M&E) to support this work, the M&E of capacity building is as much a challenge now as it was two decades ago. This paper examines both theory and current practice, and aims to promote debate on some of the key barriers to progress. To read the full paper please visit: http://www. intrac.org/data/files/resources/677/Praxis-Paper-23Monitoring-and-Evaluating-Capacity-Building-is-it-reallythat-difficult.pdf HIV in the Workplace: 20 Ways for INGOs to Help Partners HIV is not just ‘out there’ in communities, it also affects the staff of the partner organisations we work with. Many international NGOs are waking up to this reality. Some have developed innovative and effective ways to encourage partners to respond to HIV and AIDS in the workplace. These are assisting partners to develop organisational resilience to HIV and AIDS. Others, on the other hand, are aware of the need to take action, but are just not

sure how. To read the full paper please visit: http:// www.intrac.org/data/files/resources/484/HIV-in-theWorkplace-20-ways-for-INGOs-to-help-partners.pdf The Use and Abuse of the Logical Framework Approach The logical framework approach (LFA) has come to play a central role in the planning and management of development interventions over the last twenty years. Although the logical framework has become universally known, it is far from universally liked. It has been the subject of much criticism over the years, concerning both the theoretical basis of the approach and the way it is applied in practice. This review takes stock of the current views of international development NGOs on LFA and the ways in which they use it. To read the full paper please visit: http://www.intrac.org/data/ files/resources/518/The-Use-and-Abuse-of-the-LogicalFramework-Approach.pdf Participatory Monitoring and Evaluation in Practice: Lessons Learnt from Central Asia (Praxis Paper 21) This paper records an attempt to develop a fully participative M&E system, drawing on the experience of a team of INTRAC staff working on a civil society strengthening programme in close collaboration with their partners in the five countries of Central Asia. To read the full paper please visit: http://www.intrac. org/data/files/resources/420/Praxis-Paper-21-PME-inPractice.pdf Tracking Progress in Advocacy: Why and How to Monitor and Evaluate Advocacy Projects and Programmes This paper introduces the scope of, and rational for, engaging in advocacy work as part of development interventions. It then focuses on the issue of monitoring and evaluating these efforts – offering reasons why and when these processes should be planned and implemented, what’s involved, and who should be engaged in the process. http://www.intrac.org/data/ files/resources/672/Tracking-Progress-in-AdvocacyWhy-and-How-to-Monitor-and-Evaluate-Advocacy-


AnnouncementS Reconnecting with Our Roots: Celebrating 30 Years of PRIA

Launch of PRACTICE IN PARTICIPATION (www.practiceinparticipation.org)

It has been three decades – 30 years – since PRIA (Society for Participatory Research in Asia) began its journey as a development organisation committed to social change.

PRACTICE IN PARTICIPATION is a one-of-itskind knowledge portal which aims at global south-tosouth collaboration for preserving, maintaining and collaborating on issues and practices of social justice. It is an invited space for practitioners to share their local knowledge and learn from others’ practical experiences, and participate in generation, production and dissemination of knowledge based on experiences from the field.

To celebrate 30 years of PRIA, teams of current PRIA staff undertook journeys to revisit and reconnect with our roots – the people we had touched, the project sites we had worked in, the partners who had helped us along our way, and ex-colleagues who contributed to building PRIA. We wanted to learn what we did, how we did it, what difference it had made – to help strengthen us as we re-commit ourselves to ‘empowering the poor and the excluded such that they can claim their rights and improve their lives’. To know more please visit: http:// www.pria.org/events/upcoming-events/details/54reconnecting-with-our-roots-celebrating-30-years-of-pria Certificate Course on International Perspectives on Participatory Monitoring and Evaluation from PRIA International Academy of Lifelong Learning (PIALL)

This innovative course, one of its kind in a distance education mode, considers a range of conceptual and practical issues faced by practitioners, adult educators, researchers, resource providers and policy makers in managing organisations and programmes in a participatory manner. The course focuses on participatory monitoring and evaluation and facilitates critical analysis of different debates and perspectives on the theme. The next offering of the course is in April 2012 for which the last date for registration is 15 March 2012. To know more about the course and to register, please visit: http://www.pria.org/ about-pria/our-divisions/piall/piall-distance-learningcourses/overview-current-course-offers/internationalperspectives-in-participatory-monitoring-and-evaluationippme

• If you work with cross-cutting participatory practices – participatory approaches, methods, tools, principles and concepts – in your work, • Are a practitioner, academic, researcher, community leader/mobiliser or policy-maker who realises the need for ‘grounding’ your understanding and activities in the grassroots, and • Want to share your experiences and make it available to a wider audience… PRACTICE IN PARTICIPATION is the platform for you! The portal is distinctive in its emphasis on cross-cutting methodological practices—participatory approaches, methods, tools, principles and concepts—which are already being practiced somewhere on the ground. Our focus is on maintaining and presenting a platform for archiving and creating a repository of ‘grey literature’, especially for community-based organisations. We believe that with collective learning, the empowerment of communities can be at a faster pace than ever. Join us now! (http://practiceinparticipation.org/index. php/signup)

read past issues of Global Partnership

• Volume 1, Issue 3 (July–September 2011) • Volume 1, Issue 2 (April–June 2011) • Volume 1, Issue 1 (January–March 2011) Request for Contributions

Interested individuals can share their experiences and learning with a wider audience. Contributions are invited from all readers – development practitioners, consultants, academicians, research students, etc. For information regarding article guidelines (word limit, font, reference style, etc) write to pgp@pria.org mentioning the specific theme for which you wish to contribute.


This issue of Global Partnership is published in partnership with

Global Partnership is the quarterly e-newsletter of PRIA Global Partnership (PGP). The aim of the newsletter is to promote discourses among the partners and to support mutual learning and sharing. Each issue of Global Partnership will focus on a specific theme highlighting conceptual as well as empirical experiences. Publication of this newsletter is supported by SDC (Swiss Agency for Development and Cooperation). The opinions expressed in Global Partnership are those of the authors and do not necessarily reflect those of PGP or PRIA. Readers are welcome to reproduce materials published in Global Partnership. We request clear acknowledgement of Global Partnership as the source (including volume number, issue number and publication months).

Global Partnership is available free of charge. To subscribe, please visit www.pria.org/newsletters. PRIA Global Partnership (PGP), 42 Tughlakabad Institutional Area, New Delhi 110062, India, Tel: + 91-11-29960931/32/33 Fax: +91-11-29955183 Website: www.pria.org, e-mail: pgp@pria.org


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.