Preventing Corruption in Humanitarian Operations

Page 161

Preventing Corruption in Humanitarian Operations SECTION III: CORRUPTION THROUGH THE PROGRAMME CYCLE

144

• Always verify or cross-check information Allow time and budget for cross-checking M&E findings. Use multiple information sources, different tools for data collection and varied skills within the team. Watch for possible biases; ensure certain projects or sites aren’t kept from monitors and that minority groups are included in data collection. Check information with other agencies working in the same region. • Follow up on suspicious reports Follow up reports that you suspect are biased or exaggerated. Check whether they’re typical of the programme type, staff responsible or emergency context. Make surprise site visits to verify report conclusions, and ensure management acts on M&E findings. You’ll need • Simple monitoring forms and templates setting out key factors to monitor. • A set of basic evaluation standards for programmes, which all evaluations should assess. • To ensure field staff understand the importance of evaluations, and cooperate fully. • Feedback mechanisms for stakeholders to comment on M&E reports. • Sufficient resources to follow up on suspicious reports (and spot-check others). • Objective, verifiable indicators of project success, e.g. indicator tracking tables. Challenges • Staff or stakeholders with vested interests misinforming monitors and evaluators. • Resistance from management or donors to allocating sufficient resources to M&E. • The tendency to let M&E reports gather dust: ensure they’re read and acted on. • Challenges around the rotation of staff: inconsistency, loss of institutional knowledge, and new staff who may be easier to manipulate.

Reference materials AA Sri Lanka: Community Review, Colombo n.d. AID: Monitoring and Evaluation (M&E), 2009. Byrne, Catriona (Ed.): Monitoring and Evaluation in “Participation by Crisis-Affected Populations in Humanitarian Action: A Handbook for Practitioners”, chapter 6, p. 193-209, and chapter 7, p. 211-227, ALNAP, ODI, London 2003. FitzGibbon, Atallah: How to Monitor and Evaluate Emergency Operations, IR – Handbook May 2007, IR Worldwide, Birmingham 2008. HAP International: Benchmark 6: Continuous improvement, in “The Guide to the HAP Standard: Humanitarian Accountability and Quality Management, Oxford 2008. ProVention Consortium: What is monitoring & evaluation?, IRFC, n.d. Qualité COMPAS (Quality COMPAS): Criteria and Tools for the Management and Piloting of Humanitarian Assistance, 2007. The Sphere Project: Common Standard 5: Monitoring and Common Standard 6: Evaluation, in “Sphere Humanitarian Charter and Minimum Standards in Disaster Response”, 2004.


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.