Page 1

Data Quality Assessment (DQA) & M&E Guidelines April 23, 2012


AGENDA

I – INTRODUCTION What is DQA? Why USAID performs DQA? What are the Standards for Data Quality?


I - INTRODUCTION

1.1 What is DQA? •

Data Quality Assessment

Data plays a central role in establishing effective performance management systems, it is essential to ensure good data quality.

DQA addresses the quality of data sets used to estimate the performance indicators.


I - INTRODUCTION

1.2 Why USAID Performs DQA? • Data sets are used to estimate indicators, therefore we need to be concerned with the strengths and limitations of the data set. • And since indicators are used to make management decisions, the quality of the underlying data set directly affects the level of accuracy of those decisions.


I - INTRODUCTION 1.3 What are the standards for DQA? •

As per ADS 203.3.5.1, the quality of the data can be judged with respect to five quality standards;

• • • • •

Validity Integrity Precision Reliability Timeliness


1. VALIDITY [ Valid representation of performance ]

• Do the data represent clearly, directly and adequately the intended result? Data are valid to the extent that they clearly, directly, and adequately represent the result that was intended to be measured. Measurement errors, unrepresentative sampling, and simple transcription errors may adversely affect data validity.


• What is Measurement Errors? – Measurement error results primarily from the poor design or management of a data collection process

• What is Unrepresentative Sampling? – Data are said to be representative if they accurately reflect the population they are intended to describe

• What is Simple Transcription Errors? – Refers to simple data entry errors made when transcribing data from one document or database to another.


2. INTEGRITY [ Free of manipulations ]

• Do the data collected, analyzed and reported have set mechanisms that prevent or reduce intentional manipulations for political or personal reasons? Data that are collected, analyzed, and reported should have mechanisms in place to reduce the possibility that they are manipulated for political or personal reasons. Data integrity is at greatest risk of being compromised during data collection and analysis.


3. PRECISION [ Acceptable Margin of error ]

• Do the data accurately present a fair picture of performance? Data should be sufficiently accurate to present a fair picture of performance and enable the Team to make confident management decisions.


4. RELIABILITY [ Consistent data collection/analysis methods ]

• Do the data reflect stable and consistent data collection process and analysis methods over time? Data should reflect stable and consistent data collection processes and analysis methods over time. Managers should be confident that progress toward performance targets reflects real changes rather than variations in data collection methods. Reliability can be affected by threats to validity and changes in the process of data collection.


5. TIMELINESS [ Frequent and current data set is available ]

• Is the data timely enough to influence management decision-making at the appropriate level? Data should be available with enough frequency and should be sufficiently current to inform management decision-making at the appropriate levels. Effective management decisions depend upon regular collection of up-to-date performance information.


I - INTRODUCTION Quality Standards 1. Validity [ Valid representation of performance ] 2. Integrity [ Free of manipulations ]

3. Precision [ Acceptable Margin of error ] 4. Reliability [ Consistent data collection/analysis methods ]

5. Timeliness [ Frequent and current data set is available ]

Definition Do the data represent clearly, directly and adequately the intended result? Do the data collected, analyzed and reported have set mechanisms that prevent or reduce intentional manipulations for political or personal reasons? Do the data have an acceptable margin of error? Do the data reflect stable and consistent data collection processes and analysis methods over time? Is the data timely enough to influence management decision-making at the appropriate level?


Reported performance measures are the result of a process represented below: 1. 2. 3. 4. 5.

Data Collection; Data Transcription; Data analysis; Result Estimation; and Performance Reporting


FTF Partners M&E System requirement to ensure Data Quality 1. The projects should have a Performance Management Plan (PMP) in place with the indicators lists and details on definitions, collection methodologies, data source, tracking process from field to head office, data storage, evaluation plan, sample survey forms, responsibilities of the respective staff etc. It will have clear guidelines/instructions for cleaning, analysis, reporting, and quality assessment of the data. For the FTF indicators, the project will STRICTLY adhere to the FTF Indicator Handbook/Reference Sheets. For any custom indicators, the project will develop indicator reference sheets with definitions and collection methods and submit to USAID for review and approval. 2. All the M&E team members should have copies of Project PMPs.


FTF Partners M&E System requirement to ensure Data Quality 3. The M&E Team MUST be properly trained, and there should be documentation on where/how they have been trained. 4. People collecting M&E data cannot be the same people who are in charge of project implementation. M&E should be an independent team to avoid bias. 5. Consistent data collection & analysis process (as captured in project PMP) MUST be used from year to year, location to location, data source to data source. If any changes are made to the processes, these should be justified and documented and USAID will have to be notified. 6. Data entry & analysis - Steps needs to be taken to limit transcription error. It is strongly recommended that the project develop and use a software-based database management system for data storage and analysis, instead of excel sheets.


FTF Partners M&E System requirement to ensure Data Quality 7. Safeguards MUST be in place to reduce the possibility that unauthorized changes could be made to the data. It is strongly recommended that the project develop and use a software-based database management system for data storage and analysis, instead of excel sheets. 8. There needs be a plan on an independent review of the results reported. It should be part of the project PMP. 9. All the M&E Data collected and reported MUST have supporting documentation, such as hard copies of the original questionnaires, registers of trainees, etc.


Thank You!

DQA  
Read more
Read more
Similar to
Popular now
Just for you