Development Research in Practice

Page 131

the values for each variable fall within the expected range and whether related variables contradict each other. Electronic data collection systems often incorporate many quality control features, such as range restrictions and logical flows. Data received from systems that do not include such controls should be checked more carefully. Consistency checks are project specific, so it is difficult to provide general guidance. Having detailed knowledge of the variables in the data set and carefully examining the analysis plan are the best ways to prepare. Examples of inconsistencies in survey data include cases in which a household reports having cultivated a plot in one module but does not list any cultivated crops in another. Response consistency should be checked across all data sets, because this task is much harder to automate. For example, if two sets of administrative records are received, one with hospital-level information and one with data on each medical staff member, the number of entries in the second set of entries should match the number of employed personnel in the first set. Finally, no amount of preprogrammed checks can replace actually looking at the data. Of course, that does not mean checking each data point; it does mean plotting and tabulating distributions for the main variables of interest. Doing so will help to identify outliers and other potentially problematic patterns that were not foreseen. A common source of outlier values in survey data are typos, but outliers can also occur in administrative data if, for example, the unit reported changed over time, but the data were stored with the same variable name. Identifying unforeseen patterns in the distribution will also help the team to gather relevant information—for example, if there were no harvest data because of a particular pest in the community or if the unusual call records in a particular area were caused by temporary downtime of a tower. Analysis of metadata can also be useful in assessing data quality. For example, electronic survey software generates automatically collected timestamps and trace histories, showing when data were submitted, how long enumerators spent on each question, and how many times answers were changed before or after the data were submitted. See box 5.4 for examples of data quality checks implemented in the Demand for Safe Spaces project.

BOX 5.4 ASSURING DATA QUALITY: A CASE STUDY FROM THE DEMAND FOR SAFE SPACES PROJECT The Demand for Safe Spaces team adopted three categories of data quality assurance checks for the crowdsourced ride data. The first—completeness—made sure that each data point made technical sense in that it contained the right elements and that the data as a whole covered all of the expected spaces. The second—consistency—made sure that real-world details were right: stations were on (Box continues on next page)

CHAPTER 5: CLEANING AND PROCESSING RESEARCH DATA

111


Turn static files into dynamic content formats.

Create a flipbook

Articles inside

Appendix C: Research design for impact evaluation

33min
pages 215-231

Appendix A: The DIME Analytics Coding Guide

24min
pages 195-210

Appendix B: DIME Analytics resource directory

3min
pages 211-214

8.1 Research data work outputs

6min
pages 190-194

Chapter 8: Conclusion

1min
page 189

7.4 Releasing a reproducibility package: A case study from the Demand for Safe Spaces project

3min
pages 184-186

7.1 Summary: Publishing reproducible research outputs

8min
pages 172-175

7.3 Publishing research data sets: A case study from the Demand for Safe Spaces project

10min
pages 180-183

7.2 Publishing research papers and reports: A case study from the Demand for Safe Spaces project

8min
pages 176-179

Chapter 7: Publishing reproducible research outputs

1min
page 171

6.1 Data analysis tasks and outputs

3min
pages 168-170

6.8 Managing outputs: A case study from the Demand for Safe Spaces project

10min
pages 163-167

6.7 Visualizing data: A case study from the Demand for Safe Spaces project

4min
pages 161-162

6.6 Organizing analysis code: A case study from the Demand for Safe Spaces project

4min
pages 159-160

6.5 Writing analysis code: A case study from the Demand for Safe Spaces project

3min
pages 157-158

6.4 Documenting variable construction: A case study from the Demand for Safe Spaces project

4min
pages 155-156

6.3 Creating analysis variables: A case study from the Demand for Safe Spaces project

1min
page 154

6.2 Integrating multiple data sources: A case study from the Demand for Safe Spaces project

9min
pages 150-153

6.1 Summary: Constructing and analyzing research data

10min
pages 146-149

Chapter 6: Constructing and analyzing research data

1min
page 145

5.7 Recoding and annotating data: A case study from the Demand for Safe Spaces project

3min
pages 140-141

5.6 Correcting data points: A case study from the Demand for Safe Spaces project

4min
pages 138-139

5.5 Implementing de-identification: A case study from the Demand for Safe Spaces project

9min
pages 134-137

5.1 Summary: Cleaning and processing research data

7min
pages 122-124

5.4 Assuring data quality: A case study from the Demand for Safe Spaces project

7min
pages 131-133

5.3 Tidying data: A case study from the Demand for Safe Spaces project

7min
pages 128-130

5.2 Establishing a unique identifier: A case study from the Demand for Safe Spaces project

7min
pages 125-127

Chapter 5: Cleaning and processing research data

1min
page 121

B4.4.1 A sample dashboard of indicators of progress

12min
pages 113-117

4.4 Checking data quality in real time: A case study from the Demand for Safe Spaces project

2min
page 112

4.3 Piloting survey instruments: A case study from the Demand for Safe Spaces project

14min
pages 106-111

4.2 Determining data ownership: A case study from the Demand for Safe Spaces project

16min
pages 100-105

B3.3.1 Flowchart of a project data map

37min
pages 81-96

B2.3.1 Folder structure of the Demand for Safe Spaces data work

36min
pages 55-72

Chapter 4: Acquiring development data

5min
pages 97-99

Chapter 3: Establishing a measurement framework

18min
pages 73-80

Chapter 1: Conducting reproducible, transparent, and credible research

35min
pages 31-46

Chapter 2: Setting the stage for effective and efficient collaboration

18min
pages 47-54

I.1 Overview of the tasks involved in development research data work

18min
pages 22-30

Introduction

2min
page 21
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.
Development Research in Practice by World Bank Publications - Issuu