Disaster Relief 2.0 Report

Page 11

maps get updated frequently, building thematic layers for particular purposes. And yet, digital maps are printed and posted on walls, where they are annotated by hand. Documents have migrated from paper to digital files, and are still the primary method by which key metrics and supporting data gets collected, analyzed, distributed, and briefed to decision makers. Paper itself is not the problem: it is a durable, cheap, lightweight, and high-resolution method that requires no power to use and allows for annotations by multiple individuals. The problem is the method of creating the content that goes onto paper. Today’s predominant method of work relies on a human reading each document and distilling the important bits for others in their organization or network. It is a venerable method, but slow and not easily scalable to handling massive increases in data flows without also increasing the number of humans reading documents. During the response to the 2010 earthquake in Haiti, the volume and velocity of data began to overwhelm this approach, helped by a new dynamic: the rise of the cell phone. In the developing world, cell phone use has become almost ubiquitous. Even some of the world’s most impoverished communities now have access to voice and data services. After the January 2010 quake, the Haiti community used cellular technology to tell the international community what they needed. Haitians sent hundreds of thousands of text messages in through social media sites. At the same time, the scale and scope of the tragedy created an unprecedented volume of information flowing between humanitarian personnel. Humanitarian field staff had neither tools nor capacity to listen to the new flow of requests arriving directly from Haitian citizens. This gap did not go unnoticed. Working in communities, thousands of volunteers from around the world

“ The absorptive capacity of responders is pretty low. It’s not because they do not have an affinity to technology. It’s because they are really, really busy 98% of the time, and they are sleeping the other 2%. –Robert Kirkpatrick, UN Global Pulse

aggregated, analyzed, and mapped the flow of messages coming from Haiti. Using Internet collaboration tools and modern practices, they wrote software, processed satellite imagery, built maps, and translated reports between the three languages of the operation: Creole, French, and English. They provided their data to each other through interlinked services, so that outputs from one effort became inputs to another.

On the timeline of the Internet’s evolution, the 2010 Haiti earthquake response will be remembered as the moment when the level of access to mobile and online communication enabled a kind of collective intelligence to emerge—when thousands of citizens around the world collaborated in volunteer and technical communities (V&TCs) to help make sense of a largescale calamity and give voice to an affected population. That said, the humanitarian system had no formal protocols for communicating with these volunteer and technical communities (V&TCs). Despite the good will of field staff, their institutions’ policies and procedures were never designed to incorporate data from outside their networks. Some view this as a lost opportunity; others worry about what this change might mean for humanitarians who need to protect data about vulnerable populations. Regardless of one’s viewpoint on the contributions of V&TCs, the response to the 2010 Haiti quake made it clear that the rate of investment in humanitarian information management over a complex global network is failing to keep pace with new technological realities. The humanitarian system could use this revolution in connectivity and the evolution of systems for widely

11


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.