Issuu on Google+

The Magazine of the National Intelligence Community

Analysis Transformer Caryn Wagner

Under Secretary Intelligence and Analysis DHS





GIS Workflows

February 2012 Volume 10, Issue 1


Real-Time Processing

e t S 1 Se Go a al GI h 50 rra er ot Te Fed e Bo ri c Es eren nf Co

Put GEOINT in the hands of those who rely on it most

Today’s warfighter is the most sophisticated sensor in the world. But he can’t be expected to be a GIS expert. TerraGo® Technologies geospatial collaboration software and GeoPDF® maps and imagery are among the most widely adopted COTS solutions to produce, access, update and share geospatial information with anyone, anywhere. From virtually any mobile handheld device, warfighters can access interactive, compact, portable and secure GeoPDF maps and imagery and easily make georeferenced updates using notes, audio, video, Web services or other information in connected or offline environments. When armed with the most up-to-date GEOINT, warfighters can collaborate peer-to-peer in the field or with Command to produce more relevant, current products that enable better decision making. See TerraGo at Esri Federal GIS Conference booth 501 and at its Special Interest Group (SIG) meeting at 1:30 p.m. Thursday, Feb. 23 in room 306.

Geospatial Intelligence Forum

February 2012 Volume 10 • Issue 1


Cover / Q&A Big Data-in-Motion Solution Real-time analytical processing is an approach that can help master the growing tsunami of GEOINT data. By Alex Philp


Architecture for Intelligence As Pentagon officials prepare their proposal for an architecture to enable data sharing between intelligence organizations, key contractors are expecting the framework to better enable sharing. By William Murray


17 Caryn Wagner Under Secretary for Intelligence and Analysis Department of Homeland Security

Going with the Flow By stringing together tasks and guided, interactive processes, GIS workflow has become the de facto standard framework for defining the work and the flow from initial task to actionable product. By Cheryl Gerber


Departments 2 Editor’s Perspective 4 Program Notes/People

Intel Update A report on the status of some key intelligencerelated issues and pieces of legislation currently under consideration in the nation’s capital. By George Meyers

14 Industry Raster 26 Homeland Vector 27 Calendar, Directory


Shedding Light with LiDAR LiDAR’s true value as a military and intelligence tool comes when it is used in conjunction with data from other sources such as electro-optical, infrared and hyperspectral sensors to enhance the situational picture. By Peter Buxbaum


Industry Interview

28 Antoine de Chassy President Astrium GEO-Information Services North America

Geospatial Intelligence Forum Volume 10, Issue 1 February 2012

The Magazine of the National Intelligence Community Editorial

Managing Editor Harrison Donnelly Online Editorial Manager Laura Davis Copy Editor Laural Hobbes Correspondents Peter A. Buxbaum • Cheryl Gerber Karen E. Thuermer • William Murray Art & Design

Art Director Jennifer Owers Senior Graphic Designer Jittima Saiwongnuan Graphic Designers Amanda Kirsch Scott Morris Kailey Waring Advertising

Associate Publisher Scott Parker

KMI Media Group Publisher Kirk Brown Chief Executive Officer Jack Kerrigan Chief Financial Officer Constance Kerrigan Executive Vice President David Leaf Editor-In-Chief Jeff McKaughan Controller Gigi Castro Administrative Assistant Casandra Jones Trade Show Coordinator Holly Foster Operations, Circulation & Production Distribution Coordinator Duane Ebanks Data Specialists Rebecca Hunter Tuesday Johnson Raymer Villanueva Summer Walker Donisha Winston

Subscription Information Geospatial Intelligence Forum ISSN 2150-9468 is published eight times a year by KMI Media Group. All Rights Reserved. Reproduction without permission is strictly forbidden. © Copyright 2012. Geospatial Intelligence Forum is free to qualified members of the U.S. military, employees of the U.S. government and non-U.S. foreign service based in the U.S. All others: $65 per year. Foreign: $149 per year. Corporate Offices KMI Media Group 15800 Crabbs Branch Way, Suite 300 Rockville, MD 20855-2604 USA Telephone: (301) 670-5700 Fax: (301) 670-5701 Web:

EDITOR’S PERSPECTIVE One trend that seems to be especially resonating as 2012 gets under way concerns the importance not only of information sharing for national and homeland security intelligence, but also specifically the critical role of geospatial standards in undergirding and supporting that sharing. One advocate of that view is Kshemendra Paul, program manager for the Information Sharing Environment, a federal office established by the 2004 intelligence reform law to provide analysts, operators and investigators with information related to integrated and synthesized terrorism, weapons of mass destruction, and homeHarrison Donnelly Editor land security. Paul recently spoke to a U.S. Geospatial Intelligence Foundation event on topics that included the use of geospatial standards as one of the foundations to enhance information sharing, as well as the identification of geospatial methods to increase interoperability among federal, state and local operators. “The GEOINT community is clearly ahead of many others when it comes to information sharing,” Paul told the blog. “We want to drive standards development, so our government can better share and safeguard information in a repeatable, cost-effective way.” Information sharing and standards are also likely to be major topics of discussion at the Esri Federal GIS Conference being held in Washington, D.C., in February. It’s not precisely the same topic as above, but one scheduled presentation that caught my eye was to highlight the Lawrence Livermore National Laboratory’s (LLNL) development of a suite of tools providing on-demand access to critical infrastructure geospatial databases for emergency response planning and management.  Indeed, LLNL is clearly a significant force in this area, having designed a number of capabilities aimed at translating complex model data into scenario specific formats for decision-makers and the general public. One example is a tool called EleCent Earth, which is part of a broader component, Element Centric, that enables users to take full advantage of data related to counter-proliferation. Using Google Earth, EleCent Earth displays results in a geospatial format and enables users to search by area and topic of interest.

KMI Media Group Magazines and Websites Geospatial Intelligence Forum

Military Advanced Education

Military Information Technology

Military Logistics Forum

Military Medical/CBRN Technology

Ground Combat Technology

Military Training Technology

Special Operations Technology

Tactical ISR Technology

U.S. Coast Guard Forum








Providing actionable intelligence.

Maintaining strategic agility.

Protecting the warfighter.

Ready for what’s next.

In today’s electronic warfare environments, increased situational awareness and operational agility are essential to achieving mission success. The convergence of the defense and intelligence communities has created a need for integrated information that can be accessed easily and securely on the home front and the front lines. Booz Allen Hamilton is a trusted partner of the DoD in developing solutions that meet ever-changing warfare conditions. Our cost-effective, portable geospatial and C4ISR cloud computing solutions help military personnel gain a unified picture of their environment so they can maneuver and respond quickly to potential threats. Whether you’re managing today’s issues or looking beyond the horizon, count on us to help you be ready for what’s next.

Ready for what’s next.

Use of the Department of Defense image does not constitute or imply endorsement.


Compiled by KMI Media Group staff

Integrated Analysis Teams to Tackle NGA’s Big Issues The National Geospatial-Intelligence Agency is taking a bold step toward transforming operations and accelerating the agency’s vision of putting GEOINT power in the hands of users by establishing integrated work groups (IWG), according to NGA’s director of analysis and production. NGA has already established one IWG, which is focused on an unspecified strategic region of the world, and is looking to create at least two more, Lisa Spuria, director of the Analysis and Production Directorate, said in January at a GEOINTeraction Tuesday event sponsored by the U.S. Geospatial Intelligence Foundation. The goal of the initiative, which will soon include IWGs focused on the war on terror and on domestic operations, is to bring together people with skills from across the entire agency, focus them on a specific region or topic, and have them try to solve questions on an integrated basis, Spuria explained. But the project is about more than just bringing skills together, she continued. “The idea is to transform the way we do business. It’s not just about transforming how we do analysis, but it’s actually bringing people with skills—developers, engineers, statisticians—in with the analysts, to try to answer the key intelligence questions as part of a team. That means you bring your different types of experience to the analysis, and it’s amazing to see the enthusiasm when you bring engineers and analysts together at the grassroots level. They are beginning to make changes and develop tools and new ways of doing things quickly. We’re bringing everyone together, and there has been a lot of response. “The goal of the IWGs is to provide GEOINT consumers with higher quality analysis on key questions,” Spuria said. “Because we’ve brought all of our expertise into one team, they’re going to take a holistic view of things. Everyone working an issue will be together on a team. Now, we have analysts spread over a number of offices working

the same related issues, from different angles or organizations. We want to bring them all together, and work together on key issues.” The IWGs are one aspect of a growing movement toward collaboration and integration that is also reflected in the work environment at the agency’s new headquarters in Springfield, Va., said Spuria. Her remarks included information provided by the originally scheduled speaker, NGA Deputy Director Lloyd Rowland, who was unable to attend. “The building has demonstrated a lot of the qualities that we wanted to build into it—collaboration, collaboration, collaboration,” Spuria said. “The face-to-face collaboration has really gone up. It’s not just an NGA benefit, but also it’s going to drive changes in how the community does business, because it is an intelligence community building. We’re starting to see a lot of interest in the community in working with us and coming into the space.” The new environment and focus on collaboration is also posturing the agency well for the 21st-century workforce, she added. “We have brought in hundreds of new employees over the past few years who have a ton of enthusiasm and creativity. They want to serve the nation, but they also want the best environment and the best tools, and new approaches to doing business, and this environment helps build the collaborative environment that they can thrive in. We’re trying to make changes so they can thrive even more.” Another major benefit has been in the consolidation of functions within the formerly dispersed organization. “We brought together our 24-hour operations into a single operational center. That was a big deal, because we had 24-hour operations in a variety of locations. Putting them all in one area of the building has been phenomenal. The experts from one side of the operation, such as analytics or development, can all get up and talk. It’s been really important during the recent crisis situations that we have provided support to,” she said.

PEOPLE DigitalGlobe has announced the addition of three new senior leaders to its management team. Marcy Steinke, who joins the company as senior vice president of government relations, is a retired Air Force colonel with 25 years of experience within the DoD. She served as director of congressional legislative affairs for the chairman

4 | GIF 10.1

Compiled by KMI Media Group staff

and the vice chairman of the Joint Chiefs of Staff. In addition, Tim Hascall joins DigitalGlobe as senior vice president of operations, while Grover Wray is the company’s new chief human resources officer. NJVC, a provider of IT solutions to the DoD, has hired three new executives:

Jay Emerson, director, data center services; Van Henderson, director, business development; and Charles Barker, director, capture. Larry G. Hill has been appointed business unit general manager in support of SAIC’s Mission Support Business Unit.

He reports to Stu Shea, president of SAIC’s Intelligence, Surveillance and Reconnaissance Group. The Open Geospatial Consortium board of directors has elected Jeffrey K. Harris as chairman. Harris, now a private consultant, recently

retired from Lockheed Martin, where he served as president of Lockheed Martin Missiles and Space and president of Lockheed Martin Special Programs. He also served as president of Space Imaging Corp., the first company to commercially provide high-resolution satellite imagery and information products.

Big Data-in-Motion Solution Real-time analytical processing can help master the growing tsunami of GEOINT data.

Our most important mission as members of the GEOINT community is to extract meaningful geographical information from data streaming in from sensors so we can deliver actionable intelligence to warfighters where and when they need it. Increasingly, “when” means “now.” To make this possible, we must process and analyze the data in real time—while it’s still moving. Because the data is streaming in like a tsunami that threatens to inundate us, this “big data-in-motion” issue is becoming one of the most critical challenges we face. We are drowning in data, and every new imaging satellite, aircraft and UAV only adds to the GEOINT deluge. We celebrate breakthroughs in spatial resolution and hyperspectral content, and we cheer faster communications links with the sensors. But while valuable, these enhanced capabilities also make the data sets more challenging to transmit, manage, archive, process and analyze. Gigabytes of data once seemed large, but now terabytes are common. We are dealing with peta-, exa-, and zetta-scale data problems, and their speed of arrival from sensors is now measured in minutes and seconds instead of days and weeks. As if high-velocity, high-volume data weren’t significant enough problems on their own, the variety of data is increasing as well. The sub-meter imagery and full motion video that we traditionally associate with GEOINT are being fused with ELINT, SIGINT and MASINT data from many types of ground-based mobile and fixedlocation sensors. Information-rich raster imagery feeds are now being cross-referenced with acoustic signals, biometric signatures,

By Dr. Alex Philp

building control updates and cellular traffic—none of which are the structured data that easily fits into relational databases for traditional querying. Big data-in-motion, therefore, is a problem that is uniquely complicated because the incoming data is high-volume, high-velocity and high-variety—increasingly referred to as “3V” data. Defense/intelligence is just one arena facing a tsunami of 3V data and pursuing a big data-in-motion solution. In the private sector, for example, industries such as energy, utilities and telecommunications see very similar challenges as they protect their respective critical infrastructures. It is important to note that sensors protect more than just physical assets. Cyber-infrastructure is also being monitored by sensor networks that add more data to the mix. If the past 10 years have been about identifying, mapping and assessing our nation’s critical infrastructure and associated vulnerabilities, the next decade will be spent dealing with chronic, persistent cyber-attacks on those facilities. This vulnerability is poorly understood and consistent with the 3V dimensions. Fortunately, progress is being made as a result of so many professions and industries dealing with the same challenge. Real-time analytical processing (RTAP) of big data-in-motion exists today, but there is plenty of room for advancement in the technology. By necessity, however, RTAP development will never be “finished.” It must constantly evolve to keep pace with 3V data, which shows no sign of slowing down. GIF 10.1 | 5

RTAP Today RTAP technology applies computationally intensive algorithms, which perform traditional GEOINT processes, such as feature detection, pattern recognition and change detection to data sets. But rather than wait for the sensor data to be transmitted from their remote location and stored in a static database, the algorithms conduct hundreds of thousands of calculations in fractions of a second as the data streams in from the sensors. By removing the database from the equation, RTAP technology has focused on finding new ways for analytical processing to be carried out at vastly accelerated rates in the compute memory of the chip. There are currently several approaches to this type of solid-state processing, but most use a system-of-systems method that involves a hybrid collection of hardware, firmware and software. This hybrid approach to computer architecture typically relies heavily on parallel processing to perform the extensive calculations in the CPU rather than in the database. By not waiting for this data to stream into and come to a stop in the database, RTAP makes it possible for analyses to be faster than ever before, enabling information to reach decision-makers in minutes or seconds. Just as importantly, it eliminates the vast amounts of power and bandwidth that would otherwise be consumed in the transmission and storage of raw data. A good example of how RTAP is used now can be found in the surveillance arena. Acoustic sensors have been buried along the perimeters of sensitive facilities to detect the approach of potential threats by continuously collecting sounds from the environment. A processing engine located nearby instantly analyzes the acoustic signals as they stream in from hundreds of sensors in the network. Embedded algorithms categorize the noises as mechanical, biological or anomalous to determine if they warrant further observation. If a noise commonly associated with a possible threat, such as a vehicle motor, is detected in a location where it shouldn’t be, the processing system performs several functions simultaneously. It pinpoints the location coordinates of the noise on the sensor network and sends an alert in the form of an email or alarm to designated personnel who can formulate an appropriate response. Simultaneously, the processing engine uses the primary sensor information to trigger activation of a secondary sensor, such as a surveillance camera, to train on the sound location and provide real-time video to the security command center. This provides verification and validation. As a result, we have detection, classification, localization, tracking, correlation, verification, validation and communication all occurring in network real time. The key to the instantaneous aspect of this application is that RTAP ignores the unimportant torrent of background noise and separates out the critical pieces of data. The processing engine focuses upon the anomalous sounds, identifies them to some level, and delivers actionable information in the form of an email text alert or video feed directly to the human decision-maker in a matter of seconds. No other resources—human or automated—are wasted sifting through the terabytes of mundane acoustic signals from the sensor network.

Analysis on the Platform RTAP research is focused on improving several aspects of the technology. Specifically, the goal is to accelerate and expand the ability to perform algorithmic calculations within the compute memory. 6 | GIF 10.1

One of the techniques being developed to accomplish this involves moving the processing and analysis physically closer to, or embedded within, the sensors themselves. An example of how the geospatial industry is heading in this direction comes in the latest generation of digital imaging sensors that fly aboard observation satellites, aircraft and UAVs. Twenty years ago, raw data was transmitted from the satellite or delivered on a hard drive from the aerial platform to a ground facility for processing and analysis. Today, much of the pre-processing occurs on the satellite or aircraft, so that imagery is delivered to the ground station for enhancement, interpretation, fusion, change detection and a dozen other analyses. With RTAP, we want to perform all of the processing and analysis on the platform, and possibly within the electro-optical fabric of the sensor. What would this mean? Imagine a classic GEOINT scenario involving an imaging sensor aboard a space or airborne platform. As the sensor is collecting image data, multiple algorithms are instantaneously sorting through the data searching for a pattern, feature or change in ground conditions that matches a predefined mission objective. And as with the real-life acoustic example above, the future RTAP may involve multi-sensor communication in which one type of sensor detects a feature of interest and a second sensor identifies it. Once the target has been detected and possibly identified, the RTAP attaches three-dimensional coordinates to it and sends a communication in the form of an image chip, email or other alert directly to the warfighter positioned to act upon it. This happens within seconds of the initial target observation by the primary sensor. The communication carries only the information needed by the warfighter to make an informed decision. The concept of embedding processing and analysis engines into GEOINT, MASINT and SIGINT sensors will be possible only if major advancements continue to be made in the hybrid computer technologies referenced earlier. New architectures in hardware, firmware and software are part of the equation for RTAP success, and high performance computing and cloud databases will play important roles. But the most important need right now is a fundamental shift in the way algorithms are developed. As computer technology evolves to become faster and more scalable to push the boundaries of computationally intensive algorithms, processing will continue to move away from serial to parallel architecture. Parallel processing appears to be the only solution to scale beyond the limits of existing systems. This means that algorithms must be written for parallel execution—a sea-level change for most code developers in GEOINT and other big data industries. The advancements described here are by no means insurmountable. Based on the existing rate of progress and new technologies coming online, RTAP has reached an inflection point that may soon put it in front of the big data-in-motion tsunami, revolutionizing the delivery of actionable intelligence to warfighters in support of the GEOINT mission. O Dr. Alex Philp is the founder and chief executive officer of TerraEchos, which develops solutions for big data-inmotion challenges for monitoring and security applications. For more information, contact GIF Editor Harrison Donnelly at or search our online archives for related stories at

Pentagon officials develop a framework to better enable sharing and exploitation of geospatial and other data. By William Murray GIF Correspondent

“There are reasons that we protect that data,” Bigham said, notAs Pentagon officials prepare their proposal for an architecture ing that, in a given instance, intelligence analysts need to ask themto enable data sharing between intelligence organizations, key conselves, “How much do you want to share?” tractors are expecting the framework to better enable sharing and There is also the case of U.S. Army Private First Class Bradley exploitation of geospatial and other data—even though policy and E. Manning, arrested in Iraq in May 2010 and charged with transcultural issues in the intelligence community stand as obstacles. mitting more than 250,000 secret documents, which for many has The Defense Intelligence Information Enterprise (DI2E) is the offered a second cautionary tale about the sharing next iteration of the Distributed Common Ground/ of “too much information” in the U.S. intelligence Surface System Integration Backbone. Contractors community. hope that it will effectively lay the framework for how The arrest of Manning and allegations about vendors could create applications that would work his sharing of classified information has “set back using standard operating systems, secure infrastrucsharing between intelligence communities more ture and web services environments through a manthan anything in the last 10 years,” Bigham said. date for non-proprietary systems. Particularly with operational information, “the fewer The proposals could help with what Bob Noonan, people who know about it, the better. Sharing exactly senior vice president at Booz Allen Hamilton, calls the what operation you’re planning to do as a result of “exploitation dissemination problem.” Intelligence connecting the dots,” is not a good idea, he said. analysts spend about 80 percent of their time gathBob Noonan Even taking into consideration Manning’s arrest ering data for reports to their supervisors, he estiand its reverberations in the intelligence commumates, and only 20 percent analyzing the data. nity, however, Bigham said that information sharing While recognizing the limits of technology in has improved greatly since 2001. He called informathe face of cultural hurdles and policy differences, tion sharing a “two-sided coin.” Noonan said he remains guardedly optimistic. “With Manning doesn’t actually represent anything DI2E, I hope they would be able to spend 20 percent new from an insider threat perspective, said Noonan, gathering information and 80 percent analyzing it.” pointing to CIA traitor Aldrich Ames, a KGB double Kevin P. Meiners, deputy undersecretary of agent, and John Walker, a former Navy chief warrant Defense for Intelligence, heads up the DI2E effort at officer who also spied for the Soviet Union, as examthe Pentagon. He said recently that the DI2E request ples of those who conveyed secret information to for proposals should be released by April. adversaries via more traditional methods. Mark Bigham According to Noonan, the fear of insider threats Too Much Sharing? causes some intelligence analysts to take this standpoint toward geospatial data: “I’ll let you know what you need and But while the drive to improve intelligence information sharing when you need it.” is a major factor in development of DI2E proposals, some intelliOne good scenario when intelligence analysts should share data gence analysts and industry observers also point to two high-profile is when they feel that others might have other missing elements incidents in the past two years that have underscored the contrastthat could help them solve a more complex problem, according ing need to closely guard intelligence information. to Bigham. In July 2001, two months before the 9/11 attacks, FBI “There was too much information shared about the Osama bin agents in Arizona warned FBI headquarters officials to be vigilant Laden take-down,” said Mark Bigham, vice president of business for Middle Eastern students training in U.S. flight schools, urging development for Raytheon Intelligence & Information Systems and their headquarters counterparts to discuss these issues with other a former Air Force intelligence analyst, pointing to reports about U.S. intelligence officials. the methods used by the U.S. military used to track down and kill In that particular case, Bigham pointed out, al-Qaida had benbin Laden in May 2011. Wide publicity about the methods used efited from surprise and the fact that there was no precedent for an could make it more difficult to use the same methods in the future, attack using commercial airliners. “In the case of such cataclysmic he warned.

GIF 10.1 | 7

events, if no one has done something like that before, we wouldn’t have known the pattern to look out for,” for improved situational awareness and context, he said.

Open Formats One key ingredient to the success of DI2E, according to Rob Mott, vice president of the Military Intelligence Solutions Group at Intergraph, is the requirement to develop data using open file formats so that end-users and agencies won’t have to purchase software developed by a particular vendor. “It should not be proprietary,” he said. Such a requirement “levels the playing field,” according to Mott, because it enables smaller companies and academic research organizations to compete with larger companies. This competition benefits the intelligence community through greater technological innovation, lower prices and quicker speed of service, he said. Mott is excited about the potential of DI2E since it will enable intelligence analysts to subscribe to web services with UAV data about a particular area of terrain that is dynamic for hourly, daily, weekly or other regular updates. Regarding the sharing of information, “We’re not where we should be,” Noonan said. “It’s beyond a technology issue. It was a problem when I arrived in the military, when I was in command, and when I retired. It’s still a problem today.”

presents a training conference:

Military Antennas Bridging gaps between government, industry and academia


Technology Focus Day: March 19, 2012 Main Summit Day: March 20-21, 2012 | San Diego, CA InvEST your TIME nETworkIng wITh lEADErS oF ThE MIlITAry AnTEnnAS CoMMunITy AnD: • Understand the program and requirement updates that are crucial for your forward planning • Discover future visions for military antennas and what is needed from the community • Advance your technical knowledge with updates on the latest antennas research: MIMO, electrically small antennas, reconfigurable antennas, active/nonlinear structures and more • Explore potential partnerships with a variety of program managers, technical directors and engineers | 1-800-882-8684 8 | GIF 10.1

What has changed in recent years is the booming volume of geospatial data collected by drones and UAVs, which has added to the challenge of properly analyzing and sharing it. “Everyone knows it’s an issue,” Noonan said of cultural obstacles, which can only be effectively overcome through policy overhauls to encourage data sharing in the intelligence community. Despite the problems, Noonan praised the work of the National Security Agency and National Geospatial-Intelligence Agency. He also pointed to Army Lieutenant General Michael T. Flynn, assistant director of national intelligence after serving as a military intelligence leader in both Iraq and Afghanistan, as a thought leader on exploiting and disseminating intelligence data. Noonan made clear that he doesn’t want to see individual intelligence communities set up their own information clouds. What would be accomplished by moving from server farms to multiple clouds, he asked, since individual clouds will act as a hurdle for sharing information. “I would submit that multiple clouds don’t make things better,” because they mean that intelligence analysts will continue to spend about 80 percent of their time gathering information and only about 20 percent analyzing it, he said. From DI2E, Noonan would like to see standard operating systems, web security and web services prescribed, along with the ability to plug individual applications into the DI2E, much as one could plug a computer or DVD player into a home entertainment system. A retired lieutenant general who commanded the Army Intelligence and Security Command before joining Booz Allen, Noonan is impressed with the data that intelligence analysts can access through the NIPRNet and SIPRNet. He recalls serving in Afghanistan and using geospatial technology to overlay four maps—one U.S.-produced, two Russian-made and one Britishproduced—of an Afghan airfield that American troops were about to overtake, since U.S. forces couldn’t exclusively rely on NGA maps. The maps helped U.S. troops to detect where landmines could be laid to better ensure the safety of the landing party. “There needs to be leadership from the top down,” that dictates changes in policy and culture within the intelligence community, according to Mott. “This isn’t just a good idea,” he said of sharing data using open file formats that are readable and adjustable. “It’s a requirement.” The “Apple store” model for creating applications and allowing users to subscribe to them and download them within a cloud is a useful one to consider for DI2E, in part because it accommodates both mobile and desktop users and has been a successful commercial model, Mott said. The limit to the Apple store model, however, is that it requires users to operate a device provided by a single company. It is a proprietary model that would have limited applications in an environment such as the Department of Defense. “There are a lot of lessons learned from the commercial cloud community about what could work in a secure DoD intelligence community,” Bigham said.

Web Services Intelligence analysts have the need to overlay different geospatial data sources to build a comprehensive view of a particular area of interest, according to Mott. In some cases, intelligence analysts might then be building their own applications to enable

them to exploit multiple geospatial data sources, such as UAVs, commercial satellites, government satellites, infrared and signals intelligence. Open web services within DI2E will enable the intelligence community to require proper authentication of users and create a secure environment for the exploitation and dissemination of such geospatial data, Mott said. Bigham foresees an 18-24 month contract for the DI2E framework architecture. Contractors are closely monitoring whether the Di2E request for proposals will bar winning contractors from later creating applications and integrating them into DI2E. Noonan agreed, speculating that by the end of an eight- to 10-year period using clouds to share information securely could give way to some entirely different technology. “The framework has to have some agility,” to accommodate changes in technology and practice, he said. “Moore’s Law is now considered an anachronism,” he said, arguing that technology change and innovation are now outstripping that well-known prediction of the rate of technology growth. To be successful, the DI2E framework architecture will successfully address “edge users,” and satellite communication terminal users who can’t “easily touch the network,” Bigham said. “DI2E absolutely has to be secure at multiple levels,” and it should have the ability to certify where classified data is within clouds. It will be easy to search DI2E for information, he predicted. Bigham compared DI2E to a city planner, who provides guidelines for home builders and other developers to design their plans to fit within the municipality’s infrastructure and regulations. Another potential major player in this field is IBM Federal, which last year debuted its Defense Operations Platform (DOP), a reusable, and interoperable software platform that company officials hope will meet the emerging DI2E standards. “I don’t think you’ll see one platform selected,” since that wouldn’t likely be in DoD’s best interests, said Andras Szakal, vice president and Andras Szakal chief technology officer with IBM Federal. “I think you’ll see multiple companies develop their own stacks, including open source options.” Szakal estimated that IBM has a two-year jump on its competition, however, arguing that IBM is the first to market with its DOP offering. DOP reportedly has interested Defense Information Systems Agency, Marine Corps and intelligence agency officials because it provides a platform for a service-operating environment that would give DoD agencies that need to rapidly deploy a reliable, secure operating platform with virtualization capabilities. Szakal estimated that it could take some organizations a year to deploy routing, messaging, protocols and up to 50 other applications that make up DOP. O

Co-sponsored by

For more information, contact GIF Editor Harrison Donnelly at or search our online archives for related stories at

GIF 10.1 | 9

Going With the Flow

GIS workflow defines the path of geospatial data production from initial task to actionable product. By Cheryl Gerber GIF Correspondent the most efficient execution,” said Greg By stringing together tasks and guided, Pleiss, Esri solutions maninteractive processes, GIS ager, Professional Services workflow has become the division. de facto standard framework GIS workflow autofor defining the work and mates common geo-prothe flow—how the work is cessing activities, ensures routed—from initial task standardization and conto completed process and sistency across operations, actionable product. and manages geographiThe streamlined simcally dispersed workforces. plicity of workflow has harThe software is growing nessed complex geospatial Greg Pleiss more capable and flexible, data and systems in recent with user-configurable tools years, rendering GIS producto accommodate everyone, tion more effective. One of from those who define the the biggest benefits is inteworkflow to those who exegration. Military and intelcute tasks in standard ways. ligence are often working in Step-by-step workflows now separate acquisition, delivwalk users through previery and contracting offices, ously complex image analybut workflow can serve to sis tasks. unite them. Esri’s ArcGIS Workflow “Geospatial data proManager serves to structure duction leverages workPeter McIntosh the work and flow in stanflow, as it lends itself well dard, repeatable GIS tasks to repeatable workflow proas business processes across an enterprise. cesses with production-line capacity for 10 | GIF 10.1

As users execute each step in a workflow, the Workflow Manager launches disparate applications, configured as part of the workflow behind the single interface. “Software adds execution behind the process using dialogue boxes, like: ‘Is this imagery ready to use?’ If the user’s response is ‘No,’ and additional processing needs to take place, then the system takes the user back to re-task image collection. If it’s ‘Yes,’ then feature extraction would be the next step in that workflow,” Pleiss said. ArcGIS workflow users can configure a question step or a series of choices. Each selection leads the user down a different path toward software to launch for the next task.

Guided Software Many geospatial imaging companies today offer guided software to increase GIS workflow productivity. In its ENVI image processing and analysis software, for example, Exelis Visual Information Solutions (formerly ITT VIS) provides discrete, taskoriented workflows that usher users through

step-by-step processes to create specific end products. “We create interactive software processes to increase productivity through increased integration and automation,” said Peter McIntosh, solutions engineer, Exelis VIS Geographic Intelligence Systems Group. One example is the process of developing a map workflow to create a helicopter landing zone (HLZ) for troops in theater. The HLZ map is then populated or published in the GIS for actionable intelligence. GIS workflow expanded into cloud computing in 2010, when Esri’s ArcGIS Server 10 became available on the Amazon Elastic Computer Cloud (EC2). In 2011, PCI Geomatics released its GeoImaging Accelerator (GXL) large volume image production system on Amazon’s EC2 to support high resolution content production for the ESRI ArcGIS online platform. GXL uses high performance computing to optimize the speed of distributed automated workflows for industrial-strength production. As part of its image preprocessing technology stack, PCI Geomatics also offers GeoImaging tools for ArcGIS, which supports more than 20 different satellite sensors, and Geomatica, a stand-alone desktop image classification package with the ability to build in automated workflow. When mobile enhancements were added to GIS workflow on the cloud in 2011, the ArcGIS Server ArcPad Extension began to offer the latest release of ArcPad (10.0.2) to abet the process of field mapping, data collection and updates from the field to the cloud. Most GIS workflow takes place at the tool and task or process level rather than the systems level, although GIS and nonGIS workflows unite at the systems level for decision support. Software tools supporting tasks often start with capturing imagery in such processes as creating an HLZ. “A surveillance analyst seeking a process to identify an HLZ to publish to a broader audience first captures the imagery from satellite or sensors, and then starts the orthorectification workflow for accurate geospatial parameters. Next, the feature extraction workflow defines and classifies helicopters into vector-based, feature-classed information,” said McIntosh. “To determine the quantity and quality of movement, the analyst enters and runs the change detection workflow to see what features have changed and how they

have changed,” he said. “At the end of each workflow we provide the option to publish the result of the work product into the GIS.”

Step-by-Step Process For example, the ENVI orthorectification workflow, integrated with Esri’s ArcGIS, provides a step-by-step process to remove geometric distortions introduced during image capture. The workflow produces a map with planimetric geometry and orthorectified imagery that is registered to a ground coordinate system with consistent scale throughout the image. Planimetric or flat plane geometry approximates the round surface of the earth by projecting it onto a flat plane. To increase the accuracy and simplicity of the orthorectification workflow, Exelis added a method called rational polynomial coefficients, which allows a wider variety of sensors to be processed. The software is designed to be easy to use. Users select the input image they want to orthorectify from various types of commercial or multi-spectral sensors, Internet Explorer or ArcGIS, then drag and drop the selected images into ENVI. They can confine image processing to specific areas of interest to reduce processing time. Next, they select the output parameters such as pixel size or file name and path. A preview feature allows users to check orthorectification results without having to process the entire dataset. ENVI uses both feature extraction and classification workflow. However, feature extraction is object-based image analysis rather than pixel-based classification workflow. “The object-based approach is supplanting the traditional pixel-based approach since it puts out GIS-ready feature classes as vectors, not rasters or pixels, with a rich set of attributes,” said McIntosh. “Vector has properties associated with it so you can colorize.” While raster images are based on pixels and grids of pixels creating bitmaps, vector images use mathematical relationships between points and the paths connecting them to describe an image. Therefore, vector graphics are composed of paths. Bitmaps require higher resolution and a spatial anti-aliasing technique to create a smooth appearance while vector-based graphics appear smooth at any size or resolution since they are mathematically described. Spatial anti-aliasing minimizes

distortion artifacts when representing a high-resolution image at a lower resolution. Change detection in geospatial imagery is another example of a previously tedious, though vital, analysis task that is greatly simplified by workflow. First, GIS workflow users select the images they want to analyze, such as two images of the same scene taken from different satellite sensors. Next, they might select a before and after comparison of two images of the same location, based on whether the data is raw, already processed or classified. They can choose a thematic change method to detect changes in specific features over time, such as buildings, roads or natural land cover to highlight changes that have occurred in categories. Change thresholding allows users to set parameters to identify the magnitude and type of changes that have occurred. Data cleanup refines results to make features appear more visually realistic. Finally, users can preview, export and choose how to use the results, whether in a PowerPoint presentation, a geodatabase or directly into an ArcGIS file to create a map.

Web Capability GIS workflow is expanding continually beyond pixels and desktops into the realm of vectors and servers. One of the most notable improvements to Esri’s ArcGIS Workflow Manager is the development of web capability. “We’ve increased focus on the development of the server side, by building out web capability through the use of REST interfaces,” said Pleiss. Representational state transfer (REST) is a style of software architecture for distributed hypermedia systems such as the web. It is considered an alternative to the Simple Object Access Protocol (SOAP) interface, an existing standard developed by Microsoft for exchanging XML-based messaging on the web. SOAP was designed to integrate with other standards but is exclusively XML-based, whereas REST information exchanges can be in any of more than 350 Multipurpose Internet Mail Extension types. SOAP exchanges messages between SOAP nodes, whereas REST both captures information from resources and updates resources with information. Esri has made use of the flexibility inherent in the REST architecture. GIF 10.1 | 11

“SOAP exchanges verbose XML responses back and forth across the web. It’s work intensive, whereas REST architectural style reduces every resource on the web to a URL. It’s a lighter weight, simpler way to do web services communications,” said Pleiss. ArcGIS Workflow Manager uses Open Geospatial Consortium (OGC) standards for GIS data integration and the Business Process Execution Language (BPEL) for standards-based workflow. BPEL is a standard executable language set forth by the Organization for the Advancement of Structured Information Standards (OASIS). BPEL is for specifying actions in business processes with web services. BPEL processes export and import information by using web service interfaces exclusively. OASIS is a global consortium that drives the development, convergence and adoption of e-business and web service standards. Another ArcGIS Workflow Manager development is the addition of spatial notifications, most notably for change detection. Notifications are more organizationally significant than they might seem initially. They serve to share vital information in a timely fashion across departments and can streamline project management. “We provide a plug-and-play framework that allows the configuration of notifications when certain changes happen to geospatial data,” Pleiss said. “You can plug in your chosen notification type, such as email.” “A lighthouse is a nautical, aeronautical and topographic image, so all three would subscribe to notifications of any change in the lighthouse,” Pleiss said. Spatial auto-notifications and alerts are rule based. If there were a change in the lighthouse, for example, then subscribers who had chosen to receive notifications that way would receive an email indicating the nature of the change. The flexibility of the notification function reflects Esri’s continued build-out of service-oriented architecture capability in GIS workflow. “It allows the plug-in of your own notification engine, which can trigger a new business process in another part of an organization,” Pleiss continued. “This lays the groundwork for cross-business unit collaboration in a proactive and automated fashion. And it enables the seamless integration of GIS and non-GIS users and business processes.” Esri has been building GIS workflow capability to integrate business processes 12 | GIF 10.1

and bring non-GIS users into the GIS environment. “In the last two years we have undertaken a program to implement workflows that brings in the contracting officers who are the liaison between the agency and the contracted company for a seamless transfer of geospatial data,” said Pleiss. Esri is in the process of rolling out GIS workflow at the executive and management level for actionable intelligence in the decision-making process. “It provides a window of visibility for up-to-the-minute, live workflow status without having to be a heavy GIS user. It allows executives to prioritize resources, tasks and activities,” he said.

“To realize the potential of SAR imagery, we want to translate it from the technical realm to the application level so end-users can use this technology to solve business problems,” he said. “PCI has integrated 40 SAR-specific algorithms into its GeoImaging tools.” The new SAR tools for ArcGIS include coherent change detection, classification of multi-polarized imagery and advanced utilities to filter and analyze imagery. The tools support commercially available data from numerous satellites. Despite the simplicity of workflow, the huge volume of geospatial data behind it can still clog installed systems. “GIS workflow on the cloud has [input/output] issues at multiple levels. Loading these large volumes of geospatial data uses huge bandNew Technologies width on the cloud, so we had to optimize our code to reduce the number of read/ GIS workflow increasingly has incorwrite operations to optimize I/O operations porated new technologies into the process. in cloud processing,” noted Jones. For instance, PCI Geomatics recently com“Sometimes the cheapest and fastest pleted beta development of its synthetic way to get large volumes of images loaded aperture radar (SAR) GeoImaging tools for onto the cloud is still to FedEx it to the data analysis of SAR images in ArcGIS. center,” he added. Users can create application-specific I/O issues are a key hurdle in cloudworkflows since the SAR tools are intebased geospatial data processing, from grated in ArcGIS. “The current most comthe initial transfer of earth mon application workflow observation imagery data to for SAR imagery is marthe actual processing of conitime domain awareness, tent. “But at the same time, which detects ships on the there is improved scalabilocean for the Navy and ity on the cloud and it saves Coast Guard,” said Kevin on hardware purchases and Jones, director of marketing maintenance,” Jones said. and product management, Workflows will continue PCI Geomatics. to evolve to solve the probSAR is a form of radar lems users are facing. “We that uses relative motion Kevin Jones keep defining more workbetween an antenna and its flows to address specific target to provide distincproblems. Down the road, we’ll have worktive, long-term coherent signal variations, flows that say: ‘Go find IEDs in a specific which are used to obtain finer spatial reslocation.’ We are moving toward makolution than is possible with conventional ing it one-click, easy flow analysis,” said beam scanning. SAR is implemented by McIntosh. mounting a single beam-forming antenna “The workflows will become increason a moving platform such as an aircraft ingly simplified as the algorithms or a spacecraft. The target image is repeatunderneath them become increasingly edly illuminated with pulses of radio waves sophisticated,” he said. O at different antenna positions. The echo waveforms are then rendered coherent, stored and processed for an image of the target region. “SAR satellite sensors cut through all weather, day or night. You can collect For more information, contact GIF Editor images regardless of clouds or darkness Harrison Donnelly at or search our online archives for related stories based on back scatter or sound bouncing,” at said Jones.

Intel Update Budget Issues By now, everyone has heard the news about the budget. The good news is the intelligence community fared much better than everyone else. There seems to be a good understanding that intelligence, surveillance and reconnaissance is important—even in times of peace. With the drawdown of “people”, there will be much more interest in automatic platforms and systems to replace these assets. One thing is certain: People have become used to having information and intelligence and they are not going to stop anytime soon.

White House Cyber Plan White House Cyber Coordinator Howard Schimdt has introduced a new strategy for cybersecurity research and development. The

purpose of his plan is to better coordinate efforts to neutralize cyber-attacks through:

By George Meyers cybersecurity makes its way to the commercial sector though transition programs.

If You Think Nothing • Inducing change to get Is Getting Done to the root causes of existing cybersecurity Since January 3, 2011, the deficiencies, with the Senate has introduced 1,914 goal of disrupting the bills, while the House has introGeorge Meyers status quo duced 3,508 bills. A total of 80 • Developing scientific bills were signed into law in foundations to minimize 2011. The 112th Congress started future cybersecurity problems by January 3, 2011, and ends January 3, 2013. developing the science of security We are halfway through the 112th Congress, • Maximizing research impact by and it has already introduced a total of 5,422 catalyzing coordination, collaboration bills. O and integration of research activities across federal agencies • Accelerating transition to practice, George Meyers is a senior vice president where research on how to improve with Cassidy and Associates.

Bill #



H.R. 3674

Rep. Dan Lungren (R-Calif.)

House Science Space and Amend the Homeland Security Act of 2002 to make certain improvements in the laws relating to cybersecurity Technology and for other purposes. Referred to subcommittee on January 12, 2012.

S. 413

Sen. Joseph Lieberman (I-Conn.)

Senate Homeland Security

Amend the Homeland Security Act of 2002 and other laws to enhance the security and resiliency of the cyberand communications infrastructure of the United States. Hearing held in Committee on Homeland Security and Government Affairs.

H.R. 47

Rep. Darrell Issa (R-Calif.)

House Intelligence

Provide a civil penalty for certain misrepresentations made to Congress and for other purposes. In Permanent Select Committee on Intelligence.

H.R. 67

Rep. Mike Rogers (R-Mich.)

House Judiciary; House Intelligence

Extend expiring provisions of the USA PATRIOT Improvement and Reauthorization Act of 2005 and Intelligence Reform and Terrorism Prevention Act of 2004 until February 29, 2012. In the House Judiciary Subcommittee on Crime, Terrorism and Homeland Security.

H.R. 109

Rep. John Conyers House Constitution Jr. (D-Mich.)

Establish a national commission on presidential war powers and civil liberties. Referred to House Judiciary Subcommittee on the Constitution.

H.R. 174

Rep. Bennie Thompson (D-Miss.)

House Oversight and Government Reform

Enhance homeland security, including domestic preparedness and collective response to terrorism, by amending the Homeland Security Act of 2002 to establish the Cybersecurity Compliance Division. In the House Oversight and Government Reform Subcommittee on Technology, Information Policy, Intergovernmental Relations, and Procurement Reform.

H.R. 514

Rep. James Sensen-brenner (R-Wis.)

Extend expiring provisions of the USA PATRIOT Improvement and Reauthorization Act of 2005 and Intelligence House Homeland Security Reform and Terrorism Prevention Act of 2004 relating to access to business records, individual terrorists as agents of foreign powers, and roving wiretaps until December 8, 2011. Became Public Law 112-3.

H.R. 703

Rep. Peter King (R-N.Y.)

House Judiciary

Amend section 798 of title 18, U. S. Code, to provide penalties for disclosure of classified information related to certain intelligence activities of the United States and for other purposes. In the House Judiciary Subcommittee on Crime, Terrorism and Homeland Security.

H.R. 2096

Rep. Michael McCaul (R-Texas)

House Science, Space, & Technology

Advance cybersecurity research, development and technical standards and for other purposes.

S. 1469

Sen. Kirsten Gillibrand (D-N.Y.)

Senate Foreign Relations

Require reporting on the capacity of foreign countries to combat cybercrime, to develop action plans to improve the capacity of certain countries to combat cybercrime and for other purposes.

S. 1152

Sen. Robert Senate Commerce, Menendez (D-N.J.) Science & Transportation

Advance cybersecurity research, development and technical standards and for other purposes.

S. 1159

Sen. Kirsten Gillibrand (D-N.Y.)

Senate Armed Services

Require a study on the recruitment, retention and development of cyberspace experts.

S. 8

Sen. Harry Reid (D-Nev.)

Senate Foreign Relations

Build a comprehensive strategy to confront the nuclear threat from Iran and North Korea; enhance U.S. tools for pursuing key national security interests; and avert and respond to catastrophic cyber-incidents.

S. 372

Sen. Benjamin Cardin (D-Md.)

Senate Commerce, Science, & Transportation

Reduce the ability of terrorists, spies, criminals and other malicious actors to compromise, disrupt, damage and destroy computer networks, critical infrastructure and key resources, and for other purposes.


GIF 10.1 | 13

INDUSTRY RASTER Situational Awareness Video Unit Receives Multiple Feeds Harris Corp. is broadening the capabilities of its RF-7800T Situational Awareness Video Receiver (SAVR) to address the growing requirements for secure wireless digital ISR video at the tactical edge. Harris has integrated the Small Unmanned Aerial Systems Digital Data Link (SUAS-DDL) waveform with the RF-7800T video receiver. With SUAS-DDL, the RF-7800T is now able to receive Advanced Encryption Standard video feeds from

New GPS Payload Prototype Successfully Powered Up ITT Exelis has passed a key Air Force Global Positioning System III program milestone: The company has successfully integrated and performed the initial power-up of the GPS III Non-Flight Satellite Testbed (GNST) Navigation Payload Element (NPE), or full-size payload prototype. The successful power-up of the GNST NPE system shows that the digital communications, telemetry and RF interfaces are working properly. It also assures that the system can be configured and operated correctly. Scheduled for first launch in 2014, the next generation of GPS III satellites will deliver significant improvements compared with current GPS space vehicles. The new satellites, with capabilities yielding superior system security, accuracy and reliability, will improve position, navigation, and timing services for warfighters and civil users worldwide. The GPS III team is led by the Global Positioning Systems Directorate at the Air Force Space and Missile Systems Center. Lockheed Martin is the prime contractor, together with teammates ITT Exelis, General Dynamics, Infinity Systems Engineering, Honeywell, ATK and other subcontractors. Irene Lockwood

14 | GIF 10.1

multiple small unmanned aerial systems (UAS) simultaneously. SUAS-DDL, a Department of Defense-standard waveform, provides enhanced interoperability between small UAS in the air and video receivers and control stations on the ground. The characteristics of the waveform allow multiple UAS to transmit video on the same frequency. This enables warfighters to monitor ISR video streams covering a wider geographic

area, leading to enhanced command and control and operational decision-making. The multiband RF-7800T SAVR delivers real-time video feeds from cameras on aircraft or UAS platforms to ground forces. Designed for the dismounted warfighter as well as for fixed and vehicular applications, the small, lightweight RF-7800T enables feeds to be viewed outside the TOC and while personnel are on the move.

Video Calibration Solution Offers GEOINT Visual Accuracy Because GEOINT viewing requires the greatest possible visual detail and utmost visual accuracy, the CalMAN Geospatial video calibration solution from SpectraCal ensures that each PC monitor is precisely calibrated to the standards required by geospatial intelligence agencies. Prominent among the standards is an Electro-Optical Transfer Function, which makes sure each gradient step between black and white has an equal probability of detection (EPD). The EPD curve used in geospatial intelligence requires extremely rigorous settings, which the software calculates, tests, sets and verifies. In addition to the EPD gamma curve, CalMAN Geospatial accurately calibrates RGB balance. This allows support of color as well as monochrome monitors. Built around a highly customizable workflow architecture, the software uses a built-in

pattern generator and monitor control client derived from SpectraCal’s CalPC product and an auto calibration engine based on years of work in high definition video. Joshua Quain;

Enhancements Aid Transformation of Data to Use and Share Safe Software, a provider of spatial data transformation technology, has released a new version of its flagship product, FME 2012. The release of FME Desktop and FME Server introduces new tools for overcoming data challenges so that data can be used and shared precisely where, when and how it is needed. FME 2012 offers new capabilities in response to market needs that are unmatched by any other tech-

nology. Now supporting over 275 formats, FME 2012’s enhancements provide faster, simpler ways to transform data to use and share. They include new support for 16 new formats, as well as, for point cloud/LiDAR data, the ability to transform billions of points in one workflow, and enhanced abilities to extract the precise subset of a point cloud dataset that is needed. It also offers new and enhanced transformation capabilities

designed to make it even easier to read, write, and prepare XML data—both the geometries and the geospatial components— without requiring knowledge of scripting languages. In addition, enhancements for 3-D data include making 3-D clipping z-aware, making it easier to work with 3-D local coordinate systems and enhance surface modeling. Lakhvir Brar;

Compiled by KMI Media Group staff

Storage Architecture Supports High Resolution Satellite Imagery

GeoEye has selected Cleversafe, a provider of limitless data storage, to meet its requirements for a highly reliable and secure active archive to support their business mission and to maintain their image archive. In order to collect, process and analyze massive amounts of geospatial  data, GeoEye required a scalable and flexible storage architecture that could accommodate the world’s highest resolution satellite imagery, GeoEye-1. Moreover, GeoEye demanded that satellite images be available with zero downtime and 99.999 percent reliability, while keeping costs under control. Cleversafe’s Dispersed Storage Network active archive solution provides GeoEye with over 99.9999999999999

percent of annual data reliability and system uptime. The need to store critical digital content and imaging at the petabyte level and beyond can no longer be accomplished with traditional storage solutions. With complex requirements for both security and governance and the need for content preservation and data integrity, it’s more critical than ever that companies look at solutions that deliver built-in encryption, multiple levels of integrity checks and high levels of reliability. Russ Kennedy;

Metadata Capture Tool Aids Access to Geospatial Content The Canadian Department of National Defence (DND) has implemented an automated Metadata Capture and Publishing Tool jointly developed with TerraGo Technologies to make it easier to catalog, search, retrieve, distribute and archive its TerraGo GeoPDF formatted maps, imagery and derived products. The Mapping and Charting Establishment (MCE) is the DND organization responsible for providing geospatial information to the Canadian forces. During the past four years, MCE has implemented a geospatial information management solution that is centered on metadata. By tagging its geospatial holdings, MCE is able to quickly search and retrieve its content in a timely manner. The collected metadata are also leveraged to enable content management, distribution and archival. The TerraGo-automated Metadata Capture and Publishing Tool was devel-

oped to capture metadata based on the ISO 19115 geospatial metadata standard and the Adobe XMP metadata format. It uses easy-to-complete forms to capture specified metadata elements that provide critical information about individual geo-referenced maps, derived products and images. With form-captured metadata, data managers can more precisely categorize geospatial information thereby enabling rapid searching and retrieval of desired content. Previously, the MCE would manually key in metadata into its GeoPDF maps, imagery and derived products. This process was often cumbersome and prone to the introduction of human error. The new tool offers a single interface with a simple metadata pull-down feature that also cuts down on variance of metadata entries. Renee Wagner;

New Satellite Returns High Resolution Imagery A new source of high resolution satellite imagery is now operating, as the Pléiades 1A satellite has begun returning images from orbit. The precision of the 50 cm Pléiades products is clearly revealed in an image of Washington D.C. (right). Marketed by Astrium Services, Pléiades products will be available for all users from March 2012. Users will then be able to take full advantage of the agility and reactivity of Pléiades 1A with its five acquisition scenarios, three daily tasking plans and acquisition capacity of 450 images per day. Jessi Dick;

New World Map Combines GIS with Cartographic Tradition In cooperation with National Geographic, Esri has released the National Geographic World Map. Esri and National Geographic collaborated to produce a distinctive basemap that reflects National Geographic’s cartographic design, typographic style and map policies. Designed to be aesthetically pleasing, the National Geographic World Map is for users who want to display minimal data on a vibrant, highly detailed background. The map is currently available at ArcGIS Online in the basemap gallery. The new basemap combines a century-old cartographic tradition with the power of GIS technology to produce a distinctive Internet-based map service serving GIS, consumer, education and mobile users. This new map will be added to Esri’s collection of existing basemap services designed for different uses and needs. All of Esri basemaps are freely accessible for internal- and external-facing sites.

GIF 10.1 | 15

Analysis Transformer

Q& A

Providing Timely, Actionable and Useful Security Information Caryn Wagner Under Secretary for Intelligence and Analysis Department of Homeland Security Caryn A. Wagner was confirmed on February 11, 2010 as the under secretary for intelligence and analysis at DHS. Wagner served as an instructor in intelligence community management for the Intelligence and Security Academy from October 2008 to October 2009. She retired in 2008 from the House Permanent Select Committee on Intelligence (HPSCI), where she served as budget director and cyber security coordinator. Prior to that, she served in the Office of the Director of National Intelligence as an assistant deputy director of national intelligence for management and the first chief financial officer for the National Intelligence Program. She accepted this position after serving as the executive director for intelligence community affairs, where she was responsible for the Community Management Staff, which provided strategic planning, policy formulation, resource planning, program assessment and budget oversight for the IC. Wagner’s previous position was that of the senior Defense Intelligence Agency (DIA) representative to Europe. She served as liaison for the DIA director to U.S. European Command and to NATO from April 2003 to April 2004. From October 2000 to April 2003, Wagner served as DIA deputy director for analysis and production. From 1996 to 2000, Wagner headed the director, Military Intelligence (DMI) staff, where she conducted military intelligence community planning and was responsible for development and management of the General Defense Intelligence Program. Other previous positions included serving as staff director of the HPSCI Subcommittee on Technical and Tactical Intelligence and as an associate at Booz Allen Hamilton. Wagner also served as an Army signals intelligence and electronic warfare officer. Wagner received a Bachelor of Arts degree in English and history from the College of William and Mary, and a Master of Science degree in systems management from the University of Southern California. Wagner was interviewed by GIF Editor Harrison Donnelly. Q: You last spoke with GIF early in your tenure at DHS. How would you characterize your experience over the past two years? A: I honestly feel that, with a great deal of hard work from a lot of people, we have really transformed the Office of Intelligence and Analysis [I&A] into something that people who were in the office in the past, if they were to come back, would not recognize. I had some things laid out for me when I came into this job—things I knew I had to fix, and things that Congress, the Homeland Security Institute and others had identified as being problems. My leadership

team and I took those on, and I think we have made huge progress in management challenges, in terms of the basic business processes of the office. We’ve fixed our hiring process and budget process, all of our billets are filled, and we’ve spent all of our money. We have a functioning program-build process, and we have put in place procedures for identifying the need for policies, and writing, publishing and enforcing them. They are the basic things that organizations need, which we didn’t have. The other thing we’ve done is to focus hard on what our missions are in I&A. Who are our customers, what do we do, and how do we do it better? What’s our value proposition, and what do we add? We have made a lot of progress in focusing our analysis, improving the quality of our analysis and working with the Director of National Intelligence [DNI] Office of Analytic Standards and Integrity. We’ve also done a lot of work on the collection and reporting side. We’re working with our partners in the department to standardize the process for reporting information out of the department to the intelligence community and other partners, in a way that it is timely, actionable and useful. We’re also working with partners to build a consolidated architecture for collection, processing and dissemination. We’ve made huge strides in our information-sharing with state and local governments, which is one of the unique reasons we exist. All of these things come to bear, because we’re providing them better products, we have a better information sharing architecture GIF 10.1 | 17

with them, and we’ve upgraded the systems that we’re using to get the information to them. We’ve also developed new product lines in response to their feedback. Overall, we still have a lot of areas for improvement, but I feel pretty good that we have made substantial, measureable progress across the board. One frustration, though, is the articles about us that keep sounding the same themes about I&A. It’s frustrating to see that when I know how much we’ve changed. What’s interesting is that the people who are being quoted in the articles are people who left I&A years ago, certainly before I came onboard. If they’re talking to anyone inside the organization, they’re talking to people who, for whatever reason, still have an axe to grind that is rooted in the past, and not in anything that we’re doing now. I read those articles, and I don’t recognize the picture that they’re painting. I understand that perceptions frequently take a while to catch to up reality, and I’m confident that that will happen eventually. In the meantime, we’ll do what we can to counter that. But if you ask the intelligence community, and the people whose opinions I care about, who are looking at us and gauging our progress, they will tell you that I&A is a much better and more respected place than it was. There’s still room to grow, but we’re doing well. Q: How would you define the concept of “domestic intelligence”? A: Domestic intelligence is a problematic term, because it’s loaded for some people. It harkens back to the days when people were doing things they should not have been doing. So we’re trying now to focus the discussion on “homeland security intelligence,” rather than “domestic intelligence,” because that’s what we do here in the department. When people use the domestic intelligence term, they’re generally using that to capture what we, FBI, Drug Enforcement Agency [DEA] and everyone in the homeland does. There is some utility in trying to define it, but I&A’s input into that is homeland security intelligence, and we’ve been spending a lot of time figuring out what that really means. We’re moving toward a definition in which homeland security intelligence is more than intelligence, but also information that is useful to federal, state, local, tribal and private sector partners, because it will help them identify or mitigate threats to the homeland. We go beyond the traditional sources of foreign intelligence to put together things that are helpful to our customers. It’s about the full range of threats— not just terrorism, because homeland security is bigger than just counter-terrorism. It also captures the fact that we have a huge number of customers, many of whom are also our partners, in the sense that we work with them to share information and put together products that can be of use to our mutual customers and constituencies. We have made a lot of progress in defining what homeland security intelligence is. One of the interesting things is that it’s not just about the homeland, because one of the unique things about the department and I&A is that we have responsibilities for what we call the “approaches” to the homeland. Some people call it the “transit zone” between overseas and domestic or between foreign and homeland. Our job is to protect the borders—both virtual and actual. Every point of departure overseas is a virtual border to the U.S. Even if you’re getting on a plane in Dubai, that’s the border to the U.S. We also have the physical borders as well as the cyber borders, where we’re trying to protect the “dot gov” domain, which is our responsibility in the department, but also working with 18 | GIF 10.1

private industry to protect the broader cyber-infrastructure of the nation. Thinking through what our responsibilities are in homeland security intelligence, it extends into the transit zone, where we’re trying to prevent bad people and things from coming from overseas. That’s one of the department’s unique value-adds. One of the things that we have to figure out in I&A is how best to support that, and more importantly how do we get the DNI and the extended intelligence community to help us support the needs of the people who are protecting the virtual and real borders of the nation. Q: Please give readers an overview of the DHS intelligence enterprise, as well as what you think can be done to improve it. A: The enterprise is made up of the intelligence elements of our operating components. I lead the enterprise, in a collaborative way, through the Homeland Security Intelligence Council [HSIC]. We’ve made a lot of progress over the past couple of years. One of the big things has been to build up trust that it is about looking for synergy and ways that we can mutually support one another and leverage different parts of the department to do a better job for the department’s overall mission. It isn’t about telling the components what to do, taking their money, or any of the other things you might be concerned with if you were in a component. With the trust that we’ve developed, we’re now starting to do some interesting and constructive things. As an enterprise, we are reworking our intelligence information report process and trying to standardize the timelines, formats and processes for reviewing and releasing them. One of the main things that we provide from the department to the larger intelligence community and our federal partners is information to which we have unique access. We have unique access to information in a variety of areas, including the cyber domain, the border domain, encounter data in the immigration domain and the travel domain. One of our main challenges is how to share reportable information in response to intelligence needs and requirements in a way that it can be used and received and is standardized across the enterprise. That’s going reasonably well. We also are trying to make sure that we team together. There are a lot of natural partnerships, for example between Immigration and Customs Enforcement [ICE] and Customs and Border Protection [CBP] on the border, but we have to do the best job we can to bring in all the potential partners to appropriately share information that helps us with our distributed border mission. We’ve created the Border Intelligence Fusion Section within the El Paso Intelligence Center [EPIC]. We provided the SES billet to head it, but there are people from all the operating components in it, as well as people from DEA, FBI and NORTHCOM. It’s all about how we do a better job of pulling in the information and pushing out stuff that’s actionable and useful for interdictors and investigators along the border. That’s one of the things that we started in the HSIC and then promulgated outwards. The operating components are very different in their individual missions, so finding areas of commonality that it makes sense to approach in a common way is not easy. But we’re finding more and more opportunities where we can reinforce each other. Q: Some in Congress and elsewhere have recently voiced criticism of your office, saying that it has done little to improve intelligence data. What is your response?

A: We tend to get good marks from the Homeland Security committees, but we have more of a challenge with the Intelligence authorizers. One of the reasons for that is because we’re so different in our mission space that if you’re going to evaluate us with the same standards used to evaluate CIA or DIA products, we’re not necessarily going to measure up. We’re trying to accomplish different things and do them in a different way. If you’re expecting us to put out a lot of products that look very similar to other IC products, you’re not necessarily going to see that because it’s not our job. If other people are writing on a topic and doing it well, we’re not going to try to duplicate what they’re doing. We’re just going to leverage that and make sure the information gets to our customers. What we’re trying to do is to advocate that our needs be met. If they’re not, we’ll fill the gaps ourselves, although that’s the exception rather than the rule, and make sure that we’re taking what is of interest to our customers and getting it to them, while maybe adding that bit of extra information that makes it useful. We have to put out a lot at the For Official Use Only [FOUO] level, because our law enforcement customers in general are operating at that level. What you can put in an FOUO product is frequently not a lot. If you read those and say you could have gotten that from cable news, or that it isn’t really intelligence, you’re missing the point. This is the best we can give them, based on our negotiations with the intelligence community on what can be released at what level. And it is useful for them, because it’s validating what they might have already heard. We’re trying to give them the most

we can, and then take it to the next step by saying here’s what you can do about it—the preventative measures and counter-measures and the indicators you can look for. It’s a different art form, and one that we are working constantly to refine and perfect. But if you grade it on the same scale as a CIA assessment, it’s just not the same. We’ve been trying to make that case, and I think over time we will make progress. We’re serving different customers, and our products look a little different. But we have been evaluated by the DNI’s Office of Analytic Standards and Integrity, and they have documented a steady improvement in the quality of our products. Q: What have you learned over the past two years about the role of geospatial intelligence in your work? A: One thing I’ve learned is that the National Geospatial-Intelligence Agency is being extremely forward-leaning and innovative in tailoring geospatial products and services to their customers. They focus a lot of attention on the department, which we appreciate. They’re helping us to think about new ways of using the kinds of products and services they provide. In the analytic realm, we are somewhat constrained in the use of imagery. But we’re trying to explore new ways of showing the geospatial characteristics of some of our data and intelligence, and of figuring out when that’s meaningful and useful and when it isn’t. The department overall uses geospatial products and services in a wide variety of ways and we have great support from NGA in doing that. There’s a geospatial foundation

“INTELLIGENCE IS vital to National Security.” James Green, Jr. | Intelligence and National Security Relationships Spanning an impressive 38-year career with the Central Intelligence Agency and the National Geospatial-Intelligence Agency, James Green is a respected leader and mentor. Having served as an intelligence officer, branch chief, senior recruiter and project manager, James knows about American Military University’s academic reputation. That’s why he joined AMU.

Learn More at

Art & Humanities | Business | Education | Management | Public Service & Health | Science & Technology | Security & Global Studies

GIF 10.1 | 19

to the common operating picture that we’re building across the department. The Federal Emergency Management Agency [FEMA] uses it extensively in both preparation for disasters and disaster recovery. My office helps pull together the available remote sensing capabilities to bring them to the services of FEMA in the wake of a disaster. We’ve been trying to incorporate more geospatial elements into the analysis that we’re doing at the Border Intelligence Fusion Section in EPIC. We’re very interested in some of the new apps that NGA Director Letitia Long has been talking about, such as being able to put information on mobile devices, because they could have a lot of utility for our state and local customers as well as the department. We get great support, and we’re trying to incorporate more and more geospatial technologies and services into our products. Q: How can intelligence agencies make use of the new social media while protecting citizen rights? A: This issue came up in the context of conversations I was having with the DNI on what kinds of things we need to think about as a community that have applicability to both the foreign and homeland security intelligence communities. It’s obvious that this is a huge growth area, and it can be of intelligence or counterintelligence value. But it has to be done very carefully, because everything we do includes making sure that we’re protecting privacy, civil rights and civil liberties. Another reason this has to be done carefully is that it’s a new thing—a huge stream of data of uncertain provenance and reliability. It’s like the classic cartoon—“On the Internet, no one knows you’re a dog.” It’s hard to know how much weight to put on things. How many Tweets do you need before you know that something is happening? Do you have to confirm it with information from another source? These are the kind of tradecraft questions that the intelligence community is wrestling with now, both in the foreign and the homeland context. We just want to participate in defining that. For my own purposes, I have the authority within I&A to do open-source collection domestically, which the rest of the intelligence community, except for the FBI, does not have. But we have very strict rules on how we do that—it has to be linked to a specific authority, mission and requirement. So we’re not out there trolling through people’s Tweets. But we have the ability to look at those things. The question is, though, what do they actually mean, and how do we incorporate them into a product, along with other sources, in a way that is rigorous and valid. Q: What is your organizational and technological strategy for improving border security? A: Border security is a departmental mission. My actual mission is to provide intelligence and information support to that. We do that in three basic ways. First, we provide the strategic, analytic context for border security, in close partnership with other elements of the department and our interagency partners. We’re looking at things like whether it is getting better or worse, what different kinds of things are we seeing and what are the trends? There’s a lot of talk, for example, about spillover violence—can we even define that, and establish enough of a baseline to know when we’re starting to see things that are outside the norm? It’s also doing looking at topics like this: What if things really have changed and we just don’t 20 | GIF 10.1

realize it yet? What if we’ve entered a new paradigm in terms of violence on the north side of the border? What would that look like, and how could we tell? It’s just to help people think things through and inform the information that we collect, and prove or disprove our hypotheses. The second piece is that we have a lot of people operating along the border, with different missions and authorities to do border security, law enforcement, and investigations and interdictions. How do we support all those people by sharing the most information we can in ways that are most useful and actionable? What we’re trying to do, in cooperation with DEA and EPIC, is to pull as much information together as we can, to get greater insight into what’s happening on the border, and to push that information out—a lead to an ICE investigator, for example, or a notice to CBP to look out for a red pickup truck crossing at a certain time and place. It’s to enrich everyone’s ability to do their job. The third piece is working mostly with CPB to create an architecture within the department for tasking, collecting, processing and disseminating sensor data related to the border. All the components have different processes and systems, but we’d like to be able to share that information, and ideally to tip one to the other. They all developed on their own, and now we’re trying to link them together in a way that will support the entire department and enable the components to be mutually reinforcing. Ideally, we want to be able to share data with our other partners, including the Mexicans, in a way that is compatible. Q: What results have you seen from the extensive network of state and local fusion centers, and what have you learned from them about sharing information with other levels of government? A: There are 72 fusion centers, and right now we have 75 I&A intelligence officers deployed at the centers, as well as nine regional directors. What we are seeing is that they are all unique, since they are state and city owned. Some do terrorism, while others do “all threats, all hazards.” Some are big and some aren’t. They grow at their own pace, but we’ve seen steady improvement. Many of them now are at the point where they are able to take information we have provided them, combine it with information that they already have or receive from local law enforcement, and put out their own products. Some of them are quite good. Another important thing is that it’s not just that we push stuff to them and they send it back to us, but that they send it out to all the other fusion centers. They are becoming a truly national network—mutually reinforcing each other, sharing best practices and information. A lot of times, they have more granularity of data in their local area on a particular issue or problem. In several instances, we have highlighted their products in the secretary of homeland security’s daily briefing book, because they have been doing a good job. We spend a lot of time training them in areas such as writing products, analytic tradecraft and ensuring that privacy and civil rights/liberties concerns are understood and implemented. We’re starting to see the fruits of that. We’ve been working with the FBI to make sure we’re all being mutually reinforcing, and the FBI is currently considering whether to put some of their own intel officers in the fusion centers that have become mature enough to improve that linkage. The other thing we’re wrestling with now is what we call the “fusion center performance plan.” How do we assess and measure the progress against the key disciplines that we’ve established as

their goals—being able to receive information and analyze, share and disseminate it.

18 critical infrastructure sectors. There are a lot of them out there, and we know we’re not reaching all of them, but we’re working on it.

Q: What can industry do to enhance its contribution to domestic intelligence?

Q: Is there anything else you would like to add?

A: We have a lot of the same issues and challenges facing other members of the intelligence community in dealing with huge volumes of information, and figuring out ways to share with a geographically dispersed group of customers. We have a lot of challenges in the department in terms of enabling our systems and data repositories to talk to each other, but those are issues for any large agency. As I mentioned, NGA has been very forward-leaning in bringing their industrial base to bear on our issues as well. The interesting thing from our perspective is to turn that on its head and ask what we can do for industry, because the private sector is one of our five customer sets. We have built up over the years good relationships with the elements of industry that are involved in the critical infrastructure sectors. But we have a broader responsibility to industry, and working with our partners in infrastructure protection, we have started to do much more engagement with people like shopping mall owners, sporting consortiums and stadiums, which could potentially be targets of attacks, to help them think through their prevention, identification and mitigation efforts. We’re always looking for ways to be of more service to key elements of the private sector that aren’t necessarily under the

A: Coming up on my second anniversary in this job, I still hear comments sometimes about whether we really needed the DHS. The longer I’m here, the more I realize that we did. Most of the components existed before, and were doing their jobs. But as we become better and better as a department where people work together, and where information in one place can support operations in another; we’re getting more bang for the buck from the jobs the components do every day. It’s very rewarding to see that. I grew up as a military brat, and every Thanksgiving we would give a toast to all the military people who were serving away from their families. But now I’ve made my family add a toast to all the first responders, police officers and emergency workers who are doing that every day. They have an immediate impact on people’s quality of life, safety and security, but a lot of people don’t realize it. That’s a message that I’m pushing on the Hill. I’m a veteran, I love our military services, and I don’t want to see their budgets cut. But we need to start thinking about how the homeland security piece is as—or even more—important to the day-to-day well-being of people as the national security apparatus, and we should assign to it the same level of importance in our minds. O



GAme-chAnGinG technoloGieS for iSr • Store & AnAlyze your DAtA • Protect your DAtA • Secure your network

GIF 10.1 | 21

Sensor technology’s true value as a military and intelligence tool comes when it is used in conjunction with data from other sources. By Peter Buxbaum GIF Correspondent

More than anything else, one key circumstance has contributed to the growth in the collection, analysis and exploitation of light detection and ranging (LiDAR) data in recent years: U.S. forces have found themselves fighting in theaters in which they owned the skies, allowing the aerial overflights that collect LiDAR data in Afghanistan and Iraq to proceed undeterred. From there, it was just a matter of time before military and industry imaginations took over, thinking up and developing new and better ways to collect, extract, analyze, exploit and apply LiDAR data. LiDAR, a technology that has been around since the mid1990s, uses 1.064 nanometer wavelength laser light pulses to gauge distances by measuring the time delay between transmission of the pulse and detection of the reflected signal. A range finder mounted in an aircraft swings back and forth collecting data on up to 150,000 points per second, providing resolutions of one point per meter on the ground and one point per 15 centimeters vertically. The data returned by the LiDAR sensor provides location data on an x-y-z axis, referred to as a point cloud. LiDAR has an advantage over some other geospatial technologies in that it provides accurate elevation data. LiDAR outstrips the capabilities of other sensors with its ability to pinpoint the location and elevation of surface elements such as buildings, trees

22 | GIF 10.1

and roads. Under the right circumstances, it can also detect hidden objects, for example by penetrating forest or jungle canopies. But LiDAR’s true value as a military and intelligence tool, say the experts, comes when it is used in conjunction with data from other sources such as electro-optical, infrared and hyperspectral sensors to enhance the picture used by analysts, planners and commanders. The same kind of fused data is also being used in simulations incorporated into training systems. “Changes have come to LiDAR in recent years,” said Matt Morris, director of product development at Overwatch Systems. “There are more providers out there building sensors. The costs to collect the data have come down dramatically. But the biggest development is that the Department of Defense and the intelligence community are using LiDAR in their daily workflow. When the defense and intelligence folks make something a priority, the effect is seen across the whole market space.” U.S. forces in Afghanistan use the BuckEye system, which combines airborne LiDAR technology with digital color camera imagery to provide pictures to commanders and planners on the lay of the land. LiDAR elevation data supports improved battlefield visualization, line-of-sight analysis and urban warfare planning. Fusing data from multiple sources increases the probability that features can be automatically extracted from the raw data and that an accurate situational picture will result. Automated feature

In Afghanistan, LiDAR is often used to help identify sites where extraction is a capability that allows software to recognize certain speIEDs may have been buried. “When a LiDAR sensor has passed over the cific objects represented in LiDAR point clouds. Programming the softsame area multiple times, the data can be used for detecting changes ware to be on the lookout for topographical features such as hills or in terrain,” Bethel explained. “An area where the ground has been disman-made objects such as buildings, vehicles or power transmission turbed may indicate the location of an IED. But it also pays to include lines allows those features to be separately and distinctly portrayed in a hyperspectral sensor which can detect the chemical the LiDAR image. signature of the explosive.” “The total is greater than the sum of its parts,” Merrick utilizes software to visualize LiDAR point said Matt Bethel, manager of systems engineering at cloud data to develop products for customer conMerrick GeoSpatial Solutions. “The more data than sumption. The company’s Merrick Advanced Remote can be put together increases the probability that a speSensing product is a Windows application used to cific target is what you are looking for and decreases visualize, manage, process and analyze LiDAR point the uncertainty of that decision.” cloud data. “Once the LiDAR data is processed it takes “LiDAR data is becoming more critical and many steps to get to what the client is looking for,” said important for developing accurate and current visual Bethel. databases used for training pilots, troops and tank commanders,” added Pratish Shah, director of marMatt Morris keting at Quantum3D. “Manned or unmanned air Multi-INT Focus and ground vehicles take LiDAR scans of a given area, which represents the most current visual information Northrop Grumman focuses less on geospatial of that region. Integrating that current information intelligence in isolation and more on multi-INT in improves realism of virtual training systems. Current response to customer demands, said Sean Love, a comdata can also be used to extend training to mission pany business development manager. “The real focus rehearsal, giving pilots and ground troops access to the going forward is on taking geospatial data and pushing most current visual data to train for specific missions.” it to other intelligence types and ingesting other intelligence into geospatial,” he said. The company facilitates the fusion of different data Storage and Dissemination types through translators supported by a service-oriented architecture. Northrop Grumman has expanded The large volumes of data generated by LiDAR senRudi Ernst its existing geospatial portfolio to include geospatial sors represent a potential disadvantage to the use of data acquisition, collection and processing of LiDAR, that data. Storing LiDAR point clouds is more comfull motion video and persistent surveillance data, phoplex than storing and rendering image files because togrammetric services, geographic information systhe addition of a third dimension renders the task more tems and analysis. computationally intense. “All this is transparent to the user,” said Love. “The “Point clouds are actually nothing but a pile of x-y-z user doesn’t care what the application looks like. He data,” said Oodi Menaker, marketing product manager just wants to be getting the right kinds of information. at Israel-based Tiltan Systems Engineering. “The main That is why we focus on getting actionable intelligence challenge is to extract point cloud data, which you to analysts and warfighters.” can then work with to describe the ground, buildings, The company’s offerings support functions such power lines, trees, power poles and many other geoas intelligence gathering and mission planning, routgraphical features.” Sean Love ing and logistics, execution monitoring, physical asset Dissemination of these huge data files also presents tracking, exploration of what-if scenarios, data exploia problem. “Once you collect massive amounts of data, tation and analysis, highly integrated databases and sensor networks, how do you disseminate it on existing networks to a lot of concurrent and secure command and control systems. users?” said Rudi Ernst, president of Pixia. “If you can’t access the data, “In 2011 and in 2012, we are in the process of automating these what is the point of collecting it?” processes and making them a lot faster,” said Love. “We have conThe development of multi-sensor data fusion suggests the collecdensed the process of transforming LiDAR point clouds to a topotion of raw data with multi-sensor packages. “Merrick specializes in graphical map from days or weeks to minutes. We are now focused on multi-sensor collection and analysis,” said Bethel. the first part of the process—organizing point clouds from raw LiDAR Merrick provides LiDAR capabilities to the National Geospatialdata.” Intelligence Agency, U.S. Naval Research Laboratory, and U.S. Army The challenge in the processing of raw LiDAR data is in the math, Research, Development and Engineering Command, operating the said Love. “It’s all about the algorithms and getting smarter about it,” aircraft and sensors that collect the data and develop products based he added. “It is doing the error correction and consolidation, especially on customer needs. if you are doing multiple collects. It takes a long time to crunch that Sensor and analysis requirements differ based on mission and information. It involves taking raw data and shaping it into something geography, according to Bethel. “A customer in Latin America was the user can do something with.” interested in identifying semi-submersible submarines that are used Northrop Grumman’s efforts at automation are aimed at takto move cocaine,” he said. “LiDAR is very useful in that effort, but you ing users largely out of the preprocessing routine and data correcalso need thermal and hyperspectral sensors to see whether you have tion process. “We can now load all of the data on a server and let the a positive identification or a false hit.”

GIF 10.1 | 23

briefing preparation, planning cycles and decisionmaking, and a shorter learning curve.” Quick Terrain Modeler analyzes LiDAR point clouds and represents them as pixels to provide a digital elevation model (DEM). Users have the choice of working with a DEM or directly with the point cloud data. 3-D Visualization Many users rely on processed LiDAR data to develop PowerPoint presentations. Quick Terrain Quick Terrain Modeler, a 3-D point cloud and Modeler enables that automatically. “Everyone up and terrain visualization software package from Applied down the chain in DoD needs to create PowerPoints,” Imagery, was designed for use with LiDAR, but Chris Parker said Parker. “When the complex analysis is comis flexible enough to accommodate other 3-D data plete, users will want to export that and share the information with sources, said Chris Parker, the company president. “Quick Terrain others.” Modeler works with large 3-D data sets. It doesn’t matter whether Tiltan has developed a software program called TLiD, which it is LiDAR, synthetic aperture radar, or sonar. LiDAR happens to “enables users to make sense out of the point clouds,” Menaker be the major 3-D source right now, and a majority of our users use explained. Quick Terrain Modeler to work with LiDAR.” “We have automated the process of transferring point clouds to A recent update to Quick Terrain Modeler, released last summer, geographical features,” he added. “The latest improvements in the provides greater speed and improved workflows. The tool’s workflow ability to depict precise features from LiDAR point-cloud data involve is called FLAP, for find, load, analyze and produce. advances in the software algorithms used to process this informa“Speed gains are achieved through optimization of analysis protion.” Tiltan has sold the exclusive rights to TLiD to ITT Exelis. cesses and by pushing more functions out to the graphics card,” TLiD provides automatic extraction of features such as houses, said Parker. “Our latest release has a completely redesigned and trees and power lines; automatic full scene 3-D reconstruction; and intuitive interface, including a tool to keep the workspace orgaoutput in a variety of file formats. It is integrated with a 3-D viewer. nized and a mini-map to provide context when zoomed in, tilted or “TLiD has the ability to develop and run complex urban scenes rotated. These gains translate to faster exploitation, production, that include thousands of buildings and vegetation along with hundreds of moving objects,” said Menaker. “This is essential for MOUT [military operations on urban terrain] scenarios, including close air support and UAV operations.” Presents a training conference: Disaster relief is another application of the analysis and visualIntelligence Sharing Across ization provided by TLiD, according to Menaker. “By comparing the the Intelligence Community pictures pre- and post-disaster, users can assess the status of bridges w w w . I ntellig ence Shar ing Su m and rural roads.” march 26 – 28, 2012 | Washington DC, Metro Area Overwatch has developed a tool called LiDAR Analyst, which was built as an extension to GIS products such as Esri’s ArcGIS. “It was ahead of the curve when it was first released in 2006,” said Morris. “Now it is picking up steam as LiDAR data is becoming more readily available. “ArcGIS is one of the standard tools in the geospatial marketplace,” Morris added. “LiDAR Analyst dovetails directly into the Technologies and strategies for creating a ArcGIS workflow. Because we use standard formats, the output can collaborative intelligence environment be consumed by other applications such as Google Earth.” Overwatch recently improved its tools for LiDAR visualization to The SummiT will enable you To: include automated feature extraction. The company is also working • Learn about the latest developments in IT that are to expand its cataloging products to include the ability to search for shaping information sharing LiDAR files. LiDAR Fusion is Quantum3D’s most recent software application. • Untangle the challenges of establishing effective sharing practices between intelligence agencies “LiDAR Fusion is a visualization tool to help the geospatial intelligence community visualize point cloud data and automatically iden• Understand the policies and practices necessary to tify key objects such as buildings, vehicles and people,” said Shah. advance collaboration and cooperation “For visual simulation applications, LiDAR Fusion can be used to extract information from a LiDAR scan and place that information Don’T miSS The highly into the virtual database used for training pilots, ground troops and anTicipaTeD collaboraTive tank commanders.” Quantum3D uses LiDAR data in simulation and training appliiT SoluTionS FocuS Day cations to develop current and realistic visual databases for training military pilots. The visual data automatically extracted from point | 1-800-882-8684 clouds can be imported into an OpenFlight database and can be used software crunch it,” said Love. “The user doesn’t have to sit there swapping out disks. Earlier processes had users on the lookout for data anomalies. Now the computer spots these mistakes and corrects them automatically.”


24 | GIF 10.1

on any simulation platform, including Quantum3D’s Mantis RealTime Scene Management software platform and Independence IDX real-time image generator platforms. “Using current LiDAR data, for example, vegetation information is automatically extracted and placed into our flight simulation database,” said Shah. “Accurate placement of trees adds a level of realism as military pilots train. Pilots who regularly fly into military bases comment about tree placement in the virtual environment being very representative of the real environment.”

Data Access LiDAR point clouds are dense with data, and this raises the question of how this is to be transferring across busy networks. “We are focusing on the data access piece,” said Pixia’s Ernst. “Dissemination is our core mantra.” Pixia focuses on storing data in a way that allows for quick random access. “Pixia handles scalability on the server side,” said Ernst. “To have massive random access, the ultimate goal is to make a spinning disk perform like solid state memory. Our software boosts disk performance so that to accommodate hundreds and thousands of concurrent users.” Pixia allows data to be accessed at the object, rather than the file, level. That way, users are able to access the snippets of data that they need rather than having to wade through an entire LiDAR point cloud file.

For Bethel, LiDAR’s future will see even greater data density. The frequency of the laser pulses on sensors will increase, allowing for both greater detail in small areas as well as capturing wider swathes of territory. As multi-sensor data collection becomes more commonplace, Bethel expects that sensor packages will become miniaturized as well as modularized. “Right now it takes a lot of experience, knowledge, time and effort to combine sensors into one aircraft,” he said. “Future sensors will be more compact, more simplified and more rugged to better integrate them into unmanned aerial vehicles.” “We see an increase in LiDAR use across both geoint and visual simulation community,” said Shah. “As LiDAR scanners are used repeatedly in unmanned air and ground vehicle environments, we see real-time LiDAR processing becoming more prominent. Getting LiDAR data from scan into a visual database faster and near real-time will continue to enhance the benefits of virtual training for mission rehearsal.” For all of the developments associated with LiDAR, enhancement to collection and application of the data will only continue into the future. “We see LiDAR as a big growth opportunity for our business,” said Morris. “People are just scratching the surface with what they can get from LiDAR.” O For more information, contact GIF Editor Harrison Donnelly at or search our online archives for related stories at


June 5

June 6

USGIF Workshop Series

GEOINTeraction Tuesday

NGA Tech Showcase East

Hyatt Regency Reston, Reston, VA

Springfield, VA

June 5-8

June 7

June 8

Ground Warfighter Geospatial Intelligence Conference

USGIF Technology Day

USGIF Invitational

(formerly AGIC) TASC Heritage Conference Center, Chantilly, VA


NGA Campus East, Springfield, VA

1757 Golf Club, Dulles, VA

Hyatt Regency Reston, Reston, VA

Held annually in the Northern Virginia area, GEOINT Community Week brings together members from the geospatial-user community, including defense, intelligence and homeland security, for a week of networking, classified briefings, technology exhibits and learning workshops.

GIF 10.1 | 25


Compiled by KMI Media Group staff

GIS Data to Improve On-Base Emergency Response In the next 12 months, 11 Marine Corps bases will be involved in a process designed to greatly improve their on-base emergency response systems. Space and Naval Warfare Systems Command (SPAWAR) Atlantic has partnered with GeoComm, a public safety GIS, software and consulting company, to assist in the deployment of modern emergency response capabilities. GeoComm will deliver National Emergency Number Association-compliant street and building address data and will develop E9-1-1 databases for each applicable CONUS Marine installation. This effort includes utilizing all existing installation data and coordinating all efforts to complete

Application Unites Data in Web-Based Visualizations Blue Water Area Resilient, an application built on IDV Solutions’ Visual Fusion software and used by public safety and security agencies along the U.S.-Canada border, has been accepted into a national data-sharing initiative. Integration with the Unified Incident Command and Decision Support (UICDS) program will allow agencies using the application to automatically share information in response to disasters, including terrorist events. The UICDS integration is part of the upcoming third phase of Virtual City, a Department of Homeland Security (DHS) pilot project based in St. Clair County, Mich. Under the Virtual City project, IDV Solutions, with St. Clair County and other U.S. and Canadian partners, developed an application to unite data from local, regional and federal public safety agencies in a common operating picture. The application relies on IDV’s Visual Fusion, software that can unite data from virtually any source in interactive, web-based visualizations. Visual Fusion unites agencies’ data with feeds available from web sources and displays the data on an interactive map and timeline. This visualization provides a visual command center view that can be viewed through a web browser or an iPad app.

26 | GIF 10.1

digital GIS centerline mapping and address point placement along with master street address guide and automatic location identification database development to improve emergency call management capabilities. The goal of this project will be to provide the address data and process improvements necessary to ensure reliable, high quality support of public safety agencies in providing computer-aided dispatch, emergency response and related services. The addressing information developed is intended to be multi-purposed and will also support address geocoding, vehicle routing, mailing lists, address validation and incident location geo-verification. 

Map Drawing Solution Aids Emergency Responders TrueVector Technologies has introduced a webbased, interactive drawing solution for emergency responder web mapping software intended for use by federal, state and local governments, and other emergency responder organizations. The first deployment of the technology is by Defense Group Inc., which will market this drawing solution as part of its CoBRA WEB Mapping within the CoBRA Crisis Management and Emergency Response business unit. CoBRA WEB mapping will allow users to quickly develop situational awareness by viewing all essential information on a map, down to street level detail. It provides a common operating picture, where emergency personnel can log in and collaborate on a single universal map, which can be viewed and updated in real time to reflect changing events on the ground. Users of the map can draw and mark up incident information from anywhere, saving time, minimizing

property losses and protecting the safety of personnel and the general public. The system has been designed to easily track assets and resources as data is automatically fed into a map from the incident site as well as reachback assets and command centers.

Software Offers Advanced Volumetric Data Visualization Makai Ocean Engineering has released a new demonstration version of its geospatial visualization software, Makai Voyager. The new release, available at http://voyager., demonstrates Makai Voyager’s advanced volumetric data visualization and analysis capabilities. Makai Voyager provides users with an easy-to-access, cross-platform software package to process, analyze, fuse and display vast amounts of scientific and GIS data being collected and simulated in the earth, ocean and atmosphere. The 1.1 update offers many

new features, including volume rendering of large 4-D data models; display of dynamic data on the ocean surface; customizable graphs of scientific data; and faster streaming and improved WMS support. The downloadable demo contains many of the scientific visualization capabilities of the Makai Voyager software platform. The full version of Makai Voyager will contain a wide variety of data import and fusion tools to import and process GIS and scientific data and provide users with access to add-on modules for specific tasks such as LiDAR analysis.

The advertisers index is provided as a service to our readers. KMI cannot be held responsible for discrepancies due to last-minute changes or alterations.


American Military University. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 BAE Systems. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . C4 Booz Allen Hamilton. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 EMC. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21 Fugro Earthdata Inc.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 Intelligence Sharing Summit. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24

Compiled by KMI Media Group staff

Military Antennas West. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 National Association of Broadcasters . . . . . . . . . . . . . . . . . . . . . . . . . . INSERT National Space Symposium . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 TerraGo Technologies. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . C2 USGIF . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

Calendar February 22-24, 2012 Esri Federal User Conference Washington, D.C.

March 19-23, 2012 ASPRS Annual Conference Sacramento, Calif.

April 16-19, 2012 National Space Symposium Colorado Springs, Colo.

June 4-8, 2012 GEOINT Community Week Washington, D.C. area


Volume 10, Issue 2 March 2012

Cover and In-Depth Interview with:

Michael G. Vickers Under Secretary of Defense for Intelligence

Features: Video Searching

Hyperspectral Imaging

As analysts struggle to contend with rapidly growing volumes of full motion video data, defense researchers and industry are working to develop effective search technologies for such video imagery.

Hyperspectral imaging and processing, which can collect imagery at a subpixel level not visible to the human eye, is a growing area in remote sensing.

Human Terrain

While industry pushes to develop new apps for mobile GEOINT, the National Reconnaissance Office and the National Geospatial-Intelligence Agency have also gotten in the game, developing mobile applications that allow smart phones and other personal devices to access and download imagery.

Analysis of the human terrain and socio-cultural dynamics is playing a growing role in GEOINTand other intelligence analysis.

Web Apps

Insertion Order Deadline: February 24, 2012 | Ad Materials Deadline: March 2, 2012

GIF 10.1 | 27


Geospatial Intelligence Forum

Antoine de Chassy President Astrium GEO-Information Services North America Q: You just had the successful launch of the Pléiades satellite, can you tell us a little about the program and the capabilities of the new satellite? A: Pléiades 1, which launched in December 2011, is the first of a two-satellite constellation. It will be followed later this year by its twin, Pléiades 2. Together, they will operate in a phased orbit, allowing for daily revisit to any place on the globe. With Pléiades, we will provide orthorectified, 50-centimeter resolution products as a standard. Our clients will have multiple products to choose from, including black and white, color, ortho or raw data, all for the same price. Stereo and tri-stereo products are also available and will be used to create high precision digital elevation models. With an imaging capacity of 2 million square kilometers per day, with two satellites, five acquisition scenarios and three daily tasking plans, the Pléiades system is built to deliver precise geospatial information in record time. Q: Will there be any other satellite launches in the near future? A: As a complement to the Pléiades constellation, we are launching SPOT 6 later this year, followed shortly thereafter by SPOT 7. These satellites will provide world class 1.5 meter resolution imagery and maintain the 60-kilometer swath that the current SPOT satellites are known for. By integrating the two constellations, we will be the only commercial provider with the capability to bring our customers intraday satellite imagery revisits anywhere on the globe, every day of the week.

new acquisitions. At the same time, we will have plenty of capacity to meet the needs of defense users, and provide each sector with the quality of products and service delivery they have come to expect from Astrium Services for more than 25 years. Q: How will your offer differ to meet the demand for services?

geo-information and provide decision support to the GEOINT community better than any other single provider in the industry. Q: How soon do you anticipate having Pléiades imagery available to customers? A: Just three days after the launch, we started receiving and releasing great images for everyone to view. Images of Washington, D.C., Dubai, Davis-Monthan AFB, Ariz., and others can be found on our website. The images are truly fantastic and the satellite is working well. We expect commercial operations and the first tasking orders to be taken at the beginning of March. With an expected accuracy of 4.5 meters CE 90, we will be able to deliver the first high quality 50 cm imagery products with a 20 km swath. This will enable the GEOINT community to observe and interpret larger areas of interest in a single satellite pass and provide a greater level of decision support.

Q: Don’t you have access to radar imagery as well?

Q: How will you differentiate yourselves from the existing high-resolution imagery providers?

A: We operate two synthetic aperture radar satellites, TerraSAR-X and TanDEM-X, that allow for day and night collection regardless of weather and clouds. With this suite of earth observation satellites, we have an unprecedented capability, which will allow us to provide near real-time

A: The real differentiator will be in our ability to deliver. There is a demand in the market for high resolution data that is not being met. We will meet this demand with a strong dedication to both commercial and government users alike. Commercial users will no longer have to wait in line for

28 | GIF 10.1

A: The fundamental difference will be the range of services available for our customers. Everything from the satellite to the ground operations to the ordering system has been designed with maximum responsiveness in mind. For instance, a lot of work has gone into designing image production systems. The fully automatic orthorectification process is capable of generating a 20 km by 20 km color image in less than 30 minutes and a single-pass mosaic of 60 km by 60 km in two to three hours. On the user side, everything from ordering to data delivery has been made as flexible and easy as possible. New acquisitions, catalog data, subscription offers, online monitoring services and more mean that imagery is just a click away and ready to use. Q: With programs such as the EnhancedView program facing possible serious budget cuts, how do you see the future of commercial satellite imagery for defense and other government agencies? A: There is no doubt that the intelligence community and our industry will be significantly impacted by these cuts. However, we are not just delivering pixels anymore. There are clear opportunities out there in fields such as data on demand and monitoring delivered through cloud computing and online services. Multi-intelligence analysis and processing [UAV data with EO and SAR imagery] and bundled offers, such as GEOINT plus secured communication plus information security solutions, are needed as well. We have existing and proven capacity to build these offers, so there is still a strong future for commercial satellite imagery for defense and government agencies. O

Geospatial Intelligence Forum KMI GIF 2012 Editorial Calendar issue

FEB (10.1)

Cover Q&A Caryn Wagner Under Secretary for Intelligence and Analysis DHS

Special Section

NGA Support Teams


Trade Shows


Esri Federal GIS Conference* (2-22)

Information Sharing GIS Workflows Human Terrain

MAR (10.2)

Michael G. Vickers Under Secretary of Defense for Intelligence

IARPA profile with Dr. Lisa Porter Interview

Full Motion Video DCGS Web Apps

APR (10.3)

MAY /JUN (10.4)

JUL/ AUG (10.5)

SEP (10.6)

Bruce Carlson Director National Reconnaissance Office

Top Remote Sensing Companies Multi-Agency Who’s Who: Remote Sensing

Keith Barber Director, National System for GeospatialIntelligence Expeditionary Architecture Integrated Program Office

Army GEOINT Roundtable

Robert Cardillo Deputy Director for Intelligence Integration ODNI

Marine Corps Intelligence Command Profile with BGen Stewart Interview

BG Stephen G. Fogarty Commander INSCOM

EnhancedView Commercial SAR

ISIC* (3-26) DoDIIS* (4-1) Space Symposium* (4-16) SPIE* (4-23)

Special Operations

GEOINT Community Week* (6-4)

Homeland Security Bathymetric LiDAR Multi-INT


AUSA Winter* (2-22)

Big Data

Emerging Technologies

Closing Date

Esri Users Conference* (7-23) Modern Day Marine* (9-25)





Aerial Imaging 2012 Top Intelligence and Geospatial Companies

Tactical GEOINT Training

AUSA* (10-22)


GEOINT 2012* (10-8)




Mission Planning

Special GEOINT 2012 Symposium Issue Special Operations

OCT (10.7)

Letitia A. Long Director NGA


Industry Showcase M&S Analytic Software Maritime

DEC (10.8)

Vice Adm. Kendall Card Director of Naval Intelligence

Special Report: Full Motion Video

Visualization Feature Extraction Industry Outlook

*BONUS DISTRIBUTION This editorial calendar is a guide. Content is subject to change. Please verify advertising closing dates with your account executive.

To advertise, contact Scott Parker | 301. 670.5700 | |

GIF 10-1 (Feb. 2012)