Issuu on Google+

The Magazine of the National Intelligence Community

GEOINT Deployer Joseph F. Fontanella Director Army Geospatial Center Army GIO

May/June 2013 Volume 11, Issue 4

Motion Imagery Standards O Cloud Computing Big-Data Analytics O Command Post Imagery O LiDAR Software

Streamline your workflow from beginning to end with unparalleled search functionality, exploitation capabilities, and product creation for commercial and defense industries. Discover your data with GXP Xplorer. Search multiple data stores across an enterprise with a single query to locate imagery, terrain, text documents, and video. Exploit data using SOCET GXP® to create geospatial intelligence products with advanced feature extraction tools, annotations, and 3-D visualization for planning, analysis, and publication. Our Geospatial eXploitation Products give you the power to deliver actionable intelligence, when it counts.

Imagery courtesy of DigitalGlobe



May/June 2013 Volume 11, Issue 4


Cover / Q&A

Imagery for Command


Data analytics for geospatial intelligence has reached a crossroads at the convergence of big data and cloud computing. By Cheryl Gerber


Quality of Motion

The Air Force and the National Geospatial-Intelligence Agency are working together to evaluate the quality of motion imagery more comprehensively as it pours into networks in differing formats at high rates. By Cheryl Gerber





An industry team is exploring the use of shared computing infrastructures to deliver GEOINT and other data to emergency responders and non-governmental organizations. By Harrison Donnelly

As the volume of data from light detection and ranging (LiDAR) sensors increases, companies are developing software that employs new techniques and functions to satisfy user demands. By Peter Buxbaum

Infrastructure for Outreach


Users of the Army’s most popular battle command system will soon be able to quickly update maps with data and imagery from a wide range of information sources, thanks to a recent software switch. By Harrison Donnelly

Joseph F. Fontanella Director Army Geospatial Center Army GIO

Getting a Handle on LiDAR

select mission capabilities

Industry Interview

2 Editor’s Perspective 3 Program Notes/PEOPLE 14 Industry Raster 27 Resource Center

Darin Powers Vice President, Intelligence and Security DynCorp International


© 2013 Lockheed Martin Corporation

Big Analytics

Mission On Demand™. A unique set of capabilities powered by advanced analytics. For innovative security built into every level of every solution. From Lockheed Martin, the #1 provider of I.T. solutions to the U.S. government. mission-on-demand

314-65697_Mission_GIF.indd 1

3/25/13 3:18 PM

Geospatial Intelligence Forum Volume 11, Issue 4 • May/June 2013

The Magazine of the National Intelligence Community Editorial

Managing Editor Harrison Donnelly Online Editorial Manager Laura Davis Copy Editors Sean Carmichael Laural Hobbes Correspondents Peter A. Buxbaum • Cheryl Gerber William Murray • Karen E. Thuermer

Art & Design

Art Director Jennifer Owers Senior Graphic Designer Jittima Saiwongnuan Graphic Designers Scott Morris Eden Papineau Amanda Paquette Kailey Waring


Associate Publisher Scott Parker

KMI Media Group Publisher Kirk Brown Chief Executive Officer Jack Kerrigan Chief Financial Officer Constance Kerrigan Executive Vice President David Leaf Editor-In-Chief Jeff McKaughan Controller Gigi Castro Marketing & Communications Manager Holly Winzler Operations Assistant Casandra Jones Trade Show Coordinator Holly Foster Operations, Circulation & Production

Operations Administrator Bob Lesser Circulation & Marketing Administrator Duane Ebanks Circulation Barbara Gill Data Specialists Raymer Villanueva Summer Walker

EDITOR’S PERSPECTIVE The federal government recently held a banquet to present 46 seniorlevel federal workers with the Presidential Ranks of Distinguished Executive and Distinguished Professional award. The featured speaker was Director of National Intelligence James Clapper. According to a report in The Washington Post, however, none of those receiving the awards, which provide a substantial financial bonus, was a member of the U.S. intelligence community. No doubt, the absence of the IC was an oversight. But it symbolized the ongoing quandary of the intelligence professionals, who by the nature of Harrison Donnelly their business must avoid the spotlight and the public recognition that they Editor would otherwise richly deserve. As Senator Barbara Mikulski (D-Md.) has observed, “Intelligence professionals serve our country in anonymity knowing that most of the work they do to protect our freedom will never be made public. They are unsung heroes who risk their lives and even give their lives fighting terrorists and those who wish to do this country harm.” In April, Mikulski and other members of the Senate Intelligence Committee introduced legislation to designate July 26 as U.S. Intelligence Professionals Day. That was the day when President Truman signed the National Security Act of 1947, which laid the foundations for the current defense and intelligence structure. Along with Hollywood’s Zero Dark Thirty, another sign of recognition is a new HBO documentary, Manhunt. It tells the story of the decades-long search for Osama bin Laden, focusing on a group of female agents who took on the task well before 9/11. All that is great, but I would argue that the best way to honor IC professionals is not with public awards or films, but by providing them with the training and career development they need to do their jobs even better. That’s why professional development programs—for example, those led by Lieutenant General Michael T. Flynn at the Defense Intelligence Agency—are so important.

KMI MedIa Group LeadershIp MaGazInes and WebsItes Border & CBRNE Defense

Geospatial Intelligence Forum

ISSN 2150-9468 is published eight times a year by KMI Media Group. All Rights Reserved. Reproduction without permission is strictly forbidden. © Copyright 2013. Geospatial Intelligence Forum is free to qualified members of the U.S. military, employees of the U.S. government and non-U.S. foreign service based in the U.S. All others: $65 per year. Foreign: $149 per year.

Geospatial Intelligence Forum

Military Advanced Education

Military Information Technology

Border Threat Prevention and CBRNE Response


Integrated Fixed Towers

Border Protector

Subscription Information

Ground Combat Technology

June 2012 Volume 1, Issue 1

Michael J. Fisher Chief U.S. Border Patrol U.S. Customs and Border Protection

Leadership Insight: Robert S. Bray Assistant Administrator for Law Enforcement Director of the Federal Air Marshal Service Transportation Security Administration

Wide Area Aerial Surveillance O Hazmat Disaster Response Tactical Communications O P-3 Program

Medical Military Training Military Logistics Military & Veterans Forum Technology Affairs Forum

Special Operations Technology

Tactical ISR Technology

U.S. Coast Guard Forum

Corporate Offices

KMI Media Group 15800 Crabbs Branch Way, Suite 300 Rockville, MD 20855-2604 USA Telephone: (301) 670-5700 Fax: (301) 670-5701 Web:


Compiled by KMI Media Group staff

Research Program Seeks Access for the Warfighter The National GeospatialIntelligence Agency, like other elements of the military services, combat support agencies and combatant commands, is a member of the Military Exploitation of Reconnaissance and Intelligence Technology (MERIT) program, administered by the National Reconnaissance Office. MERIT solicits, assesses and funds advanced research and development projects that focus on increasing warfighter access to intelligence from national systems and national systems’ data, across the intelligence disciplines, and develops warfighter capability to exploit intelligence at the tactical level, said Corry Robb, the MERIT program manager. NGA’s MERIT program is managed and led by NGA’s Future Warfare Systems Division within

the Military Support Directorate, said Robb. The division coordinates an internal panel of agency subject matter experts to perform technical proposal evaluations. “The panel vets 70 to 85 proposals annually to ensure alignment with national and military intelligence priorities and guidance, including the recently published NGA strategy,” said Robb. There were 76 proposals submitted for fiscal year 2014, said Robb. Nearly 40 of them directly support GEOINT. The team tracks annual and multi-year GEOINT-related MERIT projects from development to completion to ensure delivery of successful programs that support the larger Department of Defense and National System for Geospatial Intelligence enterprise, said Robb.

“MERIT places great emphasis on quickly transitioning promising technology from research and development into operations,” said Robb. “NGA’s involvement in the MERIT program directly aligns with NGA’s Strategic Objective No. 7, ‘to lead advancement in the GEOINT field, and transition research and development and science and technology activities to operations.’” A recent example of success from NGA’s participation in MERIT is the development of the Enhanced Quality Imagery Search (EQUIS) tool, said Robb. EQUIS enables analysts to quickly and accurately identify the imagery they need by merging multiple image libraries into a single, integrated user environment. Another GEOINT tool funded and developed through the program is

the Web-based Access and Retrieval Portal, a major component of the NSG’s GEOINT distribution system. It is a multi-domain system that frontends imagery archives and provides the download, query and retrieval of data hosted on accessible libraries. The MERIT program is also responsible for myriad tools that improve overhead persistent infrared’s utility at the tactical level, said Robb. The technology is a key contributor to emerging GEOINT capabilities. “The MERIT program continues to benefit the NGA mission by quickly delivering to NGA analysts and the DoD warfighter cutting-edge applications and tools that significantly enhance existing capabilities,” said Robb. (This article appeared originally in the Spring 2013 edition of Pathfinder, the magazine of NGA.)

PEOPLE Army Colonel Christopher S. Ballard has been selected for the rank of brigadier general and assigned as deputy chief of staff, intelligence, Headquarters International Security Assistance Force Joint Command, Operation Enduring Freedom. 

Maj. Gen. Robert P. Lundy

Air Force Major General Robert P. Otto has been nominated for appointment to the rank of lieutenant general and for

Compiled by KMI Media Group staff

assignment as deputy chief of staff, ISR, Headquarters U.S. Air Force. He is currently serving as commander, Air Force ISR Agency, Joint Base San Antonio Lackland, Texas.  The list of Navy captains selected for appointment to the rank of rear admiral (lower half) includes Captain Christian D. Becker, who is currently serving as major program manager for Space and Naval Warfare Systems Command, Space Field Activity, Chantilly, Va., and Captain Timothy J. White, who is currently serving as commanding officer, Naval Intelligence Operations Center Maryland. 

Army Reserve Colonel Christie L. Nixon has been selected for promotion to the rank of brigadier general and for assignment as deputy commander, Army Intelligence and Security Command.

most recently served as deputy director for congressional and public affairs at the National Reconnaissance Office.

Aimee McGranahan

Gina Lundy

Pixia has hired Gina Lundy as vice president of government relations and corporate communications. A retired Air Force lieutenant colonel, Lundy

After nearly 10 years as the head of the U.S. Geospatial Intelligence Foundation (USGIF), Founder, Chairman and Chief Executive Officer (CEO) Stu Shea is stepping down as CEO. President Keith J. Masback will replace Shea as the CEO, while

Shea will remain as chairman of the board of directors. USGIF Vice President of Operations Aimee McGranahan was named to the newly created position of chief operating officer. In addition, Darryl Murdock has become vice president of professional development, where he will be primarily responsible for leading the foundation’s GEOINT professional certification initiative. After acquiring Tomnod, a company that analyzes imagery through crowdsourcing software, DigitalGlobe has hired Tomnod co-founder Shay Har-Noy as director of research and development.

GIF 11.4 | 3

Government, industry work to improve the quality of the vast quantities of motion imagery pouring into military and intelligence networks.

By Cheryl Gerber GIF Correspondent

The Air Force and the National Geospatial-Intelligence Agency are working together to evaluate the quality of motion imagery more comprehensively as it pours into networks in differing formats at higher rates than ever before. The Enterprise Image Quality Verification Program (EIQVP) expands the reach of quality assurance out to the enterprise for a bigger picture from which to differentiate the various causes of quality degradation. “We need to figure out exactly which part in the image chain causes quality degradation so we can correct it to give warfighters the best image quality they can get,” said Robert T. “Bo” Marlin, technical adviser for ISR capabilities and integration in the Air Force Office of the Deputy Chief of Staff for ISR. “We’re expanding quality assurance with a methodology that evaluates more sensors.” To manage the surge of GEOINT collected today, the EIQVP will establish a broader baseline from which to assess the quality of full motion video (FMV) and other data types. “The Sensor Integration Division of the NGA/TQ Office is developing a methodology for how to assess the quality and utility of FMV and for identifying quality impacts resulting from storage, retrieval and exploitation of that data within the context of the National System for Geospatial Intelligence,” said Lisa Jorgensen, chief, Sensor Integration Division, NGA. The expanding use of high-definition motion imagery has a scientific basis: Humans process moving pictures more quickly and easily than still images. “Video is much closer to what the human brain is expecting to see than still imagery. The human eye and brain are built to look at 4 | GIF 11.4

real-time 3-D scenes in motion. It’s a more natural process for the human visual system to integrate multiple frames of data, whereas when we look at an individual scene, only seeing it in 2-D, we have to guess at the shapes, motion and context,” said Jim Salacain, an NGA contractor in the NGA/TQ Office. Last December, Hanscom AFB, Mass., placed Harris Government Communications Systems Division (GCSD) on contract to test and analyze the quality of FMV and other forms of motion imagery as it flows from end to end. “Harris GCSD employed government-furnished equipment for video test and monitoring and installed it at selected sample points throughout the FMV end-to-end flow,” said First Lieutenant Jessica Zencey, logistics lead, AF Life Cycle Management Center at Hanscom. “The test results made recommendations for areas in need of improvement as highlighted by the increasing demand for combat air patrols. Those recommendations will be used to plan future updates in an effort to improve FMV analysis and to meet broadcasting standards,” she said. Recommendations for future updates could include improved sensor-to-sensor communications and increased onboard automation in plug-and-play, sensor-agnostic platforms. It could mean shifting some production, exploitation and dissemination (PED) functionality from on ground to on board, and integrating standard technology such as Extensible Markup Language. “We want to get to a point where one sensor informs the other sensor automatically, communicating in a machine-to-machine way,” said Marlin. But for now, PED is still on the ground. “We are not as

close to on-board processing as on-the-ground processing today. As processors become smaller, lighter and use less power, the process will improve.” Harris Corp. is keeping an eye on these developments for integration into its Full Motion Video Asset Management Engine (FAME) architecture. “We’re seeing advances in sensor-to-sensor algorithms that will allow us to do real-time analytics, alerting, cueing and tipping. As the technology matures to real time, we will apply it to FMV sensors and bring it into FAME,” said John Delay, chief strategist and architect, geospatial products, for Harris. For example, if an FMV sensor is set to watch for a white truck, it could send an alert the instant the quality is good enough to discern a white truck. But before such onboard automation can be realized, the raw data being collected must be good from the start. “We won’t get to the point where we can do machine automation until we solve the quality problem of core data at the point of collection, in real time,” said Delay.

bandwidth available to encode details, then pixels or blocks go missing, showing up as low-resolution, blurry and at times unrecognizable segments of motion imagery. Other defects, such as tiling, result from lost or misplaced blocks.

Standards Compliance

Some quality issues can be resolved or prevented by standards compliance, the use of high-definition sensors or a standards-based encoder/decoder used both for standard definition (SD) and HD motion imagery. Harris offers the Acuity H.264 HD airborne encoder, built on Motion Imagery Standards Board (MISB) standards for video processing in both manned and unmanned vehicles. H.264 refers to the International Telecommunication Union (ITU) H.264 compression standard, also called the ISO (International Organization for Standardization) MPEG-4 Part 10 Advanced Video Coding (Moving Picture Experts Group) standard. It was developed jointly by the ITU Points of Degradation and the ISO. The Acuity H.264 receives uncompressed video The EIQVP is examining the entire passage of John Delay from SD or HD sensors, at which point users can crop motion imagery, from sensor to user, to identify the or scale the uncompressed image in a way that best meets data link points of quality degradation, including the proliferation of it. “We constraints. The video is then compressed and passed to the onboard need additional testing and performance characteristics at the sensors processor for encapsulation with key, length and value metadata into to understand the limitations and capabilities of sensors and to minian MPEG-2 transport stream protocol. Local storage in MPEG-2 supmize the propagation of problems,” Delay said. “The goal of the EIQVP ports content retrieval on-demand via an Internet Protocol interface. is to identify the areas where infrastructure improvement is needed to “The MISB-compliant H.264 encoders can transfer HD at aggresimprove the quality of data.” sive bit rates over existing networks. It’s a major step in the right As one might expect, voice-only data travels more easily across direction to be MISB compliant. But now we need sensor communinetworks than video data. “Voice is well behaved compared to video, cations to catch up with compression technology,” Delay said. which is very bursty and highly compressed. If you lose one packet of HD sensors and monitors are crucial improvements in the qualvideo, you can’t recover that packet because subsequent frames are ity of motion imagery, but not all networks can transmit the higher dependent upon past frames,” noted Ulrica de Fort-Menares, director bit rates of HD video quickly. The better the image, the bigger the file of product management, Cisco Medianet. and the more it can cause network latency. Cisco Medianet is an intelligent network architecture that has In fact, from the growth of HD, FMV, multi-sensor platforms been optimized for the transmission of voice, data and video. The on aircraft and automated video recording in military vehicles, the embedded Cisco Media Services Interface standardizes integration entire PED of motion imagery strains bandwidth and slows delivery. with Medianet network services such as media monitoring. And too much unfiltered information can burden analysts’ desktops. A burst is a continuous transfer of data from one device to another Nonetheless, standards continue to improve the process. Just without interruption. Burst mode dedicates an entire channel for the finalized in January, the H.265 video standard enters the world of transmission of data from one source, rather than dedicating a timesUltra High Definition video. An H.265 video image is noticeably lot for each device that needs to transmit. sharper with finer detail than an H.264 image. To prevent or resolve video degradation as it travels across net“The new H.265 compression is 50 percent more efficient than works, Cisco proactively reroutes. “Cisco performance routing is a H.264. It significantly improves the bit rate and consumes less bandnew way of doing rerouting. Cisco networking devices are able to use width,” Delay noted. performance routing to reroute video traffic dynamically based on The quality of motion imagery is measured using the video verreal-time monitoring of information about delay or jitter in the netsion of the system for rating the quality of still imagery, the National work,” she said. Imagery Interpretability Rating Scale (NIIRS). Video NIIRS (VNIIRS) As motion imagery travels great distances at high speed across the scores the quality of video on a 2 (worst) to 11 (best) range. The widest area networks today, the quality degrades in many places, from higher the NIIRS number, the finer the video and the more tasks an real-time sensor collection at the start to the process of encoding and analyst can perform on the imagery. decoding, in satellite or radio transmissions, conversions from ana“The video NIIRS scale was revised in October 2012 to align log to digital video or from incompatible network transport technolowith the NIIRS scale used for still imagery,” said Bryan Blank, MISB gies. Increasing onboard automation would shorten the distance and chair. “A level 5 is good for tracking the movement of a car or van on lessen the degree of quality degradation in the long passage. roadways in medium traffic, but a level 6 under the same conditions To speed delivery and save space, motion imagery is encoded in would produce finer detail of movement by smaller objects, such as multiple blocks, and then decoded. If encoding and decoding are a motorcycle.” not correlated or standards-based, and if there isn’t enough network

GIF 11.4 | 5

The MISB Interpretability Quality and Metrics Working Group is developing a tool called Video Enterprise Rating Tool for Improved GEOINT (VERTIGO), which uses the VNIIRS scale for real-time, automated rating of motion imagery, allowing analysts to select only the video clips with the VNIIRS levels needed. “VERTIGO could save on bandwidth, since analysts would use it to select only what they need to accomplish their mission. It can prevent network transmission clogging with unnecessary video and help to determine where the degradation of quality occurred,” Blank said. Another way to measure video quality is with the peak signal to noise ratio, which compares the original, real-time-collected motion imagery with the noise or errors introduced by compression.

it represented before it entered the network, the process could eliminate a large degree of latency afflicting networks today.

Processing Software

Another player in this field is Motion DSP, which, with funding from In-Q-Tel, has made headway improving the quality of FMV in real time with its HD image processing software, Ikena ISR. Motion DSP adapted the Windows-based Ikena ISR to the standard Open Computer Language to ensure it would be interoperable and supported by many hardware vendors. The GPU-accelerated version of Ikena ISR allows for the use of graphics cards from AMD and Nvidia to process video faster on embedded PCs in an aerial vehicle, remote laptops or blade servers. A Graphics Processing Unit (GPU) is essentially a big, highly parallel, programmable system. Reducing Latency MotionDSP also offers the vReveal HD video enhancement software, which can use the programmable Complete United Device The Defense Information Systems Agency (DISA), meanwhile, Architecture (CUDA)-enabled GPU. CUDA was created by Nvidia. is concerned continuously with reducing network transmission Ikena ISR can integrate with IP video systems and connect clogs to improve the quality of motion imagery. “The less end-todirectly with Microsoft’s video chat program, Skype. “Testing in end latency we have in the system, the more we can maintain the a live ISR environment, FMV analysts have validated that Motion fidelity of the product, the more we can send motion imagery in DSP’s Ikena ISR software improves video quality by one to two real time, accurately, to the decision-maker for the warfighter,” said Video NIIRS,” said Sean Varah, chief executive officer and founder Bruce Bennett, DISA program executive officer of communications. of MotionDSP. A major source of latency, the process of encoding, has However, while Ikena ISR technology has shown to improve improved significantly in recent years. “In 1998, encoding took quality in a live ISR situation, it is only one of many places where 7 seconds with 15 frames per second. Now it takes 1.2 seconds quality of motion imagery is an issue. with 60 frames per second at high definition,” noted Bennett. Harris and its partners are working in the EIQVP to test and “Now we can get it to the eyeball almost as fast as we can transmeasure motion imagery from end to end. “We don’t just use mit light, although light can only travel so fast. The less distance, Harris tools. We use different types of tools from the better.” other companies, such as Ineoquest, Spirent and One area the Air Force is looking to improve Tektronix,” noted Delay. involves the way in which large motion imagery files The Ineoquest Geminus G10 platform is a 10Gb, are managed and sent across the network. “We’d real time, digital video monitoring, analysis and rather send large, object-oriented files in 1-gigafault isolation tool that provides deep packet inspecbyte chunks using Global Namespace, which would tion of IP and MPEG and is standards-based. “We remain the same no matter where it moves,” said insert hardware probes in the network and lock onto Marlin. “Global Namespace provides a consistent a multitude of simultaneous video streams,” said Jim identity, a common framework for the intelligence Welch, director of product management. “We look at community,” he said. every packet in the streams and cover more than a There are many advantages to using Global Sean Varah thousand streams at once. Geminus analyzes each Namespace (GNS) rather than multitudinous, spewith 30-40 metrics applied to each stream.” cific file names. GNS aggregates disparate file information under The Ineoquest intelligent Video Management System (iVMS) a virtualized, unified naming structure in which files physically collects statistics from one end of the network to another to provide exist in multiple servers but appear in one namespace. The use of a comprehensive view of how well video streams entered the netGNS can open up storage space and relieve network clogs, thereby work and how well they were delivered to the customer. “As video reducing latency. hops from one relay point to another, each step has potential for Storing only what is necessary and reducing the unnecessary impairment. Our devices precisely identify where the impairments replication of motion imagery increases efficiency and eliminates occur,” said Welch. “IVMS provides advanced warning and advice another source of degradation. “Replication still takes about the to take action.” same amount of time as it took to collect, although technology Spirent Communications offers Open Systems Interconnection advances are having some success in reducing that time,” Marlin (OSI) layers 2-7 testing capability on one platform. The 7-layer OSI noted. model provides the standard functions of a communications system There is also the question of how and whether to store, or not to in terms of layers. Layer 2 is the data link layer, for example, while store, FMV. “Does it reside in ‘native’ storage areas, and therefore, Layer 7 is applications. not have to be replicated? Do you store it or just use what you need “Modern sessions are extremely complicated with a composbased on the current activity and throw the rest away?” he asked. ite of multiple protocols working together in a symphony, such as If this question were answered with onboard, automated testSkype, which is more than five protocols bound together in HTTP,” ing of incoming motion imagery, to determine which VNIIRS level 6 | GIF 11.4

said Chris Chapman, Spirent senior technical marketing engineer. “The reality is that the technology is moving ahead of the ability to test it.” Spirent emulates the worst possible network scenarios to predict problems in advance. “One of the principles of testing is that you test under the worst conditions. We can emulate a solar event that impairs satellite to satellite communications. We measure whether quality of service is achievable or not under those and other circumstances with the most stringent QoS policies,” said Steve Naylor, senior director, broadband, Spirent Federal Systems. “Satellites can dramatically impact latency, adding milliseconds and seconds to delivery time. All those latencies and packet losses add up,” he said. The Spirent Test Center also measures whether a network can handle IP version 4 (IPv4) and IPv6 at once. “There are bigger address ranges now on IPv6, whereas IPv4 has limited addressable entries—4 billion—and each process requires storage space. How those entries are stored in the database and how IPv4 and IPv6 interoperate with devices is different, so we test the forwarding capability of both,” said Chapman. Although the Tektronix’ Sentry product, a combination of hardware and software, is known for its QoS monitoring and testing, the company has added quality of experience (QoE) in recent years to distinguish between the ability to reach QoS policy objectives and human perceptual analysis of audio and video in QoE.

Tektronix’ Mixed Signals is a family of products combining QoS and QoE that includes various versions of Sentry, MPEG Analyzers, Network IP statistics, embedded data, video capture and other functionality. “Test and measurement equipment in the past looked solely at how many packets were dropped. It didn’t include QoE operators like Comcast, who care more about QoE than QoS. QoE appears to the viewer, whereas QoS does not,” noted Steve Liu, Tektronix vice president of video network monitoring. Tektronix has experienced heightened market demand for video monitoring resulting from the explosive growth of video content on limited bandwidth in recent years. “This is a scale issue now. We just upgraded from 1 to 3 gigabits per second throughput in November 2012 with the introduction of Sentry 3G. And now we are shooting for 6 gigabits per second with QoE and QoS real-time monitoring,” he said. As with other experts, Liu awaits the arrival of processor upgrades. “We’re researching how much more we can add to our throughput with the latest and greatest hardware. We’re looking for an entire systems performance increase—more powerful CPUs, multiple processors with faster memory and more hard disk space,” he said. O For more information, contact GIF Editor Harrison Donnelly at or search our online archives for related stories at

T h e I n d u s t r y L e a d e r i n W i d e - A r e a M o t i o n I m a g e r y ( WA M I ) D a t a A c c e s s

High-Performance. Scalable. Interoperable.

     

Supports all WAMI data formats Sustained data storage I/O Relevant data retrieval OGC WAMI open standards Transform frame-based to stream-based feeds Deploy on any infrastructure

Commercial WAMI courtesy of PV Labs


Visit our DEMO ROOMS at our new Reston Town Center offices:

GIF 11.4 | 7

Data analysis for geospatial intelligence drives toward a cloud-based, integrated environment, partly accessed in real time. By Cheryl Gerber GIF Correspondent Data analytics for geospatial intelligence have reached a crossroads at the convergence of big data and cloud computing. The junction is filled with diverse data sources and emerging technologies driving toward the ultimate destination—a cloud-based, integrated analytic environment, partly accessed in real time. In the meantime, there are many steps along the way, starting with the spreadsheet and shapefile data format still commonly used today, and moving to GIS tools and relational or SQL databases housing structured data, as well as statistical analysis software used to perform increasingly complex queries of data. A crucial step is the growing use of “not only SQL” (NoSQL) databases, providing access to previously unreachable, unstructured data and expanded queries. Separately, GEOINT analysts still rely on electronic light table (ELT) and associated tool suites that are not integrated with GIS tools. “Analysts would like to see a merger of these technologies, where ELTs become more like GIS or GIS becomes more like ELTs. While analysts are looking at imagery, they want to be able to visualize, query, process and update spatial data over the top of the imagery, with all of their work being stored in a spatial database accessible by the enterprise,” said a National Geospatial-Intelligence Agency GEOINT data analyst, who asked not to be identified. The ELT image viewing and manipulation system is based on the National Imagery Transmission Format communication standard. Companies such as BAE Systems and Paragon offer ELTs that allow images to be accompanied by text, graphics, video, audio and 8 | GIF 11.4

background data from other applications, including Internet and database technologies. The science of data analytics is either exploratory, seeking to discover new information, or confirmatory, seeking to confirm or deny existing hypotheses. Predictive analysis determines the likelihood of an occurrence. Large-scale data analytics, or data mining, uses massive data sets, data management and sophisticated software to discover hidden patterns and relationships. Location analytics uncovers patterns by linking data to locations, as in Esri’s popular ArcGIS software, based on relational or larger databases with structured data. Today, however, Esri is developing APIs to reach beyond maps in apps for ways to embed analytics with spatial references behind interactive mapping.

Hadoop Convergence The convergence of cloud computing and big data has opened analytics to an ocean of unstructured data in Facebook, Twitter, email and text messages, among others. At the center of the convergence is Apache Hadoop. Developed in the Java programming language about a decade ago, Hadoop is now omnipresent. Every major technology company now either operates on or offers some iteration of Hadoop. Intelligence operations use Hadoop for near real-time, tactical information in battlefield missions. Hadoop expansion and integration has kept companies like Data Tactics Corp. and EMC busy building technology. One example is a cloud-based system that combines NetApp data storage

technology, called NetApp Open Solution for Hadoop (NOSH), and Data Tactics’ software engineering expertise to bring advanced analytics to GEOINT. Data Tactics engineers developed tools for data extraction, link analysis and sophisticated search with geospatial capabilities. Link analysis is a type of data analytics used to define the relationships between network nodes, with uses in such areas as counterterrorism, computer security and fraud detection. The NetApp NOSH and Data Tactics solution is used for video analytics and a video library at NGA. “NetApp provided the hardware solution, and we are doing the data space architecture, data ingest and data management to facilitate the normalization of structured, semi-structured and unstructured data. The combination of NetApp NOSH and Data Tactics’ Big Data Engine provides a turnkey hardware and software cloud computing solution, enabling the analyst to ingest data, run analytics and visualize the results,” said Lee Shabe, vice president, Data Tactics. A key factor in achieving real-time analytics is to bring data analytics out of the data center and into the user’s local environment. EMC’s Affinity NG platform provides this capability on handheld devices, laptops and desktops running Intel and ARM processors, Windows, OSX, various Linux, iOS and Android. Affinity NG captures, processes and stores structured and unstructured data at the edge, including streams of data from sensors, video, audio, emails and other types of data. Affinity NG reads data directly from sensors using various network protocols and processes the data using streaming operators. It detects complex events and stored rules. GEOINT analysts would like to merge more than ELTs and GIS. While spreadsheets and SQL databases contain plenty of structured data, the untapped expanse of unstructured data could yield valuable GEOINT—if analysts knew where to search for it and it was filtered in customizable searches before it arrived and available as shared services in cloud infrastructures. Integrating SQL with NoSQL databases, structured with unstructured data and intelligence behind interactive maps, significantly expands the access to data sources and the possibilities for geospatial analytics. Open Database Connectivity (ODBC) or Java Database Connectivity (JDBC) standard APIs are essential for linking SQL and NoSQL databases as well as for answering the multi-dimensional analytical queries in online analytical processing. “The main goal of analytics is to pull multiple sources together, to layer over the top multiple systems that traditionally have not been put together with an octopus of APIs that have tentacles into all these different sources of intelligence, for the freshest data possible,” said Rich Campbell, chief technologist, EMC Federal.

Signals and Noise One of the most persistent problems with big-data analytics is the need to improve the signal-to-noise ratio of useful to useless data. “GEOINT analysts either don’t have access to data or have too much data,” noted Lee Shabe, Data Tactics vice president. “They need to find signal from the noise.” A prolific source of big data for geospatial analytics is highdefinition full motion video (FMV) as it is collected en masse by

sensors on drones and satellites. To ease access to large volumes of data, Pixia Corp. provides standards-based technology that filters signal from noise. “Big-data analytics is an extremely complex process with many layers and hierarchies,” said Rahul Thakkar, Pixia vice president of technology. “When an analytics algorithm requests data in a format of its choice, we are able to saturate it with the appropriate data and only that much data—nothing more.” Pixia’s HiperWatch offers software as a service applications and standardized access to big FMV data using RESTful web services. Representative State Transfer (REST), a software architecture for distributed systems, is the predominant web API model on the Internet. It was designed in parallel with Hypertext Transfer Protocol. Pixia HiperWatch transforms feeds into MPEG-compliant transport streams and sends only the relevant data from disks to applications, thereby providing near real-time access to data during capture and analysis. Pixia wrote the OGC specification designating a set of web services for the dissemination of Wide Area Motion Imagery products. Pixia’s HiperStare is a traditional big-data wide area surveillance solution within a cloud-based architecture. For nearly a decade, statistical analysis software such as that from SAS has been operating over a bridge to ArcGIS. To expand statistical analysis, Esri built connectors to MicroStrategy, another prominent business intelligence software

ADDRESSING THE CHALLENGES OF FUNDING AND INTEROPERABILITY June 24–26, 2013 | Washington, DC Metro Area Highlights of GIS for Government 2013: • Learn from an exclusive lineup of the TOP experts in the field of GIS, as well as leading authorities from a full spectrum of government representation: from the Federal, State, and County levels • Gain unparalleled insight into varied GIS applications, while addressing the need for analysis-oriented and standards compliant systems • Delve into the development and integration of GIS, strengthening your knowledge of LiDAR technology, WebGIS, ArcGIS, and much more Top Priorities: • Explore emerging initiatives and ongoing programs and projects, as well as address the challenges of funding • Analyze existing systems and issues of interoperability and standardization within geospatial software • Reveal advances in GIS development, and the integration and implementation across various applications GIF 11.4 | 9

provider, to integrate both server and cloud-based map functionality and visualization with the MicroStrategy platform. Esri is also developing 3-D visualization to conduct spatial datamining with big, unstructured NoSQL databases. The fact that there is no need to format data in Hadoop, as with SQL databases, is one of the most popular aspects of it. Based on parallel processing and used to store and analyze large volumes of structured and unstructured data, the Hadoop Distributed File System (HDFS) uses Google’s MapReduce to divide queries into multiple parts, process them at nodes, then aggregate the results for answers. The Hadoop framework code cannot be modified, but the environment is highly extensible to allow for functionally specific application development. Hadoop, HBase, Big Table and Amazon’s SimpleDB fall into the class of database management systems called NoSQL, which scale with web requirements and do not use standard SQL to query data or follow other rules of a relational DBMS. HBase is an open-source, non-relational distributed database written in Java and based on Google’s Big Table, a compressed, high-performance, proprietary database built on the Google File System. NoSQL databases are designed to be expansively distributed with large-scale, structured and unstructured data, using various Internet technologies. As a result, NoSQL databases have spread like wildfire. There are now more than 150 NoSQL databases of various types today. Linking SQL with NoSQL and advancing Hadoop is critical to achieving an integrated analytic environment. As a means to integrated analytics, EMC Greenplum has been busy improving upon Hadoop and expanding its capabilities. The EMC Greenplum Pivotal HD distribution includes HDFS, MapReduce, Pig, Hive, Mahout and other tools, including visualization extensions and Isilon support. Pig (also known as Pig Latin) is a Hadoop-based language developed by Yahoo. Hive is a Hadoop-based data warehouse developed by Facebook that allows users to write queries in SQL. Mahout is a data mining library, while Isilon is scale-out network attached storage with HDFS data. EMC Greenplum also developed a massively parallel processing relational database that sits atop HDFS, called Hawq. The data is written directly to HDFS and not to disk, making this not a traditional relational database, but rather one that works with NoSQL. Although MapReduce is fully supported, Hawq uses its own engine for managing data. Hawq is ODBC- and JDBC-compliant. What is unusual about Hawq is that it is an SQL database with a Hadoop (or NoSQL) distribution.

Federated Queries The EMC Greenplum Unified Analytics Platform provides big-data analytics by fusing the co-processing of structured and unstructured data with a high-performance productivity engine. EMC utilizes federated query capabilities, which allow collaborative analytic development platforms, business intelligence tools and analytic applications to filter and join data from multiple sources. A combination of connectors, content and link indexing and query optimization allows federated queries to reach heterogeneous data sources. Further, it complements extract, transform and load (ETL) tools by offering near real-time access to distributed sources. ETL is the process of delivering data from its source to a data warehouse. 10 | GIF 11.4

Data Tactics also works with EMC Greenplum to provide architecture and engineering support to the defense and intelligence communities. Both EMC and Data Tactics work with virtualization provider VMWare, since virtualization plays a key role in the development of big-data analytics for GEOINT. “We were the first to virtualize Hadoop deployments, providing recommendations to VMWare several years ago. Currently we are actively working on the integration of Nicira into big-data deployments to improve the performance of enterprise cloud computing platforms,” said Shabe. VMWare acquired Nicira last year to integrate its softwaredefined networking technology into VMWare’s vCloud suite. Since then, VMWare has unveiled plans for a hybrid cloud service or infrastructure as a service offering to compete with Amazon Web Services (AWS), IBM Enterprise SmartCloud and HP Cloud. Data Tactics is working now with VMWare to integrate Nicira technology for the new product, called VMWare NSX. EMC is also working with VMWare on big-data cloud deployments. EMC’s G2 solution integrates purpose-optimized engines on one platform with ingestion into GemFire, VMWare’s in-memory data grid, using many different standard, web-based protocols. GemFire is an object-based NoSQL database, while SQLFire is an SQL engine that runs in situ with GemFire in order to leverage GemFire’s data management features. “We’re doing innovative things with VMWare and Hadoop, leveraging the compute to do initial analytics that refine the data set significantly,” said Campbell. “The ability to federate analytics and do real-time analytics at the collection point allows data to stay in place with a smaller data set. Once the initial query is done, the results can be pushed to a larger or centralized location for deeper analytics if needed.” Not all SQL databases are spatially aware, limiting their applicability to GEOINT analytics. If data is not indexed for spatial queries, then functionality must be added to process spatial data types “Many new data technologies are not spatially enabled yet. Companies like Esri are developing drivers to go from traditional relational GIS data to NoSQL, Big Table, XML and all of the newer data technologies emerging, but it’s not all in place yet,” the NGA GEOINT analyst said. The ultimate capability would be SQL and NoSQL-linked databases that are spatially enabled. Software engineers are working on it. “We have built apps and plug-ins that take spatially aware data from HDFS and process and analyze the data. We make those derived data sets available through Open Geospatial Consortium web service specifications such as Web Mapping Service and Web Feature Service, to name two,” said Richard Heimann, lead analytics engineer for Data Tactics. Much of the work to spatially enable big-data analytics depends on the algorithms. “We’ve been tackling data analytics with a spatial component to create algorithms appropriate for the data. Linking time, space and networks, the writing of the algorithms needs to be well specified,” said Shabe.

Analysis Toolkit Esri is getting ready to release GIS tools for Hadoop, an open source toolkit for spatial big-data analytics that provides different libraries, a spokesman confirmed.

The toolkit allows users to perform spatial analysis on spatial data with a run-filter function that aggregates operations on billions of spatial data records inside Hadoop, using spatial criteria. New areas can be defined and represented as polygons with the ability to run point-in-polygon analysis. Users can visualize analysis results on a map with styling capabilities and a rich set of maps. Maps can be integrated in reports or published as map applications online. The libraries include the Esri Geometry API for Java, a generic geometry library that can be used to extend Hadoop with vector geometry types and operations. The library allows developers to build MapReduce applications for spatial data. The Esri Spatial Framework for Hadoop extends Hive and is based on the Esri Geometry API to enable Hive Query Language users to leverage a set of analytic functions and geometry types. ArcGIS Geoprocessing tools for Hadoop contain functionality based on the Esri Geometry API and Spatial Framework for Hadoop. Developers can download the tools’ source code and customize them, and can create new tools to move spatial data and execute a predefined workflow within Hadoop. Esri also has been testing AWS Elastic MapReduce product by deploying prototypes on the Amazon cloud. Working with data partner Gnip, which provides social media data to the enterprise, Esri has built geospatial analysis of tweets generated from Twitter and collected by Gnip. Examples of social media monitoring are on public information maps, where tweets are captured and then displayed across relevant geographies.

SAS recently updated its Enterprise Data Integration Server with Hadoop support, along with dozens of existing databases from Oracle, DB2, SQL ServerEMC Greenplum and Teradata including Teradata Aster, among others. The SAS support for Hadoop and big data includes many features such as native Hadoop security with SAS data security provisions and visual analytics explorer, text mining and other analytics to Hadoop data, which also can be federated with other data sources. Users can also embed a federated query in a data management job flow. Also this spring, Teradata released Hadoop integration in its unified architecture to form a platform that can ingest structured and unstructured data in one architecture. Teradata has had geospatial functionality and big-data analytics in its database, as well as a partnership with Esri, for years. “We are seeing organizations expand their business intelligence analytics to include big analytics along with location intelligence to analyze transactions and interactions along with location data from mobile devices, baggage scans and other devices. This data can be routed into a data warehouse for real-time responses or first streamed into a data discovery platform for deep analysis and then routed on to the data warehouse for further analysis,” said Arlene Zaima, director of Integrated Analytics, Teradata. O For more information, contact GIF Editor Harrison Donnelly at or search our online archives for related stories at

“AMU is A trUsted AUthority in the intelligence community.”

Supervisory Special Agent Michael Collett | Graduate, School of Security & Global Studies Michael Collett knows the value of career-relevant education and chose AMU based on its reputation in the U.S. Intelligence Community. After serving in the U.S. Marine Corps Reserve, Army Special Forces, and California National Guard, Command Sergeant Major Collett embarked on a 20+ year career with the Drug Enforcement Administration. Today, he leads counter narcoterrorism operations and recently received the prestigious American Military University President’s Award.

Learn More at

Art & Humanities | Business | Education | Management | Public Service & Health | Science & Technology | Security & Global Studies

GIF 11.4 | 11

Army battle management system’s switch to commercial software expands access to maps and geospatial imagery. By Harrison Donnelly GIF Editor Users of the Army’s most popular battle command system will soon be able to quickly update maps with data and imagery from a wide range of information sources, thanks to a recent switch to a commercial mapping capability based on popular GIS software. The Command Post of the Future (CPOF) system had been relying on proprietary software and periodically updated map and imagery libraries to provide field commanders with terrain and other visual information needed for operations. But the next version of CPOF to be fielded by the Project Manager Mission Command (PM MC), part of the Army Program Executive Office Command, Control and Communications-Tactical, will enhance situational awareness by offering access to custom-developed products tailored to conditions. CPOF is one of the most popular systems used in the C4ISR community, noted Lisa Bellamy, deputy product manager for tactical mission command within the PM MC. Its mapping and imagery capabilities, based on a traditional thick-client architecture, have been successfully fielded for seven years, keeping commanders abreast of the locations of their own forces, adversaries and needed resources. “But when you have a program for seven years, you learn some lessons over time,” Bellamy observed. “We took a look and said we might want to integrate more sources and formats of geographical data, which would make more data available for the warfighter to choose from. “The second improvement we were able to make was shrinking the amount of space that the data consumed. The less stuff you have on your hard drive, the faster you can serve up maps and other information,” she added. To provide the back-end functioning for the new system, Army developers turned to ArcEngine, a software development kit from Esri that uses the same underlying base code as the company’s ArcGIS Desktop product. The software was acquired from the National Geospatial-Intelligence Agency under the Commercial Joint Mapping Toolkit (CJMTK), the omnibus geospatial licensing program prime-sponsored by Northrop Grumman. 12 | GIF 11.4

World Wind. “They use our product to read the data sources that “That allowed us to leverage the familiarity that the geospathey have, to compile that into a map, and then pass the map off tial community has, so that it was easier for the geospatial engito the visualization environment. It’s a good examneers who are resident with the unit to work with ple of a multi-vendor environment, and the key is it, and to support and maintain it in the field. the use of open standards. You can choose the visuAdditionally, it provided updated maps for the warfalization client that you want, and as long as that ighter. The enemy is changing all the time, and client supports open standards, we are the backend that changes what is important in the commandmap infrastructure supporting it,” he said. er’s map imagery. In one site, it may be infrastrucLooking ahead, CPOF developers are focusing ture, particularly in an urban terrain, while in a on using the Internet to expand access to imagery mountainous terrain, you might need to know how data. Command Post Web, a web version of CPOF, high the mountains were or how to get around or will allow CPOF users to pull feeds from other over them,” Bellamy said. map-based, mission command systems such as the For the user interface, designers selected World Brian T. Lehman Distributed Common Ground System-Army. Wind 3-D, an open-source virtual globe developed “Flexible access to geospatially based situaby NASA. tional awareness is key. For us, what we’d like to do is to extend Key to the approach was an emphasis on open standards, CPOF beyond our traditional clients and into CommandWeb,” which was especially important given the size and diversity of the Bellamy said. “We really do want to extend that capability via C4ISR community to which it is being fielded. “So we had to make CommandWeb, so we can offer the war-fighter renewed flexibility. sure we were taking advantage of capabilities,” she explained. “We All of this is happening because a dedicated team of military, civilnow can pull in many data sources, going from being able to pull ian and contractor employees have come together and realized the in one type of map set in the beginning to now having more than technical, programmatic and cost-efficiencies for the taxpayer.” O 112 formats that we can accept.” Another challenge was to modify training packages to ensure For more information, contact GIF Editor Harrison Donnelly that the changes were folded in gracefully and with minimal at or search our online archives interruption to the soldier in the field. In the end, Bellamy for related stories at observed, “It actually caused some advantages, because we were able to increase the commonality, and the geospatial engineer teams were more integrated into the backbone CPOF. So it got us further along toward a shared geospatial foundation.”

Standards Advantage Use of the software from the CJMTK toolkit had a financial benefit for the Army program, since the licensing fee was covered under that NGA-funded program. Even so, “probably the primary benefit that they saw was the ability to support lots of industry standard formats and pull in lots of different information and data sources that they were not previously able to do,” observed Brian T. Lehman, C2 team lead for Esri. “In previous incarnations of the system, they would build a map set in CONUS and then field it to all their platforms,” Lehman said. “In many cases, that map set would be refreshed on about an 18-month cycle. But commanders want to see the latest maps and imagery that they have at their disposal locally through other systems and their own resident imagery and terrain analysts. That was a disconnect between the CPOF system and some of the other data sources that were available locally. “The system was locked down to maintain a quality and reliable map. But that’s a tradeoff to having current information available,” he continued. “So what they’ve done is to provide that capability to the user community. When they have resident analysts at their echelon, they can take information from those resources and incorporate it into the system.” Another benefit is the ability to tie into external servers using the Open Geospatial Consortium’s Web Mapping Service standard, which easily enables links with other information sources such as NGA. Lehman also highlighted the role of standards in enabling interoperability between components such as ArcEngine and




Need Situational Awareness Need Full Motion Video

www.p a r g o v e r n m e n t. c o m GIF 11.4 | 13

INDUSTRY RASTER GEOINT Software Adds Intuitive User Interface

SeeMe Research Seeks Warfighter-Support Satellites

Overwatch, an operating unit of Textron Systems, a Textron Inc. company, has announced the latest release of its ELT GEOINT software, featuring a new intuitive and customizable user interface. ELT, which fuses image processing functionality with geographic information systems support, is used by analysts for military intelligence, mission planning and disaster management. The contemporary, ribbonbased interface in the new version of ELT enables users to customize the screen to better visualize and tailor the software’s wide variety of image viewing and analysis applications to their specific mission needs. The software interface offers large, logically grouped high-resolution icons and an improved workflow to reduce the time needed to process commands and execute tasks. Kevin Opitz;

ATK has received a contract to support the Defense Advanced Research Projects Agency for the Space Enabled Effects for Military Engagements (SeeMe) program. SeeMe seeks to develop enabling technologies to provide reliable surveillance data to the warfighter in the field, using small, low-cost satellites that are launched quickly to support the speed of military operations. ATK intends to transition advanced, imagery-processing algorithms used on UAVs to space and take advantage of the resulting higher-power processing to save size, weight and power, as well as cost, on satellites. ATK has partnered with Logos Technologies and University of Southern California/Information Sciences Institute on the study contract. Vicki Cox;

GIS Platform to Add Social Media Analysis Tools Esri and Geofeedia have announced plans to extend the ArcGIS platform with Geofeedia’s innovative social media tools. Public safety professionals will be able to advantage of these capabilities to accurately integrate, monitor, analyze and visualize live emergency data as events unfold. Deploying assets and personnel, understanding of events on the ground, adjusting response on the fly, and post-event monitoring are all improved using social media combined with location analytics. The real-time data integration, searching and streaming will work across multiple social media platforms including Twitter, Instagram, Flickr, YouTube and Picasa. Geo-located tweets, photos and videos can be viewed within the context of digital imagery, street networks, topography and community base maps. The social data can be mashed up with other information such as public safety assets, city infrastructure, utility networks, hazardous materials, demographic data and more. Additional dynamic data including weather, automated vehicle location, GPS and traffic video camera feeds can be combined with social and map data. In addition to public safety, professionals in government, national security, health care and insurance will be able to extend the ArcGIS platform by adding intelligence about social conversations. Jesse Theodore;

14 | GIF 11.4

Enhanced Software Products Leverage Geospatial Data Thermopylae Sciences & Technology has announced the next generation of its iSpatial, Ubiquity and iHarvest software products. ISpatial is a web-based, collaborative geospatial tool that leverages Google Earth and Maps in a flexible, task-based approach to solving common problems. New features include an improved user interface, internationalization to swap languages on the fly, enhanced ingest service that increases the ease and variety of data that can be pulled into iSpatial with a single click, search improvements, and temporal sliders to view data over time. Ubiquity is an extensible, web-based platform for creating dynamic, customized and geocentric mobile applications. New features

include new maps, forms and text widgets, a new backend architecture to improve performance, enhanced widget software development kit (SDK) for users to create their own widgets, an HTML5 hybrid SDK for users not familiar with mobile development, and new user interfaces. IHarvest is a standards-based enterprise analytic service that organizes, analyzes and reports activities to inform critical decisions. New features include Dragon Algorithm enhancements that greatly improve performance and capabilities, new APIs to improve flexibility and the overall user experience and adapters to connect with Google Apps elements. Harris Eisenberg;

Compiled by KMI Media Group staff

Partnership Couples Geospatial, Analysis Solutions Wiser Co. and TerraGo have announced a strategic partnership that will allow for the coupling of TerraGo’s location intelligence and geospatial collaboration solutions and Wiser’s analysis and visualization platform in order to advance location intelligence and foster situational awareness across a broad range of industries, including defense and intelligence. The agreement paves the way for Wiser to build and deliver location intelligence solutions leveraging TerraGo’s software platform, including GeoXray. GeoXray automates the process of discovering, analyzing and geospatially visualizing big data from a variety of internal and external sources, including news feeds, blog posts, social media and maps,

and allows analysts to filter the results by place, time and topic. Wiser will develop a solution that comprises GeoXray coupled with its GeoUnity application, which incorporates heat maps and additional analytical and visualization capabilities to aid in discovery of correlations between data to enhance analyst productivity and efficiency. The agreement also enables Wiser to support, from within GeoUnity, the production of intelligent, portable, interactive TerraGo GeoPDF maps and imagery that enable geospatial collaboration and field data collection by users across the enterprise. Renee Wagner;

Geospatial Display Calibration Software Ensures Image Accuracy QUBYX has launched a new version of its geospatial display calibration software. PerfectEPD, developed to adjust monitors to the requirements of the National Geospatial-Intelligence Agency’s display performance standard, ensures accurate image reproduction and correct readings in geospatial visualization. For military and intelligence users of visualization, the accuracy of the image is extremely important. According to NGA, visualization displays must be adjustable, so that black level, luminance and color temperature of the white point could be adjusted to the desired values. Luminance response of such monitors must match the equal probability of detection (EPD) curve. PerfectEPD display calibration tool by QUBYX fulfills these requirements, calibrating color, luminance and black level of any display to NGA requirements. PerfectEPDcalibrated monitors allow more exact readings and retain their imaging quality much longer then non-calibrated LCDs.

Unit Integrates Mobile Devices with Tactical Radios PAR Government Systems Corp. is now shipping GvTether, a lightweight rugged wearable computer designed to enhance situational awareness at the tactical edge. GvTether hardware and software autonomously connects a tactical network radio and Android smart device, enabling real-time routing of geospatial data, full motion video and geo-based messages. The device runs the Linux operating system and readily allows for third-party developed applications. The software deployed with the GvTether is Android version agnostic and supports both rooted and un-rooted phones. Specifically, the GvTether supports real-time two-way routing of textual data, such as Cursor-on-Target messages, and streaming of geospatial data, such as raster imagery or video, from military radios to smart phones or tablets that are not 3G/4G or WiFi activated. Ed Bohling;

NGA Contract Supports Future Solutions Research Booz Allen Hamilton has received a $315 million single award contract to support the National Geospatial-Intelligence Agency’s InnoVision Directorate. Booz Allen will provide specialized scientific and technical research and development subject matter expertise to all facets of the InnoVision Future Solutions Program (IFSP) through November 2017. IFSP provides support to perform path-breaking scientific research and transitions innovative concepts and capabilities required to solve the most complex military and intelligence problems. Additionally, IFSP explores emerging scientific capabilities and opportunities such as high-performance computing or big data, and surveillance, in high-threat environments. As the prime on this award, Booz Allen will lead a complex team of partners on this work, including 10 small businesses. Carrie Lake;

GIF 11.4 | 15



When it comes to intelligence, we know how to deliver global solutions that provide a decision advantage and prevent tactical and strategic surprise. Uniquely trained, experienced and qualified, the intelligence professionals at DynCorp International support the strategic, operational and tactical needs of the Intelligence Community. Our trusted intelligence training, analysis and mission-support solutions help our national, military, civilian and law enforcement customers reduce costs and achieve new levels of performance and productivity. DynCorp International provides our customers with unique, tailored intelligence solutions for an ever-changing world.

GEOINT Deployer

Q& A

Providing Geospatial Expertise and Reachback Capabilities Joseph F. Fontanella Director Army Geospatial Center Army GIO Dr. Joseph F. Fontanella was selected to the Senior Executive Service in January 2011 to serve as director of the Army Geospatial Center (AGC). The AGC is the Army’s knowledge center for geospatial expertise and provides geospatial information reachback capability to field units. As director, Fontanella is responsible for supporting the operations, intelligence, acquisition, research and development, and modeling and simulation communities with geospatial information. The center focuses on the development, exploitation, production and distribution of topographic, geodetic and geospatial information tools and services for the Army and other Department of Defense and national programs. Fontanella also serves as the Army’s geospatial information officer (GIO), with responsibility for collecting and validating geospatial requirements, formulating geospatial policy, setting priorities and securing resources supporting the Army geospatial enterprise, and synchronizing geospatial solutions at both HQDA and Secretariat levels of Army governance. Fontanella was interviewed by GIF Editor Harrison Donnelly. Q: What are your current top priorities as Army geospatial information officer and director of the AGC? A: As the Army GIO, my top priority remains developing and implementing an Army Geospatial Enterprise [AGE] that allows for horizontal and vertical interoperability and sharing of geospatial information, from the national level to the last tactical edge. One of my other areas of focus is working with my counterparts at the National Geospatial-Intelligence Agency and sister services to ensure that commanders and warfighters continue to receive accurate, timely and relevant geospatial information, products and services, that they need to conduct their missions. Over the past 12 months we have built tremendous momentum to implement the AGE concept. This momentum will lead to the elimination of non-interoperable and proprietary geospatial data formats, data models and viewers, which decreases operational effectiveness/readiness and increases costs. Through partnerships with the assistant secretary of the Army for acquisition, logistics and technology, the AGC is providing direct support to program executive offices and program managers within the construct of the Common Operating Environment initiative. Through this initiative, soldiers using a handheld device will have access to the same geospatial information a soldier at a command post receives. As the director of the AGC, my top priority is simple, yet critical. We must continue to do all we can to provide critical geospatial information, domain expertise, training and reachback capabilities to

our soldiers and warfighters deployed across the globe. Our agency fields a robust set of capabilities—not just quick response capabilities like our Buckeye system or the High Altitude LiDAR Operations Experiment, but also a number of Acquisition Category Level III programs of record. Finally, we have a number of research, development, test and evaluation activities that focus on filling AGE requirements and capability gaps. Within the construct of the AGE, which we see as the organizing principle for the AGC, we have aligned those efforts to match up with the roadmap pieces that go along with the enterprise, such as map data and standards, visualization and analysis tools and geospatial applications. Additionally, we are performing value engineering studies that will shape the acquisition future as the enterprise matures. However, as director my top priority remains support to those who are engaged in combat, not just in Afghanistan but around the world, including our special operations community as it is engaged in phase zero or pre-phase zero activities, such as nation building and strategic engagement. Q: Looking beyond the debates over current federal budgets, what is your long-term strategy for AGC in a fiscally constrained future? A: For us, the whole enterprise framework is an efficiency-gaining activity; especially as you break down stovepipes, eliminate proprietary solutions and start sharing data based on a common set of standards and specifications. However, there is no denying that the current uncertain fiscal environment has put a greater emphasis GIF 11.4 | 17

on becoming pervasively interoperable both internally and externally. My focus has been and will continue to be multi-layered. First and foremost, we will continue to find innovative ways to support the combatant commands, decision makers and soldiers around the globe. Secondly, the fiscal environment will force us to focus on producing information in a common set of standards, schemas and specifications to promote data sharing. I also believe there will be a greater focus on strengthening, or entering into, formal exchange agreements with our national, joint and coalition partners to optimize our resources. This includes the synchronizing of efficiencygaining activities at the national level, with those within the services to ensure we don’t create an unacceptable gap in capabilities. Lastly, as the demand and use of geospatial technology, data and expertise continues to increase, we will continue to incorporate geospatial capabilities into other areas of the defense and government humanitarian assistance, disaster relief, and so on. Q: What do you see as the most pressing unfulfilled needs of warfighters for geospatial information, and what are some of your key initiatives for meeting those needs? A: Quite honestly, the GEOINT community is ever-evolving and is currently undergoing revolutionary change. While hardcopy mapping products remain a critical requirement for current operations, more detailed, interactive digital-geospatial data and enterprise technology are becoming essential to today’s warfighters. As the GIO, it’s important for me and my staff to not only understand these trends and how they may affect our communities’ support to Army operations, but also how we can best leverage them to ensure that our soldiers and warfighters receive the most current capabilities available to conduct missions efficiently, effectively and most importantly without the loss of lives. To help us better understand and meet critical requirements, we have a number of partnerships with key stakeholders, including Army Training Doctrine Command [TRADOC], Army Forces Command, Army Engineer School and Army Intelligence Center of Excellence, to capture, prioritize and document critical requirements. The Army most recently sent a consolidated statement of requirements letter to the NGA director, in March. This document stated the army’s geospatial information and service requirements for training, equipping forces and supporting systems. It was derived using information obtained from a comprehensive data call among TRADOC capability managers, Army service component commands and the Army acquisition community. In addition to this activity, we have constant engagements with the entire force, at various echelons, to ensure we are filling critical requirements. If you look at how the Army has functioned historically, we typically plan at a smaller scale than we conduct operations. For example, during Operational Iraqi Freedom, planning occurred at the 1:250,000 scale and execution occurred at the 1:100,000 or 1:50,000 scale. If you’re doing combined arms maneuvers at that scale, it works pretty well. But when operating in a wide area surveillance environment, as we’ve been in Afghanistan and Iraq for a long time, commanders have to plan at larger scales [1:50,000] and execute at an urban or human scale [1:10,000 or larger]. This shift has led to an increased requirements for high-fidelity, three-dimensional terrain information, urban feature data and new types of information like social-cultural. 18 | GIF 11.4

Q: How is AGC working to meet growing demands for mobile GEOINT? A: Certainly, handheld technology, mobile applications and other user-based technologies have become more prominent. This has turned what use to be traditional consumers or users into producers, and in some cases analysts. We’re not completely there yet, but this will truly enable the “every soldier as a sensor” concept by allowing those closest to an object or event to collect real-time geospatial information with unprecedented levels of accuracy and timeliness. Our Geospatial Research and Engineering Directorate has developed strong partnerships with industry, the Defense Advanced Research Projects Agency and the NGA’s InnoVision Directorate to ensure we provide innovative tools and applications that expand query, mining, assembly and integration capabilities, creating smarter users. These capabilities will allow users to create user-defined products tailored to their mission. Aside from tools and applications, we are excited about some recent efforts that optimize storage and dissemination of large data sets to users with limited-to-no bandwidth and storage limitations. Ultimately, our Army will become more agile as this technology becomes available at lower echelons. The work we are doing will provide capabilities and decrease resources spent on the discovery of information, allowing for an increase of resources spend on exploitation/analysis while simultaneously enabling dissemination of geospatial information to the tactical edge regardless of network connectivity or bandwidth. One of the efforts we’ve been working on is looking at concepts of operations, tactics, techniques and procedures, and technology advances that optimize storage on the lowstorage devices. When you use a navigation app on your cell phone, the data likely isn’t resident on your device—you’re connecting to a web service that provides you the data through the app. It’s very light and works well in an environment where there is adequate bandwidth, good connectivity or 3G/4G networks. However, these types of apps and web services will not function in austere environments that don’t have bandwidth, connectivity or networks. Army users have to have the apps and data resident on the device. Our ability to store data on handhelds becomes a larger problem when you have high volumes of dense data. We’ve had great success taking the millions of JPEGs that might need to be assembled for an image scene, and condense them down into something that will fit on a small secure digital card that would go into a mobile device. That’s really important, and it gets to some of the large-data issues we’ve got to address in an Army tactical environment. Q: What is the significance of having a standard and shareable geospatial foundation, and what are some of the challenges involved in creating it? A: When you go into an Army command post, it’s common to see multiple mission command systems with various map backgrounds, but no common operating picture [COP]. There has to be some unity of synchronization, coordination and consistency in how those things are all working together. Having a standard and shareable geospatial foundation [SSGF] is critical to having a consistent, coordinated and synchronized COP enabling all elements of the force to operate on the same map and support real-time coordination and collaboration. By leveraging

Global Enhanced GEOINT Delivery Based upon the NGA

ENhANcEdViEw award, commercial

imagery access is easy. The Global Enhanced GEOINT delivery (G-EGD) program provides NGA and its customers with current unclassified high-resolution imagery in support of operational planning, emergency response and situational awareness.




Have a .mil or .gov e-mail? You could be eligible to access current global imagery. US / MEXICO BORDER

Go to: / egd

common standards, architecture and specification, our force will be able to leverage national, commercial and joint, interagency, intergovernmental and multinational geospatial data sources. This will create an updateable, common map foundation and provide an accurate display to be used for maneuver, situation awareness and precision joint targeting, at locations across all echelons, irrespective of operating environment. The SSGF will enable an accurate geo-referenced display to support all warfighting functions data tailored to a unit's mission, task and purpose, and enable visualization and dissemination of tactical plans via mission orders and graphic overlays. Q: How is your organization contributing to development of the Distributed Common Ground System-Army [DCGS-A] and Command Post of the Future [CPOF]? A: In the collapse of the Army Battle Command System, DCGS-A absorbed a number of programs, including the digital topographic support systems. Our organization plays a critical role in materiel development for all systems that produce or consume geospatial information, including DCGS-A and CPOF. For us, DCGS-A plays an important role because it will serve as the hub and spoke for the SSGF for other mission command systems at the tactical level. Additionally, we work with program managers and the CIO/G-6 to ensure systems remain in compliance with the standards and interoperability established by the National System for Geospatial-Intelligence [NSG]. We also articulate concerns or limitations back to the NSG to synchronize Army deployment and materiel development schedules. In regards to DCGS-A and CPOF, we provided system engineering expertise that enabled DCGS-A to provide a SSGF to CPOF through the use of open standards. Through this effort, the Army was able to reduce costs associated with field service engineers/representatives, while improving interoperability and timeliness of updates. Q: What do you see as the chief issues and challenges involved in sharing geospatial information, particularly across different levels of security? A: One of the chief issues is data veracity and the notion of authoritative data sources. For example, if someone is out collecting data at the tactical level, how do we know that’s accurate data? Who classifies it as authoritative? Or do we just associate some metadata with it, which says that a person unknown to everyone else collected this data, so use it at your own risk. Therefore we must get to the issue of data veracity. There is some reticence for people to serve up data that perhaps has some specious origins. But is that data better than no data? For example, when the U.S. went in to do humanitarian assistance and disaster relief [HA/DR] in Haiti, that country was not well mapped. OpenStreetMap went in and had people recording information about streets and addresses. Do we classify that as authoritative? I don’t know, but it was better than what we had, which was nothing. You have to address the issues of data veracity, especially when you have data collected by volunteers. If you’re in a HA/DR environment or a regional stability environment, with non-government organizations and countries that we don’t have a sharing agreement with, and you’re sharing data between them, how do you address that? I think that’s especially important for the tactical user. Another challenge is content management. I talked about data moving around—the soldier is providing data, and that data gets rolled up at their battalion or brigade combat team level. They’ve built 20 | GIF 11.4

a database at their level, with high fidelity information. The data that soldiers collect is related back to the commander’s priority information requirements, and is now managed at some level. How much of that data gets rolled up to the next level, and at what point does the content manager put an authoritative stamp on it? Where is the data stored, and how does it become part of some service, regional or national holdings, and where does it reside? I don’t think we have good answers to those questions now. Right now, it’s being done in-theater, but you have to address the Relief in Place/Transfer of Authority issues when units are replacing units on the ground. Do they trust the data that was generated there, and has it been exposed to them in time to understand it? Is it in formats that they can use, and in which it can be ingested into the command system? Thirdly, you have to have a universal mechanism for collecting data and managing it within an established data model. One of the things that we’ve done here is to build a Ground Warfighter Geospatial Model, which we believe services the needs of the ground warfighter. Basically, it is an extension of the data models that NGA has designed and developed, so that if we’re collecting data, we’re collecting it in a format that’s consistent with national standards, so that that data could be packaged and contribute to the national holdings. Another challenge for us, which is a big one, is the misunderstanding of data vs. applications vs. visualization software. We run into people who say that all they need is Google Maps or Google Earth. Google Earth and Google Maps are great visualizations, but they’re nothing without the data that feeds them. A lot of people don’t understand that the data that feeds these applications has to be built in a way that’s reliable, with rectified imagery and location errors removed. The data is very important. Many believe that if they can find out how to get to the nearest Starbucks by going to the Internet, that solution is also available for the services. But it’s a lot more complex than that. It’s a challenge for us to educate people, because geospatial information has become a commodity for them. They don’t understand that there is a great deal that goes behind that. Google has so many servers serving up data, but you don’t necessarily have that in a tactical environment. The last thing is that technological advances are so quick, and with our less-than-agile acquisition process, it’s hard to keep technology from outpacing what soldiers have in their hands. The Army has tried to improve agility through the Network Integration Evaluation process, which is an ongoing set of experiments and activities taking place at Fort Bliss, Texas/White Sands Missile Range, N.M. That mechanism is designed to keep pace with technology, but it is really a challenge in this particular domain because the technology is moving with the speed of Moore’s Law—so quickly that it’s hard to keep pace—and as I indicated before, our acquisition processes aren’t nearly as expeditious. Imagine putting your cellphone in a drawer for 10 years and then taking it out and trying to use it. There’s a very good chance technology will have phased it out. Q: Is there anything else you would like to add? A: There is an increased interest at the joint level on using adaptive planning processes. We’ve been involved in efforts that used a map or geospatial foundation as the organizing principle behind the adaptive planning processes to figure out things like force flow, and addressing multiple course of action development. The joint community is interested in finding ways to use geospatial information and technology to enable decision making, and reduce the timeline in the whole mission planning process. I think this is part of the future. O

Industry group explores using cloud computing to deliver geospatial information to nontraditional users. By Harrison Donnelly GIF Editor

In a demonstration of the growing potential of cloud computing to enhance every aspect of military and intelligence operations, the National GeospatialIntelligence Agency is working with industry to explore the use of shared computing infrastructures to deliver GEOINT and other data to emergency responders, non-governmental organizations and other nontraditional users operating under difficult conditions in the field. The initiative, operating under the umbrella of the Network Centric Operations Industry Consortium (NCOIC) and led by NJVC, is planning a demonstration exercise this summer during which a cloud infrastructure will provide

data, including NGA geospatial information, and applications to organizations responding to a hypothetical disaster scenario in Haiti. With additional participation by Boeing, The Aerospace Corporation and Open Geospatial Consortium (OGC), the $350,000 project will study the interoperability and movement of data in an open-cloud-based demonstration. NGA will provide unclassified data that supports a scenario depicting the 2010 earthquake in Haiti. NCOIC’s foundational model is based on a series of successful lab interoperability demonstrations, also based on Haiti, that it conducted four times during 2010. While one commercial cloud GIF 11.4 | 21

served as a data-transport vehicle durHybrid Cloud ing the 2010 lab demonstrations, the NGA work would put a number of clouds The roots of the project lie in the in the center of the action, thereby international response in recent years enabling the ever-expanding population to disasters such as the Indonesian tsuof global cloud users, including emernami of 2004 and the Haitian earthgency responders, to post their eyewitquake of 2010, recalled Kevin L. Jackson, ness views of what is happening where vice president and general manager, they are. cloud services for NJVC. The output of the project will be a “Both of those events were characterreport offering an outline for how to ized by a huge global response of governcreate a cloud infrastructure linking ments to help in humanitarian assistance different groups during times of crisis. and disaster relief,” Jackson said. “One “We want to bring a bunch of disof the first issues that came up was how parate organizations together and to collect and exchange information of have them start sharing information,” all types. Militaries, non-governmental explained Tip Slater, NCOIC direcorganizations and individuals responded tor of business development. “How to try to help. But all of them had the do you rapidly assemble an infrasame problem—they just picked up what structure to do that, and how do they had, went to the disaster location they plug into that infrastructure and asked what they could do.” and share information. How do we NGA was one of the government manage that information so that cerorganizations that responded to both tain information, such as health or events, and particularly in Haiti, where financial data, is restricted to certain geospatial information was critical. organizations?” Existing maps were not useable after The first phase of the process has the devastating temblor transformed the been completed, with the creation of a landscape, so one of the first requirecloud infrastructure, and the NCOIC ments was to remap affected areas. was planning this spring to select those “They had a lot of technology, and that will provide applications for the could use satellites and GPS to remap demonstration. “The end result of this the whole country, which provided elecwill be a process by which, if someone tronic maps that could be distributed to asked how to go about partner countries and orgastructuring a system that nizations. But how could will allow the sharing of they consume those? Every information, the NCOIC country may have a geospawill provide a set of guidetial system, but they may lines for how to duplicate not have one that is comthe process,” Slater said. patible with what the U.S. As team leader for the and NGA have. On top of project, NJVC is designthat, if they did have useful ing, implementing and data and wanted to share, managing a federated every country would have Kevin Jackson cloud environment that its own rules, regulations provides baseline infraand policies with respect to structure services to parsharing that information,” ticipating NCOIC member companies. Jackson said. Boeing is providing geoservices through In addition, GEOINT providers also a Boeing-developed set of capabilities needed to make data available to nonvia its OpenGeo software, while the governmental organizations with widely OGC will offer guidance on the use of varying capabilities for receiving and OGC standards throughout the demonmanaging information. In that heterostration period. Aerospace Corp. is supgeneous environment, the common plying an OpenStack-based cloud and a denominator could come down to acevirtual organization management systate-covered paper maps, accessorized tem patterned after the one used by with grease pencils. the Worldwide Large Hadron Collider In that situation, cloud computing Computing Grid. is ideal, Jackson said. “Right off, it gives 22 | GIF 11.4

some good capabilities. You don’t have to build or own and IP infrastructure, but can just rent one based on your needs. If you have an earthquake in Haiti or a tsunami in Indonesia, you look for cloud providers who can provide the infrastructure in those countries, and you rent it. Then you can broker that infrastructure based on immediate needs, whether storage, computing or networks from different providers. “You can put the geospatial applications in these different clouds, and assign users based on your access control requirements,” he explained. “You can also place data in different infrastructures and platforms, based on the participants’ particular status. Once you have a way to manage the different infrastructures, our idea is that you will be able to offer a platform as a service. “You have the infrastructure as a service, and layered on top of that a platform as a service that is independent of the infrastructure. So the applications can be written to support the requirements of the responders, such as translating from one format to another, or creating maps within a particular region, or maps designed for specific responders such as medical or searchand-rescue providers,” Jackson continued. “All this can be done in a cloud computing infrastructure with operational funding, with no capital expenditures required from any participating organization.” The Geospatial Community Cloud Project is using the NJVC Cloudcuity Management Portal to enable a hybrid cloud composed of Amazon Web Services, an open stack cloud provided by Aerospace Corp., and some services from Google, Jackson explained. The Cloudcuity platform is powered by cloudMatrix, a technology platform designed by Gravitant to deliver cloud brokerage services. Cloudcuity delivers comprehensive, easy-to-use reporting and administration tools to help IT organizations analyze the value of the cloud services model by comparing it with current IT operations. O

For more information, contact GIF Editor Harrison Donnelly at harrisond@kmimediagroup. com or search our online archives for related stories at

As the volume of data increases, companies are employing new techniques and functions to satisfy user demands. By Peter Buxbaum, GIF Correspondent there is a lot more of it. You have to make sense of the data sets in Much as is the trend with many other sensing technologies, the order to answer intelligence questions.” volume of data coming off light detection and ranging (LiDAR) sen“Within the past year or so there have been major changes in sors available for analysis, visualization and dissemination is growing LiDAR hardware and how the data is collected,” said Matt Bethel, by leaps and bounds. Although LiDAR sensors have always generated manager of systems engineering at Merrick & Company. “Specifically, plenty of information, the growth in demand for LiDAR data, along multiple look angles allow for better feature definition. Higher LiDAR with advances in sensors and collection technologies, has created pulse rates provide higher resolution and greater data density.” masses of data that just keep on growing. Higher pulse rates return denser data and provide more data fidelAs a result, the software that is required to process and analyze ity and detail. Multiple looks involve generating more than one LiDAR this data has had to keep pace. Companies providing such software pulse directed toward the same spot before the initial pulse is returned have employed the multicore processing techniques now becoming and detected. common when tackling big data sets, and have also “These are huge advancements in foliage penetraadded features and functions to satisfy the escalating tion and better feature definition,” said Bethel. “These demands of users. developments allow LiDAR tools to be used more effecThree-dimensional visualizations of LiDAR data tively and have brought significant changes to airand the fusion of LiDAR and imagery are two of the key borne LiDAR mapping. Higher pulse rates enable more features that have begun to emerge in a major way in data to be collected in a single pass and allow flight at recent months. higher and safer altitudes.” “The volume of LiDAR data has grown massively,” “The ability to interpret an unstructured LiDAR said Matt Morris, director of geospatial product develpoint cloud into a solid, real-world representation conopment at Overwatch, a strategic business of Textron tinues to evolve,” said Sandra Vaquerizo, vice president, Systems Advanced Systems, an operating unit of Jennifer Stefanacci CG2 Inc. “By leveraging an oblique angle source, anaTextron Systems, a Textron Inc. company. “Sensors lysts can capture all of the rich detail along the sides are able to collect data on much larger areas in a much of features that can be scanned from lower altitudes shorter period of time. The density of the data has also or longer look angles. The result is a highly accurate, grown. When we first released our LiDAR analysis softcomplete scene that can be viewed from any angle.” ware, the resolution was 1 meter. Now it is down to centimeters.” “We’ve gone from zero to 60 in a very short time,” Computational Challenge said Patrick Cunningham, president of Blue Marble Geographics. “This has overwhelmed most LiDAR The greater volumes of data being generated by hardware and software systems in their ability to proLiDAR sensors represent a double-edged sward. On cess these large volumes of data.” one hand, the data density and resolution allow LiDAR “Not only is there more LiDAR data being collected analysis software to do its job better. But the challenge Matt Morris than in the past, but it is getting into the hands of a of processing the larger data sets requires new compuwider variety of users who are not necessarily LiDAR experts,” said tational approaches. Jennifer Stefanacci, director of product management at Exelis VIS. “More data allows the software to perform at higher success rates “This creates the need for easy-to-use tools. New types of data formats for automated feature extraction and fuller classification and synchroare also being utilized. As we build new software tools we are always nization of data,” said Bethel. “As long as the LiDAR scans are accuthinking about how to support that data.” rately calibrated and positioned, the data coming from the multiple “The data question can be broken down two different ways,” said look angles is no hindrance. It is only a benefit.” Nicholas Rosengarten, a product manager at BAE Systems. “One has There have been advances in the last two years in the accuracy to do with the wealth of data. It has become a lot more accessible and of LiDAR data calibration, which improves data quality by correcting

GIF 11.4 | 23

misalignments resulting from the positioning of the Vertical Slicing sensor on the aircraft. In order to deal with the larger data sets, distributed Applied Imagery recently released a new version of processing schemes have been brought to bear, much its Quick Terrain Modeler product, which adds tools to as they have for other big-data problems. “Feature better analyze LiDAR point clouds, including the abilextraction especially does not lend itself to computation ity to view cross-sections. on just one machine,” said Morris. “Distributed pro“The tool allows a user to take a vertical slice cessing in the cloud chops the process into bits, with through the point cloud,” said Chris Parker, the comeach part being processed individually in a node and pany’s president. “This enables a much faster and more then merged back together. Something that could take comprehensive analysis of the point clouds and makes Patrick Cunningham two hours on one machine can be done in five minutes it easier to do things like measuring trees, power lines on a cluster of 10 work stations.” and building heights. This feature also enables users to Overwatch recently released its new version of its edit and remove points and to change their classificaLiDAR Analyst product. “Our history has been to focus tions based on what they see in the profile.” on feature extraction,” said Morris. “LiDAR Analyst The ability to edit point clouds is important in automates the extraction of terrain, buildings and cleaning up “noise” that may have shown up in the trees. The latest version allows users to obtain a full 3-D point cloud data, such as returns that derive from visualization of buildings and vegetation to perform above or below the surface being examined. “Editing line-of-sight and buffer-zone analyses. The disseminaallows analysts to get rid of points that don’t belong,” tion tools allow the LiDAR data to be sent to Google said Parker. Earth, PowerPoint presentations and 3-D PDF files.” A second function of the editing process is to The solution can render LiDAR point clouds in change the point cloud classification. “Upon examinaSandra Vaquerizo excess of 1 billion points. “To accomplish this, we tion, an analyst may determine the system classified improved our algorithms to focus on the denser point points in error, and would therefore want to reclasclouds,” said Morris. “We also implemented a distribsify those points to reflect what they really are, so that uted processing system for the software. The softwhen he saves the edit of the point cloud and passes ware allows the program to process data on a cluster of it on to a customer or a colleague, he can feel confiworkstations. This allows us to process extremely large dent he has done the necessary quality assurance and data sets in a short amount of time. The output can be that they have an accurate product that reflects realviewed in 3-D format, allowing users to pull the true ity,” said Parker. value out of LiDAR.” Applied Imagery’s new release did not incorporate Blue Marble Geographics recently released a new any new technology, but did move much of the proversion of Global Mapper, a geographic information syscessing to graphics processing units (GPUs). “GPUs tem that works with all kinds of geospatial data, includenable real-time functionality, such as the ability to Chris Parker ing LiDAR. “Most LiDAR data is delivered to Global move a traveler down a path, to do line-of-sight analMapper in compressed format,” said Cunningham. “With the most yses and to do route planning,” he continued. “Quick Terrain uses recent release, we added our own compression and can now process LiDAR to pan through 3-D terrain to avoid natural and man-made hundreds of millions of point clouds. Most GIS software today works obstacles and dangers. This would have been impossible to do in real with only 5 or 10 million points at a time.” time in the past relying on a CPU.” Global Mapper 14.1 has built-in functionality for distance and area Merrick’s recent focus has been to improve the processing percalculations, elevation querying, line-of-sight calculations, image recformance of its Merrick Advanced Remote Sensing, a Windows tification and other functions. The package increases LiDAR processapplication used to visualize, manage, process and analyze LiDAR ing and display speeds. point cloud data. “We have automated the quality control steps to Global Mapper Package files are now able to store LiDAR point assure quality throughout the entire work flow and to decrease the clouds in a special compressed format, much smaller than the usual labor hours involved in checking data by manual methods,” said uncompressed data in LAS, a standard file format for the interchange Bethel. of LiDAR data. “This allows LiDAR data to be efficiently archived or Also in the last year, Merrick has been working a feature that fuses shared with other Global Mapper users,” said Cunningham. LiDAR data with imagery data without losing resolution. “The benefit Blue Marble’s efforts to process larger volumes of data are based of this process is the ability to view the full resolution of the imagery on its focus on better machine memory management, distributed in a 3-D format,” said Bethel. “In the past we could fuse image colors processing across multiple machines, and storage techniques such as into LiDAR point clouds but the color was always at higher resolution caching. than the point cloud, and the result of the fusion was a loss of resolu“We have focused in the last year on the ability to process large voltion. We are now able to preserve all of this information in high resoumes of point clouds and manage them in a viewer so users can edit lution and in three dimensions.” points to make sure they are classified correctly,” said Cunningham. Exelis has recently repositioned its products to include LiDAR “Users can also zoom in and out of images and pan around them.” analysis tools in its ENVI image analysis platform. “This is imporVersion 14.1 introduces a new reader available in a special multitant because it allows visualization of 3-D point clouds and the analyfunction format that contains a variety of attribute and surface feature sis that goes along with that,” said Stefanacci. “We have also provided information useful for military work. automated 3-D feature extraction. The built-in capabilities are for 24 | GIF 11.4

buildings, trees and power lines. That is a complex and difficult pro“You tell the software what you want to extract, the software does it cess, which has now become a pushbutton tool.” for you, and you say ‘yes’ or ‘no,’” said Ramesh Sridharan, the compaExelis VIS has also released an application programming interface ny’s chief technologist. “BNSF Railway is using this to collect data on that allows users to combine LiDAR data with images and image anal15,000 miles of track.” ysis in a single work flow within the ENVI system. The functionalities Sridharan’s company recently introduced PanoLiDAR Viewer, that can be deployed on a desktop can now also be implemented in an a tool designed to visualize the LiDAR point cloud along with the enterprise environment. 360-degree panoramic images collected by laser scanning systems. CG² has automated techniques used for LiDAR processing that go “The overlay of point cloud and corresponding images allows for accubeyond the typical 2-D-rooftop-plus-height feature extrusion, accordrate measurements and asset extraction,” he said. “With the point ing to Vaquerizo. “Viewing the scene is only the beginning,” she said. cloud embedded, features in the image can be picked instantly.” “The point cloud is automatically clustered into individual identifiable PanoLiDAR offers several functions designed to automatically objects which are automatically assigned a classification. This inforextract features associated with roads, such as edge of pavements, mation can be used to highlight objects of interest and to compress road lines, signs, lamp posts, trees and manholes. The user can the data based on the interpreted structure, such as a planar surface.” import LiDAR in a native format, and create outputs in different mapBAE Systems’ SOCET GXP version 4.0, the company’s full-specping systems. Virtual Geomatics is currently working on infusing trum GEOINT tool, released last year, was the first version that intemore automation in its tools, making them easier to use and increasgrated the capability to visualize LiDAR point cloud data. “Digital ing analyst productivity. elevation models in TIN and Grid formats can be derived from LiDAR Exelis VIS is working on finding synergies among its ENVI LiDAR point clouds,” said Rosengarten, “Once integrated in the GXP platand non-LiDAR tools as well as with other technologies to be found form, existing functionality can be used to do editing, analyses, modwithin the Exelis parent company. “We will continue to automate feaeling and texturing to better refine those data sets.” ture-extraction processes and will continue to develop application SOCET GXP’s automatic feature extraction functions enable users programming interfaces to give users more control over their work,” to generate 3-D models with little human intervention. “We extract said Stefanacci. both buildings and trees,” said Rosengarten. “The BAE’s goal is to integrate LiDAR capabilities with buildings extracted are not just rectangles. We capture other remote sensor technologies. “The approach we roof details and make full 3-D models out of LiDAR take is to see how LiDAR data can be used with other point clouds.” data sets to solve problems,” said Rosengarten. GXP also enables users to layer imagery on top of For example, tools within SOCET GXP can detect the LiDAR surface models to enhance the visualization differences in the same terrain between two differof the area being studied. “The images can be used to ent LiDAR sensor passes, and calculate, for examview the area from different angles, to add visualizaple, how much material, such as natural resources, tion to buildings, and to correct any errors. All of this has been removed over time. “Then we can use data can be exported to Google Earth and shared across the from hyperspectral and multispectral sensors to idenintelligence community.” tify what the material is that is being shifted,” said Ramesh Sridharan SOCET GXP version 4.1, to be released later this Rosengarten. “The point is that LiDAR data can be year, will allow users to perform measurements on the used with other remote sensor capabilities to help 3-D models. “They will be able to calculate things like surface areas answer intelligence questions.” and perimeter, and to use the 3-D model to make products like 3-D “We’ve done a good job at improving LiDAR processing and analGeoPDF files and PowerPoint presentations,” said Rosengarten. ysis tools but we are not done by any stretch,” said Cunningham. Other enhancements expected in version 4.1 include increas“Hardware capabilities need to be stepped up to better process and ing the speed and performance of automatic feature extraction using deliver data. When you are dealing in terabytes of data you have to graphic processing unit technology “This means utilizing graphics deliver customers a hard drive because networks don’t have enough card technology to increase algorithm performance timelines,” said bandwidth to transmit the volumes of data effectively. Online cloud Rosengarten. services are really not yet a vehicle for processing and transmitting BAE’s answer to LiDAR’s big-data problem includes a rewrite LiDAR point cloud data because of bandwidth issues. Eventually the in version 4.0 of its 3-D multiport to accommodate tens of millions ability to consume LiDAR data over the Internet will emerge.” of data points “with no significant performance degradation,” said On the software side, multicore processing will be supplemented Rosengarten. “Version 4.1 will also have 64-bit computing, as comby other techniques, according to Cunningham, including the use pared to the 32 bits in current versions of GXP, so that we can take of video cards and tools that automatically and intelligently trim the advantage of increased computer memory.” LiDAR data without devaluing the data. “This is the year of LiDAR BAE’s GXP Explorer also helps with LiDAR’s big-data sets by processing,” he said. “We have taken some first steps and we will be crawling metadata and directories and cataloging user data. “That taking more steps and releasing more tools for more powerful prohelps users find data and see what is relevant to their problem,” said cessing and 3-D management.” O Rosengarten.

Human Intelligence Virtual Geomatics has released a LiDAR feature extraction product that combines computer processing with human intelligence.

For more information, contact GIF Editor Harrison Donnelly at or search our online archives for related stories at

GIF 11.4 | 25


The Navy’s shift to the Pacific inspires our twelfth title and website...


will support the Navy with the latest program developments in air and sea for Congress, the executive branch, other services and industry.





Cover Q&A:

Cover Q&A:

Cover Q&A:

Cover Q&A:

Rear Adm. Thomas Moore, PEO Aircraft Carriers

Rear Adm. Donald Gaddis, PEO Tactical Air Programs

Rear Adm. David Lewis, PEO Ships

Special Section:

Special Section:

Special Section:

Rear Adm. Paul Grosklags, PEO Air ASW, Assault and Special Mission Programs

Carrier Onboard Delivery Replacement

Mine Warfare

Special Section:


Airborne ISR

USV/UUV Systems and Launch and Recovery Technologies

Shipboard Fire Alarms and Control Systems




Modeling & Simulation in Ship Design

Ship Self-Defense Riverine Patrol Craft Precision Guided Munitions Program Spotlight: Presidential Helicopter

Features: Vibration Control Ship Life Cycle Management Program Spotlight: LCS

Maritime ISR Capabilities Asia Focus Program Spotlight: F-35

Fleet At-Sea Replenishment Corrosion Control Program Spotlight: DDG1000

Contact Nikki James at or 301-670-5700 to participate in the inaugural issue!

The advertisers index is provided as a service to our readers. KMI cannot be held responsible for discrepancies due to last-minute changes or alterations.


Compiled by KMI Media Group staff

Advertisers Index


American Military University . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 BAE. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . C2 DigitalGlobe. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 DynCorp. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 General Dynamics AIS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . C4 GIS For Government. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 IHS. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 Lockheed Martin. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 PAR Government Systems Corp.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 Pixia . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

July 8-12, 2013 Esri International User Conference San Diego, Calif. September 16-18, 2013 Air and Space Conference National Harbor, Md. September 24-26, 2013 Modern Day Marine Quantico, Va.

October 13-16, 2013 GEOINT 2013 Symposium Tampa, Fla. October 21-23, 2013 AUSA Annual Meeting Washington, D.C. October 29, 2013 SAP NS2 Solutions Summit Falls Church, Va.


Want to REACH the decision-makers in the DEFENSE COMMUNITY With a unique concentration on senior military officers and DoD leadership, KMI

KMI’S FAMILY OF PUBLICATIONS Border Threat Prevention and CBRNE Response

Media Group focuses on distinct and essential communities within the defense

Integrated Fixed Towers

Border Protector


June 2012 Volume 1, Issue 1

Michael J. Fisher Chief U.S. Border Patrol U.S. Customs and Border Protection

Leadership Insight: Robert S. Bray Assistant Administrator for Law Enforcement Director of the Federal Air Marshal Service Transportation Security Administration

Hazmat Disaster Response Wide Area Aerial Surveillance O Program Tactical Communications O P-3

market. This provides the most powerful and


precise way to reach the exact audience


that procures and deploys your systems,


services and equipment.




KMI Media Group offers by far the largest


and most targeted distribution within critical


G for Navy NAU I Medium

The Communication

The Communication


Medium for Navy




World’s Largest

market segments. Sharp editorial focus,

Carrier Craftsman


Rear Adm. Thomas J. Moore


Special Ops



U.S. Navy Program Executive Officer Aircraft Carriers 2013


pinpoint accuracy and depth of circulation


Patrol Craft O Riverine Partnership Shipboard Self-Defense Development Helicopter O Educational Presidential Munitions O Precision Guided


SOF Enhancer Adm. Bill H. McRa ven Comman der Special Operatio ns Comman d


May 2013


make KMI Media Group publications


the most cost-effective way to ensure your


11, Issue


Rapidly Deployab Robotics le Technolog Networks O SOF y O Global Light SOF TrainingVehicles

advertising message has true impact. To learn about advertising opportunities, Call KMI Media Group at 301.670.5700

GIF 11.4 | 27


Geospatial Intelligence Forum

Darin Powers Vice President, Intelligence and Security DynCorp International Darin Powers is vice president of DynCorp International‘s Intelligence and Security business, where he oversees the company’s intelligence, special operations and security-focused business. Prior to joining the company in 2009, he served as director of the operational support business area within Northrop Grumman’s intelligence systems division; in that position he led a business unit that worked with customers performing round-the-clock intelligence operations. A veteran of the U.S. Marine Corps, Powers held command and staff positions through the battalion level during his 20 year military career. Q: What types of products and services are you offering to military and other government customers? A: DynCorp International [DI] is a global services company offering tailored solutions in aviation, logistics, training, intelligence and other areas, to support the unique needs of commercial, government and military customers around the world. The company is structured into three strategic business groups: DynAviation, DynLogistics and DynGlobal. Our DynAviation Group provides comprehensive aerospace, aviation and air operations solutions worldwide. In DynLogistics, we offer best-value mission readiness through total support solutions, including conventional and contingency logistics; operations and maintenance support; platform modification and upgrades; supply chain management; training; security; and full spectrum intelligence mission support services. Finally, DynGlobal brings the full range of DI’s diverse capabilities and decades of experience to international and commercial customers. My business area—Intelligence and Security—is housed within the DynLogistics Group and provides fullspectrum intelligence mission support services and intelligence-informed security services. 28 | GIF 11.4

Q: What unique benefits does your company provide its customers in comparison with other companies in your field? A: In contrast with companies that have sprung up in the past decade or so in response to the surge in U.S. government contracting, our value to our customers and our values as a company have been built through six decades of work in complex and difficult environments, dating back to World War II. In that time we’ve been a trusted partner to the U.S. government: We’ve supported peacekeeping initiatives during times of peace, sustained the warfighter during times of conflict, and supported post-conflict capacitybuilding in developing nations. With a proven track record of performance, and building on the strong global presence and relationships we have built over the years, we’re able to bring the same services we’ve been providing to the U.S. government for years, to U.S. allies and new commercial customers. In my business area, we offer a comprehensive solution that I believe is unique in our industry. Not only do we offer intelligence capabilities, but also they are backed by an extensive global presence, along with deep aviation, logistics and security know-how, international business experience, language capabilities, and cultural understanding that you simply won’t find elsewhere. We’re also proud to serve alongside our customers in all locations, environments and conditions.

Separate and apart from our capabilities, something else that I believe is unique to our company is its culture. Because we’re a service company, we feel it is particularly important to focus on our people: the values they bring to the job each day and their leadership behaviors. We don’t make a specific piece of equipment; our product is the knowledge, experience and solutions that we bring to our customers. So we have invested a great deal of time into leadership training, the result of which I believe is felt by our customers in the superior service they get from our people. Q: How does your business currently work with its customers to contribute to U.S. national security? A: Our intelligence collection and analysis experts—who have years of handson intelligence experience—are involved in the full intelligence exploitation cycle. We create functional teams that offer integrated collection, analysis, science and technology, and other competencies to deliver full-spectrum solutions in the field of document and media exploitation, forensic analysis, source operations and counter-intelligence. In addition, we have developed and run advanced training programs that focus on educating, training and certifying the intelligence and security professionals of tomorrow. As experts in cross-cultural engagement, the training solutions we offer are tailored to the customer’s needs. Our graduates gain the competencies and capabilities necessary to meet the unique demands required of today’s national security professionals. In short, the intelligence and security professionals of DI proudly serve forward with our customers, in all locations, environments and conditions, protecting our nation through agile, integrated intelligence and security solutions. O


July/August 2013 Volume 11, Issue 5

Cover and In-Depth Interview with:

Dr. Peter Highnam Director Intelligence Advanced Research Projects Agency

Features: Video Enhancement As was shown during the recent investigation of the Boston Marathon bombing, military and homeland security forces can benefit greatly from video imagery for situational awareness. Image quality is often so poor that operators can miss important details, but software for enhancing video imagery can help.

Social Media The integration of social media and location-based analysis will provide powerful new tools for geospatial intelligence, experts say.

Military Geographic Information Systems GIS technology is vital to a wide range of military and intelligence programs.

Intelligence Enterprise IC Information Technology Enterprise strategy focuses on enabling greater integration, information sharing, and information safeguarding through a common IC IT approach that substantially reduces costs.

Bonus Distribution Esri International User Conference July 8-12, 2013 San Diego, Calif.

Insertion Order Deadline: June 17, 2013 • Ad Materials Deadline: June 24, 2013

GIF 11-4 (May 2013)