Page 28

various sensors streaming real time to the end-user. The data is fused General Dynamics customers are looking to activity-based intellisuch that all of the sensor data from the various sensors continues to gence to provide real-time indications and warning to the warfighter, be co-registered to the specific object or location of interest, even as according to Redd. “We see the need to tie forward-deployed sensing the object moves.” to the enterprise analytics. Initially, these may be independent sensors with some back-end processing. However, we have also been focusing on exploiting the cloud to link analytics to data where the data resides, Innovations in Control and delivering the results directly to the user.” Thanks to the large volumes of data being generated by this genAlong with advances in EO sensors and their associated processeration of EO sensors, the vast majority of persistent surveillance ing have come innovations in control devices. video is stored on-board the sensor. That helps both in sparing bandKutta Technologies, for example, has developed a bidirectional width and avoiding drowning analysts in data. video transceiver, through which an operator can slew a camera to “Persistent coverage of vast areas, with adequate resolution and a particular location based on a graphical map interface on a laptop frame-rates, generates about a hundred times more data than will computer or handheld device. Kutta is also working on a universal fit down the pipe, even with moderate data compression,” said Wein. ground control station, which will be able to interoperate with a vari“The goal is to make this on-board database behave as a server in the ety of UAV systems. sky—such that users on the ground can sift through the data and rapThe Office of the Secretary of Defense has implemented a conidly access and pull down those specific areas of interest.” sumer-style app store where developers of UAV applications can proNot all of the data collected is immediately relevant or of interest vide innovations to the user community. Kutta’s contribution is a to the user. “To this end, the airborne processors are designed to look downloadable app that allows operators to control UAVs and their payfor cues such as moving target indication, or to watch a specific localoads in a point-and-click environment. tion of interest,” said Wein. “The benefit is that users don’t have to go through extensive train“In other words, they only send down the video feeds in real time ing to learn how to use the system,” said Doug Limbaugh, Kutta’s that are directly relevant and store the rest,” he continued. “Multiple chief executive officer. The company is also working on another app regions anywhere within the very wide field-of-view can be contindesigned to reduce data latency. uously watched by the airborne processor with virtual video perimBall Aerospace touts its Total Sight Flash LiDAR, an emerging eters. If something crosses a prescribed boundary, a real-time video technology, as the future of EO sensor systems. Total Sight Flash window can be spawned and sent to the ground for monitoring. The LiDAR is a real-time, full motion color light imaging, detection, and user can also go backwards in time and view events at that particuranging that combines light detection and ranging (LiDAR) sensors lar location by remotely accessing the on-board storage. and laser systems with real-time processing systems. This approach of thinking smartly about what is and is LiDAR uses laser light pulses to gauge distances not sent down the link prevents analysts from becomby measuring the time delay between transmission ing saturated with data.” of the pulse and detection of the reflected signal. An “We are developing tools to take care of data locally,” aircraft-mounted range finder swings back and forth said Hansson. “We are trying to filter data so that we collecting data on hundreds of thousands of points just don’t pump out everything to everybody. Most per second. The data returned by the LiDAR sensor data is now preprocessed on board the sensor itself or provides location data on an x-y-z axis, referred to as in the ground control station before being distributed a point cloud. to users.” “What separates our technology is our ability to High-fidelity EO imagery has become a critical acquire a greater number of points per square meter Jeff Schmidt component of a multi-sensor data-fused situational picand to display the results to pilots or transmit it to ture. “The clarity and precision pointing of EO sensors the ground in real time. With ordinary LiDAR, the allows for reliable target identification during daylight and low-light information has latency to the end user. You don’t know what you operations,” said Redd. have until you get to the ground,” said Jeff Schmidt, senior manager General Dynamics’ standards-based metadata allows integraof business development at Ball Aerospace. tion with Motion Imagery Standards Profile-compliant exploitation LiDAR also typically requires forward motion of the aircraft to tools. “These systems allow analysts to rapidly search for, identify and send and receive laser pulses. “Our system can create 3-D images by exploit critical mission data from various sensor types across mulpointing and staring, which is unique,” said Schmidt. “It is electronitiple platforms in order to create a comprehensive mission picture,” cally versus mechanically steerable, which is also an innovation. The said Redd. system is also platform agnostic. We have demonstrated these capabil“Data fusion is critical to the future of persistent surveillance,” ities on fixed and rotary, manned and unmanned aircraft.” said Wein. “Fusion is much more than merely having different sensors Total Sight Flash LiDAR can be used to do mapping, to view collocated on the same platform, with each behaving more or less as potential landing areas and to assess hazards, according to Barnes. its own stovepipe with side-by-side displays on the ground.” “But the technology is cost constrained right now,” he said. “More The ability to fuse the output of different sensors so that they coltechnology, such as less expensive lasers, steering systems and focal lectively behave as one system brings greatly enhanced capability. planes, need to be developed before it becomes mainstream.” O “One sensor senses and cues another sensor to place a tracking window around the moving object of interest,” said Wein. “The user on For more information, contact GIF Editor Harrison Donnelly the ground would have a single video window on the display with the at harrisond@kmimediagroup.com or search our online archives for related stories at www.gif-kmi.com. live video and all of the relevant geo-registered information from the 26 | GIF 11.3

www.GIF-kmi.com

GIF_11-3_FINAL  
Advertisement