Sap%20ns2

Page 9

Analysis at the Speed of Thought The SAP HANA data platform has enough flexibility and speed to change the way analysts do their work. With its ability to provide near-instant analysis of very large data sets from a huge variety of sources, in-memory computing technology brings a level of flexibility and speed to intelligence missions that can actually change the way analysts do their work. That bold prediction comes from Bob Palmer, senior director of SAP NS2, who foresees a huge impact on the intelligence profession and mission from the recent introduction of SAP HANA, a platform that leverages technological advances in main memory, multicore processing and data management to deliver radically better computing performance at lower costs. Engineered to provide “analysis at the speed of thought,” the SAP HANA platform not only delivers remarkable increases in processing speed, but also new capabilities in areas such as predictive analysis and activity-based intelligence (ABI). “Because it can handle both structured and unstructured data, all with the speed of in-memory computing, it empowers analysts to approach the information they need to solve a mission problem in new and different ways,” said Palmer. “The ability to do analysis at the speed of thought on multi-trillion record data sets gives them the ability to work iteratively, at a natural rhythm and in real-time, in order to pursue their analytical outcomes.” The power of SAP HANA lies in its gamechanging strategy for managing and analyzing big data. Developed in collaboration with Intel Corp., which wrote HANA-specific instruction sets into its newest generation of chips, the platform provides software and hardware that are optimized to deliver breakthrough speed for complex analysis.

SAP HANA embodies a unique approach to the problem of operating on very large data sets, because the logic to be applied to the data—and all of the data that will be subject to processing—are both contained in main memory at all times. External memory—either in solid state or spinning disk drives—is not part of the computational path. Yet because it is based on open standards, SAP HANA can deliver this acceleration to any existing tool or user interface that already is in use. “Historically, analysts have either worked iteratively on smaller data sets, or they’ve worked on very large data sets in batch mode, with a latency between the desire for an analytical outcome and the fulfillment of that desire,” Palmer explained. “SAP HANA changes that paradigm, in that even very large data sets can be exposed to complex predictive algorithmic processing in sub-second response times. “For example, analysts can process geospatial data and open source unstructured data all in one platform. It can perform ‘what if’ or predictive analysis on possible future outcomes in just seconds, even on trillions of records,” Palmer continued. The result is an unprecedented ability to analyze both real-time streaming events and compare them against historical data in timeseries analysis in one platform. “Historically, streaming data is processed separately from historical analysis of data. The difference with SAP HANA is that both those types of data can be handled in one analytical view,” he added. Another advantage is that the SAP HANA platform can interact easily with other technologies in the enterprise, without the need for esoteric skill-sets or proprietary interfaces.

This accelerates existing tools within the enterprise with the speed of in-memory processing. That is critical in areas such as ABI, where one of the prerequisites is that the analytical solution must be very fast. “If the data platform only operates in a batch mode, and can only give you a rear-view mirror view of what has already happened, it does not fit well in an ABIbased paradigm,” Palmer observed. Similarly, predictive analysis is particularly well suited for in-memory data processing. Because complex predictive algorithms are traditionally very computing-intensive, analysts historically have only used samples of data to run their predictive models, given the slow speed of traditional computing architectures. “With SAP HANA, by contrast, predictive analysis can be performed on every element of the data set. It’s axiomatic in predictive analysis that the more data points that you use in the algorithm, the more accurate the range of predicted outcomes are,” Palmer said. SAP HANA is also a strong complement to the Hadoop Distributed File System, which was created to speed work on very large data files. SAP HANA combined with Hadoop offers the ability to absorb any sort of data needed for the mission or the business into a large unstructured “bit-bucket” file in Hadoop, where it can be accessed from queries in the SAP HANA platform. “SAP HANA can leverage the ability of a Hadoop system to store multiple petabytes of data, and yet bring to it real-time analysis without the time delay that is usual in the Hadoop batch processing paradigm,” Palmer said. “That’s a great way to leverage the existing investments that an agency has in data infrastructure.” ■

S

o

l

u

t

i

o

n

s

R

e

p

o

r

t

7


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.