IQ - Intelligence Quarterly

Page 1

intelligence quarterly Journal of Advanced Analytics

1Q 2O14 International Edition

17 5 data governance

3 Data management

19 Denmark’s data bank

with your data

backgrounder

6 B ig data in Italy’s public sector 9 North Carolina gets

mistakes to avoid

gives insight into assistance programs

22 Understanding data in motion

tougher on crime with business analytics

25 The role for big data

12 What was your data doing

28 Dutch hospital brings

during the financial crisis?

14 Build customer confidence with better data

Set your data free for the big data world

in health care’s triple aim analytics to the workplace

International Edition

IQ

1 Holding a conversation


IQ intelligence quarterly

Journal of Advanced Analytics

1Q 2O14

what’s in this issue?

Editorial Director Mikael Hagstrom mikael.hagstrom@sas.com Editor-in-Chief Alison Bolen alison.bolen@sas.com

Today organizations are pulling in social media data, exploring machine learning, and generally aggregating more data than they ever have before. Big data is here, but do you have a strategy for managing big data? Do you need one? Can you use the same strategies you developed decades ago? You’ll find the answers to these and other questions in the following pages.

Intelligence Quarterly is published quarterly by SAS Institute Inc. Copyright © 2014 SAS Institute Inc., Cary, NC, USA. All rights reserved. Limited copies may be made for internal staff use only. Credit must be given to the publisher. Otherwise, no part of this publication may be reproduced without prior written permission of the publisher. SAS and all other SAS Institute Inc. product or service names are registered trademarks or trademarks of SAS Institute Inc. in the USA and other countries. ® indicates USA registration. Other brand and product names are trademarks of their respective companies. SAS is the leader in business analytics software and services, and the largest independent vendor in the business intelligence market. Through innovative solutions, SAS helps customers at more than 65,000 sites improve performance and deliver value by making better decisions faster. Since 1976 SAS has been giving customers around the world THE POWER TO KNOW. ®

Managing Editor Lindsay Beth Gunter lindsaybeth.gunter@sas.com opy Editors C Amy Dyson Chris Hoerter Evan Markfield Trey Whittenton Editorial Contributors Andrea Acton Frédéric Combaneyre Jill Dyché Sven Hauge Mirjam Hulsebos Mazhar LeGhari Augusta Longhi Fiona McNeill Kimberly Nevala Stuart Rose Sylvana Smith Ed Walker Carolina Wallenius Lorena Zivelonghi Art Direction Brian Lloyd Photography John Fernez Steve Muir


1

Holding a conversation with your data Managing data for high-performance analytics opens it up for discussion

Have you been keeping up with the total amount of data being generated over the last few years? We’re moving from petabytes and exabytes to zetabytes and yottabytes faster than anyone ever imagined. And most of our systems were designed for a terabyte world.

So which of the traditional data management methods still provide value? And what new strategies should you be considering? This issue of Intelligence Quarterly bridges the gap between old and new, explains what still works, and examines what’s coming next so you can:

Eighty percent of the world’s data didn’t exist two years ago. If you’re still using the same IT and data management strategies that you used back then, it’s likely your systems are out of date.

• Develop a solid strategy for liquidizing your big data asset, so it can flow throughout the organization. Read our data management backgrounder (page 3) and data governance tips (pages 13 and 17) for relevant background information.

Until recently, conventional IT strategies have focused on generating, moving and storing data, as opposed to putting the data to work through the use of analytics. Instead of just capturing big data, you should be finding the best ways to use and reuse it wisely.

• Start analyzing data as it’s streaming in real time. We discuss event stream processing on page 22, explaining how to analyze streaming data to spot patterns and make decisions when it matters the most.


2 When you can break down gagged or constrained data structures and set your data free, it speaks more clearly and tells you things you would not know otherwise… it draws your attention to questions you wouldn’t have thought to ask.

•U se new, in-memory technologies for analyzing large quantities and different types of data. For example, CSI Piemonte (page 6) is combining and analyzing data from digital libraries, health care records, sensors and the Web to improve public sector programs. • Visualize your data as soon as possible in the information life cycle. The new data bank application in Denmark (page 19) shows how visual displays of data can be used to benefit citizens and make public policy more relevant.

The stories in this issue about retailers, insurance companies and public sector organizations illustrate the importance of having clean, clear, relevant data that you can use immediately. When was the last time you really had a conversation with your data? And what did it tell you? When you can break down gagged or constrained data structures and set your data free, it speaks more clearly and tells you things you would not know otherwise. That’s how the data starts to talk to you. It’s how you learn what your data has to say. And it draws your attention to questions you wouldn’t have thought to ask.

As head of an expanding global team of 5,000 professionals in 48 countries, Mikael Hagstrom is passionate about providing a culture where innovation can flourish, resulting in market leadership for the organization and its customers. He leads SAS’ Europe, Middle East, Africa (EMEA) and Asia Pacific regions, which account for more than half of SAS’ total revenue.

Unless you make the switch to managing data for high-performance analytics, your data will be an added cost, not an asset. With the right strategy, however, your data can be the competitive advantage you need to increase revenue and reduce costs. So stop amassing data in old structures and start putting it at your fingertips. In the new digital economy decision makers need to connect the virtual with the practical – or the service with the product – and high-performance analytics does just that.

online Follow Mikael Hagstrom: blogs.sas.com/mikaelhagstrom twitter.com/mikaelhagstrom


3 Data management backgrounder What it is – and why it matters

You’ve done enough research to know that data management is an important first step in dealing with big data or starting any analytics project. But you’re not too proud to admit that you’re still confused about the differences between master data management and data federation. Or maybe you know these terms by heart. And you feel like you’ve been explaining them to your boss or your business units over and over again. Either way, we’ve created the primer you’ve been looking for. Print it out, post it to the team bulletin board, or share it with your mom so she can understand what you do. And remember, a data management strategy should never focus on just one of these areas. You need to consider them all.


Data Access what is it? Data is only an asset if you can get to it. Data access refers to an organization’s ability to get to and retrieve information from any source. Data access technology, such as database drivers or document converters, are used to make this step as easy and efficient as possible so you can spend your time using the data – not just trying to find it. why is it important? The data that an organization might need can exist in many places – in spreadsheets, text files, databases, emails, business applications, Web pages and social media feeds. Without a good way to access data from these sources, collecting the information becomes a nightmare. Though it is a commonly forgotten element of data management, good data access technology is essential for organizations to extract useful data from any data storage mechanism and format that they have. Without it, trying to get the data you need is like walking into a vast, sprawling library with row after row of bookshelves and being told to look for a specific printed sentence with no instructions, no map, no organization, and no one to help you.

Data Quality what is it? Data quality is the practice of making sure data is accurate and usable for its intended purpose. Just like ISO 9000 quality management in manufacturing, data quality should be leveraged at every step of a data management process. This starts from the moment data is accessed, through various integration points with other data, and even includes the point before it is published, reported on or referenced at another destination. why is it important? It is quite easy to store data, but what is the value of that data if it is incorrect or unusable? A simple example is a file with the text “123 MAIN ST Anytown, AZ 12345” in it. Any computer can store this information and provide it to a user, but without help, it can’t determine that this record is an address, which part of the address is the state, or whether mail sent to the address will even get there. Correcting a simple, single record manually is easy, but just try to perform this process for hundreds, thousands or even millions of records! It’s much faster to use a data quality solution that can standardize, parse and verify in an automated, consistent way. By doing so at every step, risks like sending mail to a customer’s incorrect address can be eliminated.

Data Integration what is it? Once you have accessed the data, what do you do with it? A pretty common next step is to combine it with other data to present the unified results. Data integration is the process that defines the steps to do this, and data integration tools help you design and automate the steps that do this work. The most common types of data integration tools are known as ETL, which stands for extract, transform and load, and ELT, which stands for extract, load and transform. Today, data integration isn’t limited to movements between databases. With the availability of in-memory servers, you might be loading data straight into memory, which bypasses the traditional database altogether.

why is it important? Data integration is what allows organizations to create blended combinations of data that are ultimately more useful for making decisions. For example, one set of data might include a list of all customer names and their addresses. Another set of data might be a list of online activity and the customer names. By itself, each set of data is relevant and can tell you something important. But when you integrate elements of both data sets, you can start to answer questions like, “Who are my best customers?” “What is the next best offer?” Combining some key information from each set of data would allow you to create the best customer experience.


Data Federation what is it? Data federation is a special kind of data integration. The ETL and ELT types of data integration combine data and then store it elsewhere for use, in the past within a data mart or data warehouse. But what if you simply want to look at the combined results without the need to move and store it beforehand? Data federation provides the capacity to do just that, allowing you to access the combined data at the time it is requested. why is it important? While many ETL and ELT data integration tools can run very fast, their results can only ever represent a snapshot of what happened at a certain point in time when the process ran. With data federation, a result is generated based on what the sources of data look like at the time the result is requested. This allows for a timelier and potentially more accurate view of information. Imagine you’re buying a gift for your loved one at the store. As you check out, you receive an offer for another item that complements the gift you’ve chosen and happens to be something your loved one would enjoy. Even better – the item is in stock in the same store. Thanks to real-time analysis of next-best offer data and location data, the retailer enhances your shopping experience by delivering a convenient, relevant offer to you at the right time and the right place.

Data Governance what is it? Data governance is the exercise of decision-making authority over the processes that manage your organization’s data. Or to put it another way, it’s making sure that your data strategy is aligned to your business strategy. why is it important? Data governance starts by asking general business questions and developing policies around the answers: How does your organization use its data? What are the constraints you have to work within? What is the regulatory environment? Who has responsibility over the data? Once the answers to these questions are known, rules that enforce them can be defined. Examples of such rules might be defining what data users can access, defining which users can change the data versus simply view it, and defining how exceptions to rules are handled. Data governance tools can then be used to control and manage the rules, trace how they are handled, and deliver reports for audit purposes. The auditability aspect of this is perhaps the most vital, as the organization’s leaders have to sign off on the accuracy of financial reports to governance boards, shareholders, customers and governmental bodies. It’s a heavy responsibility and one that carries the risk of censure, heavy fines and even legal action if not handled correctly.

Master Data Management what is it? Master data management (MDM) is a set of processes and technologies that defines, unifies and manages all of the data that is common and essential to all areas of an organization. This master data is typically managed from a single location, often called a master data management hub. The hub acts as a common access point for publishing and sharing this critical data throughout the organization in a consistent manner.

why is it important? Simple: It ensures that different users are not using different versions of the organization’s common, essential data. Without MDM, a customer who buys insurance from an insurer might continue to receive marketing solicitations to buy insurance from the same insurer. This happens when the information managed by the customer relationship database and marketing database aren’t linked together, leading to two different records of the same person – and a confused and irritated customer. With master data management, all organizational systems and data sources can be linked together and managed consistently on an ongoing basis to make sure that any master data used by the organization is always consistent and accurate. In the big data world, MDM can also automate how to use certain data sources, what types of analytical models to apply, what context to apply them in and the best visualization techniques for your data.

why is it important?


6

Big data in Italy’s public sector

Northwest Italy’s Piedmont region harnesses big data to deliver better services more efficiently to its citizens In the Piedmont region of northwest Italy, the region’s public administration has begun to make effective use of big data in a variety of ways using SAS® solutions. These huge and complex data sets are amassed by public sector bodies on a routine basis. The work to achieve this has been led by the CSI-Piemonte consortium, which is owned by 96 organizations within the public administration but operates in the form of a company. The consortium is one of Italy’s largest and most important IT entities. It employs approximately 1,100 people, and its turnover exceeds 160 million euros (US$207 million). “We plan and develop innovative public services that make life easier for citizens and businesses, and facilitate and accelerate dealings with the public

administration,” explains Paola Leproni, Head of the Governance Management Area at CSI-Piemonte. CSI also helps these public entities to cooperate, share best practices and optimize their internal processes. As a result, they can save time, reduce costs and satisfy the needs of their citizens. In addition, CSI encourages the involvement of local companies in public sector projects and helps them respond to calls for tender. It also supports their drive to differentiate and cooperate.

Making full use of big data In data management, says Leproni, CSI has gone through different development stages over a long period. These stages have ranged from printed records, operational databases and data


7 “Public administration is in a particularly good position to make use of big data, so this will be our mission in the future ... We want to make ever better use of big data.” Paola Leproni, Head of Government Management, CSI-Piemonte

Marco Boero, IT Services Manager, and Paola Leproni, Head of Government Management, CSI-Piemonte

banks to the present situation, where CSI links together data produced by numerous Web applications and collects data from citizens and from sensors that monitor the environment. In recent years, one of CSI’s main goals has been to help public administration entities share data among themselves. This involved setting up a single regional public administration database. Following this, CSI began to distribute master data to public organizations and to develop joint use of data. And it also started to share open data via the Internet. Nevertheless, Leproni points out that the volume of data is growing faster than the rate at which it is being utilized. At CSI they no longer talk only of data in general but also about big data, much of which comes via the media, entertainment, health care, video surveillance and social media.

“We believe that it is extremely important to be able to manage all this data,” Leproni says. Different sectors produce different quantities and types of big data. Banks, for example, produce a lot of numeric data, but less in the way of video, image and audio data. The media, on the other hand, produces an abundance of all data types. In some sectors the potential benefits of big data are greater than in others, notes Leproni. “Public administration is in a particularly good position to make use of big data, so this will be our mission in the future,” Leproni says. “We want to make ever better use of big data.”

New analytics power The public administration entities in the Piedmont region have a total of 1,338 conventional databases, according to Leproni. If the databases are arranged by subject, the number increases to 1,485,


8 Gaining knowledge from Web and social data Not only is CSI-Piemonte working with traditional data sources, but the group also collects and analyzes Web and social network data to help public sector groups better serve citizens. Listening programs include monitoring citizen tweets from events in the Piedmont region, as well as national and international events. Analyzing the main topics of discussion helps agencies improve plans for future events and react to immediate feedback. Another program focuses on the region’s labor markets by collecting public hiring information from Monster. it, LinkedIn and Twitter of Piedmont region. Analyzing this data requires text analytics and data visualization from SAS. The results provide insight into the great dynamism of the labor market and job classifications, and increase knowledge about the demand and supply of labor. Another project analyzes institutional portal usage data by accessing public Web logs. The process parses a log file from a Web server and derives indicators about who, when and how a Web server is visited. It monitors four institutional portals and processes 2 terabytes of data each month. The system is developed using grid technologies and business intelligence from SAS to identify top users, top accessed Web pages and most frequently accessed time periods. This service highlights the strength and weaknesses of the portals.

because there are a lot of multisubject databases. CSI has already begun to manage various new big data categories, such as the digital library, health care image data, streaming data and sensor data on the environment. It also distributes this data for public organizations to utilize. “We also aim to create links between this and conventional data, so that we can derive fresh analytical power from both,” says Leproni. At CSI the view is that traditional BI is turning into data science, because data volumes are growing exponentially and a completely new type of data is becoming available for analysis, and because more advanced tools and considerably greater processing power are available for the analysis. For this purpose, CSI requires new kinds of experts who are able to produce business value from the new type of data, says Leproni. “It is essential that we can convert what is purely data and information into knowledge and intelligence,” Leproni adds. “We believe that data visualization will grow rapidly. We want to optimize all our processes and do this using data visualization.”

SAS® infrastructure is the answer According to CSI’s IT Services Manager Marco Boero, the aims in handling big data are that there should be improved management of the platform and reduced platform costs. “Business, on the other hand, requires more agile production processes for BI services, as well as compliance with SLA agreements made with customers and 24/7 services, task-critical services and protection from interruptions in use,” he explains.

With these needs in mind, says Boero, CSI had to develop its BI platform in order to secure business continuity using architectural solutions. The architectural solution chosen was the SAS analytical infrastructure based on distributed network computing, in-database computing and in-memory computing. Boero says that CSI selected SAS Grid Manager as its practical implementation tool, which allows centralized management of the environment and prioritization of the organization’s policies. Usability for SAS applications is high, notes Boero, as use is not disrupted by maintenance measures. Different applications can be used flexibly and simultaneously. And users can both obtain more data and receive more complex analyses in less time than before. According to Leproni and Boero, public administration is in a particularly good position to make use of big data, which is why it will remain CSI-Piemonte’s mission in the future.

online More on big data: sas.com/bigdata SAS Italy: sas.com/italy


9

North Carolina gets tougher on crime with business analytics

State saves $12 million annually thanks to better data access and process efficiencies Increased data volume, archaic information systems, shrinking budgets and constrained resources can hinder law enforcement and criminal justice agencies from effectively coordinating information and proactively maintaining public safety. Public safety agencies need reliable, timely and accurate data to strategically and tactically reduce crime and victimization, enhance public safety and optimize the allocation of finite resources. Challenged with obtaining a comprehensive view of individuals with prior criminal records, including potentially dangerous offenders, law enforcement and criminal justice officials in North Carolina needed an efficient, integrated application to provide quick access to accurate

offender information. To replace the manual process of integrating historical criminal data from multiple systems, reduce the risk of overlooking critical data and improve the information needs of law enforcement agencies, North Carolina’s Office of the State Controller worked with SAS to develop the Criminal Justice Law Enforcement Automated Data Services (CJLEADS) application.

A composite picture, virtually CJLEADS is an on-demand, Web-based application hosted by SAS. It integrates criminal offender data to provide courts, law enforcement, probation and parole agencies with a complete view of a criminal offender. The system also includes a watch list that allows officials to monitor the change of any offender’s status, such


10 “ CJLEADS is a tool to support criminal justice professionals with making quicker and more effective decisions. [It] provides a single source of information from a variety of criminal justice organizations [that] agencies can access securely via the Web.” Danny Bell, Program Director, North Carolina Office of the State Controller

as arrests, future court appearances or a release from custody. “CJLEADS is a tool to support criminal justice professionals with making quicker and more effective decisions,” says Danny Bell, Program Director at the North Carolina Office of the State Controller. “CJLEADS brings together disparate criminal justice data to help create a more rounded profile of offenders and provides a single source of information from a variety of criminal justice organizations – including court, warrant, probation, parole and local jail information – which agencies can access securely via the Web.”

Weaker budgets, stronger protection With the CJLEADS system, authorized criminal justice professionals can log in to the application through a secure, Webbased interface to perform searches. Search results on individuals are displayed as summaries, which users click to view more detailed data, such as

an individual’s criminal background. In addition, automated messages can be requested to monitor an individual’s legal status changes. “Because SAS hosts CJLEADS, the state focuses on design and business requirements, rather than procurement and installation and maintenance of a technical infrastructure,” explains Bell. “With shrinking state budgets, leveraging existing computing capabilities and technical support resources continues to be the most economical and efficient way to enhance the application environment.”

Scalability “CJLEADS is highly scalable. Initially CJLEADS supported 3,000 users – it now supports 26,700 criminal justice professionals and will continue to grow in the coming years,” Bell continues. “Based on improved access to more complete information and continued expansion of the system’s functionality, the state estimates time efficiencies and cost avoidance of $12 million annually.

SAS brought considerable resources to this project and demonstrated a vested interest in public safety. SAS’ expertise in data integration and analytics, as well as strong security controls of the technical environment, application access and authentication, was critical due to the complexity and sensitivity of the data.” Bell says the greatest challenge developing CJLEADS was data quality and the lack of common data identifiers from disparate sources. “Significant time and effort was spent developing consistent business rules to accurately match multiple system records for one individual. Because critical decisions are made based on information in CJLEADS, accurate data integration was pivotal to the project’s success.”

Preventing crime in less time Bell points to a number of recent criminal arrests that demonstrate the effect CJLEADS is having:


11 Better citizen services through analytics CJLEADS isn’t the only system that provides 360-degree views of individual citizens. Here are other examples of government organizations that use a similar approach to hone service delivery in the most efficient and effective way possible. Combating fraud, waste and abuse. The state of Louisiana uses an innovative statewide fraud and abuse detection system designed to recoup state revenue lost due to illegal practices by some businesses and workers. Using a central data warehouse, state agencies can create a composite view of individuals’ or companies’ interactions with government programs and uncover fraudulent activity.

Improving health outcomes. California’s San Bernardino County Department of Behavioral Health uses a data warehouse to standardize, consolidate and access existing data within and across county organizations. With this integrated knowledge, the agency can create a holistic and integrated view of residents and families, analyze their service utilization and needs, and design effective programs of care.

sing citizen feedback to enhance U service delivery. The Hong Kong government’s Efficiency Unit proactively identifies potential public risks and concerns from call detail records, emails and inquiries. Paired with population data from the Census and Statistics Department, the unit can develop greater insights and find innovative solutions to address citizens’ needs in a timely manner.

•O ne law enforcement agency crossreferenced security video images of an unidentified larceny suspect, who subsequently used a credit card fraudulently. While searching associates of the credit card owner in CJLEADS, investigators found an image that was an exact match with the suspect in the video. • The state’s Department of Insurance criminal investigations division used CJLEADS to track a fugitive and, discovering the individual was scheduled to appear in a county court, had the person arrested at the appearance, saving a number of investigators several hours of work. •O ne officer, questioning occupants of a stopped car, determined that one person being questioned had provided a fictitious name. Searching the alias in CJLEADS, the officer discovered outstanding warrants and arrested the person on-site. Searching CJLEADS

also led to the arrest of three other occupants in the car, who also had outstanding warrants. • The CJLEADS watch list capability allows users to alert others that they are watching an offender. This feature helped officers alert other officers and track potential gang activity statewide through the use of CJLEADS. • New online vehicle search capabilities automate a previously manual process to locate vehicles based on partial license plate information. The search returns potential vehicle matches to officers in a matter of minutes rather than days. Officers have indicated that this is especially helpful in hit-and-run situations. “The strong, collaborative relationship between SAS and the state of North Carolina has been critical to the development of CJLEADS,” Bell adds. “SAS’ knowledge of key technology and best practices, combined with a

flexible, iterative design approach, enabled us to meet the tight, legislatively mandated deadline.”

online SAS® Business Analytics: sas.com/ba Take a look at CJLEADS: cjleads.nc.gov


12

What was your data doing during the financial crisis? Financial institutions need to rethink data management strategies to escape the cycle of cataclysms “Those who cannot remember the past are condemned to repeat it.” Spanish philosopher George Santayana wrote those words more than a century ago. But enlightenment by hindsight is a slow process. Humans still persist in behaviors that do not work, and organizations still persist with information processes that are broken. Nowhere is that more evident than in the financial services industry, where the past has been one of tremendous upheaval, and the future doesn’t look much different. Naturally, the US federal government responded with regulations intended to prevent us from being condemned to repeat the past, such as the SarbanesOxley Act of 2002 and the Dodd-Frank Act of 2010. The 17 countries in the eurozone have more than 40 financial supervisory

authorities – with weak coordination among them – each with its own directives and oversight. The aims of these regulations are worthy – to ensure that financial institutions know the sources and recipients of funds, have transparency in accounting processes, are accurate in financial representations, and have sufficient capital reserves to continue operations even during times of economic and financial duress. However, each new regulation added complexity and reporting burdens in a business environment that was already getting more complex. E-commerce, mobile and online banking, wave to pay and more – new banking channels are dramatically reshaping the data management landscape.


13 3. Data definitions. To support enterprisewide decisions and reporting, establish consistency in how you define such elements as creditworthiness, risk tolerance and market segments.

Now that different functions need parallel access to the same data, data should be business-led and IT-managed, with a close coupling between the two.

What was your data doing while all this was happening? A common denominator in all these historical markers is data – either the lack of it or the inability to gain timely and trusted insights from it. Could we have foreseen the mortgage meltdown, the financial institutions’ crises and the recession, if only we had gotten our arms around more data and done more to correlate it? Could the dot-com bubble have been averted if investors had better knowledge about the true valuation of the companies they were investing in? Could crash losses have been avoided or minimized if data systems had detected the early-warning signs?

Yes, of course, but that level of knowledge has been elusive. Many IT architectures were built 10 and 20 years ago for business as usual. But business is not as usual anymore, and financial institutions need to focus on six top data issues: 1. Unified perspective. Customer expectations and regulatory reporting require the ability to easily link quality data across business and product silos, without a wholesale overhaul or creating yet more single views. 2. Data agility. Once you have gained the necessary cross-functional, cross-system perspective, provide the means for data processes to adapt quickly to inevitable future changes.

Mazhar LeGhari delivers strategic insight for solutions and best practices in information management based on his deep experience in data governance, data quality, master data management and data integration. Before joining SAS, LeGhari spent 11 years as a solution architect and product manager for enterprise information management systems with two other large software vendors.

4. Defining anomalies. Get consensus about what constitutes a trouble condition, such as a high-risk credit application, fraud or other patterns that should be flagged for investigation. 5. Data-driven processes. Machine-tomachine interactions are commonplace, such as in algorithm-driven trading. Lacking human scrutiny, unmanned systems need trusted data and rigorous early-warning systems. 6. Data governance. Who owns the data? Who manages it? Who can use it? And for what? Traditional governance has been a hybrid model – part centralized with IT and part patchwork entropy. Now that different functions need parallel access to the same data, data should be business-led and IT-managed, with a close coupling between the two. Until recently, many financial institutions have been buying software applications in silos to suit purposes and market pressures of each line of business. As a result, vendors often position capabilities to retail, commercial or wealth management divisions individually – even though they are all the same organization. Today, banks need to ask whether that siloed approach is still best. If we’re trying to avoid the mistakes of the past, we need to start doing things differently.

online Best practices in data management for big data: sas.com/iq-bigdatamanagement


14 71.2° 25.7877° N

N W

E

80.2241° W

S

Build customer confidence with better data

New Zealand insurer’s data quality program produces results in less than a month To price premiums accurately and maintain the confidence of customers, shareholders and reinsurers, insurance companies need a clear view of risk. Using SAS® to improve data integrity, IAG can now better assess the risk to homes and businesses based on precise location data throughout New Zealand, resulting in more equitable pricing. Part of Insurance Australia Group – Australasia’s largest general insurance group – IAG’s New Zealand commercial division sought a departmental solution to quickly and affordably improve the integrity of the data coming through its commercial lines.

Location is key to understanding risk One way insurers assess risk is by knowing the exact location of an insured business or household using longitude

and latitude geocode data rather than relying on a postcode. Using geographical information systems (GIS), insurers can pinpoint properties to their exact location and overlay them with risk ratings. This more accurately assesses the risk to a single property so premiums can be priced more equitably. When Carl Rajendram joined IAG as National Manager for Commercial Pricing and Analytics, the first thing he noticed was a need for better data quality. “There were a lot of fields captured in the core system that had not been validated, and we needed a solution to improve the data,” he said. The company wanted to ensure its customer address records achieved a better match rate with its geocoding


15 “ Within a month of installing SAS, we achieved what we were after. And with a match rate of nearly 90 percent, we can now present this number to reinsurers with greater accuracy and confidence… Within a week and a half, we were seeing a return on our investment.” Carl Rajendram, National Manager for Commercial Pricing and Analytics, IAG New Zealand Ltd.

tool. Already using SAS for data exploration and reporting, IAG turned to SAS for a risk-profiling solution. “We are a team of five people who could have built something to achieve the result, but it would have taken three to four months,” Rajendram says. “The ‘off-the-shelf’ SAS Data Quality Desktop solution returned the results we were looking for within one month.”

‘Quick win’ justifies business case Rajendram said SAS New Zealand was keen to support a quick win for IAG to produce a broader business case for the solution. “We used the SAS Data Quality solution to validate our address data for accurate geospatial information to better manage the organization’s risk exposure,” he says. “SAS spent a couple of days showing us how the solution could be used to standardize the addresses,

and we were impressed with how quick and easy it was.” “As an extreme example, when it comes to pricing a premium, you can’t compare Canterbury with Christchurch,” Rajendram explains. “Previously, we were using codes from the Territorial Local Authority database, but these cover a wide area. SAS enabled us to go down to x and y coordinates, which has made a huge difference in our ability to price and understand our risk profiles.” Geocoding allows IAG to identify risk factors, such as whether a property is on a slope or near the coast, and then make appropriate pricing and underwriting decisions.

SAS® brings confidence to data quality Rajendram says the tool provided clarity on data issues and improved confidence in the data itself.


16 Intersection of telematics and insurance In telematics, wireless mobile devices collect and send information on specific vehicles’ location and usage. What does this mean for business? Similar to using geographical location data for risk profiling, insurers and third parties can put the huge amount of data produced by telematics devices to work for them.

“Before, only about 70 percent of addresses were of a standard that allowed geocoding,” he says. “After we implemented SAS, the number exceeded 85 percent. Within a month of installing SAS, we achieved what we were after. And with a match rate of nearly 90 percent, we can now present this number to reinsurers with greater accuracy and confidence. “The speed of execution was good. Within a week and a half, we were seeing a return on our investment,” he says. IAG now uses the solution for other projects.

For instance, data points on date, time, speed, latitude, longitude, acceleration or deceleration, cumulative mileage, and fuel consumption can be recorded and used to develop more accurate pricing. Depending on the frequency and length of trips, these data sets can represent more than 500 MB per vehicle per year.

“We have initiated a data quality program for our commercial business division, which will form the start of a data governance project,” Rajendram says. “We use it to profile some of the data and provide a feedback loop to stakeholders such as the distribution team, business owners and operational staff. This type of visibility wasn’t available previously.”

Here are some additional examples of potential uses for insurers and others:

IAG New Zealand Ltd. is New Zealand’s largest general insurer. It is part of Insurance Australia Group (IAG), which is headquartered in Sydney and is Australasia’s largest general insurance group.

•C laims. Vehicles fitted with telematics devices drastically shorten the amount of time between an accident and a report date. Shorter claims cycles translate to savings for insurers. • Marketing. Insurance companies use telematics data to segment customer data for marketing campaigns. • Government usage. Transportation agencies use the data to improve the flow of traffic, and state and local governments use the data for remote emission testing, rather than relying on an annual inspection of the vehicle.

About IAG New Zealand Ltd.

IAG New Zealand Ltd. offers the majority of its products under the NZI, State and AMI brands. NZI specializes in providing business, rural and personal insurance through brokers and financial institutions, whereas State and AMI offer insurance for predominantly personal assets such as home and vehicle, with policies sold directly to the public.

online SAS solutions for insurance: sas.com/ins SAS New Zealand: sas.com/nz


17

5 data governance mistakes to avoid

Managing expectations and understanding corporate culture are essential Data governance has become a veritable rubric for all things data. Google the term and you’ll come up with references to data quality, metadata, data warehousing, data ownership and data security – to name just a few. Data governance is, simply, an organizing framework that aligns strategy, defines objectives, and establishes policies for enterprise information. As promising as that might sound, data governance has failed in more than one well-meaning company because people misinterpreted its meaning, its value, and the shape it would eventually take. Once data governance becomes a dirty word, an organization rarely gets a second chance. “You can’t use the word governance here,” one executive confided recently. “We’ll have to call it something else.”

Here, we provide advice to save you from similar fates.

Mistake No. 1: Failing to define data governance Using “data governance” synonymously with “data management” is a common mistake. Data governance is the decisionrights and policymaking framework for corporate data. Data management is the tactical execution of those policies. Both require executive commitment, and both require investment, but data governance is a business-driven process, while data management is an IT function. How you define data governance and how your organization understands it is crucial. Your governance program must clearly define and articulate its mission and value.

Mistake No. 2: Failing to design data governance Designing data governance means tailoring it to your company’s specific culture, organizational structures, and decision-making processes. If you design a program for minimizing security breaches as your top priority when your company cares more about enriching the customer experience, you’re designing the wrong program. Your company’s needs are unique and your data definitions, rules, and policies should be too. Deliberate design ensures that governance supports the way your company does business. It also ensures that constituents know what data governance will look like before it’s launched.


18 Mistake No. 3: Prematurely launching a council An earnest visionary perceives the need for data governance. A council of data stakeholders is convened. Everyone agrees to meet regularly, discuss prevailing data issues and address problems. At the follow-up meeting, fewer people show up. Someone complains the company has never really defined the term “customer.” Someone else pipes up about bad data on the billing system. A sidebar conversation starts on CRM consolidation. A third meeting never happens. In this all-too-common example, data governance isn’t overtly canceled. It simply fizzles. Until a core team of stakeholders deliberately designs a data governance framework that includes guiding principles, decision rights, and the appropriate governing bodies, no cross-functional council will have the clarity or the mission to effect change.

Mistake No. 4: Treating data governance as a project In a well-intended effort to fix what’s broken, many companies will announce a data governance “project” with flourish and fanfare. When data governance is

formed as a discrete effort, however, instead of being “baked in” to existing processes, it will fail. When an initiative is deemed a project, it is, by definition, finite. The reality of data governance is that it should be continuous and systemic. As information needs change, data volumes increase, and new data enters the organization via new systems or third parties, decisions about how to treat, access, clean and enforce rules about data will not only continue, they’ll also proliferate. A structured, formal, and permanent process should be retrofitted into the way a company develops its data and conducts its business.

Mistake No. 5: Prematurely pitching data governance In the first phase of its data governance program, a national financial services company solicited several business and IT subject-matter experts to function as data stewards. The stewards were tasked with identifying high-impact data issues within their domains that governance would rectify. The stewards did an excellent job. The problem: There was no defined procedure to validate, prioritize, or resolve the ever-increasing flood of

Jill Dyché is an acknowledged speaker, author and blogger on the topic of aligning IT with business solutions. As the Vice President of SAS Best Practices, she speaks, writes and blogs about the business value of analytics and information.

As the Director of Business Strategies for SAS Best Practices, Kimberly Nevala is responsible for developing industry thought leadership and key client strategies in the areas of business intelligence and analytics, data governance, and data management at SAS. She is the co-author of the first e-book on data governance, The Data Governance eBook: Morals, Maps and Mechanics.

identified business problems whose root causes could be attributed to data issues. The team expended significant effort to expose painful data sores without a method to heal them. A majority of the issues uncovered were good candidates for governance, but the lack of appropriate expectation-setting led to frustration and mistrust. Data governance became a dirty word, and getting business owners back to the table to talk about implementation remained an uphill battle.

Conclusion: Take your time and do it right The mantra “think globally, act locally” is particularly apt when embarking upon data governance. The issues addressed by data governance are far-flung and pervasive, so successful programs begin with a series of tightly scoped initiatives with clearly articulated value and sponsorship. While an incremental approach takes time, not to mention patience, it engenders support by demonstrating the value of governance in a context relevant to each stakeholder or sponsor. Most important, a phased approach establishes data governance as a repeatable, core business practice rather than one-time project.

online Read the full white paper on launching a data governance program: sas.com/iq-datagov


19

Denmark’s data bank gives insight into assistance programs

Visualization provides a single version of the truth about the country’s social services The National Board of Social Services overcomes complexity in the Danish social sector by collecting and visualizing information about all services available to its citizens. Users can access interactive visualizations and maps. The system is powered by SAS® Visual Analytics software. The data bank has many audiences and many uses, including: • All civil servants have one source of information for every therapeutic option and institution in the country, so they can easily find the right offer for a client. • Health and wellness professionals have one unambiguous database that documents the results of all addiction therapies and facilities.

•S ocial workers can access information about experts and researchers regarding socially disadvantaged children and young people, all in one place. • Citizens and journalists can be better informed about programs and facilities nearby. These are just some of the important objectives Denmark is aiming for with the country’s new data bank. Improving welfare and the effects of social programs despite shrinking budgets is a tough fact of life in the welfare state of Denmark. Rising demands for social services make the work even more difficult. Adding to this picture is the complexity of the social sector.


20 “ It was exactly the speed we demanded… It took exactly eight days from the time when the hardware part was in place until we had developed and published the first report on our website… I see this as a great success.” Allan Vestergaard, BI Architect, The National Board of Social Services

The road from political legislation to real outcomes and benefits for children, addicts and the disabled is not straightforward. Local authorities deliver nearly all social services, and they also manage independent institutions with social workers and professionals who are close to citizens and clients. To overcome the inherent potential for conflicts and disagreements in this system, the Danish state has established a fact-based and national infrastructure for data about its social programs.

Opening up with visualization “Benefitting from knowledge” is the motto of the National Board of Social Services, which brings with it a goal of benefitting society through data and research. The board’s task is to support Danish municipalities’ social efforts for children, the disabled, the elderly, and the socially disadvantaged, and ensure these efforts are successful. The board, which is part of the Ministry of Social Affairs, strives to ensure that social intervention is based on knowledge of

what works. Therefore, the data bank on the website is highly logical and a key component of the board’s knowledge dissemination to ministries, organizations, municipalities, consultants and all others with an interest in Danish social politics. This year, the board has started using SAS Visual Analytics for the data bank, and the first clear effect is a more rapid dissemination of facts. “We want to release data so that users outside the board also have access to model and create new insight,” says board BI Architect Allan Vestergaard. “With SAS Visual Analytics, we can provide data and tools for the users so they can work directly with our data bank, even if they are located elsewhere.”

Portals of knowledge The board collects data in a number of knowledge portals that other groups – including municipalities and institutions – then update with their data. For example, the offer portal delivers qualitative and


21 quantitative data and practical information on 5,000 social services providers. This includes residential care facilities, residential institutions and treatment facilities. Caseworkers in the municipalities come here to look for relevant programs and treatments for their clients – according to diagnosis, geography or cost. Other data sources include the Drug Addiction Database and the Knowledge Portal for Socially Disadvantaged Children and Young People. Data from these portals is placed in one data warehouse and used as the basis for the board’s quick visual presentation of facts for the public eye.

Common language Board leaders know there are many different perspectives on methods, quality and good practice in social services. Therefore, they have worked diligently to create a common language through facts. Denmark’s national infrastructure is based on concepts that are defined by World Health Organization standards and interaction with professionals and experts. In addition, the standards are continuously evolving to reflect reality. One aspect of the work is a large element of deregulation, because the concepts are unique and represent a common reporting for both health care and social services. Knowing the audiences for this information were varied, the board decided to differentiate the way information is disseminated to them. Some users benefit from adapting and elaborating reports, so the board offers them a variety of options to customize the information. Expert users can even tailor the way they explore and visualize data for their needs. SAS Visual Analytics is used for precisely this purpose.

Implemented in 8 days “Our output is faster and more up-todate, and in this way the technology contributes to solving our main task,” said Vestergaard. “It took exactly eight days from the time when the hardware part was in place until we had developed and published the first report on our website. We installed and configured SAS Visual Analytics and started working on our data. I see this as a great success. It was exactly the speed we demanded.” The Board plans to extend the data bank later this year with multiple data fields and more reports to provide facts and knowledge for researchers, journalists and interest groups, thus contributing to knowledge-based social politics for the benefit of citizens.

online SAS Visual Analytics: See what you’ve been missing sas.com/va ®

Two ways to design data for visualization SAS Visual Analytics turns the classic data mart approach to data design for reporting on its head. Since aggregations and analytic routines now happen on the fly, you have more options for data design that takes all of your data into account. Consider a case from the public sector where different agencies have some form of data warehouse to organize their data into facts and dimensions. SAS can be used to join these tables, add businessfriendly column names, add formats and produce a wide, flat, long and denormalized table optimized for analytical exploration and ad hoc reporting. Alternatively, data can be imported directly into local storage, so you can push a local spreadsheet to the SAS server and start visually exploring the data in a few clicks.


22

Understanding data in motion

Event stream processing discovers interesting patterns in huge streams of data in motion Data pours into the organization from every conceivable direction: from operational and transactional systems; from scanners, sensors and smart meters; from inbound and outbound customer contact points; from mobile media and the Web. Those streams of data contain a wealth of potentially valuable insight – if you can capture and analyze it. But how do you manage such a torrent of data? Where would you store it? How long would it take to make sense of it? Traditional approaches, which apply analytics after data is stored, may provide insights too late for many purposes – and most real-time-enabled applications can’t deal with this much constantly flowing data. Here’s an idea: Analyze it on the fly. Find what’s meaningful, grab only what you need, and get instant insights to react

immediately and make the best decisions as data is flowing in. That’s the promise of event stream processing. Event stream processing continuously analyzes data as it flows into the organization, and then triggers an action based on the information flow. It is a form of complex event processing that empowers you (or an automated system) to spot patterns and make decisions faster than ever.

Three steps for streaming data Managing data in motion is different from managing data at rest. Event stream processing relies on three principal capabilities – aggregation, correlation and temporal analytics – to deal with data in motion. Aggregation. Let’s say you wanted to detect gift card fraud: “Tell me when the value of gift card redemptions at any point-of-sale (POS) machine is more


23 Aggregation, correlation and temporal analysis set event stream processing apart from other approaches by revealing what’s happening now, not just what happened in the past, so you can take action immediately.

than $2,000 in an hour.” Event stream processing can continuously calculate metrics across sliding time windows of moving data to understand real-time trends. This kind of continuous aggregation would be difficult with traditional tools. With the SAS® Event Stream Processing Engine, it’s built in. Correlation. Connect to multiple streams of data in motion and, over a period of time that could be seconds or days, identify that condition A was followed by B, then C. For example, if we connect to streams of gift card redemptions from 1,000 POS terminals, event stream processing could continuously identify conditions that compare POS terminals to each other, such as: “Generate an alert if gift card redemptions in one store are more than 150 percent of the average of other stores.” Temporal analysis. Event stream processing is designed for the concept of using time as a primary computing element, which is critical for scenarios

where the rate and momentum of change matters. For example, sudden surges of activity can be clues to potential fraud. Event stream processing could detect such surges as they occur, such as: “If the number of gift card sales and card activations within four hours is greater than the average number of daily activations of that store in the previous week, stop approving activations.” Unlike computing models designed to summarize and roll up historical data, event stream processing asks and answers these questions on data as it changes. These three capabilities set event stream processing apart from other approaches by revealing what’s happening now, not just what happened in the past, so you can take action immediately.

Enrich and empower your analytic applications Event stream processing really proves its value when it is embedded into analytical applications, such as risk management, fraud detection and prevention, anti-


24 Event stream processing at a global bank Complex event processing doesn’t mean the technology is too complicated to use. Quite the opposite. The term refers to the ability to identify complex sequences of conditions – as the data moves through the organization. For example, one global bank is using event stream processing to perform value-at-risk calculations (VaR). The bank built a system that aggregates trading data on the fly into a single repository. So instead of waiting until the following day to get the latest reports, the risk management team can have an up-to-date view of its risk exposures on an intraday basis. The bank can now individually evaluate every trade before the deal is made, and then accumulate it with other current trades to evaluate trends. Finally, the bank can synchronize this data with the global corporate data and change course on any decision if the VaR exceeds its risk appetite.

money laundering and customer intelligence. Event stream processing can be used to detect patterns and filter relevant events to send to analytic solutions before data is stored. Or it can detect when the data for a specific independent calculation is available such that it can be immediately run, rather than having the analytic solution waiting until all data is available to run the most time-intensive calculation. With the ability to process millions of records per second (with latencies around a microsecond), the possibilities are limited only by imagination. Here are some idea-starters: • When transactions against the same credit card number come from four or more companies within one minute, deny the next request, flag the account and send a message to the fraud detection dashboard. • When stock level for the book The Da Vinci Code drops to 10 percent of minimum, given the last 10 hours of buying behavior, trigger the distribution center to begin the restocking process. • How many website visitors are going from the home page to About Company and clicking My Profile during a rolling, 10-minute window? • If the time between in-store credit card transactions in different cities is less than the travel time between those cities, put the account on hold and flag it for investigation.

Passionate about technology and innovation, Frédéric Combaneyre is an expert on event stream processing solutions. During his 19 years in the software industry, he has covered many domains, from business intelligence to information management. Combaneyre supports customers in multiple industries and speaks on a wide variety of topics at SAS and external conferences.

Event stream processing answers such questions while reducing storage requirements, computing demands and time to decision. When you consider the terabytes, petabytes and exabytes flowing in and around the organization, there’s enormous value in being able to quickly find the nuggets of value and using them to make better decisions, faster.

online SAS® Event Stream Processing Engine: sas.com/iq-esp


25

The role for big data in health care’s triple aim Improving care, improving health and reducing costs with analytics Health care policies and practices differ tremendously around the world, but three objectives are common regardless of the health care system. They are referred to, in the industry, as health care’s triple aim:

1. The promise of value-based health care. 2. The ability to track more aspects of health care. 3. The trend toward engaging patients as more active consumers.

1. Improving the patient experience (including quality and satisfaction). 2. Improving overall population health. 3. Reducing the per capita cost of health care.

Each of these areas alone has the potential to contribute highly to a positive development of how health care is practiced and paid for, and together, they have the potential to significantly improve the triple aim.

It might seem counterintuitive that you could accomplish all three at once, but there are changes afoot in many aspects of the industry that are enabling a simultaneous shift in all three areas. In this article, I will focus on three areas in particular that are driving change and increasing the use of analytics in health care. They are:

The promise of value-based health care Since traditional methods have proven insufficient to manage health care, and modern technology has created the means to analyze large quantities of information, the timing is right to move toward a value-based perspective in health care.


26 adjustment accounts for different results depending on the age and health of the patient, among other factors.

Value-based health care puts high demands on the ability to record and monitor data regarding specific conditions and symptoms, procedures and resulting quality.

What is value-based health care? It is an approach that focuses on health outcome per dollar spent. The value-based system pays providers based on their contributions to desired outcomes. Traditional fee-for-service or pay-for-volume methodologies have proven over time to discourage achieving the triple aim. With a value-based approach, quality and satisfaction increase, and innovation is encouraged. Let’s consider payment for hip replacements. The traditional method provides payment according to the number of procedures performed and – in some systems – leads to a fixed number of procedures. Payers who define and reimburse according to patients’ desired functions or results have more incentive to achieve patient value. The push toward a value-based emphasis will require new systems for managing and analyzing health care data, including new efforts for standardization, measuring outcomes and understanding quality

measures in health care. Ultimately, the efforts will lead to a number of benefits. Value-based health care increases pressure for innovations, which could make operations easier and will discourage unnecessary operations. Processes that focus on quality and increased health outcome per dollar spent will increase the efficiency of health care. Furthermore, a successful introduction of value-based health care can have positive effects far beyond medical care domains. Clinical researchers, as well as industries like pharmaceuticals and medical technologies, have everything to gain from a value-based approach in the health care industry.

The ability to track more aspects of health care Value-based health care puts high demands on the ability to record and monitor data regarding specific information and achieved quality. Typically, a risk-based adjustment is applied to compare fairly. This

One way of approaching these information needs is to establish registries that focus on health care outcomes. Sweden provides a good example of how a long tradition of disease registries can contribute to platforms for data management and analysis in health care. Sweden has nearly 90 government-supported disease registries (also known as national guidelines), many of them established by the medical societies of the relevant specialty. These registries contain historical data related to patients with a specific diagnosis or condition. Sweden’s registries cover conditions representing more than 25 percent of total national health expenditures and provide a solid base for Swedish health care to benchmark and assess performance on various aspects of the health care system. The registries enable in-depth analysis of performance variations. Best practices can be identified and continuously improved over time by evaluating registry data. There are several examples from Swedish health care where improvements are focused on quality but also led to cost control and reductions. Finally, the aggregate data in the registries is made available to the public, which leads to demonstrated clinical improvements as well as the ability to understand comparative effectiveness. Since citizens can access the information, it provides a competitive incentive for even better results. For example, a recent study of Sweden’s National Registry for Acute Coronary Care showed hospitals that followed absolute adherence to standards were numerically higher in all five value-based indicators studied.

Engaging patients as consumers With more and more health care data becoming available, patients are taking an


27 active interest in their health care choices. For starters, patients have become more eager to evaluate services using information from the Web and other channels. This underlines the importance of using analytics for risk adjustment to provide the consumer with the right information to make informed and fact-based decisions. As patients become more informed, they not only demand the most modern health care, but they also tend to take a role in collecting and sharing their own health care data with providers. This can increase patient adherence to physician instructions, which is a critical factor in adverse events or post-discharge gaps in appropriate care. The Swedish Rheumatoid Arthritis Registry is another example of an opportunity to continuously monitor patients’ conditions and treatment outcomes. With rheumatoid arthritis, it is essential for patients to be engaged in their own treatments. Doctors therefore provide the patient with access to a Web interface or even an app where she has to answer a number of questions and evaluate her own health status. The patient assesses her ability to perform daily activities, such as vacuuming, taking a bath and cutting meat. The patient indicates the degree of pain and self-registered swelling and tenderness. In this way, the doctor gets access to the current state of the patient’s health. The physician can analyze the patient’s health status over time and also compare data from the disease registry. Based on this proactive analysis, the treatment can be adjusted and potential problems prevented.

Big data and the triple aim Gathering large amounts of data in health care has previously been time consuming for clinical staff. New technologies such as high-performance analytics are making it easier to turn large amounts of data into critical and relevant insights

that can be used to provide better care. Analytics can even be used to predict negative reactions and intervene before they become a problem. For example, unstructured data can be captured via text mining from patient records for use in disease registries. This means information can be gathered without causing additional work for clinicians. With a large information base, analytics can answer questions such as how and why outcomes differ between hospitals, clinics and even between doctors performing the same care. Transparent, highly available information can thus improve quality and encourage innovation. As information becomes increasingly available, transparent and comparable, patients will also be empowered and more involved in their own treatment via online health applications, which can integrate patient information with their health records and make it available to clinicians. A large amount of data gathered from different sources provides the best practices for today, and will help health care providers identify trends so they can achieve the triple aim.

What can we predict? Huge amounts of health data is now being collected, including e-records, health databases and personal health care records. How can this data be used to improve health care? These are just some of the important predictions that can be made: • Identify patients and populations at risk. • Understand adherence to prescriptions. • Forecast future health care utilization. • Identify adverse drug events. • Improve selection of candidates for patient-centered interventions. • Predict future health outcomes. • Identify costly procedures, waste and delays. • Detect safety issues and risks.

Journal of the American Heart Association. “Quality Improvement in Coronary Care: Analysis of Sustainability and Impact on Adjacent Clinical Measures After a Swedish Controlled, Multicenter Quality Improvement Collaborative.” August 6, 2012. jaha.ahajournals.org/content/1/4/ e000737.full

online Follow the SAS health care blog: blogs.sas.com/content/hls

Carolina Wallenius is a senior advisor who specializes in applying modern technology and new ways of working to improve health care efficiency. She also focuses on e-health diagnostics, primarily for malaria. Wallenius has held leading positions within the Swedish health care industry for more than 10 years, including head of health care reimbursement within Stockholm County Council.


28

Dutch hospital brings analytics to the workplace From data warehouses to data visualization, Ziekenhuis Gelderse Vallei expands analytics from back office to patient care Like many modern hospitals, Ziekenhuis Gelderse Vallei has implemented an organizational structure that is based on results, with a system that measures performance for individual profit centers or units. This means that business decisions are made at lower levels within the organization. In order to provide these units with a way to measure performance and plan for the future, Ziekenhuis Gelderse Vallei implemented SAS Visual Analytics. In five questions, Rik Eding, a data specialist for the hospital, gives an impression of the importance of this technology and how it affects his organization.

How important is data for your organization? Rik Eding: In 2005, health care in the Netherlands was liberalized. In that same year, we started building a data

warehouse and implementing business intelligence (BI). In those early days, we used SAS only in the finance department. But throughout the years, we collected more and more care-related information in our data warehouse so we could examine our primary processes. Analyzing care data also helps us interact better with the insurance companies. After all, the hospital has a much better view of the market share than the insurer. Whenever necessary and relevant, we can also add external data sources to our data warehouse. One example is patient social background information from the Institute of Social Research. It is commonly known among doctors that people of a lower socioeconomic standing have a higher mortality rate. One of our doctors wanted to link patient diagnoses with socioeconomic backgrounds to better understand mortality risks. This way, we can better treat

each patient based on his or her individual situation.

Who uses the obtained insights – and what for? Eding: The entire hospital uses the information. As indicated earlier, we started with BI in the finance department and steadily started adding new sources to our data warehouse so we could do more analyses. We have now invested in SAS Visual Analytics, so we can provide our units with timely information. Previously, our colleagues on the floor could not produce reports or generate analyses. They used to ask us to do that for them. In our experience, every report we make generates 10 subsequent questions. With SAS Visual Analytics we give them the tools to create these reports themselves – and find the answers.


29 “ One of our lung specialists requested to include data from the Royal Meteorological Institute in our data warehouse… now they can use weather patterns to better predict when patients are more likely to have breathing complaints, and adapt the treatment accordingly.” Rik Eding, Data Specialist, Ziekenhuis Gelderse Vallei

How do analytics and visualization help with decision making in your hospital?

What results has Ziekenhuis Gelderse Vallei achieved with visual analytics?

Eding: It helps in many ways. We analyze logistical processes around patients, and this allows us to identify potential bottlenecks. An example is a recent discovery that the average period patients had to wait to get treated for hernias had gone up. When we looked closer at the data, however, we found that two patients had postponed their operations due to holidays. When we left those two cases out, it turned out that the average waiting period had in fact gone down. This is meaningful information. In another example, one of our lung specialists requested to include data from the Royal Meteorological Institute in our data warehouse. Based on that data, now they can use weather patterns to better predict when patients are more likely to have breathing complaints, and adapt the treatment accordingly. This is fantastic, of course.

Eding: We are currently in the middle of rolling out the solution, and we still look forward to reaping real, measurable results. We are not worried, though. There is plenty of low-hanging fruit. Analytics has come a long way from being just the tool of the finance department that it once was. It has long since become a part of the care process itself, initially to monitor the KPIs, such as the waiting lists, length of hospital stay and the number of treatments. Now, it is used increasingly in medical areas. As a result, we are able to really improve the quality of care, and the financial people are happy with that as well. It often means a reduction in costs. It goes both ways.

What do you still hope to achieve with analytics and visualization? Eding: I’d like for everybody to catch the analytics bug. As a BI team, we can provide reports, but the units know which information they need the most. This is also the reason why my role keeps moving more

from data specialist toward information analyst. I help the individual cost centers along and stimulate them to generate their own ideas for possible analyses. It’s a fun job, because with a tool like SAS Visual Analytics it is easy to get people excited. It looks great and it is easy to use. Whenever I explain, users start beaming with creativity, and they will ask, “Can I do this or that, too?” There is a danger in that, though. I notice with me, that it’s pretty addictive to dig deeper and deeper all the time. Once you have established the broad lines, you keep going to find new correlations. It’s hard to stop. Before you know it, our psychiatrists are going to be working overtime, treating their own colleagues from this new form of addiction [laughter]. When that happens, I will have reached my goal.

online How different companies use data visualization: sas.com/iq-futurebright SAS Netherlands: sas.com/netherlands


management a conversation backgrounder with your data 1Holding 3 Data

customer confidence with 14 Build better data

hospital brings analytics to 28 Dutch the workplace

data in Italy’s public sector 6 Big

Carolina gets What was your data tougher on crime with 12 doing during the 9 North business analytics financial crisis?

The role for big data governance data bank Understanding data data in health care’s to avoid gives insight into in motion 25 22 17 5mistakes 19 Denmark’s triple aim assistance programs


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.