Page 1

Michael Rasmussen Corporate Integrity Dennis Drogseth Enterprise Management Associates (EMA) Adrian Polley Plan-net THE INDEPENDENT RESOURCE FOR IT EXECUTIVES August 2009 â– www.Global


Risky Business Strategy is essential to GRC success

Paul Williams ISACA London Chapter

ETM n ConTEnTs pagE


july 2009

ETM n Contents page

contents page and contributors page 7 Editor

8 The path to success

Is there a way to integrate and reconcile

different management sources that actually works, or is it just another “Holy Grail?” DENNIS DROGSETH (ENTERPRISE MANAGEMENT ASSOCIATES) tells us how it’s possible to make your CMDB system initiative a success in 2009.

the right balance 12 Finding

There has been a lot of attention and focus on GRC, but not all of it is positive. Organizations need to look at where GRC is headed and at the application of technology to possible GRC strategies. MICHAEL RASMUSSEN (CORPORATE INTEGRITY) moderates a discussion on these issues with the help of SIMON TAYLOR (COMMVAULT), SAM HARRIS (TERADATA CORP-ORATION), MANOJ KULWAL (SAS), and MARTIN KLING (IDS SCHEER).

24 It’s all in the planning...

Now is the time to implement a sustainable GRC strategy in your organization. Creating a plan of action today will see greater return in the future, says MICHAEL RASMUSSEN (CORPORATE INTEGRITY).

Journey to the center of corporate information 28 ecosystems: The crucial realignment of IT and business priorities COMMVAULT tells us how the power of Singular Information Management® provides the catalyst for delivering corporate change and shared ownership of valuable data assets.




Contents page n ETM

32 A tailored solution

ADRIAN POLLEY (PLAN-NET) examines whether ITIL—the best practice approach to IT Service Management—can actually deliver genuine business benefits, and if it will survive the recession intact.


36 Walk the line

Some organizations will tell you that PCI DSS is too complex; some say it’s an extremely weak standard, but is it still relevant and doing its job? ETM’S ALI KLAVER addresses the issues, and some new wireless standards, with expert advice from MIKE BAGLIETTO (AIRTIGHT NETWORKS) and ANTON CHUVAKIN (QUALYS).

less...and improve access control 41 Spend

There are many examples of companies who were unable to fulfil all compliance requirements— with negative consequences. ALI KLAVER (ETM’S MANAGING EDITOR) talks to MARTIN VLIEM (MICROSOFT) and PAUL HEIDEN (BHOLD COMPANY) about their vision on identity and access management, and a concrete solution to help you.

46 Risky business

Poorly selected, planned, and executed IT-related business projects can result in massive value destruction. PAUL WILLIAMS (ISACA) says it’s time to ask the challenging questions about IT investment management.

50 Events and features AUGUST 2009


GIL 2009: India Bangalore, India October 2009 Growth, Innovation & Leadership: A Frost & Sullivan Global Congress on Corporate Growth GIL 2009: Asia Pacific Kuala Lumpur, Malaysia

GIL 2009: China Shanghai, China

GIL 2009: Europe London, UK

GIL 2009: Middle East Abu Dhabi, UAE

GIL 2009: Latin America Sao Palo, Brazil

GIL 2009: North America Phoenix, Arizona

Frost & Sullivan’s premier client event GIL – Growth, Innovation and Leadership supports senior executives in their efforts to accelerate the growth rates of their companies. Each year, thousands of CEOs and their growth teams return to engage in this global community to explore actionable strategies, solutions, and growth processes that they can put to work in building a solid Growth Acceleration System. GIL 2009: Middle East is a "must attend" for any organisation seeking fresh perspectives, new ideas and innovative, practical solutions to stay ahead of the curve. Please mention partner code GILETM09 to receive a Rs. 7,500 savings courtesy of Enterprise Technology Management

Register Today! Email: Tel: + 91 22 4001 3422

Capturing innovative ideas for growth.

Editor’s Page n ETM Contributors

Fo u n d e r / P u b l i s h e r Amir Nikaein Managing Editor A l i K l av e r Ar t Director Ariel Liu He a d o f D i g i t a l Xiao Gang Lu Fi n a n c e D i r e c t o r M i c h a e l Ng u y e n Project Director Ye n Ng u y e n Po d c a s t / S o u n d E d i t o r Mark Kendrick Tr a n s c r i b e r Ann Read

Risky business


uch as I would have liked to feature a picture of Tom Cruise on the cover of our July issue: “Risky Business,” he doesn’t impart our main message– developing a strategy for the successful integration of GRC initiatives. As well as being a hot topic, GRC tends to be a hated one. But while I’ve been putting this issue together and interviewing industry experts, I’ve found that its integration is becoming essential to recognizing good business processes, management, and even ROI. I’m sure you’ll see what I mean as you read through the issue, and listen to the podcasts. This month, the team at ETM has had the good fortune to work with Michael Rasmussen. As well as being the founder of Corporate Integrity, he is also a soughtafter keynote speaker and collaborator on GRC issues. No discussion of GRC would be complete without him, which is why he is our moderator for the exclusive panel discussion about moving beyond the basics of GRC, helped along with the expert advice of Sam Harris, Simon Taylor, Martin Kling, and Manoj Kulwal. Also in this issue, Paul Heiden from BHOLD and Martin Vliem from Microsoft discuss their successful collaboration on solutions to help you with identity and access management. Mike Baglietto from AirTight Networks and Anton Chuvakin from Qualys go head to head on PCI DSS and discuss its challenges, critics, and the new wireless standards. Discover how it’s possible to make your CMDB System initiative a success this year with the help of Dennis Drogseth from EMA, and follow Paul Williams (ISACA) as he takes you though the challenges in IT investment management. I hope that we’ve provided some handy updates and essential solutions in the GRC and ITSM fields this issue, and that they help in your day-to-day business processes. Thank you for reading, and if you would like to contribute to any future issues of ETM, please feel free to contact us at or via email at editor@

Account Executives Bee K atyal Jo e M i r a n d a Michael Osbourne Sandino Suresh Contributors Michael R asmussen R i s k an d Compl ian ce L ec tu rer, Wr i ter an d A dv i so r C o r p o ra t e I n t e g r i t y Dennis Drogseth V i c e P r e s i d e n t o f R e s e a r c h , I T Me g a t r e n d s A n a l y t i c s a n d C M D B Sy s t e m s Enterprise Management A ssociates (EM A) Pa u l W i l l i a m s C h a i r, I S A C A S t r a t e g i c A d v i s o r y G r o u p a n d I T Governance Adv isor to Protiv iti I S AC A L o n d o n C h a p t e r A d r i a n Po l l e y CEO P l a n - Ne t How to contact the editor

We welcome your letters, questions, comments, complaints, and compliments. Please send them to Informed Market Intelligence, marked to the Editor, Studio F7, Battersea Studios, 80 Silverthorne Road, London, SW8 3HE or email

PR submissions

All submissions for editorial consideration should be emailed to


For reprints of articles published in ETM magazine, contact All material copyright Informed Market Intelligence This publication may not be reproduced or transmitted in any form in whole or part without the written express consent of the publisher.

Enterprise Technology Management is published by Informed Market Intelligence

Headquarters Informed Market Intelligence (IMI)

IMI Ltd, Battersea Studios, 80 Silverthorne Road London, SW8 3HE, United Kingdom

Ali Klaver Managing Editor

+44 207 148 4444 Tokyo 1602 Itabashi View Tower, 1-53-12 Itabashi Itabashi-Ku173-0004, Japan Dubai (UAE) 4th Floor, Office No: 510, Building No.2 (CNN Building), Dubai Media City, Dubai july2009 2009 august


AnAlyST feATUre n mAnAGemenT SySTemS

The path to success Is there a way to integrate and reconcile different management sources that actually works, or is it just another “Holy Grail?” DENNIS DROGSETH (ENTERPRISE MANAGEMENT ASSOCIATES) tells us how it’s possible to make your CMDB system initiative a success in 2009.



management systems n Analyst feature

The definition of a CMDB system (CMDBS) American interest in CMDB deployments began to accelerate in 2005 and 2006 as interest in ITIL grew, and as some vendors initiated product launches in both core CMDBs and application dependency mapping. With these developments, the notion of a cohesive and trusted vision of how infrastructure and change impact service and business dependencies captured a broad, public imagination. In EMA research, it became apparent that ITIL’s definition of a “Configuration Management Database” was, at least by analogy, what IT had hoped for years—a way of integrating and reconciling different management sources in a way that finally worked. ITIL’s attention to process versus pure technology brought maturity to this vision, while market technology advances seemed to hold the promise of actually delivering on this “Holy Grail.” Needless to say, the “Holy Grail” tends to exist more happily in myth than in reality, so that the last years have witnessed a lot of lessons learned by IT organizations and vendors alike in stepping up to the many challenges of CMDB deployments. As a whole, these lessons prioritized organizational and process commitment over pure play technology adoption, as well as the need to stage phases and evolve a system with clear and specific values, versus simply buying or building the “IT cure for cancer” all at once, and expecting a miracle.

System versus product In 2006, EMA first recognized that in reality a “CMDB” was a system of related technologies, rather than a single data repository or single product investment. The notion that a CMDB was a “thing” that could be simply and directly purchased, deployed, and utilized turned out to be counterproductive rather than helpful in enabling successful deployments. In particular, in 2006, EMA recognized that many deployments were trying to create two critically linked but fundamentally separate systems. The first was a processcentric system, typically resident in a service desk purchase, and most directly linked to ITIL v2’s notion of a common repository with a single version of truth. Such a system was not optimized to support near-real-time requirements in monitoring, as it depended too much on data replication as an enabling technology. As a result, a parallel, more near-realtime system was sometimes evolving on the operations side of the house, in which data was reconciled and accessed, based on well-defined trusted sources, but not actually replicated. The benefit of the two systems is to create a reflexive awareness of the impacts of change across the infrastructure, with a focus on control and process-improvement on the one hand, and real-time or near-realtime insights into service performance for validation and troubleshooting on the other hand.

From a product perspective in 2006, BMC’s Atrium was among the leaders in service-desk-centric investment, while HP’s (Mercury’s) Business Availability Center with its application dependency mapping was one of the leaders in the near-real-time deployment, along with Managed Objects (now Novell) and, prior to being acquired by IBM—Micromuse and Cohesion. The market has evolved considerably since then with much more diversity and opportunity. EMA’s visio In the spring of 2007, ITIL v3 introduced the concept of the Configuration Management System, which strongly approximated EMA’s definition of a CMDB System. It specifically embraced the notion of multiple technologies versus a single repository, and took the term “DB” out of the solution-class definer. And in fact, the CMS could contain multiple CMDBs along with multiple discovery tools and multiple “trusted source” management data repositories. In order to understand the broader implications of the CMDBS investment, it’s good to understand the bigger pot of technologies that are now relevant. And while it’s important to start with a finite subset of the below and then evolve, it’s useful to know all the components that might eventually play to your deployment. EMA’s vision for a full-phase CMS or CMDB System is the following: 1. Clear organizational commitments including a core team with powerful,

Figure 1 Which areas has your company seen monetary/business advantages from your CMDB System deployment? Improved operational efficiencies (reduced operational overhead) Reduced downtime for key applications Reduced asset costs through improved SW license management Reduced asset costs by identifying redundant HW assets Shortened MTTR by improving change management effectiveness Shortened MTTR by identifying the right person to fix a problem (CI owner) More efficient compliance audits Shortened MTTR through improved diagnostics Reduced SLA penalties Faster time to provision new application services (productivity and or revenue benefits) Disaster recovery–related savings Other (please specify)


35% 31% 31% 31%

15% 0%




28% 26% 25% 23% 20% 19%








% Valid cases (mentions/valid cases) © 2008 Enterprise Management Associates, Inc.



Analyst feature n management systems







ongoing executive commitment, defined stake holders, and expertise in architecture, process, and unique but relevant technologies and communication skills. Far more than any single technology investment, this organizational/process focus is the single most consistent factor in determining the success of a CMDB/CMS. An effective set of reconciled discovery sources—including but not limited to application dependency mapping. As the CMDB System evolves, these discovery sources should be optimized to support a wide range of disciplines from asset and inventory, to root cause analysis topologies, to application and service provisioning (including application development for new services) in a full-phase system. A core repository focused on process and governance—where the most critical CI’s and their attributes are managed for change. This core repository could take differing physical instances, with different levels of granularity ( for network, system or other changes), but should be the same core software investment to optimize consistence and minimize complexity. A more real-time capability optimized for accessing information across monitoring systems and other near-real-time capabilities. This system is sometimes more closely affiliated to an application dependency mapping investment than the core repository, because it will depend on dynamic insights from that system to stay current. In a time of virtualized infrastructure and componentbased applications (Web 2.0 and SOA-based web services), the importance of this system cannot be underestimated. A metadata capability targeted at reconciling and normalizing every aspect of this system including directory services and securityrelated requirements. While currently this “meta data” system is affiliated most typically with the “core” CMDB repository, it is not optimized in a relational database format. As the market evolves it will take on more of an object-oriented design where relationship modeling is more important than raw processing performance. Future integrations to enable key analytics investments. While this is not part of the CMDB System directly, as the market evolves you should look to make investments


tied most closely to your key analytic requirements, whether for performance management, capacity optimization, governance and risk management, or business and financial planning.

previous page) shows what key benefits were achieved from CMDB System deployments. Not surprisingly, improved operational efficiencies took first place, as these can be achieved across a whole host of disciplines and pretty much apply to every type of initial phase deployment. It’s telling that second place was reduced down time for key applications—a fairly strong operational value. When combined with shortened MTTR tied for third place, the importance of a CMS investment as a resource for improved service performance becomes apparent. Reduced MTTR from knowing the whereabouts of the “owner” of a problem is also a crowd pleaser with initial phase deployments. One client described this benefit as “Mean-Time-to-Find-Someone.” The role of asset management is also important, as more effective management of software licenses and reductions in hardware redundancy both crop up as early phase “winners” in CMS deployments. One of the more interesting variants is asset lifecycle management for remote desktops, where the initial CMDB System links configuration management, usage, problem management, and incident management with asset and costrelated information. As can be seen in Figure 2 (below), actual dollar savings over a 12-month period range from under $50,000 (39%) to more than $5 million (6%). These dollar savings were not in pure ROI, but in estimated proactive value achieved through CMDBrelated enabling services, or even CMDB/

As an integrated system, this is of course not yet available on the market from any one or multiple vendors, nor will it be available most likely in three to four years. However, even if you’re a mid-tier business or organization, you should be aware of these needed technologies as their initial investments evolve over time. And you will want to make intelligent choices about where to start, knowing that you may want a mix of some of these capabilities to optimize to your first phase objectives. Market trends are beginning to suggest that late 2009 and throughout much of 2010 and 2011 we should see a great deal of innovation and diversity from vendors seeking to step up to the CMS vision with more finite, constituency-sensitive and technology-diverse offerings than have defined the CMDB marketplace in the past. This is good news for smaller businesses and will provide companies of all sizes more modular and benefit-specific options for CMS adoption going forward. Benefits you can expect today During the first half of 2009, EMA researched 162 respondents across mid-tier and large enterprises—most of who were in North America, but with some presence in Europe, Asia, and South America. Figure 1 (see

Figure 2 How much would you estimate that your company has saved in proactive cost efficiencies (without subtracting CMDB costs) over the last twelve months? 39%

Less than $50,000


$50,000 to under $100,000


$100,000 to under $250,000


$250,000 to under $500,000 $500,000 to under $1 million


$1 to under $2.5 million

9% 6%

$2.5 to under $5 million


$5 to under $7.5 million


$7.5 to $10 million


Greater than $10 million 0%









Column % © 2008 Enterprise Management Associates, Inc.

management systems n Analyst feature

Figure 3 What is the highest level executive/management directly responsible for overseeing the dayto-day? How much saved in proactive cost efficiencies (without subtracting CMDB costs) over the last twelve months? 22%

All C-level

VP & Director



46% 36% 36% 56%

23% 21%

All other 0%



Less than 50,000

30% 50,000 - 499,999




500,000 or more © 2008 Enterprise Management Associates, Inc.

CMS planning. Since a good number of the respondents had not yet achieved full phase one deployment (37%), it’s not surprising that preponderance had achieved less than $50,000 in proactive value. Probably the most significant recommendation for “how to succeed” arose from cross analyzing the above data with other factors including company size, phase, time in deployment, and other factors. While there were some natural and obvious correlations, they were surprisingly weak. However, one factor did correlate strongly—and that was C-level executive involvement (see Figure 3 above). Recommendations Hopefully I’ve established in pretty graphic terms the number one requirement for success—senior executive-level commitment

on an ongoing basis. Why is this? CMDB Systems are first and foremost about organizational transformation—leveraging new technologies for assimilating and sharing information to support new and better ways of working. As such, managing stakeholders, staying the course on initial phase objectives, and getting more active participation from organizations across IT, remain ongoing requirements as the system evolves. And don’t forget the skill requirements of the “core team.” These include expertise in architecture, process, and in-depth expertise in unique but relevant technologies, as well as strong communication skills. This last should not be overlooked. Evolving a CMDB System is first and foremost a dialog in which bi-directional learning takes place between the core team and stakeholders, not a pushing out

of “doctrine” from a central source. Some other key things to keep in mind are: 1. Take “baby steps.” Don’t try to build the entire system at once. Leverage the modularity inherent in the CMDBS/CMS vision to focus on a staged program of evolution versus an all-or-nothing monolithic effort. 2. Ensure sufficiently detailed requirements are established for each phase. 3. Pay attention to process—leveraging ITIL as a resource versus a doctrine. 4. Along with executive support, make sure you get an ongoing budget commitment if possible. As CMDB Systems evolve, short, projectdefined budget timelines can be counterproductive and create extra work, in spite of their obvious appeal in 2009. 5. Recognize the importance of managing expectations, both in terms of executive leadership and in terms of stakeholders. 6. Seek out areas of automation even in early phase deployments. Excessive administrative overhead not only incurs costs, it creates risks and often leads to inaccuracies. On the other hand, your CMDB System can become a powerful platform to enable more cohesive automation across silos as it evolves. 7. Learn from your mistakes—because there will be mistakes. The industry as a whole is still very much in learning mode. Look for avenues for dialog with other adopters—EMA is hoping to launch such a program within the next 12 months. Remember that as you learn, you also become a force in shaping the future of what a CMDB System is, and what it can reasonably be expected to achieve.

Dennis Drogseth

Vice President of Research, IT Megatrends, Analytics and CMDB Systems Enterprise Management Associates (EMA)

Dennis joined EMA in 1998 and currently manages the New Hampshire office. He directs a team of analysts that focus on the development of the Networked Services Management practice areas that span performance availability and service management across enterprise and telecommunication markets. At EMA, Dennis has pioneered research in converging management strategies such as performance/availability and integrated security. Prior to joining EMA, Dennis worked to develop marketing strategies and new business models for Cabletron’s SPECTRUM management software.



Executive panel n moving beyond grc

Finding the

right balance There has been a lot of attention and focus on GRC, but not all of it is positive. Organizations need to look at where GRC is headed and at the application of technology to possible GRC strategies. MICH A EL R ASMUSSEN (COR POR ATE INTEGR ITY) moderates a discussion on these issues with the help of SIMON TAY LOR (COM MVAULT), SA M H A R R IS (TER A DATA COR POR ATION), M A NOJ KULWA L (SAS), and M A RTIN K LING (IDS SCHEER).




moving beyond grc n Executive panel

MR: Governance, risk, and compliance, or GRC, is a very complicated area. We’ve seen a number of vendors aligned towards this, as well as buyers of products and services embrace this concept of GRC. In my research I find that GRC can be a very confusing term to begin with, so I’d like to open the panel to a more freeform discussion and ask: “How do you define governance, risk, and compliance, or GRC?”


I think GRC is a term that covers such a wide area of different interrelated types of activities. I tend to focus around the access management and retention of key corporate records for the management of risk, and certainly records-keeping compliance. The overall theme I see is the transparency of accessing that information. The clients I work with are very interested in the risks of record-keeping, and the context of, for instance, eDiscovery. So it’s almost a fusion of understanding how we retain information, for what reasons, and then ultimately how you access that information.


I have a simple definition: GRC is a summary expression for management disciplines and tools taking in the business risk, compliance, regulations, and internal politics. We understand GRC as a holistic process-centric approach, to ensure effectiveness of a control system while operating it in a cost-efficient way.


I think any existing definition of GRC doesn’t capture its full potential. But I think there are three key principles: 1. GRC should minimize the negative surprises for all key stakeholders. We have seen various high-profile negative surprises over the last two years, and also in the last ten years such as Enron. 2. GRC should be able to actively demonstrate that they are contributing to an organization’s growth and performance by optimizing the risk and return ratio. 3. GRC should keep the organization prepared to comply with any existing or emerging regulations while continuously optimizing the compliance costs. SH: For me, GRC is an understanding of the relationship between objectives, efforts to achieve them, and the results of governance, risk management, and compliance obligations. Furthermore, I feel that GRC is an evolutionary continuum, where it’s easy for firms to end up with a whole layered set of disparate solutions. Modern GRC is really the realization that we need to integrate the spirit of GRC information in a framework or platform, so business leaders can help the firm to make data-driven decisions. MR: As a research analyst in this space for eight years, it’s quite frustrating when GRC is locked into a specific silo, which I’ve seen in areas like Sarbanes-Oxley and IT risk and compliance. In my mind, GRC is about the collaboration

and communication of risk and compliance issues across the business. The discussion so far on linking GRC to performance is unexplored territory for a lot of the GRC solutions out there. So often they are focused on crossing t’s and dotting i’s, as opposed to making GRC a value proposition and linking it to corporate performance management. So let’s go a bit deeper with the next question: “How has GRC been effective to date and how has it been ineffective?”


Well, GRC has a history of being hype-driven. In the beginning there was a concentration on quality issues—everybody was certifying against the respective bodies or concepts. Then came environmental, safety and health, and suddenly all those environmental discussions started a wave of regulations, and all the big companies had respective departments taking care of it. The last years have seen a focus of financial compliance caused by a continuing stream of misconduct or too risky business cases. Looking back, I would say that a discussion about any of those GRC topics has always led to better management concepts and tools. Today’s software and technology offers support for GRC activities that is miles away from what was available before. But still, it has to be noted that the list of companies with quality issues, and the cases of environmental misconduct are not decreasing— and the cases of fraud, financial misconduct, and AUGUST 2009


Executive panel n moving beyond grc

lacking risk management even lead to the current crisis. All those regulations were meant to—but did not—really change business behavior. And as long as GRC topics are discussed as specialist disciplines instead of as the core task of managing a company, it will stay that way.


In terms of GRC being effective, I think integrating governance, risk, and compliance components into a single discipline has been the biggest achievement so far. Previously these were silo disciplines typically resulting in duplication of efforts and costs. In terms of GRC being ineffective, I think that GRC initiatives have mainly been stuck in the “C”, or compliance aspect, which is where GRC started as a discipline. What we have seen with customers over the last two years is the start of integration of the “C” and “R” aspects. However, we have still not seen organizations seriously investing in the “G” aspect. But, as GRC managers ask for more investment for their GRC programs, they will have to bring this component into the picture because that is what will link the GRC program with the day-to-day performance of the organization. Another ineffective area is that business units still see GRC as a burden. Unlike specialized risk areas such as credit and market risk, GRC touches every part of the organization. The universe of GRC in an organization is large, and one central team cannot fully manage GRC processes. GRC managers need the contribution of business units to fully manage all aspects of GRC. And I think GRC managers still have some work to do in terms of convincing their business units that GRC can actually contribute in helping them achieve their business goals and objectives.




I think GRC has been effective to date in firms that have shown an appreciation for the value of fostering a risk-aware or GRCaware culture, in recognizing that it’s not enough to simply meet a single GRC obligation. The organization needs to view these individual obligations in the context of the system, and the impact of achieving corporate objectives. We don’t have to look very far to see where GRC has been ineffective. One observation I would draw from this is that no one can stop an individual with intimate knowledge of the system, and one who has driven intent to circumvent these controls and systems. So I think the GRC platform—one that allows the firm to aggregate the single version of the truth from multiple and other GRC data sources, and allows for quantitative data, search and report—will enable innovative collaboration, analysis, and reporting.


I think we’ve seen a different trend— whereas before reliance was on applications and IT to deliver the information that people interpret at a GRC level to make the decisionmakers understand the transparency they need on information, we’ve seen a more unified approach, with more business ownership around this topic than ever before. I think the challenge always remains in different areas of the business. There’s not enough cross-functional interaction going on, and they tend to be very siloed implementations. I’ve got a client, a financial organization, with 160 different versions of governance and compliance record-keeping aligned almost to the same topic, so there certainly needs to be unification in terms of how people think about

these requirements. I think what’s been largely ineffective, is people’s ability to adapt and understand how these things need to change with time, and it ends up being a very static view of the world. Businesses need to understand how to keep pace with these changes. At the moment people do a task or project, and they get to the point where they have a least some form of loose business strategy in this area, but it doesn’t evolve and keep pace. MR: Let’s move on to another question which plays well to your perspective Simon—how do companies provide transparent governance to electronic content while adhering to strict compliance or recordskeeping requirements?


The reality here as we see it is that implementing GRC requirements leads to lots of different siloed approaches to the management of information across the business. So the real challenge today is very much around keeping things for compliance requirements, providing transparency for governance requirements, and mitigating the overall risk while increasing productivity across all this, but doing it in a way that is very effectual. And this isn’t a challenge that is only related to North America. If we talk about, for instance, organizational compliance versus individual privacy and access requirements—they contradict each other. The only practical response I can give is that it needs to be flexible. We’ve spent a long time, certainly over the last five or six years, developing

gement ana M t se s A

QoE Se c


ity ur

tion a z i al r tu i V


e & Configuratio n ang Ma Ch n a ge m en t

atalog ce C r vi Se


vic eD e sk

There’s a Better Way to Pick an IT Management Solution Visit the EMA IT Management Solutions Center Finding the right product for your latest IT management initiative can be time-consuming and frustrating. Target a short-list of vendors in hours vs. days (or weeks) with the IT Management Solutions Center from leading IT analyst firm Enterprise Management Associates! Our experts have cut through the hype and consolidated the real data you need on hundreds of IT management offerings – and it’s all available in one convenient location.

Register for FREE access: • Learn the basics of key technologies • Search through 100s of solutions • Compare products side by side

Visit the IT Management Solutions Center today to search and compare hundreds of product profiles across 15+ technologies. IT MANAGEMENT RESEARCH, INDUSTRY ANALYSIS AND CONSULTING

Interested in sponsoring the IT Management Solutions Center? Contact us at or +1.303.543.9500 x112.

Executive panel n moving beyond grc

very rigid approaches to compliance-driven, say, records retention, and certainly business use casespecific access and governance. But we have failed to provide unification and some sort of cross-functional based analysis of how all these things relate. So the first thing companies need to do is understand how these things interrelate and how to bring them together. The second thing is we’ve taken a very top-down view on how to manage this. It’s often driven by business requirements, but I think we need to have a meeting of minds here so that people are responsible for information and managing organizational levels from the bottom-up—a systems application approach. For instance, it is now no longer acceptable to keep multiple copies of the same pieces of information for different governance requirements because that, in turn, creates risk, associated with something like a litigation request for eDiscovery. Unification is needed on bringing this information together, but at the same time honouring different interpretations at a business level based on different governance and compliance requirements. So there is an overall theme now around unification, not just of strategy, but of systems and information management that underpins that strategy.


I do concur with the comment that we need to be careful with not having multiple copies of official records for different compliance purposes because of the risk it increases. I would also add a perspective on sensitive data and the storage of sensitive data. Imagine you have a team working on an acquisition or a bonus structure—typically there’ll be collaboration between people exchanging documents via email, perhaps using a share point or a network file, and unfortunately after these projects are completed there are often remnants of these sensitive documents. So how do you ensure that these containers for electronic documents are actually applying the appropriate controls given the sensitivity of the data in them? A possible framework for the solution is to programmatically scan your corporate and file share environments, use a search engine to read the documents, and apply business tools to assess the sensitivity. Then programmatically assess the controls for that location, and see if the sensitivity of the documents matches up to the controls that are applied. If not, you need to programmatically notify the owners of the stores and take corrective action, and furthermore, lock it down if action is not taken. My other recommendation is—don’t search for sensitive documents in your environment until you’re



prepared to remediate. Some suggest that discovery without remediation could actually increase your risk to exposure.


Well, I think there will always be a trade-off between traceability and privacy which cannot be finally solved by technology. Even today there are regulations on what to be kept and regulations on when it is not allowed to track and store data—and more often than not there is a conflict between the two. This happens even more in today’s multinational acting companies that have to adhere to different, and sometimes contradicting national laws, while trying to keep corporate governance working. In any event, a practical solution for keeping governance has to be looked for on the side of an organizational approach, not on the technology side.


This is an ongoing challenge for our customers, and it doesn’t look like this will be resolved anytime soon. The priority is to first comply with regulations, and then depending on the business context, decide to make electronic content available to various stakeholders. In case there is a conflict with multiple regulations I have seen customers discuss this issue with their regulator, and then resolve it based on advice received from the regulator. MR: What is one perspective or perception that you would encourage organizations to change in how they approach GRC and why? Obviously, a lot of organizations have already started down this GRC trail, so how would you encourage them to stretch their thinking so they can actually integrate a GRC program more effectively?


From the business perspective, I think GRC teams are very heavy in their use of headquarter power when dealing with individual business units. A balanced approach is needed when dealing with business units so open collaboration with the business unit can take place. We do not want a completely centralized approach where the business units do not have a say in the practices and policies that are being adopted. On the other hand, you do not want the business units to be left on their own, otherwise you won’t have a completely integrated view of your GRC environment.

Connect the Dots in Time to Act

Teradata Enterprise Risk Intelligence is a powerful strategic tool Accurate, timely, and relevant risk information from across the enterprise does more than ensure regulatory compliance; it empowers you to innovate, accelerate growth, and become a more agile organization for these tumultuous times. Teradata Enterprise Risk Management helps many of the world’s leading financial institutions achieve optimal risk-reward balance. Let us do the same for you.

Teradata is a registered trademark of Teradata Corporation and/or its affiliates in the U.S. and worldwide. Copyright Š 2009 by Teradata Corporation

Executive panel n moving beyond grc

GRC teams need to consider the processes for particular business units because, for example, local compliance regulations are slightly unique compared to other business units. They need to try and maintain a balance. In regards to the technology perspective, organizations need to balance their investments in GRC application and GRC platforms. The difference I see is that a GRC application is something you buy out of the box, which then gives you access to common GRC capabilities. A GRC platform is something which allows you to create a bridge between your GRC environment and the rest of your operational systems. This allows you to consolidate all your compliance regulations and risk data in a common GRC repository. Organizations are focusing more on investing in GRC applications and not so much on the GRC platform. They should be looking to invest in both areas to be better prepared for the future.


I think organizations need to look at how they provide something that is more adaptive, and not just siloed into an area of the business. This plays into the same comment that Manoj was talking about— local variations to policy and implementations—and I think a corporate directive that’s driven out to multi-national companies on a global basis is really important. From a technology approach, I think it’s important to understand the lifecycle of information. We’ve certainly seen many situations that have caused high-risk in the last 12 to 18 months, particularly around organizations not understanding where the information is kept. Compliance and record-keeping is one thing, but if you think about the individual and what they do with pieces of information in support of particular policy, it often finds its way all over an organization. Today, technologies that can provide better transparency and understanding of where information lives for the purposes of management in a GRC context is very important to mitigate risk. It’s not just about searching, indexing, and accessing information, it’s also about providing good lifecycle management and control around the assets you’ve currently got, and finding some good compliance and governance around it.


I really like the business/technology split, and also Simon’s ideas around transparency. We want to encourage organizations to promote transparency and two-way communication. There’s really no magic bullet for achieving a GRC-aware organization. But the key is to foster a



collaborative environment between the board and the business units. One way to do this is provide a single comprehensive, authoritative, and compliant GRC data source, so that you remove any distrust or question of the facts. You also have to have a common means to analyze and report on this GRC data, so really we should be promoting a combination of the single enterprise data source, enterprise class analytics, and business intelligence for GRC. In fact, this would be the realization of the modern GRC platform we’ve been reading about for some time.


Companies tend to see GRC topics as decoupled from business processes and performance results. This leads to the general perception that it is an unnecessary burden placed on the company by external regulatory bodies that needs to be cleared with the least possible impact to business. It’s important to understand that GRC is really nothing new. Companies always had to judge risk in business, to guard the investments of their stakeholders, and to adhere to laws and regulations— and they always needed to be able to report on proceedings regularly or on demand. More importantly, it is to be understood that GRC has always been an inherent part of business processes and proceedings. The tasks that are asked for by any GRC program should be in the original interest of the company to create sustainable value. For example, there’s a tight link between business performance and GRC. Performance of a process is not only cycle time, and a company’s performance is not reduced to stock performance. The way things are accomplished makes part of the worth—more so when looked at with a mid-term perspective. Think of the investment industry: there were high earnings achieved but when the internal controls were not working properly there were gigantic losses also. This is not about regulation—this is about how to conduct business. MR: Organizations themselves have been focused on replacing very manual processes for governance, risk, and compliance for years, and part of that is often replacing thousands of spreadsheets. The current solutions in the market are really replacements for spreadsheets that provide better non-repudiation, stronger audit trail and integrity for collecting content-type related information.


������� ������������������ Experience ARIS for GRC live!

September 22-23 IDS Scheer Headquarters, Saarbruecken


Find out more at:

Can you look your auditors in the eye with this much confidence? ARIS Solution for Governance, Risk & Compliance Management provides support across a wide range of regulatory requirements, including SOX, corporate governance, and ISO standards.

© Copyright (C) IDS Scheer AG, 2001 – 2009. All rights reserved. “ARIS”, “IDS”, “ProcessWorld”, ARIS MashZone, “PPM”, ARIS with Platform symbol and Y symbol are trademarks or registered trademarks of IDS Scheer AG in Germany and in many countries all over the world. All other trademarks are the property of their respective owners. U.S. pat. D561,778, pat. D561,777, pat. D547,322, pat. D547,323, pat. D547,324

Executive panel n moving beyond grc

If we look forward, what should those organizations that are just now defining their technology architecture for GRC look for, and what’s new and exciting in the GRC technology space?


I think the big message should be around automation and integration of GRC, because if you look at a typical GRC universe in an organization— it is large, and it would be impossible to effectively manage all the GRC aspects with manual processes. So GRC managers need to bring more automation and integration into their GRC processes to become more efficient and effective. Why only do your risk and control assessment once every quarter? Why get all the data from the system into the GRC environment and then analyze it? Why not take GRC to the operational system itself, and monitor the business transactions in real time? This way you will be notified about potential risks and policy deviations long before they become actual incidents and thus enable you to take corrective action. Then you’re able to take a more proactive approach rather than just collecting data and manually analyzing these compliance breaches and potential risks.


Let me build on that—I basically see the way we can move things forward into three areas. There’s the access and the interpretation/ alerting area which is about understanding and identifying breaches in policy, and therefore creating some sort of event or direction around that. There is policy-based control which is around the movement of information into a single place



so that you can analyze it and understand it more effectively, and then there’s the actual unification of information in terms of its lifecycle. CommVault takes a more unified approach with these three areas in mind. By that I mean understanding how all the data that we need to interpret for other business and information scenarios is managed and kept, without causing any duplication and inherent cost and complexity in the organization. If you can bring this information together at a very low level in the organization, it also mitigates risk at a high level, for example, it you think of the lifecycle of an email. It is created by an end-user, received by an organization managed in some form of application, and then it will be protected, backed-up, stored etc. throughout its lifecycle. This is where the complexity arrives, because when you apply GRC to that whole concept, gaining access to the information at any point in its lifecycle to understand the risk it potentially creates is a huge task. We’re very much about maintaining an almost transparent view of all data that’s managed, and then deciding what you want to do based on the particular GRC scenario. So let’s say, for instance, that you have to gather information for a particular compliance subject and store it in a more traditional records-management type view. The type of technology that you can now implement in this area is to understand the pattern of behaviour that conforms to a particular policy around this compliance requirement, and then have it redirected and placed in the most appropriate environment to manage that retention. That would be a more complianceretention driven requirement. The end result would be an alert to an individual with a review process around looking

at particular content and looking at the risk that poses. The big difference with that, and a more unified approach to technology, is that you’re not again siloed into different types of application technology, so this is very much a bottom-up strategy for all businesses.


In IDS Scheer’s understanding, the process of improving GRC by supporting with semi-automated workflows is far from being completed. There are still billions of spreadsheets used, especially when you look at medium-sized companies, so there is still a lot to be done when you are moving to the next step. We’d see the application of existing technologies on GRC tasks, rather than real new technologies. What I predict is that more and more new business intelligence technologies that gather data directly from operational source systems will be used for GRC. There I see explicitly those technologies that are flexible to read from business processes regardless of data structure or system architecture.


If you look at the contemporary and modern business environment, there are too many interdependencies for manual GRC processes to be effective. You need to look no further in the current credit liquidity and leveraging crisis to confirm that point. A lot of people throw data quality into the conversation, and I think it’s relevant to this concept of integration. In my mind, data quality for GRC means several things: in terms of integration it means the ability to capture relevant data relationships; in terms of integrity, it adds another concept to all the exposures to reconcile the financial statements; and completeness, where you are capturing all the exposures across the organization. Five more important points:

How can mobile solutions help us increase revenue? How will cloud computing help us do more with less?

What technologies can help my business right now? How can virtualization make our business more efficient?

Will unified communications help us improve customer responsiveness?


MUMBAI | 7–9 OCTOBER Bombay Exhibition Centre

for a Conference Pass or Reserve Your Free Expo Pass.

Answers. Action. AdvAntAge. ©2009 TechWeb, a division of United Business Media LLC.

The Leading Global Business Technology Event is Coming to India Get the information you need to build your competitive edge at IT’s most comprehensive Conference and Expo. See all the latest innovations—including virtualization, mobility and cloud computing—that will help you increase efficiency, drive revenue and reduce costs.

Register Today for a Conference Pass or Reserve Your Free Expo Pass. For Exhibiting and Sponsorship opportunities Contact Pankaj Jain at + 91 98101 72077 or email: SPOnSORS: Gold





Executive panel n moving beyond grc

• Accessibility—users should have easy access to current stores of data and be able to reuse the data in any form that they require within the limits of security. • Flexibility—users should be able to analyze data across any dimension they choose, including being able to filter summarizing information in any dimension at will. • Extensibility—the ability to include new types of data and to bring them alone into the environment without the relevant linkages to quality tools. • Timeliness—how soon the data is available after the event. • Auditability—is the data easily traceable back to its source? These five points on data quality are key to the integrated GRC platform in creating an environment for the risk-aware organization. MR: In wrapping up, I want to give our panellists a platform to talk about their specific positioning in the GRC market. So, how is your company positioning itself in the GRC market, and what specific value are you providing?


SAS is focusing predominantly delivering capabilities in the “G” and the “R” aspects, because that’s where we see the business decision-making capability of an organization is impacted most. We will continue to strengthen the “C” aspects, but customers will derive most value from the SAS GRC Solution and our capabilities in the “G” and “R” aspects. We offer a robust GRC platform, an enterprise GRC application that covers things like risk control assessments, SOX-type control testing, policy management, GRC reporting, and dashboard. We also offer a strong suite of compliance applications to comply with regulations such as Basel II, Sarbanes-Oxley, Fair Banking, Anti-Money Laundering etc. The value for our customers is in making better business decisions by being able to integrate GRC into the day-to-day business processes.


CommVault focuses very much on the compliance and policy management aspect of GRC, and the intelligent system controls that are



needed for information or authenticity audit. We focus on fusing both the requirements for risk and control and supervision, with requirements for other related areas so we don’t just see GRC in isolation. Although we position ourselves quite heavily at the information access and the effective business intelligence level, we’re also very much about the unification of information assets. We bridge the gap focused on not just understanding and helping organizations to facilitate the management of their GRC policies and processes, but also on how their IT systems can deliver that. It isn’t about fundamental change, but about the unification of information so that a simple request around accessing certain types of records doesn’t become a monolithic task costing millions of dollars. We are also focused on making sure IT is enabled to support that so it doesn’t become a task that disappears into the ether because it’s so complex to implement. We are bridging those two things and focusing on what you need to define at a business and organizational level for GRC, and making sure that anything in terms of an IT level is in perfect alignment to relieve that and make it more practical.


Teradata is solely focused on raising intelligence through data warehousing, consulting services, and enterprise analytics which translates directly to the business problem with GRC. We provide industry-specific logical data models leveraging cutting-edge computer and data modelling science. In context, it solves all aspects of business problems serving up a single version of the truth for governance, risk management, and compliance information in a comprehensive, authoritative, and compliant GRC data environment. We work very closely with our partners to integrate analytical and business intelligence solutions for GRC, to properly optimize and exploit the GRC data environment. In doing so, our clients get the best of both worlds—industryleading data infrastructure and GRC analysts, analytics, and business intelligence.


IDS Scheer is an international service provider for process and

IT solutions and the introduction of business applications. Through our solutions for GRC we deliver a proven and robust platform for integrated risk and compliance management. We offer our customers a unique processcentric approach to sustainable GRC that is independent of specific themes like SarbanesOxley, and allows us to move between different GRC programs for the customer. MR: I think this podcast has been very effective in stretching all of our views on GRC. now I would like to put forward some of the value propositions that I’m seeing on a regular basis from companies. I think that they’re trying to achieve five different things: • Sustainability—GRC challenges aren’t getting easier, they’re getting more complicated. Ultimately, managing risk and compliance in these little silos brings on greater risk, vulnerability, and exposure because you don’t see the big picture. • Consistency—if we want to get everybody on the same page we need to have a consistent business and technology architecture. • Efficiency—this is the one I hear most often from companies. We need a consistent, streamlined risk and compliance assessment process for the organization. • Accountability—this is about ownership, what things are going right, where are things breaking down, and how to keep people accountable. • Transparency—getting that big picture perspective on risk and compliance that’s meaningful to the board of directors and executive management and into the trenches in the organization.

moving beyond grc n Executive panel

Michael Rasmussen -Moderator

President, Risk and Compliance Advisor

Corporate Integrity

Michael Rasmussen is the authority in understanding governance, risk, and compliance (GRC). He is a sought-after keynote speaker, author, and collaborator on GRC issues around the world and is noted for being the first analyst to define and model the GRC market for technology and professional services. During his career, Michael has worked in the market analyst, consulting, and enterprise sectors.

Simon Taylor

Senior Director, Information Access and Management


Simon is Senior Director for the CommVault Information Access and Management business worldwide, including Information Risk, eDiscovery, Compliance, Information Search, Archiving, and Data Analysis. His field of expertise covers a range of topics including information and data intelligence, data warehousing, application, and information management.

Sam Harris

Director, Enterprise Risk Management

Teradata Corporation

Sam Harris is Director of Enterprise Risk Management for Teradata Corporation. In this role he leads the global ERM strategy and program development for banking, capital markets, and insurance industries. Programs include Teradata Enterprise Risk Intelligence and integrated offerings with strategic partners for complete enterprise risk management solutions.

Manoj Kulwal

Risk Business Solution Manager


Manoj is part of SAS’ Global Risk Practice. His group is responsible for defining and implementing SAS’ GRC Strategy. He has 11 years of industry experience and in his current role is responsible for supporting strategic customer engagements globally. Previously, he was the Global Product Manager for SAS Operational Risk Solution, where he was involved in driving the solution roadmap.

Martin Kling

Solution Manager, Governance, Risk and Compliance

IDS Scheer

Martin Kling joined IDS Scheer in April 2007 as Solution Manager, Governance, Risk, and Compliance. Prior to this he was working in various central functions for Infineon Technologies AG. His last assignment there was to set up the organizational part of the Sarbanes-Oxley activities, and lead the implementation project for the ARIS GRC Solution at Infineon. AUGUST 2009


Analyst article n access security

It’s all in the planning… Now is the time to implement a sustainable GRC strategy in your organization. Creating a plan of action today will see greater return in the future, says MICHAEL RASMUSSEN (CORPORATE INTEGRITY).



access security n Analyst article


One thing is certain— risk and compliance burdens are not going away


Governance, risk, and compliance can be confusing to understand in their individual capacities—bring them together as GRC, and it can be even more confounding. GRC is more than a catchy acronym used by technology providers and consultants to market their solutions—it is a philosophy of business. This philosophy permeates the organization: its oversight, processes, and culture. Ultimately, GRC is about the integrity of an organization: : Does the organization properly manage and have sound governance? : Does the organization take risk within risk appetite and tolerance thresholds? : Does the organization meet its legal/ regulatory compliance obligations? : Does the organization make its code of ethics, policies, and procedures clear to its employees and business partners? The challenge of GRC is that each individual term—governance, risk, compliance—has varied meanings across the organization. There is corporate governance, IT governance, financial risk, strategic risk, operational risk, IT risk, corporate compliance, SarbanesOxley (SOX) compliance, employment/labor compliance, privacy compliance… the list of mandates and initiatives goes on and on. It is easier to define what GRC is NOT: : GRC is not about silos of risk and compliance operating independently of each other. : GRC is not solely about technology —though technology plays a critical role. : GRC is not just a label of services that consultants provide. : GRC is not just about Sarbanes-Oxley compliance. : GRC is not another label for enterprise risk management (ERM), although GRC encompasses ERM. : GRC is not about a single individual owning all aspects of governance, risk, and compliance. Grc is a philosophy of business It IS about individual GRC roles across the organization working in harmony to provide a complete view of governance, risk, and compliance. It is about collaboration and sharing of information, assessments, metrics, risks, investigations, and losses across these professional roles.

GRC’s purpose is to show the full view of risk and compliance and identify interrelationships in today’s complex and distributed business environment. GRC is a federation of professional roles—the corporate secretary, legal, risk, audit, compliance, IT, ethics, finance, line of business, and others—working together in a common framework, collaboration, and architecture to achieve sustainability, consistency, efficiency, accountability, and transparency across the organization. GRC is a three-legged stool: governance, risk, and compliance are all necessary to effectively manage and steer the organization. In summary—good governance can only be achieved through diligent risk and compliance management. In today’s business environment, ignoring a federated view of GRC results in business processes, partners, employees, and systems that behave like leaves blowing in the wind—GRC aligns them to be more efficient and manageable. Inefficiencies, errors, and potential risks can be identified, averted, or contained, reducing exposure of the organization and ultimately creating better business performance. Governance, risk, and compliance are diverse and complex, with individual intricacies and issues ready to frustrate the organization. Organizations that attempt to build a GRC strategy with home-grown solutions, spreadsheets, or islands of technology not built to meet a range of needs are left in the dark, and boxed into a view of the world that they will find limiting down the road. The current business environment requires a new paradigm and approach to GRC, requiring a common framework, integrated processes, and platform that span across the organization and its individual risk and compliance issues. This is brought together in a GRC strategy that is ready to tackle issues at their roots through core GRC processes that are leveraged across the organization. GRC success starts with a simple five-step plan This plan draws on the lessons learned from Corporate Integrity working with numerous large organizations around the world with complex business operations and relationships. Here are the steps that prepare



AnAlyST ArTicle n AcceSS SecUriTy


… good governance can only be achieved through diligent risk and compliance management


you to deliver a sustainable GRC program: } Identify the interrelated processes, problems, and issues. An understanding of the scope of GRC issues, processes, technology, and requirements is the beginning. Organizations should start with a survey assessment aimed at identifying and cataloging the number of processes, technologies, methodologies, and frameworks used for risk and compliance across all business operations. This assessment is best aligned with the OCEG Red Book 2.0 Capability Model. } Establish GRC program goals and objectives. Once the organization has identified the scope of GRC across the organization it can establish the goals needed to achieve GRC. This starts with establishing a vision and mission statement for GRC that the goals stem from. Central to these goals will be a determination on GRC program structure— centralized, federated, or some form of deliberate, but ad hoc, collaboration. This structure will determine many other goals, particularly the consistent and relevant use of technology. } Develop your short term strategy for fulfilling GRC requirements. With your goals in mind, identify the “quick wins” that will demonstrate GRC success and improvement. Aim for tackling the items that immediately show a return to the organization and build greater buy-in to the GRC strategy across business operations. This short-term plan should be no longer than 12 months. } Conduct a comprehensive organizational risk assessment. Part of the short-term plan should be a detailed risk assessment that provides a common framework and catalog of corporate risks across GRC management silos. This risk assessment is used to



further identify and feed into the long-term comprehensive GRC strategy to help the organization better understand, manage, and monitor risk exposure. } Provide a comprehensive action plan. With the short-term plan in place, focused on the easy wins and process improvement, the organization can begin working on the long-term strategic plan that develops a comprehensive GRC strategy focused on process improvement. The harder and more challenging components of GRC should be brought into this plan. This plan is optimal when it covers a three-to-five year period. further advice Prioritization of risk and compliance activities needs to be decided at an enterprise level. This can be difficult as silos of risk and compliance can function buried within different functions of the business. To overcome

this and facilitate a top-down approach, a sustainable GRC strategy requires that the organization get executive buy-in and support. This provides endorsement of the effort and overcomes obstacles of silos wanting to work independently and do things their own way. One thing is certain—risk and compliance burdens are not going away. Government regulators continue to influence control upon organization practices through tighter regulation. Business partners are requiring stronger controls within their relationships. The globalization of business introduces significant risk with more points of vulnerability and exposure to the organization. The time is now for organizations to define and implement a sustainable GRC strategy that drives sustainability, consistency, efficiency, accountability, security, and transparency of GRC across the organization.


president, risk and compliance advisor corpoRAte inteGrity

Michael Rasmussen is the authority in understanding governance, risk, and compliance (GRC). He is a sought-after keynote speaker, author, and collaborator on GRC issues around the world, and is noted for being the first analyst to define and model the GRC market for technology and professional services. During his career, Michael has worked in the market analyst, consulting, and enterprise sectors. Prior to founding Corporate Integrity, Michael was a Vice-President and top analyst at Forrester Research, Inc. Before Forrester, he led the risk consulting practice at a professional services firm in the Midwest.


THE WORLD’S MOST IMPORTANT GATHERING OF CIOs AND SENIOR IT EXECUTIVES BALANCING COST, RISK AND GROWTH Symposium/ITxpo 2009® is designed to deliver the insight, tools and relationships you need to get through what may be the toughest year of your career. More than 200 presentations delivered by world-renowned Gartner analysts will cover all facets of how business technology can help you strike the right balance between cost optimization, risk mitigation and a carefully timed return to growth. In challenging times, organizations rely on their leaders.

IT leaders rely on Symposium.


Meet IT’s best minds. Keynotes by top CEOs.


CIOs, senior IT executives, and industry experts conferring on tough challenges.


Immediately actionable take-aways for each of nine IT leadership roles.


The world’s top technology companies across IT.

Visit for an exclusive Enterprise Technology Management discount on your registration.

© 2009 Gartner, Inc. and/or its affiliates. All rights reserved. Gartner is a registered trademark of Gartner, Inc. or its affiliates. For more information, e-mail or visit


2009 The world’s most important gathering of CIO’s and senior IT executives

company feature n information management

Journey to the center of corporate information ecosystems: The crucial realignment of IT and business priorities COMMVAULT tells us how the power of Singular Information Management速 provides the catalyst for delivering corporate change and shared ownership of valuable data assets.



information management n company feature

In the world of science, an ecosystem is a geographical area where plants, animals, the landscape, and the climate all interact together. This complete community of living organisms as well as non-living materials functions as a cohesive unit, regardless of whether the ecosystem is a rainforest covering an area larger than many nations, or a simple puddle in a backyard garden. All of the ecosystem elements operate in a balanced fashion, constantly adapting to environmental changes in order to thrive. A similar concept can apply to the business world, where an information ecosystem embodies the people, processes, technologies, and data that exist within the confines of a corporation. As such, interlocking bits and bytes of data travel throughout the organization as well as to customers, partners, suppliers, regulators, and other key external audiences. Unlike the rainforest or backyard puddle, however, the elements of most information ecosystems today are completely out of balance, due to a rapidly spreading and increasingly complex web of files, emails, documents, and applications that are growing wildly out of control. Fortunately, much-needed relief can be found without over-investing in hardware and software or reinventing the entire organization. What it takes is a realignment of IT and business strategies, and a crucial meeting of corporate minds to get everyone focused on regaining the company’s information equilibrium with unified data strategies that support both current and future business needs. The keyword for this realignment is change. Change embodies ecosystems. Put simply, to change, a transition needs to occur from one state to another. Information and data management are no different. To gain the sort of integration that companies require in terms of cohesive data and information strategies, change is required. Yesterday—roots of data fragmentation Generally speaking, our recent computing past is characterized by company IT initiatives to move off legacy computing platforms in favor of more flexible and open systems. In doing so, countless opportunities are gained to access, manipulate, and share data in innovative ways. All this newfound “openness” however, gave way to a host of fragmented solutions for managing, storing, and protecting what was

expected to be steady data growth but quickly became a never-ending explosion of data. In most organizations, data growth now has reached an all-time high, with 60% or higher year-over-year increases becoming the norm. Unfortunately, these legacy quick-fixes led to disparate, point solutions aimed at specific problem areas such as application scalability, content-specific data management and protection, departmental compliance, ad-hoc security, and siloed, long-term records retention. While these isolated solutions provided some short-term relief, they also exacerbated a problematic, recurring cost “virus” in terms of the exponential challenges of adding—and often duplicating—storage, infrastructure, power, reporting, and policy management. Meanwhile, coping with external factors, including regulatory mandates, green IT initiatives and eDiscovery requirements, caused additional reactive pressures on IT departments, which lead to further fragmentation of data. The result: failed attempts to save storage, power, and data center resources. Making matters worse is the timeconsuming, costly and tedious task of “hunting” for specific pieces of data in order to comply with regulatory or legal discovery demands. The real root of the problem stems from an age-old disconnect between IT and business leadership when it comes to identifying, assessing, and classifying information based on its intrinsic merit to the organization. IT historically has worked in a vacuum since business executives have been remiss in taking ownership of the underlying information ecosystem. As a result, IT has focused on what IT does best—delivering leading-edge applications and tools to users. But this has all been done without broader insight into the impact these applications, tools, and data proliferation has had on the overall business. Today—crossing the great divide Clearly, technology and business teams need to close the gap and focus on developing proactive information strategies that maximize the value of information to the organization. Simplification and unification are required to provide improved data control and streamlined access to vital corporate information assets. To accomplish this, IT must change its pattern of technology investments currently characterized by knee-jerk reactions to skyrocketing data growth, backup, and recovery bottlenecks, as well as other stopgap measures to address isolated data management problems.

Conversely the business side of the house needs to step to the plate and get involved in determining information workflow for the entire organization so there’s greater insight into evolving corporate direction and the role technology can play in bolstering the bottom line. How can this be accomplished? In short, IT and business must work together proactively to develop a holistic approach to information management. To kick off the realignment process, IT and business leadership need to get in the same room to assess, analyze, and create a blueprint of how their information assets flow throughout the organization. The future—it’s as simple as connecting the information dots All information profiling starts with good data analysis. Therefore, the first “dot” is an audit of existing corporate data that needs to connect with the business context that creates value, and transforms it into information. To determine the value, the unified technology and business team must embrace a singular focus and mission in assessing business requirements as they relate to the following essential elements: • Asset type • Risk • Retention • Accessibility • Protection • Security • Lifecycle, and • Destruction. Today, certain types of data may have little risk to exposure, so there’s no requirement for special storage or retention policies. Yet even a minor legislation or organizational change can alter the risk profile dramatically, and elevate once-insignificant information to the highest risk-alert status. Companies must be agile enough to make incremental modifications to their information frameworks and adapt to change. This process will include the reclassification of information assets, comprising both live and long-term data, as they suddenly become more critical from a risk management perspective. Retention policies also play a part. The growing requirement in today’s organizations is to facilitate different retention policies across similar information assets based on business access and risk requirements (e.g, regulatory retention vs civil evidence preservation). In carrying out the overall audit, the business side must be fully engaged with IT, establishing sponsors to aid in information assessment and analysis. AUGUST 2009


compAny feATUre n informATion mAnAGemenT


once the information blueprint is in place, the technology team can then initiate development of unified data strategies that blend best-of-class technologies with best practices to reduce costs, management complexity, and risk. In doing so, each information asset must be reviewed as it traverses through its workflow in the organization. Developing a thorough understanding of these phases will prove pivotal in determining which data management strategies will be most effective in meeting both short- and long-term objectives.

pieces of information also will be retained for the longest periods of time, yet need to be accessible for eDiscovery. By aligning, dissecting, and classifying data based on information asset type and underlying business value, it’s possible to reduce duplication, storage costs, and administrative overheads. Also, a cohesive information management strategy, supported by unified data management solutions, will deliver dramatic improvements in operational efficiencies, scalability, and agility, in turn providing greater business value.

sIX Degrees of DATA sepAraTIon Information is typically managed in six distinct, separate phases that aren’t necessarily sequential. since information will have varying degrees of importance to technology and business stakeholders, it’s critical to consider both points of view when answering the following questions for each step. The six phases are: � conception—what is the origin and nature of the information? for example, is it an email, file document or image? • Proliferation—is the information meant for internal and/or external use? Will it be sent or was it received? • Exploitation—how accessible must this information be? Will it be used for any legal or regulatory filings? Will it need to be searched or used in a discovery process? • Revision—how is version control managed? Will different iterations of this information need to be available for auditing purposes? • Retention—how is the information protected? What are the most appropriate storage technologies and security measures that need to be taken to ensure long-term information authenticity? • Disposition—how will this information be deleted or purged? how a piece of information is created and the context of its origin often has a major impact on its overall value to users and business. for that reason, it’s important to group low- and high-value information to determine the appropriate level of data management. once value has been assigned, it’s much easier to determine appropriate strategies and policies for managing data workflow, processes, storage, and protection. high-priority items, for instance, will require the most stringent data protection and security. Most likely, these mission-critical

fleXIBle fraMeWorKs for The fUTUre All ecosystems are living, breathing entities that change frequently to adapt to fluctuating environmental conditions. The same holds true for information ecosystems, which need to accommodate shifting business directions, changing economic conditions, ever-evolving customer requirements, and the ongoing introduction of new business processes and technologies. What’s missing in an overwhelming majority of today’s data management solutions is ample flexibility to keep pace with mutations and transformations in the ecosystem. In fact, most data protection vendors provide only loose integration between different point products, and limited collaboration between storage and document management. The inflexibility of existing data management solutions severely limits a company’s ability to repurpose information and modify the way it’s managed, exploited, accessed, and retained. consider the journey that most organizations now need to take in order to exploit the greater value of their data. The shortcomings of existing products as enabling technologies in this journey, and the difficulties and frustrations of managing ad-hoc, disjointed solutions has inspired a completely different and better way—a single platform approach to information management with solution modules sharing a common code and function set designed to work together from the ground up. With its simpana® software foundation, commvault has created a single platform architecture with centralized information management to ease data management through an entire lifecycle while providing unprecedented control over data growth, costs, and risks. for the first time, companies can move away from disparate products for data


protection, replication, archive, and resource management, as well as avoid the timeconsuming and costly challenges of managing separate infrastructures. simpana software makes it possible to manage disparate data from multiple sources simultaneously with the same underlying data management foundation. This singular approach also makes it much easier to move data and archive it offline until needed. for many companies, having a single platform also liberates information that has been trapped inside backups and archives or other applications, in order to create more compelling business value. long-term business success will hinge on how effectively organizations can unleash the power of their information while collaborating within and between groups to extract information value from their data assets. commvault’s unifying strategy is a welcome catalyst for corporate change that effectively and economically handles today’s tough problems, yet is agile enough to grow and change in supporting the needs of tomorrow.

“The real root of the problem stems from an age-old disconnect between IT and business leadership…”

When litigation arises, will you have everything you need

at your fingertips? Access & discover relevant ESI—including email messages, files, and backup data—at a moment’s notice with Simpana® software. When faced with legal challenges, every minute matters. After all, opposing counsel isn’t going to wait while you sift through years of documents, trying to find relevant data. CommVault® Simpana® 8 makes eDiscovery simple by providing you with a range of information management capabilities to deliver efficient and intuitive legal access to corporate email messages, files, and collaboration data. Simpana 8 enables search and classification across all ESI including laptop/desktops, backup copies, managed legal preservation, and archive retention—all from a single console and single infrastructure. To find out more about the most comprehensive, risk-averse, and cost-managed eDiscovery solution in existence, go to






©1999-2009 CommVault Systems, Inc. All rights reserved. CommVault, the “CV” logo, Solving Forward, and Simpana are trademarks or registered trademarks of CommVault Systems, Inc. All other third party brands, products, service names, trademarks, or registered service marks are the property of and used to identify the products or services servi of their respective owners. All specifications are subject to change without notice.

Analyst feature n ITIL

A tailored


ADRIAN POLLEY (PLAN-NET) examines whether ITIL—the best practice approach to IT Service Management— can actually deliver genuine business benefits, and if it will survive the recession intact.



ITIL n Analyst feature


ITIL implementations have become one of the first projects to be scrubbed off the list by those who hold the purse strings

ITIL—the IT Library Infrastructure—was created in the 1980s by the UK Government’s CCTA (Central Computer and Telecommunications Agency) with the objective of ensuring better use of IT services and resources. It was very different to the current ITIL, but still focused on service support and delivery. During the boom years the uptake in implementing ITIL increased dramatically in both government and commercial organizations as companies strived to deliver best practice. It reached a level where some form of the discipline was in evidence from 40-60% of IT departments, depending on which industry figure you choose to believe. However, with the recent sharp downturn in economic conditions, ITIL implementations have become one of the first projects to be scrubbed off the list by those who hold the purse strings. So why has ITIL suffered more than many other projects under these circumstances? Perhaps it is simply that people do not believe that it will actually deliver the benefits that its advocates would have us all believe. Certainly they have reason to think it when success stories showing tangible business benefits seem few and far between. So what happened? In my experience, the problem does not lie directly with ITIL. In fact, it can be used very effectively to generate efficiencies and make improvements to services. Few would dispute that, implemented properly, a best practice approach will deliver benefits, and ITIL is still the industry standard when it comes to best practice in IT service management. Where the problem lies is with the kind of full scale, by-the-book implementations that were typically carried out before the downturn started to raise questions. The danger with this kind of untailored,


off-the-shelf approach is that a large number of the processes and procedures implemented do not take into account the specific requirements of the organization in question. This means they can be unnecessary, or worse still, inhibitive and directly at odds with the needs of the business. The extra work and unnecessary expense of a by-the-book implementation could be swept under the carpet when times are good and money isn’t a problem. Now, however, when minds become focused on the bottom line and pencils are sharpened, there is nowhere to hide for projects that promise much and cost the earth, but place their focus on the journey rather than the end result. This revelation has led to some casualties. With the substantial financial rewards on offer to an IT service management consultancy involved in a full-scale, untailored, by-the-book ITIL implementation, there were no shortage of suitors promoting this approach. The problem for many of these consultancies is they have not been able to adapt to changes in economic conditions, and the simple fact that there are far fewer businesses willing to undertake projects of this nature. Since the focus has turned to a more service orientated, efficiency-driven way of thinking it is no longer enough to simply use ITIL certification as the justification for a project. Consultancies that previously used this method have found themselves unable to provide the examples of measurable ROI that its prospective clients now demand. However, it is not all doom and gloom for ITIL. Far from it. In fact, the shift in thinking has forced ITIL to evolve into something better. While the recession has undoubtedly uncovered some huge issues with a certain type of ITIL implementation, it has also shown how to get the most from it. While ITIL projects seem to have become less common, ITIL is still thriving as a standard for tailored, best practice-aligned projects with clear business goals in mind.



AnAlyST feATUre n ITIl


... ITIL implementation alone does not automatically deliver a best practice IT department, and is even less likely to deliver a fit-for-purpose one

wheRe to now? so where does this leave an organisation that recognizes the benefits best practice can bring, and is looking to use ITIL standards within their service? The outlook is surprisingly positive. best-practice need not be tossed by the wayside—quite the opposite. ITIL still has a major part to play in improving efficiency and service levels, and as such is a perfect fit as a guide for surviving the downturn and capitalizing on the upturn when it eventually comes. It should, however, always be considered as a guide and always tailored selectively to the specific needs of each organisation. best practice is beginning to be replaced by “fitfor-purpose” and the industry will be all the better for it. The secret is not to make the same mistakes as last time when the cycle begins again. whatever the economic conditions, IT will maximize its use to the business by running in the most efficient way possible, and as such, all ventures

into the world of best practice should be conducted with this in mind. It is also important to remember that the best practice approach does not begin and end with an ITIL implementation. one thing we always communicate to our clients is that an ITIL implementation alone does not automatically deliver a best practice IT department, and is even less likely to deliver a fit-for-purpose one. processes and procedures are enablers for the improvement of service, and it is in using them that an organization will see their benefit. with this in mind, it is just as essential for a best practice environment to possess the right people and the right technology in the right place, as it is to implement a standard such as ITIL—in any form. where the recession has given another unlikely helping hand is by forcing many organizations into cutting everything but the necessities of service. In doing so, they have often discovered the most streamlined way of running their IT service.

Adrian Polley

head oF ReSeaRCh Information Security Forum (ISF)

adrian polley heads up plan-net’s Information security and Infrastructure consulting practices. Throughout his 15 years at plan-net, he has maintained a hands-on approach and still delivers a significant portion of customer facing work. he is an experienced consultant with a track record of delivering business-focused IT consulting, particularly in the areas of Information security and IT strategy, as well as the ability to deliver technical complex projects and manage change within major businesses. adrian is ultimately responsible for plan-net’s Iso accreditations.




There are some caveats involved. It obviously requires the right cuts be made, and that service levels do not diminish as a result of these cuts, but with the right guidance, a framework for a super efficient, high-performance IT service can rise up from the ashes of budget-led cuts to the department. It cannot be said too many times that disciplines such as ITIL should always be tailored to the needs of the organization in question, and it can therefore be seen as one of the few positives of the unfortunate economic situation that the industry has been forced to examine the way it approaches best practice. In order to ensure momentum is not lost the industry should say good riddance to ITIL for ITIL’s sake once and for all, and embrace the tailored, ITIL-aligned approach to service management that will enable us to survive the downturn and capitalize on the future.

xls .ppt .jpeg .pdf .mpeg wff .psd .tif .gif .rtf .swf eps .ai .html .pst vmdk .dsk .csv .doc xls .ppt .jpeg .pdf .mpeg wff .psd .tif .gif .rtf .swf eps .ai .html .pst .vhd vmdk .dsk .csv .doc xls .ppt .jpeg .pdf .mpeg wff .psd .tif .gif .rtf .swf eps .ai .html .pst .vhd don’t just store it. more with it. vmdk .dskdo .csv .doc xls .ppt .jpeg .pdf .mpeg wff .psd .tif .gif .rtf .swf eps .ai .html .pst .vhd vmdk .dsk .csv .doc • Are you struggling to manage increasing volumes of information efficiently?

• Do you need to optimise your storage capacity to save costs? • Is your information stored securely?

• Are you compliant with the latest legislation?

It’s not just about how you store your information. Find out how to do more at Storage Expo –

the definitive event for data storage, information and content management.

14 - 15 october 2009 olympia london RegisteR foR fRee at– Organised by:

Platinum sPOnsOrs



Walk the line Some organizations will tell you that PCI DSS is too complex; some say it’s an extremely weak standard, but is it still relevant and doing its job? ETM’S ALI KLAVER addresses the issues, and some new wireless standards, with expert advice from MIKE BAGLIETTO (AIRTIGHT NETWORKS) and ANTON CHUVAKIN (QUALYS). 36



AK: FIRSTLY ANTON, TELL ME WHAT IS THE MOST CHALLENGING PART OF PCI DSS? AC: PCI applies to people who take credit card numbers, store them, transmit them, and who otherwise deal with credit card or debit card data. Typically what I hear is that the most important and challenging part about PCI is making people understand that they are under this mandate and must protect cardholder data. In addition to this, there are two other things that I consider key challenges. One is that for many organizations the hardest part, after understanding that they need to be PCI compliant, is where do they start? This question has a very easy answer—you find out where the credit card data is, where it should be, and where your business processes

can be adjusted so you don’t store that data, thus reducing the scope of PCI. Reducing scope is quite important since it reduces the amount of data you need to protect. The other important challenge happens immediately after—how do you prioritize your efforts? What do you do first, and how do you consider your risks, threats, and vulnerabilities to plan your PCI project and protect cardholders’ data? MB: I agree. One of the things we see most is the applicability of PCI standards. We get a lot of customers say that they’re not sure if it applies to them. It has quite a wide net and I think there are a lot of requirements that go into addressing how merchants prioritize and evaluate risk and so on. PCI did a good job of creating a set of milestones to help these merchants.


They created a document that outlines all the requirements and puts them in priority order. You have to look at those areas that have a higher risk sense of emergency, such as encryption or firewalls, and then things that are more procedural, that you maybe need to do on a quarterly basis, such as scanning your air space for unauthorized wireless devices. Essentially, it is about: a) understanding that you have to do this, and b) prioritizing effectively, and PCI will help you with that. AC: Yes, that’s an easy answer. The standard is relevant, and I’ve found that there are almost equal volumes of criticism from the opposite sides: some say it is too weak and needs much more enforcement and fines, while others say that it is too strong and mandates too many onerous and supposedly “unneeded” things.

When I think about such responses, they indicate that PCI is about right. The fact that there are very different and opposing criticisms is a really good thing, because it pushes people to actually protect the data. MB: I think PCI as a standard, and the PCI Council, does a pretty good job with their working groups in order to stay on top of new developments in various areas. From a security professional’s point of view, no standard can ever dictate specifically every single element of a security policy in order to be secure. Networks, environments, and configurations change— in some cases quite frequently when you consider wireless and mobility in the enterprise. So, it’s not a prescription to cure all ills. It should be looked at as a road map and a guideline to the things they must do to secure cardholder data.

I think one of the biggest challenges is that people don’t want to continue to invest in technology on the off-chance that PCI may change their requirements. For example, with respect to the wireless element of the PCI standards, people don’t want to invest in certain tools until the PCI Council mandates that they do, because if PCI requirements change, they are going to have to redeploy new tools. AK: I THINK IT’S INTERESTING TO HAVE SUCH DISPARATE VIEWS OF PCI WHICH, AS ANTON SAID EARLIER, IS ACTUALLY A GOOD SIGN OF ITS RELEVANCE. RECENTLY, WE’VE HEARD STORIES ABOUT BREACHES WHERE COMPANIES ARE NOT “COMPLIANT” BUT ONLY “VALIDATED.” CAN YOU AUGUST 2009



These new guidelines clearly dictate that there does need to be a wireless component to your security policies

EXPLAIN THE DIFFERENCE BETWEEN THE TWO? AC: Of course, that’s actually a common misconception which is treated unfairly by the media sometimes, because they tend to say that a PCI complaint company was breached. Now, we know that PCI is status needs to be validated—the status is checked using certain mechanisms, so those larger level one merchants do an on-site assessment via a special auditor or an assessor QC, called a QSA. Small companies do self-assessment, fill out the questionnaire, and state their compliance status at that moment. When a QC QSA leaves your environment and prepares a report on compliance, your compliance status is validated. You would certainly most likely be in compliance at the time of the validation, but if you switch all those things back to normal and what it was before you would not be compliant, and could be breached as a result. The easiest way to summarize this is that validation is a point in time—you check today, and you’re valid today. Compliance is an ongoing status—you can float in and out of compliance for certain reasons. That distinction is very often missed by the media when they say a compliant company has been breached. MB: I think the point here is that, whether your mandate is by some compliance body or not, a security breach is still a security breach. The difference between compliance and validation is, as Anton says, a continuous process. Configurations change, devices get removed or added from the network, and clients come and go in the environment, especially with wireless—talk about an environment that’s really dynamic. You need to have strong security guidelines within your organizations and you need to make sure that



you have best practices implemented as part of your day-to-day operations. When you look at compliance, especially the wireless components of it, you know this is one area that’s only going to get stronger. Certainly, the requirement mandate of a quarterly scan of all the locations is not frequent enough. AK: LET’S TURN NOW TO ADDRESSING SOME POINTERS FOR COMPANIES THAT ARE JUST BEGINNING THEIR PCI DSS COMPLIANCE PROJECT. WHAT ADVICE CAN YOU GIVE TO THESE COMPANIES? AC: The most important advice is to where the PCI standard applies—what is its scope? Before you even think of how to reduce the scope, you should think about what kind of business processes run in your environment that use the credit payment card data. The second thing to think about is whether it should be there—is there any way to not hold the card data in so many places. If you can shrink down the number of systems that touch the data, your PCI project becomes much simpler. So, the first two tips are to make sure that you know what the scope is, and the second is to make sure you can reduce the scope. A good way to do this is to outsource the process to somebody else, which makes your own PCI compliance validation scope extremely small. MB: Understanding that it’s going to be much easier to meet your compliance, and more importantly your security requirements, is understanding where you keep it, why you keep it, and eliminate anything that doesn’t need to interact with that data. Number two is that I think you need to be honest with yourself as you go through the requirements. You need to look at your organization through the eyes of an auditor and ask if this is acceptable.

I think PCI has done a very good job of outlining requirements in the order in which they think you should go about meeting compliance because they know it’s a big task. Now what’s really important in terms of new technology, like wireless for example is to understand those components at the beginning so you are not playing catch-up. There are ways to invest in tools and processes that make it easier to secure the data and maintain compliance. AK: WE OFTEN HEAR THAT PCI COMPLIANCE DOES NOT ACTUALLY ENSURE SECURITY. SO IS THERE ANY STANDARD THAT DOES APPLY? PERHAPS ISO CAN PERFORM THIS FUNCTION? AC: That is a bit of a trick question. Imagine a large organization with thousands of employees, multiple thousands of servers, desktops and other different information systems with complex business processes which rely on IT for operation. Now, imagine that an outsider with no insight about how that business runs writes a huge document outlining all the steps that organization should take to protect them from hackers. As this point, most of the readers with any connection to IT security will be on the floor laughing, because there is absolutely no way the above document will guarantee security. Now, with that out of the way, our expectations are clearly reset. Neither PCI nor any other external guidance can ever guarantee “unhackability.” External guidance would never achieve a guarantee of security, so what is the point of having this external document? It is to start your security management programme and move it from complete security ignorance to a certain degree of enlightenment.


This is the function of PCI, and the function of all other security standards. They provide a beginning, not an end, and that’s an important message when you are talking about data security, network security, and wireless security. MB: I think this question comes up because regardless of which body is generating these guidelines, you are going to hear people say: “It’s too hard, it’s too complex, and it’s completely inefficient.” I think the whole point is that it is a road map for getting this process started and keeping it moving in a positive direction, addressing new technologies, and embracing new security measures as they go along. Compliance will never equal security without organizations embracing these guidelines as a road map to data operations and security policies. AK: I KNOW WE WERE TALKING EARLIER ABOUT COMPLIANCE AS A WORK IN PROGRESS, SO TO SPEAK, BUT WHY DO YOU THINK A STRICTER WIRELESS COMPONENT HAS BEEN ADDED TO PCI DSS AND WHAT DOES IT MEAN FOR MERCHANTS? MB: I think the adoption of wireless throughout retail organizations with several high-profile breaches added to the sense of urgency for wireless security, and the tightening of standards. In fact, on July 16th the PCI standards body released new guidelines for wireless security and compliance. What’s interesting is that they really did a good job taking the wireless requirements and putting them in context of scope. They also acknowledge that wireless scanning of vulnerabilities is mandatory across all locations. Before it used to be acceptable to just do sampling across all locations, and while your auditor may only look at a sample of your scans, you are required to be compliant across all locations at all times. What’s interesting in these new guidelines is that the PCI Council goes one step further. They acknowledge that one of the means in which they do scanning, which is typically a handheld analyzer or spectrum analyzer on a laptop, is not the most cost-effective, efficient, and maybe not even the most accurate way to go about it. So they are strongly recommending, for larger organizations with a distributed network of locations, that they invest in automated solutions like a wireless intrusion prevention system.

The guidelines also talk about introducing wireless IPS requirements. These new guidelines clearly dictate that there does need to be a wireless component to your security policies. It’s clear that the PCI Council is taking this very seriously. AK: NOW TO WRAP UP WITH OUR LAST QUESTION MIKE: QUALYS HAS BEEN PROVIDING ON-DEMAND IT SECURITY RISK AND COMPLIANCE MANAGEMENT SOLUTIONS FOR YEARS. WHY DID AIRTIGHT NETWORKS LAUNCH AN ON-DEMAND WIRELESS NETWORK VULNERABILITY MANAGEMENT SERVICE? MB: About a year and half ago we were servicing a lot of merchants with our wireless intrusion prevention on-site application. What we found was is that PCI compliance was becoming a higher priority in the organization, and so a couple of dynamics were changing. One is that money was being allocated for meeting compliance. Some of the capital budgets that they may have had in previous years were now being redirected to investing in tools and meeting compliance.

The other one is the economy. After talking to a lot of different customers, what we found is that they had a need for our solutions as a more flexible model to meet their current business challenges. We were able to deliver our solution in a software-as-a-service model, so customers could now buy wireless security, wireless vulnerability scanning, and PCI compliance solutions on-demand, to give them the flexibility, capability, and functionality that they’d like. They can buy as little or as much functionality as they’d like. That was appealing to those who were especially interested in meeting their compliance requirements first and foremost, and then in upgrading to stronger security down the line once they got over that initial hurdle. It also gave them a flexible working model which allowed them to spend their operating budget on outsourcing these tools, rather than a capital budget to buy them. So, it was able to meet both challenges of compliance and economy at the same time, giving customers a choice in what they decided to buy and how they wanted to manage it.



Anton is author of Security Warrior and a contributor to Know Your Enemy II, Information Security Management Handbook, Hacker’s Challenge 3, PCI Compliance (the second edition is in production), OSSEC HIDS and others. Anton also presents at security conferences worldwide, and in his spare time blogs at www. Anton comes to Qualys from LogLogic, where he held the title of Chief Logging Evangelist. Anton holds a PhD from Stony Brook University. See for more information.



Mike is responsible for product marketing at AirTight and oversees the inside sales team and lead generation. He is a veteran of the Silicon Valley high-tech industry with 20 years of experience in product marketing, sales, and technical account management. Prior to joining AirTight, Mike held the position of senior product marketing manager for data protection services at eVault. He holds a BA in International Relations from the University of California, Davis.



identity management n in the hotseat

Spend less…

and improve access control There are many examples of companies who were unable to fulfil all compliance requirements—with negative consequences. ALI KLAVER (ETM’S MANAGING EDITOR) talks to MARTIN VLIEM (MICROSOFT) and PAUL HEIDEN (BHOLD COMPANY) about their vision on identity and access management, and a concrete solution to help you.



in The hoTSeAT n idenTiTy mAnAGemenT

AK: CAN YOU BOTH TELL ME ABOUT THE CHALLENGES FOR IDENTITY AND ACCESS MANAGEMENT—WHAT’S GOING ON IN THIS MARKET? PH: I think one of the biggest compliance trends at the moment is that organizations are required to do a lot of things to protect access to their information, or misuse of their information. So, a major priority for identity management is definitely compliance. MV: Another trend we see concerns new hosting and sourcing strategies, most noticeably in public and private cloud computing, that basically involve additional challenges in relation to facilitating, managing, and controlling seamless and integrated access, and keeping control within this extended IT landscape. PH: A more long term driver is agility—the challenge to quickly adopt changes in the organization structure due to mergers or acquisitions, or “classic” but now very frequent/flexible hire/fire scenarios, not only to revoke permission and access for leaving employees, but also for new employees to be productive as quickly as possible. MV: And of course, in current times, cost efficiency always is important. Not only through the enablement of self-service and automation of costly identity provisioning related tasks, but also by saving time and thereby money spent on external audits, identity

 42

and access management can make a significant contribution. AK: THESE FOUR ARE CERTAINLY GENERAL CHALLENGES THAT ALMOST EVERY COMPANY IS STRUGGLING WITH. SEEN FROM THE PERSPECTIVE OF A SENIOR MANAGER, HOW CAN IDENTITY MANAGEMENT CONTRIBUTE TO MEET THESE CHALLENGES? MV: Well, in its very essence, identity management is a business issue, though the solutions are mostly provided by the IT department. IT has become a very important production asset for many organizations, not just for organizations who deliver services like financial institutions or insurance companies, but for virtually any organization. PH: What I think is very striking, is that it would be unthinkable for an automotive manufacturer to not control its production assets. What we see in IT is that organizations are too often not capable of controlling the use of their production assets in IT. Even organizations that consider themselves “in control” only rely on detective controls. But, instead of controlling and adjusting access rights when things have gone wrong, it would make more sense to prevent



unauthorized access in the first place. That’s why we strongly believe in preventive measures instead of relying on detective controls like certification, which is in essence the very same as taking inventory.

the process of issuing and revoking access rights. In that way you are able to preventively ensure that people only get those access rights they (a) are entitled to have, and (b) need to have.

AK: CAN YOU EXPLAIN THIS A BIT FURTHER—THE IDEA OF PREVENTATIVE INSTEAD OF CONTROLLING AUTHORIZATION MANAGEMENT? PH: Sure. What I very often use is a comparison to warehouses. Imagine IT as an enormous set of access rights and that you need to ensure that goes out to thousands and even tens of thousands of people—these people get exactly the right set of access rights. Seen from that perspective, it’s very similar to managing warehouses. Now, it would be very much a compliance-only solution that you simply let everybody into your warehouse, let them take out the access rights they want and go off again, and then take inventory every week, every month, or every year and see what you’ve been missing. But, of course, if you find a violation during this inventory, the damage has already been done. What we believe would make far more sense would be to put a counter between the people who need access and the stock of available access rights, and then allow the people who manage access to IT to control

AK: THE COMPARISON WITH WAREHOUSES CERTAINLY SEEMS TO BE FRUITFUL. LET’S TAKE IT A STEP FURTHER. BEING IN CONTROL OF THE WAREHOUSE ALSO ENABLES BUSINESSES TO REDUCE THEIR STOCK, JUST-IN-TIME DELIVERY, ETC—WOULD THAT ALSO BE POSSIBLE WITH ENTERPRISE AUTHORIZATION MANAGEMENT SOFTWARE? MV: Yes, in essence we put identity-based access at the centre; basically all access and all protection is centered around the users that need controlled but flexible access just-in-time. However, that requires several capabilities (which in combination we refer to as Business Ready Security within Microsoft). First it requires that you integrate and extend, and you have to ensure that the identity and security services you offer are available for all layers of your business and IT environment. The second point is to protect everywhere and access anywhere. In an open enterprise, we see more and more companies becoming open, facilitating more services to their


idenTiTy mAnAGemenT n in The hoTSeAT


partners, users, and employees. So, these are basically two sides of the medal: protect what is important to you, but at the same time facilitate access and control at all times to authorized users. The third one is management and control, and simplifying the security experience. So it’s about staying in control, managing security and who can access what, but also ensuring that users can work transparently with applications and change their credentials, manage their personal identify information themselves, and access permissions easily while keeping track of corporate and legislative restrictions. AK: SO FAR WE’VE SEEN THE CHALLENGES AND WE KNOW THE ANSWER. YOU JUST PUT IDENTITY-BASED ACCESS RIGHT AT THE CENTRE, BUT LET’S NOW TURN TO SOME CONCRETE SOLUTIONS. HOW DO YOU AND YOUR COMPANIES, MICROSOFT AND BHOLD, HELP PEOPLE TO MEET THESE CHALLENGES?

MV: Well, that’s basically where one of our important solutions in the identity space comes in. Microsoft offers customers an identity management solution called Microsoft Identity Lifecycle Manager 2007. And we have a new upcoming release, which is called Forefront Identity Manager, expected to be available at the first half of 2010. As a very important extension, especially in the access part, we have BHOLD as a partner. They add advanced role-based access management to our Identity Lifecycle Management solution. AK: CAN YOU EXPLAIN THAT A BIT FURTHER? PH: When I founded BHOLD about twelve years ago, what we really wanted to achieve was to enable organizations to manage and control the process of issuing and revoking access rights. Role-based access control is a method that ensures you can do that. So what the roles basically allow you to do is translate technical access into something that the business can understand, which on the one hand enables people to communicate easily

and clearly exactly what they need in order to do their job. On the other hand, because managers now understand what the access rights are all about they can also be held responsible, and that is a stepping stone towards compliance and meeting your responsibilities from a Sarbanes-Oxley perspective, or other legislative requirements. AK: NOW LET’S GO BACK TO THE CHALLENGES WE MENTIONED BEFORE. HOW DOES THIS SOFTWARE HELP ORGANIZATIONS TO SAVE COSTS, MINIMIZE RISK, AND BE MORE AGILE IN TERMS OF CHANGE? PH: I think that’s a pretty easy list. What I tend to do is look at this from three perspectives. If you look at the end-user level, what you’ll see is that by simply optimizing the relation between user lifecycle events and access management, you prevent people from waiting for the correct access. But it is not only making people more productive, it is also making your IT safer: virtually all access rights are automatically revoked if

a user no longer needs them. This prevents the accumulation of access rights which is exactly what brought Barings Bank to its knees and almost Société Générale. Another important ability is that it enables organizations to start working in a real collaborative fashion with partners. What has always been difficult is allowing separate organizations to work together, for instance, on a certain project, because it was difficult to provide these people with the right access and then take them out again. By organizing identity and access management better, you are able to allow collaboration which is a necessity these days to produce efficiently, while ensuring that you don’t jeopardize your security. It also shows that, from the management level, there are big savings to be made by simply allowing people to work quicker, particularly the IT department. It is so much easier for them to quickly adopt, for example, an acquisition or divestment. Instead of IT being an impediment, the organization becomes a proactive responsive unit, capable of quickly handling changes



in The hoTSeAT n idenTiTy mAnAGemenT

that occur every now and then, or even daily, in the business. Finally, once you really control your access management, a big advantage is that you will really reduce your cost of compliance, simply because it’s no longer necessary to have managers check all the access rights detectively. Instead of having audits on the actual access, you can have audits on processes and that really reduces cost and time. We have had indications from customers that this approach quickly drops cost of IT audits more than 50%, and I think that’s a pretty important thing. MV: In addition, the combined solution also helps to reduce costs significantly through relative straightforward capabilities such as pushing identity management tasks like authorization approvals, end user access requests, or password self-service reset, back into the business where they belong through self service, thereby effectively reducing helpdesk and IT operations costs. Of course, you have to invest in software and the implementation of it, but the advantages Paul and I just described indicate a ROI in a relatively short time. But even more important is the fact that you are better able to

prevent unauthorized access and the potential damage that it can bring to your company. Knowing you are in control, and putting self-service in scope, makes a managers’ life, and an end-user’s life, much easier. PH: What we currently see as the biggest driver is that this joint solution is able to automate about 60-80% of the allocation of access rights, which simply happens due to identity lifecycle events, and about 100% of the authorized access is automatically defaulted when needed. This really contributes to IT efficiency, end-user productivity, and security. AK: LET’S ADDRESS THE MORE PRACTICAL SIDE. SAY, FOR EXAMPLE, THAT I WANT TO IMPLEMENT THIS IDENTITY SOLUTION IN MY ORGANIZATION. HOW LONG DOES IT TAKE, AND IS THERE ANYTHING THAT CAN GO WRONG DURING IMPLEMENTATION? PH: BHOLD has been in this market for twelve years and Microsoft even longer. The solutions are completely integrated and based on the Microsoft technology platform. Both solutions are well integrated so that, for instance, the BHOLD

admin really enables a far easier rollout of Microsoft workflows. This simplifies implementation of the Microsoft synchronization layer of Microsoft ILM, because the type of authorization instruction that has to be propagated is so straightforward and doesn’t need any additional programming. And we can prove that. An implementation of this kind would, a few years ago, have taken perhaps a year—nowadays we are capable of doing this within a few months. MV: And to make sure that your investments make sense over the longer-term, we stay focused on the integration between our current solutions. In the upcoming version— Forefront Identity Manager—BHOLD can be seamlessly integrated, so we’re also guaranteeing forward investments and benefits in the future. AK: CAN YOU GIVE OUR AUDIENCE AN EXAMPLE OF COMPANIES WHO ALREADY WORK WITH YOUR SOFTWARE? MV: There are actually a lot of companies using our Identity Management software. Specifically there is a unique case of our combined solution working very well at HUK Coburg for example, which is a German insurance company where

the access rights of more than 15,000 employees are managed by the likes of the Microsoft ILM and BHOLD Identity Management solution. PH: And then of course there are a number of important banks like the Dexia bank in Belgium and France, or the Dutch ABMAmro bank in the Netherlands. AK: AND WHERE CAN PEOPLE TURN TO FOR FURTHER INFORMATION ON THIS SUBJECT? MV: The easiest way to get more information is to visit one of our websites: windowsserver2008/en/us/idaidentity-lifecycle-management.aspx or PH: But of course, we are also pleased to meet people on any major identity management event, particularly Microsoft events. And you are always welcome to stop by at our BHOLD offices. MV: And if you have a direct relationship with Microsoft, for example in the enterprise space, you can contact us through your Microsoft representatives, ask them about identity management, and we will tell you more about the combined solutions that are applicable.

Paul Heiden


Paul started his career as officer of the Royal Netherlands Marine Corps. Having obtained a Masters in business and Roman law, he became legal counsel and frequently encountered the problem of controlling access to confidential information. In this period he developed the ideas on business driven access control that became the foundation for BHOLD in 1997, and developed into BHOLD’s leading business applications for authorization management today.

Martin Vliem


After completing his scientific study Philosophy of Science, Technology and Society at the University of Twente (The Netherlands), Martin Vliem (CISSP, CISA) was a teacher for a year at the same university. His profound interest in internet-related technologies led him to work for a large international ICT consultancy firm, where he became acquainted with the full complexity of large ICT projects. Currently Martin works with large enterprise customers on their ICT infrastructure architectures, translating their challenges into solutions based on technology from Microsoft and its partners.



Spend less

Improve Access Control

BHOLD Enterprise Authorization Management Preventive Authorization Management 60% to 80% automated authorizing Let people work not wait for access

Minimize Identity Management (IdM) implementation Risks A solution within weeks Complete integration with Microsoft IdM solutions

Reduce Cost of Compliance Reduce IT Audit efforts on access with more than 50% Prevent unauthorized access and fraud

Make IT Responsive to Business Change Adopt reorganizations in days IT doesn’t need to be an impediment to change

analyst feature n investment management

Risky Business Poorly selected, planned, and executed IT-related business projects can result in massive value destruction. PAUL WILLIAMS (ISACA) says it’s time to ask the challenging questions about IT investment management.


august 2009

investment management n analyst feature

poRtFolio ManaGeMent We are all aware of the old advertising industry adage that half of advertising expenditure is wasted—the difficulty is in identifying which half. IT-related investments are the same—the trick is being able to recognize the winners and the losers, and take appropriate and timely action to manage the winners to success, and cancel or rationalize the losers at the earliest opportunity. Hence, the current upsurge of interest in IT portfolio management. The CIO and the board need to understand that discretionary (i.e. non-regulatory or

mandatory) expenditure on IT is a business investment requiring similarly robust scrutiny as might be applied to, for example, the latest merger and acquisition opportunity, or the decision to build a new manufacturing plant. In each case, expenditure is being approved today with a view to delivering acceptable return over a defined period, and at an appropriate level of risk. These investments need to be managed from inception through to value delivery. This implies a need for a formalized system of portfolio management with regular, properly informed reporting of key metrics to the board. Despite the plethora of automated portfolio management solutions now available to the market (although it is recognized that merely acquiring and implementing an automated solution is unlikely to be solely the answer), few organizations practice effective portfolio management and oversight. Therefore, it should come as no surprise that, all too often, such investments either fail completely or do not deliver the value originally anticipated. Indeed, experience and research tell us that, with management having so little focus on the benefits and returns expected or achieved from the IT investment portfolio, perhaps in the majority of cases it is impossible to judge whether or not success has been achieved. MeasuRinG Value In current recessionary times, there is a renewed focus on this area as enterprises seek to reduce costs, with IT often being seen as a prime candidate for budgetary constraint. The IT project portfolio in particular is seen as an obvious target where cost can be saved without directly impacting current service levels. In this context, a recently released survey from ISACA, a global association of 86,000 IT

governance professionals, has significant relevance as it highlights many of the issues that mitigate the effectiveness and reliability of current IT investment portfolio management practices. The global study surveyed more than 1100 IT professionals who work in a wide diversity of industries including manufacturing, healthcare, financial services, and some public sector and notfor-profit. One particularly interesting set of findings was that while 84% of respondents claimed to measure, at least to some extent, the value returned from IT-related business investments, only 41% claimed to have a common understanding of what constituted value between different business departments and IT. Indeed, more than 60% stated that either there was no common understanding or they were unsure. This clearly begs the question of how value can be measured if no common understanding of value exists in the first place. Value management frameworks such as Val IT (available as a free download at www.isaca. org/valit) provide guidance to enterprises on the definition of value. Value generally comprises more than just financial return, although even the softer benefits arising from an investment should be quantified in financial terms if at all possible (and usually it is possible). Value will have different meanings between, for example, the expectations of a commercial enterprise and a charity or public sector entity. A commercial enterprise will normally be looking to value to comprise enhanced profitability and improved shareholder return, while a public sector enterprise will normally see value in terms of cost reduction or improved services to stakeholders. What is clear is that no investment should be contemplated unless there is unanimity or at least a

… value management practices are rarely covered by any internal assurance processes...

it Has lonG been RecoGnised tHat acHieVinG MeasuRAble and sustainable Value FRoM inVestMent in it­enabled cHanGe is a cHallenGe FacinG Most enteRpRises. Industry analysts such as Gartner have stated that on average 20% of investment in IT is wasted. Research from Butler Group tells us that in many organizations less than 8% of the IT budget is actually spent on initiatives that bring value to the enterprise. Of course, even in the best managed and governed organizations, some degree of investment waste is inevitable. Progress is never made without risk, and the nature of risk means that some investments will fail. This is precisely why higher-risk projects should require a higher level of return (in the same way that financial institutions charge higher rates of interest on higher risk loans). However, the enterprises that better manage their IT business-related investments on a formalized risk and return basis will minimize their exposure to such losses.

august 2009


Analyst feature n investment management

shared understanding of the value that should arise from the investment.

“Progress is never made without risk, and the nature of risk means that some investments will fail”

Responsibility and accountability Another interesting finding from the survey is the primary allocation of responsibility for the delivery of value from an IT-related business investment. The survey revealed that in almost 40% of the responses the primary responsibility rested with the CIO. Given that these are business investments enabled or supported by IT, it’s strange that so many enterprises regard this as the responsibility of the CIO. Only just over one-third of organizations say the board, the CEO, or the CFO is responsible. Interestingly (and worryingly), 8% responded that no one in their organization is responsible. It is recognized that one of the key requirements for success from an IT-related investment is the allocation of responsibility and accountability for delivery of value to a senior business executive. This should be a board-level person for all businesscritical and high-value investments. Given the responses to these questions, it is perhaps hardly surprising that the actual achievement of value is rather limited. Less than 20% of respondents indicated that more than 75% of the expected value was actually realized, while 32% believed that less than 50% of expected value was realized. Can this lamentable state of affairs be allowed to continue? Audit committees, nonexecutive directors and investors should take note and start asking questions. The cancellation of projects is always an emotive subject and here the survey had mixed results; 36% of respondents had cancelled projects within their enterprises for a number of reasons

Paul Williams

(FCA, CITP) Chair, ISACA Strategic Advisory Group, and IT Governance Advisor to Protiviti


august 2009

including budget overruns, lack of expected value delivery, changing business needs, and lack of strategic alignment. Changing business needs was the most common reason for cancellation, indicating that some form of active portfolio management does take place within many enterprises to detect and react to changing circumstances. This is good news. However, to offset this, 45% indicated that no projects had been cancelled within their enterprises. Because it is unlikely that all initiated projects will ultimately lead to delivery of their original objectives, it is of some concern that so many enterprises may be in blissful ignorance of the value destruction that may be lurking in their portfolios. The quest for consistently active portfolio management continues. Conclusions Given the somewhat worrying results from this extremely useful research, and the potential for massive value destruction that it reveals, it is perhaps surprising that value management practices are rarely covered by any internal assurance processes, including internal audit. Although the research indicated that value management is on the internal audit agenda for more than 40% of the respondents, my own anecdotal experience is that this number is a little optimistic. Massive value destruction arises from poorly selected, poorly planned, and poorly executed IT-related business projects. Sadly, this trend will continue until enterprises, their boards, their audit committees, and other stakeholders begin to take this significant waste of shareholder funds seriously.

Paul has been a member of ISACA for 25 years, serving originally as a board member and president of the London (UK) Chapter. Currently, Williams serves as chair of the ISACA/ITGI Strategy Advisory Group and is a member of the association’s Governance Advisory Council. He is a regular speaker at ISACA/IT Governance Institute conferences, and the author or co-author of a number of ISACA/ITGI publications, including two of the titles in the IT Governance Domain Practices and Competencies series. Paul is an independent consultant specializing in IT governance, IT due diligence, IT audit, and project risk management. He has acted as strategy advisor for SeaQuation, a spin-out company from ING, specializing in intelligent IT portfolio analysis, and he has also been advisor on IT governance activities for Protiviti, a risk management consultancy.



Coming up in the October issue

ETM is focusing on Business Intelligence











y is






Interested in contributing? If you are an analyst, consultant or an independent and would like to contribute to the Business Intelligence issue released in October, please contact me: Featured topics range from cost-effective solutions to future forecasting, and should be vendor-neutral in tone.



SECURITY 2009 CONFERENCE AND EXHIBITION DATES: 24 – 26 August 2009 LOCATON: Sydney, Australia URL: DATACONNECTORS WASHINGTON DC TECH-SECURITY CONFERENCE DATE: 27 August 2009 LOCATION: Washington DC, WA URL: events/2009/08WashingtonDC/agenda.asp DEVELOPING YOUR GRC TECHNOLOGY IMPROVEMENT BOOTCAMP DATE: 27 August 2009 LOCATION: Seattle, WA URL: workshops.html?view=21 VMWORLD 2009 DATES: 31 August – 3 September 2009 LOCATION: San Francisco, CA URL: conferences/2009

l to


C su



My CEO doesn’t know my name. And that’s the way I plan to keep it. Effective data security is key to preventing breaches, simplifying the compliance process and reducing risk to your organization. Let us help you focus your time, money and resources on more strategic projects, reduce the workload of securing critical information, and streamline compliance reporting for mandates such as PCI DSS, HIPAA, NERC, and Sarbanes-Oxley.

Our solution provides a multi-level approach to data security and compliance: • NetIQ® Security ManagerTM – from Log Management to Complete SIEM

• NetIQ® Secure Configuration ManagerTM – Compliance Assessment to Security Configuration Auditing

• NetIQ® Change GuardianTM – Privileged-User Management and File Integrity Monitoring

• NetIQ® Aegis® – the First IT Process Automation Platform for Security and Compliance

If you’d like to find out more about how NetIQ can help you with data security and critical compliance mandates, visit or contact

© 2009 NetIQ Corporation. All rights reserved. NetIQ, the NetIQ logo, NetIQ Security Manager, NetIQ Secure Configuration Manager, NetIQ Change Guardian, and NetIQ Aegis are trademarks or registered trademarks of NetIQ Corporation or its subsidiaries in the United States and other jurisdictions. All other company and product names may be trademarks or registered trademarks of their respective companies.

When litigation arises, will you have everything you need

at your fingertips? Access & discover relevant ESI—including email messages, files, and backup data—at a moment’s notice with Simpana® software. When faced with legal challenges, every minute matters. After all, opposing counsel isn’t going to wait while you sift through years of documents, trying to find relevant data. CommVault® Simpana® 8 makes eDiscovery simple by providing you with a range of information management capabilities to deliver efficient and intuitive legal access to corporate email messages, files, and collaboration data. Simpana 8 enables search and classification across all ESI including laptop/desktops, backup copies, managed legal preservation, and archive retention—all from a single console and single infrastructure. To find out more about the most comprehensive, risk-averse, and cost-managed eDiscovery solution in existence, go to






©1999-2009 CommVault Systems, Inc. All rights reserved. CommVault, the “CV” logo, Solving Forward, and Simpana are trademarks or registered trademarks of CommVault Systems, Inc. All other third party brands, products, service names, trademarks, or registered service marks are the property of and used to identify the products or services servi of their respective owners. All specifications are subject to change without notice.

Risky business  

Much as I would have liked to feature a picture of Tom Cruise on the cover of our July issue: “Risky Business,” he doesn’t impart our main m...

Read more
Read more
Similar to
Popular now
Just for you