Page 1


Storage in your DNA hen a nation with a population of more than a billion goes digital, the impact can be truly transformational in nature. A project which best exemplifies the potential of going digital is the Aadhar project. This project, which is mammoth in scale and ambition, is one of the most challenging projects from a storage and analytics perspective, as the scope is to capture 12 billion fingerprints, 1.2 billion photographs, and 2.4 billion iris scans. Not surprisingly, the whole world is keenly watching the progress of this project as it has huge lessons for every country — from a technology and impact point of view. While Aadhar is undoubtedly the poster boy of Big Data in India, there are many smaller projects that have the potential of creating a huge social transformation by leveraging the four mega trends — Cloud, Social, Mobile and Big Data — that are in action today. These four mega trends are driving the digital transformation of India Inc. This is corroborated by a recent EMC-IDC study, which states that the digital universe in India will grow 23-fold between 2012 and 2020. The report notes that India’s digital information explosion will be being driven by proliferation of devices such as PCs and smartphones, increased Internet access and boost in data from machines such as surveillance cameras or smart meters. Given this pace of rapid information explosion, Indian enterprises are accelerating usage of technologies like deduplication, cloudbased storage and alternative storage mediums to cut costs and boost efficiencies, as our writers at InformationWeek discover. This razor sharp focus has obviously yielded results, as India happens to be the country with much lower cost of managing data than other global counterparts. The EMC-IDC study points out that India spent USD 0.87 per GB to manage data, which is much lower than the U.S., China, and Western Europe. While the world and India too continues to grapple with storagerelated challenges, a tectonic shift can happen in the way we look at storage today. Researchers in the European Bioinformatics Institute (EBI) have discovered a new way of storing information in synthetic DNA molecules. Researchers say that once this method is perfected, a gram of DNA could safely store the equivalent of one million CDs in a gram of DNA for 10,000 years. The potential of such a technology is huge due to the advantages of long-term preservation of data at a low cost. While it is too early to say whether DNA can be the future medium of storage, India remains at the cusp of an exciting wave of digitization that has the potential to dynamically transform every sector and industry as we see today.


India happens to be a country with much lower cost of managing data than other global counterparts

u Srikanth RP is Executive Editor of InformationWeek India.


informationweek march 2013

contents Volume








24 Cover Story How storage technologies are evolving to manage the data deluge Exponential increase in data within enterprises is creating a surge in the storage requirements. Vendors are pursuing this opportunity by innovating and launching products around emerging storage technologies like flash-based SSDs, automated tiered storage, storage virtualization and cloud-based storage

29 32

Organizations opt for deduplication as cure for backup woes A look at how companies like HDFC Bank, Marico, Shoppers Stop and Royal Sundaram Alliance Insurance have considerably reduced their backup-related challenges by adopting deduplication technology

How CIOs are tackling the data explosion challenge Multi-fold increase in the size and complexity of data coupled with a stringent regulatory environment is compelling CIOs to relook at their storage strategies

Cover Design : Deepjyoti Bhowmik

Feature case study

35 56

Archival solution helps Yash Raj Films ensure protection of its film catalogue By adopting an archival solution from Dell, the studio has been able to archive all its films at a single location


How open source helped People Interactive save more than ` 80 lakh The firm that owns the popular Indian matrimony website, has saved huge costs related to licenses and maintenance by deploying Ubuntu Linux on more than 800 desktops

Do you Twitter? Follow us at



informationweek march 2013

Find us on Facebook at http://www.facebook. com/informationweekindia

Storage virtualization gets real Our four business scenarios show how to improve disaster recovery, boost disk utilization and speed performance

Why the cloud ecosystem needs common standards of measurement? With a plethora of cloud computing service providers defining their offerings in their own distinct terminologies, Indian CIOs tell us why there is a strong need for developing common standards of measurement for cloud computing service delivery and propose key areas where standards should be framed

If you’re on LinkedIN, reach us at groups?gid=2249272


interview 36 ‘Big Data of tomorrow will be about images, audio, and sensor data’

Steve Stavridis PlateSpin Product Marketing Manager, NetIQ

interview 38

52 interview

Christian J. Leeb-Hetzer Vice President - Storage Sales, IBM

13 14 16

How TVS Motors is using Shelf Engineering to push efficiency to a new level TG Dhandapani CIO, TVS Motors



Global Big Data revenue pegged at USD 11.4 billion; IBM leads all players

INDEX................................................................ 10

NASSCOM expects disruptive tech to propel growth

news analysis............................................. 18

Use of social networking can increase the risk of APT attacks, finds survey

cio voice..........................................57, 58, 59

Industry icons experience cyber attacks

opinion.....................................48, 62, 64, 65

WebNMS set to tap 100 million dollar M2M solutions market

event.............................................................. 66

11 year old child develops trojan: Are pre-teens writing malware?

cio profile................................................... 68

C-DAC launches India’s fastest supercomputer


‘DR in the cloud is the new trend’

Hubert Yoshida CTO, Hitachi Data Systems

‘Organizations are exploring alternatives to optimize their existing storage infrastructure’


40 interview

analyst angle........................................... 69 global cio................................................... 70

Mobile phishing attacks on the rise, reveals Trend Micro

practical analysis...................................71

Google uses 120 variables to stop hacking of accounts

down to business......................................72

march 2013 i n f o r m at i o n w e e k 9


VOLUME 2 No. 05 n March 2013

Managing Director Printer & Publisher Associate Publisher & Director Editor-in-Chief Executive Editor Principal Correspondent Correspondent Principal Correspondent Senior Correspondent Copy Editor

: Joji George : Kailash Pandurang Shirodkar : Anees Ahmed : Brian Pereira : Srikanth RP : Jasmine Kohli : Varun Haran : Ayushman Baruah (Bengaluru) : Amrita Premrajan (New Delhi) : Shweta Nanda

Design Art Director Senior Visualiser Senior Graphic Designer Graphic Designer

: : : :

Marketing Marketing Head

: Samta Datta

online Manager—Product Dev. & Mktg. Deputy Manager—Online Web Designer Sr. User Interface Designer

: : : :

Deepjyoti Bhowmik Yogesh Naik Shailesh Vaidya Jinal Chheda, Sameer Surve

Viraj Mehta Nilesh Mungekar Nitin Lahare Aditi Kanade

Operations Head—Finance Director—Operations & Administration

: Yogesh Mudras : Satyendra Mehra

Management Service

: Jagruti Kudalkar

Sales Mumbai Manager- Sales : Ranabir Das (M) +91 9820097606 Marvin Dalmeida (M) +91 8898022365 Bengaluru Manager—Sales : Kangkan Mahanta (M) +91 89712 32344 Sudhir K (M) +91 9740776749 Delhi Manager—Sales : Rajeev Chauhan (M) +91 98118 20301 Sanjay Khandelwal (M) +91 9811764515 Production Production Manager

: Prakash (Sanjay) Adsul

Circulation & Logistics Deputy Manager

: Bajrang Shinde

Subscriptions & Database Senior Manager Database : Manoj Ambardekar Assistant Manager : Deepanjali Chaurasia

print online newsletters events research Head Office UBM India Pvt Ltd, 1st floor, 119, Sagar Tech Plaza A, Andheri-Kurla Road, Saki Naka Junction, Andheri (E), Mumbai 400072, India. Tel: 022 6769 2400; Fax: 022 6769 2426

IBM Akamai IBM Kyocera Ctrl S Interop Avnet 4 G World CloudConnect TFM&A Cisco Microsoft

Amit Luthra, Dell............................................................30 Amit Malhotra, Oracle ................................................25 Anil Shankar, Shoppers Stop....................................30

International Associate Offices USA Huson International Media (West) Tiffany DeBie, Tel: +1 408 879 6666, Fax: +1 408 879 6669 (East) Dan Manioci, Tel: +1 212 268 3344, Fax: +1 212 268 3355 EMEA Huson International Media Gerry Rhoades Brown, Tel: +44 19325 64999, Fax: + 44 19325 64998

Arun Gupta, Cipla..........................................................58 B. S. Nagarajan, VMware ............................................26 Christian J. Leeb-Hetzer, IBM ...................................38 Daya Prakash, LG Electronics ...................................50 Deep Roy, NetApp.........................................................25 Deepak Varma, EMC.....................................................26 Dharmesh Rathod, Essar Group..............................65 Dilip Patil, Yash Raj Films Studio .............................35 Girish Rao, Marico .........................................................30 Harnath Babu, Aviva Life Insurance ......................33 Hubert Yoshida, Hitachi Data Systems .................36

Japan Pacific Business (PBI) Shigenori Nagatomo, Tel: +81 3366 16138, Fax: +81 3366 16139

KT Rajan, Allergan ........................................................34

South Korea Young Media Young Baek, Tel: +82 2227 34819; Fax : +82 2227 34866

Nataraj N, Hexaware ....................................................34

Mathew C George, Indian Oil Corporation ...............................................51 Nagarajan Krishnan, Cairn India .............................30

Priya Narayanan, Cairn India ....................................30 S Sridhar, Dell .................................................................26 Sajan Paul, Juniper Networks ..................................62

Printed and Published by Kailash Pandurang Shirodkar on behalf of UBM India Pvt Ltd, 6th floor, 615-617, Sagar Tech Plaza A, Andheri-Kurla Road, Saki Naka Junction, Andheri (E), Mumbai 400072, India. Executive Editor: Srikanth RP Printed at Indigo Press (India) Pvt Ltd, Plot No 1c/716, Off Dadaji Konddeo Cross Road, Byculla (E), Mumbai 400027. RNI NO. MAH ENG/2011/39874

ADVERTISERS’ INDEX Company name Page No.

Editorial index Person & Organization

Website Sales Contact

02 & 03 4 5 7 11 15 21 47 55 63 73 74

Sanchit Vir Gogia, IDC..................................................26 Sandeep Dutta, IBM .....................................................25 Sanjay Poonen, SAP .....................................................54 Shamik Sharma, ..................................33 Sheshagiri Anegondi, Oracle India.........................64 Sriram Krishnan, ING Life Insurance.......................34 Steve Stavridis, NetIQ...................................................40 Subrat Mohanty, HDFC Standard Life Insurance ................................68 Surajit Sen, EMC ............................................................29 TG Dhandapani, TVS Motors ....................................52 Udayan Banerjee, NIIT Technologies.....................59 Vijay Sethi, Hero MotoCorp.......................................57 Vijyant Rai, CA Technologies ....................................29 Vishnu Bhat, Infosys......................................................46 Vivekanand Venugopal, Hitachi Data Systems...................................................28

Important Every effort has been taken to avoid errors or omissions in this magazine. In spite of this, errors may creep in. Any mistake, error or discrepancy noted may be brought to our notice immediately. It is notified that neither the publisher, the editor or the seller will be responsible in respect of anything and the consequence of anything done or omitted to be done by any person in reliance upon the content herein. This disclaimer applies to all, whether subscriber to the magazine or not. For binding mistakes, misprints, missing pages, etc., the publisher’s liability is limited to replacement within one month of purchase. © All rights are reserved. No part of this magazine may be reproduced or copied in any form or by any means without the prior written permission of the publisher. All disputes are subject to the exclusive jurisdiction of competent courts and forums in Mumbai only. Whilst care is taken prior to acceptance of advertising copy, it is not possible to verify its contents. UBM India Pvt Ltd. cannot be held responsible for such contents, nor for any loss or damages incurred as a result of transactions with companies, associations or individuals advertising in its newspapers or publications. We therefore recommend that readers make necessary inquiries before sending any monies or entering into any agreements with advertisers or otherwise acting on an advertisement in any manner whatsoever.


informationweek march 2013

News b i g d ata

Global Big Data revenue pegged at USD 11.4 billion; IBM leads all players Wikibon, a technology research and advisory community focusing on Big Data and software-led infrastructure, recently released its second annual Big Data vendor revenue and market forecast report. The study from Wikibon pegs worldwide 2012 Big Data revenue at USD 11.4 billion, up 58 percent over 2011. The market is expected to reach USD18.2 billion in 2013 (a growth of 61 percent), and reach nearly USD 50 billion by 2017, with an overall 2012-2017 CAGR of 31 percent. Powered by services, IBM leads all players with more than USD 1.3 billion in Big Data revenue while HP and Teradata follow to round out the top three. MarkLogic, Cloudera and 10gen top the Hadoop/NoSQL vendors by revenue. In 2012, professional services accounted for the largest share of the market at USD 3.9 billion, followed by compute at USD 2.4 billion and storage at USD 1.8 billion. In 2012, the fastest

growth segments were NoSQL databases and Big Data Application software, both growths over 90 percent. Awareness of Big Data benefits within industries beyond the web, most notably financial services, phar-

maceuticals and retail, powered the 2012 Big Data market. The maturation of Big Data software such as Hadoop, NoSQL data stores, in-memory analytic engines and massively parallel processing databases are providing the infrastructure necessary to sustain substantial market momentum.

Moreover, increasingly sophisticated professional services practices that assist enterprises in practically applying Big Data hardware and software to business use cases are closing the gap between Big Data visions and the lack of skills needed to deliver Big Data project value. Wikibon study finds increased investment in Big Data infrastructure by massive web properties, most notably Google, Facebook, and Amazon and government agencies for intelligence and counter-terrorism purposes, is paving the way for the acceleration of Big Data deployments. Wikibon projects the Big Data market to top USD18 billion in 2013, a growth rate of 61 percent. Looking beyond 2013, Wikibon forecasts the total Big Data market to approach 50 billion by 2017, which translates to a 31 percent compound annual growth rate over the five year period 2012-2017. — InformationWeek News Network

i t- i t e s

NASSCOM expects disruptive tech to propel growth Changing business models, emergence of new technologies, buyer segments and solutions for emerging markets will help India retain its position as the global sourcing leader and an emerging trustworthy innovation hub. This and many other trends were revealed at the NASSCOM Strategic Review 2013. Increase in global technology spending and opportunities created through adoption of disruptive technologies are expected to propel growth in FY2014. NASSCOM expects the industry to clock export revenues of USD 84-87 billion maintaining a growth rate of 1214 percent. Domestic revenues will also grow at a rate of 13-15 percent and are expected to reach ` 1,180-1,200 bn. Some of the key growth drivers that


informationweek march 2013

are expected to open new opportunities for the industry are smart computing, ‘anything’-as-a-service, technology enablement in emerging verticals and the SMB market. “Technology can also play a critical role in enabling transformation in India. The domestic market in India is maturing, it was the fastest growing market in the year and NASSCOM will look to partner with the government in enhancing technology adoption”, said Som Mittal, President, NASSCOM. India is the only country that offers the depth and breadth of offerings across different segment of this industry – IT Services, BPM, Engineering & R&D, Internet & Mobility and Software Products. IT Services is a USD 50 billion sector, BPM is a USD 20 billion sector,

Engineering crossed USD 10 billion and Software products, Internet & Mobility are emerging opportunities. Krishnakumar Natarajan, Vice Chairman, NASSCOM said “The rapid adoption of Internet and mobile is creating enormous opportunities for entrepreneurship in the country. A growing ecosystem of early stage funding, incubation and peer learning is creating innovative start-ups building technology solutions and products for India and the global market. Initiatives like creating a strong and robust ecosystem for start-ups, innovation clusters and CoEs will encourage entrepreneurship and build the next generation of global companies from India”. — InformationWeek News Network


Use of social networking can increase the risk of APT attacks, finds survey A global cybersecurity survey of more than 1,500 security professionals found that more than one in five respondents said their enterprise has experienced an advanced persistent threat (APT) attack. According to the study by global IT association ISACA, 94 percent said APTs represent a credible threat to national security and economic stability, yet most enterprises are employing ineffective technologies to protect themselves. More than 60 percent of survey respondents said that it’s only a matter of time before their enterprise is targeted. Over 96 percent of respondents said they are at least somewhat familiar with APTs. While this is a positive finding, 53 percent of respondents say they do not believe APTs differ from traditional threat, indicating that many do not fully understand APTs. “APTs are sophisticated, stealthy and unrelenting,” said Christos Dimitriadis, Ph.D., CISA, CISM, CRISC, International VP, ISACA and Head- Information Security at INTRALOT Group. “Traditional cyberthreats often move right on if they cannot penetrate their initial target, but an APT will continually attempt to penetrate the desired target until it meets its objective — and once it does, it can disguise itself and morph when needed, making it difficult to identify or stop.” More than 60 percent of survey respondents said they are ready to respond to APT attacks. However, anti-virus and anti-malware (95 percent) and network perimeter technologies such as firewalls (93 percent) top the list of controls their enterprises are using to stop APTs — a concerning finding, given that APTs are known to avoid being caught by these types of controls. Loss of enterprise intellectual property was cited as the biggest risk of an APT (by more than a quarter of respondents), followed closely by loss of customer or employee personally identifiable information (PII). Interestingly, 90 percent of respondents believe that the use of social networking sites increases the likelihood of a successful APT, while 87 percent believe BYOD, combined with rooting or jailbreaking the device, makes a successful APT attack more likely. More than 80 percent said their enterprises have not updated their vendor agreements to protect against APTs. — InformationWeek News Network


Sagar Tech Plaza, A 615-617, 6th floor, Andheri Kurla Road, Saki Naka Junction, Andheri (E), Mumbai 400 072, India





NOT APPLICABLE Sagar Tech Plaza, A 615-617, 6th floor, Andheri Kurla Road, Saki Naka Junction, Andheri (E), Mumbai 400 072, India


Sagar Tech Plaza, A 615-617, 6th floor, Andheri Kurla Road, Saki Naka Junction, Andheri (E), Mumbai 400 072, India

6. Names and Addresses of individuals who own the newspaper/magazine and partners or shareholders holding more than one per cent of the total capital

ubm india pvt ltd., Sagar Tech Plaza, A 615-617, 6th floor, Andheri Kurla Road, Saki Naka Junction, Andheri (E), MAROL, ANDHERI (E), Mumbai 400 072, India

Stormcliff limited Julia House, 3, Themistocles Dervis Street, 1o66, Nicosia Cyprus


march 2013 i n f o r m at i o n w e e k 13

News security

Industry icons experience cyber attacks It seems that hackers have been wanting to make 2013 a special year. This year saw cyberattacks on three most iconic companies in the industry — Twitter, Facebook and Microsoft. Twitter announced in a blog post that it detected unusual access patterns that led it to identifying unauthorized access attempts to Twitter user data. Bob Lord, Director of Information Security, Twitter, said in a blog post, “We discovered one live attack and were able to shut it down in process moments later. However, our investigation has thus far indicated that the attackers may have had access to limited user information — usernames, e-mail addresses, session tokens and encrypted/salted versions of passwords — for approximately 250,000 users.” After Twitter, it was the turn of Facebook. The company revealed

in a blog post that it’s systems were targeted in a sophisticated attack. “This attack occurred when a handful of employees visited a mobile developer website that was compromised. The compromised website hosted an exploit, which then allowed malware to be installed on these employee laptops. The laptops were fully-patched and running upto-date anti-virus software. As soon as we discovered the presence of the malware, we remediated all infected machines, informed law enforcement, and began a significant investigation that continues to this day,” the company said in the blog post. Next was the turn of software giant, Microsoft. In a statement on the Microsoft Security Response Center blog, Microsoft’s Matt Thomlinson, GM, Trustworthy Computing Security, said, “As reported by Facebook and Apple, Microsoft can confirm

that we also recently experienced a similar security intrusion. During our investigation, we found a small number of computers, including some in our Mac business unit, that were infected by malicious software using techniques similar to those documented by other organizations.” — InformationWeek News Network

s o f t wa r e

WebNMS set to tap 100 million dollar M2M solutions market Have you ever wondered who manages the ATMs you withdraw cash from? The banks do not own or manage any of them. They usually outsource them to third-party operators who get paid on a per transaction basis. These operators incur huge operating expenses as a single ATM room would typically have a number of passive assets such as two air conditioners, two illuminated signage boards, an inverter/ UPS, a security camera and at least eight to 12 light bulbs. Evidently, it’s a challenging task to efficiently manage all of these elements. WebNMS, a telecom network management company and division of Zoho, has recently introduced a machine2-machine (M2M) solution to help these operators efficiently manage the ATMs. The solution named ATM Site Manager enables assets from individual ATM centers to communicate with a centralized control system. “This solution is primarily to enable remote management of passive assets and increase preventive maintenance with less human intervention in many routine service operations,” says Prabhu Ramachandran, Director-WebNMS. The product, a combination of hardware and software, also generates insights for ATM operators from the way


informationweek march 2013

assets operate and consume power. “Based on the analytics derived from the solution, we help the operators with benefits such as remote security, reduction in energy bills, overall cost optimization and predictive maintenance,” says Ramachandran. With an estimated 100,000 ATMs in India growing at least 30 percent per annum, the M2M market is big and emerging. According to 6Wresearch, in a report titled India Machine to Machine (M2M) Modules Market (2011-16), the Indian M2M solutions market is expected to reach USD 98.38 million by 2016 with a CAGR of 33.81 percent from 2011-16. Ramachandran believes the M2M is the next big wave in the IT industry, perhaps just after cloud computing. The solution ATM Site Manager, launched in December 2012, is currently under its proof of concept (POC) and WebNMS is currently working with many ATM operators who are potential customers. Going forward, the company also plans to undertake monitoring of mobile towers, power grids, and solar/wind mill farms. —Ayushman Baruah

News s o f t wa r e


11 year old child develops trojan: Are pre-teens writing malware? In a world filled with laptops, tablets and smartphones, today’s children become digitally fluent far earlier than previous generations. Now, AVG Technologies has found evidence that pre-teens are writing malware designed to steal login details from online gamers, both young and old. This was discovered by AVG and disclosed in its Q4 2012 Community Powered Threat Report. Among other malicious software developments, the most striking was the case of a Trojan developed by an 11-year-old child to steal game login information. “We have now seen a number of examples of very young individuals writing malware, including an 11-yearold from Canada,” said Yuval BenItzhak, Chief Technology Officer at AVG Technologies. “The code usually takes the form of a basic Trojan written using the .NET framework, which is easy to learn for beginners and simple to deploy via a link in an e-mail or posted on a social media page.” While stealing someone’s game logins may at first seem a minor problem, online gaming accounts are often connected to credit card details to enable in-game purchases, and may also have virtual currency attached to them amounting to hundreds of dollars. Furthermore, many gamers unfortunately use the same login details for social networks such as Facebook and Twitter, potentially putting the victim at risk of cyber-bullying, in addition to identity theft and major inconvenience. “We believe these junior programmers are motivated mainly by the thrill of outwitting their peers, rather than financial gain, but it is nevertheless a disturbing and increasing trend. It is also logical to assume that at least some of those responsible will be tempted to experiment with much more serious cybercrimes.”


informationweek march 2013

Mobile threats continue to rise

The Q4 Threat Report also highlights the dramatic and ongoing increase in mobile malware, particularly of code designed to target Google’s hugely popular Android operating system. During the course of 2012, AVG Threat Labs reported on the first Android rootkit, examples of mobile banking being targeted for attack, malicious apps that send text messages to premium rate services, and Trojaninfected versions of popular games on unofficial app stores, including bestseller Angry Birds Space. Mobile threats also feature in the Threat Report’s predictions for 2013, notably in the form of increased MITMO (Man-In-The-Mobile) attacks that target PC and mobile Internet banking apps. Such threats might benefit from the growing BYOD trend, where workers connect their personal mobile devices to company networks.

Continued exploitation

Alongside the rise in mobile malware, AVG Threat Labs found that exploit toolkits continue to dominate when it comes to online threats. Almost 60 percent of all threat activity online was performed by exploit toolkits in 2012. The use of such kits is believed to be the result of established cybercriminals realizing that they can create and sell commercial toolkits at a premium to less technically savvy peers eager to get into the market. One example of a new exploit toolkit which emerged during the last quarter of 2012, and bore a remarkable resemblance to the Blackhole Exploit Kit, was the Cool Toolkit. This new toolkit accounted for 16 percent of the top web threats in Q4 2012, topped only by Blackhole at 40 percent. — InformationWeek News Network

C-DAC launches India’s fastest supercomputer J Satyanarayana, Secretary, Department of Electronics and Information Technology (DeitY), Govt of India launched PARAM Yuva - II, the new 500 TeraFlop version of its earlier PARAM Yuva at C-DAC Pune. With this launch, C-DAC also becomes the first R&D institution in India to cross the 500 TF milestone. The launch of PARAM Yuva II was conducted as part of the Workshop on National Mission on Supercomputing being organized by C-DAC, Pune. PARAM Yuva – II provides more than half a Petaflop of raw compute power using hybrid compute technology with compute co-processor and hardware accelerators. The interconnection network comprises of home grown PARAMNet-III and Infiniband FDR System Area Network. PARAM Yuva –II has 200 Terabytes of high performance storage and support software for parallel computing. With the launch of PARAM Yuva –II, C-DAC has taken a quantum jump towards creating a general purpose research-oriented computational environment. The system achieved a sustained performance of 360.8 Teraflop/s on Community standard Linpack benchmark. In comparison to November 2012 list of world’s Top 500 supercomputers, PARAM Yuva – II would have stood at 62nd position in the world and at number 1 position in the country. In terms of power efficiency, it is better than most supercomputing systems and would have achieved 33rd position in the November 2012 List of Top Green 500 supercomputers of the world. —InformationWeek News Network


Mobile phishing attacks on the rise, reveals Trend Micro With the rise in popularity of mobile devices, hackers have now found a new platform. Research by Trend Micro reveals that mobile phishing attacks are on the rise, targeting popular websites such as PayPal, eBay, Bank of America, Barclays, Wells Fargo and SFR (SocieteFrancaise du Radiotelephonie). In a blog post on Trend Micro’s Security Intelligence blog, the firm’s Gelo Abendan, noted, “For 2012, we found 4,000 phishing URLs designed for mobile web. Though this number represents less than 1 percent of all the phishing URLs gathered that year, this highlights that mobile devices (smartphones, tablets and the likes) are valid platforms to launch phishing attacks.” Abendan says that cybercriminals use phishing sites, which are spoofed versions of legitimate sites, to trick users into disclosing sensitive information like usernames, passwords, and even account details. “What’s more worrisome is the kind of websites these phishing attacks spoof. In 2012, 75 percent of mobile phishing URLs were rogue versions of well-known banking or financial sites. Once users are tricked into divulging their

login credentials to these sites, cybercriminals can use these stolen data to initiate unauthorized transactions and purchases via the victim’s account,” said Abendan. A portion of these phishing sites were designed to spoof social networking sites (2 percent) and online shopping sites (4 percent). This small number for phishing sites for social media may be attributed due to users preference for social media apps. Because users are unlikely to visit social networking sites by web mobile, launching phishing equivalent of these pages may not be an effective way to target users. These numbers are consistent with the top 10 most phished entities, in which majority are banking or credit card websites. Trend Micro says that the trend in launching phishing attacks on mobile devices can be attributed to certain limitations of the platform itself. This includes the small screen size in most mobile devices, which prevents users from fully inspecting websites for any anti-phishing security element. With majority of mobile devices using default browsers, it is also easier for cybercriminals to create schemes as they need to focus on one browser instead of many. — InformationWeek News Network


Google uses 120 variables to stop hacking of accounts 2013 has truly turned out to be the year of hacking. This year, the industry’s most iconic companies, Twitter and Facebook have been hacked. While there is no certainty to prevent attacks, as hackers keep on evolving their mechanisms, search engine giant, Google, revealed in a blog post, on the approach it is taking to prevent attacks. For example, as spam filters have become extremely powerful, spammers have now changed their techniques to make sure that their message gets delivered to your inbox. Spammers are now trying to break into accounts of legitimate users by using a known trick — buy databases put up on sale on the black market. As many people use the same

password on many websites, stolen passwords are valid on other websites too. “With stolen passwords in hand, attackers attempt to break into accounts across the web and across many different services. We’ve seen a single attacker using stolen passwords to attempt to break into a million different Google accounts every single day, for weeks at a time. A different gang attempted signins at a rate of more than 100 accounts per second,” said Mike Hearn, Google Security Engineer in a blog post. To prevent such hacks, Google’s security system does more than just check if a password is correct. Every time one signs in to Google, it performs a complex risk analysis to determine how

likely it is that the sign-in really comes from the original user. “There are more than 120 variables that can factor into how a decision is made. If a sign-in is deemed suspicious for some reason — maybe it’s coming from a country oceans away from your last sign-in — we ask some simple questions about your account. For example, we may ask for the phone number associated with your account, or for the answer to your security question. Using security measures like these, we’ve dramatically reduced the number of compromised accounts by 99.7 percent since the peak of these hijacking attempts in 2011,” writes Hearn. —Srikanth RP

march 2013 i n f o r m at i o n w e e k 17

News Analysis

Why is India’s UID Aadhar a Big Data challenge and opportunity? Building the world’s largest biometric identity platform for authenticating the identity of 1.2 billion residents is a Big Data challenge and a big opportunity for improving governance By Srikanth RP Everything about India’s UID project, or Aadhar as it is commonly known, is ambitious. Giving a unique identity to 1.2 billion residents is a challenging task. No country has done a project of this scale, which is why this project is being watched keenly by everyone — not only in India, but rest of the world too. Let’s look at some interesting facts about Aadhar. The scope is to capture 12 billion fingerprints, 1.2 billion photographs, and 2.4 billion iris scans. The file size for each enrollment is approximately 5 Mb. When you summarize this for 1.2 billion people, the file size would be measured in petabytes. This is just the storage part. Another noteworthy part about the Aadhar project is the fact that the UID team wants to ensure that every record is indeed unique. To ensure that this happens, before every new ID is issued, it is checked against the existing database. This process, which is called ‘deduplication’ is done to ensure that no person gets more than one identity number. This is where it gets even more interesting. Till date, the Unique Identification Authority of India (UIDAI) has issued approximately 25 crore UIDs. Also, on a daily basis, the UIDAI issues over 1 million Aadhar cards. This means that each day, 1 million records have to be checked against the existing database of 25 crore IDs. This is only going to get larger every day. Additionally, as the UID database can be used for verifying identification, it receives more than 100 million authentications per day. Going forward, as more government-based


informationweek march 2013

systems access this platform to authenticate their customers or stakeholders, the number of authentication-based queries will also shoot up exponentially. “UID will help us build a nationally verifiable ID and guarantee uniqueness through biometric deduplication. This means a person’s identification can be easily verified and authenticated using any device such as a cellphone or a laptop that is connected to the Internet,” says Dr A B Pandey, Deputy Director General, Unique Identification Authority

of India (UIDAI). Pandey spoke about this topic extensively in the recent CSI IT2020 conference held in Mumbai. Big Data is characterized by three V’s - Volume, Variety and Velocity. Aadhar exhibits all the values that is characteristic of Big Data. The size of the data is huge and it is estimated that this database will be 10 times larger than the largest existing biometric database, which is created by FBI from the U.S.

Finding the true value of data

Once the system is fully developed, it can become the foundation or the platform to check an individual’s identity

across multiple services such as free education, public distribution systems and pension schemes. Dr Pandey explains this with the help of an example, “The Maharashtra Government used to spend close to ` 1,500 crore per year on scholarship to students. They took a simple decision to link student bank accounts to scholarships and transfer the scholarship amount directly to their bank accounts. By taking this simple step, the expenditure on scholarship came down drastically to ` 900 crore per year.” The same principle of authentication via UID can be used for PDS, old age pension schemes and other welfare schemes. For example, in the case of NREGA, verification can be done by asking the concerned individual to place any finger on a biometric device. This information can be immediately checked and authenticated against the information available in the system. Similarly, given the government’s aim of promoting financial inclusion, micro-ATMs can be used to dispense cash in rural areas against fingerprint-based authentication based on the UID. In the case of UID, the possibilities are limitless given that the platform’s usage is limited to the imagination of government agencies who can develop innovative applications on top of the platform. From providing insurance to the poor, ensuring health care to the bottom of the pyramid and providing banking facilities to the unbanked population, UID can be the game changer that a country like India wants today. u Srikanth RP

News Analysis

Is BlackBerry 10 right for you? Use these smartphone decision points to compare BlackBerry 10 to its Apple, Android and Windows Phone rivals By Eric Zeman


lackBerry Z10 was launched amid much fanfare globally. Let’s weigh the pros and cons of the smartphone platform before taking that crucial leap to adopt it. Here’s a look at some of the strengths and weaknesses of the new platform from the company formerly known as RIM, as compared to rival operating systems from Apple, Google, and Microsoft. First, some basics. BlackBerry 10 and the Z10 and Q10 will be widely available from U.S. carriers. AT&T, Sprint, T-Mobile USA, and Verizon Wireless have all committed to selling the devices when they are ready. Pricing will be reasonable at about USD 149 and USD 199 for the Q10 and Z10, respectively. Both phones will support all four carriers’ LTE 4G networks. In other words, you’ll have your choice of carrier and won’t have to worry about missing features. Out of the box, BB10 covers the essentials: e-mail, social networking, messaging, basic apps (think calculator, alarms, file management), and of course contacts and calendar management. It has a good browser, an app store and can playback your music and video content. Android, iOS, and Windows Phone all offer these same features. You’re not going to lose any of these basics by switching from another platform to BB10. So, what will you lose by switching? If you’re coming from Android, you’re going to lose device choice. True, the Z10 and Q10 offer a small selection, but Android wins hands-down when it comes to the variety of form factors. Android devices are big and small, cheap and expensive, rugged and high class. You’re going to lose access to apps. You’re also going to lose native support for Google’s services. Yes, BB10 supports Gmail and Google Contacts, and Google Calendar via IMAP, and BlackBerry developed its own version of Google Talk and YouTube for BB10 devices, but this is as far as it goes. There’s no Google+, no Google Maps, no Google Drive, no Google Docs, no Google Voice, no Google Search / Google Now. Sure, you can access some of these through the BB10’s browser, but you and I both know browser-based apps and native apps are two different things. If your business has “gone Google,” switching to BB10 simply doesn’t make sense. If you’re coming from iOS, you’re losing access to 800,000 apps. BlackBerry World has about 70,000, many of which are ported

Android apps. The selection just isn’t there, yet. You’re also losing access to many of the same Google services that are available to Android. You’re losing access to an incredible array of accessories. Devices such as the iPhone have more accessories available than any other device on the market. Being so new, BB10 does not yet have such accessories, and there’s no telling if, or when, it will catch up. If you’re coming from Windows Phone, you’ll lose some device choice and access to apps, too. Windows Phone 8, which launched during the fourth quarter of 2012, is available from many U.S. carriers in a wide variety of devices, colors and price points. You’ll lose tight integration with other Windows equipment and services, including XBox gaming. The app story isn’t as severe as it is with Apple and Google, but there are plenty of marquee apps missing from BlackBerry World that are available in the Windows Phone Store (Amazon, CNN and eBay, to name a few). Does BlackBerry 10 get you anything not offered by Android, iOS and Windows Phone? Very little. BlackBerry 10 offers BBM, its wildly popular instant messaging service. BBM has been duplicated by Apple, Google and Microsoft, though. BlackBerry offers BES 10 to enterprises, while Apple and Google do not offer such device management tools. BES 10 can also be used to control Android and iOS devices. If your business is already invested in BES, and has historically used BlackBerries, BB10 makes a lot more sense. BlackBerry 10 supports NFC. Android and Windows Phone do, but iOS does not. BlackBerry 10 supports wireless charging. Android and Windows Phone do too, but iOS does not. The Q10 offers a QWERTY keyboard. iOS and Windows Phone don’t support QWERTY keyboards, but Android does. This is an important factor for many business users, for sure. It is important to point out that the appgap will be closed over time, though it could be months and months before BlackBerry catches up with the best apps. BlackBerry has also said that it will ship up to six devices this year, varying in price range. That will help close the gap with Google and Microsoft in terms of device selection— eventually. Source: InformationWeek USA

march 2013 i n f o r m at i o n w e e k 19

Infographic How tHe exploding digital

digital universe will reach

40ZB by


is impacting data



57 times

of daTa is equivalenT To The amounT of all The grains of sand on all The beaches on earTh.

0111010110101010010110110101 0010010101010001010101001010101001010100101001010100101010001010010100100101101010 10111010110101010010110110101001001010101000101010100101010100101010010100101010010101000101001010010010110101010111010 11010101001011011010100100101010100010101010010101010010101001010010101001010100010100101001001011010101010101001010101 0010101010100010001111010110101010010110110101001001010101000101010100101010100101010111010110101010010110110101001 001010101000101010100101010100101010011101011010101001011011010100100101010100010101010010101010010101001010010 10100101010001010010100100101101010101110101101010100101101101010010010101010001010101001010101001010100101 00101010010101000101001010010010110101010111010110101010010110110101001001010101000101010100101010100101010010 100101010010101000101001010010010110101010101010010101010010101010100010001111010110101010010110110101001001010101 0001010101001010101001010101110101101010100101101101010010010101010001010101001010101001010100111010110101010010 11011010100100101010100010101010010101010010101001010010101001010100010100101001001011010 101011101011010101001011011010100100101010100010101010010101010010101001010010101001010100 01010100100101010100010101010010101010010101001110101101010100101101101010010010101010001010101 00101010100101010010100101010001001010100010100101001001011010011101011010101001011011010100100101010 1000101010100101010100101010010100101010010101000101001010010010110101010111010110101010010110110101001001010 1010001010101001010101001010100101001010100101010001010010100100101101010101110101101010100101101101010010010101010001 0101010010101010010101001010010101001010100010100101001001011010101010101001010101001010101010001000111101011 0101010010110110101001001010101000101010100101010100101010111010110101001110101101010100101101101010 01001010101000101010100101010100101010010100101010010101000101001010010010110 101010111010110101010010110110101001001010101


if we could save all 40ZB onTo Today’s Blue-ray discs, The weighT of Those discs (wiThouT any sleeves or cases) would be The same as 424 nimitZ-class aircraft carriers The currenT gloBal BreakdoWn of The digital universe is:

u.s. - 32 Western europe -19 china -13 india - 4 rest of the World - 32

beTween 2012 and 2020


india is currenTly invesTing

usd 0.87 per gb To manage the digital universe

source: idc’s digital universe study, sponsored by emc, december 2012

increased acTiviTy amongsT key indusTry verTicals such as Bfsi,

manufacturing, telecom, government, it/ites and media and entertainment boosT The markeT by a large exTenT


india will grow 23-fold

source: idc’s digital universe study, sponsored by emc, december 2012

storage emerging

digital universe

source: emc-idc digital universe study, “big data, bigger digital shadows, and biggest growth in the far east – india”

strategic investment priority storage market in india To aTTain a cagr close To during



source: netscribes

The presence of around

50 million smBs in india furTher fuels growTh in The storage market source: netscribes

source: netscribes


informationweek march 2013

january 2013 i n f r m at i o n w e e k 1

Cover Story

How storage technologies are evolving to manage the data deluge Exponential increase in data within enterprises is creating a surge in the storage requirements. Vendors are pursuing this opportunity by innovating and launching products around emerging storage technologies like flash-based SSDs, automated tiered storage, storage virtualization and cloud-based storage By Amrita Premrajan


ith enterprises witnessing dramatic growth in structured and unstructured data, analyzing these huge data sets for extracting critical insights has emerged as a top business priority. A recent IDC research report says that extracting value from the expanding universe of digital information is becoming a core business mandate, with organizations spending more time and money collecting, storing, and monetizing fast-growing data sets and rich pools of content for the foreseeable future. In order to enable their organizations to unlock the business value of data, CIOs are confronted with the challenge of designing


informationweek march 2013

and maintaining a robust storage architecture that could effectively support the spiraling data needs of the company. This not only involves forecasting and mapping the storage requirements in tandem with the enterprise’s growth but also involves managing the entire storage architecture cost effectively and optimally. The fact is corroborated by a recent IDC report, which highlights that the continued expansion of business-critical information and rich content within extended enterprises is today changing the storage dynamic in a wide range of industries. Perceiving the storage-linked challenges that information explosion is set to bring in, technology vendors

have been working on developing solutions that could make their storage system offerings not just robust but also intelligent and flexible to fit in specific storage needs of different enterprises. Let’s look at some of the emerging trends in storage technologies which are becoming increasingly relevant in today’s times.

Flash-based Solid State Drives

Traditionally, Hard Disk Drives (HDD) as a storage technology was widely used in the enterprises. But in the past few years, flash-based Solid State Drive (SSD), which was previously adopted extensively in consumer technology products, has started

making its inroads into the enterprise IT environment. The reason why SSD technology started finding application within enterprise IT was because it could deliver much higher performance — that is far greater Input and Output (I/O) data rates — than the conventional HDD. This is because SSD is based on the solid state semiconductor technology unlike HDDs that are spinning disks and lead to noticeable delay in data retrieval. “SSDs that have evolved as flash drives have now created a new usecase scenario in the industry where you can bring in high performance and bring down the cost of data center and power and cooling. This is due to the fact that this technology can handle larger part of the workload, without really increasing the cost of data center operations,” says Deepak Varma, Regional Head-Presales, EMC, India & SAARC. Entry of flash-based SSDs in the enterprise IT environment has started becoming highly relevant especially for certain dynamic and high-volume workload industry verticals where certain business applications need to provide quick query responses with an extremely low latency. “Flash technology has fundamentally changed the paradigm of storage systems and is enabling new use cases for essential enterprise applications and solutions by enhancing their performance, efficiency, and design,” says Sandeep Dutta, Country Manager, Storage, IBM India/South Asia. Citing an example of a high volume workload industry vertical where SSDs are increasingly becoming highly relevant, Amit

Flash technology has changed the paradigm of storage systems and is enabling new use cases for essential enterprise apps and solutions sandeep Dutta

Country Manager, Storage, IBM India/South Asia

Malhotra, Director, Storage Sales, JAPAC, Systems Division, Oracle Corporation says, “In intense trading environments of capital markets, where every milli-second taken to respond to transactions counts, SSDs are becoming highly relevant as a preferred storage technology to ensure high performance of the trading-linked business applications.” Another example where SSDs are becoming applicable is that of e-commerce websites, where at a given point in time, a large number of users stream in and search for products of their choice. In such a scenario it is essential for the business to ensure that when a user puts a search request, the web page containing the results appears without any noticeable delay. And ensuring such a quick response time means that the data need to be read from the back-end storage at a very high speed, which can be done effectively by using SSDs as a storage medium. “Solid State Technology will continue to be applied broadly to accelerate a wide range of workloads, from virtualized servers and desktops to online transaction processing (OLTP) to file services,” adds Deep Roy, Consulting Systems Engineer, Technology Solutions Organization APAC, NetApp.

SSDs are becoming highly relevant as a preferred storage technology in capital markets to ensure high performance of trading-linked business apps Amit Malhotra, Director, Storage Sales, JAPAC, Systems Division, Oracle Corporation

Understanding the relevance and application of SSDs within the enterprises and the large enterprise storage market that is waiting to be tapped, more and more technology vendors have started focusing their energies in building innovations around SSDs.

Innovations around SSDs

Within the enterprise IT environments, there are always certain businesscritical applications that need to have quick response time and are used much frequently than many other noncritical business applications. However, ensuring high performance of the business critical applications over the others has been one of the biggest challenges for the CIOs, as both these apps take the same path of going over the network and access the same backend storage systems. An interesting application of the flash technology that addresses this issue is by using it as a server flash caching solution, which is an intelligent software technology that makes a copy of the hottest data accessed from the storage system and makes it reside within the flash-based SSD sitting in the server. This means that the cache gets located closer to the application, within the server rather than on the storage arrays. “If there is a particular business application on the server, which is most hungry in terms of data access or performance, using the server flash caching solution we bring this data that was residing in the storage system closer to the server. This improves the overall response time of the application to the business,” says Varma of EMC. Though, flash technology has its performance benefits, one of the

march 2013 i n f o r m at i o n w e e k 25

Cover Story biggest roadblocks for the enterprises to actually adopt this technology as a mainstream storage option has been the fact that SSDs are more expensive than the traditional storage technologies. To address this specific issue Hitachi Data Systems introduced Accelerated Flash Storage last year, which is an innovative flash memory controller technology. “We have built our own multicore flash controller technology and we have integrated the controller with the flash modules under Hitachi Accelerated Flash storage. This meets the customer demands for higher capacity and performance but at the lower cost per bit by being able to increase performance of multi-level flash to degrees that exceed that of a single flash,” says Vivekanand Venugopal, VP and General Manager, India, Hitachi Data Systems. Another vendor VMware is also innovating in the space and is set to bring an interesting concept of virtual flash to the market. “We have virtualized CPI, memory, and hard disks, but there is one thing we have not yet virtualized, which is storage flash memory. At VMware, we are working very closely with some of the storage vendors in the industry to come out with virtualized storage flash memory, which we call as Virtual Flash. This will bring in the benefits of virtualization of the flash to the customers,” says B S Nagarajan, Director – Systems Engineering, VMware India & SAARC. This concept of virtual flash would basically enable the enterprise users to view and manage the flash devices that are plugged into vSphere hosts as a single pool — just in the way other resources like memory or CPU are currently being viewed and managed

SSDs that have evolved as flash drives can handle larger part of the workload, without really increasing the cost of data center operations Deepak Varma

Regional Head-Presales, EMC, India & SAARC

by the users.

Automated Tiered Storage

The concept of tiered storage in itself is not new. Within many industry verticals there exists a storage architecture that is differentiated into various tiers, grading from the highest performing storage array — tier zero — meant for active data that is generated through frequently accessed mission-critical applications to the lowermost costeffective storage tier, which need not necessarily give great performance but proves as high capacity storage system that primarily serves the purpose of storing the enterprise data, which is rarely used. In such tiered storage architecture, a huge challenge that existed within the enterprises was that the IT team had to undergo the cumbersome process of regularly studying the dynamic storage needs of the enterprise, map it against various storage tiers and manually distribute petabytes of data across various storage tiers. “Till 3-4 years back, it was the role of the database administrator or the system administrator to place specific enterprise data in the right storage platforms, which could be different storage media types or different storage arrays to meet the service

Automated tiering is highly relevant in the structured data environment, wherein users run queries in a structured way Sanchit Vir Gogia

Principal Analyst – Emerging Technologies, IDC India


informationweek march 2013

levels,” highlights Hitachi Data System’s Venugopal. This challenge was resolved with the emergence of software that could intelligently understand the pattern of data usage by the business, enabling automated tiered storage. “The software automates tiering of storage by intelligently identifying and prioritizing the hot data, which is used most frequently and by automating the relocation of cold and hot data across the different storage tiers. Automated tiering is highly relevant in the structured data environment, wherein users run queries in a structured way, leading to improvement in efficiency in terms of cooling performance and usage,” says Sanchit Vir Gogia, Principal Analyst – Emerging Technologies, IDC India. The primary reason why organizations are betting big on automated tiering is because it is a cost-effective storage option. “Highend storage comes at a high premium, and using top-shelf storage for all the enteprise data assets is no longer a costefficient solution. This is increasingly driving organizations to look at intelligence-based automated tiering, where management of data in turn takes care of the storage bandwidths,” explains S Sridhar, Director, Enterprise Solutions Business, Dell. EMC’s Varma cites an example of a major telecom player in India and SAARC that recently benefitted from the company’s tiered storage offering. He says that previously the customer was manually selecting its non-performing or retention data and was placing it on cost-effective drives, like the SATA drive. “We helped the customer migrate the entire customer data from standard drives to tiered

storage that involved a combination of flash drives, fiber channels and SATA drives,” he says. In this case, flash drives formed the highest performing tier, followed by the next tier of fiber channel, while SATA drives formed the lowest storage tier, which is a cost-effective solution for storing large volumes of inactive enterprise data. “Since Flash is a high performance but expensive storage technology, we did not want to overuse it and incur huge costs for the customer. So, we used just 9 percent flash drives in the overall storage of the enterprise, with 40 percent of the storage being fiber channel drive and 51 percent SATA drives. Within this storage architecture, we then introduced automated storage tiering that removed the manual intervention, improved response time of applications and enabled the company to save almost 40 percent upfront cost, which it would otherwise have to invest in various storage technologies,” adds Varma.

Storage Virtualization

IDC defines storage virtualization to include the ability to reallocate a heterogeneous collection of storage resources across storage systems and the capability to automate storage management functions. As a technology, storage virtualization has been around for quite some time now for the enterprises to evaluate and test. And it has been observed that enterprises are critically evaluating how storage virtualization technologies could be utilized to enable easy management of consolidated storage resources and ensure smooth data migration between different storage arrays.

As using top-shelf storage for all the enteprise data assets is not costefficient, organizations are looking at intelligence-based automated tiering S Sridhar

Enterprise Solutions Business, Dell

Venugopal of Hitachi says, “Apart from the regular business benefits of storage virtualization like common management of heterogeneous storage arrays and applying common processes to meet different SLAs of business applications, the most important business benefit for the customer is that they can now reuse their existing assets in the data center and ensure a far better return on assets.” He adds that companies that implement storage virtualization are able to reclaim about 65 percent of the capacity lying as toxic waste in their data center and that these customers can reduce their CAPEX by 30–35 percent and OPEX by 40 percent.

Innovations around storage virtualization

Perceiving the fact that with the staggering data growth, CIOs would seriously look at implementing storage virtualization, technology vendors are honing their offerings in the space. Automated Storage Provisioning is a recent innovation that VMware has come out with to manage the storage needs within a virtual environment. One of the biggest challenges that existed in the virtual environment with respect to storage was that customers kept creating several hundreds and thousands of virtual machines, which

With virtual SAN, enterprises would be able to use unutilized internal storage and would not have to invest in external storage arrays B S Nagarajan

Director – Systems Engineering, VMware India & SAARC

in turn created huge sprawl. Nagarajan of VMWare tells us that generally their storage virtualization customers had to regularly maintain very complex Excel sheets containing notes like what are the different kinds of storage that the company has, which storage is meant for which application, which storage has how much free capacity, which Virtual Machines (VMs) are lying on which storage and so on. To address this, VMware came out with an offering that completely eliminated the manual intervention in managing and provisioning the virtual storage. “In 2011, this issue was resolved when we came out with Vsphere 5.0, which came with automated storage provisioning under which administrators now have to just create VMs and the rest is taken care by the hypervisor software, which would decide where the VM should run, at what point in time, in order to give desired performance. This has now drastically brought down the complexity of managing storage within virtual environments,” Nagarajan adds. The concept of Virtual SAN is another important storage-linked innovation that VMware is set to introduce in its virtual environments in 2013. Minimum number of hard drives that a single server today comes with is two, with each hard disk of around 3 terabytes storage. In spite of the availability of so much free storage in number of servers that are spread across the enterprise data center, businesses today are not able to utilize this available storage for effective usage. Instead, they have to use an external shared common storage — the physical SAN — for enterprise use. Talking about how virtual SAN technology from VMware is set

march 2013 i n f o r m at i o n w e e k 27

Cover Story to change this situation in virtual environments, Nagarajan says, “With virtual SAN we could combine all the internal storage available on the servers and show this as an external shared storage for usage. As a result enterprises would be able to use the unutilized internal storage rather than paying for external storage arrays. This technology would be storage vendor agnostic and would be a part of the VMware hypervisor which would in turn enable managing the storage available on servers as simple as managing CPU or memory today.” Converged infrastructure is another concept driven by virtualization that technology vendors like HP and Dell have started bringing into the market. The concept involves virtualizing all the heterogeneous resources like storage, server and networking creating an integrated virtualized resource pool that can be managed via a single console. This offering is specifically brought out for those industry verticals whose data storage and data management needs are huge and they at the same time need high performance of their business applications. “Today, responding to customer demands on time, managing IT budgets and performing with efficiency is the essence. And convergence in the data center is an emerging trend in IT that addresses the growing needs for agility, efficiency and quality to support the delivery of applications and IT services. Adoption of converged infrastructure lowers cost of running critical workloads, and enables faster infrastructure deployments and simplicity and speed of management,” says Dell’s Sridhar.

The whole gamut of intelligent storage technologies is being today referred by a broad term called Software-Defined Storage (SDS). At this point in time, we can clearly see that storage vendors and virtualization vendors are working towards SDS, integrating software with the storage architecture in such a way that managing storage assets and smoothly scaling up and down according to changing needs of an enterprise becomes a reality.

Cloud Storage for data archiving and backup

Companies have petabytes of data, which they can’t afford to delete from a business perspective and they traditionally preserve it by spending on expensive on-premise data archiving and backup solutions. Using public cloud storage for data backup and archival is an interesting option today for the CIOs, as this can turn out to be much cheaper than the on-premise solutions. “From an IT perspective, by spending on data archiving and backup to be done on premise, enterprises are dealing with large costs in durably preserving this content indefinitely. This is one of the reasons that led us to invent and build Amazon Glacier, which is an extremely low-cost storage service that is optimized for data that is infrequently accessed and for which retrieval times of several hours are suitable,” says Alyssa Henry, VP, Storage Services, Amazon Web Services. “In general, we’re seeing a number of successful companies use AWS to both store their data and leverage it to develop greater insights. This is a trend we see across verticals, whether it’s biotech,

Companies that implement storage virtualization are able to reclaim about 65 percent of the capacity lying as toxic waste in their data centers Vivekanand Venugopal

VP and General Manager, India, Hitachi Data Systems


informationweek march 2013

advertising, or e-commerce,” he adds. The company has also brought out several other cloud storage offerings that are customized for workloads that demand high performance. Software-defined storage is a clear technology trend that is emerging in the area of storage. Here technology vendors are coupling software and virtualization technologies along with storage so as to enable the enterprises to manage their storage environments in a much simpler way and at the same time improving the speed with which the enterprise applications could retrieve information from back-end storage. Though storage virtualization was not at the forefront for most enterprises, going forward, with the multi-fold increase in data — both structured and unstructured — enterprises are set to seriously evaluate this technology. Another interesting trend is how cloud storage is emerging as a cost-efficient yet effective option of storage for data backup and archival for the large enterprises. In an era where cost-cutting has become a mandate for the CIOs, they are sure to look at cloud storage for backup and archival as this is one of the ways in which they can shunt the money traditionally spent on storage of huge volumes of inactive data to other business critical operations. With an increase in storage and data management challenges in enterprises, both storage and virtualization technology vendors are coming out with more storage-linked innovations to tap the growing market and enable the CIOs to consolidate their organization’s storage environment in a way best suited to its particular storage and data needs. This is in fact, the right time for CIOs to analyze their storage architecture, identify the best suited storage technologies for the business and map out a strong storage strategy, keeping in mind the increasing volumes of enterprise data that it would have to keep pace with. u Amrita Premrajan

Cover Story

Organizations opt for deduplication as cure for backup woes A look at how companies like HDFC Bank, Marico, Shoppers Stop and Royal Sundaram Alliance Insurance have considerably reduced their backup-related challenges by adopting deduplication technology By Jasmine Kohli for more investment in storage. “If you are a medium to large company, rapid recovery is crucial for meeting compliance requirements,” states Vijyant Rai, Director – Channel Sales (India and SAARC), CA Technologies.

Eliminating redundant data

One of the major challenges organizations are grappling with is generation of multiple copies of data. As per a survey by Symantec, more than a third (38 percent) of business information in Indian organizations is duplicate. Since with data deduplication, only a single iteration of a file, block or byte is saved to the actual storage media, it addresses one of the top information management challenges. “One of the areas where maximum amount of duplicate data is stored is in the backups. The average data change in most organizations is less than 2 percent, however the backup policy forces them to do full backups every week and retain it for at least 30 days. Data archival policies make it mandatory to store the information for extended periods of time dictated by regulatory compliance. This is where deduplication has made the biggest impact,” asserts Sen. Take the case of Royal Sundaram Alliance Insurance (RSAI), which was facing issues in managing its information as the company’s employees stored multiple copies of their data for redundancy purposes. However, this created problems for backup, as the same data in multiple forms

used to be backed up a multiple number of times. With approximately 1,500 desktops and 300 laptops, the amount of duplicate data that was getting backed up was huge and time consuming. With a deduplication solution, the firm’s backup window time has considerably reduced, and it has cut down on a lot of overhead, maintenance and deployment-related costs. In addition, the company’s storage needs have reduced by a significant percentage.



ompanies of all sizes today are faced with the challenge of managing and protecting data that is growing at an exponential rate. From confidential customer information and intellectual property, to financial transactions, organizations today possess massive amounts of information that they need to store and manage to meet stringent regulatory and compliance requirements and gain critical insights from crucial data. The fact is substantiated by a Symantec study, which says that business data in Indian organizations is expected to grow 67 percent in 2013.The study also reveals that 60 percent organizations in India struggle to manage digital information. This is due to the fact that while data continues to grow at a rapid pace, backup windows have either stayed the same or shrunk. With shrinking IT budgets, CIOs are realizing that adding more storage to manage data is no longer a viable option. They are thus contemplating deduplication technology to ensure robust backup of their highly critical data. Deduplication has now leapfrogged into almost every type of storage and networking device. “Most Indian customers are struggling to manage the costs of data storage and associated costs like rackspace, and power and cooling. They see deduplication as a long-term solution to address this problem,” says Surajit Sen, Country Manager, Backup & Recovery Systems Division, EMC India & SAARC. Deduplication is emerging as an important part of the storage strategy of CIOs, as it allows them to ensure robust data backup available for rapid recovery, reducing the need

march 2013 i n f o r m at i o n w e e k 29

Cover Story Organizations are primarily embracing deduplication technology in order to reduce the amount and the costs associated with storing large amounts of data. By eliminating redundant data, data deduplication technology achieves significant data reduction levels. This results in lesser storage space requirements, which in turn leads to reduction in storage costs due to decreased disk requirements. HDFC Bank is a case in point, which is accruing huge savings by adopting data deduplication technology. With advent of online banking, the bank was facing a challenge in storing staggering amount of data related to consumer transactions that was generated every day. The bank reported 1.6 million customer transactions per day and this data was required to be stored for a period of 8–12 years, owing to regulatory requirements. The cost of incremental direct attached storage was eating into the bank’s profitability. In order to solve the issue, the bank deployed storage virtualization technology and extended the life of its storage investments. Storage virtualization and deduplication, along with other techniques like tiered storage and thin provisioning has helped the bank efficiently manage the lifecycle of its data at the lowest cost. “More than compliance driving deduplication, it is replication and backup window, which is driving deduplication currently. The current state of higher backup window times and higher bandwidth drives deduplication in India,” says Amit Luthra, National Manager, Storage & Networking Solutions Marketing, Dell.

Optimizing backup speed

Along with reducing backup storage requirements, deduplication

We have reduced the previous 125 TB of data to 5 TB, with 96 percent deduplication Girish Rao

Head – IT, Marico

technology allows backup jobs to be completed in shorter time windows. Take the case of Cairn India, which has around 19 departments that are aligned to four major business units. Cairn India’s each department used a shared network drive (Windows NTFS with access control). Since both technical and non-technical teams were storing

data at this location, storage size of this environment was over 30 TB. Also, the estimated annual growth rate of this file system was 30 percent. The growing data caused delays in completing the backup within the planned time. It also increased the pos-

Deduplication’s biggest advantage and gain was in terms of improved user experience while ensuring efficient backup Anil Shankar, Customer Care Associate & Sr. GM - Solutions & Technology, Shoppers Stop


informationweek march 2013

sibility of unavailability of data. Considering the criticality of data protection, Cairn India embarked on a deduplication project with two-fold objectives — ensuring 100 percent protection of business critical data as per company policy and building quicker and realistic restoration capabilities. “We realized that backup to disc was a reliable and realistic solution to meet the project objectives and thus deployed EMC Avamar, which utilizes matured sourcebased deduplication methodology,” says Priya Narayanan, DGM - Information Services, Cairn India. The project has resulted in improved productivity and has enabled the IT team to focus on other strategic initiatives. “There has been an increase in productivity and DR support capabilities with a 90 percent decrease in restoration time,” adds Nagarajan Krishnan, Senior Manager - Emerging Technologies, Cairn India. Another example is that of FMCG giant Marico, which has registered several benefits post deployment of a source-based backup software. The company generates enormous amount of data through business transactions, market research, use of new applications, and the introduction of more devices and sensors in the business ecosystem. In order to manage this data in an intelligent manner, the company framed a backup policy according to which users were required to segregate data into specific folders for backup. This ensured that big files, such as movies, etc., which cannot be compressed to a great extent, were not backed up. “We have been able to reduce the previous 125 TB of data to 5 TB, with 96 percent deduplication,” says Girish Rao, Head – IT, Marico. Also, there is just one


Business benefits of data deduplication

Royal Sundaram Alliance Insurance

The firm’s backup window time has considerably reduced. It has also cut down on a lot of overhead, maintenance and deployment-related costs.


Storage virtualization and deduplication, along with other techniques like tiered storage and thin provisioning has helped the bank efficiently manage the life cycle of its data at the lowest cost.

Cairn India

The company has registered increase in productivity and DR support capabilities with a 90 percent decrease in restoration time.


The company has registered significant reduction in the time to take a full backup of data over LAN.

backup administrator managing backups of around 700-800 machines spread across 35 locations. Earlier, to take a full backup of data over LAN, the company had to spend 2-3 hours — each user had an average size of nearly 10-15 GB and a major chunk used to be Outlook pst files. Also, a lot of time was spent on managing the backup generation process. “To restore even a single file, we had to spend a minimum of 1-2 hours. Today, the time taken for backup of each user is around 30 minutes per day over LAN and 1 hour over WAN,” informs Rao.

Improving backup effectiveness

Data deduplication also improves backup effectiveness and solves challenges of data archival with traditional storage technologies. Take the example of Shoppers Stop, which was experiencing challenges with backing up data residing on desktop and laptop users. “Loss of archived data due to

crashes and high recovery times made it difficult for end users to resume work within acceptable timeframes. Dependency on external vendors coupled with lower success ratios and high spent for recovery made it an unacceptable option,” informs Anil Shankar, Customer Care Associate & Sr. GM - Solutions & Technology, Shoppers Stop. Apart from this, the company also faced challenges related to containing the overall footprint of the storage solution and ever-increasing storage disk space requirements. The company increasingly felt that traditional backup and retrieval solutions were constrained in a dynamically changing environment — both in terms of handling increasing data volume and sustainability. Hence, in order to preserve and protect the user data, the company opted for the data deduplication technology. “From a retail business standpoint, it is important to capture maximum possible data elements of transactions and customer interactions and this



• • • • •

Reduces storage needs by eliminating redundant data Reduces costs by lowering storage space requirements Cuts down data backup and recovery times Improves backup effectiveness Ensures robust data archival

is becoming more and more critical for creating unified view of the customers to serve them better. In our environment, we found that the data volume was doubling every 2-3 years. This prompted us to explore solutions beyond the traditional means of storage management,” says Shankar. From a situation where the company’s existing storage solution was running out of free space, re-architecting the storage landscape with deduplication brought the utilization levels well within normal utilization level and sustained for next year and beyond. The company was also able to cut the backup window time by nearly half. Deduplication at source also led to lesser load on the network, thereby freeing up network bandwidth. “The biggest advantage and gain was in terms of improved user experience while ensuring efficient backup. Being a block level solution, it is transparent to the end users and works seamlessly,” adds Shankar. From eliminating redundant data segments, reducing the amount of data transferred and stored to bringing down backup window time, deduplication introduces significant benefits in the data protection processes. As business data continues to grow at a staggering pace, more and more organizations will realize the limitations of traditional storage technologies and evaluate deduplication to keep storage requirements and costs under control.

u Jasmine Kohli

march 2013 i n f o r m at i o n w e e k 31

Cover Story

How CIOs are tackling the data explosion challenge

Multi-fold increase in the size and complexity of data coupled with a stringent regulatory environment is compelling CIOs to relook at their storage strategies By Ayushman Baruah


ou can hate the unprecedented increase in data in your organization but you cannot ignore it. It’s estimated that in 2013, the world will produce 4 zetabytes (4 million petabytes) of new data out of which 2 zetabytes will be stored in some form or the other. Concerned with the key word ‘store’, CIOs across verticals are rolling up their sleeves as they get ready for biting the data bytes in their respective organizations. At e-commerce company Myntra. com, revenues have almost quadrupled over the last year and data has grown more than 10x during the same period. Shamik Sharma, Chief Technology & Product Officer of the e-tailer is facing the challenge of analyzing this data and storing it securely yet affordably.


informationweek march 2013

Like any other e-tailer, Myntra. com deals with three kinds of data —transactional data, analytics data and internal IT data. Transactional data deals with low volume but highly precious data, which includes customer orders and credit card information. “The challenge here is not about size and scale but it is about managing the data and making sure it’s reliable,” says Sharma. “This type of data also involves the additional challenge of regulatory compliance.” is in the process of making itself compliant with PCI DSS (Payment Card Industry Data Security Standard), a leading security standard for payment card processing. Analytics data, the second piece of data, deals with large volumes of data though it may not be as critical as transactional data. At, this

data that tracks all customer behaviour patterns is stored in the Amazon Web Services (AWS) cloud. According to Sharma, cloud is a great model for a company of their size as it is based on OPEX instead of CAPEX. “This is also the most compelling reason why we have moved to the cloud,” Sharma says. “However, not all of this data is highly valuable and only the analyzed data needs to be kept. So, we extract insights and analytics and put only the summarised data into a BI system locally in our data center,” he says. Moreover, analyzing this data is one of the biggest challenges the e-commerce company faces. “Analytics talent is very tough to find in India,” Sharma says. This is valid for an e-tailer like that counts heavily on tracking customer behaviour

patterns, a metric that can give them a competitive edge over an offline onpremise store. The third type of data is their internal IT data, which includes employee and HR data and it’s kept in-house in an SQL server. “This is the data for running an organization, not a business,” says Sharma. Interestingly, though in terms of volume, 98 percent of’s data is hosted on the cloud, the most critical data is still in their in-house data centers. This reinstates the muchdebated security concerns around cloud in India. Explaining this, Sharma says that even if the AWS server went down for a day, their business would still run but if their data center went down even for an hour; their company would lose lakhs of rupees. “Given that the AWS data center is located in Singapore, latency and network costs are the additional reasons why critical data is not hosted in AWS.” In terms of storage devices, though flash-based databases are slowly becoming an attractive option today,, like many other organizations, has not opted for flash drives yet because of the high CAPEX involved. However, Sharma clarifies that they would be revaluating their decisions to go for flash in the next 3-4 months.


The insurance sector deals with specific challenges in terms of data intensity/ longevity, regulatory compliance and competition in the market. “In this sector, the customer gets tied up with the company for a long term, usually for not less than 30 years. Therefore, it becomes mandatory for us to retain

While we are yet to deploy the last piece of auto-tiering, we have already deployed other storage infrastructure optimization technologies Harnath Babu

VP & CIO, Aviva Life Insurance

the data for the entire duration which is about 30-40 years,” says Harnath Babu, VP & CIO, Aviva Life Insurance. Ditto with Sriram Krishnan, Executive VP-IT, ING Life Insurance, who says, “Insurance is data- and documentintensive business. As we scale up our business volumes, there is increased pressure in managing the surge in data.” At Aviva Life Insurance, the size of the data currently stands at around 100 TB and it is growing 5-6 TB every year. To deal with this continuous growth in data and overcome some of the challenges mentioned above, Aviva has recently undergone a storage infrastructure optimization service, which involves data archiving (for e-mails and financial books), automated storage tiering, storage virtualization and storage consolidation with in-built data compression, data deduplication and auto-tiering. “While we are yet to deploy the last piece of autotiering, we have already deployed all the other technologies,” Babu told InformationWeek. The data scenario at ING Life Insurance is equally daunting. Their storage requirements are continuously increasing as many new applications have come into use in the organization in the last couple of years. This trend

We are hungry for specialist cloud storage providers who can cater to our requirements in the cloud, securely yet affordably and without latency Shamik Sharma

Chief Technology & Product Officer, Myntra

is also expected to continue for the next few years. “Preempting this, we took the conscious call to consolidate all our critical and storage-intensive applications onto SAN. This enabled us to cope with storage-related challenges as SAN encapsulates features like dynamic storage expansion, high and low performance storage to meet different storage requirements,” says Krishnan. The second key challenge for the insurance sector is in terms of regulatory reporting, wherein the status of a policy must be made available to the regulator/internal auditor whenever asked for. This is typically at the end of a month, quarter or a year. “Regulations require insurers to store a lot of information such as documents, images, voice recordings, etc. These have to be retained for long periods of time and be available on demand for servicing and in cases of litigation. While retention and access is critical, it is equally important to make sure that data integrity and security of information are also taken care of. At the same time, we have to ensure that data is not leaked to the outside world. Therefore, appropriate and adequate controls have to be built,” says Krishnan. The third challenge, which is common across almost all verticals, arises out of the highly competitive market space. “There is a huge need to analyze and derive insights from the data,” says Babu.


Low business margins and less availability of funds are haunting the pharma industry. Though costs have come down on a per GB/TB basis, overall costs have gone up due to

march 2013 i n f o r m at i o n w e e k 33

Cover Story the increased volumes. “Contingency planning is vital to ensure business continuity in the current volatile environment but this adds to multiplicity of storage,” says KT Rajan, Director - Operations, IS, Projects and Neurosciences - India & South Asia, Allergan. In line with the insurance sector, the pharma sector faces a similar challenge of storing data for longer durations. For instance, new drug discovery is a lengthy process that can run over a decade and all the accumulated data needs to be available for easy access and timely reviews. The sector is extremely dataheavy and hence high on the storage requirements. At speciality pharma company Allergan, storage requirements are significantly increasing year on year with a growth of over 20 percent in 2013 over the previous year. “Storage of clinical trial data, patient records in original form i.e. EMG, MRI scans, etc, all contribute to the tremendous storage needs. Lots of data pertaining to each batch manufactured needs to be archived and stored for several months even beyond expiry date of the products,” says Rajan. The sector faces regulatory challenges too as some of the regulations have not kept pace with the advent of technology, says Rajan. “Physical copies are still necessary along with the electronic ones which increase the storage requirements multi-fold. Compliance requirements from various authorities are steadily increasing, in turn increasing storage requirements.”


While it’s always economical to procure

We plan our storage requirements in a phased manner as we have to be business agile Nataraj N

CIO, Hexaware Technologies

storage in bulk, this isn’t the best strategy in the current environment where technology is continuously driving down the cost of storage. Storage technology is also changing fast, which creates technology obsolescence and introduces operational risks that organizations have to constantly assess and manage. “This implies that the timeline for refresh has now reduced, say from five to three years, else organizations risk losing out on cost and efficiency. This also results in a difficult investment call for the CIO especially when the initial investment has not been fully realized,” says Krishnan of ING Life Insurance. For instance, ING Life Insurance plans its storage requirements based on a forward-looking business view of 12-18 months and ensures that the investments are well utilized. “We also constantly evaluate the storage arrangement and move data to low-cost storage hardware, which we procure as and when needed,” adds Krishnan. CIOs are also refraining from big one-time investments in storage and are instead budgeting their storage investments in a phased manner. Nataraj N, CIO, Hexaware Technologies, says, “We plan our storage requirements in a phased manner as we have to be business agile.”

Contingency planning is vital to ensure business continuity in the current volatile environment but this adds to multiplicity of storage KT Rajan, Director - Operations, IS, Projects and Neurosciences - India & South Asia, Allergan


informationweek march 2013

Echoing a similar opinion, Rajan of Allergan says, “We try and estimate the requirements over short- and mediumterms so that the investments are future proof.”


In 2013, organizations will need to bridge the rising gap between their storage capabilities and requirements. For doing this, many CIOs are embarking on storage infrastructure optimization projects with technologies like automated storage tiering, storage virtualization, data compression and data deduplication. Some of them are also struggling to harness the power of Big Data and derive meaningful insights from it, more so, as analytics skills are tough to find in India. Apart from that, security concerns in the cloud, latency and network/ bandwidth costs continue to be key challenges for many sectors. As Sharma of points out, like many e-commerce companies, they are “hungry” for specialist cloud storage providers who can cater to their storage requirements in the cloud, securely yet affordably and without latency. Companies, particularly in sectors such as BFSI and pharma, face stringent regulatory challenges with respect to data storage and retrieval, which add up to their storage requirements and costs. Fast changes in technology have forced organizations across verticals to budget their storage in a phased manner and take a short- to mediumterm view rather than going for a big one-time investment. u Ayushman Baruah

Case Study

Archival solution helps Yash Raj Films ensure protection of its film catalogue By adopting an archival solution from Dell, the studio has been able to archive all its films at a single location By Jasmine Kohli


ash Raj Films (YRF) Studios started out as a film making company in 1970. In the last four decades it has grown from strength to strength and has to its credit India’s most enviable film catalogue. With its ever-growing film catalogue, the studio was confronted with the challenge of increasing archival of films, which were being stored on a variety of devices, including servers, tape drives, and HDDs. The studio was looking for a solution that ensured robust archival of films and ensured data protection, as loss of any film footage meant enormous cost to the company and detrimental impact on distributors and audience. “If we lose footage due to hard drive failure or disk corruption, then the entire film project could be lost. Each frame in a film footage has a resolution of 10-15 MB, and we have to archive around 130 footage. So if anything goes wrong with a particular frame, it becomes difficult to identify that frame and restore that part,” informs Dilip Patil, IT-Manager, Yash Raj Films Studio. Also, the studio wanted to capitalize on the trend of digitization of content and expand the distribution of its films and film footage over multiple media

Key Benefits The studio can upload 500 GB of films in few hours, instead of days earlier


Automated tagging for the studio’s digital assets make them easily searchable


The studio can use its archived catalogue of films for a range of media projects across platforms


platforms. With the final film being stored on high-resolution files, all the digital assets of a film can require up to 10-15 terabytes of hard disk space. Thus, to ensure high availability of films and reliable data protection, the studio was looking for a robust archival solution. After considering various archival solutions, the studio zeroed in on Dell EqualLogic PS6500E virtualized iSCSI storage array with SATA disk drives for archiving its film catalogue. Today, the studio’s films are archived at single location and it has the flexibility to access and use its archived catalogue of films for a range of media projects across platforms. “We can respond to requests within days now, while previously it used to take weeks, and deliver exactly what our partners and media required,” says Patil. Earlier, with no backup or redundancy in place for their server storage, the studio’s film archive was at risk. Today, redundant disk drives ensure protection for film data archives. “Having fully redundant disk drives was a key feature for us as it meant that if one drive fails, our films remain protected and all we need to do is replace the faulty disk,” says Patil. The studio has millions of film fans who send requests for film content via various platforms, such asYouTube, Netflix, and iTunes. To deliver content and respond to these requests by having to locate films storage across multiple disparate devices would take the studio weeks. “With the centralized archive solution, we can easily access the film catalogue that is searchable down to the level of the single frame. The centralized archive solution helps us deliver specific film files to dedicated workstations where the IT team can package the film or film clip in the appropriate format for upload. This enables us to

“The solution helps us deliver specific film files to dedicated workstations where the IT team can package it in the appropriate format for upload, thereby ensuring maximum exposure for the catalogue”

Dilip Patil

IT-Manager, Yash Raj Films Studio expand our audience for both current and back catalogue films,” says Patil. The solution has also enabled the studio to significantly bring down the time taken to upload a film. Earlier, it took days to upload a 500 GB film to an archival server, whereas now it takes a few hours. The solution has automated the tagging for the studio’s digital assets, making it easily searchable. “With Dell storage solutions, we can integrate the storage array with industry software to add metadata tags to digital assets, which was earlier a time-consuming manual task,” says Patil. u Jasmine Kohli

february 2013 i n f o r m at i o n w e e k 35


‘Big Data of tomorrow will be about images, audio, and sensor data’ As a CTO, what is the biggest challenge and the biggest opportunity for you at Hitachi? The biggest challenge I have is keeping up with new technologies and understanding where they fit our customer’s requirements. The greatest opportunity is the opportunity I have as CTO to work with talented engineers, researchers, and customers who keep me informed, challenged and excited about my job.

Visionary in the storage industry, and CTO of Hitachi Data Systems, Hubert Yoshida, tells us about the true value of Big Data, some common myths about Big Data, and his perspective on exciting technologies that would impact the future of storage


informationweek march 2013

What is the true value of Big Data? Can you cite some innovative examples where Big Data has been used to its full potential? There is tremendous potential in the Big Data phenomenon — potential to correlate information from different data sources to increase business and social innovation. Social innovation may include security applications such as the ability to look for potential threats in crowds or analyze unstructured data on the internet to identify patterns and interactions between people. There is tremendous potential to not only secure and save lives, but also identify new revenue opportunities. Today, we are developing and delivering systems that can process growing volumes of data faster and with greater frequency. Tokyo Stock Exchange in collaboration with Hitachi, has launched a new “High-Speed Index Service” where the indexes will be calculated with each change in the price of a constituent stock and disseminated on the millisecond level. Changes in market price that previously took as long as one second now takes between 1 and 10 milliseconds. This enables investors to get immediately actionable best quote indexes with less tracking error. In health and life sciences, a major challenge has been the correlation of data from different modalities like

X-rays, MRIs, and Electronic Medical Records. Now that data can be stored in one common repository with meta data that can be used to correlate different clinical data to a patient’s visit to a hospital or clinic and reference it to previous visits automatically for improvement in patient care and diagnosis. What are some of the common myths about Big Data? While there are huge expectations that have been set with regard to this phenomenon, it is important to understand that Big Data in itself is not going to solve all problems. Undoubtedly, it will help address some real problems in terms of helping to increase efficiency, save on costs, generate new revenue opportunities and even help save lives by providing real world intelligence, but it will still be a while before enterprises begin to see real benefits. Big Data today is about data mining, Hadoop, or SAP Hana, which is primarily around business data. The Big Data of tomorrow will be about images, audio, and sensor data and will require the orchestration of a number of historic, real-time, and predictive analytic tools. The value of Big Data will come in the analytics, which are specific to different verticals. The analytics require deep expertise in verticals like health care, oil and gas, transportation, financial systems, retail, and others. In the top ten technology trends announced by you, you have cited dramatic changes in OPEX and CAPEX for storage needs. Against this context, how do you see the emergence of cloud-based storage, and the overall impact on OPEX and CAPEX for storage? We will certainly see dramatic changes in CAPEX and OPEX over the next year. The increasing adoption of server

and storage virtualization is having a major impact on operational costs and is reversing the OPEX trend. Hitachi Data Systems’ storage virtualization technologies have helped reduce TCO by 40 percent or more, with payback in less than a year. On the other hand, hardware costs have begun to trend upwards due to Big Data pressures and the need to retain data indefinitely, as well as the addition of more functions in hardware. At the same time, we have seen a slowdown in the advancement of recording technologies. The price erosion for storage capacity is projected to be only 20 percent per year through 2020. Five years ago the capital cost of hardware may have been less than 20 percent of the total cost of ownership. Today, the capital cost of hardware may be as much as 50 percent for several reasons. One reason may be that accounts have reduced operational costs through tools like virtualization. Another may be that customers are using capacity as a management tool as a substitute for hiring more people or investing in management tools. As CAPEX becomes a greater share of TCO, storage efficiency technologies, such as virtualization, thin provisioning, and the use of intelligent archives, become more important. Capacity on demand with a cloud-based service or managed services can also help to reduce CAPEX. OPEX and CAPEX are changing rapidly as new demands are created and new technologies are applied. As we move into 2013, IT will have to employ economic principles and measurements to ensure that they are making the right investments for a sustainable IT budget. You have also cited that certified, pre-configured and pre-tested converged infrastructure solutions are gaining traction. Can you elaborate on the key reasons driving this growth? Converged solutions, which include server, storage, and network components in a pre-configured and pre-tested bundle have been around for some time and have already gained a measure of acceptance due to their

ease of acquisition and setup. Instead of a do-it-yourself (DIY) approach to acquiring server, storage, network, and systems software, and assembling the kit themselves, businesses are finding it more convenient to acquire a certified, pre-configured rack that is already assembled and ready to go when it rolls in the door. In 2013, we will see the growing acceptance of unified compute platforms where the management and orchestration of server, storage, and network resources will be done through one pane of glass. The primary reasons for the growth of converged solutions are therefore, the ease of deployment and management. For example, Hitachi Data Systems announced its converged infrastructure offering, which includes Hitachi Unified Compute Platform (UCP) Pro, a pre-configured tightly integrated turnkey solution with world-class Hitachi Data Systems servers and storage, and industrystandard networking. This platform can be installed in a day or two and the orchestration is done through vCenter, a single management interface. How do you see the rising adoption of mobile devices and platforms impacting the overall storage market? According to the iPass 2010 Mobile Workforce Report, employees with mobile devices work an extra 240 hours per year that equates to an extra six weeks of work per year, per employee. Mobile devices make it possible for workers to be connected anywhere and anytime, which increases productivity but also increases IT costs and security exposure. The productivity of mobile workers can be further increased if they can share and access data that is in sync with other workers. However, the use of USB sticks, Dropbox, and Amazon S3 to store and share data is creating a nightmare for corporate data centers. However, rather than considering mobile devices as a threat to IT, there is an opportunity to adopt a secure, content-anywhere platform that can be controlled and managed

by IT where the responsibility and liability is owned. If you had to pick some exciting new technologies that would impact the future of storage, what would those be? There have been several interesting developments in the recent past that offer a glimpse into the future of storage. New controllers that are designed to increase the durability and capacity of MLC Flash will replace current Solid State Disks and enable a lower price point to replace high performance disks. This will enable disks to do what they do best, provide low cost capacity without the need to worry about performance. This in combination with page level tiering will enable data centers to reduce the cost of storage without sacrificing performance. Another technology is object based file systems, which are required to scale beyond the limitations of traditional file systems. In a traditional file system a file name is contained in a directory, which points to an inode that points to a physical location on disk. Object based file system separate the name of the file from the physical location and enables the file system to scale to petabytes and billions of objects. The management of object based file systems like tiering and replication can also be done with the meta data in the object with out having to open the file content. Object based stores can virtualize data from the application. This involves storing the data with meta data that describes the content, so that it can be searched and accessed independently of the application that created it. This enables us to see the intersection of data from different applications like the example above in health care. Converged solutions are also an exciting new technology, where orchestration of the entire stack can be done through one user interface like vCenter. Converged solutions are currently application specific, but in the future one can see the consolidation of converged solutions with hypervisor technology. u Srikanth RP

march 2013 i n f o r m at i o n w e e k 37


‘Organizations are exploring alternatives to optimize their existing storage infrastructure’ Please tell us about the latest trends in the storage sector? How do you view India as compared to rest of the world. Our planet produces an estimated 15 petabytes of information each day, and more and more of that information is in the form of unstructured data, such as text messages, sensor data, satellite imagery and posts to social media sites. Some of the key trends in storage based on client pain points include: Storage efficiency: Today, instead of investing more on storage technologies, organizations are exploring alternatives to make maximum use of their existing storage infrastructure. This provisioning has proved to be one of the most effective technologies for optimizing utilization of available storage. Storage virtualization: Storage virtualization & consolidation is emerging as an effective technique to simplify management, reduce administrative costs and improve cycle time. Storage consolidation can help support bet-

ter business efficiency by improving storage capacity, which leads to more efficient storage management. Disk-based backup with deduplication: Backup administrators are struggling to keep up with the rapid growth of storage as backups and recoveries take too long. Huge manual efforts are required during backup and it is difficult to measure the success of the backup process. Disk-based backup with deduplication helps in addressing these issues. Data Curation: The next step beyond data preservation is data curation — the ongoing management of data through its lifecycle. This will add value to data that will help businesses glean new opportunities, improve information sharing and preserve data for reuse. Social media sites such as Facebook and Twitter are examples of the power of curated data, compiling the digital lives of users and giving them a platform to organize their content. Storage Analytics: Analytics will

help turn curated data into intelligence and knowledge. Historical trending analytics and infrastructure analytics let businesses index and search in a more intelligent way, and analytics on stored data can give businesses insights. Watson technology for health care is an example: Watson collects data from many sources and can analyze the meaning and context. Indian storage market is definitely booming. According to Gartner, the Indian IT infrastructure market (server and storage) is all set to reach USD 3.01 billion by 2016 with storage as the fastest and highest growing segment. According to you, what are the primary reasons leading to increased storage requirements within the enterprises. Data volumes are rapidly escalating; today, organizations create more data than ever. It’s been estimated that every two days we now generate as much data as existed in total before the dawn

With enterprise data escalating rapidly, there is a huge growth in storage requirements. In a discussion with Jasmine Kohli of InformationWeek, Christian J. Leeb-Hetzer, Vice President - Storage Sales, IBM says that instead of investing in new storage technologies, organizations are primarily looking at alternatives that help them make maximum use of their existing storage infrastructure. He also talks about the latest trends in the storage sector and the impact of cloud computing on storage 38

informationweek march 2013

of the new millennium. Data is increasingly helping organizations in devising business strategies. With the rise of smart analytics, organizations strive to leverage all the available information for more business value by detecting and acting on underlying patterns. This is the primary reason why data storage and protection is a high priority for organizations. Additionally, in recent years there has been a significant increase in the level of oversight of the business practices of every organization. HIPAA, Sarbanes-Oxley, SEC/NASD, and US DoD 5015.2 among others, have forced organizations to closely review their electronic data management practices. Failure to comply with the regulations pertaining to an organization’s area of business can result in civil, or in some cases, criminal penalties. Please explain with few examples how stringent regulations governing industry sectors are playing a role in increasing storage requirements? Regulations have impacted the manner in which we store our data, retrieve it and use it at times of need. These

all times. Another example is the healthcare industry — stringent laws and regulations dictate the organization in the sector to collect and manage enormous amounts of digital information in the forms of patient records, accounts, images, studies, and other types of data.A hospital for instance would require reliable access to this data to perform its core functions and provide medications. Losing any of this critical patient data can result in a risk to a patient’s well being; as well as severe costs and penalties, and loss of reputation. What according to you are the challenges of security from the storage front? The most valuable asset in today’s information society is data, which must be stored, backed-up, and archived. Many modern storage systems secure the data using cryptography. Some of the critical issues of security in storage include data leakage and security threats for data stored in the cloud. Protecting data at rest in storage systems poses new challenges compared to protecting data in flight, which has been the focus of communi-

encryption approach reduces the risk of information compromise, when storage media are physically removed from the storage systems. Please give us your view on impact of cloud on storage. According to IDC, by 2014, more than 50 percent of all new storage capacity will be deployed in a public cloud. Cloud is certainly gaining momentum. Yet many IT solutions traditionally used for data protection purposes, in areas such as backup and archiving, lack the complete range of features needed to apply to a cloud context. They may, for instance, not integrate well with other cloud capabilities, provide comprehensive scripting and policy management, or even work properly with virtual servers — the basic building block of clouds. For these reasons, organizations are increasingly seeking new ways to shield their critical data from many forms of risks. Yet given the fact that IT budgets are often either flat or actually falling, this is a puzzle that will often require a creative solution. Enterprises are deploying private and hosted clouds to respond faster to business demands for new apps to im-

Traditional IT solutions for data protection lack the complete range of features needed to apply to a cloud context regulations may vary from country to country but they certainly play a big role in data storage. For instance, in India, RBI has mandated that banks need to consider near site DR (disaster recovery) architecture, given the need for drastically minimizing data loss during exigencies and enabling quick recovery and continuity of critical business operations. Major banks with significant customer delivery channel usage and significant participation in financial markets/payment and settlement systems may need to consider a plan of action for creating a near site DR architecture over the medium term. Now this sort of a scenario makes it compulsory for banks to think of how data is not only stored, but also protected, available and accessible at

cation security for some time and is well understood today. One notable difference between these two problems is that communication channels typically use a streaming interface with First-In/ First-Out (FIFO) characteristic, whereas storage systems must provide random access to small portions of the stored data. IBM offers a portfolio of information security solutions based on its innovative self-encrypting disk and tape drives. These drives are designed to encrypt data automatically as it enters the drive to be stored, and then automatically decrypt it as it moves out of the drive. The embedded encryption engine helps to ensure that there is no performance degradation compared to the nonencrypting drives. This drive-level

prove service levels for given workloads, and to reduce cost across the data center. The enterprise storage architecture deployed is paramount for ensuring agility, performance, and reliability of a private or hosted cloud. Integration of storage infrastructure across hypervisors, cloud orchestration software, and open source architectures is required. Storage architectures for private and hosted clouds must support a range of mixed app workloads and the dynamic placement of data on the right storage tier. These essential storage functions enable faster service response times, improved service-level agreements (SLAs), and greater agility, performance, and reliability for private u Jasmine Kohli

march 2013 i n f o r m at i o n w e e k 39

Interview The trend of DR in the cloud is slowly gaining momentum, with organizations moving selective DR applications to the cloud, says Steve Stavridis, PlateSpin Product Marketing Manager, NetIQ. In an interview with Jasmine Kohli of InformationWeek, he discusses how organizations can ensure an effective DR strategy, while ensuring budget optimization

‘DR in the cloud is the new trend’ Disaster recovery products feature low in the priority list of CIOs. What is your perspective? Disaster recovery (DR) products from a cost perspective will fall in the quadrant of being a priority but low latency. The reason for that is a lot of CIOs and IT managers see their IT projects as revenue generating projects. With projects such as SIP implementation or e-mail upgrade, they can see a visible total cost of ownership (TCO). The challenge with disaster recovery projects is that one does not know what will be the TCO or the Return on Investment (ROI). The reason for that is ROI for DR projects is realized only when the disaster strikes. However, nobody can predict beforehand when the disaster will occur. If we take a look at the traditional solutions that address the need of disaster recovery, then the customers effectively have one or two options, one being a very expensive option and the other being an inexpensive option. Most customers spend 80 percent of DR budgets in protecting 20 percent of


informationweek march 2013

their applications — effectively the mission-critical applications. The other 80 percent applications are either not protected at all or they are paying for unprotected applications. So, we advise our customers to reconsider their DR strategy and also protect the other 80 percent of the non-missioncritical apps, and we give them inexpensive options to ensure disaster recovery for those applications. What could be considered as an inexpensive DR solution? Please explain with an example of an organization that has benefited by adopting such solution. From a disaster recovery technology perspective, when you are looking at deploying technologies for a large enterprise and protecting their mission-critical applications, then we talk of more than half a million dollar solution. For inexpensive solutions, for mid-size enterprise it would be a less than hundred thousand dollar solution. It is more of a case-by-case basis solution, looking at the process which a particular company wants to protect and probably the solution they would

want to choose. Take the case of Multicolor Steel, a producer of a variety of color coated steel roofing systems for clients throughout India. Multicolor Steel manages a complex production process, handling all aspects of the design, manufacture and installation of steel building systems. The company undertook a rigorous evaluation of several disaster recovery solutions, but found that most were too complex and costly for its needs. In contrast, the PlateSpin Forge appliance from NetIQ offered all the functionality the company wanted, in an easy-to manage and cost-effective package. The mid-sized company invested in our PlateSpin Forge disaster recovery appliance to provide full protection for its six servers, which run SAP systems, design software, e-mail and file servers. The company realized that it could now manage all its data protection needs with just a single box, and there was no need to invest in duplicate hardware and software licenses.

BCM and DR are crossdisciplinary topics. What is your take on this? Business continuity management (BCM) is more for business process to be in place and is the preliminary step to ensure that business continues to function. DR is primarily about having the appropriate technology and personnel to provide the services that underpin the business processes. Though, both are separate, they are intertwined. But there is a distinction, BCM supports business processes and DR is more technology focused. What should enterprises consider before forming a DR strategy? What are the requirements that dictate the need for applicability of a recovery solution? While formulating their DR strategy, enterprises contemplating recovery solutions should look at Recovery Time Objective (RTO) and Recovery Point Objective (RPO). Reason is these two objectives effectively define the part of recovery that is required for each business application. So, RPO is basically time to go down to the curve and RTO is the time we take to reach to the point before the disaster actually occurred. So, it is important for an organization to understand the RTO and RPO before formulating a DR strategy. Please tell us what would be the evaluation criteria to set up a DR solution? How important would affordability be in the evaluation criteria? In order to form an evaluation criteria for setting up a DR plan, organizations should test the DR plan frequently. Secondly, it is extremely important for the organizations to determine RTO and RPO. Cost is one of the things that is unavoidable and is important but more important are the recovery requirements that drive the cost of investment. A host of people opine that DR is expensive insurance. However, going through that exercise and understanding the RTO and RPO,

will help them undermine the cost of technology that is required. Your views on the challenges brought about by the diversity and magnitude of computing platforms and data. How can organizations ensure budget optimization? It is just a case of understanding the RTO, if it we are looking at a RTO that needs to be zero or requires instantaneous recovery. In this case, the particular application will require many compute platforms. From clustering of technology to duplicating structure, one would be looking at everything in order to sustain that part of requirement. However, one can start looking at inexpensive solutions for recovery of parts where recovery time can extend beyond 4 hours. These days enterprises are moving away from traditional subscription models. What trends have you observed? One of the emerging trends in the DR market is that of disaster recovery in the cloud. Disaster recovery as a service is a combination of infrastructure as a service and platform as a service. It gives enterprise customers the ability to look at some of the lower ERP applications that don’t have missioncritical impact in terms of IT and can sustain if their DR is moved to the cloud. Medium-sized enterprises are mainly betting big on DR in the cloud, as it is a cost-effective option. Apart from this, the trend of recovery-as-a-service is also gaining momentum. Currently, a lot of organizations are doing cherry picking and moving selective applications to the cloud. Enterprises while moving to the cloud from a DR perspective mostly go for the non-mission critical applications and opt for an on-premise model for missioncritical applications.

Organizations can undermine the cost of DR technology required by understanding Recovery Time Objective and Recovery Point Objective

u Jasmine Kohli

march 2013 i n f o r m at i o n w e e k 41


Storage virtualization gets real Our four business scenarios show how to improve disaster recovery, boost disk utilization and speed performance By Howard Marks


torage virtualization can deliver benefits such as better utilization of existing storage, easier provisioning, improved performance of storage systems and applications, and lower-cost disaster recovery. But many different technologies fall under the “storage virtualization” umbrella. We’ll use a fictional company to illustrate how small and midsize businesses might craft a storage virtualization strategy that meets different business needs and balances cost and performance trade-offs. In particular, we’ll look at using


informationweek march 2013

cloud gateways for faster file services and host-based replication for disaster recovery. We’ll show how hybrid storage that combines flash storage and traditional spinning disks can deliver faster performance while controlling costs, but doing so means sorting through several possible approaches. And last, we’ll show how server-side caching done right can improve virtual desktop performance. In our scenario, a new management group has taken over Acme Inc., a manufacturer of novelties and toys. Over the past several years, Acme has made limited investments

in its IT infrastructure as a result of the economic downturn. The company’s new CIO believes in the concept behind a software-defined data center, in which software performs functions such as networking and storage virtualization that have in the past been performed by dedicated hardware. He has asked his infrastructure group to virtualize as much new infrastructure as possible — including the storage. One of the first storage applications to be virtualized at Acme was file services. Before the upgrade, Acme had traditional disk-based network-

attached storage systems in its Los Angeles design and distribution center, and in its three sales offices across the country. Acme has millions of CAD and graphics files in its archive of product designs and marketing materials, and the company’s designers are often kept waiting as the NAS systems struggle to deliver these large files. Acme also struggled with transporting files from location to location. Most of the time, employees sent files via e-mail attachments to coworkers in other offices and to the company’s Asian manufacturing partners. This clogged up the Exchange server and backup repositories with multiple copies of files. In addition, employees began bypassing corporate IT, and its security safeguards, by using consumer services such as Dropbox.

Cloud Gateways Speed File Servers

To address these problems, Acme chose a cloud storage service that uses cloud gateways like those from Nasuni and Panzura. These gateways are deployed on premises at Acme’s offices, and connect to cloud storage services from providers such as Nirvanix and Amazon’s S3. The gateways in each location use local solid-state drives and spinning disks to cache actively accessed data while presenting a single integrated file namespace to the users, regardless of their location. The LA design center will have a higher-end appliance with SSDs to provide the performance the designers need,while the sales offices and manufacturing partners can use less-expensive virtual appliances running under VMware’s vSphere to keep costs reasonable. The gateways store all of Acme’s data on the public cloud while encrypting it for safety and deduplicating it to keep Acme’s monthly storage charges from becoming astronomical. Acme can also use cloud-based snapshots of its file system to reduce, or even eliminate, the need to make backups at its remote sites. The snapshots are a pointin-time view of the file system (and therefore the files) stored in the cloud,

and therefore don’t take up space on the appliance’s cache. The cloud gateway approach also provides some benefits for disaster recovery. Acme’s file data is stored in the cloud, so in the event of a disaster Acme can spin up a virtual machine version of the cloud storage gateway in the cloud provider’s infrastructureas-a-service environment. This allows employees to access their files quickly in the event of an emergency. In addition, Nasuni recently added remote access to its gateway, allowing users to get files from their home PCs, smartphones and tablets.

Host Replication for Low-Cost DR

Given that Acme’s headquarters is in earthquake-prone Los Angeles, the company knows that natural disasters are a risk, but it couldn’t afford to build and manage a separate disaster recovery site. Instead, the company adopted host-based replication options, such as Vision Solutions’ Double-Take for its physical servers and Zerto Virtual Replication for Acme’s VMware infrastructure. Traditional array-based replication, where the array sends disk writes as they happen to a partner array at another location, would require Acme to purchase arrays from the same product line for both its primary and disaster recovery sites — a cost Acme can’t tolerate with its limited IT budget.

By contrast, host-based replication uses an agent in the host operating system or hypervisor to collect the data it will replicate, and transmits it to the remote site. At that remote site, an application running on another host computer receives the replicated data and writes it to its storage system, which need not be similar to the source storage system. Acme’s cloud providers run a multitenant version of the replication receiving program, which allows the provider to use a single host to receive replication streams from multiple customers and store each customer’s data in its own volume or repository. The customers pay a small monthly fee for each gigabyte of replicated data they store and for each host or virtual machine they replicate data from. Acme uses this cloud recovery service for its mission-critical ERP, database and Exchange servers. In the event of a disaster, Acme can spin up the replica servers in the cloud and have its applications up and running in a matter of minutes.

Go Hybrid To Balance Cost, Performance

Just as Acme’s designers struggled with performance on the old NAS system, Acme’s ERP and other database-driven systems were struggling to keep up with the demand for services at peak periods. During the lean budget years, Acme’s IT staff added disk drives to

What storage activities do you typically spend a significant time performing? 2012


Performing backup and archiving of data

55% 58%

Monitoring and managing storage systems

49% 54%

Developing strategies and tactics to improve our storage utilization and systems

33% 34%

Allocating storage capacity to applications and servers

Looking at new products and technologies

32% 33%

27% 30%

Data: InformationWeek State of Storage Survey of 313 business technology professionals in January 2012 and 377 in November 2010

march 2013 i n f o r m at i o n w e e k 43

Feature provide more random I/O performance by spreading the database across more spindles. However, while allocating more and more disk drives boosted performance, Acme only had a few hundred gigabytes of data, which didn’t come close to filling those drives. In other words, Acme was paying for more storage than it needed. Acme’s analysis indicated that even at peak periods, its database servers would require somewhere between 10,000 and 15,000 IOPS. While an allsolid-state storage system could easily provide that level of performance, an all-SSD system was too expensive. A hybrid system, which automatically places the data being actively accessed in flash memory while using spinning disks to store the bulk of the data, struck Acme’s IT group as a more affordable way to get both the capacity and performance it needed. Once Acme decided to go hybrid, it still had to choose from several hybrid approaches. Its IT team quickly rejected the approach of simply creating separate SSD and spinning disk volumes on a common array and manually placing data on each. Acme’s ERP system keeps all its data in a single monolithic database, so while much of that data was cold much of the time, a volumebased approach would require keeping the entire database on expensive SSDs. In addition, Acme’s varying workloads

Do you use cloud storage services? 2012


Yes, for email


8% Yes, for archiving



Yes, for backup and recovery



No, but we’re considering it

34% 34%



would keep its small IT staff constantly migrating other workloads between the performance and capacity storage tiers. Acme wanted a system that would automatically determine which data should be in flash for performance and which should be on spinning disks. At first glance, several of Acme’s staff found the concept of subLUN tiering intriguing. By moving frequently accessed, or hot, data from spinning disk to SSD, and colder, less frequently accessed data from flash to disk, sub-LUNtiering could address the performance problem while also letting Acme use the full capacity of both the disk and flash tiers.

What matters when buying storage How important are these storage technologies and features when making storage purchase decisions? Not important

Somewhat important

Very important








Storage-based snapshots


42% 43%


Disk-to-disk-to-tape backup




Data deduplication




Data compression




Data: InformationWeek 2012 State of Storage Survey of 313 business technology professionals, January 2012


informationweek march 2013


Data: InformationWeek State of Storage Survey of 313 business technology professionals in January 2012 and 377 in November 2010

While sub-LUN tiering is more capacity-efficient than using flash as a cache, which uses SSDs to hold a copy of the hot data that’s also on disk, it requires the array controller to work harder by keeping access frequency metadata on each page in the system, and moving the colder data back to disks when promoting new data to the SSD tier. As a result, most of the sub-LUN tiering products on the market use large data pages of 1 to 42 MB, and only migrate data every few hours or overnight. Using large pages means that to migrate a hot database index that may be only 100 KB or so in size to the SSD tier, the system also has to migrate all the adjacent data, which may not be nearly as hot. This reduces the available flash capacity for other data that isn’t quite hot enough to be promoted. In addition, Acme was concerned that a system that based whether data should be in the SSD tier on past access patterns wouldn’t always have the right data in the right place, especially with Acme’s varying workloads. While a cache-based system would require more disk capacity, as the disk tier would have to hold all the data, cache systems only have to track the “heat” of the data in the cache, not all the data on the system, and can therefore use smaller pages of a few kilobytes. Using smaller pages means it can be more efficient in its use of flash.

In addition, because flash capacity is significantly more expensive than spinning disk, IT is usually more than happy to trade a little bit of additional disk capacity to reduce the amount of flash needed. Acme’s database administrators were initially drawn to the idea of installing PCIe flash cards in the database servers and using server-side caching software like EMC’s VFCache, SanDisk’s FlashSoft or Intel’s Nevex. Putting the flash right on the server’s PCIe bus, they argued, would minimize latency and provide the biggest boost to database performance. On further research, they discovered that using server-side caching requires a significant compromise. If they used a write-back cache at the server, which caches both disk reads and disk writes, some of the write data would exist for some period of time only in the server’s write cache. This would mean that the back-end disk array wouldn’t have a consistent copy of the database, and therefore couldn’t create application-consistent snapshots of the database on the array. Even worse, if the server crashed, some data would be trapped in the PCIe cache, significantly complicating recovery. While these problems could be

avoided by using write-through cache in the server, which would only cache reads and pass writes immediately to the back-end storage, a read cache doesn’t accelerate disk writes, and Acme’s DBAs insisted that during the pre-Halloween rush both write acceleration and snapshots were necessities. While several vendors, including Dell and QLogic, have announced server-side caching that replicates the write cache across multiple servers, Acme didn’t believe this technology was mature enough. A new generation of hybrid storage system vendors, including NexGen, Nimble, Starboard Storage, Tegile and Tintri, build their systems around an integrated flash cache and capacityoriented 7,200-rpm disks. While each of these systems has its own unique advantages, all have about 10 percent of their capacity in flash memory, which for most workloads will result in a cache hit ratio of 90 percent. The next-generation hybrid arrays also support thin provisioning, so the entire capacity of the array is managed as a single pool, and disk space is only allocated to each volume as data is written. Thin provisioning significantly improves disk utilization as space isn’t allocated to specific applications that

Do you use storage virtualization? 2012


Yes, all of our disk storage is in a single virtual pool of storage


5% Yes, some of our storage systems are in a virtual pool

31% 31%

No, but we’re planning to implement it in the next 12 months


12% No, but we’re looking into it

30% 29%

No, we’re not interested

Don’t know

15% 15% 9% 8%

Data: InformationWeek State of Storage Survey of 313 business technology professionals in January 2012 and 377 in November 2010

may or may not ever use it. Considering that most storage administrators allocate not just their best guess as to an application’s requirements, but also a substantial safety margin, thin provisioning should let Acme buy 30 percent to 50 percent less storage across all its applications. The new disk array will also be used to house storage for Acme’s VMware server infrastructure and Web servers. The SSD cache will automatically accelerate these applications as each application’s workload waxes and wanes. In addition, most of the new generation of hybrid arrays use datareduction techniques like compression and data deduplication, which not only reduce the cost of storage by squeezing more data into the same amount of disk space, but do the same for the SSD cache. Acme did choose to use a serverside disk cache, such as EMC’s VFCache, SanDisk’s FlashSoft and Proximal Data’s AutoCache, for its virtual desktop infrastructure initiative. Following VMware’s best practices, Acme isolated the storage for its VDI servers from the storage for its other applications. Virtual desktops present unique storage challenges. At the start of the day, the login storm presents a huge, read-intensive workload as employees fire up their desktop machines. However, throughout the workday, the steadystate load is predominantly write traffic as the virtual desktops create temporary files and update user profiles. Acme has chosen to use a write-back caching approach with PCIe SSDs in its VDI servers, as this will provide the best performance both during the boot storm and throughout the day. While Acme is a relatively small company, its adoption of storage virtualization will allow it to support growing application needs with less disk space and effort, while providing a significantly higher level of data protection and recovery than traditional approaches. Source: InformationWeek USA

march 2013 i n f o r m at i o n w e e k 45


Will Big Data platforms enable banks to make the next big leap?


Vishnu Bhat

The emergence of specialist Big Data platforms will open up new dimensions in insight generation by enabling the bank’s business users to generate insights up to eight times faster

http://www.on the web Does Big Data mean big security issues? Read article at:


informationweek march 2013

t’s more than apparent that the financial community’s interest in data has grown in recent times to big proportions. In the 12 months ending October 2012, an IT analyst received more Big Data enquiries from the banking sector than from any other. So, with data — both structured and unstructured — exploding to petabyte levels across traditional, mobile and social channels the Big Data world is evolving into the banking industry’s oyster. A compelling instance of how banks are leveraging Big Data is evident in the arena of fraud detection — a priority that naturally tops their agenda. Let me walk you through the all-so-familiar situation of account takeover fraud (when a criminal impersonates a genuine card holder, gains control of the account and then makes unauthorized transactions) and how Big Data can save the day. Today, banks can predict and prevent such account takeovers in near real-time based on the analysis of customer transactions patterns. An event-level risk scoring mechanism is employed to identify potential fraudulent action or deviant behavior by tracking inputs from multiple sources, such as customer call logs, existing customer documentation, transactions and more. Data from transactions like change in PIN number, change in residential address, and requests for issue of a new card are treated as events and assigned risk-rating scores too. These aggregated scores, in tandem with metrics such as timing between transactions and existing red flags for customers then indicate the potential risk of customer identity takeover. When these risk ratings cross a predetermined threshold, they trigger an alert that automatically initiates appropriate fraud preventive measures. This thirst for insight, amongst financial institutions, is nothing new. Businesses have invested quite a bit in

data analysts tasked to provide meaningful insights from data captured through various systems. To cater to this need, most banks end up building point warehouses of structured data and over a period of time create multiple puddles of data. And, more often than not, data analysts struggle to find what they are looking. And it takes them just too long to get to it. The good news is that innovation is brewing in the form of a specialist Big Data platform. A platform that straddles the entire data and analytics value chain — from the discovery of Big Data, integration with structured data sources, to unlocking value of insights and the operationalization of decisions. And because it cuts across internal and external data sources, both structured and unstructured data, enterprise systems and document management systems, customer touchpoints and social networks, the platform is an asset for all types of analyses — from rigorous, inward-looking insights to freewheeling analytics based on qualitative opinion gathered from the social universe. And unlike point solutions with limited reach, the platform connects to enterprise systems, banking channels and external data sources with ease in real time. It comprises a repository of pre-built algorithms and reporting options, which the bank’s business users can access to leverage insights up to eight times faster than before. Best of all, the platform goes the distance right until execution by providing a virtual space where key stakeholders can come together to collaborate and decide, and then operationalize those decisions using the platform’s integrated workflow capability. Is this the platform that will enable banks to make the next Big Data leap? On current evidence, I‘d say, the answer is a resounding yes. u Vishnu Bhat is Vice President and Head of Cloud Services at Infosys


Is shared storage’s price premium worth it?


George Crump

As local storage improves, shared storage proponents’ justifications for investing more in SANs and NAS just aren’t cutting it

http://www.on the web How data deduplication technology can enhance disaster recovery Read article at:


informationweek march 2013

irtualization has propelled the adoption of storage area networks (SAN) and network attached storage (NAS) to new levels. That adoption comes with new levels of frustration as the task of operating shared storage, becomes even more challenging in the virtualized environment. Increasing numbers of vendors are encouraging IT professionals to just say “no.” Customers are looking for alternatives to shared storage. Local storage in the form of internal hard disks and even PCIe SSD has emerged as leading candidates to replacing the SAN. Local storage has developed workarounds for its biggest weakness: lack of shareability. But how do we get to where we are? Why is the frustration with shared storage so high? Generally, administrators site three sources of SAN frustration. First there is the cost of shared storage, which is almost always a premium compared to local storage. Second there is the frustration with having to constantly tune the storage and its supporting infrastructure, something that is increasingly problematic in the ever-changing virtual environment. Finally there is the frustration over the complexity of dayto-day management of the SAN. We will focus on the first frustration, the price premium. The premium price of shared storage is caused partly by the cost of the infrastructure required to share storage: the adapters that go into the servers and the switches that the adapters and the storage connect to. Of course this is data, so everything has to be redundant, which compounds the cost problem. Another source of the price premium is the cost of the actual storage unit. It also must be highly available, so that means multiple ports, power suppliers, and storage controllers. Local storage also needs these same components and sometimes even in redundancy, but all these components exist inside the server they are being installed in.

Finally, shared storage almost always includes capabilities like unified storage (SAN/NAS), snapshots, replication, and automated storage tiering that may not exist in local storage. While many vendors include these capabilities in the storage system at no additional charge, nothing is actually free; most shared storage vendors hold significantly higher profit margins than their local storage competition. Shared storage proponents can no longer claim that the advantage of being shared is enough justification for this premium cost. In many cases they can’t claim a performance advantage. Now operating systems and hypervisors are offering many of the nice-to-have features listed above, so that is holding less value as well. To justify their high price, shared storage solutions need to focus on one key area: offer greater capacity efficiencies than local storage. It should be able to do this in two areas. First it should be able to reduce the physical capacity footprint required in a shared environment. Deduplication is an ideal way to reduce storage capacity needs, especially in the virtual environment. The technology should become standard on all primary shared storage systems. Shared storage should allow better use of storage since it can be assigned as needed to a given host. Local storage will almost always waste capacity and it can’t allocate it to another server. This granular allocation is especially important with flash storage since this capacity is still premium priced. Shared storage can carve up the allocation of flash solid state to the exact requirements of each connecting host, or it can use it as a global pool accelerating only the most active blocks of storage. As a result, the total SSD investment may be less in shared storage than if storage is purchased on each individual server. u George Crump is lead analyst of

Storage Switzerland, an IT analyst firm focused on the storage and virtualization segments


Why the cloud ecosystem needs common standards of measurement? With a plethora of cloud computing service providers defining their offerings in their own distinct terminologies, Indian CIOs tell us why there is a strong need for developing common standards of measurement for cloud computing service delivery and propose key areas where standards should be framed By Amrita Premrajan


ince its inception, cloud computing has undoubtedly been the one buzzword that has kept the market captivated about a radical new IT service delivery model that brings in several benefits, including shifting cost models from CAPEX to OPEX, greater agility and reduced complexity. In the past few years, we witnessed the emergence of a number of cloud computing startups. The potential of this model also motivated traditional technology vendors to bring out their own set of cloud solutions to the market and tap this trend. In spite of the availability of a variety of cloud computing services by different technology vendors, one of the biggest challenges that

has been hindering the widespread acceptance of cloud computing solutions by enterprises has been the lack of specific, global standards that could enable CIOs to evaluate various vendors on certain specific parameters and choose the one that suits their needs. “As a CIO, to choose the right cloud computing service provider, I need to follow the traditional elaborate procedure of talking to each one of them and evaluating each of their services. With every service provider defining their service in their own way, it makes the whole process of comparing and choosing between different vendors offering the same solution far more cumbersome,” says Daya Prakash, CIO, LG Electronics. Resonating the same thought,

Vijay Sethi, VP – Information Systems and CIO, Hero MotoCorp says, “With a lot of new cloud computing offerings coming up every other day, a question very often being asked by CIOs is how to judge cloud offering of provider ‘A’ and provider ‘B’. The need of the hour is defining specific standards for cloud computing service delivery that would make it easier to answer this question by enabling comparison of competing products or service offerings by various vendors based on the extent of standards and protocols being followed by them and making a fair judgment based on the same.” This highlights the need of charting out common cloud computing standards that would ensure that the fundamental building blocks for various service offerings

march 2013 i n f o r m at i o n w e e k 49

Feature are consistent and can be universally understood and compared across providers. “Once the standards are defined, CIOs and the vendor community will have a common understanding about these standards and would use common terminologies to define their services. This would not only make it easier for the service providers to offer their solution but would also make it easier for the CIOs as practitioners to understand which cloud service providers are following what standards and eventually consume these services with confidence,” asserts Prakash. Emphasizing on how defining cloud computing standards will benefit both CIOs and vendors, Sethi says, “While for the vendors, defining specific standards, simplifies product development and improves time-to-market, from a customer perspective it gives CIOs confidence of interconnectivity and interoperability and also migration from one provider to another.” Let’s look at some of the areas proposed by eminent Indian CIOs where robust cloud computing service delivery standards should be drafted.

Security Standards

If an enterprise adopts a cloud computing solution, it trusts a lot of confidential enterprise data into the hands of the cloud computing vendor. So, enterprises expect more than just a verbal assurance or a vague written commitment from the vendor. “Today, the cloud computing vendors come to us and assure us that security measures are there but in reality there is no safeguard or redressal

“Defining cloud computing standards will simplify product development and improve time-to-market for vendors and will give CIOs confidence of interconnectivity and interoperability” Vijay Sethi, CIO, Hero MotoCorp

mechanism for us if their security measures go wrong somewhere and our valuable data is lost,” says Mathew C George, Chief Manager, Indian Oil Corporation. George says that all the offered cloud computing platforms must have very detailed and robust policies and procedures. “Cloud computing platforms must have policies and procedures to guarantee highest possible levels of security and several types of security — physical security, network security, application security, the security of internal systems, and data backup strategy. Also, since you as an organization is trusting valuable organizational data with a vendor, it is extremely essential to get a thirdparty certification,” he says. Apart from all aspects related to safety of consumer data, cloud security standards should also include policies regarding termination of contract. “Standards linked to information security should not only encompass all areas related to protecting customer data that may be running across multiple geographically dispersed data centers, but also at the same time ensure that at the time of termination of contract, the provider should not have any

Scalability Standards

Cloud computing service providers widely publicize about the scalability of their offering. However, a lot of discrepancy exists in the area. “Although cloud computing service providers claim to be scalable, when you actually want to scale up, it takes some time to ramp up. The so called seamless ramp up is not quite there. If today scalability is a restriction for the vendors, then they should say so and tell the customers that they would take a certain amount of time to ramp up or ramp down,” says George. To resolve this discrepancy, George suggests that there should be a proof of the ability to scale and resources to guarantee highest standards of service quality and performance.

Data Migration standards

“With every service provider defining their service in their own way, it makes the process of comparing and choosing between vendors more cumbersome”

Another important point, which Sethi highlights is the necessity of a standard which ensures that in cases where the relationship with a provider goes bad or if the enterprise has to change the provider due to any reason, the movement of data to a new provider can be easily done. “This can be ensured only if there are specific standards that beforehand ensure that the data formats and applications, APIs etc., are such that the data can be easily moved from one provider to the other,” he says

Daya Prakash

Disaster Recovery Standards

CIO, LG Electronics


residual data left including in backups or tapes or archives,” says Sethi.

informationweek march 2013

CIOs we spoke to unanimously felt a need for robust disaster recovery

standards, as currently disaster recovery protocols are not very transparent. “Today, if you really ask the cloud computing service providers about their disaster recovery protocol, they are not very clear or transparent about this. When an enterprise gets into the cloud computing platform, it is important to define and codify the DR standards so that customers clearly know what they are getting into. There should be standards that ensure that customer data needs that might be running on geographically dispersed data centers would be properly protected, while ensuring extensive backup, archiving failover capability, etc.,” says George. Voicing similar thoughts, Sethi said, “A specific standard has to be defined on hosting of data centers, ensuring disaster recovery, business continuity and high availability, with SLAs which can be customer-specific.”

Standards for MIS reports on cloud performance

MIS reports on the performance of the cloud, which the cloud service provider generally sends to the user, is today in an antiquated stage. Hence, CIOs feel there is definitely a need for standards for MIS reports on the performance of the cloud. “There is a lot more standardization and upscaling, which needs to be done to these reports. For example, if I get a daily report at some fixed time in the day showing me exactly what the cloud performance is and what is happening up there, it would really improve my confidence in the whole system,” says George.

Building a committee For drafting standards

We have highlighted some of the key areas where CIOs believe that specific standards should be framed for cloud computing service delivery. Now the question is who should be the stakeholders who should come together as a committee and sit down to frame a set of common cloud computing standards that should be then mandated to be followed across the vendor community. CIOS suggest that a committee comprising of people from different backgrounds like professors, CIOs, consultants, vendors and service providers should come together to develop a common definition of cloud and standards for cloud computing service delivery, which then needs to followed by every stakeholder. Adding to this George says that along with all these stakeholders, concerned officials or bodies from the Government of India should also be included in this committee, so that they re-look at the IT act and accommodate new cloud computing standards within it. Giving an example

“Cloud computing platforms must have policies and procedures to guarantee highest possible levels of security”

Mathew C George

he says that there is a clause in the IT act, which says that all the data should remain within the physical boundary of India, which creates confusion. He opines that these issues could only be resolved from ground-up, if we have a concerned government body onboard in the committee. Likewise, Sid Deshpande, Senior Analyst, Gartner India says, “Cloud standards are a complex area, with the requirements differing from country to country and effective cloud standards can only be brought about when vendors, IT buyers and governmental bodies are all on board and working towards common objectives.” From these discussions it is quite clear that in spite of clear advantages that cloud computing can bring to an organization, there is a lot of skepticism in the minds of the CIOs regarding adoption of cloud computing solutions due to lack of defined standards for cloud computing service delivery. These CIO insights talk about few areas on which specific standards should be drafted for cloud computing service delivery. But this is just the tip of the iceberg — pooling in collective insights from the vendor community and the CIO community would actually bring forth many more areas that need attention. CIOs strongly recommend building a committee comprising of technology vendors, consultants, CIOs and relevant government bodies, which should draft a common definition of cloud computing and various standards for defining cloud computing service delivery. These standards then need to followed across the globe. However, technology vendors should not see drafting of common standards for cloud computing service delivery as a roadblock. Instead, it will serve as a catalyst, which would bring in transparency and in turn instill much needed confidence amongst the CIO community.

Chief Manager, Indian Oil Corporation u Amrita Premrajan

march 2013 i n f o r m at i o n w e e k 51


How TVS Motors is using Shelf Engineering to push efficiency to a new level How did the concept of ‘Shelf Engineering’ originate? What was the inspiration? The concept ‘Shelf Engineering’ is from product development process. In TVS Motors, we are driven to adopt best practices from other functions. In new product development, there are several good practices to ensure target quality, cost and time. Shelf engineering, FMEA (Failure Mode Effect Analysis), early involvement of supplier for design, are some examples. Shelf engineering is the concept adopted to garner an idea into reality and use it when required. In IT, when our CXOs request for some solutions, the time taken from the request to implementation is high as IT needs time to find and adopt new technology, develop solutions, test, train and implement. Adoption of Shelf Engineering in IT aims to reduce the latency of R&D in solution development. Having ensured alignment of

IT initiatives with business strategies, the IT team in our firm started looking at future needs of the business. We are constantly in touch with happenings in the industry, technology availability, and have knowledge of use cases. On the other hand we are also aware of business challenges and initiatives. Also, we understand that any solution offered when not called for — will suffer from poor ownership from users. At the same time, when a user demands a solution, the expectation is to provide it overnight if not at the speed of light. Given these propositions, we tried to develop and prototype technology solutions based on insights we gain from our own business, and technology available in the market, and shelf them. When users ask for solutions, we can provide them the same at less than a third of the time it normally takes, with some minor modifications.

Can you explain briefly the concept of ‘Shelf Engineering’? At TVS Motors, the Shelf Engineering concept means to develop a part or process or technology for future use and not for immediate use. The concept of shelf means to create / develop and store the same for future use (3 to 5 year period). This concept came into being to cut short new product development lead time by proactively developing things ahead of time. In our own company, we found that we took longer development time if a technology or feature or process was new to us. Hence, the guideline came from our mentors that we should not mix technology and new product development together in order to achieve shorter development cycle time. Shelf engineering is a concept used in the new product development process where a new design for a product or part is either documented

How nice it would be if your IT team pulls out solutions quickly to your pressing business need, the moment you express a challenge or business requirement. If you are a business head at TVS Motors, you can actually experience this. In a detailed interview with Srikanth RP from InformationWeek, TG Dhandapani, CIO, TVS Motors, explains the concept of shelf engineering, and highlights how this concept has helped his firm cut down on new product development time and boost innovation. Some edited excerpts: 52

informationweek march 2013

or developed to prototype stage and adopted on a later date. How has this concept helped your organization? This concept is helping my team for working closely with business, understanding challenges and orient ROI approach in major initiatives. This has helped us in reducing the lead time of problem identification and solution. Simultaneously, it has helped the IT team to talk in a business language. Can you state some examples where this concept has helped your organization? Several projects were shelf engineered and later deployed in the last two years — numbering 16 so far. IT executives have started working on “beyond current demand” and they are induced to do at least two ideas per person a year. One of the major byproduct of this initiative is the IT stepping into the shoes of the user while testing. One simple example is centralized MoM (Minutes of the Meeting) project. In any large organizations, several structured and ad hoc meeting are held. As per standard operating procedure, minutes of the meeting is also prepared in MS Word document and circulated to stakeholders of the meeting and the minutes compliance is reviewed in the next meeting. In many cases, executives look at the minutes only on the day of review and

manage the meeting giving excuses and many times the same subject matter is discussed again and again. There is no alert to the person responsible in advance for compliance nor is there visibility for the chair person on the progress. Observing these issues, we developed a centralized MoM system using an open source platform. This system automates all the process from request for meeting to resolution also integrated with mailing system and calendar. This project was engineered and tested within IT. After adoption in one area, there was a request from the Chairman’s office to track the action points of business meetings for ‘on-time compliance’. There was also a mandate to develop the system in three months. Since this was already shelf engineered, the MOM project was deployed in less than a day across the organization. Now HoDs have a dashboard displaying status of committed actions on their promise. This helps in managerial effectiveness. How do you measure the success of this initiative? Simple measure for me is the rate of deployment of shelf engineered project, which is now above 80 percent. My IT engineers are now more often seen in Gemba (place of usage).

Using shelf engineering, the IT team at TVS Motors can provide users with the required solution in less than a third of the time it normally takes to develop a solution

u Srikanth RP

march 2013 i n f o r m at i o n w e e k 53


‘BYOD era needs diligent monitoring of security incidents’ Please elaborate on the opportunities emerging with the ever-growing trend of mobility within the enterprise? How is SAP planning to cash in on this trend? The rapid proliferation of consumer mobile devices is changing the traditional IT environment drastically leading to a high level of IT consumerization. The explosion of connected devices in every aspect of our personal and professional lives presents a huge opportunity in applications and application development and for new entrants in the mobile commerce space. In response to this trend, SAP has put a stake in every vertical, offering customers and partners a wide range of mobile apps and management solutions from mobile device management, multifunctional enterprise application platforms, workflow, to ERP, and many more.

In an exclusive interview, Sanjay Poonen, President and Corporate Officer, Global Solutions and Head of Mobility Divisions, SAP shares how enterprises can ride the BYOD wave, while ensuring security 54

informationweek march 2013

With the proliferation of smartphones in the enterprise, what are the security issues? How can companies adopt BYOD trend, while ensuring security? Mobility has emerged as an integral part of a company’s business strategy and is transforming business functioning. In order to gain a competitive edge, companies are increasingly aligning their processes to achieve a mobilized workforce. Today, BYOD is a policy decision that every CIO, chief security officer and business needs to make. Do you want to hand out devices to every single person? Do you have the budget to do that? At SAP, we decided to fund 18,000 iPads for our employees because we’re in the mobility business, but not every company has that budget. Some companies also have some very tight regulations, like those in the public sector. The Army or Marines aren’t going to use any type of device that doesn’t have secured

e-mail or secured communications. For many of those places where the regulations and security constraints are higher, like public sector, defense, banks, and health care, even tighter policies will be required. In those places, the BYOD policy needs to have some clear guidance, procedures and policies, and also some legal provisions to allow devices to be checked for compliance on part that’s “owned” by the company. Some companies require employees to sign an agreement that if they download corporate applications, the company “owns” the part of the device where applications are downloaded. Mobility is intricately linked to using cloud-based solutions. Embracing cloud is a cost-effective option that enables businesses to ensure security.Moreover, in the BYOD era, security professionals will need to diligently monitor vulnerability announcements and security incidents involving mobile devices and respond appropriately with policy updates. How do you see the adoption and growth of enterprise mobility trend in India? Adoption of enterprise mobility in India has already attained a full swing and is expected to intensify in the near future, thereby providing ample opportunities for vendors to capitalize on. Major industry verticals including retail, banking, financial services and insurance followed by pharmaceuticals, transportation and logistics will continue to be the foremost adopters of enterprise mobility in India. Growth in nation’s GDP has infused a significant boost within these industries, as a result of which an urge to attain maximum work efficiency from the workforce and manage an optimal profitability has risen amongst the majority of enterprises. u Jasmine Kohli

Case Study

How open source helped People Interactive save more than ` 80 lakh The firm that owns the popular Indian matrimony website, has saved huge costs related to licenses and maintenance by deploying Ubuntu Linux on more than 800 desktops By Srikanth RP


s IT Head of People Interactive, the consumer Internet arm of the group, which owns the popular matrimony website,, Joel Divekar had a tough task on hand. Even as business was growing fast, Joel was facing a huge challenge in maintaining and providing IT infrastructure-related support across more than 50 offices and locations across India. In addition, the locations where the IT team did not have presence, the team relied on external vendors to do software installation. This created a problem, as vendors sometimes installed pirated software or incorrect versions of existing software. This led to proliferation of virus and malware,

management and reduced TCO. Ubuntu was the preferred Linux distribution. “We evaluated various Linux distributions on basis of ease of use, stability, and integration with existing infrastructure, documentation, infrastructure management, support, patch and releases and found Ubuntu topping on most of these,” states Joel. In the first stage, the team implemented Ubuntu for its new 75 seater office, and subsequently installed it in all its new offices and centers. Simultaneously, the team also initiated a process to migrate existing users to Ubuntu. Today, the team has successfully deployed Ubuntu on more than 800 desktops pan-India. Joel says that the firm now enjoys

The firm has benefited from improved system security, better system management, few software license management issues and reduced TCO and led to business downtime. “Every month around 12-15 machines had to be reformatted or cleaned from viruses due to virus issues and software configuration issues. In addition, we faced difficulties in providing support due to differences in software versions and in upgrading our anti-virus software,” says Joel Divekar, GM - IS, People Interactive (I). To address these issues, Joel scouted for options, and finally decided to initiate a project to migrate existing users based on a proprietary OS to Ubuntu. The firm opted for open source after considering factors such as better system uptime, security, system


informationweek february 2013

improved system uptime as there are few cases of OS corruptions or virus-related downtime. Today, the firm has benefited from improved system security, better system management, few software license management issues and reduced TCO. For remote locations, vendors now do basic installation and rest of the Ubuntu installation is remotely done by the IT team, which leads to better control. The team uses ClamAV as the open source anti-virus solution and Uncomplicated Firewall (ufw) as the front-end to manage iptables firewall. The ROI has been fantastic, as there are no license costs. “We saved

“We saved ` 10,000 per desktop for the operating system and additional costs related to anti-virus and other software”

Joel Divekar

GM - IS, People Interactive (I)

` 10,000 per desktop for the operating system and additional costs related to anti-virus and other software,” states Joel. If we estimate only license costs saved due to desktop OS, the ROI works out to be ` 80 lakh on a conservative basis, as the firm has installed Ubuntu on more than 800 desktops. In the future, the team plans to migrate all the users on Ubuntu, and also explore other open source tools for specific functions. u Srikanth RP

CIO Voice

5 areas CIOs must examine before moving to the cloud Vijay Sethi, VP – Information Systems and CIO, Hero MotoCorp details key areas that CIOs need to evaluate before moving to the cloud


loud computing is about delivering massively scalable IT-enabled capabilities as service to external customers using Internet technologies. Instead of saying that cloud computing is a hyped techtrend, I will say it is a buzzword today and still many service providers actually interchangeably use SaaS and cloud computing in their discussions. Not all SaaS solutions leverage cloud-based computing and cloud computing is not another term for SaaS. In fact, it is a broad technological concept where some types of SaaS offerings could qualify to be included under cloud computing. That is, if the IT application being delivered under SaaS concept is the one which is highly scalable, it could qualify for being considered as a cloud computing application. Will cloud computing really take off? I think yes, it is poised to grow — and not just the private cloud but also public cloud. There are a lot of people who say that large companies will take the private cloud route, while SMEs will take the public cloud route. But I feel that even large companies will benefit a lot from the public cloud route and many enterprises have already ventured out on this path. However, as I see today, before taking a decision whether to move forward on cloud computing or not, each CIO has to evaluate certain key areas, which are as follows: 1 Firstly, fully understand the concept and implications of cloud computing before taking a decision whether to maintain an IT investment in-house or whether to buy it as a service through the cloud — and for that to happen even the service providers need to

understand the concept fully. 2 Look at the overall ROI and understand short-term costs versus long-term gains, comparing not just hardware, software, implementation and maintenance costs but also bandwidth and related costs, especially in case one is moving enterprise applications to the cloud. Also, one may need to enhance Internet bandwidths significantly. 3 Evaluate the service levels being offered by providers in terms of uptime, response time, performance, etc. 4 Determine whether implementation time will be more efficient — in terms of time to deploy or scaling up an application infrastructure — by adopting cloud computing or by utilizing in-house capabilities. 5 Lastly, I think culture and mindset has to change, and this could be perhaps the most difficult barrier in the adoption of dloud computing. The mindset in the organization — not just IT managers — needs to be aligned to the fact that there will be a transition from on-premises to offpremises computing. While there are obvious advantages of cloud computing in terms of the fact that someone else is managing day-to-day technology issues, this has to be weighed against the fact that we are leaving the business-critical information resources in the hands of third parties. I strongly feel that not everything can become cloud computing as each one of us has specific requirements — whether on functionalities, performance, or even security and privacy needs, which may be unique

to the organization and may not be supported through the public cloud. Secondly, I guess we need to start small by first understanding the concept from a practical perspective and then deciding whether to go ahead or not. — As told to Amrita Premrajan

Advantages of cloud computing need to be weighed against the fact that we are leaving the business-critical information resources in the hands of third parties

u Vijay Sethi is VP – Information

Systems and CIO, Hero MotoCorp

march 2013 i n f o r m at i o n w e e k 57

CIO Voice

The IT orphanage Every organization has a (un)labelled orphanage that sometimes gets very crowded especially if the CIO and the IT team is unable to assert themselves or if they collectively work to create solutions that are disconnected from business reality By Arun Gupta


t had been a long search, far and wide, across the oceans; many able men and women working as teams traversed the globe in her quest. A few options were shortlisted but discarded very quickly when some deficiency was uncovered with deeper analysis. The rigour redoubled, the pursuit unwavering, the promise of reward for the long-term kept them going. Their leader encouraged the team through the journey, especially when they appeared to falter and give up. Almost a year into the expedition, the quest finally came to an end with what appeared to be a perfect and made to order ending. The leadership team got together to discuss the outflow; she was expensive and required high maintenance. No one had the courage thus far to take such a risk. However, the promise of the future convinced everyone that it would be worth the investment. So they all agreed to part with the precious gold coins and get her on board. High risk, high return said the treasurer. She was welcomed with a lot of fanfare, the headman chose a name from the many suggested and the message spread across on the new unique acquisition. Everyone contributed to setting the expectations that rose in unison as if in a crescendo; everyone watched the future with euphoric anticipation. Smiths and specialists from all over the world got together to define outcomes that she would enable. Progress was slow and soon people started paying lesser attention focusing on their daily chores. Life continued as usual with occasional reviews that highlighted


informationweek march 2013

challenges to understand and adapt to her whims. The workmen toiled day and night for many moons encouraged by their leader who did not give up his belief. Two winters later the team broke off into a joyous dance; everything worked as designed, all the links delivered, the input validated — the outcome was as expected. Rushing to the leadership team they demonstrated the end result, chests puffed with obvious pride. Celebration was called, everyone wanted to be associated with success; anecdotes of arduous journey spread with friendly banter. After almost 18 months since the start day, the project had gone live and was churning out results that were unfamiliar territory but delivered business outcomes the leader had believed possible. The competitive advantage gained using the new technology was evident and accolades poured in locally and globally for the unique pioneering solution. Too good to last, some of the naysayers found reasons to challenge and doubt the results; conventional wisdom did not support the new solution, thus they were able to sow seeds of doubt, which spread quickly through the enterprise. The initial success was passed off as stroke of luck and not sustainable. With no supporters, almost everyone went back to their old ways and deserted the solution as a bad dream and mistake. The solution thus joined the IT orphanage. Applications and solutions that the IT team developed but no one really used; solutions that were bought by users only to be discarded with no one to support them; applications and reports that

are always urgent for development but rarely complete UAT; and if they do, hardly anyone wants to use them, they all finally find their place in the IT orphanage. These have no owner, no user, and no parent to support them. Once relegated they rarely if ever find a benefactor who is willing to support them. Every organization has a (un) labelled orphanage that sometimes gets very crowded especially if the CIO and the IT team is unable to assert themselves or if they collectively work to create solutions that are disconnected from business reality. The CIO needs to highlight such instances transparently and openly to either change the team behaviour or improve chances of success; and/ or change business engagement and ownership that rarely if at all any need to be assigned to the orphanage. P.S.: Within a year the project was revived by the CIO and has stayed a success now for over two years; that is a story for another time. The article first appeared in Arun Gupta’s blog: Oh I See (CIO Inverted)

u Arun Gupta is CIO at Cipla

CIO Voice

Dynamics of technology adoption in an era of consumerization


Udayan Banerjee

We are living in a world where business and government are no longer leaders in technology adoption. For many types of technologies the baton has been handed over to us — the users

armVille farmers outnumber real farmers — Why? Angry Birds passes one billion download — Why? On 21st November, 2011, Kolaveri became the ‘most searched video’ on YouTube — Why? After two and a half year of near hibernation our Yammer usage suddenly took off without any internal promotion — Why? The answer is, “we don’t know!”We don’t know why something goes viral, we cannot predict it and we cannot prescribe a formula. But we can be sure similar unexpected events will happen again and again.

Hyperconnected World

We have been connecting ourselves like never before — wired communication, networked personal computing, hyperlinking, search, wireless communication, and social networking. We live in a hyperconnected world and hyperconnectivity also leads to unexpected behavior. Look at our brain: l By the age of 5 our brain size reaches about 80 percent for the adult size. What happens after that? We develop interconnections among the brain cells — the learning; the intelligence is in the interconnection. l In the last 32,000 years our brain size has decreased. Does it make us less intelligent? Our intelligence is in the interconnectivity — inside the brain, derived from social interaction. Hyperconnectivity gives rise to an intelligence of its own where 2 + 2 may be 22! Look at the nature: Does an ant-hill have a predefined

architecture? Do birds flying in a formation have a flight plan? Do ants find food using a search algorithm? Hyperconnectivity leads to unexpected events, which change our world view, alter our business strategy, and alter our technology adoption cycle

One Way Communication to Instant Two Way Dialog

In the pre-Internet days, you had a limited opportunity of communicating with the author of a book, which you might have liked or disliked. Apart from your friend circle you would not have any clue of what others feel about the book. Today, there are multiple methods to communicate with the author and other fellow readers. Same is true for organizations trying to reach out to you. Previously, it would be one way marketing messages through printed ads and TV commercials. You did not have any power to let the world know about your experience with a product or service — good or bad. Today, you have so many different ways of expressing your opinions and connect with strangers who share your interest. Traditionally, IT was about payroll, general ledger, accounts payable, accounts receivable, sales order processing, materials management, production planning, etc. From there we evolved into ERP packages and many other applications whose primary objective was to improve operational efficiency and reduce costs. We can classify all these as “inside the fence” application. The focus has shifted. Though efficiency is important, it is more important to be able to create a

march 2013 i n f o r m at i o n w e e k 59

CIO Voice unique product or service, which will resonate with the customers. Only way to do it is to reach out to the customer and listen. Opening up of the two-way communication channel has made it possible. Focus of these applications is “outside the fence.” Current applications are about how to reach out to the customers and engage them in a two-way dialog. This is both a cause and effect of shifting technology focus.

Improving Efficiency to Enhance Connectedness

If you go through the list of “hot” technologies, they are either about enhancing your ability to reach out and interact with your customer or about extracting meaning out of the connections you have established and the interactions you have had with your customers. In fact, it is more than just your interaction — it is about all the interactions around you and your product. You had complete control of the earlier generation technologies, which focused on efficiency improvement. You could decide when and how to adopt it. You could choose the flavor and the variant. The trouble with the technologies, which deal with


informationweek march 2013

connection is that there is another party involved who has an independent opinion about which flavor and variant of technology to use. So, when you have to deal with hundreds, thousands or millions of such parties then you have practically no control on the technology that you need to adopt.

Can IT adopt any technology?

IT is always working under the constraint of a limited budget. Annual IT expenditure has to be within the allocated budget and there is always a demand to do more with less. So, CIOs face the challenge of how to allocate the budget among running existing stuff, new initiatives driven by ROI, and managing technology upgrade. Major share of the budget always goes to keeping existing applications running smoothly. Failing to achieve this is a sure way to ensure that the CIO needs to look for alternate employment opportunity! Any new initiative, which is based on Return on Investment, is most likely to be driven by the business need. The budget may be a part of IT or it may come from the business. The ROI may come from either increased sales or from reduced operation costs.

Typically, there will be many possible projects, which will vie for the limited budget and not all of them will be taken up for implementation. The third category, which is entirely under the discretion of IT, is to manage the technology upgrade lifecycle like when to migrate to new version of MS Office or when to start supporting Android devices. Money for such initiatives is always limited and the possible options of upgrade will far exceed the available money. In short, traditionally, many promising technologies may have to wait till there is a solid business case and there is money available for adoption.

How does IT choose which technology to work on?

Only a small percentage of organization is in the business of creating technology. Most organizations, however, look at technology as an aid to design new offerings and improve existing ones. Such enterprises will typically have an annual IT budget and a 3-5 year rolling technology roadmap. Any technology, which cannot be accommodated within the current budget but are sufficiently important will find a place in the technology roadmap — one, two, three or even five years down the line! Each selected technology will go through a cycle where once identified, there will be a plan of acquiring and implementing it. This would be followed by a phase where it is supported as a part of the effort to keep the lights on. Technology refresh may happen only when the budget can be found for it. So, specific technologies may never get chosen for adoption. The adoption may be delayed and in some cases it may even take as much as 3-5 years. The process of identification and analysis of each of these viewpoints is a complex process. To aid this process, two popular models are used to describe how

any new technology gets adopted by enterprise: Technology Adoption Lifecycle Model and Gartner Hype Cycle Model.

How experts explain technology adoption cycle The accepted premise is that every new technology goes through the following phases: l

Hype: Search for next big thing leads to hype around any new technology.


Struggle: Adoption of these bleeding edge technologies depended on the visionaries who had the vision, energy and money to make it work.

Success: Mainstream adoption required convincing the pragmatists who needed success stories and support system around the technology. Not all technologies made it to mainstream. All these are from the perspective of an enterprise. Consumers had very little role to play in this life cycle. This underlying theme comes out in both the “Hype Cycle” model used by Gartner since 1995 and the “Technology Adoption Lifecycle” model popularized by Everett Rogers and Geoffrey Moore. Though the curve looks different and one is plotting “Expectation” and the other is plotting “Adoption Rate” against time, both of them are based on the same three basic premises I had stated earlier. But do we have evidence to support these theories? l

What does past Gartner Hype Cycle data say?

If this pattern of technology adoption is true then most of the technologies that find a place in the “Slope of Enlightenment” should have, in the past, appeared in the “Peak of Inflated Expectations.” Do traditional technology adoption model like “Technology Adoption Lifecycle” formulated by

Everett Rogers & Geoffrey Moore or the “Hype Cycle” formulated by Gartner reflect the current reality? In Gartner’s “Hype Cycle for Emerging Technologies”, In 2012, only 1 of the 6 technologies mentioned in “climbing the slope of enlightenment” ever went through the “peak of inflated expectation”. During 2007 to 2012, only 3 of the 17 technologies ever mentioned in “climbing the slope of enlightenment” ever went through the “peak of inflated expectation”. Almost twothird of them originated in the consumer space. Success of technologies is becoming unexpected and is driven by individual users. You notice it only when it succeeds. You are suddenly hit by the success like a “Stealth Bomber”. Like a Stealth Bomber it hits you

Tablets – iPad Social networking – Facebook, Twitter, LinkedIn l Cloud e-mail – Gmail, Hotmail l Other cloud hosted services – Dropbox, Google Apps l Web 2.0 tools – Blog, Wiki, Podcast Because these technologies open up channels for the seller to reach out to us the customer, organizations can ignore them at their own peril. This trend — this change in direction of technology absorption — is called Consumerization of IT l l

How does it concern us?

Being proactive may not be a choice! This may sound very counterintuitive because we have always been advised to anticipate change and stay ahead of the curve. The problem is that

Success of technologies is being driven by individual users. Majority of the technologies, when they enter the radar of the enterprise, have already been adopted by the consumer without notice! And you will find that your customers have already adopted the technology. Majority of these technologies, when they enter the radar of the enterprise, have already been adopted by the consumer. Though, enterprise success stories may not be available but the kinks around the technologies would have already been sorted out by the consumers. Since your consumer may already be using these technologies, they may become an extremely important channel to reach out to your customers. In short, we are living in a world where business and government is no longer the leader in technology adoption. For many types of technologies the baton has been handed over to us, the users. There are so many examples: l Smartphones – iPhone, Android devices

in today’s hyperconnected world it may be impossible to anticipate the change. Even if you succeed it may be because you have become lucky and not because you have greater ability to predict. Yes, this is a tectonic shift. The question is what we should do about it? The answer is actually quite obvious. We have to be vigilant and watch out for changes, which are not a random variation. Once we spot such a trend we have to be more agile and react quickly. The advice sounds simple but the million dollar (or if you prefer billion dollar) question is that “How do we know which trend is not a random variation?”

u Udayan Banerjee is CTO, NIIT Technologies

march 2013 i n f o r m at i o n w e e k 61


Harness the power of virtualization without sacrificing security


hile there is a broad spectrum of new technologies competing for the attention of IT managers and CIOs today, only few innovations actually help companies improve top-line performance or bottom-line productivity. Virtualization is one of them and has gained a lot of traction in the recent times. In a rush to implement virtualization, many have been forced to shelve security concerns in favor of rapid project completion. Others are struggling with how to reconcile competing priorities to virtualize their environments, while still ensuring that existing protection and visibility requirements are maintained. Some challenges that need tackling as virtualization becomes more widespread in 2013 are: l Starting out small, virtualization projects grow overtime into a major portion of the IT environment. In 2013, many companies will combine the VM project teams and infrastructure with corporate IT, highlighting the need for physical and virtual assets to work together as a platform. l Virtualization increases management complexity, especially when there are contrasting tools to manage. The network teams, server teams and storage teams need to work in unison to eliminate silos. l The ease of deployment and workload mobility that virtualization enables, can make security, configuration management and compliance more challenging than in less dynamic physical server environments. The existing virtualized environment thus needs to be fortified to ensure that it is architected for security and compliance.


informationweek march 2013

Create a VM Service “Good List”

of virtualization.

Creating an optimal access policy for virtual machines (VMs) requires creating a list of warranted apps/services to be run on that VM. This list will vary based on the type of server, and the group of users and applications it enables. Creating a “good list” (or whitelist) of commonly used and expected apps and services for each kind of VM (e.g., databases, web servers, file shares), will help you create a security baseline for every type.

Insist on Purpose-Built

Enforce Access Control per VM

VMs of many different types may reside on one physical host, as a result, traffic flowing between them can proliferate malware, worms, or malicious access. It is therefore essential to monitor all traffic between VMs and apply appropriate access controls. One must also examine sanctioned apps and services for the presence of malicious traffic/intrusion to ensure business critical communications flow securely, while maintaining the flexibility afforded by virtualization.

Layered Defenses

Virtualized environments require a system of defenses just like physical networks. This means applying and enforcing security policies and controls that will block unwanted services into and out of the environment. This is the first layer that will drastically reduce the probability of these types of attacks. The next layer of protection comes in the form of traffic monitoring and inspection. Protection here entails inspecting traffic against a set of known attack signatures and behaviors. Other types of protections, including log aggregation and analysis, antivirus protection, and robust alerting mechanisms. The key is to ensure that these protections do not provide security at the expense of the flexibility

Your virtual network has features and functionality to help you make the most out of your data center hardware investments in addition to giving you the means for infinite and expedient scalability. One primary advantage of virtualization is the near instant time required to provision a new VM. Rather than prepare and connect a physical server, one can simply clone an existing VM and have it online in minutes. The new VM will simply inherit the settings of the parent, including the security policies and applications in existence for a VM of that type. This will ensure that security for the new resource is automatically provisioned. Success for an enterprise virtualization deployment largely hinges on time invested in discovering requirements, understanding options, and keeping best practices in mind. Since security is a key factor for successful deployments in the cloud, investing in a strategy, in organization and skills, and in technology to support security for virtualized and private cloud environments is indispensable.

u Sajan Paul is Director –Systems

Engineering, Juniper Networks, India


How Big Data analytics can benefit the retail sector


popular survey highlights India as one of the most preferred retail destinations. The government’s decision to allow 100 percent FDI in single brands will further boost the Indian retail industry. New international retail giants like Walmart have already begun the process of setting shop in India and the existing players are expanding their reach to the country’s tier 2 and tier 3 markets. As competition increases, retailers are interested in solutions that will help them differentiate themselves from their competitors and attract customers. In the retail market where margins are under constant pressure and product duplication is almost immediate, retail leaders need the capability to swiftly respond to changes in customer demand. They need to adopt technology solutions that allow improved decision-making and drive faster response times to market needs. Added to this, retail customers are today modernized and are online, selective and social. Before making a purchase online, they compare prices on the web, scan QR codes and browse at the stores. During the process, they want the handover from one touchpoint to another to be as smooth an experience as possible. Due to these factors, modern retailers are exploring technology solutions to analyze structured data from diverse applications across their major operations including sales and marketing, shelf space management, payment/refund, customer service, logistics and warehouse operation, procurement, financial management, back-office operation, merchandising and promotion. They are also looking for tools to analyze unstructured data from social networks, web and software logs, information-sensing mobile devices, etc.


informationweek march 2013

Business benefits of Big Data analytics

To improve customer satisfaction and to prevent any kind of customer churn, retailers are realizing that they have to do more than simply tracking complaints. Combining structured data from sales, marketing and supply chain with unstructured or semi-structured data from surveys, syndication data and other outside sources can give retailers a new perspective of their customers. For example, merging structured with unstructured content to find underlying customer satisfaction issues allow enterprises to proactively monitor customer satisfaction levels. In many organizations, sales and customer service work in separate silos and customer feedback is often not allowed to flow freely between the different operations, resulting in ineffective distribution channels. However, a COO would be interested in the convergence of sales information and call center operations to get a holistic perspective of customer engagement. Tracking the social media and analyzing feeds from Twitter and Facebook can, in effect, help retailers find a correlation between product sales, support and customer voice to validate the true issues impacting customer satisfaction. Another customer satisfaction issue solved by Big Data is to identify the most valuable customers from a 360 degree view; to be able to reward them with offers and benefits relevant to a loyalty program, and to exclude those customers who merely take advantage of discounts without shopping from the merchants again. So, Big Data can help retailers understand customer behavior segmentation and what actions trigger behavior attributes in different segments and channels. It allows for real-time marketing execution at the time of purchase. Big Data also enables improvements to

loyalty programs by revealing what factors truly impact customer loyalty and retention, such as customer experience, ease of use, value for money and effect from rewards programs. These Big Data insights can be leveraged to optimize product promotions, give offers to specific customer segments or even precisely targeted customers and drive higher returns and greater customer satisfaction. Insights can also be generated to figure out the customer churn rate, reasons behind the churn and how to address them. Even competitors’ customers can be tracked and analyzed to understand industry trends and customer propensity to buy certain products or services. Retailers are also using Big Data solutions to streamline functions like merchandising and supply chain. One of the biggest challenges retail sector is facing today, is storing and managing huge volumes of Big data to extract intelligence. However, retailers are actively taking steps to compile customer data and their purchases so as to build potential wealth of insights into the demographic attributes, tastes, preferences and buying patterns of customers.

u Sheshagiri Anegondi is Vice

President, Technology, Oracle India


The opportunities and challenges of open standards


e see today’s IT ecosystem being natively scrambled with heterogeneous components running the show. Gone are the days where a single unilateral technology/solution ran the show. While the “Closed Standards” ecosystem is also getting better and better every year, through standardizations and compliances, we still struggle to manage such varied environments with several technology OEMs and variants of similar solutions. For a long time, established OEMs have been trying to sustain their offerings through structured standardization and focused technology baselines. Products or solutions having “Open standards” on the other hand have also been able to deliver most of the benefits that products having “Closed Standards” can offer. Let’s look at the challenges for solutions or products, which adopt ‘Open Standards’.


Open Standards are questioned every time to prove their mere reason of existence via a reference case or a success story. That’s where it becomes more challenging to justify past records, having a relatively small tenure. While Open standards have been around for more than a decade, they still lack endorsements. It is thus difficult to sell them or get buy-ins as compared to Closed Standards. ‘Basic blocks’ come as a part of pre-defined specifications and are free to tweak. This opens up an altogether different paradigm with several variations, although originating from the same basic block. While it offers a wide range of flexibility, it also poses a greater threat when this boon is not utilized in the right sense. Open standards are not yet that mature, although lots of blogs and web consortiums are doing a number of

substantial activities to promote this segment. The absence of a uniform standardization body across the horizontal plan that covers all IT components make it more difficult for ‘open standards’ vendors to justify their claims as against the established ones. Open standards does suggest a lack of governance. In fact the smaller the subset, the better the content adheres to set standards. However, the next level of customization that is open to all also needs a better governance model and will definitely prove boon to the customer community. Clearly no uniform body exists although several best practices are being followed. Several success stories from the one basic block results in more and more varied demarcations between the end products and the deployment/utilization technique. This segment is huge in terms of bandwidth and life span since there are constant innovations and customizations. A calculated tap and check will result in a highly efficient and effective outcome. Different flavors of success make it difficult to map business requirements and at times even more difficult when solution architects thrive to knit solutions based on the characteristics of Open Standards and commodity architectures. This demands a vision and not every customer grasps this. Business solutions can be mapped after tweaking and customizing the basic blocks and a clear foresight will prove beneficial for both developer and customer. Unaccounted success-stories pose the biggest challenges since the absence of the creator limits it to the first few basic blocks. Further, innovations are created by several intermediaries. This segment of intermediaries is not very organized as yet.


Solutions based on ‘Open Standards’

are beneficial as they offer customizations with specifics being touched upon through this channel. Needless to mention, the results are far superior compared to generalized packages. In recent times, even vendors who propagate ‘Closed Standards’ are opening their products for customization. The recent ‘Microsoft Windows 8’ release is one of the best examples wherein developers are given a freehand to develop packages that can integrate seamlessly with Windows 8. Such capabilities also exist in the ‘Open Standards’ world. However, adequate marketing needs to be done to the customer community. IT has now been the core enabler behind several business success stories and this has raised the expectation bar, and more customizations are being demanded by customers. Innovation does not stop and Open Standards and commodity architectures bear this mantra inherently. Besides better empowerment to the customer community, such solutions can be better tweaked and hence, better is the outcome. In the long run, this opens up a much larger world of ‘create your own IT Ecosystems’.

u Dharmesh Rathod is Project

Manager - Enterprise Architecture at CTO Office, Essar Group

march 2013 i n f o r m at i o n w e e k 65


Emerging technologies key focus at CSI IT2020


omputer Society of India – Mumbai Chapter hosted its Annual Conference IT2020 on 1st February 2013 at the Victor Menezes Convention Center at IIT Bombay. The conference theme “Making Emerging Technologies a Boardroom Agenda” and tracks namely Security, Mobility, Social Media, Big Data and Analytics was corroborated with KPMG, the Knowledge Partner. Spoken Tutorials team from IIT Bombay was the Support Partner. The inauguration was attended by over 500 delegates including senior CSI members, delegation from Navy, Corporate leaders, professionals, academicians and students. Ravi Eppaturi, Chairman, CSI Mumbai Chapter welcomed the delegates and shared the program and initiatives planned over the day. The keynote speaker Sanjeev Dali from a popular FMCG company set the tone for the conference by sharing the growth and impact of emerging technologies on the boardroom agenda. He shared some of their emerging technology initiatives are being reported and tracked by their board. Prof Kannan Moudgalya from IIT Bombay in his Plenary address shared

the process, progress and impact of the Spoken Tutorial Project and the popular Aakash initiatives. He also gave a brief demo of the Aakash tablet. Prof Kannan invited Eppaturi to release the newly created spoken tutorials on C++ and Java along with Dilip Ganeriwal, Sandip Chintawar, VL Mehta, George Arakal and Dr. SP Mudur. The delegates went on to the attend the scheduled five parallel tracks on Security, Mobility, Social Media, Big Data & Analytics.

Security Track

The security track was a blend of latest trends in organizational security and how various industries are strategizing to combat them. A K Viswanathan, Senior Director, Deloitte India was the track chair. In his introductory speech he highlighted how security remains a dynamic talked about topic and shared the agenda for the day. The session on Security Intelligence Operations Centre (SIOC) was presented by two subject matter experts namely, Mark Fernandez, Partner, Deloitte Canada and Vinay Puri, Senior Manager, Deloitte India. The interactive session focused on global and Indian perspective of the growing complexities in securities for an organization.

Release of spoken tutorials


informationweek march 2013

Numerous test cases and scenarios were showcased and discussed. The security panel discussion on “Winning strategies to counter emerging security threats” was moderated by Puri of Deloitte India. Eminent panelists included Anantha Sayana, VP & Head - Corporate IT, Larsen & Toubro; Ashish Pachory, CIO at Tata Teleservices; Jagdish Mahapatra, MD, McAfee India and SAARC; and Pravin Sharma, CISO and Assistant GM, Union Bank of India. The panelists and delegates shared and deliberated on the emerging threats and how an organization can shield themselves based on their risk appetite. Pinakin Dave, National Manager, Channels and Alliances for McAfee India on Security Connected Framework shared enterprise wide solution by McAfee India to provide a seamless integration of solutions, services and partnerships that intelligently reduces overall infrastructure risks.

Mobility Track

Lalit Shawney, Director, Lalit Sawhney and Associates chaired the Mobility Track. The Enterprise Mobility Track covered the opportunity provided by increasingly popular and powerful smartphones and feature phones, which are in the hands of knowledge workers, senior management, and all level of employees today. The speakers Gerard Rego, Director – Developer Experience, Nokia; Sowri Santhanakrishnan, VP & Venture Leader, Mobility Solutions, Cognizant Technology Solutions; Suresh Anantpurkar, Consultant Mobile Governance, Ex–President, mChek; Nitin Bhandari, Associate VP – New Products & Partnerships, Vodafone; Jayantha Prabhu, Group CTO, Essar Group; Anish Gupte, IT Infrastructure & Services Lead, Kraft Asia Pacific; Manjula Sridhar, Sr Director Sales at Arcot Systems (CA Technologies); LN Sundarrajan, Founder

Prof Kannan Moudgalya, IIT Bombay

Ravi Eppaturi, Chairman, CSI Mumbai Chapter

Kunal Pande, Partner, KPMG

Rewire; and Amit Chaubal, IS Security & Compliance Manager, Kraft Asia Pacific covered the opportunity, the challenges and practical implementation of this consumerization in the enterprise world and government, including the required technologies and security network implications to allow people to work the way they want, and allowing enterprises to leverage trends and innovations to gain business advantage. The two interesting and interactive sessions covered policy issues, cost savings and productivity gains, the disruptive character of the technology, how ‘the pilots are finally flying’ and how these technologies are changing the world.

Danish Mohammed, Leader (Marketing and Strategy), IBM Collaborative Solutions mentioned that organizations will need to move towards adopting internal communication and collaboration tools, which are able to leverage these natural instincts of Gen Y, resulting in overall increased in productivity. Binay Tewari, Head-Marketing,, spoke about the how the intersection of social, mobile and local (SoLoMo) was creating huge communication and business opportunities. Bharati Lele, Head Innovations, L&T Infotech shared some very interesting facts about social media, importance of social media monitoring in CRM and relevance of social analytics.

embarking on the journey of Big Data. Sanjay Mehta, CEO, MAIA – Intelligence presented an overview of the power and reach of analytics, while Pushkar Bhat, SAP shared capabilities of their product HANA, which offers Big Data analytics using in-memory computing technology. S S Mulay of Netmagic solutions presented ‘Data Jig saw PuzzleTools & Technologies’about Hadoop and all other relevant technology elements in Big Data environment and Harish Ganesan, CTO and CoFounder of 8KMiles made his pitch ‘Big Data - beyond hype’ and shared implementation experience through a real life case study. Yogesh Sawant of Hitachi Data Systems shared interesting real life cases about deployment of Big Data solutions within his parent company and clients.

Social Media Track

Hareesh Tibrewala, Joint CEO, Social Wavelength, the track chair, spoke about how businesses need to think about making business social, beyond just social media marketing. He also shared some cases on how internal use of social networking platforms could help contribute to business productivity. Sandeep Walunj,CMO, Magma Fincorp, spoke about creating a framework, which would enable all stakeholders to engage with the brand, and with each other. He spoke about adopting a threepronged approach which involved listening, communicating and engaging. Deepali Naair, Country Head (Brand and Corp Comm), L & T Insurance explained how digital is not just an “afterthought” and overall brand communication on the digital platform has become far more powerful and engaging.

Big Data and Analytics Track

C Kajwadkar, CIO,CCIL, the track chair, shared the drivers for the explosive data growth and identifying various dimensions of data that can define the term ‘Big Data’. The essence of key parameters like Volume, Velocity, Variety, Variability and Complexity were to be deliberated in the track. A B Pandey from UIDAI made an interesting presentation revolving around AADHAR project that is going to capture data of over 1.2 billion residents of India. The complexities of nature of data, size and challenges were eye openers. In the session, ‘Elephant in the room, are corporates feeding it ?’, Arun Gupta, CIO, Cipla shared his candid view that business value has to be assessed before embracing projects of such nature and reiterated that one should critically evaluate the need before

Theme Discussion

Kunal Pande, Partner, KPMG moderated the discussion. The group of business leaders and CIOs who joined the discussion included R Ramanan, CEO, CMC Limited; Vipin Agarwal, India Chair, BSA; C Kajwadkar, CIO, CCIL, Sebastian Joseph, CTO, DDB Mudra Group; and Shashi Kumar Ravulapaty, SVP & CTO, Reliance Capital. The discussion revolved around the requirement of the board and impact of emerging technologies on the boardroom agenda. The joint CSI and KPMG thought leadership paper was released by Pande and Eppaturi along with R. Ramanan, Prof Kannan and Rajiv Gerela. Additionally, CSI Mumbai Chapter Mobile Application was launched on Android, iOS and Windows.

march 2013 i n f o r m at i o n w e e k 67

CIO Profile Career Track

How long at the current company? I have been working with HDFC Standard Life Insurance for the past 2.5 years. Most important career influencer: Amitabh Chaudhry has been an important influencer for me and my wife Smitha, who has been a constant supporter and motivator. Decision I wish I could do over: There are too many to recount. I guess have made mistakes faster than others.


The next big thing for my industry will be‌ In my view, disintermediation of distribution channels through technology will be the next big thing for my industry. I believe this will lead to higher sales productivity at a cost structure that will increase insurance penetration in India. Advice for future CIOs: The CIOs of the future should understand the business drivers, learn interdisciplinary skills and stick to basics.

On The Job

Top three initiatives l Technology-led business transformation has been the biggest transformation exercise in HDFC Standard Life Insurance till date. We ran a fairly large transformation programme in our retail distribution channel. This was going to impact 400-500 odd branches and about 7,000- 8,000 people. It was a 12-month transformation programme. We were changing a host of things, such as distribution structure, how we approach the customer and the kind of incentives given. The challenge, which confronted us during this transformation programme was that we could not be physically present at every center for the rollout. So we deployed UC to start rolling this initiative across the country. l

Customer service strategy and mobility are other initiatives undertaken by me.

How I measure IT effectiveness IT effectiveness can be measured by the efficiency of business on usual parameters, business results delivered through adoption and usage of technology, and future proofing the organization through technology.


Leisure activities: Reading books, especially the ones focused on historical events. I also enjoy watching world cinema and engaging in sport activities, especially running.


informationweek march 2013

Subrat Mohanty Executive Vice President - Strategy, Customer Relations, Persistency and Technology, HDFC Standard Life Insurance

Best book read recently: Thinking, Fast and Slow by Daniel Kahneman Unknown talents (singing, painting etc): Music historian If I weren’t a CIO, I’d be... An itinerant writer u As told to Jasmine Kohli

Analyst Angle

Storage trends to watch out for


Tim Stammers

Currently, two of the biggest trends in data storage are the use of public cloud storage, and flash memory

http://www.on the web Top 7 storage trends to watch out for in 2013: Symantec Read article at:

he amount of data stored by organizations is growing remorselessly, and the effect of this growth on enterprise IT costs is driving both suppliers and buyers to explore new technologies and approaches to data storage. Currently, two of the biggest trends in data storage are the use of public cloud storage, and flash memory. Because of network latencies, public storage clouds such as those operated by Amazon or Microsoft will never fully replace on-premise storage, to which they will always be a complement. But as data volumes continue to balloon, and increasing numbers of businesses decide to store data indefinitely, public storage clouds will inevitably become a commonplace home for business data that is not highly performance-sensitive. Public cloud storage promises many of the same benefits that have already caused the take-up of other cloud services. It is elastic and infinitely scalable, and carries highly predictable costs. By offloading responsibility for an infrastructure activity to a service provider, it allows IT departments to focus on the more directly productive work of improving or developing new applications. All of the top-tier IT suppliers have now either launched or promised public storage services, or have begun reselling third-party services. Enterprise usage of cloud storage is greatly simplified by the usage of on-premise devices known as cloud storage gateways, and over the last three to four years sales of these devices have been steadily growing. The market for storage gateways was pioneered by a handful of start-up suppliers, but during 2012 cloud giants Amazon and Microsoft entered the sector. Architecturally, flash memory is at the opposite end of the scale to public cloud storage. Flash is a blessing for IT, because it is solving the growing challenges created by the performance limitations of disk storage. A common misconception is that flash storage is

expensive, and only suits exotic, high performance apps. The reality is that flash is already boosting performance and reducing costs in a range of other settings, including but not limited to mainstream database and front-office apps, Microsoft Exchange, and server and desktop virtualization. Currently, the most common data center usage of flash memory is within conventional disk-based storage systems. Installing a relatively limited number of flash drives into these devices can improve their performance significantly, while also reducing their purchase price. Future usage of flash will follow this pattern, and for the foreseeable future flash will never be more than a complement to disk in enterprise storage systems. The biggest reason for this is that although flash costs are falling, disk drive costs per GB of capacity are falling even faster. However, other ways of using flash within data centers are emerging, which represent a departure from conventional storage architectures that have become dominated by the use of centralized, shared storage systems. In future, data will continue to be stored in such systems, which will themselves comprise flash and disk storage. However most frequently accessed data will be stored in flash drives located within servers, or inside large, flash-only storage systems — either as cache copies of centrally stored data, or as the primary, working copies of that data. Because storage is a relatively slow-changing industry, wide adoption of these new architectures will take several years to complete. The market for central, shared storage systems is currently dominated by a handful of giant suppliers, who could lose their position if they do not move quickly enough to embrace flash in all its new forms. u Tim Stammers is Senior Analyst at

Ovum, focusing on server and storage infrastructure, and covers the activities of vendors such as EMC, VMware, Microsoft, IBM, and HP

march 2013 i n f o r m at i o n w e e k 69

Global CIO

Are big companies ready for cloud ERP?


Chris Murphy

Any ERP change is high risk, but CIOs will look closely at cloud options

LOGS Chris Murphy blogs at InformationWeek. Check out his blogs at: CEO Marc Benioff didn’t invent this whole idea of going around the CIO to pitch software directly to business unit leaders. Back when Sandra Kurtzig founded ASK and was selling its MANMAN manufacturing software in the ‘70s and ‘80s, her teams regularly skirted IT to get directly to manufacturing chiefs. Dodging IT was so common that Kurtzig admits it’s a bit unsettling how often CIOs are the ones initiating a discussion about her new startup’s cloud-based ERP software. Kenandy’s software-as-a-service handles manufacturing management, order management, financials and procurement. Its core is the order-tocash cycle that SAP or Oracle software handles at most big companies. Kurtzig calls it the “heart and lungs” of the business, and she knows it will be among the last tasks transplanted to the cloud. But Kurtzig saw enough potential to come out of retirement in 2010 to start Kenandy — at the urging of Benioff. She says Benioff pitched her on the idea that companies will come around to running something as critical as manufacturing systems in the cloud, and that her ASK experience (CA acquired ASK in 1994) gave her the cred to deliver it. After meeting with Ray Lane — a Kleiner Perkins partner and Hewlett-Packard’s chairman -- and landing USD 10.5 million in funding, she started Kenandy. Kenandy’s SaaS runs on Salesforce’s development and data center platform,, and is written in Salesforce’s proprietary Apex code.

The ‘Chiquita Moment’

Kenandy has a lot to prove. It needs more than just a couple of dozen customers. The entire cloud ERP segment needs a Chiquita moment. It was in 2007 that Aneel Bhusri, cofounder of cloud HR software company Workday, announced at the InformationWeek 500 Conference that Workday had landed the 26,000-employee Chiquita


informationweek march 2013

as a customer. It was Workday’s first big multinational customer to put HR operations in the cloud. Soon came a deal from Flextronics, a contract manufacturer that had 200,000 employees worldwide at the time. Those deals let other cloud champions challenge their organizations: What’s so special about our HR that they can do this and we can’t? Flextronics CIO David Smoley drove the Workday deal, and Smoley’s now optimistic about the potential of cloud-based ERP. He’s spending time with startups such as Kenandy as well as established vendors such as Infor discussing the idea. A lot of companies aren’t ready to put something as critical as manufacturing management in the cloud. But the switch has flipped, and cloud is more often the first preference for new software, not a fallback. The boom in marketing technology spending is riding on the cloud, with software from workflow automation to analytics often bought as online services. “Heart and lungs” functions such as ERP will be among the last to go to the cloud because of the sensitivity of the data involved. And changing an ERP system (in the cloud or on premises) is a complex job that risks disrupting business operations. ERP replacements are to CIO careers what barely submerged shoals are to a sailor: a deadly hazard between you and where you want to go. But if the economy improves, and CIOs think about ERP upgrades, they’re going to explore whether phasing in cloud elements makes sense. I don’t know if Kenandy has the chops to be a breakthrough startup. Perhaps a more established cloudbased NetSuite, or cloud options from the likes of Oracle, SAP and Infor, will win the day. But I won’t be shocked if ERP-in-the-cloud gets its Chiquita or Flextronics moment in 2013. u Chris Murphy is Editor of

InformationWeek. Write to Chris at

Practical Analysis

Storage gets exciting — Really


Art Wittmann

We saw a big uptick in the use of Ethernet, SSDs and virtualization last year. Expect that trend to continue in 2013

LOGS Art Wittmann blogs at InformationWeek. Check out his blogs at:

e’ve done our State of Storage Survey for five years now, and one thing we’ve learned over that time is that storage pros are methodical in their adoption of technology. That’s to say they’re cautious; you might even say slow moving. After all, virtualization and all of its derivative technologies, Ethernet-based storage protocols and even solid-state drives have been around for a while. Uptake of one or two of those technologies generally increases only a few percentage points in a given year. But in 2013, our survey shows that storage pros moved on all three of those fronts in big ways, and it looks like they’ll continue their fasterpaced adoption in 2013. The numbers appear to show that 2012 was a year of upgrades, and often to products from new vendors — but existing vendors’ products weren’t necessarily retired, at least not yet. So while the percentage of survey respondents who reported using HP and IBM systems for Tier 1 and 2 systems stayed flat at 55 percent and 41 percent, respectively, reported use of EMC jumped eight points, NetApp jumped 12 points, Sun/Oracle jumped seven points and Dell jumped six. Most likely as a result of this trend, the 2013 survey revealed a large increase (31 percent) in IT organizations reporting that they manage 100 TB or more of stored data. We also see more concern about having enough budget and sufficient staffing. This cautious approach of bringing in new systems and technology while keeping the existing stuff isn’t without its own pain. Adding vendors implies a short-term lack of expertise that always makes the first years with those vendors challenging. These new storage systems are largely networked with 10-Gbps Ethernet. Those reporting using 10 GigE for their SANs is up 46 percent, and those using it for NAS systems is up 29 percent. Storage architectures are moving toward Ethernet, somewhat at the expense of Fibre Channel but more so through a decrease in use of 1-Gbps

Ethernet. A good number of shops are aiming for a single networking technology within their data centers, and that single fabric will most certainly be Ethernet. But Fibre Channel fans shouldn’t lose heart. We didn’t ask about 16-Gbps Fibre Channel in last year’s survey, but we did this year and 10 percent of respondents reported using the technology. More interesting is the increased use of storage virtualization, as determined by those who say they’re pooling some or all of their storage resources, up 24 percent from last year. That increase brought with it a significant uptick in other virtualization-related technologies. Thin provisioning use is up 32 percent. Tier 1 data reduction (deduplication, compression or both) is up 27 percent. Finally, there’s the movement to the use of solid-state storage. Spinning media, much like tape, still has a long life ahead of it, but improvements in the longevity, reliability and performance of solid state mean that, sooner or later, it will be the de facto storage technology. In our survey, 45 percent more respondents than in the previous-year’s survey reported using SSDs within servers. For most shops, the use is in hybrid systems that combine the performance of SSDs with the economy of traditional spinning media. Storage pros are learning that simply replacing hard drives with SSDs isn’t the way to go. Performance bottlenecks simply move from the drive subsystem to the networking system or elsewhere. Getting the most bang for the buck means reconsidering the storage architecture as a whole, and that transition will take a few more years to make its way from specialized apps to widespread general use. Storage pros are shaking things up like never before. For a discipline known for its caution, the next few years will be very exciting. u Art Wittmann is Director of

InformationWeek Analytics, a portfolio of decision-support tools and analyst reports. You can write to him at

march 2013 i n f o r m at i o n w e e k 71

Down to Business

The taxman cometh for Big Data-driven companies


Rob Preston

Data-driven commerce is an economic good, not a bad, so we shouldn’t consider taxing it like carbon emissions

LOGS Rob Preston blogs at InformationWeek. Check out his blogs at:


informationweek March 2013

mid soaring national deficits and debt, the powers that be complain that certain companies aren’t paying their “fair share” of taxes. A new report commissioned by the French government but gaining some attention in the U.S. and elsewhere recommends changing national and international tax rules to extract more money from Internet companies in particular in order to prop up government spending. The notion is that Internet titans such as Google and Amazon pay only a fraction of the taxes they ought to pay because the nature of their digital businesses lets them locate much of their profit-making operations in low-tax countries. The proposed solution: Tax those companies’“intensive use of data” in the country where that data is collected, Nicolas Colin, one of the authors of the controversial report, writes on The reasoning goes something like this: Internet companies collect all kinds of user data to deliver targeted advertising, customize products, make recommendations, adjust prices and drive any number of other profit-making endeavors. Those users, in effect, “become part of business operations,” Colin writes, in some cases replacing employees and contractors. And because those users aren’t paid like employees, their “free work” lets tech companies “reach the highest economies of scale and massive profitability,” he says. Yet the taxman can’t have at the full extent of those high profits when they’re on the books in other countries. Colin writes, “As the digital economy keeps growing, every sector’s margin will be relocated abroad, disappearing from our GDP and depriving the government from additional revenue.” The report’s broad goal is to recommend a way for developed countries to “recover the power to tax profits made by giant tech companies” based in those countries. So who exactly are these “giant tech companies” to be subjected to

this new form of data-based taxation? We hear about Google and Amazon, and it’s easy enough to extrapolate the thinking to the likes of eBay and Facebook. But is Wal-Mart a giant tech company? Is Procter & Gamble? Are General Motors and Ford? Because each of them (and thousands more) is collecting many terabytes of customer and other data to fuel their businesses. Another obvious question: What will be the process for determining the amount of taxable data? It’s an idea that reeks of imprecision. Colin acknowledges that because “the value of data is not yet mastered, the goal should not be to tax data collection per se. Instead it should be to create an incentive for businesses that rely on regular and systematic monitoring to adopt compliant practices in favor of user empowerment and innovation.” Huh? Is the proposal to tax data collection or not? A fundamental assumption of the report is that governments deprived of fat tax revenues from new economy companies are themselves fiscally responsible. They’re not. In the U.S. and elsewhere, no matter the party in power, governments have shown little interest in balancing their budgets. They need to spend the money they now collect more wisely, not turn to pie-in-thesky Internet taxation schemes. But I digress. Even if you think governments need more revenue, this is a loony way to get it. The report’s authors go so far as to compare their tax scheme to the one proposed in the Kyoto Protocol, whereby countries levy a tax on a company’s carbon emissions — as if leveraging data for competitive advantage is somehow congruous with polluting the environment. Data-driven commerce is an economic good, not a bad, even if companies misuse that data from time to time. u Rob Preston is VP and Editor-in-Chief of InformationWeek. You can write to Rob at

InformationWeek March 1 issue  

InformationWeek India

Read more
Read more
Similar to
Popular now
Just for you