pub-dedicatedcloud-uk-21x24-White.pdf 2 02/09/2015 09:56:39 Cloud Computing Intelligence
Discover Your Dedicated Cloud 30 mins. to set-up your virtual datacentre
5 mins. to scale your resources up or down
100% SLA guarantee
Choice of European and North American datacentres
Test our vSphere as a Service www.ovh.co.uk/dedicated-cloud
From the editor
Welcome to CCi Issue 22
very year IT analysts Gartner look at a raft of emerging technologies a place each one on a Hype Cycle infographic - a reality check on how far the technologies are from the mainstream - and as you would expect cloud in all its many flavours has featured heavily on the list. So it was quite a surprise to see in this years’ emerging technologies Hype Cycle infographic that Gartner still regards hybrid cloud as something that’s not mature and has between 2-5 years before it hits the mainstream. In fact, Gartner not only finds that hybrid cloud is still not up to scratch, but they have also chosen to keep it in the same place as it was a year ago. For those not familiar with the Hype Cycle here’s a quick crash course. As new technologies appear they come into Gartner’s Hype Cycle in the Innovation Trigger phase, this is normally where the press and media start the long process of hype. This then moves on to the Peak of Inflated Expectations which is where businesses start to take notice and start to kick the tires. As a consequence of this tire kicking many decide that indeed the technology was just hype, which is where technologies enter into our favourite phase, the Trough of Disillusionment. At the end of a very long drop, businesses then start to realise that actually there was something in what the media have been harping on about, and the technology swiftly moves into the mainstream a.k.a. the Slope of Enlightenment. Hybrid cloud is currently in a trough and according to Gartner’s cycle it will be another 2 to 5 years before it
moves into the mainstream, which to us seems unduly pessimistic. As far as we’re concerned hybrid is here now and many people have been using it successfully for a few years, and in our many talks with vendors and cloud customers, hybrid is always regarded as something that is certainly in the mainstream, and I don’t think I can remember any business contradicting that view. Interestingly looking back at the last years Hype Cycle there have been several technologies that have dropped off the emerging technologies map, despite Gartner predicting there were still 2-5 years before they hit the mainstream. According to Gartner in 2014 big data was just entering the Trough of disillusionment with the mainstream still 5 to 10 years off, and cloud computing was just about to enter the Slope of Enlightenment with the mainstream 2 to 5 years away, both technologies are now no longer on the Hype Cycle. Make of that what you want. Additionally on last year’s cycle the Internet of Things was 5 to 10 years away and was entering ‘peak hype,’ it is still in the same position this year. So what do you think? Are we totally wrong? We are prepared to admit that perhaps we see some of the most enlightened businesses here on CCI but they can’t all be wrong, can they?
Marcus Austin Editorial Director, CCI Magazine
EDITORIAL DIRECTOR Marcus Austin, t. +44 (0) 7973 511045 email@example.com PUBLISHING DIRECTOR Scott Colman, t. +44 (0)7595 023 460 e: firstname.lastname@example.org SPECIAL FEATURES EDITOR Graham Jarvis, t. +44 (0)203 198 9621 e. email@example.com WEB & DIGITAL Apal Goel, t. +91 (0)97 171 6733 e. firstname.lastname@example.org CIRCULATION & FINANCE MANAGER Emma Colman, t. +44 (0)7720 595 845 e. email@example.com DESIGN & PRODUCTION Jonny Jones, t. +44 (0)7803 543 057 e. firstname.lastname@example.org Editorial: All submissions will be handled with reasonable care, but the publisher assumes no responsibility for safety of artwork, photographs, or manuscripts. Every precaution is taken to ensure accuracy, but the publisher cannot accept responsibility for the accuracy of information supplied herein or for any opinion expressed. Subscriptions: CCi Magazine is free to qualified subscribers in the UK and Europe. To apply for a subscription, or to change your name and address, go to www.cloudcomputingintelligence.com, click on ‘Free Subscription – Register Now,’ and follow the prompts. Reprints: Reprints of all articles in this issue are available (500 minimum). Contact: Emma Colman +44 (0)7720 595 845. No responsibility for loss occasioned to any person acting or refraining from acting as a result of material in this publication can be accepted. Cloud Computing Intelligence (CCi) Magazine is published 10 times in 2014 by Future Publishing Solutions Ltd, and is a registered trademark and service mark of Future Publishing Solutions Copyright 2014. Future Publishing Solutions Ltd. All rights reserved. No part of this publication may be reproduced or used in any form (including photocopying or storing it in any medium by electronic means and whether or not transiently or incidentally to some other use of this publication) without prior permission in writing from the copyright owner except in accordance with the provisions of the Copyright, Designs, and Patents Act (UK) 1988, or under the terms of a licence issued by the Copyright Licencing Agency, 90 Tottenham Court Road, London, W1P 0LP, UK. Applications for the copyright owner’s permission to reproduce any part of this publication should be forwarded in writing to Permissions Department, Future Publishing Solutions Ltd, Lea Green Farm, Lea Green Lane, Church Minshull, Nantwich, Cheshire, CW5 6ED. Warning: The doing of an unauthorised act in relation to copyright work may result in both a civil claim for damages and criminal prosecution.
Lea Green Farm, Lea Green Lane, Church Minshull, Nantwich, Cheshire, CW5 6ED n + 44 (0)1270 522 132 n email@example.com n www.futurepublishingsolutions.com
WHATâ€™S BEHIND YOUR CLOUD? Tier III+ resilience
A cloud is only as reliable as the data center behind it. As the popularity of cloud-based services continue to soar, so does the importance of having the right infrastructure in place to sustain it. Our data centers provide multi-layer security, diverse power, carrier-neutral connectivity and Tier III+ levels of resilience â€“ all designed to support exceptional cloud services. Visit our website to find out more and make sure your cloud is rock solid.
Carrier neutral connectivity
News 4-12 News
14 The smart data layer is opening up new frontiers in business 16 Getting rid of project chaos in the cloud 20 Moving your desktop to the cloud 23 Why the EU may force you to recruit a data protection officer 24 How to make big data analytics work for the businesses 26 Data Protection, Contracts and the Cloud: ISO 27018 30 Infinity creates the data centre for the customer of the future 34 What to do if the CFO says no 38 Eight steps to complete cloud security
IBM creates new open source developer resources
new open source initiative from IBM aimed at cloud application developers launches with a wide range of free-to-use tools and projects IBM has unveiled a new platform for developers to collaborate with IBM on a newly released set of open source technologies and 50 projects to the open source community to speed enterprise adoption around mobile, analytics and other high-growth areas. The new developerWorks Open is a cloud-based environment for developers to access emerging IBM technologies, technical expertise and collaborate with a global network to accelerate projects. The platform gives developers the ability to download code, and get access to blogs, videos, tools and techniques to accelerate their efforts and to give developers confidence to build and deploy open source apps working with their clients’
demanding business requirements. Among the broad range of technologies already deployed on developerWorks Open, IBM is making available projects in key industry areas to help bridge the development gap. The 50 pre-written projects provided are split into industries including healthcare, mobile, retail, insurance and banking. Additionally, IBM is open sourcing several Analytics technologies including: Activity Streams which provides developers with a standard model and encoding format for describing how users engage with both the application and with one another. Agentless System Crawler offers a unified cloud monitoring and analytics framework that enables visibility into all types of cloud platforms and runtimes IBM Analytics for Apache Spark adds big-data analytics and is available in Beta
on Bluemix. IBM will also continue with its IBM Object Storage on Bluemix Service Broker open source cloud data services, which can be used to integrate OpenStack Swift with Cloud Foundry, allowing fast access to cloud data without needing to know where the data is stored “IBM firmly believes that open source is the foundation of innovative application development in the cloud,” IBM Vice President of Cloud Architecture and Technology Dr. Angel Diaz said. “With developerWorks Open, we are open sourcing additional IBM innovations that we feel have the potential to grow the community and ecosystem and eventually become established technologies.”
Cloud Native Computing Foundation aims to standardise the cloud
ew Cloud Native Computing Foundation is to advance the state-of-the-art for building cloud-native applications and services. The newly formed Cloud Native Computing Foundation (CNCF) organisation – a consortia of the Linux Foundation and every cloud business worth its salt - aims to advance the state-of-the-art for building cloudnative applications and services. According to CNCF, they plan to “create and drive the adoption of a new set of common container technologies driven and informed by technical merit and end-user value and that is inspired by Internet-scale computing. This work seeks to improve the overall developer experience, paving the way for faster
code reuse, improved machine efficiency, reduced costs and increases in the overall agility and maintainability of applications.” Which may sound a little like the aims of the OCP however, according to Ben Golub, CEO of Docker there’s more to it than just containers. “The OCP initiative announced last month at DockerCon establishes a foundation for containerbased computing, with a common image and runtime format for containers. At the orchestration layer of the stack, there are many competing solutions and the standard has yet to be defined. Through our participation in the Cloud Native Computing Foundation, we are pleased to be part of a collaborative effort that will establish interoperable reference stacks
for container orchestration, enabling greater innovation and flexibility among developers. This is in line with the Docker Swarm integration with Mesos, which we demonstrated at DockerCon one month ago.” The Foundation will look at open source at the orchestration level, followed by the integration of hosts and services by defining API’s and standards through a code-first approach to advance the state-of-art of containerpackaged application infrastructure. The organisation will also work with the recently announced Open Container Initiative on its container image specification. W: cncf.io
Ready for NOW? Introducing a new approach to data centre services that gives you the agility and flexibility you need to take control of Now.
Scale up and scale down
Only pay for the power you need
Call us on +44 (0) 20 7079 2990 For more information visit www.infinitysdc.com
Public cloud in financial services? You Canute be serious
early half of the financial services businesses admitted to bypassing the IT department when purchasing public cloud The cloud like the sea cannot be stopped and a new study, launched jointly from EMC, VCE and VMware proves that the cloud is an unstoppable force and just like King Canute ordering the sea to stop, shouting at line of businesses users to stop using the cloud isn’t going to have any effect. The survey found that even in the most heavily regulated industry in the UK, financial services, 92% of line of business workers admitted to using some form of public cloud. Of the 600 line of businesses (LOB) surveyed across the UK, some 44% of those in financial services admitted to bypassing the IT department when purchasing public cloud services.
The research also reveals the high level of public cloud adoption by LOBs in the Financial Services sector; the overwhelming majority (92%) of UK Financial Services LOBs say they use some form of public cloud, whether validated by IT or not, with factors driving adoption ranging from affordability (41% cited it was a cheaper solution), ease of use (34%) and wanting to meet client needs (30%). “The research highlights why the scale of public cloud adoption, without consultation with the IT department, should cause concern for the industry,” comments Nigel Moulton, Chief Technology Officer, EMEA at VCE. “As employees look to deploy cloud services that circumnavigate the IT department, IT organisations need to have a strategy that gives easier access to cloud services in a controlled,
compliant manner.” In order to meet these customer needs in an increasingly competitive and disruptive market, the study highlights the pressure IT is under to deliver value to the business, at pace. IT departments need to embrace a hybrid cloud strategy that can provide portability of workloads in a flexible, agile fashion. “With an enterprise hybrid cloud strategy in place, line of businesses can focus on their business interests, while gaining immediate and easy to access, secure resources and the benefits hybrid clouds provide,” explains Rob Lamb, Cloud Business Director, UK and Ireland, EMC. “As a result, IT and businesses can embrace the public cloud with confidence, in a secure and manageable way.” W: www.vce.com
Google’s Kubernetes orchestrator V1.0
fter a years’ worth of work the first full 1.0 version of open source container orchestration system Kubernetes is available and includes 14,000 commits from 400 contributors Last February, Kubernetes contributors and instigators Google got together in San Francisco and agreed on the first full version of the open source container orchestration system would be in terms of features, reliability, and supportability and now six months later the team of 400 contributors has produced the first version that’s scale tested to 1000s of containers per cluster, and 100s of nodes. CNCF will be guided by a technical committee who will engage open source and partner communities to build new software to make the entire container toolset more robust. They
will also evaluate additional projects for inclusion in the foundation and ensure that the overall toolset works well as a whole. Features of the first version of Kubernetes include: • core functionality for deploying and managing workloads in production, including DNS, load balancing, scaling, applicationlevel health checking, and service accounts • Stateful application support with a wide variety of local and network based volumes, such as Google Compute Engine persistent disk, AWS Elastic Block Store, and NFS • Enables groupings of closely related containers, pods, to deploy containers, enabling easy updates and rollback • Inspect and debug applications with command execution, port
forwarding, log collection, and resource monitoring via CLI and UI. • Live clusters can be upgraded and dynamically scaled and partitioned via namespaces for deeper control over resources. For example, you can segment a cluster into different applications, or test and production environments. • A stable fast API with <5s responses to schedule containers, with a formal deprecation policy Early adopters of Kubernetes include CoreOS which has launched Tectonic Preview with Kubernetes 1.0, CloudBees is releasing Kubernetes plugins for Jenkins and Hitachi Data Systems is offering Kubernetes on Unified Computing Platform. W: kubernetes.io
Containerisation offers true pay as you use Iaas T
he new simple, containerbased bare-metal cloud infrastructure service promises to reduce bills by up to 37% by offering true usage-based billing. Cloud hosting business ElasticHosts has created Springs.io, a standalone business that claims the first pay as you use cloud service. Yes, you read that right, first. But isn’t that what you have already with a pay by the hour solution? It seems not. With pay by the hour solutions, you select the server, memory and storage configurations up-front and then pay by the hour. If you don’t use all of the memory or the CPU then you still pay for it. With Springs.io solution you still choose the specification upfront, but you only pay for what you use, with the resultant savings which could be up to 37% for a server operating under a normal load, and even with a heavy load savings could be as much as 16%. The solution is based on ElasticHosts auto-scaling elastic containers technology announced last April and is aimed primarily at Linux developers, agencies, and small businesses looking for low-cost, auto-scaling compute infrastructure. “We have been listening to the market and what we are hearing is that people are craving simplicity,”
comments Richard Davies, founder of Springs.io. “They just want to be able to sign up to a service without having to choose instance sizes or worry about over-paying, just as you would with your gas or electricity. While some customers need greater support and configuration, many don’t, and we wanted to provide a service for users that are looking for a more simplistic offering. For example, having to configure the environment so that it can automatically scale; this is a skill that many SMEs may not have in-house. Springs.io is very clear and easy to navigate, and best of all it automatically scales and bills users based on their actual usage – so it really does the thinking for you.” With the new solution, you might just pay for 1GB when a 4GB server is lightly used. These containers boot up in 2 seconds and offer bare metal performance without the overhead of virtualisation. Springs. io uses namespaces and cgroups, the same Linux kernel containerisation technologies as Docker or LXC, backed by high I/O SSD storage throughout. As a result, Springs.io can allow auto-
scaling, high-performance Linux containers at a fraction of the cost of traditional virtualisation-based cloud servers. “While the technology is complex, the service is remarkably simple,” Davies concludes, “Businesses need a whole new service that strips away any complications, which is what Springs.io offers. This is the next step in the evolution towards a completely utility based offering, which is what cloud has always promised to be, but until now has often failed to achieve.” Prices start at $0.008 per MHz/ hour, RAM $0.011 per MB/hour, SSD Storage at $0.25 per GB/month and Springs.io claims savings could peak at up to 50%, they also claim 2 second server boot times and new servers can be spun up in under a minute. There’s currently one terabyte of free data transfer per month with fees of $0.05 per GB over the monthly terabyte allocation.
Cloud Industry Forum finds organisations are stuck fast to Windows Server 2003
esearch from the Cloud Industry Forum highlights the challenge of migrating from Microsoft Windows Server 2003 for larger organisations. The latest research from the Cloud Industry Forum (CIF) indicates that while small businesses have been steadily migrating away from Windows Server 2003, larger enterprises, in spite of the July 14 deadline, have been staying put. The survey found a quarter of organisations with fewer than 20 employees have upgraded from WS2003 in the past year – from 58% in 2014 to 44% usage in 2015. This is in stark contrast to larger enterprises, where use of WS2003
has remained almost static. More than half (51%) of organisations with 21-200 employees and 72% with more than 200 staff are still reliant on WS2003, despite the lack of support leaving organisations susceptible to potential vulnerabilities. Alex Hilton, CEO of CIF, added: “After July 14th, with Microsoft no longer issuing security updates for Windows Server 2003 there will be the potential risk to customers who have not upgraded. Cloud services now represent a viable short-term fix for many customers who lack the inhouse resources. “The end-of-life of older operating systems and solutions such as Windows Server 2003 and Microsoft’s
Small Business Server has created a new imperative and a new opportunity to look to cloud-based alternatives. While many businesses will undertake a rudimentary incremental upgrade and others will take the opportunity to refine their IT strategy, a far simpler and easier option would be to embrace the chance to move the infrastructure workloads to cloud services. The coming months represent a great opportunity for customers to make a cloud migration and adopt the latest enterprise-ready technology in a cost effective manner.” Hilton summarised. W: cloudindustryforum.org
AppFormix partners with Mirantis to optimise the cloud
ew analytics embedded into Mirantis OpenStack will help developers and IT managers discover and eliminate bottlenecks Newly launched AppFormix is to embed its analytics and control services into Mirantis OpenStack based private cloud infrastructure. The intention of the partnership is to enable OpenStack operators to improve the efficiency of their infrastructure while ensuring predictable application performance and provide analysis on how applications use server, storage, and networking resources in real time, and set policies to control resource
allocation and eliminate bottlenecks. The new AppFormix software will give users of Mirantis OpenStack utilisation analytics designed to discover and eliminate bottlenecks across I/O resources, as well as provide data-driven capacity planning for infrastructure. The analytics have been designed to deliver no matter if applications are running in Virtual Machines, or containers, either on-premise or in the public cloud, and IT managers are able to designate exactly what resources an application can use for more efficient infrastructure use and a more consistent performing
application. “Developers are adopting a new generation of tools that will help them build web applications quickly and easily,” said Mirantis co-founder and chairman, Alex Freedland. “To keep up with this wave of innovation, operators need to measure application and infrastructure resources together in real-time. Integrating Mirantis OpenStack with AppFormix will solve this problem for the operator, building diagnostics into OpenStack and making it easy to fine-tune application performance.” W: www.mirantis.com
Logi Analytics self-service analytics for end-users and experts
nalytics business revamps their Info and Vision tools and introduces new DataHub and Discovery Module to help deliver a single technology that fits both endusers and data analysts. The latest version of Logi Analytics self-service analytics platform Logi 12, sees a re-think on the platform with a new solution that is built from the ground-up to deliver analytics to, non-professional users as well as data analytics experts. The new version consists of a new product (Logi DataHub), a new module (Discovery Module for Logi Info), as well as significant enhancements to its Logi Info business intelligence (BI) platform and Logi Vision data discovery application. The new solutions allow users have the choice of interacting with IT-created reports and dashboards, author reports and dashboards, or explore data to detect
patterns, identify trends, and answer new questions. Additionally, Logi 12 offerings can be embedded directly within the context of the applications people use every day, improving adoption and leading to smarter business decisions for the line of business. The new Logi DataHub works with both Logi Info and Logi Vision to simplify data preparation and allows users to rapidly connect, acquire, and blend data from files, applications or databases, whether on-premise or in the cloud; cache it in a high-performance self-tuning repository; and prepare it using DataHub’s smart profiling, joining, and intuitive data enrichment. The new Discovery Module, is an add-on for Logi Info, and offers exploratory visual analysis that can be embedded in existing applications. The Discovery Module includes chart
recommendations to help users understand data and discover insights. Users also have the ability to publish new insights to shared dashboards to collaborate with workgroups. Logi Info now includes new interactive visualisations, self-service dashboard authoring, and real-time charts and Logi claims faster application development, integration with GIT and TFS, and 2x – 10x faster performance. Logi Vision has an improved data visualisation recommendation engine that learns from user activities and includes new project templates to help shortcut data discovery with prepopulated data connections, visualisations, and dashboards that simply require users to connect their data to begin their analysis. W: www.logianalytics.com
Actifio now available on AWS
ctifio has expanded its data management technology to customer workloads on Amazon Web Services and adds new DevOps features. Boston-based copy data virtualisation company, Actifio has launched Actifio Sky for AWS to help businesses improve their data access, agility, and control on the AWS Cloud. Actifio is used by many organisations to act as a filter in front of their backup and recovery and business continuity solutions to save on data storage costs and the solution powers solutions such as including IBM’s new SmartCloud Data Virtualisation (SCDV), and Sungard’s Recover2Cloud cloud disaster recovery services. The business recently added a series
of new features and functional enhancements aimed at better supporting the growing number of customers leveraging its patented Virtual Data Pipeline (VDP) technology in support of DevOps use cases. New features include: • Authorised developers, DBAs, and data scientists can now access fresh copies of production data securely and instantly. • Operations are now able to better support users with self-service data access built for the enterprise. • Simplified protection of test/ dev environments - administrators are able to protect, replicate, and restore clones used for test/dev environments whilst minimising the need for storage. • Enhancements to the popular
LogSmart feature - users can go beyond recovery and mount a database at any point-in-time for application development, testing, analysis and more. • The next generation of Actifio’s powerful Resiliency Director. Greg Scorziello, Managing Direct Actifio EMEA said. “Actifio improves application development by letting customers build faster, easier and less expensive apps. It improves code quality and time to market. Once an application is live in production, Actifio improves data protection, access, and control. Moreover, it integrates the concept of application retirement, enabling not just access to historical data but access to entire applications already decommissioned from production.” W: www.actifio.com
Cyber threats evolving too fast to keep up with
ew survey reveals that despite companies employing more IT security staff than ever before, the hackers are still winning. Despite organisations employing more IT security personnel than they ever have in the past, a new survey from US Lieberman Software reveals that a third of companies do not feel this is making them more secure because cyber-attacks are evolving at too fast a pace for them to keep up with. The survey, which was carried out at RSA Conference 2015, studied the attitudes of nearly 200 IT security professionals and it revealed that 67% of organisations are now employing more IT security staff than they ever
have in the past. However, 76% of respondents still believe that cyberattacks are evolving at too fast a pace for their IT security personnel to keep up with. And this is from a survey of people at a security conference, who are supposedly the experts in security and if anyone can keep up surely it’s them. Other findings from the study reveal that a staggering 85% of organisations find it a struggle to find really good IT security personnel who are capable at combating today’s cyber-attacks. As we have said before this skills gap is liable to be putting organisations at risk as it could make it easier for cyber criminals to gain access to companies’ networks. Particularly as this survey reveals those
organisations may not have a proper understanding of cyber security threats or have policies in place which prevent staff from carrying out tasks which put their company network at risk. Commenting on the findings Liebermann Software CEO Philip Lieberman said. “The dramatic increase in data breaches over the last few years has led to a demand within organisations to employ skilled IT security staff. However, many companies have struggled to find staff who are competent enough to defend against the type of sophisticated cyberattacks we are frequently seeing today. W. www.liebsoft.com
Intel collaborates with Rackspace on OpenStack for the enterprise
ntel and Rackspace have got together to create a new joint OpenStack Innovation Centre and to work on fast-tracking OpenStack. Managed cloud business Rackspace and chip giant Intel Corporation are to create the OpenStack Innovation Centre in San Antonio, Texas and will collaborate on accelerating the development of enterprise capabilities and significantly add to the number of developers contributing to upstream OpenStack code. The project will bring together OpenStack engineers from Rackspace and Intel to advance the scalability, manageability and reliability of OpenStack by adding new features, functionality and eliminating bugs through upstream code contributions. Rackspace founded the OpenStack open-source cloud software platform, in July 2010 with NASA and since it was founded, OpenStack has become
one of the fastest growing open source communities in the world, with more than 520 member companies and 27,000 individual contributors across 167 countries and 451 Research estimates that OpenStack revenue will reach $1.26 billion in 2015. The agreement between Rackspace and Intel also includes: OpenStack Developer Training – Rackspace and Intel will offer new modules of courseware designed to onboard and increase the number of open source developers actively contributing to the success of the community. Joint OpenStack Engineering – Rackspace and Intel will resource OpenStack development, working in collaboration with the OpenStack Enterprise Work Group and community, targeting bug elimination and the development of new enterprise features. The companies will recruit
new engineers to participate in OpenStack development. Largest OpenStack Developer Cloud – Rackspace and Intel will build and make available to the community two 1,000 node clusters to support advanced, large-scale testing of OpenStack and new features developed through the joint engineering work. The companies anticipate having the clusters available for community use within the next six months. “The community’s goal is to foster collaboration and spur innovation that drives broad adoption,” said Jonathan Bryce, Executive Director of the OpenStack Foundation. “The depth of experience and community engagement that Rackspace and Intel offer makes this an exciting project, as the code contributions and large-scale testing will benefit everyone who uses OpenStack.” W. www.rackspace.co.uk
Business Cloud: Matt Smith
The smart data layer is opening up new frontiers in business Matt Smith explains how significant changes in computing hardware and data science are now allowing a much broader range of businesses to utilise truly huge amounts of diverse data in real time.
Matt Smith Matthew Smith is Chief Technology Officer at Software AG in UK, Germany, Nordics & South Africa and has a background in IT and Business Transformation stretching back nearly 20 years starting as a published scientist in Military Research. He has worked in large transformational programmes ever since, and specialises in real-world advice and guidance on topics ranging from business discovery, solution design, business automation and transformation, board ready business case creation, governance and control, team staffing and business innovation.
Until fairly recently it was only the big banks and trading institutions that could afford to deploy streaming analytics and event processing technology, using complex algorithms and low latency messaging to take split-second decisions in the markets. Now, however, significant changes in hardware and advances in data science mean the same technology is being adapted for use in sectors as varied as telecoms, manufacturing or retail, where individual businesses generate masses of data. One of the key dynamics has been the falling price of parallel processing chips, which has occurred just as their computational power has increased hugely, following Mooreâ€™s law about the doubling of transistors in integrated circuits every two years. Similarly, the cost of memory has fallen markedly, so that it is no longer prohibitive for a business to place several terabytes of its data in RAM. In simple terms, this means medium-sized business can now conduct tasks that were previously out of reach and required big farms of servers. These new opportunities have led
to a rapid change in attitudes towards in-memory computing, since it now takes a few hours to load and index huge amounts of data that would previously have taken days. A SMART Data Layer All kinds of data can now be stored and used in an entirely transient way that would have been inconceivable a few years ago when memory and processing power were so much more expensive. Traditionally, companies were only able to analyse a limited amount of their internally-generated data in a timely fashion. Now, however, they potentially have access to enough capacity to draw it in from external sources, along with their own historical records that could go back decades and to submit it all to comparison and pattern matching. Indeed, in-memory data management finally makes big data consumable when used in conjunction with the technologies that were formerly only cost-effective for stock market trading and high-performance banking operations. Where break-through innovation is happening now, is in combining this computational power and in-memory
management with pattern-matching technology, using advanced timebased algorithms â€“ creating the data layer smart â€“ a SMART Data Layer. A simple example of how this works is in the supermarket sector, where it is often said that if bananas are not sold within ten minutes, something is seriously amiss within a store. Using smart data layer technology, it is possible to analyse the mass of cash till data to show that sales of bananas in a store are not currently matching the usual pattern. This then triggers a stock-take, using in-store sensors and then if necessary, an alert to the manager via his smart badge so that action can be taken to put sales of bananas back on track. Savings through long-term business rules to govern processes This is a straightforward example, but there are two further ways in which this technology is being used to bring major gains. One is in the creation of long-term business rules to govern processes, such as supply chain functions, manufacturing or complaint-handling. In pharmaceuticals, for example, businesses are able to use this technology to track their supply chains and spot the gaps in handling when
fakes may have been substituted or stock stolen, for example. They can also use it to flag up when consumption rates have changed. Equally, by deploying smart data layer technology to monitor the information streaming from sensors, manufacturers of silicon wafers and high-grade copper wire have been able to lock in huge improvements and reduce waste. The technology ensures that processes are always operating optimally, by adapting to changing conditions in real time. For one wire manufacturer, waste has dropped tenfold, resulting in huge savings and a product of much higher quality. The second area of application is much more transient, correlating for example, customer relationship management information, marketing and website data for short-lived campaigns of 24 hours. A retail company can load all the transactions from the last 12 hours and use it for a closely targeted flash marketing campaign, simply disposing of the data the following day. Assisted decision making lifts sales The algorithms within the smart data layer can look at the data, including public sentiment from social media, to help drive a rapid, short-lived
campaign in a particular location. This kind of assisted decision-making has led to campaigns that have lifted sales immediately. The Smart Data Layer is therefore, a technology whose time has come, created by the perfect conjunction of cheaper and more powerful processing and memory, along with the use of advanced time based algorithms. However, if companies are to ensure they reap the benefits from these opportunities, they will need the data scientists to tailor the data, rules and algorithms to their precise needs. They also require access to substantial amounts of data and to the hardware on which this technology rests. Most of all they must have the expertise to integrate it all effectively. Not only will it yield a huge return on investment, it is also capable of doing so in a few weeks, compared with the many months it might otherwise take an enterpriseâ€™s IT department with traditional methods. As enterprises undergo significant cultural change and develop their IT departments into more businessled digital technology operations, the Smart Data Layer surely will be at the heart of things. n
Project Management: Maria Nordborg
Getting rid of project chaos in the cloud Maria Nordborg shows how cloud-based project management can help managers and their teams keep up in the always-on, digital world.
Maria Nordborg Maria Nordborg is Director of Customer Experience for Projectplace at Planview and has been working with the Projectplace product since April 2001 where she manages the Customer Experience teams. She also speaks frequently at events and webinars about Lean, Agile, Kanban, Change Management and Management techniques.
s the working environment becomes more and more complex, there is increasing discussion around the business benefits of cloud technology and superfast mobile broadband. Experts agree that new technologies will make true mobility and on-demand intelligence a reality, allowing people to work on their own terms and enabling leaner, more productive workspaces. Nonetheless, little has been said about how technology has also created an expectation for project managers to be more accountable, productive and collaborative than ever before. According to a recent report, new technologies are also impacting European business habits and changing how people work together on projects. The study reveals that inefficient ways of working lead managers to waste up to 20 working
days a year, resulting in stress, missed deadlines and overextended budgets. This leads to decreased productivity and leaves projects in chaos. However by adopting some of the following steps managers and their teams can keep up in the always-on, digital world: Embrace a cloud based flexible working policy Begin by outlining your company’s flexible working guidelines and looking towards technologies such as the cloud that will support your remote employees. You and your team might not all be sat in the same office every day, but you can still collaborate and communicate online to produce the same level of work as before with as effective results, through a “virtual office”. The cloud will enable your staff to work from different locations, and
use the hardware and applications of their choice, rather than those already under IT jurisdiction. Anything from mobiles, tablets and laptops will be used, so secure tools and platforms that can be accessed across multiple platforms and mobile devices are ideal. Look for cloudbased solutions to avoid the pain of having to reconfigure and manage on premise software or the servers needed to support it. Roll out BYOD and other BYOX policies The Workforce 2020 study predicts that the ability to work at any time and from anywhere is becoming a major factor in the business world, especially with more millennials joining and shaping the workforce. However, despite the growing number of cloudbased, virtualised tools available, lack of communication between team members is still a big headache
for businesses, many of which still need to overcome the challenge of integrating their employees’ devices force and applications with their own systems. The key is to adopt a simple, open, company-wide policy that helps organisations achieve a balance between security and userfriendliness in all the technology choices made. This allows employees to make the most of cloud-based tools. Use the cloud to keep track of actions and milestones With studies showing how the visualisation of information can counterbalance cognitive overload, more and more businesses are turning to innovative ways of working that empower teams to collaborate, plan and execute their work in the cloud, while providing project managers with a complete overview of the progress made. For example, in addition to using Gantt charts to graphically represent goals, priorities and timelines, many businesses use a 60 year old Japanese method called Kanban to visualise project workflows. By including tasks in one of three columns - ‘To Do’, ‘In Progress’ and ‘Done’- and moving them from one to the other as they progress, managers can easily envisage how a project is progressing and what their teams are working on. Stop email addiction with better communication methods With more dispersed teams combined with the adoption of flexible working policies, it’s worth looking beyond traditional communication tools like email. Whilst juggling various tasks and projects, people will often find they’re struggling to keep on top of their emails – and achieve “inbox zero” as some call it – to get their jobs done. Email isn’t exactly the most efficient communication tool for collaborating with peers and sharing/ modifying documents together, with all the back and forth of edits of a document. With the proliferation
of mobile devices and BYOD in the workplace, using cloud-based tools is a natural step for teams to streamline business communications. Protect your data Keeping data safe has never been as important as it is today. But while two-thirds of managers can access sensitive data in their organisation, only half can see who has actually read, changed or downloaded a shared document. If employees are using their own devices to access documents out of the office, businesses should establish some ground rules regarding security. Device security, firewalls and passwords are a great place to start, but they should also educate team members on how they can help prevent any slips ups, too. While they shouldn’t lose trust in the cloud, organisations should make sure that employees understand the possible consequences for the business if sensitive customer information or intellectual property land in the wrong hands, such as large fines, a tainted reputation or even a damaged bottom line. Taking simple measures, such as making sure they read the small print when signing
up to cloud services or letting the IT department know about the tools they’re using, can help ensure the business won’t be at risk. Choosing certified cloud tools equipped with the latest two-factor authentication, single sign-on, and encryption technologies is a must. Using the cloud to achieve project harmony In order to be able to manage projects successfully, businesses need to achieve a balance of elements, including a motivated team, easy collaboration, smooth communication, and getting things done on time and within budget. When balance is achieved, projects run like clockwork. But, if just one of those elements slips, the balance is disrupted and chaos follows. While the rapid evolution of technology might sometimes catch organisations off-guard, the latest cloud based innovations are proving invaluable in empowering businesses to tackle chaotic, inefficient ways of working head-on. By harnessing tools that facilitate new ways of working, they can propel smarter, goal-driven collaboration and ensure project success. n
Virtual Desktop: Graham Jarvis
Moving your desktop to the cloud Graham Jarvis looks at the best way to move your desktop infrastructure to the cloud and looks at the options available for the virtual desktop market.
Graham Jarvis Graham Jarvis is an experienced technology and business journalist and is a staff member on CCi.
n the US the virtual desktop infrastructure (VDI) market is expected to grow by a compound annual growth rate (CAGR) of 29.7% between now and 2019. In the EMEA market region the growth rate of VDI isn’t so certain, but in one of its earlier reports Research and Markets predicted that cloud-based VDI would grow at a similar rate of 24% (CAGR). In spite of some distinct challenges, the market is very competitive and it has some immediate potential for growth. T-Systems, for example, brought its virtual desktop product Dynamic Workplace to market a year ago and it has in that time signed up 100,000 users within global enterprise. The company says that they feel that there is more growth in their sales pipeline too, and it finds that cloud-based end-user computing (EUC) is picking up. In the past most firms concentrated on the virtualisation and rationalisation of the back-end systems, but CIOs are now looking to further reduce costs. To do this they have turned to the end user computing space as historically VDI didn’t conclusively prove itself from a cost-perspective. The newer technologies are permitting organisations to look at this again in order
to assess how they can use the cloud to drive out EUC costs. For example, they want to find out how they can use cloud-based desktop services to allow contractors to bring and use their own devices because by doing so capital expenditure savings can be made. This strategy also leads to a reduce reliance on technical helpdesks and support – which in turn leads to an ability to reduce operational (OPEX) expenditure. VDI’s rough ride Scott Cairns, CTO at T-Systems, nevertheless says that virtual desktop solutions have had a rough ride over the years. That’s because storage costs have held the technology back in the datacentre space. “This is compounded by the fact that companies have been pushed to move from one desktop migration to the next, and this can be quite frustrating for customers because applications run a business and not the operating system”, he says. In spite of these issues he comments that the initial operating system upgrade is relatively painless compared to “assuring that you application portfolio runs on a new operating system.” Yet companies are looking
for new technologies – including virtualisation in general, but they aren’t expecting VDI to stand on its own. Aggregating workspaces HTML 5, for example, is allowing companies to deliver desktops in a different manner. So VDI on its own is not going to enjoy a high uptake because customers are looking for other kinds of technology-based solutions, ranging from applications delivery through Citrix to access to their ‘apps’ via browser technologies such as HTML 5. In essence the market is in Cairns’ view looking to “create aggregated workspaces by pulling together all of the applications an employee consumes on the most appropriate software and hardware platform.” He adds: “With the consumerisation of IT and the penetration of a multitude of devices into the organisation, companies are looking to break the chain between a physical asset, and the workspace the employee needs to perform their job – and there should be no reason why they can’t move their different devices and still get to their workspace.”
CCI also caught up with Neil Thomas, Product Director of Claranet, after Cloud Expo Europe. He says the key benefits of deploying a virtual desktop strategy revolve around flexible working, disaster recovery and security. Yet there are in his view some secondary benefits too. These include the simplification of the desktop environment and its management, and bring-you-owndevice (BYOD) support. “If for example you just need it for flexible working, you might be able to achieve that by deploying laptops, but then again desktop-as-a-service becomes the obvious choice if you want to combine this with other requirements”, he explains. Infrastructure models Gabriel Gambill, Senior Systems Engineer for EMEA at Quorum, doesn’t accept that VDI is always the answer anyway. Speaking from his own experience about VDI, he says “most users are leveraging client server applications in which the servers are doing most of the heavy processing and just feeding updates to the screen of the client.” He adds that a properly configured server could
Virtual Desktop: Graham Jarvis
achieve the same goals as a VDI server, and he argues that “the terminal server can handle many more users than a VDI host because you don’t have the operational overhead of running the OS for each user.” So not all virtual desktops need to be deployed in the cloud. Thomas points out that they can sit on traditional IT infrastructure, but performance and licensing can become a significant challenge for organisations that elect to take this path. “Performance is key because virtual desktops can be quite complicated, and so if your user-base increases your performance could drop off quickly too”, he says. Licensing can also be problematic. So it’s important to have a solution that it designed with desktop services in mind. This enables the organisations that deploy virtual desktops to better optimise their performance and reduce their licensing costs. Kevin Linsell, Director of Strategy and Architecture at desktop-as-a-service (DaaS) vendor Adapt, says his company only deliver DaaS by using elements of a shared infrastructure. He says this approach saves both time and money. “In the traditional manner you have to design, build, configure and migrate the solution”, he says. He has found that this can take “multiple weeks to achieve”, and yet in contrast desktop-as-a-service is as quick as turning on a virtual machine. The whole migration process took just two weeks with one of Adapt’s customers. It began with user acceptance testing and the migration of legacy applications. The traditional cloud arguments still apply in his opinion, but he recommends that very large organisations might still want to have their own infrastructure – particularly if they have security concerns and want to make sure that it’s remain strong. Yet Peter White, Chief Architect at Bell Integration, says that they will need to consider network latency issues, and they should assess whether a cloud solution offers more or less agility. In comparison a cloudbased virtual desktop can deliver greater efficiencies and to him that’s not just about costs. To him efficiency is also about configuration standards, speed of resolution of support and self-service. He adds that federated directory and authentication services are essential to allow a single sign-on for public desktop-as-a-service offerings. With regards to latency, he says: “Latency issue can be a limiting factor for client-service applications; those applications that can’t tolerate high latency or low bandwidth are required to be well connected, or in close proximity between the client application and the line of business application or service.” Moving desktops to the cloud
So how can organisations move their desktops to the cloud? Linsell says the best way to achieve this goal is by working with skilled people, or by working with a partner to help them to understand their desktop strategy. It’s important to think application first in his view, because at the end of the day virtual desktop – cloud-based or otherwise – are about user-experience. The next step is requires them to focus on the service offering to understand what it means to be a customer in order to avoid thinking in silos of single applications. “We run through the whole process with our customers from pre-sales to try before you buy in order to let customers experience it, and then we do a proof of concept before running a pilot around performance and integration”, he explains. The final stage involves rolling it out into the production environment. Adapt uses a hybrid cloud for this because it is then private, shared, multi-tenanted and it involves legacy applications. Top Cloud desktop tips The aforementioned participants in the research for this article offered their top tips for moving desktops to the cloud. Apart from a forecast that the future of virtual desktops and applications most probably in the realms of HTML 5, suggest White, there is a need to remember that people and process are just as important as the technology itself. Virtual desktops won’t be the sole solution, and therefore it’s important to consider using a blend of solutions. To achieve the best business outcome as well as to save time and money it’s worth finding a partner – either a managed service provider or a consultant who can assist with the design of the platform, aid the migration to it and help to managed it on a daily basis. It’s also vital to analyse when the organisation’s network infrastructure is appropriate for a cloud-based model, and to ensure that there is a distinct and valid business case for deploying desktop-as-a-service (i.e. for moving desktops to the cloud) as it needs to be aligned with the organisation’s business strategy. Thomas concludes that moving desktops to the cloud may not cost less than a traditional desktop environment too, and he warns that the increasing adoption of software-as-a-service, such as Office 365, could mean that the requirement to host desktops might diminish over time. To many organisations VDI, whether based in the cloud or not, is just a transitional solution too. Once they have completed the migration of their legacy applications it can become redundant. So before you move your own desktop to the cloud, it’s important to consider all of the options. n
Data Protection: Dan Sutherland
Why the EU may force you to recruit a data protection officer Dan Sutherland explains why the impending EU Data Regulation may mean you need to employ a data protection officer if you process more than 5000 records
Dan Sutherland Dan Sutherland is founder and CEO of Carrenza, a cloud services provider that blends IaaS and PaaS capabilities and has Cineworld, Royal Bank of Scotland and Government Digital Service as customers. Dan started the business following a career in management consultancy and the music industry.
ecently I joined a delegation of technology companies who met the key players in the European Parliament and Commission to discuss the current and likely impact of the Digital Single Market, and most specifically the General Data Protection Regulations which are currently in Trilog (discussions between the three EU bodies; the Commission , the Council and the Parliament)- where the agreed texts from each of the three institutions are essentially horse traded to get to a compromise text which can be implemented. The first thing to be clear about is that the proposed regulations will affect every company that holds data on its customers and not positively. These regulations are little understood today, but they will be a board discussion in most companies in the UK in the next 12 months. This is in large part because they have been born out of post-Snowden concerns and focussed on consumer data. The Institutions see the dominance of big American tech companies
as negative, and want to protect against the misuse or overuse of personal data. As a consumer the general strengthening and simplifying of data protection regulation is a good thing, but the regulations as proposed in any of the three texts are a blunt instrument, and they have largely ignored the business impact or viewpoint, especially where transactions are business to business. Key points to note include: 1. Businesses will have to employ a data protection officer if they process more than 5000 records a year, or process special classes of data. Those data protection officers have to be hired for a minimum of four years and they will enjoy special protection from dismissal. They canâ€™t be current senior staff of directors who could be seen to have a conflict of interest. 2. Joint liability for data breaches will extend to both processors and controllers of data, opening up all parties to huge liabilities in the event of a data breach whether or not they had control over or even
knowledge of the systems where the breach occurred. 3. The regulations donâ€™t just apply to EU companies, but to anyone selling into the EU. 4. It will become illegal to automatically process personal information and then make a decision based on that which has legal effect. That means the systems of companies like funding circle would have to be de-automated, in effect a massive retrograde step. But it also affects consumer systems - your nest thermostat could no longer learn in the way it does now, so in all likelihood your home energy bills will rise. The European Parliament is taking a more reasonable approach to these regulations than the European Commission who created them, but the current position of both will act like an anchor on the productivity of companies and competitiveness across all 28 member states, we can but hope that during Trilog the key issues of joint liability, scope and hiring a DPO will be watered down. n
Analytics: Michael Kearney
How to make big data analytics work for the businesses Big data can help make predictions and inform what you do next, but Michael Kearney warns that we should concentrate more on finding solutions rather than concentrating solely on the data
Michael Kearney Michael Kearney, is product marketing specialist at Encompass and has more than 25 years in marketing and partnering management roles specialising in products used to manage, analyse and capitalise structured and unstructured data. Throughout this time he has worked for IT vendors including IBM, Netezza and Oracle in Australia, Asia, Europe and USA, and has consulted to financial services, electronics, energy, telecommunications, manufacturing and pharmaceutical companies. As a leading thinker in the domain of data analytics, Mike has been regularly contributed to the ‘IBM Big Data and Analytics Hub’ as well as a range of other publications and is looked to as an authority when it comes to defining the place for data driven applications in business.
he initial excitement around big data is translating into realworld progress. Appropriate architectures and processing platforms are now in place, while modern data management technology capable of storing and organising large data sets while taking advantage of the underlying massively parallel processing grids to crunch data at volume with high throughput and low latency, is accessible either from the cloud or as on-premise computing. Yet, the excitement is contained as despite some fine examples across different industries, success at converting investments in big data technology into real business value eludes many CIOs. The driver for investing in big data is the insight derived from analytics that improve existing processes or inspire entirely new ways of creating value. Analytic tools are increasingly coming on stream - but a gap exists between businesses’ thirst to consume analytics and their ability to derive real business value. The core problem is a shortage of data scientists; those rare individuals who combine the business analysis skills needed to become knowledgeable in a specific business domain with the technical skills to codify that knowledge into applications that help organisations achieve their strategic goals. Businesses urgently need to fill
this gap. The next wave of enterprise computing, in the shape of datadriven software applications, offers an opportunity to do just that. This rapidly emerging new class of applications embeds deep domain knowledge of a business and its challenges in software designed to solve problems. Brian Ascher, a partner at venture capital firm Venrock, neatly encapsulates the benefits in a recent piece for Forbes CIO Central website: “These solutions use algorithmic data mining on your own data and often on external third party data accessible by cloud ecosystems and APIs. Data Driven Solutions make predictions about business functions, prescribe what to do next, and in many cases take action autonomously. Trained analysts are not required to query databases. Instead, business users get answers directly from the software.” Extracting value So technology fills the skills gap for businesses. These new datadriven applications ingest data from a wide range of internal plus external sources, extract the entities of interest (people, groups, places, things), establish relationships between these entities while applying knowledge of what is and what is not important to the business community being served,
and then analyse the emergent situation to present answers, and not merely reports or dashboards. Importantly, to accelerate the rate at which business users realise value, these applications include virtuous feedback loops, increasingly implemented in machine learning algorithms, that continuously observe their users’ interactions to learn and capture valuable behaviours which can then be published to the entire community accessing this software as a service to improve outcomes for all. Deep domain knowledge embedded in software combined with this continuous learning capability helps businesses successfully convert data into knowledge. The business can now consume that knowledge in one of two ways. It may be channelled directly into a business process with no human intervention, for example as confirmation that all checks are completed successfully and a customer can be on-boarded. Alternatively, the knowledge can be presented to a human audience to make a decision, for example a deep analysis of an application for a business loan unearths a history of defaults that demand further
consultation with the applicants. In cases when knowledge is to be communicated to people, data-driven applications employ interactive data visualisation, technology already familiar to everyone who uses a touch screen and navigation software on a smart phone. Such visualisations help businesses quickly and efficiently gain an understanding of complex situations informed by multiple data sources to drive fast time to insight. Above the line technologies In-built domain expertise, ongoing learning capability and interactive data visualisation distinguish what Jake Flomenberg of venture and growth equity firm, Accel Partners describes in a recent O’Reilly Media, Inc. article as “above-the-line” technologies such as data-driven applications from their “belowthe-line” counterparts such as data platforms, data infrastructure and data security services, which serve as essential building blocks but lack specific business domain intelligence. So instead of employing data scientists in every business and government department, software companies will employ these scientists within multi-disciplinary
teams to develop applications specifically for vertical industries. Embedding domain knowledge in software brings scale, meaning it can be used to the advantage of hundreds or thousands of businesses. With this in mind, it is not surprising that it is the abovethe-line segment that industry observers expect to really take off. As Flomenberg puts it: “We’re in the early innings for the above-the-line zone and expect to see increasingly rapid growth there.” It might be counter-intuitive but data is not really the point of datadriven software. Instead this rapidly emerging technology is all about finding solutions. As Ascher points out, “most business customers don’t really care about data. They care about solving business problems.” That’s what data-driven software is so effective at doing today – and that’s why, at Encompass, we echo Ascher’s confident assertion: “Software-as-a-Service and cloud computing has been transformational for the software industry, but compared to what is coming next, you ain’t seen nothing yet.” n
Cloud ISO 27018: Richard Kemp
Data Protection, Contracts and the Cloud: ISO 27018 Richard Kemp looks at the first standard for protecting personal data in the cloud ISO 27018, and reveals why it’s proving to be a genuine help for cloud service users in managing their data protection legal obligations
I Richard Kemp Richard Kemp is the founder of Kemp IT Law. With over thirty years’ experience at the leading edge of technology law practice, Richard is widely recognised as one of the world’s top IT lawyers. He has built an outstanding reputation for advice that combines commerciality and client service with innovative legal solutions to the business challenges of technology development, deployment and regulation.
SO 27018 – the first international standard focusing on the protection of personal data in the public cloud – continues to move centre stage as the battle for the Cloud moves up a gear. At the highest level, this is a competitive field for the biggest companies – think billion dollar investments and million square foot data centres with a hundred thousand servers using enough energy to power a city. According to research firm Synergy, the Cloud infrastructure services market - Infrastructure as a Service (Iaas), Platform as a Services (PaaS) and Private and Hybrid Cloud – was worth $16bn in 2014, up 50% on 2013, and is predicted to grow 30% to over $21bn in 2015. Synergy estimated that the four largest players accounted for 50% of this market, with Amazon at 28%, Microsoft at 11%, IBM at 7% and Google at 5%. Of these, Microsoft’s 2014 revenues almost
doubled over 2013 and Amazon recently reported its AWS (Amazon Web Services) revenues at $1.6bn in the first quarter of 2015, up 50% on 2014. Global SaaS (Software as a Service) revenues were estimated by Forrester Research at $72bn in 2014 and are predicted to grow by 20% to $87bn in 2015. Equally significantly, the proportion of computing sourced from the Cloud compared to on-premise is set to rise steeply: enterprise applications in the Cloud accounted for one fifth of the total in 2014 and this is predicted to increase to one third by 2018. This growth represents a huge increase year on year in the amount of personal data (PII or personally identifiable information) going into the Cloud and the number of Cloud customers contracting for the various and growing types of Cloud services on offer. But as the Cloud continues to grow at these startling rates, the
biggest inhibitor to Cloud services growth – trust about security of personal data in the Cloud – continues to hog the headlines. Under data protection law, the Cloud Service Customer (CSC) retains responsibility for ensuring that its PII processing complies with the applicable rules. In the language of the EU Data Protection Directive, the CSC is the data controller. In the language of ISO 27018, the CSC is either a PII principal (processing her own data) or a PII controller (processing other PII principals’ data). Where a CSC contracts with a Cloud Service Provider (CSP), Article 17 the EU Data Protection Directive sets out how the relationship is to be governed. The CSC must have a written agreement with the CSP; must select a CSP providing ‘sufficient guarantees’ over the technical security measures and organizational measures governing PII in the Cloud service concerned; must ensure compliance with those measures; and must ensure
that the CSP acts only on the CSC’s instructions. As the pace of migration to the Cloud quickens, the world of data protection law continues both to be fragmented – 100 countries have their own laws – and to move at a pace driven by the need to mediate all competing interests rather than the pace of market developments. In this world of burgeoning Cloud take-up, ISO 27018 is building a bridge over troubled data protection waters between Cloud market developments and effective Cloud contracts by providing CSCs with a workable degree of assurance in meeting their data protection law responsibilities. Almost a year on from publication of the standard, Microsoft has become the first major CSP (in February 2015) to achieve ISO 27018 certification for its Microsoft Azure (IaaS/PaaS), Office 365 (PaaS/Saas) and Dynamics CRM Online (SaaS) services (verified by BSI, the British Standards Institution) and its Microsoft Intune SaaS services
(verified by Bureau Veritas). Microsoft’s certifications show that ISO 27018 is not tied to any particular kind of Cloud service and applies to IaaS (Azure), PaaS (Azure and Office 365) and SaaS (Office 365 and Intune). If you consider computing services as a stack of layered elements ranging from networking (at the bottom of the stack) up through equipment and software to data (at the top), and that each of these elements can be carried out on premise or from the Cloud (from left to right), then ISO 27018 is flexible enough to cater for all situations across the continuum. Software as a Licence to Software as a Service: the Cloud Continuum Indeed, the standard specifically states at Paragraph 5.1.1: “Contractual agreements should clearly allocate responsibilities between the public cloud PII processor [i.e. the CSP], its sub-contractors and the cloud service customer, taking into account the type of cloud service in question (e.g. a service of an IaaS,
PaaS or SaaS category of the cloud computing reference architecture). For example, the allocation of responsibility for application layer controls may differ depending on whether the public cloud PII processor is providing a SaaS service or rather is providing a PaaS or IaaS service upon which the cloud service customer can build or layer its own applications.” Certification also shows that the provider’s service is fit for processing personal data even if the provider doesn’t know if the customer’s data contains personal data. Perhaps the biggest practical boon to the CSC however is the contractual certainty
that ISO 27018 certification provides. So when the procurement group at the Cloud customer is running a tender for a large Cloud contract, ISO 27018 certification enables them to grade the bidders; and then to include the requirement to comply with the standard as a contractual obligation when the agreement is let. In the Cloud contract lifecycle, the flexibility provided by ISO 27018 certification, along with the contract and the CSP’s policy statements, goes beyond this to provide the CSC with a framework to discuss with the CSP on an ongoing basis the Cloud PII measures taken and their adequacy.
In its first year, it is emerging that complying, and being seen to comply, with ISO 27018 is providing genuine assurance for CSCs in managing their data protection legal obligations. This reassurance operates across the continuum of Cloud services and through the procurement and contract lifecycle, regardless of whether or not any particular data is PII. All this becomes more important when the market for Cloud infrastructure and platform services is growing by 30% a year; and when Cloud enterprise applications are set to rise from a fifth of the total to a third by 2018. n
Case Study: Infinity
Infinity creates the data centre for the customer of the future IT solutions business Secura Hosting needed to review its infrastructure and find a flexible, scalable, efficient UK-based data centre for them and their clients and found it in Infinity Hertfordshire-based Secura Hosting, founded in 2001, provides highly scalable cloud infrastructure solutions to 100 clients from across the UK including leading B2B software-as-aservice (saas) businesses, Causeway and Selima, international Microsoft Dynamics developer, AlfaPeople and Park Plaza Hotels.
Early in 2014, Secura found that as cloud services became more critical to businesses, customers were asking to see where their data was being stored, meaning cloud providers needed to factor this requirement in when finding a data centre partner. As a business that builds its client relationships based on integrity and transparency,
Secura needed to find a partner that would offer an equal level of integrity and transparency to its clients. Additionally Secura identified an opportunity to grow the business by building a new Virtual Private Cloud infrastructure on latest generation technology, with the security and scalability to meet growing client demand. Secura’s old outsourced technology consisted of racks located across various data centre facilities, and while this system suited the company’s needs at the time, it was becoming a barrier to the businesses focus on cloud services. Secura needed to review its infrastructure and find a flexible, scalable, efficient data centre that not only met their expectations, but also those of its clients. Ollie Beaton, CEO of Secura, explains, “When growing our company we have to consider the business elements, such as reducing costs, increasing the number of new business wins and catering to client demand. But we also need to consider that new customers with business critical applications in the cloud will want to visit our data centre facilities to see where their data is physically
being stored. “It’s important that we not only consider our data centres from a technical point of view, but also aesthetically. As cloud technology is playing an increasingly more central role in enterprise IT, we need to expect more visits in the future. A customer visit to a data centre can now really make or break a business deal.” Finding the right provider Extensive research was undertaken to identify a partner that could not only offer exceptional, highly secure infrastructure and the ability to upscale quickly, but one that could also impress clients with high-quality, remarkable data centres. Secura selected award-winning data centre provider Infinity with a focus on its data centre facilities in Slough, offering growth and flexibility and maximising the value Secura provides its clients. Ollie says, “We evaluated several providers and found Infinity exceeded our needs. The facilities at Infinity Slough are technically exceptional, providing the ability to flex with our requirements and scale the service when we need it. It not only offers fantastic service, but also has an appearance that truly impresses our customers.”
With Secura in a rapid growth phase, Infinity’s Infinite Data Centre product provides the flexibility to grow and shrink the data centre footprint in tune with the changes in the IT estate. Infinity Slough Infinity’s Slough data centre supports a broad range of flexible products and provides a range of pre-built facilities ready for immediate occupation. It offers individual racks, private suites, scalable halls and dedicated data centres supported by innovative cooling techniques. Its halls provide a standard power density of 8.5 – 10kW per rack and the modular, dynamic chilled water cooling system delivers a market leading PUE of 1.22. It is highly resilient with no single points of failure. Stuart Sutton, CEO of Infinity says, “We see the data centre as a fully integrated part of the overall IT solution and our Infinite Data Centre proposition provides evidence of our commitment to this. Our flexible data centre solutions enable our partners to truly add value to their customers.” Infinity Slough provides 9,200 m2 of technical space and is supported by 34MVA delivered via a dedicated onsite primary sub-station. Its state of the art design draws on all our previous experience. Everything about this data centre is truly world-class.
Taking the data centre beyond data The facility plays a pivotal role in business growth for Secura, who predict growing at a rate of a minimum of one rack every six months and the organisation is utilising the visual effect Infinity Slough has on its customers. Ollie explains, “For our customers the quality of Infinity Slough is a big attraction. Although scalability is a key selling point in our negotiations, when our customers visit the site they all talk about how impressive the data centre is. Infinity works hand in hand with our team to ensure our customers are looked after on the tour and see every aspect of the data centre. “Our customers comment on the scale of the site, the high level of security and how modern the facility is. Infinity Slough has helped us secure deals and we now include site visits as an integral part of our new business process. Customers can see the investment in the facility and it is a reflection of Secura as a business. “Our growth is obviously dependent on how many new customers we win and Infinity Slough has certainly helped us make this easier. As customers become more aware of the cloud and ask to see where their data is stored, it is great to be able to show them a data centre facility that is designed to meet the future, rather than the past.” n
CFO: Graham Jarvis
What to do if the CFO says no Graham Jarvis looks at how to get the finance officer on your side and to make the move to the cloud a tick-box solution.
Graham Jarvis Graham Jarvis is an experienced technology and business journalist and is a staff member on CCi
he traditional view of chief financial officers (CFOs) is that they are only concern about cutting costs, but this isn’t the reality. Yes they want to control costs, but they will permit investments that will lead to business growth and higher rates of profitability. Today CFOs are still concerned with the business cases for deploying, for example, cloud computing applications and platforms. Yet it’s not the financial arguments that they are solely focus on hearing about. They want to know whether the systems offer strict control over data protection and data security compliance. That’s because data breaches are potentially costly and damaging – from both a financial and reputational perspective. “CFOs want to protect customers and data because they are moving towards compliance, and as a director you are responsible for your data as there are financial penalties attached to non-compliance and a failure to
comply would also have a negative impact on a brand’s reputation”, says Simon Wither; Vice-President of Infrastructure-as-a-Service (IaaS) at Sunguard AS. Cloud democratisation Even so CFOs do still have some control over spending, but Gordon Davey, Cloud Strategy Leader at Dell, says that cloud computing democratises how IT expenditure occurs: “Democratisation can happen in many ways when it comes to the cloud”, he says. For example shadow IT moves IT decisions away from chief information officers (CIOs) and to the rest of the business. It requires little or no traditional involvement with anyone in IT, and the CIO might not even be aware that it’s happening because individual departments and business users can now control IT expenditure without consent from a centralised IT department. This makes governance and cost
control a significant challenge for them to manage. “Cloud can be daunting to CFOs because they can lose control over their organisation’s cloud purchases, and yet from it can also bring huge benefits to CFOs: the focus on an operational expenditure (OPEX) model to control how and when their organisations spend on IT”, he explains. Davey thinks that it’s inevitable that CIOs and CFOs are going to have to sit down to talk about costs and how they are controlled. Tony Lock, a distinguished analyst at Freeform Dynamics, adds that in some companies CIOs will control all of their budgets, but others won’t. Decisions to make There is nevertheless a decision that CFOs should make: do they want to use cloud technologies to cut costs or to make money? The danger is that they may choose to solely focus on costs, which can stifle innovation. So there has to be a balanced approach in reality
– better gaining cost-efficiencies and deploying cloud technologies to help their organisations to increase their revenues. Lock adds that CIOs have to show that they are doing the best for their organisations by delivering IT services, and it would be good if CIOs became better at explaining why cloud would make a difference to the business in a language that CFOs understand: e.g. in pounds, pence, dollars, cents and euros. Yet CIOs don’t tend to come from a financial business case background. “CIOs have focused on the technology side of their role – to have accounting qualifications would help them, but ultimately it’s about being able to separate out the IT requirements at a business rather than at a technology level”, says Matthew Larby – Product Strategy Director at datacentre company Virtus. He argues that there needs to be a blending between the CFO’s need to add up, control and analyse the numbers and the CFOs need to focus on driving innovation and revenues with technology. Increasing revenues “Revenue generation is often linked to the agility of the cloud, whether that is about quicker go-to-market or scale, and Dell’s Global Technology Adoption Index has shown that 72% of organisations actively using the have experienced 6 percent growth or more over the last 3 years – only 4 percent experienced either zero or negative growth”, says Davey. He elaborates that the 24% companies that didn’t use cloud at all have achieved growth rates
of over 6 percent, and 37% of them also experienced zero or negative growth. The survey also revealed that 42% of the respondents whose organisations use cloud have achieved cost-savings. So it’s this kind of information that CIOs need to use when talking with CFOs to gain support for the cloud. Simon Withers adds that Sunguard has many travel, finance and healthcare customers who’ve seen a 50% increase in their revenues from its cloud platform over a two week period. His customers needed to attain a higher level of computing performance in order to deliver more revenue, and he says that on the corporate side it’s about looking at applications and services that respond to the peaks and troughs of demand. He believes that IT needs to be a profit rather than a cost centre, and CIOs can achieve this by understanding the desired business outcomes and the financials of the organisation. For this reason he has become more focused on the economic of his business. Demonstrating ROI “So when I build up the business plan I have to show my CFO that it will provide a certain amount of ROI, and then the economics will play back into the service to ensure the growth of the company”, he explains. He therefore recommends that CIOs should develop a 3-5 year plan and he has to show CFOs how it relates to the organisation’s required business outcomes – not just in terms of costs, but also in terms of security and compliance. To achieve the
organisation’s goals and objectives CIOs, chief marketing officers (CMOs) and CFOs need to work more closely with each other than they did in the past. In fact, arguably, CIOs should collaborate with the whole board and not just with CFOs. “The CFO is one amongst many within the group and so CIOs need to convince everyone that cloud can help the business to grow and that’s a people thing because it not about technology”, says Lock. He thinks that some CIOs have good people skills while others don’t. He says they are important because at the end of the day selling anyone the benefits of investing in cloud technologies is about good communication and understanding what the different stakeholders within an organisation need. On the flip side the business also needs to talk more clearly to IT about its needs. “The business might wish for the moon, but it’s often not willing to pay for it”, he reveals. The organisation also need to look at the consequences of making certain decisions in his view, such as the moving of data from one cloud provider to another and where it should be stored. With compliance in mind, data sovereignty questions become particularly important if data is stored in a public cloud – and particularly because a number of people are still concerned about how secure the public cloud is. To protect sensitive data CIOs might therefore wish to invest in a private or hybrid cloud solution. Much depends on what the business wants to achieve from
CFO: Graham Jarvis
its cloud investments. Cloud benefits Davey proposes that the following benefits of the cloud should be discussed with CFOs: the agility of the cloud which enables companies to scale up and down to meet season demand trends, the ability to offer new applications and services, improve employee productivity and investments in cloud computing technologies can enable a business to achieve its business goals and objectives in new ways – ones that might even give that organisation at competitive advantage over its rivals. For example, Dell’s survey showed that organisations that use three or more different types of cloud solution experienced a 15% increase in employee productivity. “So the key is not just about cloud; it’s about using the right combination of platforms for the right scenario, and so CIOs should avoid turning their case for investing in cloud
solutions into simply a technology conversation because understanding the commercial and business benefits is critical for it to become an end-toend success story”, explains Davey. Getting the CFO on board tips The aforementioned people who were interviewed for this article concluded with their top 5 tips for getting a CFO on board to ensure that they are able to invest in cloud technologies. Here are some of them: • Understand the business goals and objectives; • Communicate the business needs and the benefits of cloud technologies effectively with all stakeholders – not just with CFOs; • Never assume that everyone has the same understanding, or the same expectations about cloud computing; • Embrace the ability to have a charge-back and develop a smart sourcing strategy; • Ensure that the CFO shifts from a
cost focus to one that talks about business agility and work with technology partners to perform the maintenance elements. In terms of what cloud solutions CFOs should allow CIOs to invest in today. The jury is out because it very much depends on the needs of each business. However, with the widening array of clouds and in the face of increasing complexity firms such as Sunguard and VCE believe that CFOs and CIOs should consider investing in solutions that make cloud management and deployment easier. Withers, for example, cites that they should look at gaining cloud orchestration and a level set of cloud management capabilities. Those using a public cloud might have a subset of tools, and so they would need to invest in software integration and have an API strategy to tie them together. Before they can achieve this, CIOs need to ‘tie together’ their business case with their organisation’s CFO. n
• pPUE’s of less than 1.03 • Free cooling to increase energy efficiency • Data Centre air fully separated • Energy efficiency: Evaporative Cooling ESEERs over 45 • 3 modes of operation • Munters heat exchanger technology • Any fresh water type • Configurable design • Efficiency internal fan wall
Series: 100 / 200 / 300 / 400 New!
Oasis™ Indirect Evaporative Coolers Munters Oasis™ Indirect Evaporative Coolers (IEC) save you money, provide optimal operation and serve as an energy efficient solution for cooling your data centre. We are also proud to offer 1st class dedicated global sales and service teams plus fully operational test facilities for Factory Acceptance Testing and simulation. The facilities follow ASHRAE Standard 143-2015 guideline method for indirect evaporative cooling testing www.munters.com/dctestfacility Call to find out how much the Oasis™ IEC can meet your data centre cooling and efficiency needs.
Munters Europe & Middle East +44 1480 410223 • Americas +1 978 241 1100 • Asia +86 10 8041 8057 E-mail: firstname.lastname@example.org Web: munters.com/datacentres
ISSUE 22 22
Cloud Security: Lawrence Jones
Eight steps to complete cloud security Eight steps, from data sovereignty to logging and maintenance, to keeping your business one step ahead and delivering positive results and growth with cloud
Lawrence Jones Lawrence Jones is CEO of UK Fast and was awarded an MBE in the 2015 New Year’s Honours List for Services to the Digital Economy. UK Fast recently appeared in the Sunday Times Profit Track 100 ranking with profits of £11m in 2014 on a turnover of £28.9m and its UK Fast Campus in Manchester is home to over 200 employees and hosts a graduate programme and apprenticeship scheme. His first business venture was, The Music Design Company a business dedicated to providing the North West with entertainment and event organizing. At its height the company had over 240 musicians on its books.
Cloud has opened up new possibilities and advantages for organisations. We know that cloud is delivering upon its promise and that many businesses are looking for further ways to leverage the flexibility, scalability and agility of cloud for business advantage. But as the potential for progress grows, so does the potential for risk and failure. How can we continue to stay one step ahead and deliver further results with cloud? There is never a complete removal of risk, but following certain steps and empowering colleagues to make safe decisions can considerably improve your chances of avoiding a difficult situation. Step 1 - A few basics It’s important to approach the cloud environment with similar procedures to physical infrastructure. Anything on an operating system level in the cloud will be reflected in the physical server. They’ll have the same vulnerabilities and threats and people will always be looking to exploit those vulnerabilities.
From the Coalface: Darren Woollard
Patching operating systems is often a first step for security professionals but many do not follow through with updating patches or systematic patch maintenance procedures. Make sure your patching protocol includes regularly reviewing new patches for your operating systems. In addition to patch maintenance, anti-virus and anti-malware is required on each endpoint and pre-encryption is strongly recommended, in order to provide a last line of defence. In the case of a breach, data being rendered unreadable by attackers will avoid a catastrophic situation. Step 2 - Combined solutions A measure which is often employed to reduce critical risk is to use the public cloud for less critical functions; this helps to reduce costs and uses a small dedicated physical infrastructure for functions which necessitate a higher level of protection. Using a combination of cloud and physical infrastructure can be a real win; organisations can identify the strongest and most appropriate elements of each and apply them within their own solution. With cloud there is an extra controlling layer, known as a ‘virtualised management layer’, so any service provider offering cloud should be offering a ‘hardened management layer’; preferably one which
has security controls around it and is separately validated by a third party accreditation. Step 3 - Public cloud When utilising public cloud and some hybrid cloud solutions it is important to think about the level of vulnerability testing you would like to conduct. If there are shared elements which are multi-tenanted, there can be issues with penetration testing and vulnerability testing. Penetration testing shared elements is often out of the question because it is unlikely that your co-tenants will be happy for you to start bombarding firewalls - which they are using - with traffic or test attacks. This is something to consider with anything that is shared; it could potentially result in more superficial testing than would be the case with a dedicated hosting solution. Something which CIOs need to consider when weighing up the strengths and weaknesses of their solution is that public cloud can sometimes be more vulnerable than dedicated hosting solutions. In the public cloud the security procedures of the weakest member of a shared solution can leave the whole of that cloud vulnerable to attack. Terminology can sometimes vary between providers so it is also important to
define what is being offered. With private cloud it is prudent to ask how they define this term, and which dedicated elements are available; with hybrid you should ask which elements are shared and which have dedicated controls. Standards like PCI dictate dedicated functions on separate servers with firewalls for particular high risk or mission critical services. It is always important to ask the right questions of your provider and a good provider should be happy to engage in a dialogue and explain which of their services will be appropriate in order for you to be compliant. Step 4 - Stay informed Be keen and stay up to date with which algorithms have been cracked. Security is never stable; it’s an ongoing battle between attackers and security personnel. Providers should have strong communications and information available in real time. They are often able to issue press releases and blogs to keep you updated – a good way to help maintain security. Dialogue and engagement between you and the cloud service provider is a good step towards maintaining best practice and staying ahead of threats.
Cloud Security: Lawrence Jones
Step 5 - Data sovereignty – where’s my data? Data sovereignty is another important question. You need to be assured that your data is being hosted in a physical location where data protection laws are strong. You need to ask whether the data is going to be moved between nation states. Often people don’t know where those cloud servers are – which doesn’t particularly affect us at UKFast – but for someone using, for example, software as a service, it’s very difficult to see where that particular data is held. If you are using software from an American company, say Office 365, you have to ask where that data resides and what the legal jurisdiction is on that data. We see now that the EU is trying to exert more control over the big US based operators. It is important to be aware of the developing legislative landscape when looking into potential cloud solutions. Cloud is in its infancy and the EU Data Protection Directive in a year or two will reshape the landscape again. Even the USA is having discussions about more stringent data protection. All of this plays into the importance of the question: ‘Where’s my cloud?’ This is an evolving issue, and one which CIOs and information staff will need to monitor. The main thing which distinguishes cloud infrastructure concerns from physical concerns is being diligent over where your data is held. Step 6 - Logging and monitoring Monitor your environment for developing threats – this is a must. To secure an environment you need to be aware of what’s going on in it. Logging and monitoring is important for secured cloud solutions; you should have a picture of what your environment looks like. Firewalls should be fitted with intrusion detection as a minimum standard. Logging and monitoring services which plug into your solution are offered by some providers and are a great tool
in helping you to be proactive rather than reactive. Assess your prospective provider; what kind of tools can they offer? Does the provider you’re looking at offer data loss prevention, which can scan areas of your solution to identify if critical data is getting where it should not be? If there’s an area of your solution which is not so well protected you will need to ensure that critical or high confidentiality data is not stored or transited through the weak spots. There is always the risk that data can leak into areas without your knowledge; that’s why monitoring is so important. Understanding your data and how it moves will help you to spot when something is not as it should be. Obviously when data is at rest you should be encrypting it. Increased visibility is a capability which good providers will support you with, helping you to spot potential threats before they become a catastrophic risk. Step 7 - Physical measures Physical security is also a concern, obviously. You need to be asking the right questions: is the site permanently manned? Has it got CCTV? Has it got perimeter fencing? Access controls? Admission policies? There might be a need for a compartmentalised data centre where your solution is housed in a separate area of the datacentre with an extra layer of access required. It might be prudent to check whether your cloud provider is vetting their staff properly. Some of these controls are only necessary in the case of ultrahigh end security requirement but they are available if needed. Step 8 - Balance the risk and enjoy the benefits It’s all a matter of risk assessment when it comes to weighing up a new solution. You have to balance the flexibility and scalability and all of the other fantastic benefits afforded by the cloud alongside the layers of management that the cloud can sometimes require. Can you manage the risk and the balance that with the reward? n
FREE YOURSELF FROM YOUR MASTER
MOVE FROM RELATIONAL TO RIAK Unstructured data is creating massive stress for those still relying on relational databases not designed for the digital first evolution. It’s time for innovation. It’s time to simplify your deployments, enhance your availability and reduce your costs. It’s time for a master-less NoSQL solution like Riak KV. Find yourself time for a little more Zen.
Find out today how you can free yourself Call: +44 020 3201 0032 Contact us today: basho.com/contact
DOWNLOAD WHITEPAPER http://info.basho.com/CCI-WP-Relational-to-Riak_LP.html
All the latest news and articles from the cloud computing sector.