Page 1


data centre news

January 2018



Centiel expands with the acquisition of MPower UPS

inside... Special Feature GDPR

Meet Me Room

Dave Ricketts of Six Degrees discusses what makes a successful data centre and why timing is everything.

Centre of attention

Greg McCulloch of Aegis data on what to consider when it comes to data centre location.


in this issue… January 2018

32 Patch Management

New year, new regulations.

Mathivanan V of ManageEngine tears up the traditional approach to patch management with automation.

Cloud workloads could be at risk from security, management and compliance failings.

12 Centre of Attention Greg McCulloch of Aegis Data on what to consider when it comes to data centre location.

14 Meet Me Room Dave Ricketts of Six Degrees discusses changes in the industry and what makes a successful data centre.

26 Cyber Security Greg Sim of Glasswall Solutions predicts what’s next for cyber security in 2018.

28 IoT and Big Data Darren Watkins of Virtus Data Centres addresses the bottlenecks that come with Big Data and the IoT.

January 2018



Centiel expands with the acquisition of MPower UPS

04 Welcome 06 Industry News

data centre news

34 Edge Data Centres Mattias Fridström of Telia Carrier looks at how cheaper fibre is driving data centre evolution.

36 Projects and Agreements Etix Everywhere to deliver Ghana’s first Tier IV data centre.

42 Company Showcase Could you be overestimating your cable certifier usage? Ideal Networks is here to help.

44 Final Thought Rob Perry of ASG Technologies gives us four top tips to help solve traditional EMC problems.


Meet Me Room

Special Feature GDPR

Dave Ricketts of Six Degrees discusses what makes a successful data centre and why timing is everything.

Centre of attention

Greg McCulloch of Aegis data on what to consider when it comes to data centre location.

SPECIAL FEATURE: GDPR 16 Mervyn Kelly of Ciena gives

us a guide to network security when it comes to GDPR.

18 Darren Mawhinney of

CloudMigrator365 explains how the cloud could help your business get GDPR compliant.

22 Your GDPR FAQs answered.

Alex Bateman of Virtual College gives us the lowdown, alongside experts from leading IT companies.

24 What you need to do to

ensure compliance. Ian Kilpatrick of Nuvias Group gives us a GDPR checklist.

January 2018 | 3

data centre news

Editor Claire Fletcher

Image courtesy of Descrier

Group Advertisement Manager Kelly Byne – 01634 673163

Studio Manager Ben Bristow – 01634 673163

Designer Jon Appleton

Business Support Administrator Carol Gylby – 01634 673163

Managing Director David Kitchener – 01634 673163

Accounts 01634 673163

Suite 14, 6-8 Revenge Road, Lordswood, Kent ME5 8UD T: +44 (0)1634 673163 F: +44 (0)1634 673173

The editor and publishers do not necessarily agree with the views expressed by contributors nor do they accept responsibility for any errors in the transmission of the subject matter in this publication. In all matters the editor’s decision is final. Editorial contributions to DCN are welcomed, and the editor reserves the right to alter or abridge text prior to publication. © Copyright 2017. All rights reserved.

4 | January 2018


anuary: For many of us that means failed resolutions, excess Christmas weight and a mild case of the January blues. It is a time when we realise this ‘new year, new me’ stuff just isn’t worth the effort and we come to the conclusion we will remain exactly the same and not change a thing. It’s a fact of life, us human beings don’t like change, but at the moment, particularly in this industry, change is the only constant we have. It also happens to be a constant that we need. The amount of data we produce is increasing at an exponential rate; hackers are constantly getting better and faster and attacks are becoming relentless. In order to stay one step ahead we need to be continually adapting to change. Perhaps a change that isn’t so welcome is the ominous arrival of the General Data Protection Regulation (GDPR) set to come into force in just four months’ time. On May 25 to be precise. The industry appears to have gone into panic mode, and no one

Claire Fletcher, editor

seems to be ready, or actually know what they need to do to. You might not think it affects you, but if your company handles data relating to any EU citizens, I hate to be the bearer of bad news, but it applies to you. This data could be anything from usernames and location data, to something as simple as a password. And you should be aware that with GDPR comes ludicrous fines for a data breach – up to 4% of annual global turnover or €20 million, whichever is greater. So, sticking your head in the sand probably isn’t the best solution, which, according to many an industry survey, is what the majority of us are doing, so at least we’re in good company. That said, in this issue our special feature centres around GDPR; what it is, why it’s happening, how you can make your life easier, and most importantly what you need to do to ensure compliance. Should you have any questions or opinions on the topics discussed please write to: Claire.fletcher@


industry news

Security, reliability and quality of service most critical factors when evaluating networking solutions providers Reliability, security and quality of service all rank above cost when evaluating networking solutions providers, according to a new report released by CenturyLink. Conducted by Spiceworks, a professional network for IT, the CenturyLink-commissioned report surveyed US-based IT professionals who have influence on their organisations’ Ethernet, MPLS and SDWAN purchase decisions. The survey gathered data around the challenges, benefits and drivers of utilising these networking technologies. It was found that nearly half of respondents preferred purchasing networking solutions as part of a bundled offering, as opposed to a single solution. Nearly three out of 10 respondents believe their organisations will require significantly more bandwidth in the next three years, and more than eight out of 10 respondents said their organisation will require more bandwidth. In addition, more than 70% of organisations using Ethernet and MPLS, and more than 60% using SD-WAN, are considering or evaluating vendors to assist with technology solutions. “As IT professionals work to meet the increasingly complex technology demands of their organisations, they are looking to deploy versatile networks that can scale with company growth,” said Eric Barrett, director of marketing, CenturyLink. “Regardless of the type of networking technology being utilised, reliability, security and quality are the driving factors when selecting a solutions provider.” CenturyLink, 6 | January 2018

Alert Logic finds advanced threats and insider security threats top cybersecurity concerns Alert Logic has announced the results of a survey conducted with 400 UK cybersecurity professionals to better understand the evolving cyber threat landscape UK companies face. The survey found that respondents’ confidence in their organisations’ overall cybersecurity posture is moderate to high, with only a fifth indicating they are not at all, or only slightly, confident in their organisation’s security posture. When asked about the top challenges facing their cybersecurity teams, respondents cited detection of advanced threats (62%) and detection and/or mitigation of insider threats (48%) as the two top security challenges. Furthermore, 41% lacked advanced security staff to oversee cyber threat management and nearly a third (27%) lacked confidence in their automation tools catching all cyber threats. Lack of budget (51%), skilled personnel (49%) and lack of security awareness amongst employees (49%) weighed in as the most significant obstacles facing cybersecurity teams, inhibiting their organisations from adequately defending against cyber threats. In addition, when asked about the business impact of security incidents, system downtime was highlighted as having the biggest impact. Interestingly, revenue impact was only cited as a relatively minor factor (16%), suggesting that either security teams have evolved their maturity to effectively manage risk, or lack full visibility into the downstream business impact of security incidents. Alert Logic,

industry news

Poor security practices put cloud-driven business growth and cost savings at risk Swift adoption of cloud-based services and a lack of welldefined security strategies is leaving organisations struggling to keep control of their data, across a sprawling number of services and applications, according to new research from Kaspersky Lab. For many organisations, the speed of cloud adoption and lure of cost and operational savings has been to the detriment of security, with many using cloud services with no strategy for the security of their information. Uncertainty around who is responsible for the security of data in the cloud can often be the basis for this approach. The research found that 70% of businesses using SaaS and cloud service providers have no clear plan in place to deal with security incidents which could affect their partners. A quarter admit to not even checking the compliance credentials of their service provider, suggesting an assumption that they will pick up the pieces if something goes wrong. Also with a quarter of businesses having experienced a security incident affecting the IT infrastructure hosted by a third party over the last 12 months, a reliance on cloud providers alone to provide complete protection could be a risky strategy.

This lack of planning and accountability by cloud adopters for the security of their information, could have serious consequences for companies, with enterprises suffering an average financial impact of £900k as the result of a cloud-related security incident, compared to £75k for SMBs.

2. How Does It Work? Kaspersky Lab,

A blockchain is a distributed and decentralized database, shared among known or anonymous participants, that maintains a continuous list of records (transactions). Participants in a blockchain do so using technology called blockchain nodes. Each blockchain node (a computer connected to the network) uses cryptography to secure strings of records (called blocks). Algorithms enable consensus among participants that new records are valid, and reject them if this consensus is not reached. Blockchain tracks the chronology of records (provenance), and ensures that records cannot be modified after being created (immutability). Each block includes a transaction and the reference to the previous block. This linear chain of blocks is replicated across all participating nodes, so that every participant is aware of all the transactions. Figure 1 illustrates the process.

451 Research: 28% of enterprises are already experimenting with blockchain

Figure 1: How Blockchain Works Source: 451 Research, 2017

BLOCK IS BROADCAST AMONG As it publishes its Blockchain Codex, 451 Research has PARTICIPANTS OF THE NETWORK TRANSACTION AS A BLOCK revealed that 28% of enterprises are now evaluating PARTICIPANT CREATES TRANSACTION or using blockchain, although fewer than 3% have any production applications. According to the 2017 Voice of the Enterprise Cloud Transformation, Vendor Evaluations study, 20% of organisations surveyed are using blockchain in a discovery or evaluation phase, 4% running trials or pilots, 2% in test and development environments, 2% undertaking initial implementations of production applications and less than 1% have broad implementation of production applications. The market is rife with blockchain-washing and there is little understanding about how enterprises can deploy NETWORK PARTICIPANTS BLOCK ADDED TO CHAIN REPLICATED LEDGER UPDATED blockchain profitably while navigating a market with APPROVE TRANSACTION A blockchain builds trust among participants as peers without the need for a central authority, by distributing and thousands of vendors and hundreds of consortia vying for securing records management, and authenticating records using distributed consensus algorithms. Any participant can thus confidently engage in transactions or commerce with any other participant in the blockchain.with Using aan blockchain mindshare. The Blockchain Codex systematically decodes this market with the goal of replacing confusion and complexity does not magically eliminate the possibility of fraud, but it makes it nearly impossible, or at least economically or practically unfeasible. examination of the technology components and guidance on first steps. The blockchain’s components – distributed database, cryptographic hashes and consensus protocols – are nothing new, but Analysts believe blockchain has the potential to be the active ingredient for establishing universal trust amongst parties, through combined, they create a new way for sharing data and transferring assets. clever code and peer consensus. In the enterprise sector, where smart contracts will dictate terms, and cloud-tasking using multiple C O PY R I G H T 2 0 1 7 4 5 1 R E S E A R C H . A L L R I G H T S R E S E RV E D. providers is the norm, there will be a need for transparency and an©immutable system of record. At the edge, IoT devices could take advantage of blockchain for authentication and to store and share interactions and data.

451 Research, January 2018 | 7


industry news

WinMagic finds cloud workloads at risk from security, management and compliance failings New research from WinMagic has found that security, management and compliance challenges are impacting the benefits businesses are receiving from using the cloud as their infrastructures become more complex. 39% reported their infrastructure was more complex since using the cloud, and 53% spend more time on management tasks than they have done previously. 98% of the 1,029 IT decision maker respondents reported using the cloud, with an average 50% of their infrastructure up in the sky. Over one third (33%) of respondents reported that data is only partially encrypted in the cloud, and 39% admitted to not having unbroken security audit trails across virtual machines in the cloud, leaving them exposed.

Asked about their top three concerns on future workloads in the cloud, 58% reported security as their top concern, followed by protecting sensitive data from unauthorised access (55%) and the increased complexity of infrastructure (44%). On average, companies had to use three encryption solutions to protect data across the cloud and on-premises infrastructure, illustrating one of the main ways this complexity emerges.

Hyperscale data centre count approaches the 400 Mark; US still dominates New data from Synergy Research Group has shown that the number of large data centres operated by hyperscale providers is rapidly approaching the 400 mark. The year-end total will be over 390, after Q4 data centre openings in China, India and Malaysia. The mid-year period saw a flurry of openings in Germany, the UK, Singapore, Australia, Brazil and the US, with Google being particularly active. One notable feature of the global footprint is that despite a major ongoing push to locate new operations in countries around the world, the US still accounts for 44% of major cloud and internet data centre sites. The next most prominent locations are China, Japan and the UK, which collectively account for another 20% of the total. The four leading countries are then followed by Australia, Germany, Singapore, Canada, India and Brazil, each of which accounts for 3-5% of the total. The research is based on an analysis of the data centre footprint of 24 of the world’s major cloud and internet service firms, including the largest operators in SaaS, IaaS, PaaS, search, social networking and e-commerce. Synergy Research Group, 8 | January 2018

Responsibility for the regulatory compliance of data is a significant area of confusion, with only 39% considering themselves ultimately responsible for the compliance of data stored on cloud services. Worryingly, 20% believe it is solely the responsibility of the cloud service provider, while a further 20% believed they were covered by their cloud service provider’s SLA. WinMagic,

industry news

Digital transformation drives strong HCI Adoption in the UK NetApp has announced the findings of its study to uncover the true scale of digital disruption. In a survey of 501 UK IT decision makers from the full spectrum of the enterprise, NetApp found that 82% of businesses are in the midst of digital transformation. Hyper Converged Infrastructure (HCI) emerged as the answer for digital transformation acceleration, with 22% of respondents adopting the technology for scalability and 20% for access to data. The level of concern is high among larger businesses, with 42% seeing digital transformation as a primary concern and only 2% citing digital transformation as a low priority. To navigate increasingly data-driven enterprises, HCI is the solution of choice, with 80% of UK businesses already using it and 11% planning to in the next 12 months. Only 4% say they have no plans to adopt HCI. Security is the leading motivation for HCI adoption (54%), with ease of use (44%) and cost savings (41%) following close behind. However, 49% state hardware costs as a primary concern when considering HCI adoption. With 78% of IT decision makers indicating that they are ready for HCI technology, the market is ripe for a next-gen HCI solution. NetApp,

New research reveals cultural fit is a key consideration when choosing a cloud provider A survey by Proact and AG Connect has found that 70% of companies would choose out-tasking over outsourcing. When making the complex decision of who to select as a cloud provider, organisations do not just consider cost, but instead seek a partner that has the right cultural fit in order to achieve the desired service quality. This theory is supported by the results from the 2017 edition of the Hybrid Cloud Journey Barometer survey from Proact and AG Connect. The survey results illustrated that when choosing a service provider, three criteria play an important role, namely the service costs involved, the deep-seated IT knowledge of the IT partner and their cultural fit. Flexibility also turned out to be a key factor. Almost 90% of respondents consider flexibility as important or very important when purchasing cloud services. Flexibility in this instance refers to customers having a wide selection of choice, from the location where data is stored, to management and financial models. Sander Dekker, business unit director of Proact Unit West, commented, “Choosing the right cloud provider isn’t easy. How do you determine whether a service provider has an approach and a culture that truly suits your business? That is why it is not surprising that customers usually base their initial decision on costs. It is, however, a lost opportunity, because you need a lot more information than that to build a successful partnership.” Proact,

New study finds security professionals are wasting 40 hours per month due to inefficient systems LogRhythm has released a new research study that reveals process and software inefficiencies play a major role in slowing down an organisation’s ability to detect and respond to cyber threats. Over one-third of IT decision makers say their teams spend at least three hours a day on tasks that could be handled by better software. Additionally, the majority think the average cybersecurity professional wastes as much as 10 hours a week due to inadequate software. The study, conducted by Widmeyer, which surveyed 751 IT decision makers from the US, UK and Asia/Pacific, also found that an overwhelming majority (88%) of respondents view insider threats as a dangerous and growing concern in defending their organisations. The good news is that artificial intelligence (AI) is emerging as a critical weapon that organisations can use to fight the cyber war. The study reveals that IT executives in the US believe that AI will be the biggest game changer for security over the next several years, enabling them to start winning the battle against external hackers and insider threats. These decision makers expect that faster threat detection will be the number one benefit of cloud-based AI security, followed by superior data analysis and improved collaboration. LogRhythm,

January 2018 | 9

on the cover

Thinking Big Centiel aims to significantly grow its UK market share with the acquisition of MPower UPS. Michael Brooks, managing director at MPower tells us more and joins us for a Q&A.


arlier in 2017 MPower joined forces with Centiel, to market their UPS products and has now become a subsidiary company in the Centiel Group. Michael Brooks, managing director at MPower UPS confirmed, “Whilst it is very much business as usual for all of our existing customers, the acquisition now means we have the product range and financial strength to exclusively and proactively market Centiel’s power protection solutions to a much larger target market.

10 | January 2018

“These solutions include the CumulusPower and PremiumTower products. CumulusPower is a three-phase, modular UPS system which offers class leading system availability and very low total cost of ownership. PremiumTower is a unity output power factor machine for critical loads of between 10kW and 60kW. “The change is very exciting as we can now offer our valued client base a manufacturer’s commitment and backing meaning we can offer even better support,” continued Michael. “The new

structure also means we can target ever larger contracts and have access to ever greater resources, whilst maintaining the quality and level of personal assistance we have become known for over the past 12 years.” Filippo Marbach, founder of Centiel SA commented, “Centiel SA needed a strong UK subsidiary to expand its market share in the UK and MPower UPS is ideally placed to fulfil this role. MPower’s highly experienced team have an excellent reputation for high quality service and support.

on the cover

By combining such service excellence with an unrivalled expertise in project delivery and Centiel’s power protection products, we believe we can provide compelling arguments why customers should trust Centiel with their critical power protection requirements.” Michael Brooks of MPower explains more about the acquisition and what the future hold for both MPower and the industry.

What is Centiel’s history? Centiel is a Swiss-based technology innovation company. The company’s founder, Filippo Marbach and his highly experienced design team have led the advances in UPS technology for many years, enabling the improvements in availability and efficiency that we see today. For example, the now commonplace transformerless UPS technology was pioneered by Filippo and his team in the early 1990s. Furthermore, this same design team also developed the first three phase modular UPS systems, and the now widely available second generation of systems. It should be no surprise therefore that Centiel is now responsible for the latest third generation of three phase modular UPS systems and we are very excited to be a part of that.

What were the drivers to join with Centiel? The Internet has transformed the landscape of business. Big data, Industry 4.0, and the Internet of Things require a global and always-on network. As a result, traditional high-availability technology is no longer good enough. Data centres and systems now span the globe and applications must be accessible at all times.

Filippo Marbach, owner and founder of Centiel (left) and Michael Brooks, managing director, MPower UPS (right)

Meeting these higher availability demands requires a new power protection strategy, that accounts for the increasing complexity of enterprise application infrastructures. UPS solutions now need to be ultraavailable, ultra-efficient and of course very high quality. Centiel’s technology now offers a vastly superior solution in the form of CumulusPower, the innovative fault-tolerant modular UPS solution that takes availability and operational efficiency to a whole new level.

What does the future hold? We have recently introduced the new Swiss built Centiel CumulusPower UPS, incorporating Distributed Active Redundant Architecture (DARA) which provides a significant improvement over previous system designs. Each module contains all the power elements of a UPS – rectifier, inverter, static switch, display – and critically – all control and monitoring circuitry. This places it far above other current designs that incorporate a separate, single static switch assembly and separate intelligence

modules and ensures there is no single active component as a possible point of failure. As for the future, we are delighted now to be part of the Centiel group and look forward to introducing the latest innovations in product development. One area in which we have anticipated change for some time is in battery technology. We believe that over time, there will be a move towards lithium-ion (Li-ion) batteries, as cost reductions driven by developments in the automotive industry flow through to the standby power sectors. Incorporating Li-ion batteries will inevitably reduce the size and weight of UPS systems and the longer useful working life of Li-ion will mean fewer costly replacements. All of which will benefit our customers. The systems of the future will need to be designed with Li-ion in mind. The good news is that Centiel’s technology is already Li-ion ready, so existing lead acid battery installations will have the option to upgrade to Li-ion in the future without needing to replace the UPS. We will be displaying and talking about our Li-ion battery capabilities at this year’s Data Centre World in London. January 2018 | 11

centre of attention

Location, Location, Location Greg McCulloch, CEO at Aegis Data, discusses what we should be considering when it comes to choosing a data centre location and tells us why London shouldn’t necessarily be top of your list.


ocation plays a critical part within our decision-making process. From choosing where to live, where to work, where to holiday, even to where we do business and how we manage our information. The growth in data volumes coupled with the need for more flexible and bespoke IT capabilities, means committing significant investment into managing and maintaining your own internal IT infrastructure is becoming increasingly difficult to justify. Instead, the rise in third party outlets, such as colocation facilities, and the flexibility of terms offered by these providers now means you can pass that responsibility onto someone

12 | January 2018

else, giving you peace of mind in knowing that your IT infrastructure is safe and secure. But once you’ve made the decision to outsource, how do you choose a provider? Historically, London has often been the de-facto location to store your information, but as considerations such as rent, scalability and data growth continue to creep higher on operators’ agendas, we are starting to see a much bigger pull towards alternative locations. Recently, a report from the data centre consulting group BroadGroup, revealed Ireland to be the best place in Europe to set up a facility, citing several benefits including connectivity amongst cities, taxes and active government support.

Both Amazon and Microsoft have facilities in Dublin, with Microsoft’s being one of the largest in Europe. Now, Apple is looking to build an €850 million data centre in Athenry, outside of Dublin, and in doing so reinforces why many are starting to look past more traditional data centre locations. So what are the considerations you need to take on board when looking for a location? To start with, facilities located in major cities are automatically privy to the costs and risks associated with city life. In somewhere as land precious as London for example, the rental market is very competitive and because of this, the costs associated are then passed

centre of attention

city. While obvious conveniences such as proximity to financial districts and key exchanges will ring true, there are also risks to this, notably having all your centres in one basket. We live in a society where the threat of terrorism is very real. The recent London and Manchester attacks are an unfortunate reminder of this, but also served to demonstrate that we can’t take anything for granted. In the event of an attack, a city may go into lockdown – the powers of remote working mean the impact of this is lessened, but what would happen if an issue occurred at your data centre and no one was around to fix it? Additionally, the natural surroundings of a data centre can also define vital elements of security, inherently increasing or reducing your exposure to risks. Is the data centre located close to a river, like the Thames for example?

“The natural surroundings of a data centre can define vital elements of security.”

Image courtesy of Dronepicr

onto the customer. Another consideration is future cost – is your business likely to grow, meaning you may require more rack space? If this is a possibility, you need to ensure that the centre can provide the space to grow, because relocating to a new facility can be costly. Centres outside of London are often bigger, with greater capacity to grow and scale, and therefore are easily able to incorporate new technologies. While the costs associated with location are likely to be specific to a given business, one common concern is security and risk. Traditionally, hubs can often be found in close proximity to one another. In London, you’ll find the majority located in the east of the

How is this going to impact the data centre should there be heavy rains and flooding? Locating a data centre on a flood plain is always a risky strategy – all that needs to happen is for flood defences to fail once and your entire IT system could be compromised. Fire is also a constant threat for major cities. While today’s advanced fire prevention systems go a long way to protect the city’s data centres, those located outside of the city carry far less of a risk. When looking for a data centre partner, these are all things that you’ll need to consider. Your business will have unique requirements, and it is critical that any facility is able to meet these. While major cities such as London might appear to be the obvious choice when it comes to outsourcing your information, it is important to understand that these are not the only choice, as the rise of communities such as Ireland have demonstrated. Recognising this might allow you to unlock more features and benefits, reduce the threat of risk, and all at a significantly lower cost.

January 2018 | 13

meet me room

Dave Ricketts – Six Degrees Dave Ricketts, head of marketing at Six Degrees discusses changes in the industry, what makes a successful data centre and why timing is everything. What were you doing before you joined Six Degrees? How did you first get involved in the industry and would you have done anything differently? I was head of sales and marketing for C24, which was bought by Six Degrees about two years ago now. Originally when I completed my studies at university I worked for ad agencies, which had been a childhood dream, probably more to do with the 80s culture – the reality wasn’t for me. I happened upon the tech industry after I helped start 14 | January 2018

a company as one of the business development managers – we floated the company on the alternative investment market (AIM) and it was really my introduction to the industry. I believe you learn from your mistakes and successes, that everything happens for a reason and so far, I am happy with the journey. What are the biggest changes you have seen in the data centre industry? We have seen significant improvements to the services

“I think GDPR is an essential change for a maturing industry.”

offered by the giants, in terms of public cloud. Strong partner ecosystems have enabled them to have a share in solutions that have usually been dominated by traditional data centre owners and hosting providers. The challenge is the commoditised version of these solutions and the arrival of these companies as a competitive force within the market place. The development of ways to meet this competitive challenge will be a dominating issue throughout 2018.

meet me room

What is the main motivation in the work that you do? Client satisfaction - personally I try to deliver tangible benefits to all my customers, it is important to ensure clients see and feel that there is real value in the solutions we offer and in the services that we can bring to them. It is a great feeling when a client re-signs at the end of a contract – to know you have done a good job and they are clearly happy with it. What, in your opinion, is the most important aspect of a successful data centre? To be successful, a data centre requires a combination of a lot of attributes – accreditation, aesthetics for customers to comfortably have business meetings there; security is paramount, power, cooling, backup DR and more – it is multifaceted and you need them all. Are there any changes in laws or regulations that you would like to see, that you think would make your job easier? Well, GDPR is coming, ready or not. It will be one of those things that people will need to learn, adopt and change the way the business has been operating – the way data has previously been

managed. Really, I think it is an essential change for a maturing industry, we have taken on many aspects that we would not have coped with technologically 10-12 years ago and so it follows that process and procedure must also change and develop. What are the biggest pressures involved in your job? From a marketing perspective, it is getting in front of the right clients at the right time with the right solution. You need such a perfect combination – the right message in the right verticals for the right people. It is tough but when you nail it, it is a great feeling. The hardest lesson learned is the loss of a client and the realisation that they are not necessarily a client for life – the best you can do is be honest, fair and never become complacent. What are your hobbies outside of work? I like to exercise three to four times a week but I love to read – immersing myself into the works of top Harvard lecturers Chip and Dan Heath or the lucid prose of George Orwell and the epic stories of Charles Dickens – in my spare time you will always find me with my nose in a book.

In his spare time Dave loves reading classics the likes of Charles Dickens.

Where is your favourite holiday destination and why? St Ives in Cornwall. The destination has strong links to family for me and it epitomises memories of, what for me, are the most important things – long family lunches, discussions in the front room – just spending time well.


If you could travel back to any period of history, where would it be and why? My mum and dad are no longer with us so it would be nice to travel back and see them again. Historically speaking though, it would have to be 1930s America – to see how America was built and the emergence of this ‘American dream’ culture.

St Ives in Cornwall holds a lot of fond memories for Dave.

What is the best piece of advice you have ever been given? “Always play with a straight bat” – whether it is family, friends, colleagues, clients or complete strangers, always be as honest as you would want them to be with you. January 2018 | 15


Rules and Regs Mervyn Kelly, EMEA marketing director at Ciena gives us a guide to network security when it comes to EU security regulations.


arely does a piece of legislation come along that has such widereaching implications as the European Union (EU) General Data Protection Regulation (GDPR). One of the largest and most detailed pieces of law that the EU has pulled together, it drastically changes and harmonises the rules for

16 | January 2018

how data is collected, used, stored and protected. Coming into effect in May 2018, it places greater rights in the hands of individuals to control the use of their data, and commands organisations to deliver a drastic overhaul of data governance. The legislation has implications for companies globally – this really isn’t something that is contained

within the EU – it has far-reaching implications for any organisation that handles data relating to EU citizens. This includes carriers, managed service providers, data centre operators and anyone responsible for operating data links from A to B. If you do business with anyone in the EU, or hold data on users in the EU, GDPR will impact you.


“If you do business with anyone in the EU, or hold data on users in the EU, GDPR will impact you.”

In addition to taking steps to collect, store and use data in a compliant manner, companies also need to look at how they move data, to ensure this too does not place information at risk of interception, miscommunication, misuse and general dereliction of care.  

The implications for network operators The complex network infrastructure over which data travels is going to play a key part in ensuring compliance. With a legally required, but broad notion of a duty of care for data security established in the new law, everyone from telcos to data centre

operators will need to show they have taken robust and reasonable steps to protect data at every part of its journey, including in-flight. This means ensuring data can’t be read if it is intercepted or otherwise tampered with, as it moves from users into networks (such as inputting sensitive information via a secure web form), moves between locations (transferred from server to server, or database to database) and between points of storage (from short term to long term storage and archiving). Encryption can play a significant role in helping organisations to move and store data in a way that is compatible with GDPR compliance. In-flight encryption can prevent the interception of data as it moves across fibre and copper networks. It is a major step towards ensuring a network infrastructure owner is not dragged into a data breach story, and held liable for all or part of a large fine – which could be as much as 4% of global turnover or €20m, whichever is greater. To achieve this, operators should place appropriate hardware at each end of a connection to ensure that in-flight data is automatically encrypted and decrypted with minimal lag and with minimal disruption to users and applications. Alongside the implications of GDPR, it is important to note that all operators of large network infrastructures will need to ensure they adhere to the EU’s e-Privacy Regulation. This operates in tandem with GDPR, and applies to approximately 60% of data in active circulation. It serves to regulate traffic and location information on a telecoms network. As part of efforts to further harmonise it with GDPR and ensure compatibility between the two pieces of legislation, this directive is set to be expanded to potentially cover more than just telco network operators and ISPs, but also to cover over the top (OTT) services such as VoIP internet

telephony, video streaming services, webmail and instant messaging. This would see over-the-top providers such as Skype, Facebook, Apple FaceTime, WhatsApp and others fall under the legislation.

Permission to hold and move data The GDPR requires all organisations collecting personal data – either directly or indirectly on behalf of wholesale customers – to be able to prove clear and affirmative consent to process that data. Organisations need to ensure they use simple language when asking for that consent to collect personal data. They need to be clear at the point of collection about how they will use the information, how and where it will be stored, and they need to understand that users need to actively opt in, not opt out. If they intend to shuttle data around the EU, or in and out of the EU to leverage global data centres and storage facilities, this too needs to be transparent to the individuals the data relates to in the event of a breach or loss of data. The GDPR also introduces very restrictive, enforceable data handling principles. One of these is the ‘right to be forgotten’. It empowers individuals to require an organisation to purge all personally identifiable information (PII) they hold on an individual. This will have obvious implications for network operators as an increase in internal search and discovery traffic will add strain to networks and storage arrays. GDPR is coming in a little over six months. If network operators, OTT providers and data centre operators want to continue to be trusted advisors, and not exposed to punitive sanctions and reputational damage when data goes astray, they must become reliable custodians with a focus on security, process and safe transit. January 2018 | 17


Cloud Control Darren Mawhinney, managing director of CloudMigrator365, explains how the cloud can help your business get compliant with GDPR.


he UK Brexit planning has started in earnest and companies and organisations are rightly looking at what leaving the EU will mean to their operations and staff. However, amid wide-ranging business concerns is a new piece of legislation affecting personal data which could potentially have similar aftershocks – and that’s the EU General Data Protection Regulation (GDPR) which will apply to the UK from May 25, 2018. GDPR is intended to strengthen data protection for individuals within the EU while imposing restrictions on the transfer of personal data outside the European Union, to third countries or international organisations.

18 | January 2018

New rules

The challenge

It would be a mistake for any data controller or processer to assume that because they know and adhere to the existing Data Protection Act 1998 (DPA) that it will be similar and therefore no additional compliance is required. GDPR will have a set of new and different requirements and for any organisation which has day-to-day responsibility for data protection, it’s imperative that they monitor the regulations and ensure that their organisation can be compliant-ready ahead of next year. Compliance requires investment as well as specialist knowledge and many business leaders are looking at how the cloud will be able to help with their data storage, protection and management and meet GDPR compliance as well.

GDPR is the biggest challenge facing data management in the last 20 years; it’s no understatement to say that it is presenting business leaders with a headache. A survey from analyst firm Gartner earlier this year showed that around half of those affected by the legislation, whether in the EU or outside, will not be in full compliance when the regulations take effect. The message coming forward is that the cloud is the preferred option to help with the upgrading of data security practices and data protection standards in line with the regulations. As the May 2018 deadline nears ever closer, moving data to the cloud can help ease the burden faced by senior IT leaders, many of whom see GDPR compliance as their top priority.


As a leading cloud services provider, we are increasingly being asked about GDPR considerations from concerned clients migrating to the cloud. We believe that the task of migrating people’s data such as emails, contacts, files, calendar and tasks over to Office 365 will make compliance easier for organisations.

The process of cloud migration During any cloud migration process, the most important result, particularly with the need for GDPR compliance ahead, is that data sovereignty is maintained and full control with comprehensive reporting is provided. After migration comes management and it’s the next big part of the cloud which is vital to GDPR compliance to address security and data protection.

This is why we introduced our own cloud management software, CloudManager. Launched at Microsoft Inspire 2017, CloudManager is a public cloud management platform which gives organisations and service providers the ability to control multi-tenant Office 365 users in a very intuitive and cost effective way. It can be used by organisations and enterprises of different sizes and so far is helping companies to lower daily administration costs and Microsoft Ticket escalations. More importantly, it is particularly beneficial for largescale operations with a mobile and fast-growing changing workforce, which has need for substantial licences while it deals with the ebb and flow of its workers.

With the need for increased security, CloudManager offers bulk transaction processing, hierarchical management capability and role based access control, which again helps companies to comply with the increasingly stringent access controls required by GDPR. By providing software such as CloudMigrator365 and CloudManager, which help organisations meet GDPR and get ready for May 2018’s deadline, we are growing our business rapidly. GDPR compliance before May 25, 2018 isn’t an option for those doing business with EU countries, it’s a necessity. Organisations will need to look across their business and manage their data holistically to ensure compliance and avoid sanctions. With GDPR coming into effect in a matter of months, the time to act is now.

“GDPR is the biggest challenge facing data management in the last 20 years.”

January 2018 | 19

Product Categories



Gala Awards Dinner - May 9 - Receive your award - Network with industry leaders - Live entertainment - Three course meal


Projector screen of the year Projector of the year Bracket/rack or mounting product of the year Loudspeaker of the year Cable solution of the year Matrix/signal distribution product of the year Multi-room/zone music solution of the year Dedicated touchscreen of the year TV/Display of the year Integrated TV/Display of the year

Project Categories -

Best cinema project Best commercial integration project Best whole house project Multi-dwelling unit (MDU) project of the year

Company Categories -

Distributor of the year Manufacturer of the year Training initiative of the year

People Categories -

Sales person of the year Special recognition award

w w i t t e r. co m /e i l i ve s h ow Untitled-13 3

21/12/2017 15:56





The Event Of The Year. May 9th - 10th 2018

Ready in good stead for 2018, EI Live! is back and



ready to blow your smart home socks off. There’ll be a



brand new format for exhibitors, a delicious networking



dinner where the winners of the Smart Building Awards


will be announced, and a jam-packed learning zone


with a super schedule of spectacular speakers.

Book your stand now, your competitor has!

w w i t t e r. co m /e i l i ve s h ow Untitled-13 2

21/12/2017 15:56


Just Do it There’s no time like the present – Alex Bateman, strategy manager at Virtual College explains why businesses need to update now for GDPR. Experts from leading IT companies also answer some key questions surrounding the new legislation.


he new EU General Data Protection Regulation (GDPR) is set to be a complete overhaul of the UK’s current Data Protection Act (1998). The new data protection laws will apply to businesses and public sector organisations of all sizes, and the regulation enforces new guidelines for data handling that both data processors and data controllers must abide by.

22 | January 2018

The purpose of GDPR As we find ourselves in a new age of technology, companies are now able to store more Personally Identifiable Information (PII) data than ever before. This means that there is a greater need for customers’ personal information to stay protected, and this is what the General Data Protection Regulation aims to achieve. Along with companies now being able to store more data

comes a rise in cybercrime activity, as cyber criminals become more advanced in their efforts to target companies. Recently, well known organisations have found themselves victims of cybersecurity attacks, such as the WannaCry ransomware which affected the NHS in the UK, as well as hundreds of other companies globally. In total, there were 50 cyber incidents reported to the ICO


across different industries between the period of April-June 2016, compared to 119 between JanuaryMarch 2017 (a 138% increase).

Investing in cybersecurity E-learning and training provider Virtual College says that in order to be compliant for GDPR, businesses need to ensure that their technology and software is updated to protect stored data from cybersecurity threats. Peter Hilliard, head of marketing at Virtual College commented, “There will be several key changes between the Data Protection Act 1998 and the GDPR that you need to be aware of. You need to start thinking about some of these changes now so that your business can be ready when the law changes. Virtual College has developed a free overview course that explains the changes that you need to be aware of as a risk owner.” A study with IT professionals in 200 businesses by CA Technologies revealed that to meet with the GDPR deadline next May, almost nine in 10 (88%) of businesses stated that they need to invest in new technologies and services. Plans for investment in technology included the following areas: • Encryption (58%) • Analytics and reporting (49%) • Test data management (47%) Businesses spent on average £4,590 in the last financial year on investment into cybersecurity, according to the government’s Cybersecurity Breaches Survey 2017. Predominantly, sectors which deal with finance, insurance and information or utilities sectors spent the highest amount. Education, health or social care sectors typically spent a relatively low amount in comparison, despite cybersecurity being considered a high priority.

Some of the UK’s leading IT companies have provided answers below to key questions surrounding how businesses should prepare for GDPR.

for doing so. They must also have in place the ability to manage and respond to customers’ Subject Access Requests in a timely and efficient manner. Finally, SMEs must re-examine their cybersecurity systems to make sure they are up to date and capable of protecting any data they are storing,” commented Phil Beckett, managing director of Global Disputes and Investigations at Alvarez and Marsal.

If a business has not updated technology correctly or is still using outdated software, what are the potential cybersecurity risks? “This is a big problem as you are opening yourself up to hackers and the potential of an attack. Even simply ignoring an update is creating a risk,” says Harshini Carey, regional director at KMD Neupart UK. “Technology has a shelf life, needs constant updates and maintenance, and failing to keep technology up to date results in vulnerabilities and being exposed to hacks, malware infections or ransomware attacks to mention a few,” commented Austen Clark, managing director at Clark IT.

What processes can a business put in place to be ‘GDPR ready’? “Employees need to be sufficiently trained to take the proper precautions, be it with surveys, educational videos, or one-on-one meetings. The truth is that in order to become – and stay – compliant with the GDPR, organisations will have to establish the right processes that will ensure continual compliance. Information and data security need to be a part of every aspect of a company, its very foundation, for it to have a chance of succeeding,” said Harshini Carey, regional director at KMD Neupart UK. “In order to prepare, small businesses should avoid the unnecessary usage of multiple disparate data systems, as they will need to account for the data they are holding by proving it is relevant to their business, including the legal justification (including consent)

“With companies now being able to store more data, comes a rise in cybercrime activity.”

How can a business know the right amount of money to invest in IT and cybersecurity? “There is no right answer to this, but taking this seriously is important - how much value do you place on your reputation? We read news of a breach to a business, and the stigma of being unsecure stays with the victim, confidence and integrity is damaged and can be more damaging than the initial financial loss. “Almost all software has a lifecycle. Software engineers are tasked to manage and maintain this during its lifespan. When it reaches the end of its useful life these engineers move on to the ‘new’ software leaving the older unsupported software vulnerable to future attacks – this is what happened to the NHS. “Many actions cost nothing, like changing your password, locking your phone, educating yourself. Other steps have minimal cost, like installing a malware and antivirus package, or ensuring your router and firewall are secure and up to date,” commented Austen Clark, managing director at Clark IT. Read Virtual College’s guide to GDPR here to find out more about what your business can do to be prepared for the new regulations. Or you can register for Virtual College’s free ‘An Introduction to GDPR’ online course, which is available to anyone looking to find out more information. January 2018 | 23


‘To Do’ Ian Kilpatrick, EVP Cyber Security for Nuvias Group discusses what GDPR means for your business and what you need to do to ensure compliance.


DPR is coming and will be a game-changer in how organisations store, secure and manage personal data. GDPR will affect the whole of the EU Zone, which currently spans 28 member countries and half a billion citizens. Its goal is to unify data protection across the European Union, but because GDPR applies to individuals within the EU or the European Economic Area (EEA), companies outside these zones will still have to meet the standards if they want to continue using data from customers in the EU. The purpose of the new regulation is to shift control of personal data back to the owner of that data. Every organisation should be aware that with GDPR comes huge fines for data breaches – up to 4% of annual global turnover or €20 million, whichever is greater. Therefore, the consequences of any data loss could be financially devastating for any company. The data in question could be usernames, location data, online identifiers like IP address or cookies, or passwords. The loss of personal or work-related information – whether that’s access details, passwords, or any other customer data – is endemic today; almost 1.4 billion data records were lost in 2016 alone, an increase of 86% compared to the year before.

24 | January 2018

After next May, organisations will have 72 hours to disclose any serious data breaches to the relevant authorities – in the UK it’s the Information Commissioner’s Office (ICO), as well as the victim of the breach. The penalty for failing to notify them of a breach will be up to €10 million, or 2% of revenues. Analyst firm IDC predicts that the severity of fines, coupled with the substantial changes in scope, will drive enterprises to radically shake up their data protection practices, seeking the assistance of new technologies to assist with compliance. Despite all this, a survey by information services group, Experian, reports nearly half of businesses (48%) admit they are not ready for GDPR, and are only in the early stages of preparing for the regulations. If they are not doing so already, organisations need to start putting plans in place now if they’re to meet the May 2018 deadline. So, what steps can companies take to ensure their GDPR compliance? The ability to ensure confidentiality, integrity, availability and resilience will be crucial – as will be restoring data in a timely manner in the event of an incident. Organisations will need a process for testing and evaluating the effectiveness of their security processes, meaning they will need to demonstrate they have taken adequate steps to protect the data.

GDPR doesn’t prescribe specific data protection technologies, but rather processes that organisations should undertake. However, companies should be talking to their IT providers about core data security solutions that cover things like encryption, access and identity management, two factor authentication, application control, intrusion prevention and detection, URL filtering, APT blocking and data loss protection. Also, they shouldn’t neglect the network, by securing wireless access points, for example. Having a demonstrable security policy in place and


making sure employees are fully trained in the correct security practices will prove invaluable. Larger organisations and public bodies will require a data processing officer; this is a senior role that operates independently of the IT department and will enjoy significant protection, along with the responsibility of reporting any data breach. They will act as a fulcrum for developing, enacting and continually testing security compliance posture. However, GDPR compliance is everyone’s responsibility, and shouldn’t be left to one team – legal, IT, HR and other business

“The purpose of the new regulation is to shift control of personal data back to the owner of that data.”

functions must all be involved with visible support from the executive level. Something else that GDPR will likely affect is insurance. As the regulations require every business to report any data breach, there is going to more of an emphasis on liability and who is to blame as data losses come to light.

In simple terms, businesses should document everything they have done at a technical and policy level to show due diligence. There are several framework documents created at a national level that can help. For example, the UK’s national cyber security centre has a number of 10-step programmes that offer a basic checklist of areas that should be covered. With heavy financial and reputational risk threatening, the sooner the new regulations are adopted, the more confident a company can be that it will not be found wanting when GDPR comes into effect. January 2018 | 25

cyber predictions

What next? Greg Sim, CEO at Glasswall Solutions discusses what’s next for cybersecurity in 2018.


he last year has been a significant 12 months in the short history of cybersecurity, with headline security breaches such as Uber and a scramble to come up with new approaches, particularly as the European Union’s General Data Protection Regulation comes into force next May. 2018 will see further developments in this dynamic field that will affect almost every organisation on the planet. Here are some predictions for the next 12 months.

26 | January 2018

Automation will continue to transform cybersecurity It is increasingly recognised that responses to security breaches and other incidents are badly slowed down by manual processes. As a result it is inevitable that security operations workflows will increasingly be supported within Security Information and Event Management tools and incident response (IR) platforms. We can expect to see hefty resources devoted to IR automation in particular.

This will involve, for example, blocking malicious IP addresses, web domains, and URLs, using threat intelligence. An organisation could orchestrate the workflow associated with a security investigation or patching a software vulnerability, but in 2018 we are more likely to see large organisations automating security analytics and operations, largely because security involves so many mundane tasks, whereas orchestration is complex.

cyber predictions

Blockchain will be no cybersecurity panacea

Automation offers immediate gains across cybersecurity. With emails, for example, advanced solutions can automate the minute examination of every attachment against the manufacturer’s standard, so that only a sanitised document, free of malware is admitted to an organisation’s system. Decisions on whether to click open an attachment are no longer left to the harassed employee.

The growth of IoT will necessitate further rethinking of security The Internet of Things (IoT) extends the security border of an organisation way beyond its physical boundaries. Consider how many internet-enabled devices are part of an electricity grid. Smartphones, tablets and the new generation of electronics that users can control externally, such as refrigerators, home security systems and even home heating systems are also part of IoT and vulnerable to compromise. By 2020 we could be looking at a trillion connected devices in the world. The successful attack on the San Francisco MUNI transport system in 2016 is a prime example of just how vulnerable an organisation reliant on multiple internet-connected devices can be to hackers demanding a ransom to release encrypted data. An assault on the core infrastructure of the internet could have massive effect, particularly if it is linked to terrorism. The best defence is to keep malicious code out of an organisation’s network in the first place, rather than relying on outdated anti-virus defences, which as is widely known, can never pick up the kinds of malware criminals are devising every hour of the day.

“The Internet of Things extends the security border of an organisation way beyond its physical boundaries.”

It is tempting to think that blockchain perfectly complements internal security layers as part of a defence-in-depth approach. Implementations are starting to address blockchain data confidentiality and access control challenges by providing ready-made data encryption and authentication and authorisation capabilities. But blockchain provides little utility in threat-detection or active defence, so organisations throughout 2018 will find they need other more proven and tested forms of technological innovation to protect them from hackers and the millions of different malware variants they are throwing at businesses every year. This has to go alongside an overall cybersecurity programme that includes a governance framework covering roles, processes, accountability measures, performance metrics, and a change in mindset within the entire organisation.

GDPR will wake everyone up to security requirements Although the rush to achieve GDPR compliance is already underway, many businesses are going to be caught out as they fail to grasp their responsibilities to EU citizens whose personally identifiable data they hold. Legal challenges about the way data is handled are likely to proliferate, with fines, substantial costs and public exposure inevitable. It is likely, however, that the regulators will not inflict the full rigour of the penalties available where organisations have failed to comply through poor implementation of new processes.

The same may not be true of organisations that are breached by hackers and seen as failing to fulfil the GDPR’s requirement for state-of-the-art technology to be in place. Fines of up to €20 million or 4% of turnover may be levied if it is felt an example should be made to encourage everyone else to invest in effective security that protects citizens’ data. The first half of 2018 should be when the laggards finally address their major security loopholes such as continuing reliance on anti-virus solutions.

The small print – why innovation will trump cyber insurance in 2018 The cyber insurance market will continue to grow from a low base, but more businesses are also likely to realise that pay-outs can never cover the entirety of their losses if they are hacked. In the course of the year it will become apparent to many organisations, including SMEs, that investing in advanced security technology is a much better investment. They will be targeted by hackers using emails just like everyone else and need innovative solutions to protect them. Relying on traditional perimeter security and cyber insurance will nowhere near protect an organisation. Not only will substantial fines and legal costs be inflicted, the victim organisation will have to compensate individuals affected and then spend substantial amounts of time and money on rebuilding its reputation. Enterprises will see how cyber insurance will never mitigate all the damage of a successful cyber-attack. January 2018 | 27


Battling the Bottleneck Darren Watkins, managing director at Virtus Data Centres, talks Big Data, IoT and the need for high density and ultra high density computing, addressing the capacity and complexity bottlenecks that come with the territory.


ig Data and IoT have long been heralded as the next revolution within the IT world. Beyond the headlines of connected devices – and customer behaviour analysis – IoT and Big Data are being used to solve increasingly complex business problems. Digital businesses are turning to IoT technology to manage the connections, devices and applications that make up their organisation. Automated workflows – which have long been a watchword of

28 | January 2018

manufacturing business strategy – are being embraced by many disparate organisations. IoT and Big Data are clearly intimately connected: Billions of internet-connected ‘things’ will, by definition, generate massive amounts of data. The IoT industry generates ‘Big Data’ to take all of the information that it gathers and turn it into something useful, actionable – and sometimes – automated. Whilst on the flip side, IoT provides a wealth of data, which with compute

processing and intelligence, can generate invaluable insight for organisations to use. And, although the future seems expensive for these innovative technologies, for many, the possibilities are limited by issues of complexity and capacity. The benefit of IoT and Big Data will only come to fruition if businesses can run analytics that – with the growth of data – have become too complex and time critical for normal enterprise servers to handle efficiently.


The big capacity challenge IoT and Big Data put intense pressure on the security, servers, storage and network of any organisation – and the impact of these demands is being felt across the entire technological supply chain. IT departments need to deploy more forwardlooking capacity management to be able to proactively meet the business priorities associated with IoT connections. And Big Data processing requires a vast amount of storage and computing resources. All this means that, ultimately, the data centre now sits firmly at the heart of the business. Apart from being able to store IoT generated data, the ability to access and interpret it as meaningful actionable information – very quickly – is vitally important and will give huge competitive advantage to those organisations that do it well.

At Virtus, we believe that getting the data centre strategy right means that a company has an intelligent and scalable asset that enables choice and growth. But – get it wrong and it becomes a fundamental constraint for innovation. So organisations must ensure their data centre strategy is ready and able to deal with the next generation of computing and performance needs – to remain not only competitive and cost efficient, but also ready for exponential growth.

High Performance Computing Of course, the IT industry is devoted to designing innovative tools and techniques to keep up with the rapid evolution of tech trends like IoT and Big Data – and tech vendors already offer a multitude of solutions to the capacity and complexity problems.

“Being able to support High Performance Computing in the data centre has become the new battleground for colocation providers.”

High Performance Computing (HPC), once seen as the reserve of niche verticals such as education and pharmaceuticals, is now being looked at as a compelling way to address the challenges presented by IoT and Big Data. HPC has presented significant challenges in recent years – such as the scalability of computing performance for high velocity, high variety, and high volume Big Data, deep learning with massivescale datasets – but the benefits are increasingly clear, and not just within a few key verticals. Data centre managers are now looking to adopt high density innovation strategies in order to maximise productivity and efficiency, increase available power density and the physical footprint computing power of the data centre. Indeed High Density Computing (HDC) also addresses an important cost element – a crucial concern

January 2018 | 29


as complex tech developments mean that storage and power requirements spiral. HDC offers customers the ability to consolidate their IT infrastructure, reducing their data centre footprint and therefore their overall costs. The denser the deployment, the more financially efficient customer deployment becomes.

Finding the right provider We know that the processing requirements to meet the demands of IoT and Big Data, combined with cost mitigation, is accelerating the need for HPC. But many organisations may find the public cloud ill-suited to delivering the right platform. However, we believe that the answer is not to design and build a highly expensive owned data centre – that will age rapidly and become inefficient – but instead look to the colocation providers who understand the specialised needs for HPC. Being able to support High Performance Computing in the 30 | January 2018

Data centre managers are now looking to adopt high density innovation strategies to maximise productivity and efficiency.

data centre has become the new battleground for colocation providers – and high density capability will be crucial for businesses deciding which third party data centre to use. We think that organisations need to look closely at these capabilities. If High Density has been designed ‘in’ from the beginning, it provides the ability to support the next generation of businesses’ IT infrastructure for High Performance Computing – optimising the data centre footprint required and the overall associated costs. This means that irrespective of whether existing data centres take steps to offer High Density, they are playing catch-up with a next generation of intelligent data centres that already have this capability. Providers that are working to upgrade legacy data centres for Ultra High Density are facing a more difficult task. Although the concept of high density is straightforward, it involves a lot more than simply mainlining more electricity into the

building. It’s essential that before a data centre can support this requirement that it has a robust and fit-for-purpose infrastructure in place.

Future proofing your business Businesses can either seize the opportunity that IoT and Big Data offers – like game-changers Netflix or Instagram – or see their business disappear. While many industries have embraced this crucial opportunity to adopt IoT and Big Data technology, businesses who don’t get the basics right, will ultimately struggle to remain competitive on every front. And the key component to success is to ensure that the data centre is equipped to handle the rigorous demands which technology innovations place on them. Organisations must look to the right data centre partner to help their business succeed, and to new technologies like HPC and HDC, to help meet these demands.

next issue

Infrastructure management

Next Time‌ As well as its regular range of features and news items, the February issue of Data Centre News will contain major features on infrastructure management. To make sure you don’t miss the opportunity to advertise your products to this exclusive readership, call Ian on 01634 673163 or email

data centre news

January 2018 | 31

patch management

Patchwork Mathivanan V, director of product management at ManageEngine, tears up the traditional approach to patch management with automation.


nsuring full IT security is a thankless task that is often undermined by regular software updates, but it’s a necessary evil in today’s workplace. Businesses have little choice but to yield to this timeconsuming task or face being compromised by unrelenting cyber-attackers. Keeping on top of patch updates places considerable stress on IT teams already inundated with a variety of other security-based responsibilities. Many teams don’t have the time or resources, due to it being so labour intensive, leaving businesses at considerable risk. Patch management means keeping every piece of software on every networked machine up to date to safeguard the business from vulnerabilities. This is something every IT manager must wish for, but most struggle to achieve this because of the complexity of the task. With a

32 | January 2018

variety of operating systems, ranging from Mac OS Sierra to Windows 8.2, running off a Linux server, making sure everything talks to each other is hard enough without constantly installing patch updates that could upset these delicate ecosystems.

The quick-fix affliction Fulfilling the dream of seamless patch management and the associated security benefits requires an almost entirely selfsufficient tool that ensures all the available patches on applicable systems are up to date without troublesome, time-consuming daily intervention and management from the hard-pressed IT admin. Receiving, distributing and installing routine monthly Microsoft operating system and application patches on “Patch Tuesday”, for example, certainly benefits from automation.

Yet patching third-party applications on a desktop remains a significant challenge for many organisations, because of the fragility of many server environments. When virtualisation is added, the admin can be facing even greater complexity, especially when resources are limited, as they are in many medium-sized and larger businesses. Java, Adobe Reader and Flash and Firefox, along with many other business-specific applications are often patched considerably later than Windows and Office, for instance.

Update inundation With cybercrime rampant, hundreds of patches are released each month, all of which heaps even more pressure on the IT department. It has to decide which patches to install and which to ignore and what the

patch management

optimum order of installation should be. Delaying could expose the business to a devastating ransomware or zero-day attack – and we all know who receives the blame when that happens. Yet the variety of platforms and configurations a business may have and their vital importance in dayto-day operations may mean it is not desirable to install a new patch as soon as it is available. Testing of patches before implementation is another necessity that can complicate matters. While it is important to test patches to ensure their stability, it can be difficult to achieve when an organisation does not have the spare hardware, software or personnel readily available to create a testing environment. Software inventory management also introduces another challenge because patch management is dependent on having a current and complete inventory of the software that is installed on every device in the environment. Even when the IT department has an accurate inventory of systems, a list of controls, a system for collecting and analysing vulnerability alerts and a risk classification system, it still has to deploy patches without disrupting normal operations.

Automation for the people

“Patch reports lacking detail can place devices and applications at risk.”

Automation is already overcoming many of these hurdles, using a single interface to make the whole process of patch management much easier and far less taxing on the brain. Automation of the entire patch management life-cycle, like ManageEngine’s Desktop Central, now means that it is possible to detect missing patches without staff intervention. Patches are downloaded from the respective vendors’ websites and tested as required in relation to the business’s own assessment of its risk and business priorities. Nonetheless, when opting for automation, it’s important to ensure that every one of an organisation’s current IT infrastructure platforms, including operating systems and applications, are addressed and that remote offices and roaming devices are always included. Where necessary, it should be possible to exclude patches for specific groups of devices or the departments in which they are used, to prevent the network falling over if they use a specific OS. Automation also makes it possible to minimise disruption and irritation for end-users by installing patches during nonbusiness hours, or at least when applications are not in use. Devices are woken up before patches are

deployed and then rebooted after installation. The word ‘automation’ can also suggest an undesirable level of rigidity, but in reality it gives admins all the flexibility they need so that a patch can be postponed if – for instance – an end-user is on a slow network at a remote location but urgently needs their notebook. Lack of access to detailed reports is also a common problem with potentially serious consequences, which automation resolves. It is worth remembering that patch reports lacking detail can place devices and applications at risk. If the business has to meet specific industry compliance standards, these risks are never worth taking, because they can place the organisation’s entire IT infrastructure in jeopardy. Although IT departments continue to face so many challenges in relation to patches, there is no longer any need for them to spend so much time grinding through the process when automation can take care of almost every aspect of this never-ending task. The advances in solution design not only cuts out much of the drudgery, but they offer better insight into current status and give greater reassurance that organisations will be safer without compromising business performance. January 2018 | 33

edge data centres

On The Edge Mattias FridstrĂśm, chief evangelist at Telia Carrier, looks at how cheaper fibre is driving data centre evolution.


onsumers are making much more use of cloud computing and therefore data centres, whether they realise it or not, for mobile apps, gaming, and of course streaming content. As the number of people using these services keeps on rising so too does the data throughput needed to support a high quality experience. At the same time, businesses are making much greater use of cloud computing, whether it is through software as a service applications, storage, infrastructure as a service (IaaS),

34 | January 2018

or to offload intensive tasks that require raw computing power. The workhorses making all of this possible are, of course, the data centres and the backbone interconnections between them and end-users, consumers and businesses. Since the beginning of IT time, carriers have needed to move content between content producers and end-users. Content is seldom produced or archived where it is consumed. The key to fast service delivery is making sure content travels over a network that reaches as many end-users as possible in the most efficient fashion.

The edge – are you in or out? The basic challenge has not changed. Even as hardware becomes cheaper, smaller and less power-intensive, being able to quickly and efficiently reach end-users across the network will always be the highest requirement, regardless of where compute resources are located. The cost of installing high-speed fibre connections is constantly dropping, and the availability of those connections means carriers can now offer multi-gigabit speeds to any location served by fibre.

edge data centres

In turn, this dramatically increases the number of feasible data centre locations, including at the edge of the network. Enterprises are entering an era of reliable global high-speed connectivity that they have not experienced before. This is creating opportunities to have their own inexpensive data centres built on commoditised hardware connected anywhere in the world across highspeed fibre. Carriers are also working to enable enterprises to connect directly into their infrastructure at points that are convenient to them. As a result, the concept of traditional interconnection meeting points is becoming less important. Enterprises can build data centres and operate them without the need for ‘old school’ interconnection points to shuffle around data traffic. This gives companies a lot of choice over how they use network services, but it’s not as straightforward as saying ‘edge is best because it gets you closer to users.’

CDNs and the IoT The availability of fibre makes it practical and relatively inexpensive to put physical data centres in different countries.

Let’s look at two examples: As content has become larger, more sophisticated and life critical, end-users expect better quality. Distributing content closer to the user provides a better experience, but it comes at a cost to the content provider in the form of a more complex distribution network with servers at the edge

and the overhead in locating and maintaining equipment across multiple geographic regions. Content delivery networks (CDNs) provide a distribution network as a service which removes a lot of this complexity for the content distributor, but at a cost. CDNs have an important role to play but have limitations when it comes to speeding up noncacheable data, cloud services or other interactive services. At first glance, edge computing might help by placing servers closer to the end-user, but tasks need to be relatively stand-alone and selfcontained, so basic staples like financial transaction processing, e-commerce, and anything that requires a centralised database isn’t going to be ‘edge worthy.’ Our second example is Internet of Things (IoT) applications, which would appear to be edge computing candidates at first glance. However, IoT’s power comes from the collection, aggregation, and archiving of data from thousands to millions of end-point devices. The value in IoT comes from the data collected over time and analysed as a whole across the entire collection of ‘things.’ That means a centralised database is needed to be that aggregate data store, that can be sorted and probed in numerous ways with statistical tools.

Fibre driving new opportunities and choice Establishing high-speed network connections is now mainly a function of the availability of physical media – fibre – rather than being dependent upon more expensive proprietary network equipment. It is now almost as easy to set up a 100Gbps or faster optical connection across a thousand kilometres or more as it is for a few hundred metres between racks at an interconnect point.

There are many factors to consider when deciding whether you need centralised control or an edge-based approach, or indeed a combination. Many companies are moving to a model of multiple owned data centres across national boundaries, so they can meet regulatory requirements. For example, some countries require financial transaction and personally identifiable information (PII) to remain within their originating country for privacy and security reasons. While the need in this case is being driven by legislation, lower cost networking hardware and the availability of fibre within and between metros makes it practical and relatively inexpensive to put physical data centres in different countries, without impacting performance.

Seizing the opportunity With the availability of lower cost hardware and plentiful fibre, data centres are no longer restricted to one or two central locations. Instead, they can be in any place that has a fibre connection, with information collected, stored, and transported to where it is needed for business processes and/or regulatory requirements. CIOs and IT staff need to take this opportunity to rethink their data centre strategies and the partnerships with carriers that can help achieve their goals. Getting the right partnership could help build a more resilient business infrastructure, bring sensitive workloads back in-house, improve performance or allow exploration of new expansion opportunities. Now that anywhere with fibre can host a data centre, enterprises can do what is best for them without the constraints they had in the past. January 2018 | 35

Projects & Agreements

JPIX expands Tokyo data centre presence with Colt Data Centre Services JPIX (Japan Internet Exchange) is expanding its presence to Colt DCS’s Tokyo Shiohama carrierneutral data centre. Colt DCS has enjoyed years of strong growth across Asia, and is now expanding its connectivity capabilities to serve its customers’ requirement for optimal network solutions. By joining Colt Shiohama, JPIX will be able to service the demand for high performance interconnectivity from the global customers located in this data centre. Peering at Internet Exchanges is an important requirement as organisations work to most effectively distribute their content. JPIX’s global customers in Shiohama will also benefit from bi-lingual staff on-site who can respond instantly and effectively to their needs. Furthermore, as the data centre is located in the heart of Tokyo, it provides a first-class edge location for organisations needing to be as close as possible to their users. This is a crucial consideration for many JPIX customers, especially internet service providers (ISPs) content delivery networks (CDN) and communications services providers (CSPs). JPIX, established as the first commercial Internet Exchange (IX) in Japan in July 1997, currently provides services and 700Gps peak traffic to more than 170 customers across the Asia-Pacific region, more than any IX provider in Japan. JPIX is a completely neutral Internet Exchange and plays a vital role as a part of the core of Japan’s Internet. Colt Data Centre Services, 36 | January 2018

BATM launches virtual cybersecurity solution with Trend Micro BATM Advanced Communications, a provider of real-time technologies for networking solutions and medical laboratory systems, has announced that together with Telco Systems and Trend Micro, it has launched a software-based cybersecurity solution for deployment across virtual networks. This new virtual network function (VNF) enhances the market position of BATM by expanding its NFV portfolio to include security solutions and it is the only vSecurity offering by a worldwide software vendor that operates on both Arm architecture and all Intel platforms. This latest launch is a vSecurity VNF that is based on Telco’s NFVTime, which can convert any operating system into a software-based virtual network and that comes with a broad portfolio of VNFs, and Trend Micro’s Virtual Network Function Suite that offers flexible and high performance network security functions. As a result, telecoms operators and managed service providers can seamlessly deploy a high quality network security service that provides increased performance, flexibility and cost savings on their networks, regardless of their hardware or what systems they may choose to use. BATM,

APC recognises Scale Computing as an Alliance Partner for its Micro Data Centre Xpress Solution Scale Computing has been chosen as an Alliance Partner by APC by Schneider Electric, supporting its award-winning hyperconverged Micro Data Centre Xpress solution. The Micro Data Centre Xpress is designed to simplify physical deployments for Edge environments, as well as meet the challenges of Big Data and IoT. Scale’s portfolio of HC3 hyperconverged solutions combines storage, servers and virtualisation in one comprehensive system to automate overall management. This allows IT to focus on managing applications rather than infrastructure. With no virtualisation software to license and no external storage to buy, HC3 products reduce out-of-pocket costs and radically simplify the infrastructure needed to keep applications running. To support the growing demand for Micro Data Centres on the edge, APC by Schneider Electric created the Micro Data Centre Xpress solution, which combines a purpose-built infrastructure with a physical management wrapper for hyperconverged architectures. This enables organisations to simply and efficiently plug in the Micro Data Centre Xpress to deliver a complete and highly energy efficient IT solution that is pre-tested, optimised and able to be deployed rapidly. Scale Computing,

Projects & Agreements

First Tier IV data centre in Ghana to be delivered by Etix Everywhere The Joint Venture between Etix Everywhere, Ngoya and Africa Investment Group (AIG) announced the start of the construction of a Tier IV data centre in Accra, Ghana. This multi-tenant data centre (MTDC) will be carrier and vendor neutral. Etix Accra #1 will be the first Tier IV data centre in West Africa. The Tier certification from the Uptime Institute guarantees customers an infrastructure, capacity, and processes in place to provide a truly maximum level of availability.

“African data centres always face a power supply challenge,” stated Dr Samuel Ankrah, chief executive officer of AIG. “We decided to entrust Etix Everywhere and their proven design and technical know-how to tackle this issue. The Tier IV certification aims to ensure the quality of our infrastructure for our clients.” “Due to the unpredictable nature of African power utilities and to secure an uninterruptible power supply for the data centre, Etix Everywhere proposed to build a solar plant next to the data centre,” said

Ross Macdonald, CEO of Ngoya. “The infrastructure will be able to support the data centre at full capacity.” The solar plant will be built and operated by Etix Energy, a subsidiary of Etix Group. Based on the modular design developed by Etix Everywhere, the first phase of the project will offer a capacity of 150 racks. Over half of the first phase is already booked. This data centre marks the next phase of Etix Everywhere’s expansion throughout Africa. Etix Everywhere,

Lenovo and Intel to deliver powerful next-gen supercomputer to Leibniz Supercomputing Centre Lenovo and Intel will deliver a next-generation supercomputer to Leibniz Supercomputing Centre (LRZ) of the Bavarian Academy of Sciences in Munich, Germany. One of the foremost European computing centres for professionals in the scientific, research and academic communities, LRZ is tasked with managing not only exponential amounts of Big Data, but processing and analysing that data quickly to accelerate research initiatives around the world. For example, the LRZ recently completed the world’s largest simulation of earthquakes and resulting tsunamis, such as the Sumatra-Andaman earthquake. This research enables real-time scenario planning that can help predict aftershocks and other seismic hazards. Upon its completion in late 2018, the new supercomputer (called SuperMUC-NG) will have a staggering 26.7 petaflop compute capacity, and will support LRZ in its groundbreaking research across a variety of complex scientific disciplines, such as astrophysics, fluid dynamics and life sciences, by offering highly available, secure and energy efficient, high performance computing (HPC) services that leverage industry leading

technology, optimised to address a broad range of scientific computing applications. The LRZ installation will also feature the 20-millionth server shipped by Lenovo, a significant milestone in the company’s data centre history. Lenovo,

January 2018 | 37

Projects & Agreements

Huawei announces research partnership with Trinity College Huawei has announced a new research partnership with Trinity College Dublin as part of its growing R&D footprint in Ireland. At an event at Trinity College Dublin, Guo Ping, Huawei deputy chairman and rotating CEO, also announced the expansion of the company’s Cork R&D operation, which is growing from a small team to nearly 20 highly skilled staff. These developments bring Huawei’s R&D investment in Ireland to $21 million (€17.7 million) in 2017, a significant increase from 2016. As part of his visit to Dublin, Guo Ping will also have a meeting with Leo Varadkar, Taoiseach, to update him on Huawei’s 13 year presence in Ireland. Huawei now employs over 160 people in Dublin, Athlone and Cork across its business and R&D operations, of which 75% are locally recruited. Taoiseach Leo Varadkar welcomed the announcement and commented, “Huawei’s continued investment in Ireland illustrates the innovative technology ecosystem we have developed, with more and more major international tech firms basing and growing their operations here. Bilateral trade between Ireland and China is now worth over €12 billion each year, and by strengthening our links with companies like Huawei we can increase this further in the years ahead.” Huawei,

Dutch colocation provider NLDC and cloud broker Cloudwirx partnership Dutch colocation provider NLDC has announced it is partnering with Cloudwirx with the objective of bringing compelling cloud solutions to the market. Cloudwirx is a procurement and systems integrator for data centre and bandwidth infrastructure services, helping companies to source, build, and optimise their IT systems and networks. Many Cloudwirx customers need an end-to-end solution consisting of both colocation and connectivity services in Europe, as their businesses continue to grow and expand in global reach. NLDC is the largest colocation provider in the Netherlands, with firm roots in the Dutch economy, and has been part of the Dutch digital ecosystem from the very beginning. The Netherlands continues to be a favourite destination for US and Asia IT clients due to its favourable business environment and central location. “We’re very pleased to become a Cloudwirx preferred supplier for data centre services provided from the Netherlands,” said Claartje Mangert, managing director of NLDC. “As more and more businesses look to outsource their infrastructure services, the Netherlands – and in particular Amsterdam – has proven to be a strategic European hub from which to host business-critical IT systems. Cloudwirx will also bring NLDC more exposure to business opportunities outside the Netherlands and Europe.” Cloudwirx,

SolarWinds acquires Loggly and strengthens portfolio of cloud offerings SolarWinds has announced it has completed the acquisition of Loggly, a provider of cloud-based log monitoring and log analytics software. With the transaction, the company also will add to its team of software engineering talent. Loggly is a SaaS-based, unified log monitoring and log analytics product, that aggregates, structures, and summarises log data so users can analyse and visualise their data to answer key questions, spot trends, and deliver actionable reports. The acquisition complements the company’s existing portfolio of SaaS-based cloud monitoring solutions and SolarWinds plans to continue investing to innovate and enhance Loggly. With the acquisition, SolarWinds also will deepen its cloud-software engineering and analytics expertise. Former Loggly executives Manoj Chaudhary, CTO and VP, Engineering, and Vito Salvaggio, VP, Product will join SolarWinds as leaders in engineering and product, respectively. Members of the core development, operations, support, sales, and marketing teams will transition as part of the transaction. The Loggly acquisition is the latest advancement toward SolarWind’s vision of enabling a single view of infrastructure, applications, and digital experience management. Loggly will offer a solution to address use cases where customers need log monitoring and log analytics with structured log data and aggregated events. SolarWinds,

38 | January 2018

Projects & Agreements

Produce World deploys Silver Peak EdgeConnect SD-WAN increasing available bandwidth 500% Silver Peak has announced that one of the largest expert growers and suppliers of high quality fresh vegetables in Europe, Produce World, has deployed the Silver Peak Unity EdgeConnect software-defined WAN (SD-WAN) solution to connect employees in its remote branch locations to business applications. Since deploying EdgeConnect, Produce World has increased its available bandwidth more than five-fold. As a dispersed organisation with five production sites spanning the UK – four factory sites and a smaller 10-user site in Scotland – Produce World needed to increase bandwidth to support more advanced video and audio conferencing, as well as business critical applications, including Citrix. With each site running on a combination of asymmetric digital subscriber line (ADSL), fibre lease lines and broadband connections, with multiprotocol label switching (MPLS) back-up links, Produce World wanted a

solution that would give it the highest level of visibility and control over its network traffic, while assuring application service level agreements (SLAs) over any combination of transport services. Finally, the company wanted to improve internet connectivity for both staff and visitors. After assessing the market, Produce World elected to deploy the Silver Peak

EdgeConnect SD-WAN solution due to its centralised control, ease of deployment, and its ability to extend application SLAs to its dispersed locations. Since deploying the SD-WAN solution, Produce World is now able to take advantage of five times more available bandwidth than it had previously. Silver Peak,

Bulk Infrastructure partners with Epsilon to accelerate cloud access in the Oslo Internet Exchange Epsilon has partnered with Bulk Infrastructure to deliver rapid access to the world’s leading cloud service providers. Bulk Infrastructure will be deploying the Cloud Link eXchange (CloudLX) module of the Infiny by Epsilon on-demand connectivity platform in its Oslo Internet Exchange (OS-IX) carrier hotel.

Service providers in the Oslo Internet Exchange will now be able to rapidly interconnect new services globally in and out of Bulk’s OSIX facility at the click of a button. The deployment creates a major aggregation point for local network service providers, internet service providers, carriers and cloud service providers. Epsilon will be deploying a full Point-of-Presence (PoP) within Bulk’s OS-IX facility, expanding its global interconnect fabric in the Nordics and enabling it to serve local and global customers. Bulk Infrastructure is a Norwegian data centre infrastructure developer specialising in the development of data centre real estate, data centre services and fibre optic infrastructure with a focus on dark fibre. Its OS-IX carrier hotel facility acts as an aggregation centre and enterprise connect point for Norway. Bulks N01 Campus is located in Vennesla, southern Norway. With more than 300 hectares next to 3,600MW of hydroelectric power, the N01 Campus aims to become the world’s largest data centre campus on 100% renewable energy. Epsilon, January 2018 | 39

Projects & Agreements

Bitdefender and Netgear partner to bring IoT security to customers worldwide Bitdefender has announced a technology licensing agreement allowing Netgear to include Bitdefender’s IoT security technology on certain parts of its networking devices. With advanced threats leveraging vulnerabilities in smart devices, this joint offering will secure IoT devices at the Wi-Fi router level, reducing the threat of attacks and protecting sensitive user data from cybercriminals. Consumers can count on a comprehensive IoT security technology with the capability to detect devices within the Netgear Nighthawk Wi-Fi router network and identify those with vulnerabilities. Ciprian Istrate, Bitdefender’s vice president of Consumer Solutions stated, “We are excited about our new relationship with Netgear. The integration of our solutions will help prevent damaging breaches and attacks through various types of IoT devices, from doorbells to thermostats to baby monitors to security cameras to smart TVs. We’re pleased that our long history of breakthroughs in IT security can now be a part of a new offering that helps protect IoT devices from cyber criminals.” Bitdefender, Netgear,

40 | January 2018

DADI announces partnership with Netwise Technology company DADI has announced a partnership with Netwise, a provider of server colocation and data centre services. Netwise offers private facilities in London, designed and built entirely inhouse, delivering end-user content on a national and international scale. It is also a pioneer in green colocation solutions, offering highly efficient rack space powered by 100% renewable energy – an issue of equal importance to the DADI team. The collaboration between Netwise and DADI will bring industry insight, consultation and support during DADI network development in 2018. It will also provide spare capacity for the network as it grows. DADI is a global decentralised cloud services platform, built using blockchain technology and offering compute power, database storage, content delivery and other functionality to help businesses grow. It represents a radical overhaul of the cloud computing sector, by using cost efficient fog computing organised by a DAO to provide web services for building digital products. This means it can offer cheaper and faster cloud hosting – with projected savings of up to 90%. “We are very happy to be supporting DADI as they go to market with their innovative new decentralised cloud platform. As avid supporters of bleeding-edge technologies, we are very much looking forward to helping DADI develop this system by supporting their growing core infrastructure requirements,” commented Matt Seaton, senior manager at Netwise Hosting. Netwise, Dadi,

Megaport provides enterprises with direct access to IBM Cloud Megaport has announced that it now provides direct, secure connectivity to IBM Cloud. Enterprises can now access high-speed, dedicated network connections to IBM Cloud from any of Megaport’s 179 data centres globally to help accelerate the transfer of business-critical data between private infrastructure and the cloud. With IBM Cloud Direct Link, Megaport accelerates cloud adoption for the enterprise, enabling them to architect a hybrid environment that connects on-premises infrastructure, private cloud, and public cloud services. Megaport customers can access IBM Cloud’s expanding global footprint and cloud-native services such as AI, analytics, blockchain, Internet of Things, serverless and more. In the race to provide the enterprise with better IT performance and secure connections to cloud-enabled applications, Megaport-enabled data centres provide a global footprint that extends beyond major metros to provide connectivity to previously underserved regions. Additionally, performance, security and regulatory compliance are all top of mind for enterprises. Megaport’s platform helps solve these barriers to entry for cloud adoption. Megaport,

Projects & Agreements

Nlyte furthering commitment to EMEA with Simac ICT Belgium announcement Nlyte Software has announced a partnership with Simac ICT Belgium, a technology organisation that provides personalised improvements to business processes. Simac ICT Belgium provides essential business improvement services throughout Belgium: Business management solutions measuring network monitoring, application performance monitoring, network tracing and forensics and server KPIs and integration as well as hybrid ICT solutions and professional services.

The new European partnership will see Nlyte Software working with Simac ICT Belgium Cabling and Infrastructure business unit, to provide data centre infrastructure management solutions. This will ensure that organisations throughout the region will have access to industry leading products that can be introduced into existing infrastructures to improve efficiency and give more accurate and timely feeds of information. Simac ICT Belgium works with hundreds of organisations across a

plethora of industries and with Nlyte Software’s partnership, existing and future customers will benefit from solutions that will give them the reliability and robustness to generate maximum output. The new strategic partnership will see Nlyte Software’s solutions supplied with Simac ICT Belgium’s trademark quality customer service offering 360° support including expert implementation, on-site support and consultancy. Nylte,

Secure IT Environments completes Ridgeons data centre relocation project Secure IT Environments has announced the completion of its data centre relocation project for Ridgeons, timber and builders merchants. The project involved the closure of a data centre, originally built by Secure IT Environments in 2007, and relocation to new facilities on an existing Ridgeons depot site, following the sale of land in Cambridge. The new site introduced design challenges due to its smaller size and location, however, given the team’s close relationship with Ridgeons, a design solution was found and the associated risks with planning the transfer of services were mitigated. The project was completed in just 10 weeks. New developments in server and cabinet technology meant no compromise had to be made on the compute and storage power available to Ridgeons, despite the new data centre site being smaller. The project included new incoming power supply, UPS in N+1 format, energy efficient air conditioning, structured cabling, access control/CCTV, a Novec fire suppression and detection system, 19in cabinets with intelligent power distribution, environmental monitoring, as well as general building works. Secure IT Environments will also relocate the existing generator as part of decommissioning the original site, creating cost savings for Ridgeons. Secure IT Environments, January 2018 | 41


Fast, simple, effective: Edge data centre for innovative IoT solutions Companies that employ machine-tomachine communication to streamline manufacturing require real-time capabilities. IT resources deployed in close geographical proximity ensure that latency is low, and data readily available. The Rittal Edge Data Centre provides an effective answer to this need. It is a turn-key, pre-configured solution based on standardised infrastructure. It can be implemented rapidly and cost efficiently – paving the way for Industry 4.0 applications. The sensors and actuators deployed in smart production systems continuously relay information on the status of processes and infrastructure. This forms the basis for innovative services – such as alerts, predictive maintenance, and machine self-optimisation – delivered by the company’s IT department in real time. To make this possible, and to rapidly respond to events and anomalies,

low latency between production and IT infrastructure is critical.

Fast, simple, effective A remote cloud data centre is unable to support these scenarios. The solution is edge computing, i.e. computing resources at the perimeter of a given network. With this in mind, Rittal has introduced a new edge data centre: An end-to-end product with standardised, preconfigured IT infrastructure. The Rittal Edge Data Centre comprises two Rittal TS IT racks, plus

corresponding modules for climate control, power distribution, UPS, fire suppression, monitoring and secure access. These units are available in various output classes, and can be easily combined for rapid deployment. Moreover, to safeguard critical components from heat, dust and dirt in industrial environments, the data centre can be implemented in a self-contained high-availability room. The Rittal Edge Data Centre can be extended two racks at a time. Moreover, the modular approach provides customers with diverse options, allowing it to accommodate a variety of scenarios – for example, installation in an IT security room, or in a container, to be located wherever it is required. The Rittal Edge Data Centre will be on show at Data Centre World (Excel, London, 21-22 March 2018) on Stand D510. Rittal,

Edgecore Networks introduces cost effective Gigabit Ethernet web-smart switch Edgecore Networks has announced the latest generation cost effective web-smart switch – the ECS2020 series. The new websmart ECS2020 series is designed for SMB markets and provides a complete solution from 10/28 ports, including both non-PoE and PoE options. This series offers complete PoE solutions from 70W to 190W for VoIP, surveillance, and Wi-Fi APs. The series includes four models – ECS2020-10T/10P, ECS2020-28T/28P. The switches support 8/24 x 1GbE Base-T ports and 2/4 integrated Gigabit SFP ports. Besides providing more uplink bandwidth, the SFP ports can be used for redundant links. The ECS2020 series can operate at a temperature from 0-50°C; the ECS2020-28P includes cooling fans, whereas the ECS2020-10T/10P/28T are fanless designs. This series provides 4KV surge protection on Ethernet ports, which can prevent damage to the network caused by power surges and lightning strikes. The new web-smart series is designed with powerful software features; the switches support Web, SNMP v1/v2c/v3, and provide CLI operation by telnet connection, as well as IPv4 and IPv6 management, ensuring your network can upgrade from IPv4 to IPv6. This series also supports automatic voice/surveillance VLANs, providing VoIP and IP cameras the optimum network traffic usage. The ECS2020 series also supports multiple languages (TW/CN/ English), which enable all users to easily operate the switch. Edgecore,

42 | January 2018


Schneider Electric introduces APC Smart-UPS with SmartConnect for intelligent UPS management through the cloud Schneider Electric has introduced Smart-UPS with APC SmartConnect, the first and only cloud-enabled uninterruptible power supply (UPS) for distributed IT environments. The solution enables businesses, particularly small and medium sized businesses (SMBs) that have limited IT staff and resources, to proactively and effectively manage the health of their UPS systems. UPS battery failure on any piece of equipment is undesirable, but on the most business-critical technology it could mean catastrophic business delays and profit loss. To compound the criticality, today’s IoT-enabled world means these devices are likely supporting onsite and remote edge environments that must function at the same level of availability and security as the largest and most mission critical data centres. Available as a standard feature with select models in APC’s Smart-UPS portfolio of solutions, SmartConnect delivers the power reliability, security and certainty that SMBs need to stay connected to the technology and information that powers their business.

SmartConnect leverages the Schneider Electric cloudenabled EcoStruxure IT architecture to: • Gather and send data about the health and status of a customer’s UPS devices including battery replacement, warranty renewal and UPS performance notifications. • Provide a secure, cloud-based web portal where customers can view the status of their UPS, accessible from any internet-connected device. • Send customisable automatic notifications, firmware notification updates and advanced troubleshooting support through an easy-to-use remote monitoring interface. • Deploy right out of the box – no configuration required – making it easy for even non-technical users to install. SmartConnect cloud-powered technology also enables managed service providers (MSPs) to expand their offerings to deliver remote UPS monitoring for SMB clients. This provides MSPs with a greater opportunity to better serve their customers through value-added power infrastructure services while generating new revenue streams – all with minimal effort and no additional cost. Schneider Electric,

Installers overestimate cable certifier usage Data cable installers are overestimating their usage requirements for cable certifiers, according to Ideal Networks, suggesting that this may be generating extra costs for contractors and installation companies. A cable certifier is only an essential test tool on jobs where the job has specifically stated that a certifier or a cable manufacturer warranty is required. Ideal Network’s analysis of the work typically carried out by data cable installers showed that manufacturer warranties were typically required in 25% of jobs. Therefore, cable installers may not actually need to invest in more certifiers to improve productivity. 75% of the time, where a warranty is not needed or the job specification does not require a certifier, a cable and network transmission tester,

such as Ideal Network’s SignalTEK CT or SignalTEK NT, can be used to create comprehensive proof of performance reports to industry recognised standards. This type of tester is easy to use, allowing installers to gather data that is sufficient for their own records, and that can be shared as professional reports with clients. Importantly, these testers are far more affordable and, in most organisations, it is feasible for all installers to have their own. So rather than investing in more certifiers, businesses can instead meet their customers’ testing requirements with a mixed fleet. By combining a greater amount of transmission testers, alongside a smaller number of certifiers, such as Ideal

Network’s LanTEK III, businesses could reduce capital expenditure on testers by up to 57%. To help businesses select the right mix of test equipment for their needs, Ideal Networks has launched Test4Less, a suite of solutions, care plans and payment options. Following 18 months of research with data cable installation businesses, Test4Less is designed to help reduce capital expenditure, increase productivity and improve cash flow. Ideal Networks, Test4Less,

January 2018 | 43

final thought

Sound Advice Rob Perry, chief product officer at ASG Technologies, lends a helping hand to companies that are struggling to manage volumes of content, giving us four top tips to help solve traditional ECM problems.


n some situations, content is king and a key driver of customer interactions that can help lay the foundations of core business processes, helping guide decision making. However, content can become problematic if not dealt with in a systematic and effective manner. Document volumes are always growing and regulations are becoming more complex while users demand a clear and concise management system. Unfortunately, the volume and complexity of problems businesses

44 | January 2018

face are typically compounded by unwieldy legacy solutions with many enterprises using multiple systems to store content. This article considers the shortcomings of this increasingly outdated approach and how modern content services platforms can provide a solution.

Three areas where traditional ECM is failing A recent ASG-commissioned technology adoption profile study, ‘Today’s Enterprise Content Demands a Modern Approach’,

by Forrester Consulting, found 95% of respondents were using more than one system to manage enterprise content, including 31% using five or more systems. This leads to disjointed information and difficult access. Lack of flexibility is therefore one clear shortcoming of existing approaches to ECM. Organisations want to invest in systems and technology that allow them to grow and adapt to changing markets but traditional ECM often hinders their progress. Further, 82% of respondents reported an increase in

final thought

unstructured data in the form of business content, like office documents, presentations, spreadsheets, and rich media. They are also managing transactional content from outside the organisation. Traditional ECM systems struggle to cope with this level of growth due to another key shortcoming – their inability to scale. Most traditional ECM solutions also struggle to manage growing regulatory and security requirements. As the study highlights, “Sharing content with external parties is becoming the norm. But with that comes expanding regulatory and compliance demands and an increased urgency to protect both customer and enterprise data.”

What’s needed from a solution Modern enterprises can leverage content services to manage assets across multiple content repositories, whether in the cloud or on-premise, and keep that information in its native form while still making it easily accessible. By providing controlled access and integrating content from any device, anywhere, these solutions, can effectively scale to accommodate growing data volumes, while at the same time breaking down the repository walls created by proprietary systems and allowing content to be stored in public and private clouds, onpremise and hybrid environments for greater flexibility and savings. At the same time, however, they need to be aware that the regulatory environment is becoming ever more complex, and its dynamic nature means processes must be put in place to effectively ensure compliance and business success.

Four top tips to help solve ECM content problems

“Lack of flexibility is one clear shortcoming of existing approaches to ECM.”

Recognise that technologies alone do not solve the problem of getting content into the right hands when organisations are making business decisions. Today’s content solutions connect people with the business and content they need to make decisions, disseminate knowledge and collaborate with customers and colleagues. Look for purpose-built, decoupled content services architectures, such as ASG’s Mobius, to manage content. Build your content services infrastructure for a mobile-first workforce and look for platforms that expose specific ECM capabilities as services rather than fully-formed features. Seek vendors that deliver transparent, contextual access. Content should be delivered to the users’ workflow through an intuitive process offering options through a policy controlled ‘learning’ process. When reviewing content services architectures, look for those that have granular

policy management services to provide content with contextual meaning, as well as how it should be governed. This rules-based policy foundation approach to content management is becoming more powerful in a world where regulatory and compliance pressures are constant.

Positive prospects The cumbersome ECM suites of the past are giving way to flexible content services platforms. These enable access to content across on-premise, cloud-based and hybrid environments at any time and from anywhere and also obtain enhanced visibility across their disparate systems. In the modern business world, it will be those businesses that take the plunge and adopt the latest modern content management approaches, that derive the most value from the content and use it most successfully in order to achieve improved decisionmaking capabilities; enhance customer relationships and drive competitive edge.

January 2018 | 45

Data Centre News is a new digital news based title for data centre managers and IT professionals. In this rapidly evolving sector it’s vital that data centre professionals keep on top of the latest news, trends and solutions – from cooling to cloud computing, security to storage, DCN covers every aspect of the modern data centre. The next issue will include a special feature examining infrastructure management in the data centre environment. The issue will also feature the latest news stories from around the world plus high profile case studies and comment from industry experts. REGISTER NOW to have your free edition delivered straight to your inbox each month or read the latest edition online now at‌

DCN January 2018  
DCN January 2018