DCR Q2 2022

Page 1

Q2 2022


G-Core Labs Introduces Full-featured Managed Kubernetes Service in Secure 15 Region Global Cloud


Green IT & Sustainability

Colocation & Outsourcing

Getting to grips with refurbished IT

Colo vs cloud, friends or foes?


Final Say Gender equality in the tech sector


News 04 • Editor’s Comment Green means go


06 • News Latest news from the sector.

Features 12 • Edge Computing Simon Michie, Chief Technology Officer at Pulsant, gives a state of play of the edge in the UK, and the potential benefits of implementing a UK-wide edge computing platform.

20 • Green IT & Sustainability Can the sector respond to the climate crisis in time, asks Simon Harris, Head of Critical Infrastructure at BCS.

28 • Colocation & Outsourcing Terry Storrar, Managing Director UK at Leaseweb, discusses whether cloud and colocation can be complementary.

32 • Storage, Servers & Hardware


To avoid a data disaster, cloud and hardware backups must work together, says Jon Fielding, Managing Director at Apricorn.


Regulars 34 • Industry Insight Kerry Osborne, Sustainability Lead at Coeus Consulting, discusses the challenges posed by actioning sustainability initiatives in IT.

40 • Products Innovations worth watching.

42 • Final Say


Gender equality is critical to the technology sector’s continuing growth, says Lorraine Wilkinson, Regional Vice President of Sales UK, at Equinix.

16 28


Editor’s Comment With DCR being a quarterly publication, a lot can happen in the interval between when I write these editor’s comments. While last issue I was hoping for the end of the latest Covid variant – Omicron – this time around, I’m lamenting the state of the world, one month into a conflict between Russia and Ukraine that has been painful and difficult to comprehend for many of us. It feels a little as though the world is stuck in stasis, always moving two steps forward and one step back. But one thing, I hope, will be a constant that we can rely on – the move towards a greener, more sustainable society. Yes, the progress is slow, and yes, government guidance on how we’re going to get there is oftentimes vague and inconsistent. But overall, general consensus seems to be headed in the right direction – and both industry and individuals are changing how they operate to minimise their carbon footprint.


Kayleigh Hutchins kayleigh@datacentrereview.com


Jordan O’Brien jordano@sjpbusinessmedia.com


Alex Gold alexg@sjpbusinessmedia.com


Kelly Baker +44 (0)207 0622534 kellyb@datacentrereview.com


Sunny Nehru +44 (0) 207 062 2539 sunnyn@sjpbusinessmedia.com

The last decade has seen the data centre sector start to scrutinise its impact on the environment, particularly when it comes to energy consumption. Already there has been a push for protocols and new technology to optimise how that energy is used – and this is just the start of the innovation.


But change inevitably takes time. Sustainability needs to be a key consideration right from the initial plans of a data centre build, and we need to be better at thinking about how we waste and reuse equipment. We’re a work in progress – and that’s okay, as long as we don’t lose momentum.


As always, you can drop me an email at kayleigh@datacentrereview.com, and don’t forget to join us on Twitter (@dcrmagazine) and on LinkedIn (Data Centre Review). Kayleigh Hutchins, Editor

Fidi Neophytou +44 (0) 7741 911302


Wayne Darroch PRINTING BY Buxton Paid subscription enquiries: subscriptions@electricalreview.co.uk SJP Business Media 2nd Floor, 123 Cannon Street London, EC4N 5AU Subscription rates: UK £221 per year, Overseas £262 Electrical Review is a controlled circulation monthly magazine available free to selected personnel at the publisher’s discretion. If you wish to apply for regular free copies then please visit: www.electricalreview.co.uk/register Electrical Review is published by

2nd floor, 123 Cannon Street London EC4N 5AU 0207 062 2526 Any article in this journal represents the opinions of the author. This does not necessarily reflect the views of Electrical Review or its publisher – SJP Business Media ISSN 0013-4384 – All editorial contents © SJP Business Media

Follow us on Twitter @DCRmagazine

Join us on LinkedIn

4 www.datacentrereview.com Q2 2022


News The latest highlights from all corners of the tech industry.

Image Credit: Kao Data

Kao Data expands Harlow Campus with second 10MW data centre onstruction is underway on a second 10MW facility at Kao Data’s Harlow data centre campus. The new facility, KLON-02, will provide 10MW of capacity, as well as 1,800 racks of IT equipment across 3,400 sqm of space. Once fully operational, the site will be NVIDIA DGX-Ready and OCP-Ready. The carrier-neutral KLON-02 will feature an energy efficient, sustainable design, including the use of 100% renewable energy and hydrotreated vegetable oil (HVO) in its backup generators. Kao’s latest site follows the company’s launch of its Slough data centre in February and its recent investment from Infratil Limited.


Micro Focus opens UK data centre Micro Focus has unveiled a new public cloud data centre in the UK. According to the company, the new facility is part of an ongoing strategy to move away from the private data centre business and provide improved availability, security and data sovereignty to its UK customers. “As a company, we felt as though it was important to provide customers with an alternative public zone for their data to reside, outside of the EU but still within Western Europe,” said Ian Simmons, VP and GM UKI at Micro Focus. “We have supplemented our public cloud offerings to ensure our customers’ needs are met and in addition to our cloud data centre in Frankfurt, this UK data centre reaffirms Micro Focus’ commitment to investing in the region.”

6 www.datacentrereview.com Q2 2022

PRINCIPAL ANNOUNCES CLOSURE OF EUROPEAN DATA CENTRE FUND Principal Global Investors has closed on a fund dedicated to acquiring data centre assets in Europe. The ‘Principal European Data Centre Fund’ was completed by the company’s real estate investment arm, Principal Real Estate – and reached €155 million, surpassing its target. The amount is one-third of the total equity hard cap of the €450 million Principal set for the fund. The fund was raised through seven international investors, including asset managers, pension funds and insurance companies. Principal has said the fund will be focused on manage-to-core data centre assets, and has allocated 60% of it across the Netherlands, France, UK and Ireland, with the remaining 40% to be distributed to secondary markets, such as Spain, Italy and Switzerland.

Telehouse opens fifth Docklands data centre elehouse has opened its fifth data centre in the Blackwall Yard campus in London Docklands. According to Telehouse, the new site – Telehouse South – is its largest facility yet. The first floor provides capacity for up to 668 racks and 2MW of power, with a later upgrade scheduled for the end of 2023 to add a further 2.7MW. At full buildout, the 31,000 sqm facility will provide 12,000 sqm of colocation space and a total power capacity of 18MW. Like the company’s other facilities, Telehouse South is powered by 100% renewable energy from wind, solar, biomass and hydro generators. A network of 7,000 dark fibres, across two diverse routes, interconnects the new facilities with the existing four data centres. Upon completion, Telehouse will have invested £223 million into Telehouse South, with its total investment in the Docklands campus set to reach £1 billion by 2025.


Image credit: Telehouse


Image credit: Valeriya Zankovych / Shutterstock.com

NTT opens seventh data centre in London NTT Global Data Centers has unveiled its seventh London data centre site, located in Hemel Hempstead. The facility, called Hemel Hempstead 4, was opened in December 2021. Providing half its full capacity in phase 1, it will add a further 9,600 sqm of space and 24MW IT load when complete. NTT is also planning to expand capacity at its Dagenham site, London 1, which was opened in 2020. Its London data centres currently provide more than 53MW capacity, with NTT’s future investment plans potentially increasing this to over 120MW. According to the company, Hemel Hempstead 4 will also be powered by 100% renewable energy and has been designed for low water usage (WUE). Masaaki Moribayashi, President and Board Director at NTT, said, “As the third-largest data centre provider in the world, NTT continues to significantly accelerate our investment into the London and UK market as a key global location. This investment provides a scalable data centre and connectivity infrastructure for our clients’ digital transformation needs.”

Equinix completes $320m acquisition of MainOne quinix has completed its $320 million USD deal to acquire West African data centre and connectivity solutions provider MainOne. According to a statement issued by Equinix, “The completion of this acquisition augments Equinix’s long-term strategy to become a leading African carrier-neutral digital infrastructure company by being able to bring a full range of transformative technologies and connectivity to Nigeria, Ghana and Cote d’Ivoire.” The investment will extend Platform Equinix into West Africa, providing organisations based inside and outside of Africa access to global and


regional markets. MainOne, headquartered in Lagos, was founded by Funke Opeke in 2010 to enable connectivity for the business community of Nigeria. The site provides 64,000 sqft of space across four operational data centres, with an extra 570,000 sqft of land available for future expansion. MainOne provides a terrestrial network of more than 1,200 km of terrestrial fibre in Lagos, Edo and Ogun States, and its subsea cable network stretches 7,000 km between Nigeria and Portugal. MainOne will now operate under the name MainOne, an Equinix company, and Opeke will continue to lead under the new brand.

Image credit: Michael Vi / Shutterstock.com

Boeing forms multi-cloud partnership with AWS, Google and Microsoft Amazon Web Services, Google Cloud and Microsoft will help the company expand its cloud operations, “creating a single foundation for the company’s approach to cloud computing in the years ahead,” it said in an announcement. According to Boeing, its applications have previously been hosted through on-site servers, managed by Boeing or external partners. With its legacy systems ageing and difficult to maintain, there have been difficulties “developing and deploying digital solutions across the company.” Boeing said it hopes the new deals, which build on existing relationships with each provider, will remove these infrastructure restraints, allowing for development of applications across the business. “One of the biggest challenges to traditional hosting solutions is scalability – predicting, procuring, maintaining and paying for servers before a developer ever writes a single line of code,” said Susan Doniz, Boeing Chief Information Officer and Senior Vice President of Information Technology & Data Analytics. “Cloud adoption unlocks those challenges by allowing developers to tap into additional storage or capacity when they need it. It’s like having a nationwide broadband network and we’re still using dial-up.” “These partnerships will strengthen our ability to test a system – or an aircraft – hundreds of times using digital twin technology, before it is deployed,” added Doniz. “Our partners will help Boeing take advantage of the best the industry has to offer while enabling employees to tap into leading tools, training and experts to improve skills and learn new ones.” The cloud partnerships will also work in hand-in-hand with Boeing’s sustainability goals, the company said, giving it access to more energy-efficient technologies, streamlined data centres, and digital tools and testing to help reduce its carbon footprint. 8 www.datacentrereview.com Q2 2022


Box clever Andrew Wreford, Rittal’s Product Manager for IT Systems, explains how the company’s ‘Data Centre in a Box’ is providing a sustainable data solution for Oxford University’s Gardens, Libraries and Museums Division. xford University’s Gardens, Libraries and Museums division (GLAM) forms one of the greatest concentrations of university collections in the world. GLAM holds over 21 million objects, specimens and printed items, constituting one of the largest and most significant collections in the world. Faced with the challenges of increased data demand, the Museum of Natural History – one of the museums within GLAM – wanted to upgrade its IT infrastructure to house core network switches, which are responsible for running the services. A major rewiring project was undertaken with the aim of significantly improving the data connectivity for computers, phones and next generation devices. The wiring presented a challenge in itself as the historically-significant listed building was not best designed to accommodate the space for conventional hardware. This required ingenious methods to work with the fabric of the building. Faced with these challenges, Anjanesh Babu


10 www.datacentrereview.com ??????

– the Technical Project Lead in the Gardens, Libraries and Museums IT team – researched options available. The traditional approach was for the designated network core of a building to be stripped bare and rebuilt with air conditioning and electrics to meet the requirements for the equipment. However, given the nature of the building, this would present a number of challenges, including space and cooling loss through the surfaces. The design approach was led by GLAM’s sustainability strategy. Babu approached Rittal’s IT team, who quickly identified the ‘Data Centre in a Box’ (DCiB) concept as a possible option. DCiB replicates the key data centre capabilities, but on a smaller scale – and has been developed to enable equipment to be deployed in non-traditional data centre environments. The turnkey package concept provides IT racks, demand-oriented climate control, PDU, monitoring and fire suppression. It provides a complete solution from product selection, through to installation and ongoing maintenance.

When installed in the Museum of Natural History, the cooling footprint would be significantly lower than the traditional full-room air conditioning and the absence of any work to the space to accommodate the system would mean that the building would remain relatively untouched. A site visit by Joel Farrington, Rittal’s Area Sales Manager for IT, was arranged and the requirements gathered. “The system was to be located in the museum’s basement, which had restricted access with a very narrow staircase and doorways. In addition to this, the building’s listed status would mean that any cooling equipment would have to be positioned cleverly and with the utmost consideration, not only to aesthetic but to any noise pollution emitted,” recalls Farrington. Farrington and members of the Rittal IT development team, Clive Partridge and Andrew Wreford, worked with Babu to identify key areas that needed to be achieved. “Given the kW loads and environment of the proposed location, it became clear that the DCiB’s LCU option was the best way to go, and


we quickly built up a package including racks, accessories, cooling, fire suppression, PDUs and monitoring. “To mitigate the access restrictions, we used the ‘rack splitting/re-joining’ service which enabled us to resolve the challenge of space limitations of the project,” says Partridge, Rittal’s Technical IT Manager. Rittal provided an end-to-end solution from the manufacture of kit, to the installation, commissioning and hand-over. To overcome the issues with the listed building status, Rittal’s IT team worked in collaboration with Babu and the lead contractor, Monard Electrical, to find a suitable home for the condenser.

DCiB replicates the key data centre capabilities, but on a smaller scale – and has been developed to enable equipment to be deployed in non-traditional data centre environments. “Rittal’s DCiB allowed the museum to utilise the proposed location without having to make costly building modifications, thus saving time, energy and effort,” reflects Babu on the options deployed. By adopting ‘in-rack’ precision cooling instead of ‘in-room’ cooling, the location is

more environmentally efficient and this controls operational expenditure. Cooling via the high-performance LCU option provides temperature consistency, allowing better care of equipment along with nearly silent operations. Not only is the installation providing energy efficiency and longevity for the museum, there is the added benefit of noise reduction in the room compared to an existing server room utilising in-room cooling. Haas Ezzet, Head of IT GLAM, contextualises this piece of work as being part of the, “Museum’s drive towards greater environmental sustainability. The approach piloted here, of focussing climate control specifically to the area needed – the data cabinet – rather than the entire space in which it is housed, will optimise energy consumption and afford a blueprint for other spaces within GLAM and beyond.” Further information at www.rittal.co.uk and www.friedhelm-loh-group.com or on Twitter @rittal_ltd.

???????? www.datacentrereview.com 11


The view from the edge

Image Credit: Shutterstock.com

12 www.datacentrereview.com Q2 2022


In this Q&A, Simon Michie, Chief Technology Officer at Pulsant, gives us a state of play on the edge in the UK, and the potential benefits of implementing a UK-wide edge computing platform.

What are the benefits edge could offer UK businesses? Put simply, edge computing is the adoption of decentralised strategies which enable data, applications and content to be processed and managed at the network edge. By bringing applications and data processing closer to where data is collected, organisations can benefit from faster access and lower latency, enabling better performance, reduced costs and strengthened security. This will prove crucial for businesses under increasing pressure to improve user experience and performance to gain a competitive advantage. Most immediately, edge computing offers benefits to those businesses where low-latency is business-critical, such as financial services. It will also be beneficial for those businesses that have deployed hybrid and remote working strategies and have dispersed workers that require fast access to the cloud but are often located too far geographically from main providers. In the long-term, this will also open doors for the use of emerging technologies, such as AI and machine learning, to improve efficiency and help with data sovereignty post-Brexit. With edge computing, sensitive data can be processed at a local level, completely separate from the cloud, helping to strengthen compliance and reduce security concerns.

The technology is still in its infancy which means there’s a lot of different definitions of what edge computing is What do you see as the main challenges facing the edge? The main challenge currently is a lack of understanding. The technology is still in its infancy, which means there’s a lot of different definitions of what edge computing is and many enterprises don’t fully comprehend how it works and the use cases for their organisation. There are also a lot of vendors promising edge services and solutions which can be overwhelming and cause businesses to steer clear of the technology for fear of not knowing where to start. Others are looking for guidance and expertise on the best way to implement it and where to position workloads and data, so there’s a level of education that also needs to take place first from businesses and the channel. Other challenges include concerns over cost, complexity and access – the latter of which is a big driver for the need for a UK-wide edge computing platform.

What does a UK-wide edge computing platform look like? A UK-wide edge computing platform essentially provides a grid-like architecture that enables the processing of data close to edge devices and bridges the gap between the micro-edge and centralised platforms. At its core, it is a network of strategically located and interconnected data centres across the UK, with fast connectivity to cloud services through a low latency and agile network, connected by high-performance fibre. With an edge computing platform, organisations can spread network traffic across multiple data centres, or regional hubs, to improve latency and reduce network congestion and data transaction costs. The best platforms provide support to a multi-cloud ecosystem and uninterrupted availability and resilience, with route diversity around the UK to protect against any major fibre outages.

A unified network provides the infrastructure and access that businesses need to be able to deploy resilient multi-cloud and edge strategies, regardless of location What are the benefits of a unified UK network? A unified network provides the infrastructure and access that businesses need to be able to deploy resilient multi-cloud and edge strategies, regardless of location. It means that businesses no longer need to locate themselves near central business hubs and can benefit from the same speed and service anywhere in the UK. For example, Pulsant’s edge computing platform can deliver sub-5 millisecond latency to over 95% of the UK population. For regional businesses, it is a true game changer, enabling them to compete on a level playing field against those located in central hubs. For SaaS providers, it can help improve performance, competitive advantage and enable them to reach their UK customer base quicker and more readily. What are the future possibilities for edge computing in the UK? The ultimate end goal of edge computing is that location will no longer matter. It will enable every UK business to benefit from the best digital products and services and to achieve their full potential. Currently, the edge computing market is still young. Our research suggests that less than 10% of today’s workloads need less than 10 milliseconds’ latency – however, this will grow. We’ve already seen robust growth in the edge computing market over the last five years, and I certainly expect this to continue. Edge is already starting to transform IoT, content delivery and enterprise applications. Moving forward, I think we’ll see it being applied to autonomous vehicles, drones, virtual reality, real-time advertising and even remote healthcare. Then in the longer-term, we will see the creation of entirely new classes of applications which will allow current SaaS providers to ‘edgify’ their applications.

Q2 2022 www.datacentrereview.com 13


Image Credit: Shutterstock.com

Is enterprise compute ready for the edge? Jason Matteson, Director of Product Strategy at Iceotope, discusses whether the industry is ‘edge-ready’ and why an edge transition plan is so important.

here is a perfect storm coming our way. Data is being generated at a pace we have never seen before. We contribute to 6.9 billion searches on Google every day and WhatsApp users exchange up to 65 billion messages in just a 24 hour period. Everything is collecting data. Smart cities, smart cars, even smart doorbells. All these devices and sensors require processing and analysis to make them useful. When data sits unused it costs businesses between $9.7 million and $14.2 million annually. This data explosion continues to provide challenges for the enterprise. Data, artificial intelligence (AI) and machine learning are all rapidly becoming intertwined. AI is fed by data and making sense of data requires AI. Gartner Research predicts that 75% of enterprise-generated data will be ‘created and processed outside a traditional centralised data centre or cloud’ by 2025. To cope with that level of change, our industry must create the platform to support low latency, dense compute capabilities within edge data


14 www.datacentrereview.com Q2 2022

centres. These platforms need to offer at least the same server resiliency and serviceability as those in larger scale sites if the expansion at the edge is to be effective. Finding solutions A clear solution to edge server environment design is chassis-level precision immersion liquid cooling. There are several variant solutions that address these edge conditions and most offer a sealed chassis which creates a controlled environment that is impervious to dust, gases and humidity. These solutions are also able to maintain data centre compute density while offering improved energy efficiency; this allows high speed, higher processing power servers to be efficiently cooled by liquid compared to the inefficiency of air-cooling systems. Sealed chassis servers also ensure that external environmental factors do not affect the compute capability of the edge system. Autonomous vehicles are often cited as examples for high performance compute at the edge. This is for a good reason as they are constantly generating data for predictive analytics and search patterns to keep drivers safe on the road. In a split second, the data needs to be filtered, analysed and moved. If you are trying to predict if someone is going to have a car accident, latency becomes a critical issue when moving data back to a centralised data centre. The infrastructure needs to support the speeds and feeds of the data being generated otherwise you can have a very serious problem on your hands. Consider, as well, the retail environment. Real-time data is used to improve the in-store customer experience. However, the equipment and servers that are needed for that capability have to be in a form factor


suited for a retail environment. Floor space is a premium asset and any computing device reducing floor displays or stock room footprint is costing the retailer money. Liquid cooled compute solutions are in form factors identical to air-cooled servers, with the benefit of greatly increased compute density in similar footprints, without the requirement for additional expensive air-cooling systems. As the move to the IoT and edge computing continues, colocation is becoming an option for organisations who don’t want to manage hundreds of distribution points. However, it is also likely to be a greater point of disruption from AI. Colocation facilities were designed for legacy, traditional, non-compute intensive applications at 5 to 8 KW per rack. If multiple tenants are deploying AI and machine learning applications at 30 kWh per rack, power and cooling limitations within the data centre are quickly maxed out. Energy efficiency at the edge The good news is, as an industry, we have developed solutions to address these issues in the data centre itself. Over the last couple of decades, there have been many studies addressing data centre energy consumption. Our industry has made massive moves on energy savings by focusing on best practices for optimising energy and newer technologies to increase capabilities for the same energy use. The shift to the edge will, however, disrupt these efforts. The economies of scale for infrastructure and solutions in a centralised data centre will not be easily reproduced at the edge, if at all. The question becomes how do you maintain data centre density and improved energy efficiency while bridging the need for ruggedised equipment required for the edge?

Edge locations contend with a variety of harsh IT environments. At one extreme you have the cold and damp in Scotland, on the other, the heat and humidity of India. There are also airborne contaminants, particles, and corrosive gases to be aware of. All of which need to be closely monitored to not impact the servers regardless of their location. ASHRAE outlines key considerations for the reliable operation of servers and equipment in edge locations. These range from checking IT specifications in order to understand the impact to equipment warranties; servicing capability; corrosion limits; and the impact of air and temperature on equipment. New standards are likely to evolve as we see more deployments in unusual locations from utility towers, light poles or perhaps even in vaults beneath pavements.

How do you maintain data centre density and improved energy efficiency while bridging the need for ruggedised equipment required for the edge? ‘Edge washing’ Until more solutions are developed, the industry runs the risk of ‘edge washing’. Being edge-ready will need to be about sustainability as much as being operationally resilient. New thinking and truly sustainable solutions will need to be developed and reengineered. It won’t suffice to take a solution developed for inside the data centre, tweak it and then place it at the edge. Solutions will come to market to test the parameters, many will not be successful as they have not used the right type of electronics, or chips, or didn’t do something as simple as using conformal coating to protect the server boards. Enterprises are at the centre of an unprecedented data explosion. Data, AI, and machine learning are becoming ubiquitous across multiple industries all over the world. Now more than ever, it is time for enterprises to have an edge transition plan in place. With the right preparation, organisations will be able to capitalise on the real-time insights and create greater value for their business.

Q2 2022 www.datacentrereview.com 15


Distributed and dynamic

Image Credit: Shutterstock.com

16 www.datacentrereview.com Q2 2022


Michael Cantor, CIO at Park Place Technologies, explores the edge – what it is, how it’s evolving, and what the future of the edge will look like.


dge computing is set for exponential growth – but we still don’t have a singular understanding of what the edge is. That’s not a bad thing. It simply highlights the vast variety of ways in which the edge can support industries, the expansive cloud ecosystems they inhabit, and the ever-evolving nature of technological innovation. The edge means many things to many people and that’s the way it will stay. The IDC predicts spending on the edge will reach $40 billion in 2022 in Europe and significantly increase over the next five years – reaching nearly $64 billion through 2025. Citing performance, innovation, and cost improvement as the top business goals driving adoption in Europe, one thing is clear, no one is disputing the need or potential of edge technologies. Uncovering the opportunity Bringing compute power to the edge, where data is created and collected, massively reduces time to value, enabling speedier, smarter and compliant data processing, decisions and intelligence outside of the core IT environment. To some, it is simply a mobile computer in a box, to others it’s the essential fourth pillar in the hybrid cloud mix. But the bottom line is it’s already unlocking business opportunities across industries, from retail to manufacturing and healthcare. Edge computing enables lower latency data processing, making IoT technologies even more efficient, accurate and dependable. In retail we see this already with in-store checkouts becoming increasingly automated. Video cameras replace the beady eyes of staff, ensuring goods are paid for. This is only possible because of the latency – with the video data being analysed on location and insights delivered in real-time. Similarly, vision-based IoT technologies are enabling greater precision in manufacturing thanks to faster response times, leading to a reduction in waste. This is happening today. But edge technologies go much further than answering latency challenges, they provide solutions for data gravity and data sovereignty challenges too. With this in mind, we will see far more industries exploring the power of edge technologies. They want to keep data in a location close to where they need it to be, within their jurisdictions. They don’t want data flying around the world or region as it’s processed in multiple clouds. As data becomes increasingly integral to our everyday lives – from shopping to travelling, healthcare and agriculture – the data landscape will have to become increasingly distributed to enable real-time efficiencies and go beyond current innovations. But there is one big

elephant in the room: who will provide the unifying edge solutions that enable this distributed ecosystem to flourish? Managed services scale edge transformations In the next few years, we will see a massive movement towards edge computing and this will drive an up-tick in on-demand managed service offerings – covering everything from service environments, network environments and supply chains. This will take place across all verticals, but it will evolve around core sector challenges. For example, in the healthcare sector, where a deep trust in data privacy is needed as healthcare practitioners need instant access to sensitive patient information, data sovereignty challenges will shape the edge’s evolution. But there will be universal challenges that unite an otherwise vertical-centric edge; businesses will need to run hardware and software patches across the full ecosystem. For CIOs tasked with managing a distributed environment of largely unmanned edge servers, with less boots on the ground and more automated monitoring, visibility will be a challenge. More than ever, they will need trusted partners, with deep vertical knowledge and edge expertise that enable businesses to uncover the full edge opportunity. As there are no end-to-end edge solutions, this means stitching together and tailoring bespoke ecosystems that centre around core business challenges.

Leaning on professional managed services that can be scaled up or down to meet demands makes unlocking the edge easy The future is distributed, diverse and dynamic Looking ahead, the growth in edge ecosystems is good news in a world beset by cyber threats. Edge technologies can help to solve data sovereignty challenges, reducing the need to share sensitive data beyond the point of collection, while also avoiding the risk of fines by enabling onsite compliance reports. This is also important as we adjust to new 5G speeds. Just as edge technologies make IoT technologies increasingly viable, 5G will put the wind in the sails of the edge revolution. Powering the need and urgency of edge compute, as new products and services require greater compute power to crunch an explosive growth in data volumes in real-time. This will only enhance the demands of a distributed environment and turbocharge industries like manufacturing – with some preparing for private 5G networks. CIOs know what needs to be done, but they don’t know how to do it. Leaning on professional managed services that can be scaled up or down to meet demands makes unlocking the edge easy. Remembering that while all verticals stand to benefit from edge technologies, the solutions must be dressed around vertical-centric challenges – and require a portfolio of capabilities – is key. Close collaboration with experts will unleash the true value of edge compute power and transform industries.

Q2 2022 www.datacentrereview.com 17


Open should hold no fear Simon Ward, Director of Sales, UK & Ireland – Distech Controls, explains why an open Building Management System (BMS) is the way forward for data centres.


e are increasingly reliant on phones, computers and the applications that run on them. With the advancement of technologies like IoT, 5G and autonomous cars, we will only need more infrastructure to handle the data we and our machines create every day. A critical element of our digital infrastructure is the many data centres that process and connect our digital devices to the information needed. Today, the widespread demand for transparency, new data points and improved analytics in the data centre environment requires a controls approach that is secure, scalable, resilient and flexible. Data centre facilities typically manage their mechanical and electrical systems using either Programmable Logic Controllers (PLC) or Direct Digital Control (DDC) systems. The functions of both PLC and DDC are very similar. Both have digital input, digital output, analogue input, analogue output for basic control operation. The application of either control approach comes down to the goals of the stakeholders and conditions under which they are operated. So, let’s take a look at the differences between the two systems. Programmable Logic Controllers (PLCs) vs Direct Digital Control (DDC) systems Traditionally, PLCs have been popular for industrial and processes applications, where fast, resilient and fully programmable controllers are required. Praised by many for their response time, PLCs respond in fractions of seconds. This makes them ideal for near real-time actions, such as safety shutdown or firing control. While fractions of seconds are critical in manufacturing, it is not required

18 www.datacentrereview.com Q2 2022

in building-related control response times. Temperature and humidity control of a facility does not rely on fractions of seconds in logic response; it takes time to gather the data. On the other side, DDC systems have been used in a wide variety of Building Automation (BAS/BMS) applications. DDC became available around 1980 and was developed specifically for the control of building systems (HVAC, security and lighting). By leveraging standardised and commonly-used technologies shared by IT systems, the capabilities of DDC systems have shared many of the same advancements that computers have. The connectivity of DDC has helped create a foundation of open and easily integrated systems. By starting with an IP-based system, DDC systems easily incorporate new devices and data sources using open, standard protocols and open-source languages like RESTful APIs. RESTful APIs are the preferred approach to integrate the digital and physical world. APIs create a fast, secure, low-cost method of connection for our devices and software that is native to the IT world. Data centre controls systems should be built to easily integrate new sensors and third-party software, including cloud services. BMS manufacturers and system integrators are a great resource for finding the system approach that is best for the goals of both the data centre and the customers. When discussing the controls system options, it is important for clients to ask about the transparency, flexibility, and future readiness of their choices. Though PLCs and DDC systems can both provide similar basic functionality of controlling a facility, each has unique attributes. Review the features that best match your business needs today, and in the future, before accepting a proposed solution.

Selecting the right system for you – be open In the past, building systems have traditionally been proprietary and not flexible like open systems. Proprietary systems speak different languages, resulting in incomplete visibility, data and reliability, and leave you tied to one, often expensive, service provider. However, that is changing, and open systems are becoming ever more popular in commercial buildings and have numerous benefits for data centres.


Open systems offer monitoring and analytics at the local controller, reducing network complexity, and increasing redundancy and security. Distech Controls was the first to create intelligent building solutions utilising artificial intelligence enabling continuous learning for continuous optimisation. Open systems can bring everything together in a cohesive and centralised fashion, allowing users to visualise information, assess relationships,

establish benchmarks and then optimise energy efficiency accordingly. New open systems can meet even the most demanding data centre control requirements (even remotely) via fully programmable controls and advanced graphical configuration capabilities. For instance, the new Distech Controls ECLYPSE APEX is a powerful HVAC/IoT edge controller that offers enhanced performance and dedicated

spaces to IoT and AI developers. It facilitates HVAC system maintenance, increases efficiency of equipment and optimises energy consumption by leveraging the latest available technology on-site. Embedded RESTful API exchanges data from different applications, such as energy dashboards, analytics tools and mobile applications, on the premises or from the cloud with the IoT Hub connector. Using a RESTful API interface makes integration easier for systems integrators by enabling IT web services to easily interact with software applications. RESTful APIs also provide flexibility as the API can handle multiple types of input and return different data formats. In summary, it allows developers to meet the needs of data centre operators as well as facilities and energy managers. The smarter buildings become the higher the importance of cyber security. There are some fundamentals that building owners and system integrators need to consider when it comes to the security of their BMS. As a starting point, the devices or operational technology (OT) should be on a different network to the IT system, as they have separate security requirements and various people need to access them. As an example, contractors overseeing BMS devices do not need access to HR information. Each device should be locked down securely so they can only communicate in the way that is required. There should be no unnecessary inbound or outbound traffic from these devices. This links neatly to monitoring. It is vital to monitor the devices after installation and commissioning to ensure there is no untoward traffic to the devices that could threaten a building’s or company’s security. Some manufacturers, such as Distech Controls, are ensuring their products are secure straight out of the box. Security features are built directly into hardware and software like TLS 256- bit encryption, built-in HTTPS server and HTTPS certificates. For instance, the ECLYPSE APEX incorporates a secure boot and additional physical security measures to help overcome today’s security challenges. Data centres are unique buildings and a BMS requires careful planning and implementation. An open system has many benefits and should hold no fear for data centre operators, facilities managers or system integrators.

Q2 2022 www.datacentrereview.com 19


The clock is ticking Can the sector respond to the climate crisis in time, asks Simon Harris, Head of Critical Infrastructure at BCS.

20 www.datacentrereview.com Q2 2022


t has been beyond doubt for some time that in our data-hungry, always-available world, the environmental performance of data centres has come into sharp focus. Operators around the globe are making commitments on energy and carbon performance as part of their Environmental, Social and Governance (ESG) programmes and reporting on their achievements as they would their financial performance. The sector’s growth pathway serves to intensify interest in this area as a result of the unique characteristics of these facilities in terms of the intensity and scale of power consumed. It is still a fact that with the technology and resources available today, most businesses will struggle to become zero carbon entities through the elimination of carbon emissions over which they have direct control. After the technically possible reductions have been achieved, there will still be emissions that require neutralising in order for the organisation to achieve a ‘no impact’ state as regards to greenhouse gas


Data centre operators working to a net zero agenda need to make deep and meaningful cuts to their emissions before embracing the, at times, uncertain world of carbon neutrality through offsetting

• Additionality – does buying a specific offset lead to a reduction of greenhouse gas emissions that would not have happened otherwise? Determining whether a project is really additional requires scrupulous and transparent accounting and that’s difficult to do, which is why some offsets fail to deliver. • Double counting – it is vital to ensure that a party claiming an offset has exclusive rights over that offset. Verifying this has been historically difficult and uncertain, especially in the world of international carbon accounting. It remains to be seen whether the Article 6 agreements reached at COP26 will deal with this challenge effectively and that the public and private sectors work coherently on this front. • Performance – some forms of offset are sold with performance that is difficult to verify. In addition, there are offsets traded on an average carbon footprint basis, as opposed to a quantified footprint that some buyers see as time consuming and expensive to ascertain Of course, given enough funding and corporate appetite, it is possible to execute initiatives that provide greater certainty of outcomes without the third-party verification that some of the arms-length offsets truly require. Microsoft’s investment in Climeworks’ direct air capture solution is a good example of this. However, smaller businesses may lack the muscle to participate in these types of technologies in the short to medium term. A global problem Going forward, the pathway to net zero becomes steeper when the future growth in data centre construction and operation is considered alongside the levels of deployment in territories with highly polluting coal-powered electricity grids, such as Eastern Europe and the Far East. China’s data centre market is anticipated to deliver a Compound Annual Growth Rate (CAGR) of over 19% in the period to 2026 for instance. Unless there is rapid decarbonisation, then operators in these territories will continue with the questionable practice of paying for the right to pollute through the purchase of offsets.

emissions. The recently published Science Based Targets Initiative (SBTi) Corporate Net-Zero Standard recognises this fact, providing a pathway to net zero that includes some element of capture and storage of the last elements of carbon not able to be removed by other means.

It is possible to execute initiatives that provide greater certainty of outcomes without the third-party verification that some of the arms-length offsets truly require

The challenges of offsetting This is where we find the real challenge for businesses wanting to pursue an authentic, corporately verifiable net zero pathway. Data centre operators working to a net zero agenda need to make deep and meaningful cuts to their emissions before embracing the, at times, uncertain world of carbon neutrality through off-setting. The challenges of offsetting include: • Reforestation/afforestation – a freshly planted tree takes years to take up meaningful amounts of carbon. These trees will require protection from the effects of fire, diseases and deforestation for decades to achieve the promised carbon take up. In short, trees can be a risky bet unless permanence can be guaranteed.

The environmental management landscape has changed and there is much more change coming. Offsetting will be with us for the foreseeable future, but significant improvements need to be made internationally for this to function as the world requires. Data centre businesses will respond to the net zero agenda at different rates, and they will have to respond either because of legislation or wider market forces. Whilst the web has been an enabler of many things, including a general speeding up of many aspects of commerce and society generally, the sector will be judged on the speed with which it authentically responds to the climate crisis. If not, it faces being labelled one of the world’s dirty industries as the globe strives to satisfy the IPCC’s Carbon Budget.

Image Credit: Shutterstock.com

Q2 2022 www.datacentrereview.com 21


Simplify SOC 2 compliance with the right physical security systems Genetec explains how service providers can stay SOC 2 compliant when it comes to data centre security.


f the demand for cloud services was soaring before, then it has sky-rocketed in the past few years. According to Gartner, by 2025, 85% of infrastructure strategies will integrate on-premises, colocation, cloud, and edge delivery options, compared with 20% in 2020. This shift to online in business, government, and education, as well as the introduction of GDPR and its various country-specific modifications in Europe, has given data centres a complicated new reality to navigate: how to stay compliant with evolving regulations. Companies transmit and store sensitive information online every day. That means it’s more important than ever to ensure partners are following best practices when it comes to cybersecurity. One way to make sure software vendors are taking this seriously is to check if they are SOC 2 compliant. SOC 2 in a nutshell A System and Organisation Controls (SOC 2) report indicates that the organisation meets industry standards in regulating information, as determined by an independent audit by a

22 www.datacentrereview.com Q2 2022

certified public accountant. It defines criteria for service providers to securely manage data and protect the interests of their enterprise clients and the privacy of their customers. SOC 2 Type II offers proof that controls have been implemented properly over several months. In essence, it is a stamp of approval that an organisation is compliant with best practices in data protection and has all the appropriate safeguards and procedures in place to control who can access sensitive data. The Genetec solution Genetec has successfully completed a SOC 2 Type II audit for its portfolio of cloud solutions and the Information Security Management System that governs them. What this means for Genetec customers in the data centre world is absolute assurance that their security is airtight. The Genetec data centre portfolio unifies all aspects of security within one solution, creating a holistic view of all locations and helping security personnel make better, more informed decisions. A unified system, weaving together video surveillance, access control and digital evidence management

to name but a few, is a guaranteed way for operators to get the full view of their territory. SOC 2 Type II accreditation provides peace of mind that proper procedures are in place to ensure data stored within the Genetec system is secure, private and confidential. It also provides independent validation that controls are in place to deliver our service availability and processing integrity. What makes Genetec unified solutions essential to data centre security is that they are supremely effective for keeping track of who had access to what, when and why, greatly reducing opportunity for human error that manual processes are so susceptible to. Making use of a physical identity and access management solution that bridges physical and IT security to automate the workflow will also drive down costs. Centralisation and compliance There are many other ways in which centralisation can enhance security and streamline compliance operations. For example, by making it easier to set expiry times for contractor passes or by automating the generation and sharing of audit reports so that any irregular activity is quickly brought to light. The automation that Genetec can provide is key as these activities are easy to specify but difficult to consistently carry out if manual intervention is required. SOC 2 Type II certification is the most comprehensive proof of a technology company’s commitment to safeguarding client data. Data centres must keep up with evolving regulations and security threats while ensuring their customer needs are always met. By choosing Genetec, you can be sure you have the physical security systems in place to keep your networks, devices and data safe. We help your team reduce security risks and improve decision-making, facilitating your data centre to run at peak efficiency and giving you peace of mind. For more information on securing your data centre with SOC 2 compliant, scalable solutions, contact Genetec today: www.genetec.com/industries/data-centers


Preparing for a greener future Darren Watkins, Managing Director for Virtus Data Centres, explores how we can achieve greater efficiency and performance in data centre design and operation.

24 www.datacentrereview.com Q2 2022

hile many data centre operators strive to be more effective and efficient and prepare for a greener future, not all are looking at optimising their entire data centre footprint. Some focus on discrete initiatives, but the most sustainable providers will look at managing the data centre lifecycle end-to-end to achieve the greatest efficiency and performance – from design and construction, through deployment to operation and optimisation.


Performance Firstly, it is important to understand what is meant by ‘performance’. Availability is a key performance indicator since the IT loads that data centres support are mission-critical and any unplanned or unscheduled downtime can end up being costly for any organisation. For example, in February 2022, British Airways suffered an outage that prompted it to cancel short-haul flights out of London Heathrow Airport. According


to a report on the BBC News website, the fallout from the incident was wide-ranging, with numerous systems that BA relies on to conduct its operations at Heathrow affected. Availability comes hand in hand with uptime and scalability, which are equally vital for performance. This is particularly key for colocation providers that are required to flex their provision alongside their multiple customers’ changing requirements. High-Performance Compute (HPC) environments need large amounts of power and the agility to rapidly change consumption profile in line with demand. Other performance metrics include how cost effective the data centre is in terms of CapEx, OpEx or total cost of ownership (TCO), how sustainable and environmentally compliant the design is, and how energy efficient the facility is. Even with increased demand, modern data centres are expected to keep power usage within environmental requirements. But performance isn’t just about defining what is required today. It’s important to continually review which techniques, technologies and strategies are performing as expected, and which need to be improved. For back-up power, the industry continues to investigate alternative, sustainable sources – fuel cells are being looked at as a potential standby energy source. At present, this technology is not available at the scale required for large data centres, but research is ongoing. Design and construction One of the first considerations for data centre operators is likely to be location. From a country perspective, recent reports show that despite the continued growth in the key five FLAP+D country locations (Frankfurt, London, Amsterdam, Paris and Dublin) – which account for 70% of data centre space – other areas (such as Zurich/Geneva, Warsaw and other large European metropolises which aren’t capital cities) stand out as hot spots for future investments. Data centres can be built almost anywhere with enough suitable power and the right connectivity, but location has an impact on the quality of service it can provide to its customers. Many organisations choose providers that are close enough to business hubs and other data centre ecosystems to allow for mission critical data replication services, but far enough from both to satisfy physical disaster recovery requirements. Being convenient for customers to access their systems is also a deciding factor for location. Today, the rising cost to construct and operate a data centre due to increases in fuel and raw material prices calls for better approaches to design. A holistic, innovative design approach provides significant benefits to data centre managers and involves carefully considering all of the variables with an eye on OpEx over the life of the data centre, not just the initial CapEx investment to build the facility. However, innovation must always be aligned with ongoing sustainability, and it’s here where the BREEAM (Building Research Establishment Environmental Assessment Method) standards are important. These standards look at the green credentials of commercial buildings, verifying their performance and comparing them against sustainability benchmarks across the entire project lifecycle. As well as the commitment to meeting BREEAM specifications, many providers also employ a modular build methodology to deploy capacity as and when required. This drives up utilisation, and maximises efficiency both from an operational and cost perspective.

Deployment, operation and optimisation Power and cooling account for much of the operating costs of a data centre, so efficiency is paramount. When it comes to cooling, providers use a variety of innovative techniques including indirect evaporative air. This works by drawing air from two sources: firstly, outside air is drawn through the louvres on the side of the data centre and into the cooling unit; secondly, the hotter air from within the data hall is contained from the hot aisle of the IT equipment and enters the cooling unit. The temperature of the cooler outside air is used to cool the hotter air from the data hall via a heat exchanger, before being returned into the data hall as cool supply air. Critically the air flow never mixes inside the cooling unit, ensuring the environment inside the data hall is kept free of outside contaminants.

As well as the commitment to meeting BREEAM specifications, many providers also employ a modular build methodology to deploy capacity as and when required Other innovative operational techniques include using water sourced from a natural underground aquifer to minimise usage of mains water, rainwater harvesting and reuse of heat waste. In terms of power requirements, the uninterruptible power supply (UPS) will be determined by several factors including the criticality of the systems under load, the quality of the existing power supply and, of course, the cost. When it comes to energy use, renewable energy is on the rise. Indeed, renewable energy projects are an area of continued success for the industry. Many providers are committed to using 100% renewable energy sources – helping them to meet environmental goals whilst also providing cost savings and increasing reliability. Looking to the future Data centres have become one of the most crucial pieces of business infrastructure in the modern world. They are responsible for storing and processing the vast amounts of information needed to run the digital economy – if they don’t work, businesses can’t operate. However, demand comes at a cost and brings sustainability pressures, so time and investment must be spent on research and development of every aspect of data centre solutions – from cooling systems to energy efficiency, to security and monitoring – constantly striving to improve performance and efficiency. Forward-looking data centre providers will work with supply partners and customers to innovate, enhance product development and ensure that they are providing operational excellence to all their customers.

Q2 2022 www.datacentrereview.com 25


Going green with refurbished IT Nick Stapleton, Managing Director of ETB Technologies, discusses the role the refurbished IT market can take in supporting the green economy.


n 2019, Scotland became the first home nation to declare a climate emergency, citing the circular economy as one way to drive down emissions and positively contribute to the global climate setting agenda. The following year, the wider UK government set out its strategy for decarbonising all sectors of the economy and meeting its net zero target by 2050. As part of its strategy, the government sought to encourage individuals and organisations alike to recycle, reuse, or refurbish to reduce the amount of waste generated and sent to landfills. From an IT perspective, however, recycling is an energy-intensive process, and reusing equipment can mean sacrificing quality, speed and reliability. As members of the technology sector and responsible businesses, we must therefore be looking to the refurbished market to support the green economy. Landfills are full of computers, tablets, and other electronic devices – also referred to as ‘e-waste’ – that have been discarded, but could have been made as good as new with a little work.

26 www.datacentrereview.com Q2 2022

A complex challenge E-waste is a global problem. It’s one of the fastest growing waste streams in the world and it often leads to contamination in soil and water. The UK is a big contributor to this, with households and businesses throwing away 300,000 tonnes of e-waste each year. But the need for a greener economy doesn’t just stem from waste caused by products reaching the end of their lifecycle. IT equipment also produces a significant amount of waste and emissions during production. In addition, much of the IT equipment we use is built with rare materials, the extraction of which is fuelling climate change and creating pollution. Put simply, the IT sector has historically been a negative contributor to the environment. We now have the capability to change that by increasing our use of refurbished products. In fact, a 2021 report from Aldersgate Group found the circular economy could deliver 80% of the emissions reductions the UK needs to make to meet its goals for 20282032. Aiding this should be our goal. And while the environmental benefits are huge, there are other reasons for businesses to choose refurbished. For example, some enterprise manufacturers are quoting a lead time of over 100 days to fulfil orders, leading to huge delays for businesses. Refurbished vendors, on the other hand, have items readily available, in some cases being delivered within 24 hours of placing the order. Businesses can also get more for their money by choosing refurbished equipment. For example, a business planning to buy a new server for £10,000 may be able to get a refurbished one with the same specs for half that – or one with more capacity for the same price.


Image Credit: Shutterstock.com

Going refurbished We know there is little downside for the end-user in using refurbished equipment; over the past two decades, the industry has matured and there are far higher professional standards than when it was in its relative infancy at the turn of the millennium, including warranties as good as buying new. In many cases, refurbished products are comparable to new ones in terms of their performance. The challenge lies in changing perceptions among businesses, to encourage them to think of refurbished first when choosing new or upgrading IT equipment. The move to remote working coupled with a global shortage of semiconductors and supply chain challenges last year has undoubtedly made the circular economy more attractive to businesses looking to invest and stay ahead during the difficulties of the past few years. But offices throughout the UK are reopening. Delays across the supply chain are coming to an end. As an industry, we can’t rely on these alone to drive forward the green economy. Government incentives and other subsidies will be key to ensuring businesses continue considering refurbished. In the December budget, the Scottish Government allocated £43m towards supporting this sector; the wider UK government needs to also consider how to incentivise companies to reduce or remove emissions from their supply chain. A good place to start would be for the UK government to update the super-deduction allowance, which lets British businesses claim back up to 25p for every £1 they invest in ‘qualifying machinery and equipment’, before it ends in April 2023. Refurbished machinery isn’t currently

included in this but – given the crucial role it has played in helping businesses access lower-cost, quality machinery while supporting carbon-friendly business decisions – it should be. However, the government did note in the March 2022 budget announcement that all cloud costs, including storage, will be included in the R&D tax relief system moving forward, which helps some.

The challenge lies in changing perceptions among businesses, to encourage them to think of refurbished first when choosing new or upgrading IT equipment Businesses everywhere are looking at how to reduce their spend, while governments look to reduce waste and carbon emissions. The circular economy is the ideal marriage of these two goals. There is, of course, a time and place for new equipment, but if the UK wants to get serious on supporting the circular economy and meeting its net zero target by 2050, refurbished needs to be prioritised and support given to businesses to enable more carbon-friendly investment decisions.

Q2 2022 www.datacentrereview.com 27


Image Credit: Shutterstock.com

Cloud, colocation – or both? Are cloud and colocation complementary or competitive? Terry Storrar, Managing Director at Leaseweb UK, discusses this core debate, and explains how the latest advances in AI, machine learning and 5G are encouraging colocation to evolve. 28 www.datacentrereview.com Q2 2022

re cloud and colocation complementary or competitive solutions? It’s a debate that’s been raging in the wake of the rising dominance of cloud technologies over the past decade. Offering flexible, scalable and cost-effective infrastructure, demand for cloud solutions jumped as the pandemic hit and businesses moved at speed to support remote workforces and a raft of new digital operations and services. As organisations re-assess their priorities and cost bases, while evaluating how best to upscale infrastructure capacity for today’s increasingly data-centric and digitalised operations, the search for an optimised way to support mission-critical workloads, while reducing IT costs, is intensifying.



Choosing a middle path – hybrid Unsurprisingly, this is refuelling discussions around the question of which is best – cloud or colocation – in terms of adapting to future challenges and an evolving IT landscape. The problem is that the decision isn’t quite as clear cut as it might seem. This is because data centres are evolving fast to become just one feature in a much larger and more complex environment that is typically distributed across multiple locations and contains both on and off-premises facilities. This means that colocation providers are evolving fast too. No longer limited to the provision of data centre facilities that just provide floor space, electrical power and an internet connection, many now offer a host of services – from managed IT to hybrid cloud. And some offer direct connections to public cloud providers like Google, Amazon and Microsoft. In other words, today you can have both colocation and cloud. It’s no longer an either/or choice. Indeed, for many organisations, opting for a hybrid infrastructure is enabling them to create the right mix of cloud and traditional IT to suit their needs. Banishing misconceptions In essence, the cloud is a set of services, technologies and tools that are provided via a physical IT environment that sits in a data centre. This could be a cloud provider’s own data centres, but more typically, cloud providers use colocation facilities to house these services. So, regardless of whether an organisation is looking to use private, public or hybrid cloud, ultimately it all sits on physical infrastructure that resides somewhere in some form of data centre. Having reframed your thinking in this way, it’s easier to understand how the rise of cloud has impacted colocation in a positive way. That’s because colocation providers have been able to take advantage of virtualised or cloud environments to run more services on less hardware and expand into managed services, where virtualised workloads are isolated in multi-tenanted server environments. It’s a move that both reduces the footprint required in data centres while enabling customers to deploy their own private clouds from data centres that sit within a colocation facility. Indeed, many next generation colocation providers are now rebranding themselves as multi-cloud and hybrid cloud providers that offer fast data connectivity to public clouds. As technology advances and future enhancements in AI and machine learning emerge, colocation providers will play a pivotal role in leveraging the cloud to enable more compute at optimum times and overcome the challenges of legacy and ageing hardware. In this way, colocation isn’t in competition with the cloud; it complements it. Indeed, state-of-the-art data centres are now finding innovative ways to reuse heat and power, turning this waste into a usable asset. All of which can only assist cloud providers when it comes to maintaining their commitments to tackling the challenges of climate change. Colocation is evolving – fast Colocation data centres are evolving fast to offer high-performance colocation facilities featuring bespoke solutions that are designed to address the changing demands of customers. Many enterprises are now looking to mix and match public and hybrid cloud models with their colocation data centre environments in order to balance the need for a secure and stable production infrastructure with a growing requirement for agility

and high-performance compute. For enterprises, there are numerous positive gains to be achieved by adopting a hybrid approach to their IT ecosystems. While many workloads may work best in the cloud, cloud isn’t necessarily always cost-effective for applications where demand is predictable and stable. Plus cloud doesn’t always provide the high performance needed for AI, machine learning or real-time data analytics. All of which makes colocation the ideal choice for applications that require compute-dense resources and low latency response times. Added to which, colocation also offers significant benefits where data sovereignty, security and privacy regulations are concerned. As a result, enterprises are now turning to colocation as a means to locate and manage data closer to their cloud, network and security functions while enabling the public cloud and hyper-converged systems capacity that is both affordable and can be closely managed and controlled. This in turn means they’re increasingly relying on colocation providers to support and monitor their systems – and provisioning status – in real time.

Colocation providers will play a pivotal role in leveraging the cloud to enable more compute at optimum times and overcome the challenges of legacy and ageing hardware Delivering against shifting customer needs More and more IT leaders are looking to reap the benefits of combining colocation with cloud to select the best mix, from a multitude of infrastructure choices, to meet their evolving business and workload needs. This means that colocation providers now need to offer increasingly sophisticated network connectivity and service options to address the operational realities of these hybrid ecosystems. Along with enabling direct and secure connections to public clouds, this includes deploying tools that will deliver the intelligence needed to ensure all elements of their customer’s infrastructure are working in unison. With Gartner predicting that by 2025, 85% of enterprise infrastructure strategies will integrate on-premises, colocation, cloud and delivery options – compared with 20% in 2020 – one thing is clear. Increasing interconnections with cloud services will be critical for speeding up access for businesses to the edge of their networks, so they can deliver faster services with lower latency. As colocation providers evolve to work with all cloud options, and the increasingly sophisticated data requirements of customers, connectivity services are set to become a defining competitive differentiator. With all to play for when it comes to providing the low-latency connectivity that enables enterprises to meet new business demands, the good news is that technologies like 5G are finally enabling the super-fast and affordable wired connections to colocation data centres that will be needed.

Q2 2022 www.datacentrereview.com 29


Proper preparation and planning

30 www.datacentrereview.com Q2 2022

Image Credit: Shutterstock.com


Jon Healy, Operations Director at Keysource, explores why IT directors are having to meet some complex challenges to ensure the success of company digitalisation plans. he data centre sector continues to be at the heart of the global post-pandemic recovery, as for many organisations, the ability to bounce back or accelerate growth lies in their technology and continued digital transformation. In fact, digital transformation continues to be a mandatory strategic initiative for businesses and organisations across several sectors. This has many different drivers, such as the ability to provide a competitive edge, service quality, cost, compliance etc. Of course, for some it is about survival – but whatever the reason, the opportunities it opens up can be huge. Achieving digital transformation usually involves very complex, multi-faceted projects, which can often be difficult to plan, let alone execute and manage. This is resulting in an increase in outsourcing where an SLA and some KPIs can reassure, and the problem is passed to someone else – potentially with some (or a lot) of pain along the way. However, the nature of IT outsourcing is changing and IT directors are having to make difficult decisions to ensure that their company’s digitalisation plans can be delivered.


A solution in the cloud? Over the past few years there has been a trend towards ‘migrating to the cloud’ and an increased uptake in cloud usage which has been further expedited by the pandemic and subsequent lockdowns. Benefits promised from cost savings, increased flexibility and agility are a few valid reasons for its continued success. However, we are seeing some organisations that have discovered too late that the ‘cloud’ isn’t necessarily the best solution for them, certainly not on its own, or their ability to get there is much harder than first thought. Worse still, they have found themselves tied into long-term, ‘inclusive’ agreements that cannot be fully utilised as intended or do not support all of their requirements. Let’s make no mistake, the cloud has been, and continues to be, an absolute game changer and is often at the heart of digitalisation transformation plans that we help implement and support. But too often, the pace in which an organisation moves can be directly connected to its success, and it is fair to say those who don’t move quickly can often get left behind. And speed is not the only issue. In our experience, organisations may have bespoke applications or utilise legacy platforms that cannot be supported by the cloud, in its current form, or they may require a disparate estate for regional or customer reasons. In addition, the potential utilisation of the service given market trends and/or changes in technology may require technical or commercial mechanisms for flexibility within a cloud service. Customers may be

unaware of this before they decide to engage with a cloud provider, as it can be a challenge for some organisations to establish their requirement in sufficient detail. Often the first time organisations become aware of any issues is when things quite simply can’t be moved. By then the contract is signed and IT service owners are keeping multiple environments operating to maintain service, probably those same environments which were dependent on closing and reducing cost for savings to be realised. A deeper look at IT requirements This process is not helped by the fact that key information these service providers need to enable them to identify benefits or opportunities can be difficult to uncover without a certain level of detail. As a result, the proposal is often based on assumptions rather than fact but can often eventually assume the basis of a deal. Companies can help to mitigate this risk by commissioning a full audit of their IT requirements – from applications, hardware and supporting infrastructure, including site visits and stakeholder interviews – to capture their requirements. During this process, dependencies and risks associated with the migration of a particular service can be established. This enables companies to define any enabling works which may need to take place or influence the sequence in which the migration is completed.

Organisations may have bespoke applications or utilise legacy platforms that cannot be supported by the cloud, in its current form, or they may require a disparate estate for regional or customer reasons When deciding which elements of the IT are best suited to which service, there are other things to consider too. For example, some applications may be deemed business critical and therefore resilience is a key factor; for other less important applications, then more cost-effective options may be considered. By undertaking this deep dive upfront, organisations can focus on anything which may impact a quotation from a supplier and ensure that the goals and objectives are clear and consistent. It is at this stage that decisions should be taken about services that may not be suitable for the cloud and where they are in their operational lifecycle, which may influence how these are managed. This may involve hosting these on-premise or even replacing the service. It is also worth considering migrating these either first or last to further mitigate the risk. With significant cost at stake, it is details like these that don’t necessarily need a solution but do need to be built into the business case from the beginning. The devil is in the detail and the level of that detail will dictate the success of any changes and enable organisations to make truly informed decisions.

Q2 2022 www.datacentrereview.com 31


In it together To avoid a data disaster, cloud and hardware backups must work together, says Jon Fielding, Managing Director at Apricorn.


hen we talk about cybersecurity, words such as defence, prevention and mitigation spring to mind. It’s a mindset that forms the bulk or entirety of many organisations’ security efforts: if you stop an attacker in the first instance, there’s no chance that they can succeed. Unfortunately, this approach alone is no longer adequate in the face of today’s sophisticated threat landscape. In recent times, we’ve seen countless examples of companies that would have considered their security protocols to be watertight, leaking when put under pressure. Why? The simple fact of the matter is that no organisation can ever be considered 100% secure. New vulnerabilities are emerging and being exploited all the time. Take Log4J as an example. Uncovered as a vulnerability publicly on December 8, 2021, it was given a rare 10 out of 10 vulnerability score by the National Institute of Standards and Technology (NIST), owing to its unique combination of being easily exploitable and highly damaging.

By maintaining a physical backup location off-site that complements the use of the cloud, companies can retain an element of control Identified as a highly concerning and previously unknown weak point, attackers quickly set about attempting their exploits. The first attempt occurred just nine minutes after publication, rising to a total of 40,000 within 12 hours and 830,000 by the time a patch was released to the public three days later. Of course, preventative methods like endpoint detection and response (EDR), zero trust access and the training of employees in detecting social engineering attacks, such as phishing, all have their place on the security roster. But no matter how prepared a company’s defences are, there’s always a possibility that malware will get in – or that a user will make a mistake resulting in the loss of vital data. Recovery is therefore just as important as mitigation in ensuring business continuity.

32 www.datacentrereview.com Q2 2022

Why you need a multi-layered backup strategy Thankfully firms are recognising this, establishing backups as a means of responding to attacks. Indeed, a 2020 report by Acronis revealed that as early as two years ago, nearly 90% of companies were backing up the IT components they were responsible for protecting. Of those companies that have formal data backup procedures, more than half (55%) rely on the cloud as their primary backup location, according to respondents to Apricorn’s latest Twitter poll. However, only 36% of respondents believe this is the most failsafe place to store data securely. Businesses today depend heavily on cloud storage – and quite rightly, as it offers a convenient, fast and secure way to back up critical information off site. However, relying on this (or any other solution) on its own, leaves organisations vulnerable to a data breach or loss. If a cloud provider experiences downtime or suffers a cyber-attack, for example, data is at risk whether an SLA is in place or not. For this reason, companies should look to develop a multi-layered backup strategy. By maintaining a physical backup location off-site that complements the use of the cloud, companies can retain an element of control, ensuring they can always recover and restore from a clean, protected data set. Of course, this is easier said than done. So, what’s actually required to develop a 360-degree, layered backup strategy that not only incorporates offline and online backups but plays to the strengths of both to cover all eventualities? Policy First, you need to identify and implement a solid set of best practices that will form the foundations of any reliable and effective backup strategy. Companies should adopt complementary procedures, such as making multiple backups, multiple times a day in an automated fashion to minimise the impact of any potential data loss. Here, the 3-2-1 rule is an easy guiding principle in developing a resilient backup strategy which stipulates the following: you need a minimum of three copies of data (one primary copy, and two backups); on at least two different media; and with one dataset stored off-site (and ideally offline). Ransomware attackers will typically target backups in order to stop companies from restoring the data that they exfiltrate and encrypt, forcing them to pay their ransom. By both geographically distributing backups, as well as creating readily maintained offline and online versions, these threats are mitigated. Knowledge Of course, the benefits of creating backups are somewhat diminished if you can’t leverage them effectively in critical moments. Sophos estimates that the average cost to recover from a ransomware attack is $1.85 million. Yet this figure is not simply the typical cost of paying a ransom. It accounts for the downtime, people time, device


Image Credit: Shutterstock.com

Hardware encryption offers much greater security than software encryption and PIN pad authenticated, hardware-encrypted USB storage devices offer additional, significant benefits

costs, network costs and other lost opportunities when a company struggles to recover from an attack quickly. To mitigate these costs, a playbook should be developed outlining the process of performing data backup – who is involved, which programs and products they use and the location of the backups. It should also include the procedure for testing, reviewing and updating the process. Should any staff be absent in the event of an attack, or critical cogs in the recovery chain leave the company, the firm will still retain a step-bystep guide enabling them to respond effectively. Technology and tools Certain technologies and tools can enhance the recovery process, making it easier to achieve best practices. There are a variety of data backup and recovery software solutions on the market. Almost all will offer the ability to create multiple copies of key applications, documents, files, and folders, housing these in different locations (in line with the 3-2-1 rule). However, they all differ slightly. Some will be able to perform backups of disk images, mailboxes or virtual machines, and databases on many data storage devices, for example. To differentiate between providers, consider your exact needs. Ask key questions. Is your business heavily reliant on email? Do your staff need to be able to access these contacts via their inbox? In doing so, you’ll ensure you have everything you need without paying for unwanted or unneeded extras.

Use of encryption That said, one solution we do recommend you tap into is encryption. Encryption of backups provides an additional security measure that can help to protect data should it be misplaced, stolen or compromised. Interestingly, IBM’s 2019 Cost of a Data Breach report pointed to the extensive use of encryption as having the greatest impact in reducing breach costs – ahead of data loss prevention, threat intelligence sharing and integrating security in the software development process (DevSecOps). By providing employees with removable USBs and hard drives that automatically encrypt all data written to them, companies can give everyone the capability to securely store data offline. It’s also the perfect solution for remote working, allowing employees to move data to, and from, office to home safely. These devices can also be used to back up data locally, mitigating the risk of targeting in the cloud. Hardware encryption offers much greater security than software encryption and PIN pad authenticated, hardware-encrypted USB storage devices offer additional, significant benefits. Being software-free eliminates the risk of keylogging and doesn’t restrict usage to specific operating systems; all authentication and encryption processes take place within the device itself, so passwords and key data are never shared with a host computer. Encryption is therefore critical. As well as helping to reduce the financial impact of a breach, it is a means of demonstrating your trustworthiness and reliability in the realm of data protection, providing your own data-anxious customers with complete peace of mind.

Q2 2022 www.datacentrereview.com 33


Industry Insight: How tech leaders can support sustainability targets Kerry Osborne, Sustainability Lead at Coeus Consulting, discusses the challenges posed by actioning sustainability initiatives in IT, and three potential areas to focus solutions.

Gone are the days when individual business functions could operate successfully in siloed parcels. For companies to meet the demands of today, from heightening regulatory pressures to changing consumer expectations, all internal facets need to be aligned and pulled in the same progressive direction. This is particularly critical in the context of sustainability. If net zero targets are to be reached by 2050, then enterprises and businesses will have a huge part to play. For many, aligning with national and global climate goals will require a dramatic change in mindset and a holistic operational overhaul to ensure every process, from property management, to accounting, to manufacturing, is accountable for an organisation’s carbon reduction ambitions. Numerous companies have already embarked on this journey, placing all aspects of their business under the sustainability microscope to view it through a distinctly green lens. A Coeus Consulting survey published in February 2022 highlighted a recognition of the need for more eco-conscious IT operations. The report found that 90% of IT leaders see sustainability as a key IT objective within their organisation, while 88% of organisations already have an IT sustainability strategy in place. Further, 85% agreed that their organisation needs to be doing more when it comes to IT sustainability, and 80% noted that IT and sustainability are intrinsically linked and that IT has a large impact on sustainability. This is incredibly promising, yet turning sentiment into impact is often easier said than done. Indeed, the fact that half of respondents said their formal strategies were not defined in IT but elsewhere in the organisation, and seven in 10 agreed that IT sustainability is often viewed as a tick box activity to improve company reputation and justify government tax savings, is worrying. Despite the positives, there is clear room for improvement. Critically, firms must move away from viewing sustainable IT practices as a burden and recognise it as an opportunity, addressing it more proactively and comprehensively. We outline three ways in which this can be achieved: 1. Broaden the scope of sustainability activities In our survey, around 90% of respondents already considered IT sustainability to be a high priority. Interestingly, for over half of them, this was only a recent change, driven by a combination of IT leaders ensuring it is covered within IT leadership strategy, increased awareness and demands from senior stakeholders. Organisations are taking the right steps towards IT sustainability by building strategies and investing in technologies like cloud platforms. However, IT has the potential to facilitate an organisation-wide shift to sustainability, which will require aligning the IT and business sustainability strategies. Teams must therefore look to expand the scope of their sustainability activities, allowing IT to contribute significantly towards sustainability targets. This can be achieved in a variety of ways. Extending sustainability to the operating model using IT can increase efficiencies in energy consumption, resource utilisation and waste management, for example. Further, IT systems can also be used to measure current consumption levels, set sustainability baselines and targets, and monitor progress through the use of key data. Promoting sustainability culture within the organisation will enable employees to be engaged with the strategy and support the sustainability

34 www.datacentrereview.com Q2 2022


Image Credit: Shutterstock.com

drives introduced by the stakeholders. IT can again help here by working with the business to foster sustainable customer and provider behaviour, promoting sustainability outside the immediate organisation. All these ambitions will of course need planning and buy-in from senior leadership. However, succeeding here can help to elevate IT as a sustainability enabler instead of just a target for isolated sustainability activities. 2. Prioritise ‘sustainable’ over ‘lower cost’ For IT departments to help drive the sustainability agenda across the wider company, technology teams must be given the freedom to prioritise ‘sustainable’ solutions over ‘lower cost’ ones at every possible turn, including making decisions about which technology products to invest in. Responsibility for sustainability within IT should be formalised to ensure it is a key consideration in every technology-related purchasing decision the company makes. Within larger organisations, it may be suitable to have a specific role devoted to this, such as a Head of IT Sustainability. At smaller organisations, someone within IT should be made responsible for checking that sustainability is given due consideration in all IT initiatives. By assigning a champion for such initiatives, organisations can ensure time is dedicated to considering the best options, enhancing best practices, measuring impact and developing accountability in sustainability initiatives. IT has most of the data required at its disposal today; by stepping up as a business partner, IT leaders and CIOs can help drive the pace of change. For more ambitious organisations, there is evidently a large opportunity for IT to work with the business to develop new sustainable products and services and potentially spearhead a true ‘sustainable transformation’, providing digital and technology solutions to support changes in customer consumption and business needs. 3. Embed sustainability within the supply chain Companies should also kickstart the process of embedding sustainability within the supply chain. Existing agreements should be reviewed

to understand what additions may need to be made to cover certain sustainability criteria. Service credit regimes should reflect these measures so that there are clear service-level agreements (SLAs) applied around sustainability that both penalise under-performance and reward over-performance. When drafting requests for proposals, there should be an explicit area that refers to required sustainability targets and their relative weighting when scoring a potential supplier. SLA targets should be both realistic and present stretch targets that show measurable increases in sustainability across the term of the contract. Overall, those who are proactive in embedding sustainability in their supply chains will be viewed positively, whilst those that treat sustainability as a low priority run the real risk of suffering reputationally. Greater action is needed While there is encouragement to be found in the acknowledgement of the importance of IT sustainability among organisations, it is vital to note that this is still a work in progress. Indeed, significant focus and action is needed if businesses are to move the needle in any meaningful way. Thankfully, innovation in this arena is advancing rapidly, and there is therefore significant untapped potential for organisations to use technology proactively to reach their sustainability goals, from taking the early steps of recycling and transitioning to the cloud, to reconfiguring their operating model along more sustainable lines. Critically, IT must align with wider business sustainability goals and step up to the challenge of providing the solutions that drive digitalisation and efficiency. This isn’t simply a case of delivering reports and showcasing charts. It is about delivering measurable benefits both internally and throughout the supply chain. If CIOs and IT leaders get this right, they have a massive opportunity to take a leading role in furthering the sustainability agenda for years to come, adding value to organisations in ways that would otherwise be unattainable.

Q2 2022 www.datacentrereview.com 35

Get ready to get Powered On LIVE! The Powered On podcast, brought to you by our sister publication Electrical Review, is expanding its format to a two-day digital conference that will feature discussions that are sure to interest the readers of Data Centre Review. On 15 June and 16 June, Powered On Live will take place featuring a host of speakers that will cover important topics in the electrical industry, with many of the sessions likely to have a big impact on the data centre sector. We’re giving our platform to some of the industry’s foremost experts to share their expertise about the big issues. These include Darren Jones, UK Technology Manager at Hitachi Energy, who will be on the panel discussing the benefits of microgrids, as well as sessions dedicated to data centre power and energy efficiency. Data centres are amongst the biggest consumers of electricity worldwide, so key sessions such as dealing with renewable intermittency and the transition to net zero are likely to be at the top of the list for those looking to tune in. Additionally, we will have sessions from both Riello UPS and Megger, giants of the industry, who will also sponsor the event. There are plenty of sessions available, from a choice of big names, from Damien Kelly of Innovate UK to Simon Orr, Net Zero Plan Lead at National Grid. To get the latest speakers list, head on over to the Powered On Live website. Additionally, with Powered On Live suitably whetting your appetite for live, digital events, we’re pleased to confirm a second event will take place later on in the year dedicated to all things data centres – you won’t want to miss it. So why not come join us this 15 and 16 June? We’ll see you there.

Find out more at poweredonlive.co.uk 36 www.datacentrereview.com Q2 2022

Join us for an exclusive gala dinner held at Christ Church near Spitalfields Market on 19 May 2022, where winners of each category will be announced.

Catch you there!


Entertainment Sponsors:

POWER Power Product of the Year Sponsored by Omicron Shortlisted: Riello UPS – Multi Power Modular UPS Kohler Power – Tier EPA-Certified Tier 4 Final KD Series Generator Bachmann – BlueNet Intelligent Power Distribution Unit CENTIEL – Tier III Ready Data Centre by Cannon Technologies in Conjunction with CENTIEL SolarEdge Critical Power – CyberSecure cloud monitoring for UPS systems

Power Project of the Year Sponsored by Omicron Shortlisted: Vertiv – Installation of UPS and cooling technologies at Green Mountain’s DC1-Stavanger Data Center, Norway Sunpower – Project Green Heat CENTIEL – Keysource in association with CENTIEL, installation of UPS and Li-ion battery technology to the Cell and Gene Therapy Centre, UK

LIGHTING Lighting – Product of the Year Sponsored by Schneider Electric Shortlisted: Thorn Lighting – Plurio LED Trilux – E-Line NEXT LED modular continuous line Delmatic – DALI-2 SpaceApp Microsensor PowerLed – BLADE2 M

Lighting – Project of the Year Sponsored by Schneider Electric Shortlisted: Ecolighting – Birmingham Botanical Gardens Zumtobel – University of Lincoln Medical School Delmatic – University College Hospital, Grafton Way Building

FIRE SAFETY & SECURITY Fire Safety & Security – Product or Project of the Year Shortlisted: Firexo – fx73 & fx51 Aico – Ei1000G SmartLINK Gateway Protec Fire Detection PLC – Cirrus HYBRID Danfoss Drives – development of the first certified fan/VSD packaged solution for smoke control in buildings, in partnership with Fläkt Woods

INNOVATIVE Innovative Project of the Year Sponsored by Aico Shortlisted: RiT Tech Ltd – Bezeq International Colocation Data Center Innovolo – Clean Pig Robotic Pipe Cleaning and Testing Machine Project CSC IT Center for Science – LUMI EuroHPC data centre project



Energy Efficiency – Product of the Year

Data Centre Cooling – Product of the Year

Sponsored by Yuasa

Sponsored by Bachmann

Shortlisted: Iceotope – Ku:l Extreme Micro Data Centre Schneider Electric – Galaxy VL Kohler Power – Tier EPA-Certified Tier 4 Final KD Series Generator

Shortlisted: Iceotope – Ku:l Extreme Micro Data Centre EkkoSense – Cooling Advisor Aria – DENCO fan wall unit

Energy Efficiency – Project of the Year Sponsored by Yuasa Shortlisted: Vertiv – Energy Savings as a Service (ESaaS) across Telefónica sites in Europe and LATAM Vertiv – UPS and cooling technologies at Green Mountain’s DC1-Stavanger Data Center, Norway Thorn Lighting – Newport City Council Delmatic – University College Hospital, Grafton Way Building


Data Centre Cooling – Project of the Year Sponsored by Bachmann Shortlisted: Sudlows – HD cooling solution at Science Technology Facilities Council, Swindon Vertiv – UPS and cooling technologies at Green Mountain’s DC1-Stavanger Data Center, Norway EkkoSense – Cooling solution and monitoring for Virgin Media O2


Sustainable – Project of the Year

Data Centre Colocation Supplier of the Year

Sponsored by Aico

Sponsored by Vertiv

Shortlisted: Danfoss Drives – Aarhus Water Limited energy neutral project Vertiv – Energy Savings as a Service (ESaaS) across Telefónica sites in Europe and LATAM ServerFarm – TOR-1 Toronto Data Center 300 Bartor Road Vertiv – UPS and cooling technologies at Green Mountain’s DC1-Stavanger Data Center, Norway

Shortlisted: IP House Green Mountain Vantage Data Centers Custodian Data Centres

DATA CENTRE DESIGN & BUILD Data Centre Design & Build Product of the Year Sponsored by Centiel Shortlisted: Iceotope – Ku:l Extreme Micro Data Centre Bachmann – BlueNet Intelligent Power Distribution Unit CENTIEL – Tier III Ready Data Centre by Cannon Technologies in Conjunction with CENTIEL

Data Centre Design & Build – Project of the Year Sponsored by EnerSys Shortlisted: Sudlows – Ashton Old Baths Green Mountain – 2.5 MW Mountain Hall Data Center Expansion Schneider Electric – Digital Transformation project for Newcastle City Council Datacentre UK – Turnkey DC project CENTIEL – Keysource in association with CENTIEL, installation of UPS and Li-ion battery technology to the Cell and Gene Therapy Centre, UK

OUTSTANDING ACHIEVEMENT Technical Leader of the Year Sponsored by EnerSys Shortlisted: Simon Binley, Head of Project Management for the Wellcome Sanger Institute Jason Yates, Technical Services Manager, Riello UPS Ltd Dave Breith, CEO, Firexo

Consultancy/Contractor of the Year Sponsored by ECA Shortlisted: Slatters Electrical Data Centre UK Oper8 Global 2bm Limited

Outstanding Project of the Year Sponsored by Riello UPS

Outstanding Product of the Year Sponsored by Riello UPS



AKSA DCC Generators power Cyberfort Group’s The Bunker Facility


t AKSA Power Generation, we are very proud to be the selected manufacturer for the Cyberfort Group’s The Bunker data centre’s power needs. Cyberfort Group exists to provide their clients with peace-of-mind about the security of clients’ data and the compliance of clients’ businesses. AKSA Power Generation became an important part of this data centre, providing 700 kVA generators. Michael Watts, Director of Infrastructure & Technology at Cyberfort Group, said, “These units are taking pride of place, as a key part of our power infrastructure, assisting in securing the continuity of our services, adding layers of resilience to the power infrastructure at our Ash (Kent) facility, The Bunker. These units were built to a custom specification and design that meet our needs and exceed our expectations.” AKSA Power Generation offers 65 models in our DCC product range, with products between 550 – 3,000 kVA acceptable for Tier III and Tier IV standards set by the Uptime Institute. Regardless of the power rate or complexity of your power needs for your data centre investments, we are able to provide a reliable power source. We manufacture and combine all the important components using the industry’s highest level of design and performance control. sales@aksaeurope.com • www.aksaeurope.com

Automated data centre inventory tracking with custom UHF RFID and NFC labels


worldwide ICT company needed to automate how their servers were tracked and managed. With thousands of high value ICT assets in play, the ability to report without error on real-time asset whereabouts proved essential for both commercial success and compliance. In addition, the company was looking for ways to enhance cable maintenance intervention speed and accuracy. Brady Corporation suggested the solution: automated, real-time asset tracking with passive, custom on-metal UHF RFID and NFC labels. Relevant asset locations, time-stamps, and other data are available in real-time at the click of a button. Staff no longer have to manually count assets and can assess a site’s entire ICT inventory in a couple of hours, instead of weeks. The data also enables the company to prevent errors in asset movement through automatic alerts generated via the supporting software. This increases overall efficiency and decreases labour cost. Additionally, compliance with various regulations worldwide is easier when whereabouts on the entire ICT inventory are available almost immediately in a central location. csuk@bradycorp.com • www.brady.co.uk

Uptime Institute approves range of innovative Tier III-ready data centre designs


range of innovative Tier III-ready data centre designs have now been fully approved by the Uptime Institute. The team at Cannon Technologies, working in conjunction with several OEM manufacturers, including Centiel UK, have spent more than three years developing a pre-certified set of solutions now available in ratings of: 100kW, 250kW, 500kW and 1MW. Mark Awdas, Engineering Director, Cannon Technologies said, “The first 100kW data centre solution using a 2N design configuration was completed and approved in 2019, and since then we have worked closely with our industry partners to make various Tier III data centre designs available for facilities requiring different sized systems. This award now provides our customers with fact-based evidence that their final build will more easily obtain their final Tier Certification. “We now have four Tier III-ready modular DC designs described as ‘concurrently maintainable’, which ensures that any component can be taken out of service without affecting production.” Centiel’s leading three phase, true modular UPS, CumulusPower, and technical support was chosen to be incorporated into the designs. The technology offers the highest levels of resilience, is flexible, robust and it has been tried and tested in many scenarios. sales@centiel.co.uk • www.centiel.co.uk

40 www.datacentrereview.com Q2 2022


Go beyond building automation


he ILC 2050 BI industrial controller from Phoenix Contact is ideal for the most demanding applications in buildings, infrastructure and data centres, using the Niagara 4 Framework. The integrated Niagara Framework enables IIoT-based automation through standardisation of various data types. This makes it easy to connect with various sensors and actuators regardless of the manufacturer and communication protocol. • Minimised commissioning costs thanks to different protocols on one controller • Easy programming using drag-and-drop within the Niagara 4 Framework • Real-time, on-premise analytic control thanks to integrated Niagara Analytics • Cost-effective operation thanks to web-based maintenance, monitoring, and programming • Offers a compact footprint in the panel, and therefore cost savings Go beyond building automation with the ILC 2050 BI providing industrially hardened control and modular I/O running the Niagara 4 Framework. info@phoenixcontact.co.uk • www.phoenixcontact.co.uk

G-Core Labs introduces full-featured Managed Kubernetes service


-Core Labs has announced Managed Kubernetes, a new full-featured highly-secured PaaS service that allows enterprises to deploy Kubernetes clusters without the complexities of managing the control plane and containerised infrastructure. Available in more than 15 regions worldwide, Managed Kubernetes enables customers to quickly access ready-to-use Kubernetes clusters and automate application scaling and deployment in its secure cloud environment. As containerised environments spread across multiple public cloud providers and data centre locations, the resulting complexity impedes the acceleration advantages that fuelled the container adoption effort in the first place. G-Core Labs’ availability in 15+ regions eliminates this complexity by managing the secured service for users. Managed Kubernetes enables customers to automate container management, scaling, and updating. It accelerates and simplifies the development process, testing and deployment of new application versions into G-Core Labs’ highly secure cloud. Customers can get access to a new Kubernetes cluster in minutes with one click via the dashboard or via APIs or Hashicorp Terraform. Additionally, Managed Kubernetes provides DDoS protection, providing all customers with protection at the network and transport level by default and a Web Application Firewall (WAF) for protection from attacks at the application level. info@gcore.lu • www.gcorelabs.com

Neterra prevented more DDoS attacks in 2021 eterra prevented nearly six times more DDoS attacks on its customers in 2021 than in 2020, the company has said. Between the period of January to December 2021, Neterra stopped a total of 1,105,456 DDoS attacks, compared to 197,870 in 2020. According to data from the company’s monitoring system for DDoS protection services, this is an increase of 559%. DDoS attacks are designed to disrupt access to the services or equipment of a specific victim-targeted company in order to steal data, money or intellectual property. Neterra offers a variety of solutions to prevent DDoS attacks, as well as a system that proactively monitors the state of the network. Neterra customers who use DDoS protection from the company get access to the unified system for monitoring DDoS services. Through it, customers can see statistics on attacks in real-time and make periodic reports – for example how many attacks were stopped per a period of time – as well as changing the defence mode. contact@neterra.net • www.neterra.net


Q2 2022 www.datacentrereview.com 41


Equal footing Gender equality is critical to the technology sector’s continuing growth, says Lorraine Wilkinson, Regional Vice President of Sales UK at Equinix.


ender equality within the workplace remains a long-standing issue and the tech industry is no exception. Women in tech are largely underrepresented. As it stands, just 19% of the tech workforce is female, according to a recent report published by Tech Nation. For many, International Women’s Day in March presents a moment to reflect and celebrate the contributions women have made in forging gender equality. It is also a time to recognise that our work is not done, and we have a commitment to further the efforts made, so we can continue to make a positive difference. Women in the past have been more hesitant to apply for tech roles, even when they do in fact have the transferable skills and capabilities needed for such positions. This is partially due to the lack of female role models, the portrayal of tech as a primarily male sphere, and the historically small proportion of women studying STEM subjects. However, there is positive change regarding education. The number of women accepted into full-time STEM undergraduate courses increased by 50.1% in the UK between 2011 and 2020, with the number of women entering full-time undergraduate STEM courses increasing from 33.6% to 41.4% in the same period. This is a big step in the right direction, but advances are from a very low and disproportionate base. So, what’s holding them back? Word processing was once seen as women’s work in the 1970s, but over time there has been a shift, with STEM subjects and the resulting careers being perceived as masculine pursuits. What’s more, certain areas of the tech industry can demand long hours and increased commitment, which could pose more of a challenge for women, who may perform care work, such as childcare and informal adult care. In recent years, this was exacerbated by the Covid-19 pandemic. The underrepresentation of women in tech is also reflected in the industry’s leadership, as 77% of tech director roles are taken up by men. And these figures have remained almost the same since 2000, despite various drives to improve gender disparity. Given these statistics, many companies have prioritised bringing more women to join the industry. Improving diversity and inclusion is important at Equinix. We recognise that currently just 22% of our staff are women, with this figure at 19% in operations. We want to make a real difference in changing this imbalance and support women who have

42 www.datacentrereview.com Q2 2022

Image Credit: Shutterstock.com

the skills and capabilities to enter the tech industry, but might lack the confidence and support to do so. To address this issue, Equinix recently launched an initiative focused on helping women return to work after a career break. The business is especially keen to support those whose jobs have been negatively affected by the Covid-19 pandemic. The ‘I Am Remarkable’ programme targets candidates from outside the industry by encouraging them to recognise the value of transferable skills, often considered irrelevant by those entering the tech sector for the first time. Successful applicants are then hired into full-time paid roles and provided with on-the-job skills training to become data centre technicians. The aim is to open more opportunities for women to work within our own business and to join the wider sector without the need for previous technical training or a degree in a STEM subject. If we are to turn the tide on the lack of female representation, we must encourage more women to join the industry by removing the outdated and counterproductive stereotype that tech is a more suitable career for men. Evidence reveals that a gender-balanced organisation reaps far greater rewards, financially and productively. There is far lower turnover of employees, a greater pool of talent to choose from, and a more diverse set of skills within a thriving organisation. We know from our own experience that women often offer a different emotional perspective than men. Because of this, they can be very effective and empathetic communicators, which makes them excellent candidates for managing teams and customer relationships. Over the past decade, the Equinix Women Leaders Network (EWLN) has grown globally to support women through gender equity and multiple education initiatives. EWLN has been promoting, connecting and empowering female leaders through ongoing programmes for professional growth, visibility and cross-functional networking. Whilst the tech industry is attempting to make strides towards improving the gender disparity, progress is disappointingly slow. Companies must develop new and creative approaches to hiring more women to expedite gender parity and ensure that more females, despite their backgrounds, are encouraged to join the tech industry. We all need to commit to a more diverse employee profile that will better reflect today’s society. This approach will ultimately deliver far greater benefits to individual businesses and to the continuing growth of the sector.

The invaluable resource for electrical professionals informing the industry for over 140 years Register now for your free subscription to the print and digital magazines, and our weekly enewsletter



Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.