DCR 2021 Q1

Page 1

Q1 2021

www.datacentrereview.com

DENCO® Products

Hydro-DENCO® Refrigerant Free Sensible Cooling

DATA CENTRE COOLING Cooling

UPS/Standby Power

The perfect pair: Liquid cooling and the digital twin.

10

Telecoms/5G How edge colocation could be the answer to delivering 5G services.

Is gas the future of data centre standby power?

16

20



News 04 • Editor’s Comment New year, new stuff.

Contents

06 • News The latest stories from the sector.

Features 10 • Cooling Dave King of Future Facilities explains how a combination of liquid cooling and a digital twin could help businesses achieve their green goals, alongside a raft of other worthwhile benefits.

16 • UPS/Standby Power Martin Byrne of Finning UK & Ireland explains why gas is the future of data centre standby power, and what operators must consider.

20 • Telecoms/5G John Hall, managing director at Proximity Data Centres explores how edge colocation could be the answer to delivering 5G services.

10 06

24 • Cloud Computing Neville Louzado of Hyve Managed Hosting dispels the myth that companies need to choose between a public and private cloud, and espouses the potential benefits of a hybrid solution.

Regulars 28 • Industry Insight You’d think when it comes to cable specification, the trusty 18th edition would be a wealth of information and best practice, right? Well, apparently not. Alex Smith of Flexicon tells us more.

18

30 • Final Say Power Control explores the ever upward trajectory and resiliency of the data centre market and highlights the advances in UPS technology that will pave the way for further success, not only now, but in the future.

26 28

22


Editor’s Comment It’s been a while, hasn’t it? I’ve not written one of these since November and sadly, not a great deal has happened since; is what you’d say if you didn’t work at Data Centre Review. Firstly, not only was I struck down for what seemed like an eternity with this sordid illness, I do happen to work at Data Centre Review and there has been quite a bit going on – my near death aside. First off, if you haven’t already had a chance to have a look, check out our brand-new DCR website. Last year, we revamped the Electrical Review website, so we felt it time its little sister got the same treatment. Got to keep the kids equal now, don’t you? So not only have we now revamped both Electrical Review and Data Centre review in print (and upped our output of DCR to four times annually), but they’ve now both had a digital makeover. They grow up so fast. To accompany our digital progression, we fully intend on expanding our digital offerings further. With Covid as a catalyst, digital has certainly been the way forward for many companies. And if it hasn’t, unfortunately, many of those companies are no more. And although we quickly adapted in the wake of the pandemic, with many of our interviews taking place remotely (where you can witness my ever-expanding roots and general shortage of professional hair care), we felt the need for something a little more regular and editorially-lled. This is why, hopefully in the very near future, we will be launching our very own DCR podcast. Said podcast will be hosted by myself, the more reluctant sarcastic one (think, Bernard Black of Black Books and you’re just about there), and my beautiful, far cheerier and more optimistic co-host, Jordan O’Brien. The yin to my yang. Jordan and I will generally be chewing the fat on all things tech, whether this is based on current goings on, the latest innovations or if we just fancy a rant, there will (fingers crossed) be a segment for everyone. We will also be welcoming guests, so if you fancy getting something off your chest, or have a professional opinion to share with our audience, we’d love to hear from you. So, if you have any suggestions surrounding any hot topics for discussion, please don’t hesitate to get in touch via clairef@ datacentrereview.com. Claire Fletcher, Editor

EDITOR

Claire Fletcher clairef@datacentrereview.com

CONTRIBUTING EDITOR

Jordan O’Brien jordano@sjpbusinessmedia.com

DESIGN & PRODUCTION

Alex Gold alexg@sjpbusinessmedia.com

GROUP ACCOUNT DIRECTOR

Sunny Nehru +44 (0) 207 062 2539 sunnyn@sjpbusinessmedia.com

ACCOUNT MANAGER

Kelly Baker +44 (0)207 0622534 kellyb@electricalreview.co.uk

PUBLISHER

Wayne Darroch PRINTING BY Buxton Paid subscription enquiries: subscriptions@electricalreview.co.uk SJP Business Media 2nd Floor, 123 Cannon Street London, EC4N 5AU Subscription rates: UK £221 per year, Overseas £262 Electrical Review is a controlled circulation monthly magazine available free to selected personnel at the publisher’s discretion. If you wish to apply for regular free copies then please visit: www.electricalreview.co.uk/register

Electrical Review is published by

2nd floor, 123 Cannon Street London EC4N 5AU 0207 062 2526 Any article in this journal represents the opinions of the author. This does not necessarily reflect the views of Electrical Review or its publisher – SJP Business Media ISSN 0013-4384 – All editorial contents © SJP Business Media

Follow us on Twitter @DCRmagazine

Join us on LinkedIn

4 www.datacentrereview.com Q1 2021



EDIT

News

CLIMATE NEUTRAL DATA CENTRE PACT COMMITS INDUSTRY TO SERIOUS SUSTAINABILITY ACTION

The latest highlights from all corners of the tech industry.

After a decade in the works Microsoft (almost) has all its own services on Azure

T

en years ago, Microsoft officials pledged to move Office 365 to Azure, and it looks like this goal of having all first-party services hosted via Microsoft’s own platform is soon to come into fruition. Even as recently as five years ago, Microsoft still wasn’t running some of its major services on its own Azure cloud. Since then, the company has made a concerted effort to change this situation and it’s closing in on being able to claim all its first-party services, including Office 365, Xbox Live and Bing services, are running on Azure. That said, the more services migrated, the more there is to keep an eye on, so hopefully Microsoft have some cracking managers in the pipeline. It was only back in April last year that Microsoft failed to notice a major Azure outage thanks to a sleeping manager. Letting sleeping managers lie however, there are a lot of reasons for any company, including Microsoft, to want to have all of its cloud services in one proverbial basket. By doing this, Microsoft and others can more quickly build new products; adhere to specific compliance needs; take advantage of cross-cloud underpinnings like the Microsoft Graph APIs; scale more quickly; use its own Azure-hosted services as proof-of-concept examples for customers; and, last but not least, save money.

6 www.datacentrereview.com Q1 2021

Covid-19: The lasting legacy BCS has announced the findings from its latest data centre survey, proving our industry is nothing short of resilient. A return to pre-Covid levels of confidence regarding future demand for data centres was an encouraging finding from the latest independent industry survey, which captures the views of over 3,000 senior data centre professionals across Europe, including owners, operators, developers, consultants and end users. Nearly two-thirds of respondents believe that 2021 will see an increase in demand, up on the 40% recorded last year, and now back in line with the long-term trending average. Over the past six months, just over 90% of developer and investor/funder respondents reported an increase in their portfolio of technical real estate.

The Climate Neutral Data Centre Pact has committed the European cloud and data centre industry to ambitious sustainability measures, with twenty-five companies and 17 associations from across Europe having agreed to take specific steps to make data centres climate neutral by 2030. One year after the adoption of the European Green Deal, leading cloud infrastructure providers and data centre operators have created the Climate Neutral Data Centre Pact. Apostolos Kakkos, chairman of EUDCA (European Data Centre Association) commented, “Data centres are the supporting pillars of the fourth industrial revolution and, as seen during the Covid-19 pandemic, are essential infrastructure of not only the digital economy but of the entire global economy.” “It is our duty to commit to a self-regulatory initiative that will help to ensure the operational availability, sustainability and the future of our industry.” The Climate Neutral Data Centre Pact establishes a Self Regulatory Initiative which has been developed in cooperation with the European Commission. It supports both the European Green Deal, which aims to make Europe the world’s first climate neutral continent by 2050, and the European Data Strategy by making EU data centres climate neutral by 2030.



NEWS

DIGIPLEX ACQUIRES LAND TO BUILD NEW DATA CENTRE OUTSIDE OF OSLO

DigiPlex, a Nordic leader for sustainable, innovative, and secure data centres, has secured a plot of 60,000 m2, with an option to purchase an additional 100,000 m2 in Treklyngen industrial park in Ringerike municipality outside of Oslo, from seller Follum Eiendom AS. The site has been prepared for the establishment of a data centre and an agreement with Ringerikskraft ensures the necessary power supply for the first development phase.

Blockchain for Covid vaccine storage implemented by UK hospitals in world first ike many medications, the Covid-19 vaccine is temperature sensitive, and as such, must be stored under the right conditions. In one of the first such initiatives in the world, two UK hospitals are utilising blockchain, not only to track and monitor the Cov-

L

id-19 vaccine process, but keep a watchful eye over the movement and storage of the drug, among others. Hospitals in Stratford-upon-Avon and Warwick are using the distributed ledger tech to track not only the fridges that store the Covid vaccine, but chemotherapy drugs too.

EQUINIX JOINS THE FIGHT TOWARD CLIMATE NEUTRALITY BY 2030

Surge of IT professionals seeking new jobs due to employer conduct during pandemic Employer behaviour and work-life balance during the pandemic are driving IT professionals to look for a career move, despite fears of mass unemployment at the end of furlough. The research, which asked 2,000 people about their skills and job hunting over the last six months, revealed that even before November, 36% of IT candidates were looking to move compared to 27% before lockdown. Employees stated employer behaviour (38%), work-life balance (38%) and lockdown causing priorities to be re-evaluated as the main reasons for movement in the market.

8 www.datacentrereview.com Q1 2021

Tech industry, assemble! In a historic pledge toward climate neutrality by 2030, Equinix has joined other European cloud and data centre providers, as well as European trade associations to form the Climate Neutral Data Centre Operator Pact and Self-Regulatory Initiative.



COOLING

The perfect pair: Liquid cooling and the digital twin

10 www.datacentrereview.com Q1 2021


COOLING

The onus on data centre sustainability has never been higher, and the pressure is on for data centre operators to comply. Here, Dave King, product manager at Future Facilities, explains how a combination of liquid cooling and a digital twin could help businesses achieve their green goals, alongside a raft of other worthwhile benefits.

D

espite all the pressures faced by businesses and governments during 2020, the focus on sustainability and environmental awareness took important steps forward. This was illustrated by the UK and incoming US government’s promises to commit to impactful climate change policies, not to mention David Attenborough’s stark commentary on what will happen to the planet if we don’t change our collective behaviour. As such, 2021 and beyond will see businesses push even further to achieve their sustainability goals; incentivised by government regulations, taxation and public pressure. For the data centre industry this is likely to include a tax based on energy usage and efficiency. We will likely see data centres forming new technology pairings – where separate technologies, each of which has a pre-existing function, will combine to manage current and upcoming pressures. One example is the pairing of liquid cooling with the adoption of a digital twin. By uniting these two innovations, businesses will be able to achieve their sustainability goals, while also improving data centre capacity and reducing operating costs. However, before we can explore how they can be used together, we must first ask why liquid cooling is the way forward. Liquid cooling takes centre stage Liquid cooling has long been tipped to eventually supersede air cooling, with air and liquid cooling co-existing in the meantime. But, 2021 is the year that liquid cooling adoption is set to really take off, thanks to the widespread acceptance of traditional air cooling’s deficiencies in the context of high density systems and the benefits offered by liquid cooling. For example, in order to cool high-powered chips you need to move a large volume of air, which is challenging in the small confines of a server box. Direct chip liquid cooling does not share this challenge and keeps rates of energy consumption down. Therefore, liquid cooling is an ideal choice to successfully combat the greater levels of heat produced by high-performance computing (HPC) and high-density requirements, such as Artificial Intelligence (AI). From a sustainability point of view another challenge with air is that it is difficult to transport effectively. As such, the waste heat from air cooling can usually only be used in neighbouring offices. This means it can’t always be channelled productively to support sustainable practices by other businesses. Liquid cooling doesn’t share these limitations. While there are significant capital expenditure considerations and operational complexities attached to liquid cooling, it does cut operational costs long-term while supporting sustainability with a clear environmental benefit.

What’s more, businesses can reduce their need to invest in new infrastructure by using liquid cooling on chips and air cooling to remove the heat from the rack. Therefore, as the popularity of high power densities boom over the next five years – research from MarketsandMarkets indicates HPC will grow by $11.6 billion between 2020 and 2025 – so too must the adoption of liquid cooling. This will be a key move for data centres striving to future proof their infrastructure. Unlocking the value of liquid cooling through the data centre digital twin It’s evident that data centres can bolster their sustainability efforts by supplementing air with liquid systems, and that this is driving significant uptake. In fact, the market is set to reach a valuation of $3.2 billion by 2024, growing at a CAGR of 22.6%, according to MarketsandMarkets. However, if businesses want to achieve maximum financial and environmental return on their liquid cooling investments, they need to fully understand and manage its impact in their facility. By deploying a technology such as liquid cooling with clear performance targets and thresholds, and with a clear understanding of how these relate to other components within the infrastructure, the investment derives far greater value. A digital twin – a 3D, virtual replica of a data centre that can simulate its physical behaviour under any operating scenario – provides this visibility and predictability. This replica allows designers and operators to test out different setups and scenarios in a safe environment ahead of real-world deployments, using science-based simulation. This ultimately enables them to make more efficient use of their data centres’ space while reducing risk. By using a digital twin to analyse the optimum set-up of a liquid cooling system and assess how it will work within the context of existing air-cooled infrastructure, businesses can minimise energy consumption. This will empower them to streamline costs, not to mention helping them reach their environmental goals.

Research from MarketsandMarkets indicates HPC will grow by $11.6 billion between 2020 and 2025 – so too must the adoption of liquid cooling A combined approach to meeting sustainability objectives Building on the sustainability groundwork that was laid in 2020, governments, businesses and customers alike will strive for a more environmentally sound future in 2021. Rather than ever-increasing high-density technologies forming an obstacle to this goal, they will drive data centres to pursue new means of fulfilling their sustainability objectives. Chief among these will be the adoption of liquid cooling alongside traditional air systems; facilitated in its implementation by the digital twin. By combining these technologies, businesses will unlock cost efficiencies and make 2021 the year the data centre industry takes important steps towards greener practices.

Q1 2021 www.datacentrereview.com 11


COOLING

Climate change? Just chill With sustainability now a major differentiator in the data centre market, organisations must find ways to ensure their facility’s carbon footprint is kept to a minimum. With cooling the second largest consumer of power within a data centre, Marc Garner, VP, secure power division, Schneider Electric UK&I, explores the crucial role of cooling in achieving data centre sustainability and why a more holistic approach could be the way forward.

D

ata centres are the foundational building blocks of today’s digital and electric world. In 2020, their importance was well and truly amplified, with demands for mission-critical applications and connectivity surging, and the growth of data that ensued. According to IDC’s Global DataSphere, more than 59 zettabytes (ZB) of data would be created, captured, copied and consumed across the world in 2020. This has been driven by an increase in the number of employees working from home, greater use of video communication and a surge in the consumption of downloaded and streamed video. IDC also states that the creation of new unique data, as well as the increased consumption of replicated data, has fuelled the growth of the DataSphere, which is forecast to continue at a five-year compound annual growth rate (CAGR) of 26% through 2024. Today, data centres and the professionals that work within the digital infrastructure sector continue to play a crucial role in meeting connectivity and application demands. So significant has the data centre industry become, that during the course of the Coronavirus pandemic many engineers and service providers were named key workers, helping to ensure operational continuity for the many businesses transitioning their organisations to remote working. The sector itself is built upon a reputation for reliability and there’s no doubt that key requirements of any data centre include performance, resiliency, security and efficiency. Yet looming large in the conscience of operators and customers is a factor that’s become more pivotal than any single facet of the technology realm. Now, with greater awareness of the impact of data centre carbon emissions on the environment, sustainability has become a key focus for the industry. Data centre sustainability Climate change and, in many respects, sustainability is now driving government and corporate agendas. Concern for the health of the planet is a matter of global importance and has become the responsibility of both businesses and consumers alike. The Paris Climate Change Agreement

12 www.datacentrereview.com Q1 2021


COOLING

calls for < 2°C warming by 2030, while the current projection is at + 3° C. In January 2020, The Guardian reported that the climate crisis fills the top five places of the World Economic Forum’s risks report. By 2035 it is anticipated that IT will consume 8.5% of global electricity up from 5% in 2018, and with data centres responsible for a large share, the industry will play a key role in driving more sustainable operations. Looking inwardly, a recent survey of more than 800 global data centre providers by Schneider Electric and 451 Research found that nearly all (97%) of respondents had customers who demanded contractual commitments to sustainable practices; the majority (57%) believed that efficiency and sustainability will be important competitive differentiators within three years – a large increase on the 26% who believed that such was the case at the time of asking; and nearly half of respondents (43%) had already put in place strategic sustainability initiatives and efficiency improvements for their data centre infrastructure. Clearly market forces, as well as regulatory pressures, are encouraging data centre operators to take a strategic approach to being more efficient, but how in practice does the industry become more sustainable? PUE and sustainability A key mechanism for driving data centre efficiency is in the type and management of the cooling architecture deployed. Typically, the cooling system is the second largest consumer of power after the IT equipment itself, meaning that any change here, positive or negative, can have major implications for the carbon footprint of a facility. PUE (Power Usage Effectiveness) has encouraged the adoption of strategies to improve the efficiency, as well as the effectiveness of cooling, and today the reduction of this metric is one of the key ways that many organisations gauge their success. According to the Uptime Institute, “PUEs have fallen from an average of 2.5 in 2007 to around 1.6 today,” and furthermore, their 2020 annual survey found that around 95% of respondents said that it is important that colocation companies have a low PUE. However, when this is compared with 451 Research’s survey, which states that only 43% of operators had a strategic sustainability program to comprehensively improve the way they design, build and operate their infrastructure, it identifies a major gap in the sector; one that must be addressed quickly. PUE and sustainability are clearly intertwined, but it’s becoming inherently obvious that a more holistic approach to sustainable data centre deployment strategies is required. Today operators must also consider other factors to drive sustainability, including power purchase agreements (PPAs) with stringent site selection and planning processes where access to renewables is a deciding factor in location. Use of resource-efficient data centre designs will also be crucial, combining them with AI and vendor-agnostic management software as a mechanism to support lower PUE ratings. However, as facility designs continue to evolve and demands for more power and processing continue to drive rack densities, cooling loads will inevitably become greater, thereby meaning an increase in potential emissions. How then, can operators continue to feed the burgeoning appetite for digital services, while maximising efficiency and minimising environmental impact? The role of cooling in data centre energy efficiency Traditional cooling strategies have used chillers to cool the ambient air

within a data centre, while fans or in-row air cooling can provide additional cooling to the racks. For more efficient operation, variable-speed fans can run at reduced speeds to match a lower cooling load with strategic use of containment to create ‘hot’ and ‘cold’ aisles that streamline the thermal profile and ensure efficient cooling. For years ‘air’ has been the go-to, but facility location has also become a deciding factor in cooling strategies. Based on the natural climate, further efficiencies can also be achieved by using free cooling, in which cold air coming from outside the data centre can be used to cool the interior so that chillers can be temporarily switched off or run down.

According to IDC’s Global DataSphere, more than 59 zettabytes (ZB) of data would be created, captured, copied and consumed across the world in 2020 Modern operating conditions are also causing data centre operators to consider new variants of liquid cooling. Now with increasing power densities and concerns for the environment, the operating cost savings of liquid cooling can offset the capital costs that are required to prepare a data centre for its deployment. Today, for example, between 40 and 60% of the heat generated by a server can be absorbed by using a Direct Liquid Cooling (DLC) approach, reducing the burden on air chiller systems and the accompanying costs. Research detailed in Schneider Electric White Paper #282, ‘Capital cost analysis of immersive liquid-cooled vs air-cooled large data centres’, found that liquid cooling can offer many tangible benefits, including higher energy efficiency, smaller footprint, lower noise pollution and up to 14% CapEx savings. With the demand for digital services showing no signs of slowing, the need to drive sustainability while balancing greater requirements for resiliency is becoming paramount. Addressing one part of the data centre alone is not the answer, but any improvement in PUE or efficiency via cooling will amount to enormous savings – both in terms of energy usage, cost and carbon emissions.

Q1 2021 www.datacentrereview.com 13


Hydro-DENCO®

Sensible Cooler Range Meeting the required ASHRAE server inlet conditions with reduced energy cost and without mechanical refrigeration Environmental legislation leading to phase down of HFC refrigerants initiated our latest development programme giving rise to the new Hydro-DENCO® sensible cooler range Apart from EC fans on both indoor and outdoor units, very high efficiency coils and touch screen controls, further refinements mean that the required pump, expansion vessel and controls for the low energy side can all be factory fitted, integral to the room unit. Hydraulic pressure losses are kept to an absolute minimum, there are no concerns with heat exchanger fouling, and the electronically commutated pump motor will have a much lower power consumption than with alternative systems. A nominal 300 kW cooling system would merely require the installation of 80mm flow and return pipework Units can be installed as autonomous, stand alone systems or on a conventional pipework distribution system as required.

www.flaktgroup.com/uk appliedsystems.uk@flaktgroup.com


A very high efficiency indoor cooling coil allows the use of relatively high fluid temperatures, obtainable all year round from external evaporative coolers, particularly in temperate Northern European Climates

The design philosophy with our Hydro-Denco® systems is to achieve acceptable data centre server cooling requirements without using any mechanical refrigeration, or the high Global Warming Potential (GWP) refrigerants associated with some such systems. Units are designed to operate with low cost heat rejection such as evaporative fluid coolers, district cooling or heat reclaim systems, all with minimum complexity and risk.

• • • • • •

Reliable Energy Efficient Scalable Flexible Zero ODP/GWP Integrated Controls

HD 102

Hydro-DENCO

HD 202

HD 302

Sensible Cooling - Hydro

kW

100

150

200

Sensible Cooling - CHW 10° C

kW

250

350

500

Control air temperature (Nom)

°C

25

25

25

Control air temperature (Max)

°C

28

28

28

Air volume flow

m³/s

8

12

16

External static pressure

Pa

50

50

50

Fan motor power (Total)

kW

5.0

8.0

10.0

Total System Power Input

kW

6.6

10.8

12.2

Fluid Flow Rate (Max)

l/s

5.0

7.5

10.0

Fluid Temperature (Max)

°C

27

27

27

Hydro Denco Pressure Loss

kPa

45

50

45

Fluid Cooler Pressure Loss

kPa

45

48

28

Fluid Concentration

%

25

25

25

Connection Size (Nom)

mm

65

65

80

Hydro Denco Unit Total Height

mm

3200

3200

3200

Hydro Coil Section Height

mm

2500

2500

2500

Hydro Fan Section Height

mm

700

700

700

Hydro Denco Unit Width

mm

1995

2500

3100

Hydro Denco Unit Depth

mm

900

900

900

Hydro Denco Unit Weight

kg

700

850

1400

Matched Fluid Cooler Height

mm

1690

1690

2521

Matched Fluid Cooler Width

mm

1130

1130

2260

Matched Fluid Cooler Length

mm

2066

3466

3970

Fluid Cooler Weight (Wet)

kg

640

1008

2460

3


UPS/STANDBY POWER

Why gas is the future of data centre standby power Data centre expansion will add over one and a half million tonnes to Ireland’s carbon emissions by 2030, according to the Irish Academy of Engineering. With emissions on the rise, using gas to power standby gensets could help reduce harmful pollutants generated by data centres. Here, Martin Byrne, sales manager for Gas Power Solutions at energy and transportation expert Finning UK & Ireland, explains why gas is the future of data centre standby power and what operators must consider. ritical facilities like data centres require optimal uninterrupted power 24 hours a day. Any lapse in power is an issue – it can result in files being lost or corrupted, mainframes malfunctioning and money being lost. Backup generators provide vital standby power, keeping the site online during outages. Data centres can produce power themselves using prime gas gensets. or as part of combined heat and power (CHP) systems. Generating electricity on site is preferable to relying entirely on what can be a flexible network connection. For these sites, gas is cheaper and more efficient than running a diesel prime generator. However, many data centre operators use diesel to power their standby gensets because these machines can accept significant load steps quickly after an outage. The priority is keeping the system running to prevent outages that cost customers money. Diesel generators offer long runtimes and require minimal maintenance because they don’t have carburetors or spark plugs, meaning that operators can be confident that the site will keep running at minimal cost. Nonetheless, there are signs of a growing trend towards using natural gas to power backup generators as well as prime.

C

Reducing emissions One reason gas is becoming a more attractive option for data centres is the demand for clean energy. The Irish Government has set itself the ambitious goal of cutting greenhouse gas emissions each year and becoming carbon neutral by 2050. For data centre operators, this may mean selecting a more environmentally-friendly fuel to power their generators. As previously mentioned, gas is already being used for prime and continuous power generation. In 2017, a major computing company announced it would install 16 gas-powered gensets to provide up to 18 MW of electricity to one of its sites in Dublin. However, these sites might still

16 www.datacentrereview.com Q1 2021


UPS/STANDBY POWER

use diesel to power backup generators – emitting harmful pollutants. Installing gas generators for standby power will make it easier for companies to comply with emissions standards. For instance, under the UK climate change agreement (CCA) scheme, data centre operators have been expected to reduce their power usage effectiveness (PUE) by 15%. While natural gas is a greenhouse gas – methane – it produces fewer pollutants than diesel when burned. According to Clean Energy Fuels, natural gas emissions produce 99% less sulphur oxide (SOx), 80% nitrogen oxide (NOx) and 40% less carbon dioxide (CO2) than diesel. Capacity market The trend towards gas is also being driven by the grid. Data centres are power hungry, which can put significant pressure on the surrounding network. Because of growing power demand, companies and sites in the UK, including data centres, are being offered flexible contracts where they agree to provide supplementary power to the grid or self-generate their own power. In 2020, a UK Government consultation proposed making it easier for companies to bid for these contracts by reducing the minimum capacity threshold from 2 MW to 1 MW.

Generating electricity onsite is preferable to relying entirely on what can be a flexible network connection. For these sites, gas is cheaper and more efficient than running a diesel prime generator Like diesel models, gas generators can synchronise with the grid, generating on-site power in conjunction with the incoming supply. They can do this by producing a constant output and providing a base load that is potentially cheaper than power from the network. If the site purchases its power from the grid, gas gensets can provide additional capacity when required. Either way, by generating supplementary power on-site, data centres can help balance out the grid when it reaches its limits. As the capacity market grows along with the incentives to generate power on-site, it will be easier for data centres to make the switch to gas standby systems, because sites will have already installed the required infrastructure, such as gas mains valves. Consequently, the jump will be far smaller than if operators were making a clean break from, say, diesel prime. Engineering and maintenance While it may be tempting to switch straight from diesel to gas, it’s not as simple as converting a generator. Operators will need a new machine with low hours that has been designed to run on gas. Another consideration is how the gas standby genset will fit into the data centre. It could require a different UPS design than a diesel generator, meaning that some ancillary parts may need replacing before the genset can function. There is also size to consider – the power-to-weight ratio is lower for

gas gensets, meaning that the machine will often be larger than a conventional diesel model. Sizing is critical because oversizing a system can increase costs. The genset will take up more space, have higher operating costs and consume more fuel than a smaller machine. Undersizing will result in inadequate backup power supply, meaning the genset could stall or fail to start. To address the balance between space and power, engine manufacturers have released products with high power densities. Power density measures the number of kilowatts produced in relation to size – the higher the power density, the higher the output from the plant room. The trend towards gas will gradually drive manufacturers to produce machines with good power-to-size balances, as supply catches up with demand. When designing a generator for a data centre, consider fuel alternatives early because the fuel will often determine maintenance regimes. For instance, servicing is crucial for gas generators because of the mains supply that is fed into the machine. Sometimes, these gensets might need to be checked weekly. If operators are unsure about servicing requirements, they can consult the manufacturer’s recommendations.

According to Clean Energy Fuels, natural gas emissions produce 99% less sulphur oxide (SOx), 80% nitrogen oxide (NOx) and 40% less carbon dioxide (CO2) than diesel While Ireland’s data centre expansion is forecast to contribute over one and a half million tonnes of carbon emissions, an early trend towards gas standby gensets looks promising for the reduction of harmful pollutants. If it is a trend, it’s being driven both inside the data centres with the reduction of greenhouse gases and outside from the grid. Could gas be the answer for your data centre’s power needs?

Q1 2021 www.datacentrereview.com 17


UPS/STANDBY POWER

Reaping the rewards of remote UPS monitoring With Covid-19 restrictions likely to remain in place for weeks or even months to come, Chris Cutler, Riello UPS business development manager, explains why remote UPS monitoring remains more important than ever.

18 www.datacentrereview.com Q1 2021

or many businesses, 2020 marked what is likely to become a permanent shift from analogue to digital, with the cloud becoming the key tool in keeping teams connected. Data centre operators have demonstrated incredible adaptability and resilience meeting this rising demand during the coronavirus crisis, as companies and communities have become increasingly reliant on online solutions in both their professional and personal lives. The sector has had to do all this while coping with reduced staffing levels, complying with ever-changing government guidance, and committing to making their facilities Covid-secure. It’s likely that restricted site access, social distancing and other safety precautions are here to stay, in the short-to-medium term at the very least. Limited physical access to sites makes the ability to remotely monitor crucial infrastructure and equipment almost priceless. For the essential UPS systems safeguarding data centres from damaging downtime, cloud-based remote monitoring provides the equivalent protection of

F


UPS/STANDBY POWER

having a ‘virtual’ power engineer onsite round the clock keeping watch of your ultimate insurance policy. UPS monitoring: The basics The most basic form of monitoring is based on voltage-free (also known as dry) contacts in the guise of a set of terminals on the UPS or fitted using a slot-in accessory card. It provides binary ‘true/not true’ information to simple statements. In terms of a UPS, that could be ‘is there a mains failure?’ or ‘is the UPS running on battery?’. Whilst this may suffice for a small office or workshop, mission-critical settings like a data centre call for a far more sophisticated network-based approach, either locally using ethernet connections or over the internet. The first method incorporates an RS-232 connection to provide single-ended signalling, a relatively simple yet resilient approach. Then there’s Modbus, the open protocol that’s become the most common way of connecting industrial electronic devices. It enables serial communication from a single RS-232 or RS-485 connector and allows for the creation of a hierarchy on the network.

Limited physical access to sites makes the ability to remotely monitor crucial infrastructure and equipment almost priceless Similarly, there’s Profibus, a faster version of Modbus, which is increasingly used to monitor automation technology. The final option is ideal for advanced data centre UPS monitoring. You can equip your UPS systems with Simple Network Management Protocol (SNMP) capabilities. In layman’s terms, this allows you to monitor and even control the UPS remotely from a central location. How does this work in practice? Each UPS on your network gets fitted with an adapter that connects it to management software. This means the UPS can ‘talk’ (send information and transmit data) and ‘listen’ (receive external commands). So, in the case of the former, it would be something like an alarm triggering a fault message. While the latter would be initiating system shutdown scripts when a power failure occurs. Round the clock protection A cloud-based SNMP remote monitoring service links into a remote service centre manned 24/7 by highly trained technical engineers. All communications between the UPS’ on your network and the service centre are SSL encrypted for security. Most uninterruptible power supplies these days will perform a range of automated tests every 24 hours, while the service centre also remotely polls the UPS at regular intervals (i.e. daily or weekly). If there’s a significant change to the operating conditions, for example a mains power failure, overload, or a fault with the UPS itself, alarms will trigger immediately. When this happens, the remote monitoring software automatically sends an email or SMS notification to key personnel and first responders,

as well as alerting the service centre so engineers can carry out further remote diagnostics and drill deeper into the problem. If needs be, the service centre arranges for field engineers to attend the site armed with the correct parts to fix the issue. And with the most serious faults, they can initiate emergency shutdown scripts for all connected devices. Under normal day-to-day operating conditions, the service centre and data centre staff can interrogate historical alarm logs and statuses to produce informative performance reports. Remote monitoring rewards For data centre operators, remote monitoring of their uninterruptible power supplies is a no-brainer as it reduces the risk of them suffering a failure that could lead to damaging service downtime. The first benefit is obvious. If something goes wrong with your UPS, you know about it immediately thanks to the automated notifications that trigger when there’s a fault. You aren’t left waiting for onsite staff to see or hear the alarm. Just think if something goes wrong outside of normal working hours or at an unmanned location – how long will it take for someone to physically notice the issue? Remote monitoring enables you – and your service centre – to quickly spring into action and get the fault fixed. This leads to the second advantage. You’re aware there’s an alarm, but what has really gone wrong? By remotely interrogating the UPS, engineers in the service centre can analyse valuable information that allows them to make an informed diagnosis of the problem. In turn, the engineer attending the site to conduct repairs has the information – and necessary spare parts – to give them the optimum chance of a speedy first-time fix. Of course, preventing faults in the first place is far better than the cure of having to fix them. And remote UPS monitoring helps data centre operators detect – and solve – many issues before they have the chance to grow into something more serious. Say the UPS alarms during an automated battery test, and further investigation suggests a weakening block of batteries is the reason. Promptly replacing those batteries ensures the entire set won’t fail, eliminating the chance of a potentially catastrophic outcome. Cloud-based remote monitoring minimises the number of onsite service visits you’ll need, reducing your maintenance costs, while it also allows staff to carry out many of their key day-to-day duties from the safety of their own home, without having to physically attend site. There’s one final positive to consider. One that proves remote monitoring shouldn’t just be seen as an additional insurance policy to mitigate against something going wrong. That’s because it can actually boost dayto-day performance and improve operational efficiency too. How so? During ‘normal’ conditions, you can build up an insight into the UPS’ performance over time by interrogating its historical alarm and status logs. Now if you’re a big data centre with lots of UPS systems on the same network, that’s a huge amount of valuable information to examine. Crunching the data can help you to optimise load management and identify other areas for improvement. In increasingly uncertain times, remote UPS monitoring offers data centre operators the reassurance that they’ve always got a ‘virtual’ power engineer onsite 24 hours a day making sure their essential standby power systems are in full working order.

Q1 2021 www.datacentrereview.com 19


TELECOMS/5G

Telecoms on the edge John Hall, managing director at Proximity Data Centres, explores how edge colocation could be the answer to delivering 5G services.

20 www.datacentrereview.com Q1 2021

ccording to Gartner, edge computing will account for 75% of enterprise-generated data by 2025. Today it only accounts for 10%. The huge increase predicted is largely down to the IoT applications that 5G is expected to enable, which in turn demand huge volumes of data to be processed at the edge. 5G’s small cell architecture and new mobile spectrums will deliver unprecedented low latency and increased bandwidth to support the addition of an increasing number of edge devices for driving compute and storage much closer to the user. The fortunes of 5G, the IoT and edge computing are therefore inextricably linked. With 5G’s comparable performance to a physical broadband connection, mobile operators and carriers will be able to offer consumers much faster and more reliable applications, such as for video streaming and gaming, as well as boosting the overall experience in terms of voice and data services. Businesses, industry and public services will also benefit, gaining the ability to wirelessly connect billions of remote edge devices worldwide for monitoring and controlling an array of IoT applications in real-time – for smart cities, smart motorways, factory floor machinery, medical imaging systems, not to mention smart fridges and of course, the

A


TELECOMS/5G

much-vaunted driverless vehicle. According to Gartner, last year 57% of businesses were looking to 5G to support their IoT communications. Data centres are key In practice much of 5G’s potential depends on data centres. This can only be fully realised by locating many more data centres much closer to edge devices and adapting them to 5G’s short wavelength transmission frequencies, which depend on the deployment of multiple small cells and antennas. In reality, massive infrastructure is still to be deployed and developed in many countries, and to support 5G networks, regional edge data centres are still to be connected to cell towers with fibre optic cabling, including many in the UK. This is absolutely necessary to support 5G’s implementation.

According to Gartner, last year 57% of businesses were looking to 5G to support their IoT communications In some cases, this will lead to micro data centres being connected at the base of cell towers for ensuring the critical sub-10 millisecond response times necessary for real-time applications, such as autonomous cars and remote surgery. At the same time, Tier III standard distributed edge data centres and mobile 5G cells/micro data centres will be necessary – for supporting the low latency content delivery and IoT communications requirements of enterprise users and service providers. Equally, edge data centre proximity is a must to process the data volumes that 5G will create through the proliferation of edge devices and sensors. Long-haul networks will be hard-pressed to handle these data volumes from network traffic and congestion perspectives, meaning much more processing of critical operational data must happen in local data centres. Only the less time sensitive data will be sent back to centralised data centres for further analysis and archiving. Furthermore, spreading the data traffic load by ensuring much shorter distances to travel between edge devices or users and nearby data centres will greatly reduce data backhaul costs – making the difference between using 10 Gb/s circuits instead of the 100 Gb/s ones typically needed in centralised computing architectures. The decentralised approach also goes a long way to addressing the security issues of many companies considering edge computing but who do not want their valuable IP being sent long distances via the public cloud. Moving data closer to the edge means internal IT teams can reduce the amount of data they have to store at a single point, which is crucial when addressing cybersecurity threats. By working in a more distributed way and utilising edge data centres, organisations will be more able to focus on stopping the next attack, rather than scrambling to recover from the last one. Move to regional colocation Traditionally, many larger telecom providers have operated their own data centres while also offering up spare capacity within their facilities to

third parties for colocation. But more recently as the colocation data centre industry has continued to grow and competition has increased from dedicated colo providers, many telcos have divested large swathes of their estates. This is so they can focus on their core business and maximise the returns on the massive network investments necessary, not least in ongoing 5G rollouts. Recognising that data centres are expensive to build, own and operate, more telcos are turning to colo data centre operators for data centre infrastructure to support their needs, including moving network operations towards the edge. The Huawei debacle and the subsequent delay in the rollout of 5G have also given CTOs of the big carrier networks more time to assess how they are going to deploy their networks in future. While often smaller than traditional facilities in terms of space and power capacity, these edge colocation facilities still have to be as reliable as larger centralised ones bearing in mind the mission critical applications they have to support. This will require Tier III redundancy and availability at a minimum, as well as highly scalable bandwidth to meet the intensive edge processing demands of new applications and technologies such as AI, AR and VR. With this, they must also be able to support decentralised public, private and hybrid cloud infrastructures that can be distributed to the edge. Clearly, those data centres that aren’t yet ready for 5G must adapt and refresh as necessary.

Edge data centres will play a pivotal role in ensuring the whole of the UK – not just its major cities – is reaping the rewards of 5G However, the actual location of data centres will determine their overall effectiveness in serving as low latency regional points of presence for centralised telco or indeed cloud scale facilities; being physically close enough for bringing data closer to devices, users and customers, for enabling real-time decision making, improved customer service and competitive edge for businesses. And for consumers, an enhanced user experience. For telcos, their dilemma will be whether to build larger and larger ‘pipes’ back to hyperscale data centres or use highly connected networks of regional edge data centres to store ‘some’ of this data, the pay-off being between the cost of computing and storage at the edge versus increasing the size of their networks. In summary, edge data centres will play a pivotal role in ensuring the whole of the UK – not just its major cities – is reaping the rewards of 5G. While centralised data centres still have a crucial role to play as the hubs of data distribution networks, it is fit for purpose, well-connected and energy-efficient edge data centres that will continue to act as the local depots of data and low latency applications for regions across the country.

Q1 2021 www.datacentrereview.com 21


TELECOMS/5G

That’s private! With social distancing on the cards for the foreseeable future, the implementation of telemedicine has snowballed. Unfortunately, despite the benefits, this also kicks up some serious privacy issues for patients. Alejandro Coca, co-head of TrueProfile.io, explores some of the challenges and solutions that will help build confidence in a now indispensable technology.

I

n the 1950s, the concept of telemedicine was first put into practice in the US, transmitting radiologic images at large distances via telephone. Originally a tool to reach patients living remotely, it has now become one of the fastest-growing areas of healthcare. The Covid-19 pandemic has done nothing but accelerate this growth. At the beginning of the outbreak, GP surgeries urged patients not to come into their practice to prevent the potential spread of the virus and health secretary Matt Hancock advised that consultations should be done by telemedicine where possible. Since then, there’s been a dramatic increase in the use of telemedicine, with a survey by the Royal College of GPs finding that six in ten appointments in mid-July were conducted by telephone. However, with rapid technological advancements comes potential risks and barriers, such as connectivity limitations and data privacy. Providers must ensure they put patient safety first by ensuring they hire verified medical professionals and by investing in the right technology to facilitate telemedicine. We believe blockchain has a role to play in some of these key elements, helping ensure data privacy and patient safety. What is telemedicine? If you’ve yet to come across the term ‘telemedicine’, it’s defined as follows: ‘Telemedicine is the use of technology to virtually administer medical advice, prognosis and support from a qualified practitioner to a telehealth patient.’ Telemedicine enables qualified healthcare professionals to safely and remotely provide their services to telehealth patients, eliminating in-person interactions and the chance of viral transmission.

22 www.datacentrereview.com Q1 2021


TELECOMS/5G

Services offered via telemedicine include mental health support, chronic disease management, psychiatry, family planning, medication renewal, on-call visits and much more. Telemedicine hours often go beyond standard office hours for physicians, which is of particular benefit to those who live or work far away from the nearest healthcare centre. Further, telemedicine services can reach populations regardless of geography, while their ability to respond to medical emergencies where there are critical healthcare shortages is beyond valuable. Through this pandemic, the need to socially distance and protect the vulnerable has made telemedicine an essential service. The rise of telemedicine this year was not just a quick-fix as a stop-gap for the world to re-adjust to the effects of the pandemic, but a solution that is very much here to stay. A Forrester report estimates that there will be 20 million telemedicine care visits in the UK alone by the end of 2020. These virtual care services will and have become an accepted alternative to the traditional methods of delivering healthcare.

With rapid technological advancements comes potential risks and barriers, such as connectivity limitations and data privacy The connectivity challenge There are clearly many benefits to the telehealth boom – increased convenience, the potential to limit overhead costs, and even the new insight doctors get into their patients’ lives. Further, telemedicine has progressed past the traditional telephones of the 1950s and is widely done over video now. This delivers a better service, helping to improve patient outcomes. However, it’s not all plain sailing and there are some barriers to adoption. Firstly, for clinicians working for a telemedicine provider, many have reported experiencing video fatigue, longer workdays, and a loss of work-life balance. Further, for many patients, particularly those that are located in rural and less-developed regions, access to a reliable internet connection is arguably the biggest barrier to connecting with clinicians online. Without an accessible and reliable broadband connection, there is no way they can get the treatment they may need. The privacy challenge Outside of these barriers, there is the issue of data privacy. Although telemedicine has truly taken off over the last year, it will still be the case that many will be hesitant to adopt the technology on the grounds that they want to protect their health-related data. As evidenced in the national rollout of the test and trace app, huge widespread concerns around the privacy of health data have emerged throughout the pandemic, which can quickly turn into unfounded conspiracy theories. However, privacy concerns themselves are perfectly understandable, even more so when it comes to telemedicine. Some telemedicine services allow patients and doctors to share and store sensitive information such as test results and x-rays, meaning the right technology

must be in place to ensure these records are protected. The topic of using blockchain technology to protect patients’ medical records is not a new one. However, it isn’t something that has yet gained traction as healthcare institutions struggle to digitise and keep pace with technology advancements. However, this is where we believe that the use of blockchain should be implemented, decentralising the storage of data so that no central party has control over its content, and nobody can tamper with the records because every member has to agree to its validity and can check the history of record changes. The recruitment challenge From doctors and nurses to radiologists and psychologists, healthcare professionals are being hired to supply the fastest growing healthcare area and its constant demands. An inefficient hiring process can result in costly lawsuits, imprisonment, brands suffering reputational damage, casualty, and even loss of life. The true value of verification within the telemedicine industry is more important now than ever to alleviate recruiting challenges. Why is it important though? As with a physical healthcare environment, a virtual one facilitated by telemedicine needs to provide patients with a safe environment. The key to achieving this is through the right medical professionals. It’s critical for healthcare professionals to be properly vetted before allowing them to practice on a telemedicine platform and therefore providers must ensure they source verified, credible healthcare professionals to be assured of their skills and qualifications. With this in mind, an innovation that is gathering pace is blockchain-powered verification. For example, using a blockchain-enabled professional document verification platform can enable candidates to securely upload and verify private documents, such as passport or university certificates, providing them with a form of portable credentials. NHS recruiters and healthcare regulators can then view and verify candidates’ credentials against the blockchain. From a recruiter’s standpoint, this can help to drastically streamline the verification process by eliminating the continual churn of verification requests on employers and educational institutions every time a healthcare professional applies for a new role. For candidates themselves, the process is also expedited as their credentials only need to be verified once before being saved on the blockchain. They can then share this with potential employers at any point during their careers, rather than having to be verified each time when applying for a role. By using blockchain-enabled professional document verification, telemedicine providers can eliminate the risk of hiring unqualified, fraudulent individuals, ensuring that patients and co-workers are protected. Building confidence With telemedicine increasing at a rapid pace, it is critical that providers build confidence in any solution, as well as making it as accessible as possible. While the focus previously was all about getting a service up and running quickly, the focus now needs to be on implementing a long-term solution with the patient at the heart of the service. When it comes to healthcare, patient safety and privacy should always come first and blockchain technology has a clear role to play in many aspects of telemedicine.

Q1 2021 www.datacentrereview.com 23


CLOUD

Benefits of a hybrid cloud

24 www.datacentrereview.com Q1 2021


CLOUD

Neville Louzado, head of sales at Hyve Managed Hosting, dispels the myth that companies need to choose between a public and private cloud, and espouses the potential benefits of a hybrid solution.

I

n one form or another, digital transformation was already well underway for most businesses prior to the Covid-19 pandemic. For most, agile working was considered a luxury rather than a necessity, and moving operations into the cloud was a steady and gradual process. That’s changed, with many businesses now accelerating their secure migration to the cloud in various ways in order to support the rapid increase in remote working. However, despite the new sense of urgency brought about by the pandemic, it’s essential businesses maintain that same steady, careful attitude to cloud adoption in order to ensure a smooth and secure transition. Businesses approaching this decision at breakneck speed often find themselves cornered, having to choose between a private or public cloud solution. But the choice isn’t as binary as it first seems. A report published in 2019 by Right Scale called ‘State of the Cloud’, found that in 2019 more than 91% of businesses used a public cloud solution and 72% opted for the private cloud route. The reason those statistics don’t quite add up is that there is an overwhelming amount of overlap between the two, with more than two-thirds of businesses opting for a hybrid cloud solution that combined the advantages of both. Why? Because the versatility and flexibility offered by a hybrid solution is simply too good for modern businesses to pass up. Let’s start by summarising both hybrid and public cloud solutions. The commonality of the public cloud There’s a reason more businesses choose the public cloud over any other cloud solution. Public cloud solutions, such as those offered by Amazon, Google and IBM, allow businesses of all shapes and sizes to tap into a centralised pool of resources as and when they’re needed. From raw processing power to optimised software, and all of the power and cooling that goes with it, organisations can simply tap into what they need as part of a subscription model that can scale with their business. It’s often the number one choice for smaller, fluid businesses that want to grow on a budget. While the pay-as-you-go aspect of the public cloud is attractive, it does come with some potential negatives depending on an organisation’s needs. Control over data is an obvious one, and as a knock-on effect, security can be more difficult to manage. Even though businesses are effectively outsourcing their computing, they’re still responsible for the security and integrity of their data to some extent, and usually have to meet the cloud provider halfway when it comes to things like security policies and protocols. The pay-as-you-go model can also backfire if an organisation experiences a sudden growth spurt or a burst of high-intensity computing activity, resulting in high, unexpected invoices.

The control offered by the private cloud The private cloud is similar to the public cloud in many ways, but is better suited to larger businesses who are willing to make a long-term investment in their computing infrastructure. With a private cloud solution, a business might have slightly less financial flexibility, but will have dedicated computing resources that they - and only they have access to. This results in better performance as data is not shared between other users, and tighter security and easier compliance as it is built behind a firewall. The private cloud gives companies complete control over data and applications, which is particularly appealing to heavily regulated industries like finance. The rigidity of fixed monthly bills also makes it easier for companies to plan budgets and forecast revenue. However, the company must be prepared to manage it extensively long-term or invest in a managed hosting provider.

The private cloud is similar to the public cloud in many ways, but is better suited to larger businesses who are willing to make a long-term investment in their computing infrastructure. The best of both worlds Reading the above, the advantages of both a public and private cloud should become readily apparent. Most businesses faced with this decision will be asking themselves why they can’t have the advantages of one without missing out on the benefits of the other. In other words, they want to have their cake and eat it. The good news is that this is perfectly possible, and the majority of businesses who migrate to the cloud actually use a combination of the two - a hybrid cloud solution. A hybrid cloud solution will use a combination of the public and private cloud depending on the application and use-case. For departments or functions that require high security, such as the handling of sensitive customer data, accounting or HR, private cloud solutions can be used, allowing organisations to control the levels of security and policies around the transfer of data. For less critical applications, such as day-to-day operations that might scale up or down from month-to-month or quarter-to-quarter, a public cloud solution will more than suffice, allowing businesses to only use what they need and save money where possible. In this way, a hybrid cloud solution can actually eliminate most of the shortcomings associated with both private and public clouds, as they tend to cancel each other out when applied effectively. As businesses continue to make progress towards cloud migration, particularly in light of the global pandemic, we’re likely to see an increasing number of teams tap into the flexibility, security, scalability and affordability of a hybrid cloud solution.

Q1 2021 www.datacentrereview.com 25


CLOUD

Cloud control Terry Storrar, managing director at Leaseweb UK, explores the complexity of implementing cloud solutions and how organisations can ensure they’re managing those assets more sustainably.

C

loud computing is one of the outstanding technology success stories of recent years. Not only has it radically changed the way organisations invest in IT, but it has provided the basis for entirely new and disruptive business models. But, as the market matures, technology and business leaders are now looking for additional advantages that address the complex issues of implementing cloud infrastructure and services. In the current uncertain economic climate, cost savings and business agility have taken centre stage in business planning, and cloud strategy has a central role to play. Take the management of cloud infrastructure, for example. IT teams need to move away from the notion that simply moving workloads to the cloud will offer automatic benefits. Instead, data, applications and services should be delivered via the most appropriate combination of technologies and service providers. Whether the objective is to control costs, to quickly adjust capacity and services, or to affordably add new infrastructure in a completely new territory to take advantage of a growing market, the power lies in the ability to choose, change and refine a strategy as required. As a result, today’s cloud businesses look for flexibility in everything they do, including moving workloads, data and applications between different providers and infrastructure options in search of the best combination of price and performance. But that’s only part of the picture. In the pursuit of innovation and value, many organisations have found themselves operating a multi-environment strategy – either by accident or by design – with services, data and assets split between on-premise systems, managed service providers, dedicated servers, a multitude of SaaS solutions and the public cloud. The list goes on and serves to remind us that cloud computing is a strategic investment that requires careful management. The amazing diversity of the ‘as a Service’ market has offered a huge choice to business, but also risks adding significant complexity into the average cloud strategy. To an extent, the cloud industry is a victim of its own success, with service providers happy to add the ‘as a Service’ prefix to just about anything, from infrastructure, disaster recovery and security to virtualisation and storage (among many others). The problem this creates is that organisations will make strategic and ad hoc decisions about which to adopt, and on average use almost five different cloud platforms, according to industry research. The key to sustainable success This can create a significant management overhead, and in some cases, can be a serious barrier to effective delivery, return on investment and sustainability. Indeed, industry experts have argued that cloud complexity is not only inevitable, but is the ‘number one reason enterprises ex-

26 www.datacentrereview.com Q1 2021

perience failures with the cloud’. Part of the problem is that procurement and ownership of cloud services has, to a significant degree, been left to the IT team. If, for instance, individuals working across different roles sign-up for bespoke or disjointed cloud services, businesses might not even have a single view of their technology and spending commitment. They may even risk over-specifying or duplicating services that are not easily integrated with legacy technology or other cloud solutions. Any organisation determined to balance its adoption of cloud infrastructure with its sustainability goals must first understand what the impact of that complexity might be. Are, for instance, service providers being evaluated and appointed based on their environmental credentials? Can IT teams assess the cumulative effect of cloud complexity on their sustainability performance, and how many organisations can accurately distill that information into a format that helps optimise their overall strategy?

The cloud industry is a victim of its own success, with service providers happy to add the ‘as a Service’ prefix to just about anything The solution – and the true demonstration of an efficient, sustainable cloud strategy – lies in adopting a unified management process for the whole infrastructure stack. When looking for efficiencies, organisations can be forgiven for focusing on the technology, whereas the answer actually lies in building a strategy based on working with a smaller number of managed service providers. However, this can often depend on consolidating the number of partner relationships by working with a service provider that can integrate services such as dedicated servers, colocation, private cloud stacks and CDN in a way that gives organisations all the flexibility of modern cloud infrastructure. With the right strategy that focuses on efficient long-term management, the cloud will reduce costs in most circumstances. But reaping the benefits of cost savings and increased agility through cloud computing needs to take centre stage in the quickly evolving world of IT-led business. Success depends not just on opportunist choices that ultimately fail to deliver a coherent approach, but on each organisation building its own sustainable cloud ecosystem of technologies, services and trusted partners.


CLOUD

Cloud computing is a strategic investment that requires careful management

Q1 2021 www.datacentrereview.com 27


INDUSTRY INSIGHT

The bare minimum You’d think when it comes to cable specification, the trusty 18th edition would be a wealth of information and best practice, right? Well, apparently not. In this article, Alex Smith, technical director at Flexicon, details why operators should be specifying their cable protection on more than just halogen-free ratings, and why cable protection must meet all low fire hazard criteria requirements and not the (surprisingly sparse) guidance offered by the 18th edition wiring regulations.

or every data centre operator, it is essential that all cable installations meet the required low fire hazard specification; by using appropriate cable protection, such as flexible conduit. Traditionally, halogen-free conduits have been specified, often based on the common misconception that they offer comprehensive fire protection performance. Although such a conduit may prevent the generation of toxic gases in some settings, it does not necessarily mean that it is also flame retardant or has low smoke properties, and it may still be flammable if exposed to a heat source. This can give rise to a potentially dangerous situation where cables can be laid in flexible conduit that, according to BS EN 61386, only needs to declare if it is self-extinguishing and does not need to offer a comprehensive level of fire performance.

European cable fire standard, BS EN 50575. This includes classifications of ‘reaction to fire performance’ and considers heat release, flame spread and propagation, smoke production, flaming droplets and acidity. BS 7671, commonly referred to as the 18th edition wiring regulations, calls up EN 61386 for flexible conduit performance requirements, including fire. However, this standard for conduit systems, which was first published in 2008, only addresses non-flame propagation (self-extinguishing). It does not include additional fire performance properties such as enhanced flame retardancy, smoke and toxic fume emission. Almost all applications will require non-flame propagating (self-extinguishing) as called for within the UK wiring regulations (BS 7671) and tested by means of the flame propagation test in EN 61386 as a bare minimum. Many operators will assume that this basic requirement will be met by any flexible conduit they specify, but this is not always the case.

The background Data and power cables should be suitably fire performance rated for the application and location, therefore they should comply with the

Meeting low fire hazard specification Low fire hazard classification requires a product to meet four clearly defined characteristics. This includes low smoke emission, high flame

F

28 www.datacentrereview.com Q1 2021


INDUSTRY INSIGHT

retardancy, low toxic fumes and halogen free properties. There is no single European classification standard for low fire hazard cable management products that defines terms, test methods and results expected, and it’s clear why clarification is required. Other tests that are undertaken by manufacturers to prove the fire performance of their flexible conduit products include: • Low smoke emission: Here a sample of material is burnt under controlled conditions in a smoke chamber and the smoke obscuration of a defined beam of light is measured. • Flame Retardancy: Flammability, the measure of how difficult it is to ignite the conduit if it is exposed to a heat source, is often cited here. The minimum requirement is that the product is self-extinguishing, according to conduit system standard EN 61386. Here a vertical sample of conduit is exposed to a 1 kW burner and must extinguish within 30 seconds of the removal of the flame with no flaming droplets. • Low toxicity: In this test, a sample of material is burnt under controlled conditions in a smoke chamber and the fumes are analysed for various gases. The concentration of each gas is then multiplied by its toxic potency to give a toxicity index.

Polypropylene, NFR (non-flame retardant) is a commonly used material for data centre cable protection as it is halogen, sulphur and phosphorus-free, so will not aid acid formation, but is highly flammable and flame propagating. In contrast, PA6 (nylon) is self-extinguishing, halogen, sulphur and phosphorus-free. The specifier should look for independent test results to back up the supplier’s claims rather than relying on unsubstantiated jargon.

If halogens, sulphur or phosphorus are present in a material, it is unlikely to pass the low toxicity tests.

Almost all applications will require non-flame propagating (self-extinguishing) as called for within the UK wiring regulations (BS 7671) and tested by means of the flame propagation test in EN 61386 as a bare minimum

Halogen-free As we have learnt, because a material is halogen-free it does not mean that it is automatically a low fire hazard product. Without the accompanying low toxicity, low smoke and flame-retardant properties, it will not meet the full criteria. Typical halogens are fluorine, chlorine, bromine and iodine. Chlorine is the most common in PVC, fluorine is present in fluoro-polymers and bromine appears in flame retardants. All of these produce highly toxic fumes and thick smoke if exposed to a naked flame; another reason why operators may have tended to rate this area of performance above other fire hazard properties.

In conclusion While many products may look the same, performance properties can vary greatly, so customers should always check suitability and compatibility for their application and consider the installation as a complete system. Moreover, specifying based on all the recommended performance criteria can provide full peace of mind that every action has been taken to protect all aspects of the data centre infrastructure; from plant and processes, to the people responsible for ensuring that uptime and data resilience are maintained.

Q1 2021 www.datacentrereview.com 29


FINAL SAY

The power is in our hands The UK data centre market is pipped to be the largest in Europe, and despite predictions from the Tariff Consultancy Ltd (TCL) a few years ago – suggesting that rack space and square metre pricing may have hit its glass ceiling – the industry is still buoyant. Here, Power Control explores the ever upward trajectory of this resilient market and highlights the advances in UPS technology that will pave the way for further success, not only now, but in the future.

O

riginally identified as mainframes, the data centres of yesteryear were merely hubs of computing power used to process physical data. That was before wireless and even the internet, which is now accountable for the mass uptake and rapid data centre advancements. What data centre operators were not prepared for at the time was the seemingly impulsive adoption of virtualisation. Driven by the need to address hardware utilisation, power and cooling efficiency, and reduced IT spend, virtualisation was the culprit for a significant dip in data centre requirements in the early 2000s. Undeterred and spurred on by cloud computing and the Internet of Things, the data centre market met these challenges head on, embraced more efficient solutions and maintained its position at the heart of the digital economy. It is the industry’s willingness to embrace and invest in new equipment that has influenced our existing data centre landscape and helped it emerge from on-premise comms rooms to independent micro DCs, co-los and hyperscale data centres. Whilst all data centre components still exist to provide the same functions, they have all reformed to harness global pressures for improved connectivity, efficiency, resilience and sustainability. Take UPS (Uninterruptible Power Supply) solutions. These provide essential power backup, delivering clean, reliable power protection against load disturbances. Central for power hungry facilities such as data centres, the principal role of UPS systems to deliver unfailing emergency power has not changed. What has altered is their capacity to significantly improve data centre efficiency and flexibility. Specialists in the field are now able to help owner operators realise

30 www.datacentrereview.com Q1 2021

their true efficiency potential by selecting not just the correct emergency power solution, but also considering all the other elements of the electrical infrastructure that contribute to the total cost of ownership. Early data centres had to rely on large transformer-based UPS solutions to support their infrastructures. Once hailed as the far more resilient backup power system, a transformer-based UPS unit has its drawbacks. The most obvious being its large footprint and heavy weight. The very first transformer-based systems were also grossly inefficient. However, advances in technology have allowed newer models to offer far superior efficiency of up to 98%, whilst maintaining resilient performance. Another shortcoming that has more recently come to light in the wake of modular UPS solutions, is the slightly more complex installation and maintenance processes that transformer-based options require. Despite this, many data centre facilities still utilise transformer-based UPS solutions, reassured that improvements in their technology guarantee power protection, whilst also meeting efficiency goals. The launch of modular UPS systems not only reinvigorated the power protection market, but allowed the data centre industry to broaden its scope. Offering a flexible and scalable approach to UPS investment, modular UPS solutions deliver high efficiency even at low loads. This makes them ideal for micro data centres and colocation facilities, which rely on a ‘scale as you grow’ approach. Modular UPS development has been rapid, and its boundaries are still being pushed, most recently with the introduction of lithium-ion batteries. With their cost barriers slowly eroding and with the sector becoming more knowledgeable about their technology and extended lifespan, lithium-ion solutions are predicted to become commonplace. Smaller, lighter and more temperature tolerant, they reduce the space needed to house power protection. In addition, lithium-ion batteries also contribute to a data centre’s economic and operational efficiencies through peak shaving. The global appetite for digital consumption is relentless and dependence on a virtually connected world has never been greater than now. Covid-19 has had an enormous impact on the data centre market but thanks to the associated industries’ forward-thinking, the market has boomed. Data centre growth is undisputed. Already big players including IONOS and Google have made moves for potential data centre builds in the UK. Over the last decade we have seen UPS systems help shape the industry, where colocation data centres continue to thrive thanks to modular UPS innovation. Continued research, development and investment will see power protection solutions undoubtedly impact the current hyperscale revolution, but to what extent is anyone’s guess.


The invaluable resource for electrical professionals informing the industry for over 140 years Register now for your free subscription to the print and digital magazines, and our weekly enewsletter

SUBSCRIBE FOR FREE TODAY

www.electricalreview.co.uk/register