

FROM THE EDITOR'S DESK
At W.Media, we are all passionate about Sustainability. Which is why, when one data center Sustainability expert lamented at one of our Cloud & Datacenter Conventions, that Sustainability had sadly become a “fashion statement” rather than a “passion statement”, it got us thinking… What can we do, as a publication, to reignite a genuine passion for Sustainability and encourage our thriving and vibrant Cloud and Data Center industry to truly walk the talk when it comes to adopting more “green” initiatives.
Thus, was born this issue, with the theme “Greening the Grid”, and it has truly been a labour of love for our entire team. This issue reflects diverse voices from Australasia,Northeast Asia, Southeast Asia, and South Asia, as industry experts and veterans share the unique challenges they are facing vis-à-vis sustainability. They have also shared their wisdom and insights on innovative and exciting ways to overcome these challenges.
Finally, this issue also gives you a sneak peek into W.Media’s exciting SIJORI Week where we held a variety of conventions, networking events, site visits and even a golf tournament, as part of our efforts to showcase the Cloud and Data Center industry in Singapore (SI), Johor (JO) in Malaysia, and Riau Islands (RI) in Batam in Indonesia.
Deborah Grey

Editor-in-Chief

Hazel Moises Senior Tech Journalist
Parfitt

Research, Content & Production
Nick

POWERING THE CLOUD, PRESERVING THE PLANET: HOW GREEN DATA CENTERS ARE REDEFINING SUSTAINABILITY
Can We Achieve Net Zero? Debunking the Myths of Sustainable Data
The digital age is fueled by data, and data centers are the colossal engines keeping the information flowing. These massive facilities store and process the information that powers our online world, from social media feeds to streaming services to the complex algorithms driving artificial intelligence. But this vital infrastructure comes at a hefty price – a staggering energy footprint.
Data centers are estimated to consume around 3% of global electricity, a figure projected to rise as our reliance on digital services continues to grow. This translates to a significant environmental impact, with data center operations contributing to greenhouse gas emissions and placing a strain on energy resources. The sheer scale of the challenge has led some to believe that achieving net zero emissions for data centers is simply not feasible. This narrative, however, fails to acknowledge the innovative companies and technologies already redefining how data centers operate.
Stepping Inside the Green Revolution
Imagine a data center transformed. Gone are the noisy servers and energy-guzzling air conditioners. Instead, silent servers bathe in a cool, efficient bath of specially designed liquid. This liquid immersion cooling eliminates the need for bulky AC units, a major source of energy waste in traditional data centers. Studies show this approach can slash energy consumption by up to 40%, a significant leap towards sustainability.
Industry leaders are embracing the change. Shell's innovative cooling fluid is undergoing trials with tech giants, while companies like KT Cloud in South Korea are partnering with immersion cooling specialists to combat temperature imbalances. In Japan, KDDI's tests achieved a remarkable 94% reduction in cooling energy with immersion. The momentum is building further with companies like Iceotope Technologies receiving funding for their precision cooling solutions and partnerships like ResetData bringing immersion technology to Australia.

Even lubricant companies like Castrol and ENEOS are joining the movement by developing fluids specifically designed for efficient and safe immersion cooling. This revolution in data center cooling promises to transform them into sustainable powerhouses for the digital age.
Beyond Efficiency: Redefining the Grid
Efficiency is a must, but green data centers are going beyond, forging partnerships with communities to redefine sustainability. One key approach? Transforming a liability – excess heat from servers – into a valuable resource. Imagine data centers as virtual geothermal power plants, warming homes and fueling local food production, all while minimizing their environmental impact.Several companies are pioneering these methods. Microsoft in Finland built a data center that uses excess heat to warm homes and businesses through a district heating system.

Facebook in Denmark recycles heat from its data center to provide warmth for thousands of homes. White Data Center in Japan takes a unique approach, leveraging the abundant winter snow as a natural coolant for their servers.
The melted snow then gets warmed by waste heat, creating a closed-loop system for both cooling and heating nearby greenhouses growing mushrooms and other produce.
Green data centers aren't limited to warming homes. Deep Green, a UK startup, uses "digital boiler" technology. Their mini data centers capture server heat and convert it into hot water, like for a swimming pool in Devon County that no longer relies heavily on gas boilers. This technology can also be applied to bakeries, laundromats, and even urban farms, like the one built by Reid Brewin Architects on the outskirts of Paris. This rooftop farm utilizes waste heat from a nearby Equinix data center.
These examples showcase the exciting potential of green data centers. By turning waste heat into a valuable resource, they're fostering a more sustainable future for both technology and our communities.
The Skeptics and the Reality Check
While some express skepticism about achieving net zero in data centers due to ever-growing data, their concerns overlook the power of innovation. The very technology driving data growth is also fueling solutions for radical efficiency.
Artificial intelligence (AI) is a game-changer. AI-powered software analyzes server performance and workloads, dynamically allocating resources to minimize energy waste. Hardware manufacturers, too, are constantly innovating, developing new generations of low-power servers. These advancements work together to ensure efficiency gains outpace data growth.
For a real-world example, consider the W.Media Podcast episode "Achieving Sustainability Goals through Smarter Cooling, Better Design and AI." Ning Liu, from Shenzen Envicool Technology, discusses how AI in intelligent data centers analyzes performance across a wider range of factors, pinpointing the optimal settings for maximum efficiency.
"AI will optimize energy usage across the board, from cooling systems to IT infrastructure," says Liu. "By combining automation, AI, and data analytics, organizations can finally leverage their data for better decision-making and proactive problem-solving, achieving true sustainability."
Collaboration is Key: The Road to Net Zero
The greening of data centers cannot be achieved by a single entity, it requires a collaborative effort. Governments are taking the lead with initiatives like Singapore's Green Data Center Roadmap. This plan aims to improve energy efficiency through hardware and software upgrades, while also promoting green energy sources like bioenergy and solar power. Other countries are following suit. Taiwan has its ambitious 20-30-50 plan for increasing renewable energy use in data centers, while Indonesia boasts a special economic zone powered by a renewable energy grid. The Philippines is calling for proposals from hyperscale data centers to explore renewable energy options.
Data center companies are also embracing sustainability. True IDC in Thailand is investing heavily in expanding its data center and cloud business with a focus on sustainability. Bridge Data Centres is partnering with a company specializing in AI-powered data center optimization for energy efficiency. AdaniConnex secured financing specifically tied to achieving sustainability goals. These are just a few examples of the industry moving towards a greener future.
The future of data centers is sustainable. Collaboration between governments, tech giants, and energy providers will be crucial for achieving this goal. By working together, we can ensure data centers are not only powerful but also environmentally responsible.
The Future is Green: A New Era for Data Centers
The story of data centers is no longer just about processing power – it's about environmental responsibility. By embracing groundbreaking technologies, fostering collaborative partnerships, and prioritizing renewable energy sources, the industry is proving that powering the future doesn't have to come at the expense of the planet.
The green revolution in data centers is here, fundamentally changing how we store and process information in a way that is both efficient and sustainable. This is not just a technological shift; it's a paradigm shift, one that ensures the digital age can flourish alongside a healthy planet. As we move forward, the focus will not just be on processing power, but on processing power for a greener tomorrow.



SIJORI Golf Open: Networking on the Green, with Sustainability in mind
Forget stuffy conference rooms and monotonous presentations. The data center industry is getting a breath of fresh air with a new breed of networking event: a round of golf with industry bigwigs, exemplified by the innovative SIJORI Golf Open held at the Padang Golf Sukajadi in Batam, Indonesia.
This throws out the traditional playbook, opting for a relaxed outdoor setting that fosters genuine connections while subtly underlining the industry's commitment to Sustainability. While participants had been playing golf with their friends and colleagues for years, this was the first time that people from different organizations across the digital infrastructure industry teamed up and teed off together, and networked on such a large scale!
“The idea behind this event stems from the vital need for networking and collaboration in the digital infrastructure sector,” said Byron Cristol, Head of Global Sales at W.Media. “By bringing together industry leaders in a relaxed and engaging setting, we aim to foster connections and strengthen partnerships essential for driving technological advancements.”
The event, which attracted over 60 senior executives from leading enterprises, hyperscalers, colocation providers, and
Pacific region.
Breaking the Mold
The beauty of a sun-kissed golf course in Batam provides a welcome contrast to the typical cold and sterile environment of data centers. Spending time outdoors amidst rolling hills and manicured greens creates a more relaxed atmosphere, encouraging conversation and relationship building. Imagine discussing the latest cooling technology while enjoying a cool drink on the clubhouse
Beyond the Game: A Sustainable Statement
This shift towards outdoor networking at the SIJORI Golf Open isn't just about a change of scenery. It's a subtle nod to the data center industry's growing focus on Sustainability. By choosing an activity that inherently connects participants with nature, W.Media is sending a message about their commitment to environmental responsibility.

OTT companies, was the first such golfbased industry networking event in the Asia
patio, a far cry from the formality of a banquet hall!

A Winning Formula on the Green
On the course, the competition was fierce. Benoit Boudreau from Amazon Web Services claimed the Longest Drive title, showcasing impressive power. The Beat the Star Challenge, a thrilling format, saw Theresia Diella, Koen Indarto Irawan, Yanto, Lay Khuan Soh, Herson Suindah, Avnish Patankar, Zurianto, Tresna, Steven Lee, Hendrikus Gozali, Calvin Siow, Dicky Setiawan, Arif Rosy, Yuki Ida, and Kenji Lee emerge victorious.
Yudhi Sukma from NeutraDC impressed with his accuracy, securing the Closest to Pin award. Chang Cho from Onion Technology clinched the Best Net Score, while Steven Lee from IX Technology demonstrated exceptional skill with a Gross Score of 80 to claim the Best Gross Score title. In the women’s division, Shannon Lum from Delta Electronics took home the Ladies Overall Winner trophy.
These outstanding performances added to the excitement and competitiveness of the SIJORI Golf Open. But beyond the sporting glory, this event offers a winning formula for data center leaders.
The relaxed atmosphere fosters deeper connections, paving the way for stronger business partnerships. Informal settings can spark the most productive conversations, leading to innovative ideas and collaborative projects. The event itself underscores the industry's environmental consciousness, a growing priority for businesses and investors. The SIJORI Golf Open delivers a winning formula for data center leaders by combining exhilarating competition with valuable networking opportunities.
These events are sure to become a popular way for industry leaders to connect, collaborate, and drive the future of data center technology by combining business with a touch of leisure and a commitment to sustainability.






Data Centres and a ‘Green’ Grid
Any data centre facility worth its salt should have a roadmap and sustainability strategy in place. And as sustainability becomes a front-and-centre issue to any data centre conversation, there are several non-negotiable components in the green grid.
Core to any sustainability strategy is a clear understanding from the board and executive team of what should be done to achieve a science-based target of Net Zero by 2050. The said strategy should contain measurable goals, rather than broad sweeping statements. Typically, this should include how the organisation will reduce greenhouse gas emissions (GHG), reduce water usage and increase energy efficiency and the use of renewable energy.
As it currently stands, many data centre operators now have a sustainability team embedded within their workforce – this is naturally a positive development.
However, these teams need to work cohesively with other elements of the workforce like operations. Their work cannot endanger reliability, resilience or performance – and this is not their primary area of expertise.
Of course, one of the areas that needs to be considered in an energy intensive data centre is the use of renewable power. This falls into the Scope 1 and Scope 2 GHG emissions and can be quite easy to quantify. In practice this presents several challenges.
PPAs
There has been widespread discussion about the legitimacy of organisations buying up power purchase agreements (PPAs). However, the industry’s cynics, which I will freely admit to being, could say, ‘is this power actually available?’ and ‘is this power being injected into the grid that the DC’s are fed from?’. Often the answer is no, the power is not being injected into the grid.
There is no way to denmark green electrons from any other electrons. Power is power and what is obtained from the grid is blended from a cocktail of sources. The difficulty arises when the grid generation is made up of carbon heavy sources like coal and to a greater extent, gas. If the grid is 75% carbon heavy, any importation of lower carbon energy from a nearby country will only dilute a proportion of the grid by a few percentage points.
Solar is a sound renewable power source with caveats. Any solar programme needs to be tied into an efficient battery energy storage system (BESS), to be of use. Solar does not generate at night and that which is generated in the day may need to be stored to charge electric vehicles overnight, for example. It is possible to use such power for ‘parasitic loads’ such as lighting and other non-critical uses. Even if this power is being offered back to the grid, a consistent question we hear is can the grid have taken additional capacity?
Another way to use renewable power is to disconnect the data centre from the grid. Some have taken the option of becoming a stand-alone facility with a cogeneration plant onsite or using gas fired fuel cells.

The latter does require a connection to the gas grid, which is much more reliable than the power grid. There are examples in EMEA and USA of both approaches making use of the byproducts of the generation process with success.
Wind
There is such a thing as the wrong type of wind - look at Malaysia. In other words, the average wind speed is 2m/ second and the minimum wind speed for wind turbines is 4m/second even though most coastal and offshore locations do not achieve a viable wind speed. The only exceptions to this might be Kuala Terengganu, Mersing and Kudat areas of the country, according to
Can we make the facility more efficient?
Over the last few years, there have been undeniable improvements in facility data centre efficiency. The only way the facility will become more efficient is with the use of adaptive power provisioning that right sizes to the power draw.
As a capital plant reaches the end of its service life, we recommend replacing the facility with more energy efficient models that are sized to meet actual demand rather than the over estimations of what might be needed in the future. Ensure that you are provisioning a biodiesel for emergency power generation – do you actually need generators? Can you live with 99.99% availability, or must you have 99.999% availability?
Does PUE need to be retired?
It has been said that you can’t manage what you can't measure. This statement leads to another pressing question - what measures are there in place to measure data centre efficiency?
Recently, there has been more than one article stating that power usage effectiveness (PUE) has run its course as an efficiency metric. Let’s be clear about one thing, PUE was NEVER, is not and never will be an efficiency metric! PUE simply demonstrates the amount of load that the facility uses in addition to the IT power load, which being the ‘1’ in the PUE metric. The smaller the ratio the less power the facility needs, however, it is a metric that is affected by a raft of factors such as load, environmental conditions to name but two.
To start with, it needs to be understood in the context of baseline reporting. It needs to be a 12-month trailing figure measured against a common international standard. ISO/ IEC30134 series or EN50600-4-X series. Once you start to measure according to these standards and be prepared to be audited against them you will be on the way to baselining your facility and then can see how to improve.
There is nothing wrong with PUE……. it’s the people and organisations who do not either use it correctly or understand what it actually is, that presents the issue.
It is one of a range of metrics that will help to give a holistic picture of your facility – not a means to compare it to another facility!
What is possible?
Can we green the grid or get our data centre facilities to be greener? Yes, we can, but it does require owners/ operators to be creative in how they power their facilities and the uptime that they might offer. It does not mean that virtual PPAs are the way to go. There is an argument that PPAs exist purely within the realms of self-serving greenwashing. The use of hydrogen produced by green sources is not currently available at scale and the distribution methods will not be available for at least 10 years. Nuclear is always an option and several countries in Southeast Asia are considering this, but the regulatory processes and other concerns may mean that this is much more of a medium-term option.


James Rix
Head of Data Centres and Industrial - Malaysia & Indonesia, JLL

Can Data Centers Really Achieve Carbon Neutrality/Net Zero?
In the last 12 months, I’ve have seen a massive increase in the demand for data centre capacity. This has been fuelled by the explosion in the use of Artificial Intelligence (AI) applications, which are driving power consumption to new heights, to a level I don’t believe anyone even imagined 2 – 3 years ago. These changes in scale are making it even more important that the data centre industry plays its part to achieve a carbon neutral future – but the question is, can data centres ever really be carbon neutral?
The data centre building can’t be Carbon Neutral
The immediate challenge for any company that is building new infrastructure, is that the building itself, in my opinion, can’t be carbon neutral. This is because of the amount of embodied carbon that exists in a data centre building including raw material extraction, manufacturing and transportation. Of course, a lot can be done to minimise the building impact, but the construction involved, the steel and cement, plus the electrical and mechanical plant that is needed, will have a massive impact.
So, I believe we should change the question to “can the net impact of the data centre be carbon neutral over time and how quickly?”
Let’s consider how Carbon Neutrality is actually defined. In effect it is the goal of reducing the net human-caused greenhouse gas (GHG) emissions to zero. In the case of a data centre, you can’t achieve that in the building and infrastructure. The only way to reach the goal is by taking other actions that capture carbon or cause
The data centre operation can be carbon neutral in-life
To achieve carbon neutrality for the data centre operation, developers have to compensate for the carbon cost of its construction and its energy use in-life, by preventing GHG generation elsewhere. Historically data centre operators have tried to offset their carbon generation by purchasing carbon credits, however this practice has rightly been widely criticised. I think that as an industry we need to do much more to minimise our environmental impact than simply handing the problem to
Data centre operators and cloud companies are finding ways to make a positive environmental impact through investment in sustainable power, this also helps secure the power they need to run their data centres. At its most basic level this can be done through Power Purchase Agreements (PPAs) for renewable sources. This means that a data centre operator makes a contractual commitment for renewable power which indirectly funds the development of more sustainable, additive energy (i.e. new supply), preventing the need for new generation from fossil fuels.
While this arrangement is perfectly sensible in principle, the reality is that by contracting for PPAs, data centre operators are actually taking green power from other uses, such as residential, because the rate of sustainable power development is generally struggling to keep up with demand. Therefore, while this is the right thing to do, it also has knock on implications, i.e. it is not the silver bullet on its own.
Hyperscalers will invest in generation and grid
Some Hyperscalers and operators are going further by investing directly in renewable energy companies/projects, guaranteeing their future power by committing to purchase and consume some of that new capacity. This can, but not always, mean that the generation projects that are closer to the data centre are more attractive because the challenge of transmission infrastructure is reduced, as is cost. However, where the renewable supply is not close to the data centre, which is much more often than not, we inevitably see challenges in the transmission network. So to overcome this some of the very large CSPs are now lobbying Governments to allow them to build their own transmission networks, Ireland being a good example. With “grids” often classed as a national security asset/critical infrastructure it will be interesting to see how this develops.
Developing power infrastructure is highly capital intensive, but the hyperscalers get the additional benefit that they can control their supply chain from generation, through to transmission and consumption. However, is it realistic to expect data centre operators to have the resources (time, money and skill set) to influence Government policy in the same way that a few of the large CSPs can do? For certain, operators can’t add points to a country’s GDP, so I would suggest that no operator, or perhaps even group of operators, can realistically achieve the same level of influence as a major CSP.
Data centre developers must be careful not to make promises they can’t keep
As the focus on sustainability in data centres increases, operators/ developers need to be able to say they will be carbon neutral, because they want to deliver for their hyperscale customers. But, since they don’t typically have the financial resources to invest in generation, meeting these commitments may be difficult or even impossible as the scale of energy consumption increases.
I can see a future where hyperscale CSPs work differently with operators by letting them build the data centre and then the hyperscaler brings the renewable power, so that together they can make the entire project carbon neutral. My view is that this model is easier for the major CSPs, not only can they further control their supply chain, it is also much easier, and cleaner, to reflect that clean power in their own sustainability reporting.
Data centre efficiency is important, but it won’t deliver neutrality
For the last 20 years data centres have made massive improvements in efficiency to the point where approximately 13% of the energy required to run the IT is needed to cool it. The challenge now is even though greater efficiencies can be found we are facing diminishing returns. Given the forecast of high data centre growth, as an industry we can’t rely on efficiency to curb energy usage. However, we could see additional benefits by simply increasing operating temperatures in the data hall. This will need leadership from the big cloud operators to change and in my opinion, there are energy savings to be made across all data centres if that can happen. The first rule of energy savings is “reduce”, so this would seem an easy win, yet today it hasn’t happened.
The Evolution approach to Sustainability
Evolution Data Centres was founded with the sole purpose of developing and operating large scale sustainable data centres. But if you asked me, given the complexity, I would be reluctant to set a date for carbon neutrality. However, we do commit to three things, the first is to build as sustainably as commercially possible, the second is to closely measure our performance, and transparently report against that. The third is to continuously seek improvement until we get close to, or reach carbon neutrality. Data Centres have an important role to play in providing much needed digital infrastructure. However, we also have a responsibility to ensure that this is done with minimal environmental impact.
DATA CENTRES HAVE AN IMPORTANT ROLE TO PLAY IN PROVIDING MUCH NEEDED DIGITAL INFRASTRUCTURE. HOWEVER, WE ALSO HAVE A RESPONSIBILITY TO ENSURE THAT THIS IS DONE WITH MINIMAL ENVIRONMENTAL IMPACT.

Darren Webb
Co-Founder and CEO of Evolution Data Centres

How Data Centres are Changing to Tackle GHG Emissions

Data centres are essential for powering modern economies, yet these facilities have gained a reputation for increasing energy consumption and associated concerns have been raised about their carbon cost.
Like all buildings, data centres have a carbon footprint. But what is often missed are the advances data centres are making in cutting embodied and operational carbon through innovative designs, advanced technologies, and sustainable practices aimed at reducing their GHG (Greenhouse gas) emissions.
Data centres are deploying clean energy storage systems and changing power generation infrastructure through the use of cleaner fuels.
Advances are opening up the prospect of data centres becoming support systems which help green traditional grids through the supply of on-site generated clean power.
The Cushman and Wakefield 2024 Global Data Center Comparison report stated that the data centre market reached new heights in 2023 responding to over 30GW of capacity demand. Power availability emerged as a critical issue, with major markets facing a shortage of large power blocks.
These constraints have prompted data centre operators to explore untapped and smaller markets globally. Power access and the growing demand for Artificial Intelligence (AI) are significantly influencing site selection strategies and
Clean Energy Storage
For emergency back-up discharge in seconds, for discharge of 15-30 mins or long-term discharge over days, weeks, and months; how energy is stored on data centre sites and off-site has the potential to radically shake up data centre power chain design and operation for sustainability objectives.
In battery storage, traditional VRLA batteries are being phased out in favour of more compact, lightweight and sustainable equivalents. The first change phase saw the adoption of Li-ion for its energy capacity superiority, lower battery discharge through efficiency; extended lifespan; software optimisation enhancement and better remote management capability.
And while lithium remains a prevalent battery technology, more sustainable options with sufficient energy densities are emerging to become competitive as their costs decrease. These alternatives include flow batteries, zinc-nickel, and sodium-ion batteries.
Sodium-ion cells offer a cost-effective and sustainable solution by utilising a more common and less expensive element than lithium that can be recharged in about one-fifth of the time.
For short-term emergency power, 20 to 30 seconds flywheels have been used to store energy in data centres across the world. Kinetic flywheels have seen success as energy storage components in the UPS power infrastructure.
Other emerging energy storage options include Liquid air energy storage (LAES) or Cryogenic energy storage (CES) which stores liquid air inside a tank which is then heated to its gaseous form and this gas is used to rotate a turbine.
Compressed gas systems have high reliability and a long life span that can extend to over 30 years. This is a long-duration, large-scale energy storage technology that can be located at the point of demand. The working fluid is liquefied air or liquid nitrogen (~78% of air).
The Grid and Gas Engines
There are many and significant implications for any sector seeking to shift from a reliable, proven energy solution.
This is as true for data centres as it is for traditional grid-based power systems.
As traditional power grids shift towards increasing use of renewables as their primary power source, data centres must ensure that their on-site equipment can handle the intermittent nature of this power generation cleanly and efficiently.


Data Centre Location Decisions
Grid stability, grid evolution and grid sustainability are key factors in where to locate a data centre. As grids change to renewables, data centre developers and operators need to understand the proportion of curtailed energy that might result.
In terms of distribution, ensuring continuous access to stable grid power must be considered within the context of grid modernization and restructuring. The question shifts from the current stability of the grid to its future stability.
Different regions have varying approaches to local power generation. A chosen location will have opportunities for power exports back to the grid which may enable a potential revenue stream for data centre operators albeit governed by local regulations concerning demand-side response (DSR).
Conclusion
Greenhouse Gas abatement and net-zero operations require every energy storage component such as grid, switch, battery, UPS, or generator backup in data centres to evolve.
To support this a collaboration between i3 Solutions Group and EYP Mission Critical Facilities Inc., Part of Ramboll (EYP MCF), has released a series of white papers which provide detailed technical analysis to assist data centre operators in transitioning to carbon net-zero operations.
The series aims to offer vendor-neutral guidance and insights into the various technological options available for reducing the carbon footprint of data centre operations. A paper, titled "Infrastructure Sustainability Options and Revenue Opportunities for Data Centres," explores how the goals of reducing greenhouse gas emissions and increasing revenue-generating opportunities can be pursued simultaneously.
Data centres must be able to continue to provide traditional backup power and they must adopt or adapt to using modern clean power infrastructure which can also be used to supply power back to the grid as demand-side response suppliers.
In data centre on-site power generation, the transition to gas engines may become the preferred option. Parts of the industry are beginning to adopt gas as a replacement for diesel engines.
“Substituting diesel engines with low-carbon alternatives such as gas reciprocating engines or turbines, along with sustainable energy storage devices, will enable many data centre owners to reduce their carbon footprint and gain additional income from various grid support schemes. Gas-driven generators have low NOx and SOx emissions, allowing for unrestricted use. In contrast, standby diesel generators typically operate for only a few hours annually” (Source: Infrastructure Sustainability Options and Revenue Opportunities for Data Centres – i3 Solutions Group).
It is clear that in most locations the future of diesel as a standby power source for data centres will be constrained by increasing restrictions on use, tougher tax regimes, lower emissions targets, improved air quality requirements, and stricter noise regulations. All these factors point to a need for a complete re-evaluation of diesel.
These factors have significant implications for investments in power infrastructure and the design of new power chains within data centres.
In the age of energy transition data centres have the opportunity to be at the forefront of the green revolution, setting new standards for sustainability in the digital age.

Ed Ansett Founder and Chairman, i3 Solutions Group



Mark Stickells
CEO of Pawsey Supercomputing Research Centre

Supercomputing Perspectives on Data at Scale, Future Technologies and Sustainability
The Pawsey Supercomputing Research Centre [Pawsey] is home to Australia’s research High Performance Computing system, Setonix. With 42 petaFLOPS of power, Setonix is the fastest research supercomputer in the Southern Hemisphere. It was also launched as the 4th ‘greenest’ supercomputer in the world.
Pawsey provides essential computing services to a growing range of industries and companies. It supports Australia’s radio astronomy programme, including the Square Kilometre Array (SKA) which is now under construction in Western Australia and South Africa.
Mark Stickells, the CEO of Pawsey brings a diverse set of skills and qualifications to Pawsey, starting with degrees in literature and management and starting his career experience in university admissions and research administration before moving into research management and leadership roles. His first CEO role was leading an energy R&D joint venture between some of the major players in WA resources including Chevron, Woodside and Shell together with local universities and the CSIRO.
He joined Pawsey in 2018 in order to work in the world of high performance computing and to be part of “a wave of opportunity around digital technology and innovation”. In that year the then Australian Prime Minister announced $70 million to refresh the foundational systems and architectures behind Pawsey: “And so I joined at a time when there was a positive opportunity to lead a transformation at a national research facility, supporting culture and performance improvements as well as a major capital technology program”. He considers that his variety of work experiences and skills means he is able to complement the more specific focus of the technical domain experts.
Pawsey’s Supercomputer, Setonix, comprises advanced computing infrastructure that works in massively parallel and accelerated ways to deal with solutions to questions and problems that can't be delivered through conventional means such as cloud or desktop computing. Setonix is founded on specific architectural requirements and offers a very specialised data environment to support the sort of data workflows and data generation required for the solution. Broadly, supercomputing is applied to large-scale computational modelling and simulation activities.
Stickells explains, “There are certain domains, industries, and indeed national interests that require very advanced and dedicated infrastructure to support areas of science and enterprise, national security and key services such as weather prediction and climate modelling. But increasingly, as computing power continues to grow exponentially, the models for delivering advanced computing are changing. In some cases, virtual access to smaller instances of hundreds of nodes of a computer to operate advanced work is now available via commercial providers or within the resources of CSIRO and universities. Commercial cloud-based service providers have entered the specialised world of the advanced supercomputing market over the past five to seven years and this reflects the growing demand for advanced computing, as well as advances in the underlying technology”.
Pawsey’s systems manage vast data flows and computational requirements, and work with experts to accelerate scientific discovery, to reach the meaning behind the data.
We didn't get through the pandemic and develop a vaccine after 200 days just by using petri dishes.
COMPUTATION ACTED AS A FORM OF NON-HUMAN TESTING, DEVELOPING PLANNING TOOLS THROUGH COMPUTER-GENERATED SCENARIOS AND ALTERNATIVES.
For example, radio astronomy uses antenna arrays aimed at different parts of the galaxy and beyond to pick up distant signals that are processed via a supercomputer. The vast volumes of data need to be calculated into something that scientists and astrophysicists can use - for example, to surmise that's where there's a supernova, the remnant of a supernova or a pulsar that they have detected. But it requires advanced supercomputing operating as part of the scientific workflow to be able to arrive at that knowledge.
Pawsey therefore forms part of the scientific workflow and discovery process for a range of different domains. As more domains are becoming data rich, more organisations are becoming aware that they require systems like Pawsey to analyse the data to reach the insights. Mark illustrates with a recent example:
“We didn't get through the pandemic and develop a vaccine after 200 days just by using petri dishes. Computation acted as a form of non-human testing, developing planning tools through computer-generated scenarios and alternatives. They accelerated responses through using a number of national supercomputing resources to tackle what was a global challenge. And that's an illustration of the importance of this sort of infrastructure”.
Pawsey strives also to achieve sustainable supercomputing; through initial design and engineering and through the sustainable sourcing of power and optimised consumption.
It has operated using cooling via geothermal solution and renewable energy infrastructure for a decade and is currently hosting thermal battery technology to improve operational capability. Pawsey is also exploring novel approaches to optimise applications and codes for scientific discovery including accounting for computing resources in units of energy rather than in terms of compute 'cycles'.
“Sustainability and performance were included in our procurement strategies for our new systems. Setonix had performance as a requirement but it also had energy efficiency and sustainability as investment requirements. So we were effectively demanding that we wanted the most energy efficient approach, not just the most powerful approach”
Pawsey is supported by both the Australian and State Governments and it works also with a number of significant public agencies including the CSIRO and universities. Mark considers that it's important to be accountable for that support and he considers sustainability to be inseparable from that accountability.
He also considers that sound sustainability practices are important in any business: “For us, it has helped us recruit and retain good staff”.
He adds a further dimension to the sustainability debate based on the reasons Pawsey was established and its delivery on that purpose. This provides a context for consumption rather than a focus on the consumption of resources as an end in itself.
“The system requires about 1.5 megawatts to operate and – which is about one and a half times the previous system, but generates 50 times the compute performance. If we're drawing that power from a grid that isn't predominantly renewable then we are making a significant contribution to carbon emissions at the output of several thousand tonnes a year. It is important we understand this cost and environmental impact and strive to reduce it, however it is also important to demonstrate the value of our science and research that we enable for the benefit of humanity.”
The inter-dependence of advances in astrophysics and advances in supercomputing creates a continuum of dynamic change for Pawsey:
“It's very challenging to align the long-term ambitions of various scientific disciplines with the fairly rapid and sustained uplift of advances in ICT – we see this most directly in radio astronomy. It may take a decade to design and construct a telescope array - how far computing will advance in that decade is also a factor in instrument design”.
The time to build a telescope or an array will see two or three uplifts in computing performance, or even in architecture types. Mark continues: “We're building supercomputers now that even five years ago would be designed quite differently. The component architecture, design, innovation, data and algorithms that are now working to advance generative AI and other machine learning approaches, are broader and demanding more compute and data power than systems in the late 2010s – following the COVID pandemic, the 2020s will probably be defined through major advances and wider adoption in AI into science, enterprise and communities world-wide”.
SUSTAINABILITY AND PERFORMANCE WERE INCLUDED IN OUR PROCUREMENT STRATEGIES FOR OUR NEW SYSTEMS. SETONIX HAD PERFORMANCE AS A REQUIREMENT BUT IT ALSO HAD ENERGY EFFICIENCY AND SUSTAINABILITY AS INVESTMENT REQUIREMENTS. SO WE WERE EFFECTIVELY DEMANDING THAT WE WANTED THE MOST ENERGY EFFICIENT APPROACH, NOT JUST THE MOSTPOWERFUL APPROACH.


Pawsey’s move from CPUs to a hybrid system that is now draws 80% of its performance from GPUs has facilitated a significant improvement in compute performance per energy unit. This transition meant that “Pawsey needed to work with stakeholders to help migrate their codes away from systems that were heavily CPU dependent to the coding environment that's needed to run off GPU architectures”.
Stickells’ view of the future in the advanced computing field looks first at the growth in computing power and capacity. “If I were to fast forward and look at what might happen in ten years’ time, I’d say we will have yet more performance per square metre of space and that this space would be predominately sustainable in its energy use. In ten years, I’d also offer that we may have quantum computing chips in the fabric of a heterogeneous system that unlock exponentially more powerful algorithms.”
As computing power increases Stickells considers that the focus on various elements of sustainability will increase also:
“I think we will focus on the entire footprint, looking at the carbon that goes into making what we use, I would imagine that these have been designed with that in mind so that if these are generating heat or if they're using water, how precious that is, we recycle that in our case. Those sorts of considerations will be headlines into the design of these flagship systems over the next decade. Arguably, 10 years ago, this consideration was a bylines.
Major hyperscalers are now starting to report what their water consumption is as communities realise what goes into actually providing the digital capability at their fingertips, or in homes, schools and businesses. The impact of this, the awareness and the understanding of it globally will be much more widespread than it is now..”
Stickells is also concerned about the availability of supercomputing facilities in younger nations given the increasing importance of that capacity to scientific, social and economic advancement:
“I think an interesting question for the future is how do nations invest in this? We see how major companies are investing in it. But how can nations and regions invest in this as it is becoming increasingly a part of science, in health care, disease, environment, climate needs …?”
He also questions how long the business model on which Pawsey was set up can be maintained as the demand for supercomputing ramps up:
“We aren't a commercial provider, so our partners understand the different levels of service capability and redundancy that we offer. Aspirationally, we are like CERN, a national advanced computing and scientific data facility. It will be interesting to see how long our model operates with different levels of public and private support when the demands for computing are becoming as intensive and widespread as they are.
I’m proud that Pawsey is a mission-led type of organisation and we're advancing science for the benefit of all the communities we serve. The research we support has widespread impact on our understanding of our climate, in environmental biodiversity protection, or in vital medical and health research, through to energy – in the latter, from the design or operation of our facility to using supercomputers to advance the design and function of novel batteries”.
Stickells ends the interview with a focus on the question of why he chooses the work he does. He sees this question being as important as the question of how he undertakes that work in ensuring Pawsey’s future focus and direction:
“Why we do this is really important because the ‘how’ is intensive – it is capital intensive, energy intensive, people intensive. Through our support of leading scientists and researchers we are tackling some of the most important human, environmental and economic challenges of our time. The power of supercomputers to accelerate discovery and provide insights in solving some of the worlds’ intractable problems, and leading a team that is committed in the way they are at Pawsey, is what motivates me every day. So we want to be part of the solution. Not just part of the problem”.

Data Centers: Powering the Future, Sustainably

This Q&A dives into the world of sustainable data centers with Walt Coulston, Founder & CEO of Green Square DC, a company on the forefront of sustainable data center solutions. As data demands soar with the rise of AI, ML, and IoT, Walt sheds light on the challenges and opportunities of achieving sustainability in this ever-growing industry.
Emerging technologies like AI, ML, IoT etc. are placing an increasingly greater strain on data centers. Given how data centers are already huge power guzzlers, how is all this making it more challenging to achieve sustainability goals?
Emerging technologies are indeed intensifying the demand for data processing power, which in turn increases energy consumption. This challenge underscores the necessity for innovative solutions in energy efficiency and sustainability. Leading organizations are pioneering the use of scalable closed loop Direct Liquid Cooling systems with integrated renewable energy and storage solutions, ensuring that even as demand grows, our carbon footprint is minimized.
But can AI also help automate and streamline processes that can eventually lead to reducing a data centre’s carbon footprint?
Absolutely. AI has the potential to revolutionise data centre operations by enhancing energy efficiency through predictive analytics and automated management systems. By leveraging AI to optimise cooling, workload distribution, and power usage, data centres can further reduce their energy consumption. Google's Deep Mind achieved a proclaimed 40% saving in cooling efficiency when integrating Machine Learning based AI as far back as 2016. In our company, we're looking to mirror this and manage our operations in real-time, ensuring optimal performance while minimising environmental impact.
When it comes to achieving energy efficiency via other means such as liquid cooling, what makes it challenging to walk the talk on sustainability?
While liquid cooling is a highly efficient solution, its implementation does pose challenges in terms of its wholesale adoptioWWn and understanding, its certification and operational demarcation. However, these challenges are surmountable with the right upfront commitment and of course, strategic partnerships. Our partnership with industry leaders such as NVIDIA and Schneider Electric enables us to design scalable and cost-effective technologies that align with our customers' sustainability and operational goals.
Some industry experts feel PuE isn't the best way to gauge energy efficiency. What are your views on PuE?
Power Usage Effectiveness (PuE) is a valuable metric, but it does have limitations. PuE focuses solely on the efficiency of energy use within the data centre without accounting for the source of that energy or the overall environmental impact. We believe a holistic approach is necessary, incorporating metrics that evaluate renewable energy use, carbon emissions, and resource utilisation. We also concur with NVIDIA who recently argued that instead of focusing on power in terms of watts, data centres should focus on energy measured in kilowatt-hours or joules, adding that any new benchmark should also measure advances in accelerated computing – using parallel hardware and software processing to speed up work on demanding applications, such as generative AI.
Net Zero" is all the rage these days, but do we really understand what it entails, or is it just a feel-good PR campaign?
"Net Zero" is a critical and a wholly worthwhile goal over the long term, but to even close the gap requires a deep understanding of the issues at hand and a genuine commitment to deal with them. As an industry, I don’t feel simply offsetting our emissions is the answer. In fact, as power becomes more of a premium, we feel renewable PPA’s will likely become prioritised, and Data Centres may not always be at the top of that list. Our innovative ‘Five Pillars to Sustainability™ model provides a pathway to a true net-zero future. By focusing on hybrid liquid cooling with denser white space, removing diesel fossil fuels from operation and a commitment to zero wastewater, we aim to set a new industry standard, which in time will become the new norm.
What kind of regulatory environment is required for fostering an environment where data centers can genuinely commit to and achieve realistic sustainability goals?
A supportive regulatory environment is crucial. This includes minimum operational guidelines, incentives for adopting renewable energy, and frameworks that encourage innovation and investment in green technologies. Policies should also facilitate collaboration between the private sector, government, and research institutions.

In our company, we advocate for regulations that support these principles, and that are proactive in nature, not reactive as by then, it's too late and the damage is done.
What are some of these realistic sustainability goals?
Realistic sustainability goals can be achieved by making measurable improvements across various aspects of operations. For instance, transitioning to fossil-fuel-free backup generators, aiming for zero waste to landfill, and committing to 100% e-waste recycling are all significant steps.

Additionally, improving embodied carbon in construction by using sustainable materials like green concrete and timber, and fostering diversity with initiatives such as a 40/40/20 gender split in leadership roles, are essential. Each of these goals, whether related to energy, waste, or social impact, marks meaningful progress towards a sustainable future.
What drew you to working towards sustainability in data centers, and what advice would you give to young professionals interested in pursuing a career focused on sustainable AI and data centres?
My passion for sustainability stems from a profound respect for our environment and a true belief in the role we can all take in shaping a more sustainable future. Creating more sustainable solutions in the data centre allows me to merge these interests, driving meaningful change. To young professionals, I would advise focusing on continuous learning, thinking outside of the box and speaking out about your beliefs. AI and the data centre is such a dynamic field, and those who are knowledgeable, innovative, and committed to making a difference will find it incredibly rewarding.


Powering a Sustainable Digital Future: Huawei 2024 Summit Charts Course for Green Data Centers
Huawei’s Global Data Center Facility Summit 2024 was held in Singapore on May 17, 2024, and brought together a significant number of industry leaders, technical experts, and ecosystem partners. Over 600 attendees explored the newest opportunities and trends in the global data center industry under the theme "Power the Digital Era Forward," which signified a turning point in the intelligent computing era.
The transformational potential of intelligent computing and artificial intelligence (AI) was highlighted at the summit. AI is bringing about immense opportunities and rapid advancements, noted Charles Yang, Senior Vice-President, Huawei and President of Global Marketing, Sales, and Services, Huawei Digital Power.
Since ChatGPT first appeared, artificial intelligence (AI) has expanded processing capacity, resulting in unparalleled growth in data center infrastructure.
Over the next five years, the global data center capacity is expected to increase by 100 GW, with the market value surpassing $600 billion.
Yang also highlighted the difficulties that come with this expansion. Two important issues are energy usage and reliability. The potential impact of faults increases exponentially as data centers scale up from megawatt level to gigawatt-level, which can impact millions of users. More than ever, ensuring end-to-end reliability is essential. Furthermore, AI is consuming an exponential amount of energy; for example, training models such as GPT-4 requires 10 times more power than in previous versions.



In order to address these challenges, Huawei is building new energy infrastructure and promoting advancements in digital industries, power systems, and electric vehicles. Huawei wants to grow the industry and fully embrace the era of intelligent computing by collaborating with stakeholders and customers.
Huawei's Data Center Facility & Critical Power Business President, Sun Xiaofeng, outlined the company's strategy in his keynote address. He highlighted the importance of rapid deployment, flexible cooling, green energy, and maximum reliability for data centers. Huawei's solutions, which include prefabricated and modular data centers, advanced cooling technologies, and integrated green energy systems, are crafted to meet the demands of the intelligent computing era.
In an exclusive interview with W.Media, Sun Xiaofeng also discussed the impact of intelligent computing on data centers and detailed Huawei's approach to addressing power shortages and rising electricity costs through both supply-side and demand-side measures.
He emphasized Huawei's commitment to enhancing power distribution system security through product reliability and proactive intelligent prevention. Additionally, Huawei offers comprehensive capabilities, including products, consultation and design services, and ecosystem development, to build next-generation intelligent computing centers.
Moreover, a notable highlight of the summit was the release of the white paper "Building Next Generation Data Center Facility in ASEAN," co-developed by the ASEAN Centre for Energy (ACE) and Huawei. This document outlines the status, challenges, and trends in the ASEAN data center industry, emphasizing the need for efficient and energy-saving solutions to meet sustainability goals. The white paper provides policy recommendations to support the development of reliable, simplified, sustainable, and smart data centers.

Dr. Nuki Agya Utama, Executive Director of ACE, and Dr. Andy Tirta, also from ACE, spoke about the importance of renewable energy and energy efficiency in supporting the region's digital transformation. The white paper highlights the critical role of clean energy in reducing carbon emissions and operational costs for data centers, advocating for incentives such as discounted electricity rates and tax breaks for operators utilizing renewable energy.
The summit's exhibition area showcased Huawei's scenario-based solutions for various data center sizes, featuring collaborations with ecosystem partners like CIMC, Weichai, CSCEC, and Huashi. One of the standout exhibits was the Huawei Power POD, which made its global debut. This innovative solution offers a plug-andplay, high-protection power system per container, significantly reducing timeto-market and enhancing reliability.
Furthermore, the Global Data Center Facility Summit 2024 highlighted the importance of collaboration in driving the industry forward. By aggregating partner ecosystems and fostering innovations, Huawei aims to help customers build reliable computing infrastructure, accelerating the adoption of AI and powering the digital era.
As AI continues to present vast opportunities, Huawei's commitment to innovation and sustainability positions it as a leader in the intelligent computing era. Through collaborative efforts and forward-thinking solutions, Huawei and its partners are paving the way for a greener, more reliable future for data centers worldwide.


Innovation powers Filipino DC provider’s new facility
In the fast-evolving world of technology, change is inevitable, and only those who innovate, survive in the long run. Two organisations that understand it best are Total Information Management (TIM) Corporation, a Philippines-based provider of Data Center and Managed Services, and Starline, a brand of Legrand, and a renowned maker of power distribution equipment. The two have been working together closely and overcoming technological challenges together.
Founded in 1985 as an IT equipment and peripherals supplier, TIM is a homegrown Filipino company, that has evolved into a leading technology solutions provider today, in a market dominated by multinationals.TIM specialises in data center services, Information Technology (IT) infrastructure, connectivity, cybersecurity, and managed services.
Meanwhile, Starline Track Busway, and Starline Critical Power Monitor have earned the respect of industry professionals for being economical, flexible and fast power distribution solutions for mission-critical, data center, healthcare, higher education and industrial environments.
Founded in 1985 as an IT equipment and peripherals supplier, TIM is a homegrown Filipino company, that has evolved into a leading technology solutions provider today, in a market dominated by multinationals. Meanwhile, Starline Track Busway, Starline Plug-In Raceway, and Starline Critical Power Monitor have earned the respect of industry professionals for being economical, flexible and fast power distribution solutions for mission-critical, data center, healthcare, higher education and industrial environments.
Starline Enables Innovation at TIM’s C2 Data Center
When it was in the process of setting up its new data center, an 8MW high density in Carmona City, Cavite, TIM chose Starline for the job.
Named C2, this new facility has been designed to house more than 500 racks with a maximum power loading of 20 kVA per rack, and is a TIA-942 Rated-3 Design and Facilities certified data center.
Instead of traditional power cable set ups, TIM went with Starline Track Busway System 250A and 400A busway with metering. These systems can be easily deployed by simply plugging them into the existing infrastructure.
The busway system is especially well suited for dynamic environments looking to expand or change layouts; or those who would simply like to have these options available for the future.
Other main benefits of the Starline Track Busway product include reduced facility construction costs, faster installation and the ability to customize solutions to fit most facility needs.


Cost benefits and long-term savings Forward together
The switch to Starline’s Track Busway is also expected to provide cost benefits over time for TIM.
“The long-term cost savings can be significant. The ease of maintenance, scalability, and energy efficiency of their busway system can lead to lower operational costs over the lifespan of the data center,” said Bernados. He further explained, “Since it is an overhead system, it eliminates obstructing airflow under the raised floor. Not only does it significantly decrease installation and maintenance time and expenses, it also can eliminate panel boards, long runs of conduit and wire, and expensive installation and maintenance costs. As such, it becomes very easy to expand, reconfigure, or relocate operations.”

Impressed by Starline’s products, TIM has plans to partner with them on at least two more upcoming projects.
“TIM aims to continue to be an enabling company, an enabler of industries and capabilities that builds values and in a secure, stable, and efficient way. Our team adopted specific approaches to solve pain points, which has nurtured a staunch and loyal community of clients and partners,” affirms Bernados, adding, “By continuing our partnership with Starline, the capabilities of TIM become more impactful. We believe this will grow and advance the businesses of our clients as well, and enable us to innovate new avenues to solve more problems for the businesses we do and will serve.”
Having a busway system stands out as a game-changer in power distribution for our operations. Unlike traditional power cable setups, it provides seamless installation, adaptability to evolving and various requirements, and efficient power delivery.
Its patented u-shaped copper busbars and continuous access slot design allow for seamless connection and easy layout changes without service interruption, which is a significant advantage over traditional pipe and wire systems.


Kim Yong Ng Regional Sales Manager, Starline
Robin A. Bernados Vice President, DCO, CTO and CISO, TIM



Achieving Sustainability Goals using Virtualization and Consolidation
Skeptics have often questioned if we can have truly “Green” data centers, but that shouldn’t stop us from innovating and making more sustainable choices now, should it? While there is no magic wand one can wave, Virtualization and Consolidation can help data centers walk the talk on Sustainability. Let’s take a closer look at what they entail.
Virtualization: Making Data Centers more Efficient and Sustainable
Data centers often run multiple applications on separate physical servers, leading to underutilization of computing power.
Rajat Srivastav (Director of Operations, Environmental, ESG and Climate Change Services-India, Jacobs) explains, “Ongoing practices in data centers require multiple servers of the commonality of one application (or “workload”) on each physical server. This results in commonly three to five physical servers per application thus resulting in gross under-utilization of available computing capacity mostly in the range of 12 to 18% of the total available computing power.”
But Virtualization enables multiple virtual machines (VMs) to run on a single physical server. Thus, multiple applications can share the same physical hardware, significantly increasing server utilization rates.
This translates to a requirement of fewer servers that are needed for handling the same workload, leading to a reduction in energy consumption.
“Today’ era where Datacenter density is increasing (i.e., 8kW PDU a few years ago is 50kW now and going to 100kW) it’s always wise to have compact machines i.e. blades. And to cope up with logical segregation in hardware further- Virtualization is key,” says Gaurav Dixit (Service Delivery Manager, Strategic Alliances, Lefdal Mine Datacenter). He explains, “It helps with dynamic resource share, which is a key in cloud model, and in technical terms Server Consolidation Ratio can help power reduction up to 30-70 percent.”
“Virtualization allows for fewer physical servers in a data center, with each remaining physical host server operating at higher total utilization. This helps maximize the utilization of computing power whilst eliminating the multiple server redundancies to be substituted by a single server which reduces the overall operational footprint and thus resulting in significant Scope 1 and Scope 2 footprint reduction which includes the energy consumption mostly in the range of 52 to 65 percent,” says Srivastav.
Moreover, by minimizing the need for physical servers, virtualization also reduces floor space requirements, not to mention a reduction in cooling expenses, thus leading to significant cost savings.
___ Gaurav Dixit (Service Delivery Manager, Strategic Alliances, Lefdal Mine Datacenter)
Virtualization also facilitates swift and efficient backup, replication, and migration of VMs which helps in smoother disaster recovery processes. In the event of a server failure or during maintenance, VMs can be seamlessly transferred to another server with minimal downtime. This capability ensures that business operations remain uninterrupted, maintaining productivity and minimizing the impact of hardware issues on overall operations.
But Virtualization enables multiple virtual machines (VMs) to run on a single physical server. Thus, multiple applications can share the same physical hardware, significantly increasing server utilization rates. This translates to a requirement of fewer servers that are needed for handling the same workload, leading to a reduction in energy consumption.
In the age of Artificial Intelligence (AI), Virtualization plays an even more important role.
“AI has the most dynamic load- as during training it requires maximum resources while during inference its significantly low, so here Dynamic resource share feature of virtualization is very helpful,” says Dixit. “For more requirement during training phase it can be scaled without affecting much of capital expenditure or operating expenditure i.e., Microservices plays a significantly key role in efficient deployment of modules, and microservices are highly supported by virtualization technologies like Containers,” he explains.
Srivastav concurs, saying, “AI applications where the particular application is part of machine learning, even during the initial AI training phase, require tremendous computing load which can be optimized during Virtualization, which reduces dependencies of multiple server configurations, especially for high-density AI applications. The load reduction can be as significant as 50 to 60 percent in terms of power consumption.”
Virtualization is the foundation of Cloud technology. The inherent flexibility of virtualization supports easy scaling in response to changing business needs. Resources can be allocated or deallocated to VMs as application demands fluctuate, improving responsiveness and agility. Additionally, Cloud-based virtualization allows organizations to leverage the expansive resources of public Cloud providers.
WHEN AN ORGANIZATION ADOPTS A DATA CENTER CONSOLIDATION STRATEGY, THAT STREAMLINING ALSO INCREASES THE ORGANIZATION'S ENERGY EFFICIENCY, WHICH REDUCES THE COMPANY’S ENERGY CONSUMPTION AND LESSENS ITS CARBON.
___ Rajat Srivastav (Director of Operations, Environmental, ESG and Climate Change Services-India, Jacobs)
This integration facilitates even greater scalability and flexibility, enabling businesses to scale operations up or down efficiently, based on demand, without the need for significant investments in additional physical infrastructure.
Consolidation:
Packing the 1-2 punch against energy wastage
Consolidation involves reducing the number of data centers or servers in use by combining workloads and resources. This strategy complements virtualization and amplifies its benefits. For example, by consolidating data centers, organizations can centralize their Information Technology (IT) management, thereby simplifying monitoring and optimization, allowing for better control over energy use and more efficient deployment of resources.
“When an organization adopts a data center consolidation strategy, that streamlining also increases the organization’s energy efficiency, which reduces the company’s energy consumption and lessens its carbon footprint,” says Srivastav.
Moreover, larger, consolidated data centers can implement advanced cooling technologies more effectively than smaller, dispersed facilities. Techniques such as hot and cold aisle containment, liquid cooling, and free cooling (using outside air) can be employed at scale, leading to significant energy savings. Improved cooling efficiency means less energy is required to maintain optimal operating temperatures for servers.
Consolidation also reduces the need for redundant systems. In dispersed data centers, redundancy is often required to ensure availability and reliability, leading to higher energy consumption. By consolidating, organizations can streamline their redundancy strategies, using fewer physical resources to achieve the same level of reliability, thus conserving energy.
Larger, consolidated data centers can also take advantage of economies of scale, investing in more efficient, large-scale infrastructure that would be cost-prohibitive for smaller facilities. This includes high-efficiency power supplies, advanced energy management systems, and bulk purchasing of renewable energy. These economies of scale contribute to a lower overall energy footprint.
But Dixit says, there are a few factors one must be mindful of while contemplating Consolidation. “A domain analysis is must before going for a data center plan,” he advises, also emphasizing the need for thorough technology analysis, procurement planning, risk assessment and proper research while putting together a Go-to-Market (GTM) strategy. “It should not be fast so you miss due diligence, PoC planning and benchmarking and it should not be so slow that you miss the industry pace and used technology become obsolete before lifecycle,” he warns.









