CXO DX April 2024

Page 1

ENABLING EFFECTIVE AI DEMOCRATIZATION

Democratization of AI is helping break down traditional hierachichal and functional barriers in terms of technology access that in turn enables possibilities of enhanced innovation at work and better decision making across the workforce spectrum. Organizations that are able to empower employees with access to AI tools in their work and daily operations could also see better outcomes in terms of innovation and collaboration. For instance, there are AI-powered platforms that help automate data cleaning, preparation, and visualization, enabling employees to extract insights from complicated datasets without the requirement of specialized skills. Empowering employees with low-code no-code AI tools and platforms will give them the resources to build customized applications.

The challenge is to ensure that there is proper training and implementation when it comes to implementing AI democratization across the enterprise so that employees are upskilled and empowered in their roles. On the other hand, the quality of output data from AI that you may use for decision making will be proportionate in a large measure to the quality and volume of data used to train the AI forecasting model for instance. Therefore there is a need to ensure high quality data standards, else it could lead to inaccuracies or errors in judgement and discrimination. These are amongst the challenges when it comes to wider AI implementation in the enterprise.

According to to Gartner, the democratization of access to AI makes the need for AI Trust, Risk and Security Management (TRiSM) more urgent to ensure AI model governance, trustworthiness, fairness, reliability, robustness, efficacy and data protection. Without these safeguards, AI models could have a negative effect instead of delivering positive outcomes.

Organizations must in the final analysis look at building a robust AI strategy and framework to leverage the benefits in the longer run while also monitoring its performance and ethically right usage.

RAMAN NARAYAN

Co-Founder & Editor in Chief narayan@leapmediallc.com Mob: +971-55-7802403

Sunil Kumar Designer

SAUMYADEEP HALDER

Co-Founder & MD

saumyadeep@leapmediallc.com Mob: +971-54-4458401

Nihal Shetty Webmaster

MALLIKA REGO

Co-Founder & Director Client Solutions mallika@leapmediallc.com Mob: +971-50-2489676

3 APRIL 2024 / CXO DX » EDITORIAL
PUBLISHED BY - Leap Media Solutions LLC REGISTERED OFFICE: Office 10, Sharjah Media City | www.cxodx.com

14 » NAVIGATING THE EVOLVING CYBER THREAT LANDSCAPE

The shifting terrain of cyber threats in an incresasingly AI dominated world means that organizations need to adopt a proactive approach in embracing innovative threat management solutions and effectively protect their digital assets.

20 » BEST APPROACHES TO BACKUP

Fred Lherault, CTO Emerging, Pure Storage shares his view on best approaches to backup.

22 » BUILDING ROBUST APPOACHES

Ranjith Kaippada, Managing Director, Cloud Box Technologies discusses his insights around the changing backup and disaster recovery approaches.

24 » A SHIFTING LANDSCAPE

Ankit Satsangi, DirectorEnterprise Development at Beeah discusse the evolving cyber threat landscape and cybersecurity approaches.

30 » WHY DNS IS A FAVOURITE ATTACK VECTOR

Terry Young, Director of Service Provider Product Marketing, A10 Networks discusses why DNS exploits continue to be a top attack vector in 2024.

Michael Heering, Global Field Marketing Director, SANS Institute discusses the impact of AI on the evolving cybersecurity landscape and the need to focus on cybersecurity training.

26 » THE JOURNEY FROM MONITORING TO OBSERVABILITY

To realize the full potential of their full-stack observability solutions, IT leaders will need to ensure all stakeholders in the IT department are on board says Gregg Ostrowski, CTO Advisor, Cisco Observability

» BUILDING FOR PORTABILITY IN THE CLOUD

Having data portability is crucial to be able to move things around as needed and to simply maintain data hygiene in the long term writes Rick Vanover - Senior Director Product Strategy-Veeam

32 » DEFEND YOUR DIGITAL ASSETS

Organizations need to employ a full range of security options to protect their APIs and ultimately, their business writes Lori MacVitte, F5 Distinguished Engineer.

34 » HOW ENTERPRISES CAN AVOID TECH DEBT WHILE PIONEERING PROGRESS

Dinesh Varadharajan, Chief Product Officer (CPO) at Kissflow says technical debt can become an obstacle to the very transformation that was originally envisaged.

4 CXO DX / APRIL 2024
12 » REGULATION REMAINS THE STRONGEST MULTIPLIER TO CYBERSECURITY GROWTH, ACCORDING TO REPORT FROM FROST & SULLIVAN 18 » RETHINKING CYBERSE-
CURITY APPROACHES
06 » NEWS 36 » TECHSHOW 38 » TRENDS & STATS » CONTENTS COVER STORY COLUMN COLUMN NEWS INSIGHT INTERVIEW REGULARS 14
28

MEET THE DYNAMIC DUO

Meet us at GISEC Global 2024, where Exclusive Networks and Fortinet come together in an unparalleled collaboration, setting new standards in cybersecurity excellence!

HALL 5 | BOOTH A100

DIMENSION

DATA REBRANDS AS NTT DATA

The rebranding initiative marks a pivotal moment for the company, as it aligns its local focus with enhanced global expertise

Dimension Data announced that it will be rebranding as NTT DATA from April 1, 2024, in the Middle East and Africa

(MEA) market. NTT DATA, a leading global IT infrastructure and services company, will introduce its global services to the MEA market through a full stack of services and innovation across consulting, applications, infrastructure, connectivity, and operations offering clients the assurance of partnering with a globally recognised and respected technology provider.

"We're excited to embrace this next chapter as NTT DATA," says Alan Turnley-Jones, CEO of Dimension Data Middle East and Africa. "This transition not only signifies a continuation of our legacy and experience but also heralds a new era of growth and innovation which allows us to provide a differentiated, consistent, and better client experience through world-class platform-delivered managed services."

Under the NTT DATA brand, Dimension

Data will maintain its commitment to local markets while gaining access to a broader range of global resources. By leveraging global best practice, NTT DATA will now offer a richer portfolio of services and tailored industry solutions from retail to financial services, manufacturing, pharmaceuticals, mining, and more. This includes an expanded portfolio of digital assets, innovative technologies, and industry-specific solutions, empowering the company to address evolving client needs with precision and agility.

The rebrand also opens the doors for increased growth and investment. “Our focus as a leadership team remains on developing the Middle East and Africa business, one of four NTT DATA regions around the world, and we’ve embarked on a bold new growth and investment strategy to match the global NTT DATA technology portfolio,” he adds.

CISCO COMPLETES ACQUISITION OF SPLUNK

The joint offerings will help companies harness data to connect and protect every aspect of their organizations

Cisco announced it completed the acquisition of Splunk, setting the foundation for delivering unparalleled visibility and insights across an organization’s entire digital footprint.

To thrive in the new digital era, organizations must connect and protect all that they do. They need to connect the people, places, applications, data, and devices that power their business while protecting their entire digital footprint from cybersecurity threats, downtime, and other critical business risks.

Cisco will bring the full power of the network, together with market-leading security and observability solutions, to deliver a real-time unified view of the entire digital landscape, helping teams proactively defend critical infrastructure, prevent outages, and refine the network experience.

“We are thrilled to officially welcome Splunk to Cisco,” said Chuck Robbins, Chair and CEO of Cisco. “As one of the world’s largest software companies, we will revolutionize the way our customers leverage data to connect and protect every aspect of their organization as we help power and protect the AI revolution.”

“Uniting Splunk and Cisco will bring tremendous value to our joint customers worldwide,” said Gary Steele, Executive Vice President, General Manager, Splunk. “The combination of Cisco and Splunk will provide truly comprehensive visibility and insights across an organization’s entire digital footprint, delivering an unprecedented level of resilience through the most extensive and powerful security and observability product portfolio on the market.”

6 CXO DX / APRIL 2024
» NEWS
Alan Turnley-Jones CEO, Dimension Data MEA Chuck Robbins Chair and CEO of Cisco

REDINGTON JOINS THE ORACLE CLOUD DISTRIBUTION PROGRAM IN MEA REGION

The Cloud Distribution Program will enhance, diversify, and strengthen Redington's cloud footprint and offerings within its mutual partnership regions with Oracle

Redington, a leading technology distributor with a strong presence across the Middle East and Africa, and a member of Oracle PartnerNetwork (OPN), today announced that it is one of the first cloud distributors to join Oracle’s Cloud Distribution Program. Redington, with the support of Oracle, will focus on growing cloud consumption in the MEA region through its partner ecosystem.

The Cloud Distribution Program (CDP) is a global program that launched in 2023 to focus on strengthening partnerships with established cloud distributors to expand the reach of Oracle Cloud Infrastructure (OCI) into both new and existing markets, with a particular focus on the small and midsize business markets.

By working with Oracle, Redington will leverage the collective strength of both channel ecosystems to promote the adoption of OCI and drive long-term utilization growth,

while delivering successful outcomes for customers.

“We are excited to join the Oracle Cloud Distribution Program and align our expertise with Oracle's vision for a diversified, innovative, and customer-centric OCI partner ecosystem. This partnership empowers partners to deliver optimal cloud solutions to their customers every step of the way. Channel partners will have simplified access to Oracle’s cloud services coupled with Redington’s in-depth technical support and training to best leverage these services, among other benefits," said Dharshana Kosgalage, Head of Technology Solutions Group, Redington MEA.

OCI is a deep and broad platform of cloud infrastructure services that enable customers to build and run a wide range of applications in a scalable, secure, highly available, and high-performance environment. From appli-

MICROSOFT LAUNCHES COPILOT PRO

cation development and business analytics to data management, integration, security, AI, and infrastructure services including Kubernetes and VMware, OCI delivers comprehensive security, performance, and cost savings.

Copilot Pro enables smarter and faster assistance across Microsoft’s productivity apps, enhanced AI image generation, and the ability to build custom Copilot GPTs to achieve various tasks

Microsoft announced the expansion of its Copilot Pro subscription, empowering individuals across the Middle East, to further

enhance their productivity and creativity. Copilot Pro is the advanced version of Microsoft’s Copilot platform, designed for individuals, creators, and power users who want to take their Copilot experience to the next level. The premium subscription allows users access to numerous features such as priority access to the latest OpenAI models such as GPT-4 and GPT-4 Turbo, even during peak times for faster performance. Copilot Pro users will also be able to build and share their own Copilot GPTs that are tailored to their specific needs, as well as generate unique images, which can be enhanced with 100 daily boosts in Microsoft Designer.

Users will now be able to access Copilot within the free Microsoft 365 web apps such as Word, Excel, PowerPoint, and Outlook, without the requirement of an

additional Microsoft 365 subscription. However, a Microsoft 365 Personal or Family subscription is still required to unlock Copilot in the desktop apps for PC and Mac.

Zubin Chagpar, Senior Director and Business Group Leader, Modern Work and Surface Devices, at Microsoft CEMA said, “From clearing up overflowing inboxes to catching up on missed meetings, finishing complicated reports, and even suggesting recipes to make a delicious dinner, Copilot is helping to drive a new era of efficiency.With Copilot Pro, individual users – in addition to business users - can take these experiences to the next level and even create their own personalized Copilots to assist them with various tasks.”

7 APRIL 2024 / CXO DX
» NEWS
Dharshana Kosgalage Head of Technology Solutions Group, Redington MEA Zubin Chagpar Senior Director, Microsoft CEMA

ZAINTECH AWARDED MICROSOFT AZURE EXPERT

MSP STATUS

ZainTECH supports private and public organizations in regulated and non-regulated industries in leveraging the power of the cloud

ZainTECH, the integrated digital solution provider of Zain Group, has earned the Microsoft Azure Expert Managed Service Provider (MSP) status. This status is awarded to organizations that demonstrate exceptional capabilities in delivering comprehensive, end-to-end solutions on the Azure cloud platform.

ZainTECH's attainment of this status further solidifies its position as a trusted partner in the digital transformation journey of businesses across the Middle East, and signifies its proficiency in managing and optimizing Azure services to provide clients with reliable, scalable, and secure cloud solutions.

Andrew Hanna, ZainTECH CEO commented, “We are focused to delivering cutting-edge solutions that make it easier for businesses to transition to the cloud and

deliver scalable and optimized workflows. Achieving the Azure Expert MSP status reinforces our commitment to cater for requirements and challenges posed by our customers’ needs.”

ZainTECH supports private and public organizations in regulated and non-regulated industries in leveraging the power of the cloud to deliver transformational IT outcomes. Whether customers are focused on growth, driving down costs, or mitigating security risks, ZainTECH offers versatile cloud solutions that bring immense value as well as the power to scale alongside the business. With in-country datacenters that offer improved flexibility for scaling and costs, ZainTECH cloud solutions are proving to be essential for customers who are still in the early stages of cloud adoption.

PROVEN CONSULT AND SADQ SIGN PARTNERSHIP

PROVEN Consult is integrating its Sanad.ai Arabic OCR technology into the joint offerings

PROVEN Consultannounced the signing of an MoU with Sadq, a leading digital signature company in Saudi Arabia. The MoU was signed between Hilel Baroud, CEO of PROVEN Consult, and Dr. Abdulla Allahuo, marking the beginning of a collaborative journey aimed at providing cutting-edge solutions to clients.

Through this partnership, PROVEN Consult and Sadq are committed to enhancing the client experience by offering access to Sadq's seamless e-signature solution. This integration will streamline document signing processes, ultimately boosting efficiency and productivity for clients across various industries.

In addition to the e-signature solution, PROVEN Consult is integrating its powerful Sanad.ai (https://sanad.ai/) Arabic OCR technology into the joint offerings. This tool, renowned for its accuracy and efficiency in text extraction, will further elevate the capabilities of collaborative solutions.

"We are thrilled to partner with Sadq to bring innovative solutions to our clients," said Hilel Baroud, CEO PROVEN Consult. "This collaboration underscores our dedication to providing innovative and comprehensive solutions to address the diverse needs of our clients."

The partnership between PROVEN Consult and Sadq reflects a shared vision of leveraging technology to empower businesses and enhance operational efficiency. Both parties are committed to leveraging their expertise and resources to deliver unparalleled value to clients.

8 CXO DX / APRIL 2024 » NEWS
Andrew Hanna CEO, ZainTEC

VAST DATA UNVEILS NEW DATA PLATFORM ARCHITECTURE FOR THE AI FACTORY

Designed in collaboration with NVIDIA and being deployed at CoreWeave, the new infrastructure unleashes VAST’s parallel data services architecture on NVIDIA BlueField DPUs to simplify and scale AI

VAST Data, the AI data platform company, unveiled a new AI cloud architecture designed to deliver unprecedented levels of performance, quality of service, zero-trust security and space/cost/power efficiency for the AI factory. Building on NVIDIA BlueField-3 data processing unit (DPU) technology, VAST Data’s parallel system architecture makes it possible to disaggregate the entirety of VAST’s operating system natively into AI computing machinery, transforming supercomputers into AI data engines.

The NVIDIA BlueField networking platform combines robust compute power and integrated hardware accelerators to create secure and software-defined accelerated computing infrastructure for AI. By outfitting each GPU server with a dedicated NVIDIA BlueField DPU running a stateless container that powers the VAST parallel services operating system, this new architecture design embeds storage and database processing services directly into AI servers and delivers

true linear data services designed to scale to hundreds of thousands of GPUs. Moreover, by removing multiple layers of x86 hardware and networking from VAST’s network-attached Data Platform infrastructure, this new AI factory architecture dramatically reduces the cost, footprint, and power associated with AI data services.

“We’re extremely proud to partner with NVIDIA to help industrialize AI computing,” said Jeff Denworth, co-founder at VAST Data. “This new architecture is the perfect showcase to express the parallelism of the VAST Data Platform. With NVIDIA BlueField-3 DPUs, we can now realize the full potential of our vision for disaggregated data centers that we’ve been working toward since the company was founded.”

This new VAST architecture – running VAST software on BlueField DPUs in the AI servers – is being tested and deployed first at CoreWeave, a leading specialized

GPU cloud provider. VAST and CoreWeave began partnering in 2023 to build some of the world’s most scalable AI machinery and to help many of the world’s leading LLM builders and blue-chip enterprise customers build their own AI factories.

DATAIKU LAUNCHES GENERATIVE AI COST MONITORING

New feature within the Dataiku LLM Mesh creates standards for tracking and optimizing Generative AI use cases across the enterprise

In the rapidly advancing landscape of Generative AI, the rush to harness the power of large language models (LLMs) has led to a challenge for widespread enterprise adoption: the comprehensive understanding and management of associated costs. Responding to this challenge, Dataiku announced the launch of its dedicated cost monitoring solution, LLM Cost Guard, a new component of the Dataiku LLM Mesh.

LLM Cost Guard enables effective tracing and monitoring of enterprise LLM usage to better anticipate and control Generative AI costs. It provides visibility into costs attributed to specific applications, providers, and users for a fine-grained understanding of which LLM use cases are driving what costs. LLM Cost Guard is a feature within the

Dataiku LLM Mesh, which provides a secure LLM gateway and allows customers to be agnostic when it comes to LLM providers, including integration of LLMs provided by OpenAI, Microsoft Azure, Amazon Web Services, Google

Cloud Platform, Databricks, Anthropic, AI21 Labs, and Cohere.

"AI is a critical strategy for every enterprise’s innovation and growth, yet we constantly hear business leaders’ concerns not only about the potential costs of Generative AI projects but the inherent variability of that cost,” said Florian Douetteau, co-founder and CEO, Dataiku. “With LLM Cost Guard, we’re aiming to demystify these expenses. IT leaders will now not only have controlled, governed LLM access within the Dataiku LLM Mesh, but detailed, real-time control over spend, so that they can focus their time on building and innovating. By distinguishing between various costs and setting early warnings, we empower leaders to move forward with their Generative AI initiatives confidently."

9 APRIL 2024 / CXO DX » NEWS
Jeff Denworth Co-Founder, VAST Data

GBM EXTENDS STRATEGIC PARTNERSHIP WITH LENOVO

The extended partnership includes the sale and support of Lenovo TruScale IaaS solution throughout the GCC region

Gulf Business Machines (GBM) announced the extension of its long-standing strategic partnership with Lenovo. The renewed expansion will include the sale and support of the groundbreaking Lenovo TruScale IaaS (infrastructure as a service) solution, throughout the GCC region.

Building upon years of successful collaboration between GBM and Lenovo, the expanded partnership will see organizations in the GCC benefit from a cutting-edge hybrid cloud solution that seeks to revolutionize the way businesses manage and deploy their IT infrastructure.

With Lenovo TruScale, business will be able to harness a pioneering on-premises, flexible, consumption-based model, allowing them to scale IT resources on-demand. This approach ensures adaptability to the dynamic modern business landscape, eliminating constraints associated with traditional infrastructure investments.

Businesses leveraging Lenovo TruScale can scale their IT infrastructure according to specific needs, promoting optimal resource utilization and cost efficiency. The consumption-based pricing model ensures cost predictability, empowering organizations to accurately manage IT expenses without the limitations of upfront capital investments. Key features also include streamlined IT management through a unified platform, prioritized security and compliance with robust data protection measures, and the delivery of a powerful hybrid cloud infrastructure for businesses of all sizes through Lenovo's state-of-theart hardware and technologies.

Alaa Bawab, General Manager at Lenovo Infrastructure Solutions Group, said, "As we expand our strategic partnership with Gulf Business Machines (GBM), Lenovo is thrilled to announce the integration of our cutting-edge Infrastructure as a Service (IaaS) solution, TruScale, into

the GCC region. Lenovo TruScale represents a revolutionary leap forward in hybrid cloud solutions, and we are excited to empower businesses in the GCC with unprecedented flexibility and control over their IT resources.”

NOZOMI NETWORKS LAUNCHES SAAS PLATFORM FOR OT AND IOT SECURITY

Expands global footprint with new Vantage region in the UAE

Nozomi Networks Inc, a leader in OT and IoT security, introduced a significant expansion of its global cloud footprint with the launch of a new Vantage region in the United Arab Emirates – part of Nozomi Networks’ continued commitment to the UAE and the Middle East.

Nozomi Vantage is a cloud-based cybersecurity management platform that provides critical infrastructure operators and cyber-

security teams with unified OT/IoT security monitoring and risk management across the Nozomi cybersecurity platform. Through a single pane of glass, Vantage powers asset management, vulnerability assessment, and threat detection and response.

A valuable resource in supporting the UAE’s Cyber Security Strategy to accelerate digital transformation and smarty city initiatives, Vantage delivers the unmatched security and visibility expected from Nozomi Networks, with the addition of unlimited scalability, powered by SaaS. It makes it possible to protect any number of OT, IoT, IT, edge and cloud assets, located anywhere, with a single platform.

“We are very excited to be able to better serve our customers in UAE with this new Vantage region,” said Bachir Moussa, Nozomi’s Regional Vice President for MEA. “Nozomi Networks has always been committed to strengthening critical infrastructure cybersecurity in the Middle East and this new Vantage region is a testament to that commitment. Now our customers can not only reap the benefits of a cloud-based solution, but they will also be able to leverage Nozomi’s latest cybersecurity defenses for the best possible protection.”

10 CXO DX / APRIL 2024 » NEWS
Alaa Bawab General Manager at Lenovo Infrastructure Solutions Group Bachir Moussa Nozomi’s Regional Vice President for MEA
OFFICIAL GOVERNMENT CYBERSECURITY PARTNER HOSTED BY CYBER SECURITY COUNCIL ﻲﻧاﺮﺒﻴﺴﻟا ﻦﻣﻷا ﺲﻠﺠﻣ OFFICIALLY SUPPORTED BY OFFICIAL DISTRIBUTION PARTNER LEAD STRATEGIC PARTNER STRATEGIC PARTNER PLATINUM SPONSOR GOLD SPONSOR BRONZE SPONSOR 23-25 APR 2024 DUBAI WORLD TRADE CENTRE #gisecglobal | gisec.ae SCAN HERE THE SUPER CONNECTOR EVENT FOR THE MIDDLE EAST & AFRICA'S CYBERSECURITY COMMUNITY AI-DRIVEN CYBER RESILIENCE A BOLD NEW FUTURE gisec@dwtc.com | Tel: +971 4 308 6469 GET INVOLVED

REGULATION REMAINS THE STRONGEST MULTIPLIER TO CYBERSECURITY GROWTH

Report by Frost & Sullivan offers deep dive into current conditions of the region’s cybersecurity industry

In 2023, the United Arab Emirates (UAE) actively repelled more than 50,000 cyberattacks daily, according to the UAE Cybersecurity Council. In the first three quarters of the same year, the country successfully prevented over 71 million attempted attacks in total.

These findings, highlighted in a report from analysts Frost & Sullivan (F&S), show the exponential growth of the region’s cybersecurity landscape – and serve as a sobering reminder of the rising threats that accompany it.

As the GCC (Gulf Cooperation Council) cybersecurity industry continues to grow – with F&S estimating it to triple in value by 2030 to reach US$13.4 billion – countries like the UAE and Saudi Arabia continue to reduce their dependence on oil exports and are instead opting for digital tools and technologies.

This shift in economic agenda has made businesses increasingly prone to escalat-

ing cyber threats, with regional geopolitical instability further driving vulnerability across key sectors.

The detailed report, titled ‘Middle East Cybersecurity: Exploring the Middle East Cybersecurity Market Potential’, was released ahead of GISEC Global 2024 – the Middle East and Africa’s largest and most impactful cybersecurity super-connector, which returns to Dubai World Trade Centre from 23-25 April. In collaboration with Frost & Sullivan, it aims to identify the challenges and opportunities facing the region’s expanding industry.

The Middle East braces for escalating cyber threats

In the UAE and Saudi Arabia, specifically, there has been a dramatic uptick in the adoption of technology across the finance, healthcare, and manufacturing sectors, further boosting the need for cybersecurity and robust regulatory frameworks.

Contributing to the existing challenges

with increased reliance on technology are issues around awareness and a scarcity of skilled professionals, as well as a lack of clarity among businesses regarding proactively combating cyberattacks.

In response to these industry-wide shortcomings, and as the region continues to navigate the global overhaul of technology, countries in the Middle East are taking measurable steps to enhance their cybersecurity posture.

Setting up cyber-specific departments and innovation centres, driving awareness through educational campaigns and training programmes, and promoting entrepreneurship through cybersecurity conferences are just some of the ways that the region is equipping the next generation and bridging the existing skills gap.

In fact, as per the ITU Global Cybersecurity Index 2020 highlighted in the report, Saudi Arabia has ranked second, and the UAE fifth, among 194 participating countries, indicating that both countries have taken extensive measures in terms of regulatory approaches.

As a result, they have become destinations of choice for academics, businesses, research, and innovation, with the UAE government launching the first national Cyber Pulse Innovation Centre aimed at upskilling professionals at Abu Dhabi Polytechnic.

As the Middle East continues to develop a robust cybersecurity infrastructure and economy, it remains one of the most promising global regions for industry growth; its commitment to regulation, cybersecurity training, and supply chain security set it apart as an industry leader with an ambitious vision to integrate technologies and meet evolving client needs.

12 CXO DX / APRIL 2024 » NEWS INSIGHT

www.alpha.ae

ACCERLERATING CHANGE
Nobody ever said transformation was easy. That's why we're here to lead your business from where it is today, to where you want it to be tomorrow.

NAVIGATING THE EVOLVING CYBER THREAT LANDSCAPE

The shifting terrain of cyber threats in an increasingly AI dominated world means that organizations need to adopt a proactive approach in embracing innovative threat management solutions and effectively protect their digital assets

With each passing year, there have been significant shifts in the cybersecurity threat landscape. Cybercriminals are continuously innovating and leveraging new tools to exploit all available vulnerabilities of an ever expanding attack surface and maximize their objectives. Organizations have no choice but to be ever vigilant and maintain an effective cybersecurity posture that provides them resilience against all possible breaches and attacks.

Aaron Bugal, Field CTO APJ, Sophos says, “Traditionally we’d associated cyber threats with malware, and while that is still true today – as an example, there’s a glut of ransomware – modern threats are turning to the abuse of tools and utilities that are packaged up as part of the operating systems we use. Many tools embedded within Windows, macOS and any distribution of Linux all have utilities that are being used to help cyber criminals enumerate systems, laterally move and exfiltrate data all without introducing any malware. This exemplifies the evolution of the threat landscape and the adaptability of cyber criminals.

There has been an industry wide shift-left in modern attack strategies over the last 12-24 months, says Christopher Hills, Chief Security Strategist, BeyondTrust.

He elaborates, “Where we are used to seeing Vulnerability and Exploit combos, the shift has been to what we are seeing as a Log-In,

rather than, Hack-In, from threat actors, nation states, and cyber crime syndicates. These bad actors have shifted their focus to the user and identity infrastructure systems to compromise organizations. This major shift has uncovered major risk areas in Identity Infrastructure, because they are all siloed, managed independently, and there is no cohesiveness in finding faults and risks. We often talk about Identity as the new perimeter or even the new network, that couldn’t be closer to the truth.”

Cyber threats and protection in the era of AI

The impact of AI has been double edged when it comes to cybsesecurity. While AI helps strengthen cybersecurity automation and tools to combat evolving threats, cybercriminals too are leveraging such tools to enhance their attacks. The evolution of LLMs (Large Language Models) that are trained on a large amount of data also creates the possible scenario of cybercriminals using these to launch more attacks.

Ezzeldin Hussein, Regional Senior Director, Solution Engineering, META, SentinelOne says, “AI has significantly transformed the threat landscape, both empowering cyber attackers and enhancing defensive capabilities in cybersecurity. Malicious actors utilize AI to automate attacks, evade detection, and launch targeted campaigns at scale, making them more sophisticated and difficult to thwart. AI-driven techniques, such as machine learning and natural language processing, enable attackers to adapt their strategies in re-

14 CXO DX / APRIL 2024 » COVER STORY

» COVER STORY

al-time, exploit vulnerabilities, and bypass traditional security measures.”

He adds, “Conversely, AI plays a crucial role in cybersecurity automation and defense. AI-powered tools can analyze vast amounts of data, detect anomalies, and identify potential threats with greater accuracy and speed than human operators alone. By leveraging AI for threat intelligence, incident response, and predictive analytics, organizations can strengthen their defenses, detect threats in real-time, and respond more effectively to evolving cyber risks. Additionally, AI enables automated incident response, reducing response times and mitigating the impact of cyberattacks.”

According to Christopher, AI brings its own set of challenges to the table for organizations.

“Whether it be the free version of Generative AI or even the paid subscription versions of AI, they all present a problem, particularly from a privacy standpoint when it comes to end users leveraging these AI tools to put data into and get data out. The lack of control and not knowing what is being put into AI leaves a lot of risk potentially on the table.”

He adds, “On the other hand, the adoption and usage of AI into cybersecurity strategies is a newer approach that organizations are still learning to adopt. While vendors find ways to incorporate AI into their products and solution offerings by making their solution “smarter” with data, businesses are trying to find ways to modernize their business processes, analytics, and data consumption in ways that provide tangible and actionable data output. One of the biggest AI strategies in cybersecurity revolves around risk awareness, detection, and remediation — bringing quicker awareness around the discovery of potential risks, which should lead to faster remediation, and ultimately shrinking the potential of a compromise or beach.”

Ramsomware as a persistent threat

Ransomware has continued to evolve through through collaboration and partnerships between between cybercriminal groups as they launch well coordinated attacks. These attacks have been targeting high profile institutions and critical infra-

structure and continue as an ongoing problem for organizations worldwide.

Aaron says, “Ransomware has now reached homeostasis. It’s a commoditised technique that is typically used by an attacker as a means to drive awareness within an organisation they’ve been victimised, proving the cybercriminal has accessed your networks and data stores and chosen to break the confidentiality, integrity and availability of them. It’s important to understand any ransomware attack, successful or prevented, is investigated in how that attack came to be. Remote ransomware is a common tactic whereby unprotected systems within an organisation are being used as launch pads to infect other systems with ransomware remotely. Defensive capabilities start with visibility into what systems and services you have, moving to fully protect the entire environment with proactive protection and solid detection and response functions. You can’t stop, let alone detect, things you cannot see as such insight into precursory, suspicious activity should be highlighted and investigated before it turns into an attack.”

Ransomware attacks represent a grave threat to organizations globally, with cybercriminals encrypting data and demanding ransom payments for decryption says Ezzeldin. He adds, “Ransomware attacks represent a grave threat to organizations globally, with cybercriminals encrypting data and demanding ransom payments for decryption. To reduce the likelihood of falling victim and mitigating potential damages, several best practices are imperative. Regularly backing up critical data ensures recovery options without succumbing to ransom demands. Implementing robust security measures such as updating software, patching vulnerabilities, and deploying security solutions like firewalls and next-generation antivirus software with rollback capabilities is paramount for preventing infections. Educating employees on recognizing phishing attempts and other social engineering tactics aids in mitigating risks. Implementing access controls and least-privilege principles limits attackers' reach within the organization's network. Furthermore, having a comprehensive incident response plan enables swift and effective responses to ransomware incidents, including isolating affected systems, rollback and restoring data from backups, and reporting incidents to authorities.”

VP, Information Technology, Emirates Hospitals Groups Zero Trust and Automation

The Zero Trust approach and security automation are critical components of modern cybersecurity strategies today.

Ezzeldin says, “Zero Trust challenges the traditional notion of perimeter-based security by requiring continuous authentication and authorization for every user and device, regardless of their location. This approach reduces the risk of lateral movement by attackers and limits the potential impact of breaches. Security automation complements Zero Trust by streamlining security operations, automating routine tasks, and accelerating incident response. Automation tools can detect and remediate security vulnerabilities, enforce security policies, and orchestrate incident response workflows. By leveraging automation, organizations can improve operational efficiency, scale security operations, and respond quicker and more effectively.”

According to Christopher, adoption of Zero Trust faces challenges from a HR perspective in some organizations which is confounding.

He says, “The interesting part about the Zero Trust approach, for me, honestly is that this has now become an HR issue,

15 APRIL 2024 / CXO DX
Aaron Bugal Field CTO APJ, Sophos

yes I said it, HR! End users are complaining to HR that they are not being seeing as trusted employees and that Zero Trust is derogatory. I was shellshocked when I heard this, and I feel organizations are not doing a great job educating their users on what Zero Trust is, what it means, how it helps protect the organization again cyber criminals, and has nothing to do with them, or punishing them, but instead, protecting the business.”

Christopher adds, “There are some great Zero Trust approach strategies for many different use cases and even architectures.

The NIST CSF 2.0 framework is a great example. Organization need to implement varying Zero Trust controls to a level they feel comfortable with - “Risk Appetite.” And, they need to educate their end users on what Zero Trust is and why its important to the organization, to the point of even explaining what it might mean to an end user if the organization was breached or compromised (reputation damage, loss of revenue, impact on stock price, etc).”

Zero Trust also help secure OT environments that present their own set of challenges.

Christopher says, “Most OT environments cannot be configured with modern security protocols or principles. This is truly where I feel a Zero Trust Architecture approach

makes the most sense. I use the example of a resource enclave, that gates access IN and OUT, monitors it in Real-Time, and has active alerting in place to ensure those that are supposed to be out or in the OT environment, are, leveraging security principles and strategies that ensure the OT environment is secure.”

The road ahead

The need to to drive cybersecurity awareness and also to train cybersecurity experts is a key priority for organizations on the road ahead.

According to Aaron, “Security awareness, training and enablement are core pillars to driving better cyber security outcomes for every organisation. However, this should not be limited to just the technical people who have been made responsible for managing the cyber defensive tools. Security awareness and building a cyber security intelligence quotient within the executive and c-level teams is integral to ensuring better business outcomes via the ability to adapt to the pace of change that cyber issues present”

“An accountable board and c-level team that understands business risk and the impact that a cyber-attack would have, can best govern their organisation to better outcomes by enabling their cyber defenders with the right support and resources to realise ultimate cyber resiliency. Cyber security is a team sport, everyone should be included in educational strategies inclusive of the board, executives, management, team leads and employees, ” he adds.

Organizations need to focus on to enhancing their cybersecurity posture and reslience against emerging threats.

Christopher says, “Organizations need to take a deep dive and look into their Identity Security posture. They need to re-think how they are going to solve for the modern attack landscape and look at areas they never thought of, such as Identity Infrastructure. They need to consider how they are granting permissions across siloed instances and infrastructures, and figure out how to better gain visibility to risk that can pose as threats. The problem is that in todays current landscape, most tools on the market only cover their own areas, cloud, or on-prem, or MFA, or SSO,

or iDP(Identity Provider), AD, or even Entra ID. None of them traverse across these multiple layers in the Identity Infrastructure, so it becomes hard to find, alert, or even mitigate some of these risks that are in hiding.”

Cybersecurity budgets and investments in advanced technologies are expected to continue growing in response to evolving threats and priorities in 2024.

Ezzeldin says, “With cyber threats becoming more sophisticated and pervasive, organizations across various industries are expected to allocate more resources to cybersecurity initiatives. This includes investments in advanced security technologies such as artificial intelligence, machine learning, and automation to enhance threat detection and response capabilities.”

Additionally, as remote work continues to be prevalent, organizations are likely to prioritize investments in securing remote access infrastructure, endpoint security solutions, and cloud security platforms to mitigate the risks associated with distributed work environments. Moreover, compliance requirements and regulatory mandates are expected to drive increased investments in cybersecurity frameworks and governance mechanisms to ensure regulatory compliance and protect sensitive data.

16 CXO DX / APRIL 2024
» COVER STORY
Ezzeldin Hussein

RETHINKING CYBERSECURITY APPROACHES

Michael Heering, Global Field Marketing Director, SANS Institute discusses the impact of AI on the evolving cybersecurity landscape and the need to focus on cybersecurity training

Elaborate on the importance of driving cybersecurity awareness and training cybersecurity experts.

As the number of cybersecurity professionals continues to shrink, with an estimated shortfall of 3.5 million this year, according to Cybersecurity Ventures, validating an employee's expertise through thorough accreditation will become increasingly necessary. Now, more than ever, it is important to build up cybersecurity awareness and upskill professionals. Organizations should implement comprehensive cybersecurity awareness programs that educate employees on the importance of security practices and the common tactics used by adversaries. Balance technical defenses alongside existing human resources. Regular training sessions, phishing simulations, and the promotion of a security-conscious culture are effective ways to reduce the likelihood of human-error. Strategies such as gamification of training, personalized learning paths, and promoting security as a shared responsibility can also enhance engagement and awareness.

Discuss the impact of AI on the threat landscape and in terms of cybersecurity automation and tools to combat evolving threats. Organizations are leveraging AI to address various cybersecurity threats through sophisticated means. For malware, AI is used to analyze patterns and quarantine new strains. In combating phishing, AI examines email content and sender behavior to detect attempts. Additionally, to

mitigate insider threats, AI monitors user activities to identify unusual behaviors that may signify a threat, showcasing the versatility of AI in tackling diverse cybersecurity challenges.

Incorporating AI into cybersecurity strategies brings critical benefits, notably enhanced threat detection through vast data analysis, automated threat responses for quick mitigation, predictive analytics for forecasting breaches, and a reduction in false positives for more accurate threat identification. These advancements make AI a key component in modern cybersecurity efforts.

However, to harness these benefits fully, there's an essential need for training. Cybersecurity professionals must be skilled

in AI technologies to implement and manage AI-enhanced security measures effectively. Training ensures teams can deploy AI tools efficiently and keep pace with the evolving landscape of cyber threats, making ongoing education and skills development a pivotal aspect of leveraging AI in cybersecurity.

How are compliance requirements playing a key role in cybersecurity investments and strategies?

We’re seeing an increased focus from governments across the world on regulations that push organizations across all sectors to take cybersecurity more seriously and make it an undeniable part of organizational culture and strategy. We’ve seen this in the US with the SEC ruling on Incident Reporting and Management oversight, with the DoD 8140.3, in Europe the new NIS2 Directive also puts more accountability on executive leaders to ensure they’ve done all they can to strengthen their security posture and we’re seeing a renewed focus on cybersecurity regulation in KSA as well.

Organizations as a result, need to rethink their overall strategies around cybersecurity and how this isn’t just a sole responsibility of a CISO or security team. It is a cross-company responsibility that starts with a strong security awareness across all employees, investments in risk assessments and skill assessments to determine where gaps might lie and subsequent investments to close those gaps through training, exercises and certifications.

18 CXO DX / APRIL 2024
» INTERVIEW

BEST APPROACHES TO BACKUP

Fred Lherault, CTO Emerging, Pure Storage shares his view on best approaches to backup

How has the strategy of backup evolved in the past few years given the changing landscape? How does it become part of a disaster recovery plan?

Backing up data remains critical for data protection, but it's not enough. Implementing advanced data protection capabilities helps companies better plan for — and recover quickly from — ransomware and cyberattacks. This essentially requires a twopronged approach: taking regular, "immutable copies” of data, and having the necessary infrastructure to rapidly restore from backups at speed and scale.

What are some of the key best practices to follow for enterprises in terms of backup strategies?

As mentioned above, backing up data isn’t enough. Organisations need a two-pronged approach: taking regular, "immutable copies” of data, and having the necessary infrastructure to rapidly restore from backups at speed and scale.

In the event of a cyber attack or any other event that compromises data or disrupts operations, companies can recover critical data from their immutable copies so that they can restore operations quickly — without having to succumb to the demands of cyber criminals. Proper immutability means these copies cannot be encrypted, or even deleted by attackers. Modifying them in any way, or the frequency of which they are used, is protected with multi factor authentication and therefore safe from hackers. This makes them far more resilient and reliable in the event of a cyber attack.

Next comes the ability to restore data as fast as possible, as reliable backups are limited in their effectiveness if operations cannot be restored quickly. Some of the most advanced flash-based storage solutions dramatically increase the speed of data restoration. The leading solutions boast a recovery performance of up to many hundreds of TBs per hour at scale, enabling organizations to restore systems in hours — rather than weeks, so they can get up and running again with minimal impact.

Furthermore, while protecting data should be a major concern, organisations need to be careful not to overlook other critical factors in the aftermath of a ransomware attack. For example, affected arrays could become off-limits and unusable. Following an attack, storage arrays are often locked down for forensic investigation by cyber insurance or law enforcement agencies, leaving organizations unable to recover data to infected arrays. Without a data storage infrastructure to get systems back up and running, organizations are stuck.

Thankfully, there are now solutions on the market that can mitigate this risk. Some vendors can offer ransomware recovery

SLAs, on top of an existing STaaS subscription, to guarantee a clean storage environment with bundled technical and professional services in the wake of an attack. In practice this means being able to deliver a completely new storage environment in a matter of hours to recover from, should the original one be unavailable for any reason. This kind of guarantee can give businesses the peace of mind that they can recover safely, faster, even if arrays are locked down following an attack.

How do regulatory requirements and compliance standards affect data backup strategies, particularly in industries like healthcare and finance?

The ability to swiftly restore critical services is quickly becoming mandatory in some regulated industries. For example, the EU DORA (Digital Operational Resilience Act) regulation is geared to require that critical banking systems be recovered in less than 2 hours in case of disaster, something that is very difficult to achieve with legacy data protection solutions which were never designed with fast recovery in mind. It is likely that we will see more countries and industries mandating quick recovery of critical services.

20 CXO DX / APRIL 2024 » INTERVIEW
Fred Lherault CTO Emerging, Pure Storage
EXPERT
ADVICE

BUILDING ROBUST APPROACHES

Ranjith Kaippada, Managing Director, Cloud Box Technologies discusses his insights around the changing backup and disaster recovery approaches

How has the strategy of backup evolved in the past few years given the changing landscape? How does it become part of a disaster recovery plan?

Over the years, backup strategies have evolved from being merely for data preservation to comprehensive disaster recovery plans today. Businesses now understand what value their data holds and how ever-evolving cybersecurity threats and potential system failures could disrupt their operations leading to downtime and other negative impacts. The current backup strategy utilizes modern technologies such as automation, cloud storage, encryption, and beyond to maintain data integrity in events of sudden disruptions or planned attacks.

Having a robust disaster recovery plan helps organizations identify weaknesses and potential vulnerabilities in their systems. It also helps them deploy advanced tools such as cloud-based backup solutions. They can implement real-time

monitoring systems giving them an edge to withstand any attacks and recover as quickly as possible without putting their reputation or operations at risk.

What are some of the key best practices to follow for enterprises in terms of backup strategies?

It is crucial for companies to exercise an effective strategy to shield their business against data loss due to disasters, data breaches, or cyberattacks, etc.

Enterprises should opt for regular data backups tailored to their data volume, business priorities, and frequency of updates. Setting up automation in backup processes minimizes human errors while allowing employees to concentrate on the execution of other critical tasks. Encrypting data holds paramount importance as well given the fact that attackers can siphon data at rest or when in transit by breaching into the system using unauthorized access.

Another key practice is to regularly test your backup and recovery systems to ensure they are working optimally. Assessing the backup systems assists in identifying potential weaknesses that can be patched before it is too late. As a leading IT infrastructure solutions provider, we have an array of backup and recovery services at Cloud Box Technologies that can help you execute your backup strategies and defend against threats.

Do you see enterprises realizing the need to invest in proper backup strategies? Is the cost of backup a concern? Is there a need to drive more awareness?

Although enterprises have started recognizing how critical it is to protect their data against attackers and disasters, the cost of adopting backup strategies poses a significant challenge. The upfront costs of hardware, software, and maintenance among others add up that could break the spine of SMBs that have limited resources to begin

with. However, it is paramount to understand how these initial costs can outweigh any financial and reputational implications that data breaches or data losses could trigger. Service providers can play a vital role in this venture by offering pay-as-yougo or subscription-based options to make these solutions affordable and accessible.

When it comes to educating and creating awareness, industry stakeholders including service providers, IT vendors, and regulatory bodies must take the initiative to demystify the topic, its myths and misconceptions as well as highlight the impact that such backup solutions could offer.

How do regulatory requirements and compliance standards affect data backup strategies, particularly in industries like healthcare and finance?

Regulatory requirements influence data backup strategies greatly, especially in the finance and healthcare sectors. For instance, PCI DSS sets up measures to protect financial data and prevent fraud. GDPR in Europe and HIPAA in the United States are regulatory bodies that set up rigorous guidelines and safeguards against data integrity, confidentiality, and availability.

When setting up data backup strategies, it must align with the regulatory and compliance standards of the government in the region of business. It includes safeguarding user data against breaches and unauthorized access that could put user’s privacy and security in jeopardy. It also includes assisting auditors with logs and audit trials as mandated.

Organizations must ensure their backup providers comply with pertinent regulations failing which could trigger unwanted actions. There are also backup strategies to ensure companies can recover from data breaches or losses to maintain business operations and continuity.

22 CXO DX / APRIL 2024
» INTERVIEW

A SHIFTING LANDSCAPE

Ankit Satsangi, Director - Enterprise Development at Beeah discusse the evolving cyber threat landscape and cybersecurity approaches

How do you see the Cybersecurity landscape evolving in terms of threats?

The cybersecurity landscape is becoming increasingly complex, with swift advances in technology and smart strategies by cybercriminals. Adversaries are now using state-of-the-art generative AI to orchestrate sophisticated cyberattacks, including producing deepfakes and refining techniques that exploit social interactions for deception. The capacity of these technologies to autonomously generate harmful content, such as malware and phishing attacks, presents major challenges for current detection and response systems. Ransomware attacks have become more user-friendly and prevalent through what's known as Ransomware-as-a-Service (RaaS). This approach makes it easier for those with minimal technical knowledge to initiate attacks, using tools and services

from experienced hackers and even hiring Blackhat’s from the Darkweb and many other channels. This democratization of cyberattack tools has led to a rise in ransomware incidents affecting various regions and sectors more broadly.

The misuse of generative AI for harmful purposes, combined with the spread of RaaS (Ransomware as-a Service), underscores a move towards more accessible yet sophisticated methods of cyberattacks. To counter these evolving threats, cybersecurity professionals are increasingly relying on proactive, advanced strategies. This includes the use of AI-driven defense systems, enhanced threat detection technologies and comprehensive incident response plans. Such efforts are crucial to safeguard against the growing complexity of the cyber threat landscape we currently face.

Discuss the importance of Zero trust approach and security automation in overall cybersecurity strategies?

In the current cybersecurity landscape, it's crucial to deploy strategies that don't just rely on traditional defenses but also incorporate advanced measures like the zero-trust model and security automation. The zero-trust model operates on a simple principle: trust no one and verify everyone. This means always checking who's trying to access your systems, no matter if they're inside or outside your network, ensuring they only have access to what they need. This helps prevent attackers from moving freely once they breach the initial defenses.

Security automation complements this by handling repetitive security tasks, like monitoring for threats and updating defenses, without human intervention. This not only speeds up response times but also reduces the chance of human error. When

you combine the meticulous verification of zero trust with the efficiency of automation, you create a robust, responsive security environment. This dual approach not only tightens security across the board but also aligns with modern compliance demands, protecting both data integrity and organizational reputations.

How are cybersecurity budgets and investments expected to change in response to evolving threats and priorities in 2024?

Cybersecurity budgets and investments are expected to see a significant shift to address the evolving threats and shifting priorities. As threats become more sophisticated and pervasive companies are recognizing the need for increased investment in cybersecurity measures to protect their assets and minimize risks. There is a trend towards larger allocations towards cybersecurity in IT budgets driven by the growing awareness of the financial and reputational risks associated with data breaches and cyber-attacks. Companies are not only increasing their spending on traditional security measures but are also investing in advanced technologies like artificial intelligence (AI) and machine learning (ML) for threat detection and response.

The rise of remote work and the expansion of digital infrastructure have broadened the attack surface that organizations need visibility into to defend. This change necessitates investment in securing cloud environments, mobile devices, and remote access tools which have become integral to maintaining business operations in a post-pandemic world. Companies are also focusing on enhancing their incident response capabilities and recovery plans to respond more effectively to cyber incidents.

24 CXO DX / APRIL 2024
» INTERVIEW

THE JOURNEY FROM MONITORING TO OBSERVABILITY

To realize the full potential of their full-stack observability solutions, IT leaders will need to ensure all stakeholders in the IT department are on board says Gregg Ostrowski, CTO Advisor, Cisco Observability

Across IT departments, we’re seeing a steady march toward full-stack observability. The Application Performance Management (APM) tools that teams once relied on, are now limited when it comes to the monitoring needs of complex hybrid environments. And consequently, new Cisco research shows that observability is now a strategic priority for 85% of organizations around the world. Observability provides IT teams with full and unified visibility across all domains — whether on-premises or in the cloud — enabling them to identify, understand and resolve performance and security issues in a timely way.

With so much benefit to be had, it is tempting for organizations to jump in, both feet first. But maturing from traditional monitoring to observability isn’t an overnight process. With a considered, incremental approach, a two-to-three-year timeline is more realistic. After all, it’s about a lot more than simply implementing a new technology solution; the bigger challenge is driving the cultural and structural change that is essential for organizations to maximize the benefits of observability.

All aboard the observability train

To realize the full potential of their full-stack observability solutions, IT leaders will need to ensure all stakeholders in the IT department are on board. Resistance to change can encumber even the most impressive implementation, so it is key to demonstrate to all technologists (whether they’re developers, operations or security professionals, and whether they’re specialists in cloud native or on-premises environments) how full-stack observability will deliver benefits for them in their everyday work. And how each new capability will alleviate pressure, reduce firefighting and enable IT teams to focus on more fulfilling and high value work.

The observability imperative

Today’s tech-savvy consumers have little patience for brands

26 CXO DX / APRIL 2024
» COLUMN

that fail to deliver on the digital experience they expect. In an effort to keep innovating, organizations have rapidly grown their IT environments, which today can span multiple on-premises locations and cloud. In such a sprawling and volatile hybrid IT estate, traditional APM tools are no longer capable of providing technologists with a complete picture of the application and its underlying infrastructure. When issues arise, IT teams struggle to pinpoint root causes and understand dependencies. As a result, metrics such as Mean-Time-To-Resolution (MTTR) are negatively impacted, and the chances of organizations suffering a revenue and reputation-impacting incident are growing significantly.

Full-stack observability enables IT teams to cut through the complexity of their hybrid environments and provides the visibility and insights technologists need to ensure that applications and supporting infrastructure are operating at peak performance at all times.

While the benefits are apparent, IT leaders cannot discount the fact that no technologist ever likes to be told which tools they should and shouldn’t be using. Many feel that their current APM tools still serve the specific needs of their domain team and enable them to hit their KPIs.

Rather than aiming to rip and replace, IT leaders should ease the transition by choosing an extensible observability platform which embraces open standards. This will allow IT teams to continue to use their preferred tools. An open platform can bring in and correlate signals from any tool, and this makes it a much easier ‘sell’ for CIOs looking to introduce observability across the IT department. They can gradually add new capabilities to their observability platform based on the most pressing business needs and, with each step, prove the value that is being delivered.

How to showcase the benefits of observability to every IT team

To get IT teams on board, and participating in the organizations move towards observability, IT leaders should focus on emphasizing the milestones on the journey towards observability that will most help these stakeholders in performing their daily tasks more effectively. Here are three examples which can help drive buy-in and support:

1. Expanding visibility across domains

By integrating infrastructure visibility, including Kubernetes and hosted environments, along with network visibility, IT teams can extend their monitoring beyond the application layer. This enables swift identification of specific domain issues, bridging visibility gaps across hybrid environments where application components operate, leading to reduced MTTR.

For technologists working in DevOps, NetOps or InfraOps, domain visibility brings huge benefits. It puts an end to the constant firefighting — trying to understand the root cause of issues — and allows them to adopt a more proactive and strategic ap-

proach to their work. Any IT practitioner who has faced the challenge of troubleshooting under pressure would welcome this.

2. Building security into observability

Many IT professionals have had sleepless nights because of security incidents. So, highlighting how observability can enhance security is a sure way to garner support. By adding security monitoring into their observability capabilities, organizations can ensure complete protection for applications, from development through to production, across code, containers, and Kubernetes. This is becoming mission-critical for organizations in many sectors that are having to manage ever more sophisticated threats.

Importantly, the integration of security breaks down long-established silos, fostering increased collaboration between security and application teams. This paves the way for the adoption of DevSecOps methodologies. This new approach allows developers to embed robust security into every line of code, resulting in more secure applications and easier security management, before, during and after release.

3. Generating a customer perspective on digital experience

By implementing digital experience monitoring (DEM), organizations can analyze application performance through a customer lens, understanding and optimizing the experience that end users are encountering when interacting with applications and digital services. Functionality such as Session Replay enables operations teams to visualize how customers are behaving and interacting with their applications, and this insight is hugely valuable given the extent to which consumers are now demanding worldclass digital experiences at all times.

Other milestones on the journey to observability include integrating cost insights, enabling CloudOps teams to analyze and optimize the costs of their cloud workloads, and automated rightsizing of cloud workloads to drive efficiency and digital experience.

With IT leaders now earning their rightful place at the decision-making table, it is also essential for IT leaders to add business context into their observability strategy in order to correlate IT data with real-time business metrics. This ensures that every IT team can identify the most important issues, based on potential impact to customers and the business, and prioritize their resources in these areas.

The move to observability is now inevitable, so the sooner IT leaders can get their teams on board, the faster they will realize benefits, both within the IT department and beyond. As well as ensuring that organizations can deliver seamless and secure digital experiences, observability also provides a platform for technologists to transform their careers. They can embrace new ways of working, learn new skills and forge new partnerships across the IT department, setting them up to thrive in a hybrid future.

27 APRIL 2024 / CXO DX
» COLUMN

BUILDING FOR PORTABILITY IN THE CLOUD

Having data portability is crucial to be able to move things around as needed and to simply maintain data hygiene in the long term writes Rick

As businesses look to optimise their costs to weather economic downturns, ramping up cloud spend can cause some headaches. While there are lots of different options to help mitigate this, from moving workloads to a more cost-effective environment (or even back to on-premises) or re-architecting to save costs, organisations often lack the technical agility to make the most of them.

With modern businesses carrying so much data, some legacy or homegrown applications not allowing for transfer and cloud lock-in all to contend with, it can quickly feel like trying to fit a thousand square pegs through a thousand round holes. All of this is against the backdrop of cyber threats like ransomware - so the right balance between cost and security needs to be found for every workload. To avoid this, IT teams are increasingly designing and adjusting their environments with portability in mind, but there are some questions to ask yourself first.

28 CXO DX / APRIL 2024
» COLUMN

Why move data at all?

To state the obvious for a second, modern enterprise IT environments are vastly complex. They can be monolithic and highly dispersed, with the growing data gravity of some environments making many companies essentially “digital hoarders.” This is problematic as is, as holding on to data you don’t need exposes you to unnecessary cybersecurity and compliance risks. But data bloat in the cloud also brings severe financial consequences and the dreaded “bill shock” when that invoice lands.

So, even though many companies moved to the cloud in the first place to optimise costs, the flexibility that the cloud gives businesses can be something of a double-edged sword. While the attractiveness of the cloud is that you only pay for what you need, the flip side is there is no “spending cap” so costs can easily get out of control. To solve this, better data hygiene can help, but for the data you do need, it's about picking the right platform for the workload. This may involve re-platforming or re-architecting to optimize costs. This is where data governance and hygiene come in - before looking to move data or improve processes, you need to know exactly what data you have, and where.

What data can we move?

So, once you’ve established what data you should think about moving, either to a different environment, server, or storage tier, the next, more difficult question is what data you can move. Unfortunately, this is where many organisations face challenges. Having data portability is crucial to be able to move things around as needed and to simply maintain data hygiene in the long term. But several factors can make it difficult to move or transfer workloads from one location to another. The first is “technical debt” - essentially the extra work and maintenance required to update older or scratch-built applications to get them to a point where they are transferable and compatible with other environments. The cause of these issues might be taking shortcuts, making mistakes, or simply not following standard procedures during software development. But leaving it unfixed makes it impossible to optimize environments and can cause additional problems for things like backup and recovery.

The other, perhaps more infamous, issue that can affect data portability is cloud lock-in. It is a well-known fact at this point, that businesses can easily be locked into using specific cloud providers. This can be due to dependencies like integrations with services and APIs that can’t be replicated elsewhere, the sheer “data gravity” it might have in a single cloud, and a simple knowledge gap meaning teams know how to use their current cloud, but lack the expertise to work with a different provider. Of course, this will only affect moving workloads out of the cloud, so it's still possible to build for better portability to give you better storage options and promote better data hygiene. Essentially, where possible businesses need to create some standardisation, across their environments, making data more uniform and portable and mapping and categorizing it so they know what they have and what it's for.

The (constant) security question

Finally, it's crucial when building and capitalising on data por-

tability that security is not left behind. Of course, improving security can (and should) be a motive for moving workloads in the first place but if you’re migrating workloads to optimise costs this must be balanced against security considerations. Security needs to be part of the data hygiene process, so teams need to ask “What do we have?” “What things do we not need?” and “What are the critical workloads we absolutely cannot afford to lose?” Beyond this, continue to patch servers and when moving data to colder storage etc remove internet access when it's not needed.

Having backup and recovery processes in place is also key when moving workloads. To come full circle, having easy data portability is also important for disaster recovery. In a critical event like ransomware, the original environment, be it a cloud or on-premises server is often unavailable to recover damaged workloads (via a backup) as it is typically cordoned off as a crime scene, and the environment might still be compromised. In order to recover quickly and avoid costly downtime, workloads sometimes need to be recovered to a new temporary environment, like a different cloud for example.

As organisations strive to manage their IT environments and avoid financial and cyber security surprises, it's important to constantly assess what data and applications you have, and where they are kept. But to manage this and adjust as needed, businesses must build with portability in mind. By doing this, businesses can create a more agile and cost-effective cloud environment and will find it easier to bounce back and recover from disasters like ransomware.

"Having backup and recovery processes in place is also key when moving workloads. To come full circle, having easy data portability is also important for disaster recovery. In a critical event like ransomware, the original environment, be it a cloud or on-premises server is often unavailable to recover damaged workloads (via a backup) as it is typically cordoned off as a crime scene, and the environment might still be compromised."

29 APRIL 2024 / CXO DX
» COLUMN

WHY DNS IS A FAVOURITE ATTACK VECTOR

Terry Young, Director of Service Provider Product Marketing, A10 Networks discusses why DNS exploits continue to be a top attack vector in 2024

The world of IT security has become more sophisticated and complex; as threats have grown exponentially, they have also become more blended, obscure, and harder to remediate. Today, most organisations have experienced some kind of attack, with many experiencing multiple attacks, and it is no longer a matter of if, but when, an attack will take place.

The growth of cybercrime-as-a-service, especially DDoS-as-aservice, has enabled criminals to purchase or rent tools and services that enable them to carry out attacks without having to develop expertise themselves. Combining such tools with attractive financial incentives and a wide collection of ready-made victims, it is easy to see why this is such a lucrative industry for criminals.

Top attack techniques

The cost of a network, website or service being down or unavailable can be probative. The average cost of downtime across all industries has historically been about $5,600 per minute, but recent studies have shown this cost has grown to about $9,000 per minute. For higher risk industries such as finance, government, healthcare, manufacturing, media, retail, and transportation their average cost of downtime tends to be over $5 million per hour.

One of the most popular attack techniques involves the domain name system (DNS). The DNS protocol is essential to every internet-based service and is used to translate alphabetic domain names into a set of numerical internet protocol addresses. DNS is one of the key protocols that makes the internet work.

Why DNS is a favourite attack vector

Today, many organisations provision their own DNS infrastructure to ensure uninterrupted operations of their IT infrastructure and business applications. For example, in many organisations, work computers default to using the organisation’s own DNS servers. This helps internal users access internal websites while keeping such domain names confidential and secure. However, DNS still remains one of the favourite attack vectors for cyber criminals for two main reasons:

• It is an inherently insecure protocol, and easier to target.

• DNS is fundamental to the operations of the internet and applications, and therefore bringing it down can have a much greater impact compared to simply targeting individual applications or services.

30 CXO DX / APRIL 2024 » COLUMN

As more organisations rely on online applications, DNS exploits have become more common. In a 2023 IDC study, 88% of organisations have experienced one or more DNS attacks on their network, with an average of seven per year and each successful attack costs the business, on average, $942,000.

Delving into DNS attack techniques

There are several different DNS-based attack techniques including: DNS tunneling, DNS phishing, DNS hijacking or credential attacks, DNS spoofing, and DNS malware. DNS attacks are also used as the basis for both DDoS and more advanced phishing attacks.

Many DDoS attacks rely on ways to abuse DNS protocols, including traffic amplification, subdomain attacks, DNS floods and DNS recursion attacks. DNS hijacking, for example, allows attackers to re-route queries from an organisation’s servers to destinations that they control, and it is often used to insert malware into endpoints. With DNS spoofing, malware is injected into DNS caches, or directly via DNS tunneling, so hackers can redirect DNS query traffic. DNS NXDomain flood attacks send spurious queries to nonexistent domain names with requests for invalid or non-existent records, tying up servers.

All of these types of attacks can have short- and long-term implications. In the immediate aftermath of an attack, an organisation may experience downtime or loss of productivity as a result of systems being taken offline. This can lead to revenue loss, reputational damage, and regulatory fines. Long-term impacts include damage to brand reputation, loss of customers, and decreased market share.

The challenge with multiple products to protect DNS

With the emergence of each new threat and the technology to counter it, organisations have traditionally responded by deploying a new security product to remediate the immediate threat at hand. Over time, this has led to the deployment of numerous security devices in the network, resulting in the following challenges:

• Increased complexity: With many security devices in the network, the task of deploying, managing, and troubleshooting has become increasingly complex. Each device has its own separate management interface and configuration commands that require specialised knowledge to deploy and troubleshoot.

• Increased cost: Upgrading DNS infrastructure to meet growing traffic needs requires upgrading most, if not all devices. This results in the need to purchase multiple different products, resulting in high purchase and licensing costs.

• Slow performance: Some of the newer DNS technologies, such as DNS over HTTPS (DoH) and DNS over TLS (DoT) require TLS decryption/encryption processing, which is highly CPU-intensive. However, DNS servers were not originally designed for such processing, therefore adding DoH/ DoT can lead to a severe slowdown in overall performance.

Unsuitable for hybrid cloud: All these problems are further compounded by the growing adoption of hybrid cloud. This is because many of the legacy security products that have been deployed in private data centres may either not be available or may not be optimally suited for such a deployment. This leads to adoption of cloud-specific offerings, adding to the complexity and cost of deployment.

Securing and simplifying your DNS infrastructure

DNS is a critical component of the internet infrastructure, and it is important that DNS is always up and running to ensure normal business operations. However, DNS is also susceptible to a range of attacks and unfortunately no single security method can prevent all the different types of attacks. Therefore, an all-encompassing approach is required, including DNS load-balancing, DNSSEC, DoH/DoT, and DNS caching to ensure DNS infrastructure is constantly available and performing optimally.

Only with a comprehensive set of DNS security solutions can organisations secure and simplify their DNS infrastructure without compromising on performance or the user experience.

31 APRIL 2024 / CXO DX » COLUMN

DEFEND YOUR DIGITAL ASSETS

Organizations need to employ a full range of security options to protect their APIs and ultimately, their business writes

Pop quiz time. Which of these endpoints belongs to an API and which one belongs to an app?

https://www.example.com/product https://www.example.com/product

If you’re confused and can’t decide, that’s okay. That’s the point. App and API endpoints look pretty much the same. That’s because in technical terms if they’re RESTful (and most are) they are invoked in the same way, via HTTPS and usually with a GET method. What’s often different is the payload sent with the request. For APIs that typically contains some data in a JSON or XML format while web app requests may contain, well, nothing.

Still, one of the key findings from F5’s annual State of Application Strategy report implies that organizations treat APIs as different from applications when it comes to security. We infer this based on the finding that 41% of organizations have at least the same or greater number of APIs than they do applications and yet place a lesser value on the same security services that protect them.

32 CXO DX / APRIL 2024 » COLUMN

» COLUMN

You might wonder how organizations would end up with more APIs than apps. Thanks for asking! While APIs used for internal, service-to-service communication (a la microservices) are certainly tightly coupled to the service they support, this is not necessarily true when APIs are used to present external interfaces.

Where do APIs come from?

Consider that in our 2021 research, 61% of respondents told us they were “adding a layer of APIs to enable modern user interfaces” as a method of modernization. In 2022 that number was 45%. What that means is the APIs enabling modern user interfaces are not necessarily artifacts directly attached to applications. They might be façades that facilitate modern user interfaces and applications, like mobile apps and digital services, or they might be façades designed to enable partner and supply chain communications. These use cases are supported by API Gateways and layer 7 routing in load balancers, which often provide some level of transformation capabilities that allow them to translate from API endpoint to app endpoint, thus enabling an API façade like those that make old American west buildings appear much more impressive than they are.

And of course, a goodly number of APIs are public-facing entities attached to apps and accessed via the web (typically HTTPS).

Regardless of how they got there, public-facing APIs are subject to many of the same attacks as applications. This is especially true when bots are involved, as APIs with good documentation simply make it easy for attackers to script attacks at scale.

For example, just over 13% of transactions protected by F5 Distributed Cloud Bot Defense in 2023 were automated. That is, a script or software was used instead of a human using a web browser or mobile app. Those transactions occur via both APIs and apps. Some percentage of those automated transactions were certainly “bad bots” that the presence of our security service prevented from doing whatever bad thing they were trying to do.

So, when we looked at how respondents perceive bot management based on their self-reported number of APIs, we were somewhat shocked to discover that bot management is pretty low on the importance scale.

While the importance placed on API Gateways appears to be appropriate to the number of APIs under management, the same is not true for bot management. In fact, it’s completely the opposite! As the number of APIs grows, the importance of bot management appears to decline rapidly.

It could certainly be the case that the bulk of those APIs are internal. That is, they are east-west APIs between microservices that are not exposed to external actors that might be bad bots with malicious intent.

But then again, they might be. Given the number of articles I’ve read in the past year about attackers gaining access via APIs, I’m going to guess there are a lot more external than we think.

So, it’s time to remind folk that while there are a number of annoying bots out there—grinch bots, sneaker bots, etc.—that disrupt business by gobbling up high-demand goods, there are also a significant number of bots whose only purpose it is to sniff out vulnerabilities and attack them. In both APIs and applications.

Thus, it would be a good idea for organizations to employ a full range of security options to protect their APIs and ultimately, their business. Bot management is certainly one of those security options and should be considered a critical component of any security strategy.

At the end of the day, the bots don’t care whether that endpoint belongs to an app or an API. They’re going to attack both.

Which means organizations need to be protecting both apps and APIs by detecting bots and preventing them from doing whatever bad thing they’re trying to do.

33 APRIL 2024 / CXO DX

HOW ENTERPRISES CAN AVOID TECH DEBT WHILE PIONEERING PROGRESS

Dinesh Varadharajan, Chief Product Officer (CPO) at Kissflow says technical debt can become an obstacle to the very transformation that was originally envisaged

In recent decades, the GCC has emerged as a trailblazer. Far from being content with just following the trend, countries in the region have aspired to set it – whether that be the UAE in being the first country in the world to appoint a Minister for Artificial Intelligence, or Saudi Arabia’s US$40billion push to lead the ongoing AI revolution. Private enterprises throughout the region also move rapidly to mitigate risks or take advantage of opportunities.

Speed is everything, so many pioneering projects inevitably involve technology. And when an organisation gathers its stakeholders to discuss digital transformation, the roadmap becomes littered with deadlines. This is understandable. No business wants to be stuck in due diligence while its competitors are onboarding new customers. Nonetheless, quick fixes and fast-spun code can lead to a cycle of heavy maintenance burdens and spiralling costs – or as it has come to be known, technical debt.

This debt is inevitable when replacement of legacy systems becomes an urgent priority. For example, pre-2023, a GCC business may have been exploring AI use cases with a view to digitalising a core workflow. Then, when ChatGPT started grabbing everyone’s imagination, the business’s technical and business leaders may have felt a greater sense of urgency about adopting AI. They may have had concerns about more agile players entering their industry or an existing competitor, powered by generative AI, bringing new offerings to the marketplace.

Complex interdependencies

But in moving away from legacy systems and processes, it is critical that project leaders pay due attention to their complex interdependencies with other assets. If an organisation puts speed before quality, its short-term approach may lead to suboptimal

34 CXO DX / APRIL 2024
» COLUMN

coding decisions that dampen the effectiveness of the overall solution and call for ongoing fixes that accrue as technical debt. Such debt can become an obstacle to the very transformation that was originally envisaged.

Technical debt is easier to prevent than to eliminate. Technical leaders that have been primed to avoid it will be wary of practices such as taking shortcuts during software development. They can inform line-of-business executives of the future inefficiencies that are risked by prioritising short-term results in the present. If the DevOps team spends increasingly more time reworking code, that leaves less time to develop new digital experiences. Decision makers will be trapped by the choice of leaving suboptimal code in place, delaying new projects, or recruiting new developers. Every one of these options carries a significant cost.

A McKinsey report from 2022 likened technical debt to “dark matter” (“you know it exists, you can infer its impact, but you can’t see or measure it”). But CIOs surveyed by the analyst firm estimated that their technical debt was anywhere between 20% and 40% of the value of their digital estate. As we have seen, the debt flows directly from the costs associated with fixing issues. Resources are directed towards tasks that would not have been necessary if different coding decisions had been made.

Shortcut today, slowdown tomorrow

The lessons of technical debt are clear. If we take a shortcut today, we may face a slowdown or a standstill later in our digitalization journey. When DevOps leaders make the case for more careful development cycles, however, they must be able to frame the problem in a way that will resonate with non-technical executives. Tech-team leaders can point out that technical debt leads to less innovation, which inevitably impacts customer engagement. Developers that are fixing issues related to past shortcuts are not building new solutions. If a market opportunity arises and the development team is distracted or unavailable, then that opportunity will likely be missed. This also applies to the ability to react to external crises.

Some non-technical stakeholders may be interested in the escalating financial costs of maintaining and updating applications amid technical debt. They may also be alarmed to learn that the longer the debt goes unaddressed, the more expensive it will be to fix. Stakeholders of a financial mindset will likewise see a business risk in technical debt if they are told that it slows or prevents growth. This is also an easy case to make. Put simply, if the enterprise cannot bring new offerings to customers quickly enough, then a competitor captures that opportunity for growth.

Potentially the most damaging of all is the impact tech debt can have on cybersecurity. In the GCC’s highly regulated markets, private companies must master the vulnerabilities in their core business systems. On top of the risks posed by original shortcuts, each workaround, patch, and upgrade is a chance to introduce another vulnerability that threat actors can exploit. Most modern business leaders need no more information than that to recognize the severe risks involved.

Low-code to the rescue

When under pressure from development backlogs and tight deadlines, moving fast may be an unfortunate necessity. But thanks to Low-code and no-code (LCNC) app development platforms, teams can now accelerate development without accumulating technical debt. LCNC platforms allow non-technical users to become citizen developers, alleviating the labour burden on IT and enabling solutions to go live much more quickly.

As development times shrink from months to weeks, quality does not suffer under low-code development. LCNC tools use out-ofthe-box templates with best practice already baked in. Visually rich drag-and-drop editors allow business users to build useful apps intuitively, with most underlying code written automatically. These are not shortcuts because the optimal code base already exists and is imported automatically as the user builds their solution, thus avoiding technical debt. Moreover, with robust governance layers in place, IT teams can maintain their watchful eye, ensuring that quality and security are never compromised.

Agile, iterative development is crucial to competitiveness in a region where markets change rapidly. Low-code/no-code app development allows businesses to maintain quality, security, and efficiency while meeting the demand for new experiences. Technical debt becomes a thing of the past if DevOps teams carefully manage delivery expectations. Low-code/no-code platforms allow tighter deadlines to be more realistic because best-practice code has already been written and passed quality checks. With IT and non-IT leaders working together, LCNC adoption becomes the first step in eliminating technical debt and making sure it no longer stands in the way of progress.

"Technical debt is easier to prevent than to eliminate. Technical leaders that have been primed to avoid it will be wary of practices such as taking shortcuts during software development. They can inform line-of-business executives of the future inefficiencies that are risked by prioritising short-term results in the present."
35 APRIL 2024 / CXO DX
COLUMN
»

PNY RP60 PORTABLE SSD

The new RP60 Portable SSD was designed with creative wanderlusts and on-the-go professionals in mind to provide a highly portable and extremely durable data storage solution. With a compact form factor and convenient clip loop, the RP60 can easily go wherever the adventure takes you. Travel safe in the knowledge that your data is secure and protected, with the RP60’s tough, outer silicone shell which makes the drive resistant to drops, water and dust. Plus, experience top-tier performance with read/write speeds of up to 2,000 MB/s and 1,800 MB/s, respectively, and broad compatibility across various devices, thanks to the USB 3.2 Gen 2x2 Type-C connector.

Offered in 1TB and 2TB capacities, the RP60 facilitates convenient storage and swift transfers of large multimedia files. Creators can access and edit high-resolution images and videos with ease for smooth editing and playback experiences. While gamers can enjoy lightning-fast installations and loading times for their extensive gaming libraries. Mainstream users too will find the RP60 Portable SSD to be an excellent choice as both a dedicated

system backup drive or an everyday-use storage companion, perfect for securely storing and accessing critical work documents, projects, and large libraries of photos and movies.

Highlights:

• RUGGED DESIGN: Engineered and tested to protect your data against water, dust (IP654,5) and accidental drops (up to 3m/9.8ft)6

• HIGH-SPEED TRANSFERS: Move large files at lightning-fast speedssequential read/write speeds of up to 2,000 MB/s and 1,800 MB/s3

• COMPACT AND PORTABLE: With a compact and lightweight design featuring a clip loop, the RP60 can easily slide into your pocket or securely clip onto a beltloop, backpack or camera bag.

• USB Type-C COMPATIBILITY: The RP60 Portable SSD is compatible with a wide range of devices, including smartphones, tablets, laptops, and desktops. Effortlessly connect and transfer data across your tech ecosystem.

• DATA PROTECTION: Included with the RP60 is the Acronis True Image Data Protection software to help backup and recover all your important data.

DELL P-SERIES MONITORS

The Dell P-series monitors are targeted at business professionals who are looking to maximize their work productivity while the S-series monitors are meant for families, students and general consumers who are after entertainment, enjoyment and lasting comfort.

For working professionals who crave all day comfort and easy connectivity, the Dell 27 / 24 USB-C Hub Monitors (P2725HE / P2425HE / P2425E) offer enhanced eye comfort features as well as extensive connectivity ports.

The new P- series monitors are designed using 85% post-consumer recycled and closed loop ITE-derived plastic, 50% recycled steel, 100% recycled alumi-

num and at least 20% recycled glass. Additionally, the P-series monitors also meet the latest environmental standards such as EnergyStar, EPEAT Gold , TCO and TCO Edge. With the new EPEAT Climate+ designation, these monitors have met the industry best practices around decarbonization.

Users can now conserve energy when the monitor is not in use with PowerNap mode in Dell Display Manager, which dims or puts the monitor to sleep and lowers energy use by up to 14.8%. Lastly, all the monitors also ship in boxes that are made from 100% renewable content and can be recycled, and they come with fiber-based cushioning that are more sustainable to the environment.

36 CXO DX / APRIL 2024 » TECHSHOW

DGS-F1008P-2S

DGS-F1008P-2S, a 8-port industrial Gigabit POE switch from D-Link provides 8 Gigabit POE ports, 2 Gigabit optical ports, plug and play. This switch can be used as a power supply unit (PSE), max 30W each port, suitable for IEEE802.3af /at compatible powered devices (PDs), achieve EMC Industry Level 4 protection.

It can effectively resist static, lightning and pulse interference; corrugated highstrength aluminum pro le housing, IP40 grade, low power design, seismic rail mounting, wide temperature design, can work in harsh environments; this series is especially suitable for intelligent Provides fast and reliable communication links in harsh industrial situations such as transportation, energy, electricity, and security.

Highlights:

• Features 8 Gigabit POE ports, 2 Gigabit optical ports

• Housing: IP40 grade protection, corrugated high strength metal housing

• Effectively resists static, lightning and pulse interference

• Three year warranty

Highlights:

• Improved Productivity: An improved 100Hz refresh rate on an ultrathin bezel screen, FHD/ WUXGA (1920x1200) resolution featuring IPS technology for color consistent and smooth viewing experience.

• Comfortable Viewing: With TUV Rheinland® 4-star eye comfort certification, users can benefit from the improved ComfortView Plus feature that reduces harmful blue light to 35% or less to help reduce eye fatigue, along with flicker-free vivid visuals with a 1500:1 contrast ratio and 99% sRGB color coverage.

• Extensive Connectivity: Transform the monitor into a productivity hub with RJ45 for wired Ethernet connectivity and single cable solution USB-C (power delivery up to 90W) — all in a clutter-free setup. Extensive connectivity ports allow for easy connection to a variety of devices. Dell upgraded one USB-A port to USB-C to address the increasing demand for USB-C ports for easier connectivi-

ty to peripherals and fast charging.

• Convenient Design: Connect to your external devices easily with intuitively designed quick access ports on the bottom bezel or attach an optional snap-on soundbar (sold separately) for audio. The height adjustable stand with bi-directional tilt, pivot and swivel function provides a personalized setup.

37 APRIL 2024 / CXO DX » TECHSHOW

42% OF UAE BUSINESSES EMBRACE AI, SAYS IBM STUDY

65% of IT professionals have accelerated AI rollout in the last 24 months

Among growing AI adoption in personal and business lives in the region, IBM has released the regional findings of two significant studies: the 2023 Global AI Adoption Index by Morning Consult with insights from businesses in the United Arab Emirates (UAE) and the biennial consumer study focused on the Middle East and Africa (MEA) by the IBM Institute for Business Value. These studies provide a comprehensive look into consumer behaviors and AI adoption trends by enterprises, highlighting the transformative impact of AI in retail and business operations.

UAE Leads in AI Adoption in Business Operations

The IBM Global AI Adoption Index 2023, which surveyed over 8500 IT professionals, including those from the UAE, underscores the rapid adoption and integration of AI in various business sectors. The study reveals that 65% of IT professionals in the UAE have reported a significant acceleration in AI rollout over the past 24 months, reshaping the way businesses operate, driving efficiency, innovation, and competitiveness in the UAE market. Highlighting the UAE’s stride towards technological advancement, the country has emerged as a frontrunner in the adoption of AI with a remarkable 42% of companies having already actively deployed it in their business operations. This figure is particularly noteworthy as it reflects the proactive approach of UAE businesses in integrating AI into their core processes, setting a benchmark for AI implementation in the region.

Investments in AI are also seeing a strategic focus, with research and development, along with workforce upskilling, being identified as the top investment priorities. This emphasis on R&D and skill enhancement is crucial for sustaining AI growth and innovation, ensuring that the work-

force is equipped to handle the evolving demands of AI technologies.

The study further highlights that 34% of UAE companies have a comprehensive AI strategy in place, while an additional 30% are in the process of developing one. The main drivers propelling AI adoption in the UAE include advances in AI technology, the increasing integration of AI in standard business applications, and the pressing need to reduce costs and automate key processes, collectively contributing to the growing reliance on AI as a critical tool for business efficiency and innovation.

Revolutionize retail with AI everywhere: Customers won’t wait

IBM’s study “Revolutionize retail with AI everywhere: Customers won’t wait,” surveyed nearly 20,000 global consumers, including from the MEA.

The study highlighted a widening gap between shopper demands and the current

retail offering. Only 3% of MEA consumers are satisfied with the in-store experience, despite the majority of consumers globally (73%) having a preference for physical stores. At 10 points higher than the global figure, 75% of MEA consumers are supplementing their in-store experience by using mobile applications while shopping demonstrating a trend towards a digitally integrated in-store experience. In terms of AI, 9 out of 10 MEA consumers who haven’t used AI for shopping are interested in its potential to enhance their shopping journey. Meanwhile, 75% of MEA respondents are eager for AI applications, and 67% for AI enhancements like virtual assistants.

Consumer expectations are rapidly changing, underscoring the urgent need for retailers to innovate and integrate advanced technologies, such as AI, to elevate the shopping experience. The study also serves as a resource for retailers, outlining actionable strategies to help these businesses meet changing consumer demands.

38 CXO DX / APRIL 2024 » TRENDS & STATS

Creating A Bold Future For Africa

Vibin Shaju General Manager – UAE, Trellix

UNLOCK AFRICA’S DIGITAL FUTURE AT GITEX AFRICA MAY 29-31, MARRAKECH, MOROCCO ◼ Ai Everything (AI x Cloud x IoT x Data) ◼ Cybersecurity ◼ Consumer Tech ◼ Digital Finance ◼ Telecoms & Connectivity ◼ North Star Africa ◼ Digital Cities ◼ Digital Health
Meet MORE tech brands Network with MORE tech professionals Discover MORE tech solutions Hear MORE ground-breaking opinions … than anywhere else on the entire African continent IN AFRICA VISIT THE TECH & STARTUP SHOW ORGANISED BY HOSTED BY UNDER THE AUTHORITY OF Under the High Patronage of His Majesty King Mohammed VI 29 - 31 MAY 2024 MARRAKECH Book to secure your Early Bird Ticket today. Expires 18 April 2024 gitexafrica.com FIND YOUR WORLD DUBAI CO-LOCATED WITH

Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.