Arm Tech Trends Q1 2021

Page 1

ISSN 2634-9116

HOW WILL YOU HARNESS THE

POWER OF THE EVOLVING EDGE?

Vol. 3, No. 1

Report UK£99 but FREE for anyone that registers to IoT Global Network www.iotglobalnetwork.com SPONSORED BY

Tech Trends – your new inside track to the topics that really matter.


Unify the edge, transform the business Pelion Edge unifies and simplifies the management of edge applications, gateway systems, and downstream IoT devices. Reinvigorate legacy OT infrastructure while delivering secure, seamless integration with the latest IoT innovations on offer.

Transform the edge from an operational burden to an autonomous powerhouse. Create an ecosystem of enabled applications, both in-house and third-party, to deliver localized, actionable real-time data that empowers business-critical use-cases such as:

Condition-based Monitoring

Environmental Monitoring and Safety Compliance

Asset Tracking and Inventory Management

Usage-based Insurance

Secure digitalization at the edge produces the data that forms the basis of powerful business transformation. Extending decision-making to the factory floor with Pelion Edge delivers improved responsiveness, operational efficiency, and enhanced security for your IoT deployment.

pelion.com/product/edge-applications-management


Contents

Editor’s overview

Analyst report

Edge evolution

06 George Malim warns that it’s too simplistic to view the edge as a destination when, in fact, it will continue to evolve

Transforma Insights’ Jim Morrish explores how the evolution of the edge has created complex challenges for organisations to address at the same time as opening up enormous potential

Case study

Solution overview

04

14 George Malim assesses on-site industrial deployment experiences of edge computing

18

19

Inside a building management specialist’s deployment of Pelion device management

How Pelion is enabling the edge applications ecosystem

IoT Global Tech Trends covers technological & business developments for businesses enabled by the Internet of Things (IoT). © Copyright WKM Ltd. All rights reserved. No part of this publication may be copied, stored, published or in any way reproduced without the prior written consent of the Publisher.

ISSN 2634-9116

Managing editor: George Malim Tel: +44 (0) 1225 319566 g.malim@wkm-global.com

Sponsor: Founded initially as Arm's IoT incubation unit, Pelion is now at the forefront of designing, securing, and managing IoT, from devices and connectivity, through to data delivery. Whether in the home, the workplace, and everywhere in-between, secure, scalable, robust, and dependable IoT is critical. Pelion is forging an independent path, building on a solid foundation of connectivity and device expertise, operating a global footprint, a strong base of more than 500 customers, and over 150 partners. With a unique combination of global IoT connectivity, device management, and edge applications enablement, the Pelion connected device platform breaks down barriers to IoT success. www.pelion.com

Editorial director & publisher: Jeremy Cowan Tel: +44 (0) 1420 588638 j.cowan@wkm-global.com

Business development: Cherisse Jameson Tel: +44 7950 279368 c.jameson@wkm-global.com

Digital services director: Nathalie Millar Tel: +44 (0) 1732 808690 n.millar@wkm-global.com

Designer: Jason Appleby Ark Design Consultancy Ltd Tel: +44 (0) 1787 881623

Published by: WeKnow Media Ltd. Suite 138, 80 Churchill Square, Kings Hill, West Malling, Kent ME19 4YU, UK Tel: +44 (0) 1732 807410

Evolving Edge I Vol. 3, No.1 I 03


Editor’s overview

THE EDGE IS EVOLVING SO TREAD CAREFULLY There’s a temptation to be simplistic and suggest that edge computing will replace cloud when it comes to deciding where compute power will be located as enterprises evolve their IT architectures to better support and more closely mirror where their functions are performed and their business is done. It’s easy to look to the latest figures and deduce that the cloud era is over and the edge is now dominating technology decisions but, to arrive at that conclusion, you have to decide what and where the edge is. That’s still open to debate because the edge continues to evolve and new deployments, devices and data technologies continue to expand the scope of the edge, writes George Malim. It’s clear that the traditional model of collecting data, transmitting it to a centralised processing location via the cloud, processing the data and determining an action or no action, and then transmitting back to the data collection point or another location an instruction is complex, costly and has the potential to cause delay. Certainly in IoT, as the number of connected devices grows into the billions, this hyperscale connectivity will be hard to sustain, placing huge burdens on networks and necessitating low latency connectivity for many applications. Edge computing, at its most simple, appears to offer a compelling answer because it enables data to be processed at or nearer to the point of collection and action, potentially cutting out the traditional hub and spoke, back and forward traffic. However, the edge is amorphous. For some, it’s the telecoms network edge, for some it’s an edge gateway device while for

04 I Evolving Edge I Vol. 3, No.1

others it is the sensor or device at the edge itself. Determining where the edge is depends on individual applications and business cases. A light sensor in a smart city’s streetlamp is an example of utilising edge intelligence at a basic level. The sensor recognises it’s going dark and orders the light to switch on. It doesn’t need to ping a message saying it is going dark to a centralised computing resource, which can then instruct it to turn the light on. This system only needs connectivity to set parameters and gather data about how long the lamp has been on to aid predictive maintenance and to reconcile energy consumption with billing. This is the edge at its simplest. A straightforward decision tree that can be acted upon hyper-locally, with connectivity used to augment the performance of the system. For smart city authorities, there’s an improvement over arbitrarily setting an on and off time


Editor’s overview

George Malim Tech Trends

because being able to be more precise and turn each lamp on or off at a specific light level saves wasted on-time in the form of energy consumption while assuring lighting is provided when needed. However, this basic example gets more complex when you start to add in additional functions and new potential use cases. Solar panels on streetlamps can be used to power the lighting but intelligence is needed to store the power safely, allow excess power to be sent to the grid or to request grid power if solar collection has been inadequate. Add unpredictable new services such as recharging electric cars at the streetlight and suddenly the streetlamp has a lot of complex decisions to make. The initial on-off decision tree has become a multi-layered process with large numbers of variables to consider and the basic computing power of that core process will be swamped by all the extra inputs of additional services and task load. The edge is therefore always evolving and that is likely to continue for the foreseeable future, at least. Today, the edge is not the artificial intelligence (AI) and machine learningenabled world in which every device from your smart watch to your home security,

your car and your child’s schoolbag is connected and equipped with compute power. Cost remains too high for the business case and applications that truly need edge computing everywhere are few and far between. Instead, edge computing and multi-access edge computing (MEC) in particular is proving its value in discrete deployments, typically in private, controlled environments, rather than those that utilise the public internet.

even more to be done at the edge. There is relatively high risk here. Organisations have invested substantially in their cloud infrastructure and further investment in MEC is needed along with adoption of new techniques and addition of new skills. None of this can happen overnight against an unclear business environment. However, new use cases and practices will see cloud infrastructure come together with new technologies, such as AI and digital twins, at the edge.

This issue of TechTrends sets out in the Analyst Report on the following pages, how industrial organisations are deploying MEC in factories, industrial sites, campuses and other contained locations, such as mines. These deployments are far more than test beds and are real, live projects delivering true business benefits. However, these on-site instances are secure, robust and controlled by their owner-operators.

This will see the continuation of centralised computing resources accessed via the cloud. These will become data repositories, processing centres and the keepers of the centralised, continuously updated data record. However, many routine functions will be more speedily performed by MEC capability, cutting out centralised processing from the routine process loop.

These deployments are the reality today and edge computing is being applied to high value business processes which make sense of the economics for their organisations. At the same time data operations capabilities are improving which is enabling more to be achieved at the edge. Hopefully, this represents a virtuous circle in which more is done at the edge so confidence increases allowing

Everyone’s idea of the edge is different and evolving as new possibilities crystallise. This is why we have focused this issue of TechTrends on how the edge is evolving. Importantly, this is not a revolution. The edge has been gaining real adoption for several years now and it is the on-site deployments that are demonstrating how the edge is evolving and where true value can be exploited.

Evolving Edge I Vol. 3, No.1 I 05


Analyst report

REPORT Jim Morrish Transforma Insights

THE EVOLVING EDGE ENABLES ON-CAMPUS INDUSTRIAL ELEMENTS Edge computing makes processing and storage resources available in close proximity to edge devices or sensors, complementing centralised cloud resources and allowing for analytics close to those end devices. Jim Morrish, a founding partner of Transforma Insights, explains how the evolution of the edge has created complex challenges for organisations to address at the same time as opening up enormous potential benefits

SPONSORED REPORT

06 I Evolving Edge I Vol. 3, No.1


Analyst report

Figure 1: Various definitions of edge computing from different perspectives

Source: Transforma Insights, 2020

Edge computing has many benefits, but also takes many forms and different communities within the technology space tend to define edge in different ways. Figure 1, sets out what a range of different groups, including data centre providers, mobile operators, communications providers, corporate IT departments, IoT architects, and software architects, tend to regard as edge. The common thread to these different perspectives is that in each instance the adopted definition moves processing closer to the device, or source data or users, than might typically be the case for the community in question. In the context of industrial IoT, it is the on-site edge elements that unlock the most potential. Examples of edge-enabled applications in an industrial context include: • Quality control based on artificial intelligence (AI) applied to video streams of manufactured products on a production line, to check for any defects in manufactured products. • Precision agricultural applications using essentially self-driving vehicles that plough, fertilise and maintain agricultural land based on AI-enabled surveys of the ground and precision location monitoring. • An integrated airport building management system that links flight data to airport systems and building automation systems.

Benefits of edge computing • (Near) real-time responsiveness. Analytics can be supported locally, avoiding the delays inherent in transferring data to a cloud location for processing. • Improved device-to-device communications. Communications and the exchange of data between devices that are co-located together can be routed more directly. • Improved robustness, resilience and reliability. With more analytics undertaken locally to data sources, systems are not as susceptible to disruption in the case that a connection to a remote cloud location fails. • Improved security and data protection. Security and privacy issues associated with transmitting data to cloud locations can potentially be mitigated. • Regulatory compliance. Locally managed information potentially only needs to comply with local regulations, rather than multi-jurisdiction regulations that might apply in a cloud environment. • Reduced operating costs. Reducing the amount of data transferred to cloud locations for processing can reduce communications and cloud processing and data storage costs.

Evolving Edge I Vol. 3, No.1 I 07


Analyst report

Figure 2: Edge compute capabilities vary between thick and thin

Source: Transforma Insights, 2020

Thick and thin variants of campus edge computing Not all edge assets are the same, and in a campus context edge devices can range from small microcontrollers through to enterprise servers. Consequently there is generally a wide variance of edge asset types within any given campus environment. Clearly, not all edge applications can be deployed to all edge assets. For example, it isn’t possible to deploy an AI-enabled video stream processing application onto a small microcontroller. This variety of edge-type devices limits the potential for application deployment particularly onto industrial machines, although other edge-type assets are essentially IT-domain equipment and can be dimensioned to match the tasks that need to be undertaken. From this perspective, it is worth calling out the edge gateway, which can potentially be collocated with an industrial machine. Unless an industrial machine has been designed

“...an edge gateway can often be the most effective way to get the maximum amount of actionable data from an industrial machine”

08 I Evolving Edge I Vol. 3, No.1

specifically to support higher level edge computing, or can be upgraded with more onboard processing power, then the edge gateway represents the closest that high powered compute infrastructure can be located to an industrial machine. For many older industrial machines then, the deployment of an edge gateway, collocated with the industrial machine, represents something of an optimal solution: to get edge computing any closer to the machine – i.e. on-board the machine – could involve a substantial upgrade, or replacement, while the deployment of an edge gateway can offer very substantial processing power in very close proximity and directly connected to the machine in question. Essentially, an edge gateway can often be the most effective way to get the maximum amount of actionable data from an industrial machine, without having to significantly upgrade that machine. Figure 2 identifies different potential campus edge hardware types, and places them on a scale between ‘thin’ (low compute power) edge and ‘thick’ (high compute power) edge. Clearly, these different types of edge assets are better matched to different applications, with the smallest microcontrollers best suited to alarming, while resilient operations would depend on all on-campus edge devices. Compute power is not the only determinant of edge application location though. For example, access control systems could be


Analyst report

For many potential end-user adopters of edge technologies, one of the most immediate challenges is simply connecting aging industrial machines to appropriate control infrastructures

supported by enterprise servers within an industrial facility location so that access to different areas of a campus for individual staff can be most effectively managed, with reference to central databases of access privileges. It is clear that on-campus edge infrastructure can become a relatively complex and diverse place with a range of different device types, with different processing power, available in different locations. The varying requirements as to the optimal location of different edge applications need to be considered against this potentially complex landscape.

DataOps at the edge DataOps is a fast-emerging concept in the edge domain, allowing for the optimisation of data flows and location of application deployment within a local campus environment, including deployment to devices with limited on-board processing. DataOps at the edge might include configurable application-specific synchronisation policies between cloud and edge and within the on-campus edge, and data partitioning so that only the business data needed at a specific edge location is distributed and synchronised. This approach effectively renders all available computeprocessing assets within a campus location

as a single, managed, distributed computing environment and has the potential to more effectively support a range of edge-type applications, including, for example, in the case of a manufacturing facility: • On-device processing for monitoring and controlling the performance of an industrial robot. • Device gateways for interworking between the data feeds from a production line machine and control messaging, including protocol translation. • Production line management can be undertaken at the level of a single production line, potentially including video-based and AI-enabled quality control to identify specific remedial actions required for that production line. • Multiple production processes, which could be managed together to support synchronisation among different processes, for example manufacturing, packing and parts delivery. • Central campus edge infrastructure, which would be an appropriate location from which to optimise overall production at a manufacturing facility, taking into account production deadlines, stock levels of component parts, and supply chain information.

Evolving Edge I Vol. 3, No.1 I 09


Analyst report

“...edge gateways can also support edge computing, and may be an appropriate location to which to deploy certain analytics”

Software containers • Software containers function like individual computers from the perspective of software programmes running in them. • However, they are in fact isolated user space instances, supported by an operating system kernel that allows for the existence of multiple such instances running on the same infrastructure. • A computer programme running on an ordinary operating system can see all resources (connected devices, files and folders, network shares, CPU power, and other quantifiable hardware capabilities) of that computer. However, programmes running inside of a container can only see the container's contents and resources assigned to the container.

The general principles underlying this kind of distribution of applications within a campus edge environment are that application functionality that is either mission-critical, or time-critical, or relies on high data volumes, should be deployed as close to the relevant source(s) of data as possible, meanwhile recognising the potential to increase the scope of an application in terms of the variety of data that it can ingest from different sources as it moves further away from an edge data source.

Deploying edge For many potential end-user adopters of edge technologies, one of the most immediate challenges is simply connecting aging industrial machines to appropriate control infrastructures and getting relevant data from the same machines for subsequent processing. Many industrial facilities that are currently operational will have been deployed years ago, and many of the industrial machines operating in these locations will have been designed before support for edge computing was a necessity. In addition, the data that such machines generate may only be available in proprietary formats, with protocols, configurations, frequency and meaning varying between different machines from different manufacturers, or even between machines of different ages from the same manufacturer. This situation gives rise to two key elements that are often required to deploy edge capabilities into a brownfield environment. Firstly, potential end-user adopters of edge computing need to find ways to communicate in both directions with their legacy

10 I Evolving Edge I Vol. 3, No.1


Analyst report

A Kubernetes-type approach can help with the actual deployment, but will not ensure that one application cannot interfere with another

machinery and secondly, they need to find a way to make such communications meaningful so that machine information can be ingested into edge applications and also so that machines recognise the commands that they receive. Edge gateways collocated with industrial machines can solve many of these problems. Essentially, such devices have the potential to translate communications between the formats and protocols that a legacy device understands, and formats and protocols that the wider campus edge environment understands. Of course, edge gateways can also support edge computing, and may be an appropriate location to which to deploy certain analytics. From the perspective of an end-user then, edge gateways can be a key armament in the arsenal of resources that can be deployed to fully enable edge computing in a campus location. A similar tactic potentially can be gamechanging for a vendor of industrial machinery. Such a vendor may have deployed industrial machinery at multiple locations, for multiple clients, over a period of decades. Their machinery in the field will have a wide range of capabilities in terms of output data that can be generated, and control messages that can be received, and in terms of communications formats supported. More than this though, many of that vendors

clients will be facing various levels of legacy equipment challenges as they attempt to stitch that vendor’s assets in their edge computing environments. A solution to this challenge could be for the vendor in question to retrofit each of their assets in the field with an edge gateway which can output information and receive instructions in standardised, well documented, formats and according to widely accepted communications protocols. The edge gateway would ideally, of course, also offer some level of standardised edge compute capability to the operators of the machinery in question. In this way, a vendor of industrial machinery could effectively upgrade their estate of devices deployed with multiple clients so that all ages of machinery could support some level of edge computing capability, and in a reasonably consistent way. Ideally, this same vendor would be able to access the newly deployed edge gateways from a remote location, firstly in order to monitor machine performance, but also to support software upgrades and also potentially support remote maintenance and other applications.

Edge challenges Whilst the advent of DataOps at the edge can bring significant benefits, there are also challenges inherent with deploying such a flexible system. One such challenge is edge orchestration, including the automated

Evolving Edge I Vol. 3, No.1 I 11


Analyst report

The range of third parties that could potentially deploy applications to edge assets is not limited to those that might directly be included in day-to-day operations and production

configuration, coordination, and management of computer systems and software. What is needed is a software container orchestration system for automating software application deployment, scaling, and management in an edge environment. Kubernetes is one of the main such solutions. Kubernetes is open-source and is often re-branded and distributed by vendors, although there are alternatives. But software container orchestration is not the full story. Up to this point, this article has focussed on edge capabilities in an abstract sense. In a real-life deployment, it is likely that edge applications provided by different vendors, and intended to support different use cases, will need to be deployed alongside each other on the same infrastructure. A Kubernetes-type approach can help with the actual deployment, but will not ensure that one application cannot interfere with another. For instance, a badly written or malicious machine learning application might potentially draw so much system resource that another potentially mission-critical application is unable to operate effectively. Clearly there is a need for some level of security in the edge environment including extending to ensure trustworthy operations. One potential approach is to effectively deploy a security ‘wrapper’ around any software containers that run applications at the edge. The purpose of the security wrapper is to ensure that individual software containers and so the applications within them do not operate in a way that might compromise other software containers and applications. Once such security measures are in place, the edge effectively becomes a much more open

12 I Evolving Edge I Vol. 3, No.1

environment, with potential for multiple parties to deploy applications to the edge. In turn, this opens up the potential for what could be termed application stores for edge applications.

Application stores for the edge Any industrial facility is likely to include machinery from a range of OEMs. This immediately highlights two potentially conflicting dynamics: the desire of the facility owner to deploy edge applications that support their overall smart factory vision, and the desire of any OEM equipment manufacturer to deploy applications to support their machinery. However, with flexible deployment of applications to edge assets, supported by a Kubernetes-type platform and enabled by appropriate security measures, this conflict is easily dealt with, and both stakeholders, the facility operators and the OEM, can potentially deploy their own applications alongside each other on the same edge assets. There’s no reason why the story should stop there though. Both the facility owners and OEM could maintain their own ecosystems of applications that can potentially be installed onto edge hardware. From an OEM perspective, this could include their own applications both to support ongoing operations, for instance to support remote monitoring, and also to support any specific requirement, such as a diagnostic tool that could be installed temporarily to support the resolution of a fault. From a facility owner perspective, the application ecosystem could include applications required to support the needs of different clients, or production processes, particularly in a contract manufacturing context.


Analyst report

Both the OEM and the facility manager’s application ecosystems could potentially extend to include applications provided by third parties, used to support specific functionality. The range of third parties that could potentially deploy applications to edge assets is not limited to those that might directly be included in dayto-day operations and production. Providers of financial services for instance might potentially be interested to deploy their own monitoring applications onto edge gateways to support condition-based insurance, or even secured finance in the case of a servitised asset-as-aservice proposition. In these cases, the financial services provider would benefit from a direct connection to the machines that are the source of their financial exposure. Such an approach ensures that the provider of financial services can benefit from the full range of information that a machine can provide, as opposed to needing to rely on information that has first been

collected and processed by either the OEM or facility owners. Additionally, this kind of approach can help to address many issues around the trustworthiness of performance information. With a direct (and secure) connection to an edge gateway, a financial services provider can be sure that they are getting full, complete, and reliable information on the real time performance of a machine. If appropriate controls are implemented at the edge gateway, it will also be possible to guarantee that any information reported has not been tampered with, and that the configuration (and use) of a machine has not deviated from any pre-defined operating condition restrictions and that maintenance routines have been adhered to. Effectively, this is implementing a ‘root of trust’ that extends through all relevant aspects of an industrial machine, and reports directly to a financial services provider.

Conclusions The edge is one of the most exciting and fastest evolving areas of the industrial IoT right now. The benefits of deploying edge technologies can be significant both in an operational sense, and in terms of the new business models that edge technologies and particularly edge gateways can enable. But the edge is a complex environment and managing data flows within campus edge infrastructure, whilst ensuring an overall secure environment, is a critical challenge.

Evolving Edge I Vol. 3, No.1 I 13


Edge evolution

EDGE EVOLUTIONS TAKE IN ON-SITE INDUSTRIAL DEPLOYMENT EXPERIENCES On-site MEC deployments are addressing digital business challenges for industrial enterprises, while also familiarising the market with new technologies. These represent more than a sandbox for pilot projects to develop, they’re real and they point the way to a mature industrial IoT, writes George Malim. Multi-access edge computing (MEC) has rapidly become a hot technology trend, but it is not a new concept. The principle of hosting computing resources close to where processing needs to be performed has clear benefits in terms of reduced latency and has strong potential impacts on the cost and efficiency of operations. However, imagining that every device will have MEC capability embedded and the days of centralised data storage and processing are gone is an over-simplification and a red herring that distracts from the technology’s real value.

14 I Evolving Edge I Vol. 3, No.1


Edge evolution

The theoretical proposition of having MEC in every device is impractical and, certainly today, unnecessary. However, as understanding and capability at the edge has evolved, MEC deployments that deliver real business benefits right now are growing in number. These, typically, involve harnessing edge benefits in industrial machines often in on-site scenarios rather than a fantastical future in which every device has intelligence and is connected to everything else. These real deployments are showing how the edge will be adopted and rolled out and provide environments in which technologies can be deployed at scale and lessons learned from these. This is not to suggest that current edge deployments are restricted to pilot projects or trials, which is certainly not the case given the scale of projects and the size of investments made. Instead, the commercial reality of on-site MEC deployments is providing a means to rapidly mature approaches to security, application management, connectivity and resilience.

machines move off campus into potentially more challenging operational environments. Issues such as security, coverage and device management can be more tightly controlled in on-site environments, enabling reduced-risk MEC deployment. Industrial organisations are therefore leading the MEC adoption charge, partly because the business benefits are clearer, and infrastructure already exists that can be exploited. This is borne out by research from analyst firm Frost & Sullivan which predicts that approximately 90% of industrial enterprises will utilise edge computing by 2022. That level of adoption by industrial enterprises represents immense growth prospects for MEC market participants and the firm expects the value of the multiaccess edge computing market to reach US$7.23 billion by 2024.

These slightly more contained deployments enable artificial intelligence (AI) and machine learning (ML) to be deployed and understood on-site before devices and

Evolving Edge I Vol. 3, No.1 I 15


Edge evolution

That lengthy lifecycle is one of the main challenges that the deployment of MEC faces

Industrial enterprises are a sweet spot for edge computing because of the clearly identifiable benefits to enterprises the technology offers and the readiness of supporting infrastructure and technologies to enable industrial use cases. An important driver is the readily available amount of computing power at the enterprise edge at a sustainable cost. This resource is also scalable and flexible, allowing more compute power and storage to be added relatively simply as deployments scale-up and system demand increases. MEC also aids efficient and effective data handling for industrial organisations. In terms of augmenting data, the enterprise edge is an ideal location to bring together different data streams within a local facility. This data can range from machine performance information, to building and supply chain or productivity information. The edge can draw useful information from remote cloud locations such as weather information, which can be used to optimise performance of machines or their output. However, MEC’s impact on data goes beyond augmenting machine information. In addition, data can be redacted. In on-site deployments this has less immediate value because data transmission within sites is typically free in terms of cost because the network exists and is owned by the enterprise. This means the data redaction benefits of off-site deployments in terms of reduced traffic are largely negated. However, significant costs can still be saved in data processing using cloud infrastructure by redacting data nearer to end-devices. Finally, on-site there are local benefits from the lower latency enabled by the high-speed local network which enables analytics to be performed rapidly without the need to communicate information to cloud locations. Transmission of data for processing and then communicating an action back to an edge device delays speed of response and this can negatively affect industrial processes.

Use-cases The on-site, industrial use-cases encompass everything from building management and controls to industrial IoT (IIoT) applications and manufacturing processes and systems. In building

16 I Evolving Edge I Vol. 3, No.1

management, smart buildings rely on edge computing and the secure orchestration of applications run on edge gateways. This enables smooth operation of buildings and cost savings or efficiency gains in terms of energy consumption and more. When it comes to IIoT, large industrial companies are deploying cloud-based management of machines that rely on industrial-grade security and effective management of applications across the machine estate. Manufacturing companies utilise edge computing to report on machine usage, performance and data such as productivity and throughput and consumption of resources. All this leads to integrated, streamlined processes that improve performance, reduce environmental impact and close the feedback loop into product design and development. However, these benefits are not freely achieved and demand that industrial organisations address a series of challenges both at launch and during the often long deployment lifecycle.

MEC challenges That lengthy lifecycle is one of the main challenges that the deployment of MEC faces. Internal research by Pelion, the fully-owned subsidiary of Arm that provides connected device platforms, IoT connectivity, and IoT device and edge applications management systems, has uncovered an expected 10-15 year lifespan for devices and, in some industries for specific machines, this can be even longer. This presents a significant challenge as organisations today have to decide what to integrate into new devices. A parallel challenge is identifying how to retrofit functionality or upgrade existing equipment for the edge era. In some cases, the benefits of MEC could necessitate an early upgrade of machines but this is a finely balanced equation with many edge benefits still uncertain. At the same time, the proliferation of IIoT devices is seeing a wider deployed-base of devices that can be – or are already – MEC-enabled. This device estate creates a greater number of sources of data and enables much of the value of MEC and IoT to be more fully exposed to industrial organisations.

This increased data leads to more indepth insights but also larger data flows or streams and places greater burdens on organisations’ data operations capabilities. To put it bluntly, more data means more processing, more insights and more actions to be taken. Operationally, organisations must therefore scale up their data operations to cope. Much of this will be automated but skills must be grown and applied to ensure the relevance of data that is processed and optimal performance. The increase in IIoT devices on the shopfloor has created security loopholes created by the influx of IIoT network points that are vulnerable due to a lack of in-built security features. These can appear insignificant and are easily overlooked when an esoteric system that is little used can be a point at which a breach can occur. Weak links in the IT ecosystem and multiple access points for attackers to exploit are being enabled so MEC can play an important role in securing IIoT. Edge provides necessary isolation to make the network more faulttolerant, self-healing, and resilient. In addition, the independence of edge computing enables can also be extended to legacy equipment. MEC creates an ecosystem of data management, analytics and local control that enhances security. This means IIoT and legacy equipment will capture, analyse and discard temporary data while sending only important or permanent


Edge evolution

data to a centralised server, thus reducing overall bandwidth costs, improving security and making networks more faulttolerant. A further headache for organisations that deploy MEC is the sheer number of different applications involved. Any industrial facility will involve machinery from multiple vendors, many of which will have proprietary systems and proprietary software. These need to interact to create a smart factory but also need to support the needs of the machine-maker. Often, these two approaches are not in alignment and flexibility in the applications is required. Systems that enable both the customer and the device-maker to deploy their own applications alongside each other on the same edge device are starting to be used and are enabled by technologies such as containerisation. Continuation down this path could see libraries or stores of MEC apps emerge that short-circuit the development cycle and accelerate time to operation. This is an area which, although it is maturing rapidly, is relatively new. A final traditional concern is connectivity. The communications industry has made significant advances, especially in wireless technologies, to the extent that on-site deployments are well-served by offerings such as LTE Cat-M1, private 5G or Wi-Fi6. Further, lower bandwidth options exist for lower bandwidth applications such as narrowband-IoT and

other low-power wide-area network (LPWAN) technologies. There is likely to be a connectivity option that suits the cost, bandwidth, security and ease of deployment requirements for most applications so, for many, this will be one less complexity – although connectivity selection should not be trivialised. The on-site environment provides a segregated area in which MEC-enabled devices can be configured to show their true business value to industrial organisations. Industrial machines are particularly well-suited to showcase the benefits of the evolving edge and the onsite market allows a ‘safer’ basis for deployment. However, the mass market involves hyper-connected devices such as tracking and monitoring of devices, cargos and machines as they traverse different locations. The security, connectivity, device management and other aspects of this are more complex, sophisticated and challenging. Costs must be balanced against business benefit and clear calculations are harder to conclude at this stage. Potentially, learning and familiarity achieved in the on-site deployments can be transposed of off-campus deployments and this expected to feed through in the short to medium term. In the meantime, it’s clear that the headroom for growth in uptake is enormous. Analyst firm Gartner reports

that 75% of data is still processed outside the cloud while 85% of devices are currently deployed without cloud connectivity, which leaves huge scope for transformation. MEC is certain to play a significant role. In fact, the importance of edge is being recognised to the extent that ABI Research expects AI chipset revenues from edge deployments will dethrone cloud as the leading market by 2025. At that point edge AI chipsets will generate revenues of US$12bn per year, outpacing the cloud AI chipset market, which will reach US$11.9 billion in 2025. The harnessing of AI in concert with MEC will provide organisations with powerful tools as they engage in increased automation and robotics to achieve cyberphysical optimisation of their processes and move from being traditional product manufacturers to being providers of services. This servitisation will place demands for tailoring products to users’ requirements and transform traditional facilities from high volume production environments that churn out the same product, to facilities that create highvalue use-case optimised iterations in lower volumes. The business goal is increased margin by enabling greater choice but we are only at the start of this. It is the on-site deployments of MEC that are illustrating the art of the possible, and providing a guide to what the connected, edgeenabled future may look like.

Evolving Edge I Vol. 3, No.1 I 17


Case study

Building management specialist reduces deployment time from months to days with Pelion Device Management When a well-known Fortune 100 corporation planned to significantly reduce the time required to deploy its building management systems (BMS) while simultaneously establishing a platform for additional functionality, it turned to Pelion to help implement the system quickly, more widely, and further bolster the security of the ecosystem This organisation, a household name in building management and automation services, sought to introduce cloud connectivity and a frictionless device management experience. It also aimed to gain an edge over competitors by reducing the shipment and configuration time required to commission its industrial IoT analytics platform. Utilised in a range of applications, including aerospace, industrial environments, and buildings where HVAC and lighting systems are ubiquitous, these platforms are also often targeted as a means of gaining unauthorised access to an enterprise's infrastructure. Indeed, a stringent requirement for security as a foundational element was one of the organisation’s main reasons for partnering with Pelion. The Pelion team broke the project down into several key functional areas so the project would: • Use secure device management for a frictionless deployment experience • Deliver a Platform-as-a-Service model that replicates existing partner functionality • Facilitate secure application enablement via containerised edge computing The partner was already familiar with Pelion’s life cycle management architecture anchored in a secure root-of-trust, and both organisations have a shared vision for computing at the edge. The Pelion team quickly got to work, creating containerised instances of the partner’s cloud-native applications, enabling it to be rapidly and securely delivered as new gateways were deployed and activated. The combination of container-based applications and cloud-based management provided all-important sitespecific functionality, with the added advantage of autonomy, mitigating the loss of connectivity or hardware failure. It also allowed the partner to focus on delivering value to its platform’s feature set, rather than wasting time and resources attempting to overcome the inefficiencies and complexities inherent with existing application deployment strategies. By moving to an IoT-enabled model, organisations can take advantage of the edge gateway's protocol translation and remote management capabilities. These foundations deliver an integrated framework for the deployment and configuration of legacy devices, the ability to offer cloud services in remote, disparate locations, and the secure deployment of new applications and delivery of the data they gather. According to a senior manager of the organisation’s product management group, remote management capabilities offered two distinct benefits: • The ability to remotely create and deploy a fully-functional IoT ecosystem in a matter of days gives a distinct competitive advantage at the tender stage. • At the same time, the approach dramatically reduces development resource requirements on every solution deployed. Through the partnership with Pelion, the organisation benefited from extensive collaboration in creating and optimizing the application environment, maximising agility, and establishing

18 I Evolving Edge I Vol. 3, No.1

the solution's strong security foundation. A series of application programme interfaces (APIs) orchestrate the necessary interactions between the Pelion Edge Gateway, the Pelion IoT Device Management Cloud infrastructure, and the partner’s cloud-based environment. The result provides a clear demarcation between the device management control plane and the data that drives the building management systems.

Ensuring frictionless service Discussions with the Pelion team identified specific use-cases that threatened service continuity and ease of deployment and management that the solution offered. As a result, the team utilised the cellular capabilities of Pelion IoT Connectivity to resolve three separate use-cases: 1. The partner’s engineers could deploy their solution to construction sites and utilise a cellular connection to autodiscover downstream devices, even before the availability of wired or wireless network connectivity - this offered freedom for engineers at remote locations and continuity of service for facility managers. 2. This cellular backhaul offers continuity of service and communication with the cloud in an outage situation. 3. Cellular connectivity facilitated the remote management of unoccupied or sub-let premises by reaching gateways and resolving issues within the preordained service level agreement (SLA) without friction.

Futureproofing architecture After a successful proof of concept, piloting 5,200 gateways, the partner is now transitioning to full-scale production. While the solution initially used Mbed, Arm’s real-time operating system, the partner has identified their preference for Linux in the medium term. "Our reference architecture uses an Arm-based processor that also supports Linux," said Deepak Poornachandra, Pelion’s senior product manager for Cloud Gateways, confirming the solution's flexibility. "Thereby allowing the partner to consolidate around one operating system should in the future they chose to." To learn more about Pelion IoT Device Management, visit https://pelion.com/product/iot-device-management/


Solution overview

HOW PELION ENABLES THE EDGE APPLICATIONS ECOSYSTEM Supporting scaled-out deployments that facilitate both legacy environments and innovative cloudnative applications on edge gateway systems is becoming vital. By extending decision making to the on-premises edge, organisations see improved responsiveness and operational efficiency and increased security for IoT deployments Edge computing has emerged as a dominant industry trend, and the balance between edge and cloud computing is shifting. Meanwhile, IoT use-cases are becoming more expansive and sophisticated by the day. Deployments that allow applications to be run on the edge rather than in the cloud improve latency, reduce bandwidth demands, and deliver robust privacy guarantees, ensuring that confidential data remains on-premises. Increasingly, IoT solutions based on highperformance embedded RISC processors and Linux-based systems are overshadowing the original operational technology (OT) and the first-generation IoT deployments of the past. Dynamically managed from the cloud, these devices can be upgraded with new application software and optimised with tuned machine learning models. Over time, the operating system and application software require updating to fix bugs and patch vulnerabilities. As applications spread over a highly distributed and heterogeneous compute fabric - often resource-constrained, relatively insecure, and sometimes unreliable - application management and orchestration become a daunting challenge. Pelion Device Management Edge addresses these challenges by offering an open, standards-based, and interoperable mechanism to seamlessly and securely package, distribute, and manage cloudnative IoT applications to edge gateways. In addition to extending full-featured IoT device management functionality to local devices, edge also integrates support for legacy, nonIP endpoints through protocol translation. With Pelion Device Management Edge, operators create a unified environment for all on-premises requirements: compute, data acquisition and inference, and process automation. And a unified solution is essential; operators expect much of the gateway, in addition to the traditional OT/IoT device management role, it is now also the focal point for significant functionality. Operators especially those in the industrial, manufacturing, and commercial sectors need a solution that enables various usecase applications, interfaces with multiple systems, monitors for suspicious or

anomalous events, and maintains operating software across a range of downstream system entities. Linux containers have become the standard mechanism to package and distribute portable applications, and they are a natural fit for gateway-based devices. For serverside cluster management and application deployment frameworks, Kubernetes, the open-source, container-orchestration system, is a popular option. Pelion's view is that an IoT-optimised Kubernetes-based container environment delivers the optimum strategy for edge computing. Fortunately, Kubernetes delivers an ideal platform for hosting and integrating both legacy and emerging applications. Extensibility and modularity ensure that operators can redeploy cloud-native applications, often without the need for extensive changes to core code or application programme interfaces (APIs). As impressive as Kubernetes is, however, there's a vital need to complement its generic application deployment, scaling, and management automation functionality with a robust and extensible security framework that's fit-for-purpose. Considerations include hardening of the Linux OS, device identity protection and integration with the Trusted Platform Module, log integrity verification, network segmentation for individual container applications, authenticated image delivery via a secure registry, certificate status verification and revocation check, and device attestation and integrity checking. Many of these considerations reflect the emergence of a standardisation framework for securing the systems and connectivity deployed in industrial networks. For example, the International Electrotechnical Commission has codified its approach to the requirement, as defined in the IEC 62443 specification, and Pelion Device Management Edge creates a compliance path for those organizations seeking certification. Pelion Device Management Edge offers a unique edge computing management platform. It's based on open-source components, uses open APIs, and integrates well with the existing application management ecosystem and cloud-based

DevOps toolchains. Security for IoT devices is critical, from hardware to connectivity and into the cloud; end-to-end security is enabled from gateway provisioning to secrets management to application-level security. Pelion's Edge provides a wide range of features that ensure device-to-data security, regardless of the industry, market, or application use-case, facilitating robust IoT solutions. Crucially, Pelion Device Management Edge establishes a unified control plane for orchestrating hybrid cloud-edge applications. Capable of remotely deploying and managing microservices, edge utilises a myriad of advancements in cloud application management technologies such as rolling updates, containerisation, rollbacks, health monitoring and checkpointing. Based on a Kubernetes/Docker architecture, extensible containers support multiple backends such as virtual machines, Unikernals, and higherlevel artifacts. Edge computing is entering the mainstream, and the emergence of diverse edge computing platforms enables a diverse range of use-case-enabling applications. Pelion is taking a progressive approach - crafted on a proven open source-based architecture - to deliver an integrated, unified, and secure edge applications enablement ecosystem.

EDGE APPLICATIONS MANAGEMENT Container Orchestration

EDGE DEVICE MANAGEMENT Protocol Translation for IoT endpoints

EDGE SYSTEMS MANAGEMENT Gateway Management

Evolving Edge I Vol. 3, No.1 I 19


Optimize your business outcomes with IoT device management Finding the right platform to manage your IoT deployment can be challenging, but it’s the key to profitability and operational efficiency. Take control with Pelion IoT Device Management. Pelion provides everything you need to activate, manage and update your devices flexibly and securely from anywhere, including the cloud, on-premises or at the edge. Greatly reduce your time to market: Quick start guides, standardized industry processes and easy deployment means you can swiftly move from POC to profit. Ensure your deployment stays relevant: Pelion’s flexibility and wealth of supported protocols and devices allows you to keep up with technology as it develops. Secure IoT from chip to cloud: Built-in access control, device authentication, certificate management and network security come standard for every customer.

Learn how Pelion simplifies the IoT device lifecycle and try it for free at pelion.com/product/iot-device-management


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.