

AI helps ease the coding load
Generative AI supports engineers by automating coding tasks and giving them time for higherlevel programming









Inside the cabinet? Outside? You choose with our IPCs for edge computing and control

Measuring just 82 x 82 x 40 mm and with unparalleled mounting flexibility, the C6015 ultra-compact Industrial PC makes optimal use of even the smallest installation spaces. The particularly robust C7015 variant, on the other hand, delivers multi-core computing power directly to the machine to open up even more potential applications. The integrated Intel Atom® CPU with up to 4 cores enables simultaneous automation, visualization, and communication – even in demanding industrial IP65/67 applications. In addition to classic control tasks, both ultra-compact IPCs are particularly well-suited for use as a gateway for networking machines and system parts. Their exceptional computing power means this is even possible with complex pre-processing of large amounts of data. The integrated EtherCAT P connection on the C7015 provides additional flexibility for I/O expansions.




ACOPOStrak
ACOPOStrak is a revolution in adaptive manufacturing. This highly flexible transport system extends the economy of mass production down to batches of one. Parts and products are transported quickly and flexibly from processing station to processing station on independently controlled shuttles.
• Increase overall equipment effectiveness (OEE)
• Boost the return on investment in your machines (ROI)
• Shorten your time to market (TTM) br-automation.com



cover story
AI helps ease the coding load
Generative AI supports engineers by automating coding tasks and giving them time for higher-level programming Anna Townshend, managing editor
machine input SDA brings scalability and flexibility
Software-defined automation reduces reliance on
hardware and supports machine learning
Mike Bacidore, editor in chief
motion control Hydraulic motion control revitalizes languishing legacy aerospace press Macrodyne turns to Delta Motion for help with seemingly impossible press upgrade
Mike Dorian, Macrodyne Technologies
industrial networking What are the 7 layers of the OSI model?
How to understand the standards and protocols relevant to industrial control system networks
Patrick Bunn, Bunn Automation Consulting
product roundup Tool up for industrial networks
Gateways, modules, routers, switches, converters, I/O and terminals to transmit data
PHOTO: DEREK CHAMBERLAIN / SHUTTERSTOCK AI

Remote wireless devices connected to the Industrial Internet of Things (IIoT) run on Tadiran bobbin-type LiSOCl2 batteries.
Our batteries offer a winning combination: a patented hybrid layer capacitor (HLC) that delivers the high pulses required for two-way wireless communications; the widest temperature range of all; and the lowest self-discharge rate (0.7% per year), enabling our cells to last up to 4 times longer than the competition.
Looking to have your remote wireless device complete a 40-year marathon? Then team up with Tadiran batteries that last a lifetime.
Endeavor Business Media, LLC
30 Burton Hills Blvd, Ste. 185, Nashville, TN 37215 800-547-7377
CEO Chris Ferrell
COO
Patrick Rains
CRO
Paul Andrews
CDO
Jacquie Niemiec
CALO
Tracy Kane
CMO
Amanda Landsaw EVP Group Publisher
Tracy Smith
VP/Market Leader - Engineering Design & Automation Group
Keith Larson
editorial team
editor in chief
Mike Bacidore mbacidore@endeavorb2b.com
managing editor
Anna Townshend atownshend@endeavorb2b.com digital editor
Madison Ratcliff mratcliff@endeavorb2b.com
contributing editor
Rick Rice
rcrice.us@gmail.com
contributing editor
Joey Stubbs contributing editor
Tobey Strauch tobeylstrauch@gmail.com
contributing editor
Charles Palmer charles101143@gmail.com
columnist Jeremy Pollard jpollard@tsuonline.com
design/production
production manager
Rita Fitzgerald rfitzgerald@endeavorb2b.com ad services manager
Jennifer George jgeorge@endeavorb2b.com art director
Derek Chamberlain
subscriptions
Local: 847-559-7598 • Toll free: 877-382-9187 email: ControlDesign@omeda.com
sales team
Account Manager
Greg Zamin gzamin@endeavorb2b.com 704/256-5433 Fax: 704/256-5434
Account Manager
Jeff Mylin jmylin@endeavorb2b.com
847/516-5879 Fax: 630/625-1124
Account Manager
Kurt Belisle kbelisle@endeavorb2b.com
815/549-1034
COLUMNS
Charles
Joey



Mike Bacidore editor in chief mbacidore@endeavorb2b.com
Where to find innovative ideas
SPRING IS SPRUNG. Time to start making plans for upcoming conferences and trade shows. Two of my favorites—Automate and the CSIA Conference—are sure to be chock full of great ideas, innovative products and enlightening speakers.
The Association for Advancing Automation (A3) anticipates robust attendance, hundreds of exhibitors and speakers delivering informative conference sessions at Automate 2025, May 12-15 in Detroit.
Local hero, Brad Holmes, executive vice president and general manager of the Detroit Lions, will be the opening-day keynote speaker, but the Automate Show and Conference is all about robotics and automation.
Holmes will talk about strategies and principles that can be used to transform an organization into a powerhouse, using the Lions’ inspiring resurgence as a case study.
countries around the world. Anyone ages 12 and older can attend the show to see the latest in robotics, vision, artificial intelligence and motion control.
The Control System Integrators Association (CSIA) will host its 2025 CSIA Conference June 2-5 in San Diego.
This year’s theme, “Elevate Your Game,” is an opportunity to level up—to grow, learn and lead in the system-integration and technology landscape.
System integrators will learn innovative strategies to onboard new engineers, empower emerging leaders and create a culture that prioritizes growth and adaptability. There will be insights into staying ahead through digital transformation and actionable ideas into building strong, resilient teams that drive success.
After 10 years at the helm of CSIA, CEO Jose Rivera plans to step down from his duties.
“Brad Holmes has played an instrumental role in turning the Detroit Lions into one of the most exciting and respected professional football teams in the NFL,” said Jeff Burnstein, president of A3. “I cannot think of a person who has more to offer in terms of inspiration and insights into how an organization can tackle complicated and difficult challenges. We think the attendees of Automate 2025 can learn a lot about leadership, collaboration and how to develop talent from Holmes.”
Automate will take place at Detroit’s Huntington Place, and keynotes are free to all show and conference attendees.
While the 320,000-sq-ft show floor is accessible for free, there is a fee to attend the four-day conference portion of the event. In 2024, the conference featured more than 140 sessions across 20 topic paths for beginners and advanced professionals alike. Last year, the Automate Show and Conference took place in Chicago and attracted more than 40,000 registrants from states across the United States and
Optional workshops will dig deeper into CSIA best practices and topics such as business risk and marketing.
After 10 years at the helm of CSIA, Jose Rivera plans to step down from his duties as chief executive officer. He will give the opening address and prep the conference attendees for nonstop learning.
George Young, vice president, global digital and cyber services, at Rockwell Automation will then deliver a keynote on Industry 4.0 and how to avoid the potholes on a digitaltransformation journey.
Alex Chausovsky, director of analytics and consulting at Bundy Group, will deliver his always-insightful annual update on the economy and labor market.
General sessions will include presentations on pricing power, digital transformation as a service, professional development, how to manage key performance indicators (KPIs), how to manage growth, practical artificial-intelligence (AI) applications, open process automation (OPA) opportunities and how to navigate mergers and acquisitions.

Jeremy Pollard jpollard@tsuonline.com

How to survive North American tariff talk
“YOU THINK YOU’RE gonna break up. Then she says she wants to make up.” Those lyrics from the 10cc song, “The Things We Do for Love,” seem appropriate to the North American and global trade environment. The proposed tariffs could affect projects now and in the future.
Most imported automation product will cost more due to potential import tariffs. Also remember automation products must be certified by agencies like Underwriters Laboratories (UL) and Canadian Standards Association (CSA).
Not too long ago a chip maker discontinued a chip set that was used in a vendor’s drive family. A re-design had to be accomplished, and re-certification had to be done. To create new product is not a simple process. Existing products must be used in the short term. They have to be affordable and available for projects to proceed.
Partnerships have been crucial to the sustainability of products in our industry. Chips, boards and semiconductor devices have to be reliable and cost-effective for the vendors to use in their products. Costing for the user must be consistent.
Steel, aluminum, wiring/cabling and devices are all needed to make up the hardware of the machine. The cost of the machines, such as a headlight assembly and test line have hard and soft components.
Stability of trade in project costs has to be top of mind.
I am reminded of a time in the early 1980s when our office received a shipment of analog input programmablelogic-controller (PLC) modules. It wasn’t supposed to go to us directly, but it did.
The invoice inside charged the Canadian entity something like $230. The customer in Canadian dollars were being charged in excess of $2,400. Why was that? The answer is unclear, but it seemed that the Canadian entity took the tax hit on the profits instead of the U.S. vendor.
So, the gamesmanship of business has back doors and deals that most don’t know about. But what we do know is that, based on the amount of automation business, there isn’t a lot of room for errors in dollars.
Budgets and project costs have already been determined, and one wonders how current projects might be affected. Stellantis has already put the brakes on a $1.3 billion dollar retooling project in Brampton, Canada, based on the disruptions that may be occurring.
So, stability of trade in project costs has to be top of mind. Every industrial project has components of automation and robotics. Any unforeseen cost increase can put the project in jeopardy. Moving the manufacturing of automation components back onshore will take a huge investment in dollars and time, which is not practical in the short term.
There is also the cost of labor, installation, commissioning and software that adds to the cost of the machine. There may be a workaround by elevating certain costs to soft costs, so the tariffed hard costs could be reduced.
Global companies that build catalog type machines, such as a high-pressure injection molding machine, would be tariffed on the catalog cost, which is like buying a light bulb. The cost is the cost.
That project cost would be negatively affected, and maybe that project loses its viability due to the increased costs.
Part of the argument against restrictions and disruption is that assembly plants on both sides of the border perform a singular function. From a piece of steel, a windshield wiper motor mount is made, and it may go back and forth across a border multiple times. Each time the increased value of the part gets additional costing, which ramps up the cost of this one component by orders of magnitude, not just on the finished good cost.
Automation projects will be in for a cost rise based on these trade barriers, which may force the use of inferior products that may be used in applications they weren’t designed to do. A “do with what you have” mindset may create more headaches. It’s all about the money.
So, break up already. Or don’t. We need stability in our automation world.
JEREMY POLLARD, CET, has been writing about technology and software issues for many years. Pollard has been involved in control system programming and training for more than 25 years.

Rick Rice contributing editor rcrice.us@gmail.com
How to remove servo phobia
NOT TOO LONG AGO, my employer was setting up a new production line. There was a lot of money involved with new-to-us vendors and technology. It’s been a bit of a journey, but our team has finally embraced servo technology, and we are reaping the benefits.
As a co-packer, we would traditionally just throw people at a challenge and do whatever we need to do to make product. With a fully automated line, we don’t have that option.
For this particular line, we needed to be able to erect cases, automatically, that could be left- or right-hand-oriented opening. This wouldn’t be an issue if we had people opening the cases, but to do this using automation was, as it turned out, a formidable challenge. In fact, for the case speeds we were facing, we were only able to find one vendor who produced a machine capable of doing this.
what made this particular decision easier for us is the availability of smaller, fractional horsepower solutions in this integrated package platform.
The hardest part of the upgrade was converting the PLC application into the newer processor.
Our case packer would simply send a digital signal to indicate when the infeed belt was calling for more cases, and our metering system would cycle on and off to match. Since we now had a way to both hold back product and positively index individual cases out when we needed them, it was the perfect solution to our challenge. Servos have a lot of torque, and this gave us another advantage, in that we could use the vertical section of our chute to accumulate product and, using a photo eye, cycle the conveyor at the top of the chute to keep that accumulated section full.
The challenge for us was how to fit that machine into our packaging line. As it turned out, we couldn’t. However, for the first time in our history, we decided to locate the case erector on the floor above packaging and send the cases down to the floor below. Our approach was to install a stainless-steel chute to transition the now-erected cases down to the case packer on the first level.
There is quite a challenge to create a steep enough angle to allow for the free flow of product while not having that product launch when it hits terminal velocity.
We didn’t have a lot of horizontal distance to provide a gradual slow-down section to the horizontal plane, and we had to ensure the empty cases didn’t just stall on the deceleration section and not make it to the case packer infeed.
Our solution, was to provide a means to both slow down the cases and power them onto the infeed of the case-packing machine. The concept was to mount a soft wheel on the side of the chute that had a bit of compressibility to its construction. We toyed with controlling this with a variablefrequency drive but, ultimately, elected to use a stand-alone servo motor with built in drive.
Stand-alone drive/motor combinations have been around for a while, so it’s not necessarily a new technology, but
The drop section, once the accumulated section was loaded, was shortened from 14 feet down to only 5 feet. The cases don’t even come close to meeting terminal velocity, so the transition is gentle with very little bounce back from empty cases colliding.
One of the more entertaining things that I get to do in my current position is give life to older machinery—for example, a machine that automatically dispenses cups used for microwavable products and then cycles those cups through filling and sealing stations before ejecting the finished product out onto a takeaway conveyor for further packaging.
When we first got into this type of business, we purchased a piece of used machinery with the thought that we would give this a try, and, if it wasn’t for us, then we weren’t out much of an investment. Well, we soon discovered that, while the machine came with a control system, it was older technology, and we needed to give it an upgrade to something newer.
This particular machine was completely mechanical in nature. The main drive was an indexing gear box with various pneumatic devices operating the cup dispenser, sealing and cup-eject stations.
We started with an older PLC platform, and that worked for us for the first year or so of operation. We started to get a little more sophisticated, especially with the peripheral devices and set upon a goal of eventually replacing
technology trends
some of the mechanically cammed operations with servo versions. While we didn’t immediately jump into a servo system, we did recognize
that our older PLC platform wouldn’t allow us to add a servo system. The first step in our journey was to replace that older PLC with a modern




One axis or fifty.
Servo hydraulic or servo electric. Position, velocity, or force control. Direct connection or through EtherCAT.
Delta RMC Motion Controllers and graphical RMCTools software make complex motion easier, smoother, and more precise.
Drive over to deltamotion.com or call 1-360-254-8688.
Get your next project moving forward more quickly than you thought possible!

programmable automation controller (PAC) with motion capabilities.
The hardest part of the upgrade was converting the PLC application into the newer processor. We reassigned the human-machine interface (HMI) tags to mate up with the new processor and spent some time to completely re-commission the PAC and HMI off-line to make sure we were ready to go. The actual PLC swap out took less than two eight-hour days, and we were back in business.
The next step took a little longer than we had planned. In fact, we ended up hiring an independent machine builder to make us a similar machine, with learnings from our first experiences at cup filling/sealing.
The new machine included the same processor and HMI as our first machine so that we could include it in our plans for a fully operational servo machine. Included in this newer model was a servo main cycle drive with indexing capabilities to match the mechanical version on the first. It’s important to mention here that not all journeys need to be fully immersed. Sometimes, it’s better to just get one’s feet wet and figure out the process as you become more comfortable with it.
With cup-filling machine versions 1.0 and 1.5 in operation, we were excited to get on with the next part of the journey and started the process of converting the air-operated seal heads to servo operation. We decided to do this operation next; the most important part of the whole machine is to properly seal the film to the top of the cup.
Sealing is dependent on two very important parameters—time and pressure. If either one is off, the seals are not sufficient. With a pneumatically operated plunge, we were limited
HYDRAULIC VALVES MOTOR DRIVES
SENSORS
Drive Forward with EtherCAT from Delta Motion
by the distance of the stroke on the cylinder and how much dwell time we had while the head was in contact with the cup. We were limited on how fast the machine could run because the two critical elements, time and pressure, were fixed due to the pneumatic operation. It was clear that servo motion, cammed to the main cycle shaft, would give us the flexibility to make good seals and increase the overall speed of the machine.
On our pneumatic version, film could easily melt if we located the stroke too close on a machine pause, so the original machine builder had a long stroke on the cylinder. Keeping the seal head farther away from the seal position adds extra time in the cycle that can’t be altered. The great thing about servo motion is we can break the functions up into different strokes, depending on where the machine is in the cycle.
At a full machine reset, the seal heads are positioned far above the sealing surface to allow for manual functions like threading the film through the machine and clearing machine jams. Upon resumption of cycle, the head can be advanced down to a dwell position that is just high enough to prevent damage to the film. Seal cycle times are significantly shortened, and we get the bonus of the torque that a servo can apply to the stop position. The two key elements are accomplished—time in the seal position and pressure on the film on top of the cup.
We ended up buying a third machine from our machine-building partner before we got a chance to implement the changes to our version 1.5 machine.
Happily, we felt confident enough with our trials with that machine that we worked with our vendor to go full
servo on the third machine by adding not just the sealing operation, but the cup-dispensing and cup-eject stations, as well.
technology trends
is a controls engineer at Crest Foods (www.crestfoods.com), a dry-foods manufacturing and packaging company in Ashton, Illinois.

RICK RICE
component considerations
Tobey Strauch contributing editor tobeylstrauch@gmail.com

How PLCs and PACs are different and alike
PROGRAMMABLE LOGIC CONTROLLERS (PLCS) have been a mainstay in industrial automation since the 1960s. Most platforms are more of a computer that can use ladder logic or higher-level programming languages like C/C++ or Python scripting. This is called a programmable automation controller (PAC).
Some generalized comparisons between the PLC and PAC can be categorized under purpose, programming, memory, application and costs. As you might guess, the PAC has more memory and more versatility as far as application and programming languages. PLCs will cost less, but the PAC may be used to source the ladder logic, run the human-machine interface (HMI) and have a small historian for data forwarding. Which platform is the best choice for machine builders? This is probably based on the application and discussions with the customer and what platforms they are using in their operations. However, there are some options.
The
The other highlight of PACs is the increased remote input/output (I/O) capacity. PLCs may be single-racked and have limited I/O capacity, but PACs generally can handle 200 or more remote I/O nodes. This is dependent on which vendor system you choose, but the amount of I/O interface with PACs is higher than with PLCs. Much of this is due to networking, and, if the PAC has layered control, then it will have more network capacity.
PLC, PAC, local vs. distributed control system will continue to be a debate and under constant change.
If the application is relegated to one machine and there is limited I/O and limited footprint, then I would choose a PLC. If you are working with a large machine and you need a group of controllers and a high-level HMI in more than one location around the machine and it must feed back to a corporate higher-level system, then a PAC is a good idea. One of the differences between the PLC and the PAC is the PAC may have more memory or dual cards/cores, which applies for multiple applications, or increased architecture complexity.
PACs are more integral in system safety, as well. Some control architectures set up a card in the rack to be dedicated to safety. Others modularize the software, so the safety is separated out modularly via memory on the controller. Simple PLCs would not have this capacity. In this way, PAC technology has allowed us to advance to smart relays and safety PLCs. One manufacturer has created a safety PLC that can be programmed with CoDeSys and meets safety-integrity-level (SIL) and performance-level (PL) requirements as high as SIL3/PLe.
Since CoDeSys can be programmed with C++ there is the capacity to do safety with C++. So, technically this small PLC is a PAC because of C++ capacity and higher memory capacity.
PACs could be utilized with instrumentation that is Ethernet-compatible and skip the remote I/O racks. For instance, with IO-Link and a proper Ethernet server, a PAC can become an edge computer utilizing inputs, networking signals and controlling the machine. As the trend has been and will continue to be, the PLC, PAC, local vs. distributed control system will continue to be a debate and under constant change. Why? Technology is advancing and industrial automation is becoming more open, more modular and more integrable. Advancing the PLC to the PAC is allowing this versatility to grow.
Imagine using a Linux-based system with container software like Docker. Picture breaking out the control inputs, the control processing and then the outputs to the machine, and the overall-equipment-effectiveness (OEE) and enterprise-network type of outputs. Programmable automation controllers will allow this kind of activity.
The increased modularity pushes industrial automation forward into Industry 4.0 and more plug-and-play applications. Many controller manufacturers have offered container-type applications on their PACs. Others are following suit with Linux-based real-time control.
No matter the platform, machine builders have expanding options as PACs continue to develop. This is also exciting as far as adaptive automation. PACs allow the integration of AI modules and designs that can adapt dynamically.
Tobey Strauch is an independent principal industrial controls engineer.

Charles Palmer contributing editor
Safety sensors provide protection
INDUSTRIAL SAFETY SENSORS are critical for ensuring the safety and protection of workers, machinery and the overall production process in various industries. They use a combination of different technologies to monitor conditions and detect hazards.
Proximity sensors
These sensors comprise inductive, capacitive, ultrasonic and laser technologies and are used to detect the presence or absence of an object without physical contact. Inductive sensors work with metal objects; capacitive sensors detect a wide range of materials; and ultrasonic and laser sensors can detect the distance to objects. Applications include machine safeguarding, positioning and motion detection.
Pressure sensors

Thermistors and infrared sensors are additional alternatives. The very latest temperature-related sensors are known as thermochromic; in these products, a color change based on temperature ranges provides 24/7 safety monitoring to give a predictive warning on overheated equipment. The permanent quality of the change highlights issues in straight and non-continuous load applications, protecting against electrical fire or equipment failure.
The very latest temperaturerelated sensors are known as thermochromic.
Typically, industrial pressure sensors have been centered around piezoelectric technology, which shows optimal linearity, accuracy and stability. Following this, capacitive, strain gauge and optical pressure sensors form the balance. Their function is to monitor pressure levels in systems such as pressure vessels, piping systems, hydraulic circuits, pneumatic equipment or tanks. They ensure that pressure levels are within safe operating limits. Applications include gas and liquid pressure monitoring, pipeline safety and tank level monitoring.
Temperature sensors
In industrial applications, thermocouples, such as types K or J, are commonly used. The highest temperature thermocouples are Type C, which are made from tungsten rhenium alloys, and type B, R or S, which are platinum rhodium alloys, with capabilities in the 1,000 °C to 2,315 °C temperature range.
For high-accuracy temperature measurements below 400 °C, resistance temperature detectors (RTDs), typically Pt100 (100 Ω at 0 °C) and Pt1000 (1,000 Ω at 0 °C) are the most commonly used, the latter having 10 times the resolution of the former.
All these are used to measure temperature in machines, equipment and processes. Overheating is a significant safety risk, so these sensors help prevent damage and hazards. Typical applications include over-temperature protection in machines, processes, motors and electrical systems.
Gas and chemical sensors
These safety sensors comprise electrochemical, catalytic, infrared (IR) and metal oxide semiconductor (MOS) technologies to detect hazardous gases like carbon monoxide (CO), methane (CH4), hydrogen sulphide (H2S), oxygen (O2), and toxic chemical vapors. They are essential for ensuring a safe working environment, especially in confined spaces. Their applications include leak detection, air quality monitoring and toxic gas detection in industrial plants, oil rigs and mining.
Smoke and fire sensors
In this application, ionization, photoelectric, thermistorbased and infrared technologies are used to detect smoke particles, heat and flames to prevent fire hazards. Applications include fire detection in manufacturing facilities, warehouses and high-risk environments.
Vibration sensors
Vibration sensors can be traced back to the 1920s and comprise piezoelectric, accelerometers and strain gauge technologies. In the 1960s, tunable analog filters were added to meters so that users could discriminate between frequencies. Around this same time, fast Fourier transform (FFT) vibration detection and analysis started increasing.
Fast Fourier transform is an algorithm for transforming a time-domain signal into a frequency-domain signal. These systems measure vibrations or oscillations in machinery.
Excessive vibration can indicate malfunction or risk of failure. Their applications include vibration monitoring for rotating equipment like motors, pumps and turbines.
Sensors and the predictive analytics lead to the IIoT-enhanced, prescriptive maintenance future, a future in which software will inform maintenance teams on when, where and why to complete maintenance. They provide advanced safety warnings.
Motion sensors
There are three types of motion sensors that are used frequently. The function of passive infrared (PIR), microwave and dual tech/hybrid—ultrasonic, microwave and radar—is to detect motion of objects or people in the environment. These sensors can trigger alarms, stop machinery or activate safety features. Typical applications include safety light curtains, machine guarding, intrusion detection and employee safety monitoring.
Safety light curtains and grids
Infrared light beams are applied where a barrier of light beams is created across a defined area. If a person or object crosses the light curtain, the system triggers a safety shutdown or alert. Typical safety applications include machine safeguarding, robot protection and access control to hazardous zones.
Optical sensors
Typical optical sensors comprise photoelectric, laser and optical reflection devices and are used to detect the presence of objects or humans based on light interference or reflections. They can measure distance or detect motion. They are used as safety barriers, conveyor system monitoring and machine operation control.
Radar and lidar sensors
These make use of radio waves and light waves. Their function is the application of radar or laser signals to detect objects, measure distances and avoid collisions.
Lidar is often used for precise location mapping. They are typically used for automated vehicle safety, navigation in warehouses and collision avoidance in industrial robots.
Safety relays and controllers
These items comprise smart relays, electronic control and programmable logic controllers (PLCs), and their function is to integrate signals from various safety sensors and control safety mechanisms, like stopping machinery or activating alarms when unsafe conditions are detected. They are widely used in machine control systems, emergency shutdown systems, and safety interlocks.
Load and force sensors
This technology comprises strain gauges, piezoelectric and capacitive sensors, which are used to measure the force, load or weight applied to a machine or structure, ensuring it remains within safe operational limits.
When any of these load and force sensors are applied to applications such as load monitoring in cranes, lifting systems and mechanical equipment, they provide the necessary safety guarantees.
Machine vision systems
These systems include cameras, image processing and artificial intelligence (AI) algorithms, and they monitor and detect hazards in real time, such as detecting if an operator is too close to a dangerous machine. Industrial applications include visual inspection, hazardous zone detection and operator safety.
Environmental sensors
We employ safety sensors to protect workers and others from the environment.
In these applications, humidity, temperature and particulate-matter sensors detect changes in environmental conditions that may pose a risk to both people and machinery, such as extreme humidity or temperature levels.
In terms of applications, ensuring safety in environments such as clean rooms, data centers and chemical processing plants are commonly found.
These technologies often work together to provide comprehensive safety solutions in industrial settings, offering early warnings, preventing accidents and minimizing damage to personnel and equipment.
Charles Palmer is a process control specialist and lecturer at Charles Palmer Consulting (CPC). Contact him at charles101143@gmail.com.
machine input
SDA brings scalability and flexibility
Software-defined
automation reduces reliance on proprietary hardware and supports ML
by Mike Bacidore, editor in chief
DAVY DEMEYER IS founder of Acceleer, a Belgian company specializing in collaborative design specifications, code generation and automated deployment to test, staging and production. Demeyer has been talking about software-designed automation (SDA) for years. He launched Acceleer in 2024 specifically to help speed up the transition to SDA. Roy Krans is software development manager at ACS, a CSIA certified member of the Control System Integrators Association. Headquartered in Verona, Wisconsin, ACS specializes in equipment, testing, process systems, automation and controls. Krans shared his insights on the rise of software-defined automation. Nathaniel Scroggins is product marketing manager, connection technology, at Balluff. Garrett Wagg is product manager, ctrlX Automation, at Bosch Rexroth. Andre Babineau is marketing director of Next-Generation Industrial Automation Incubator at Schneider Electric.
What is the primary focus of software-defined automation (SDA)?

Andre Babineau , marketing director, Next-Generation Industrial Automation Incubator, Schneider Electric : The primary focus of software-defined automation (SDA) is to enable greater flexibility, interoperability and portability in industrial automation by decoupling software from hardware. This approach allows industries to design automation solutions without being constrained by specific hardware capabilities, fostering seamless integration across different vendors’ equipment and platforms. By relying on standardized interface and leveraging Universal Automation.org runtime, SDA enhances system scalability, efficiency and cost-effectiveness, while empowering businesses to respond more swiftly to changing operational needs and market demands. It also supports sustainable practices by optimizing energy use and minimizing operational complexity.

Garrett Wagg , product manager, ctrlX Automation, Bosch Rexroth : The primary focus of SDA is to improve and expand industrial or enterprise processes by using software to control and manage automated systems. This allows for optimal data collection and communication.

Nathaniel Scroggins , product marketing manager, connection technology, Balluff : The primary focus of softwaredefined automation is to automate processes with software infrastructure in mind. This is a mindset change for an automation world that has gotten used to designing with specific hardware as the primary focus. That approach can lead to a lack of flexibility and interchangeability within a system’s architecture.
The SDA approach emphasizes flexible, open-source, universal protocols and agnostic platforms that can use any vender’s hardware, making it easier for users to manage and scale their systems over time. This approach also opens the door for more rapid deployment and troubleshooting of assets.
When designing with a hardware-defined approach, you are not always providing the user with the ability to make changes as their needs evolve or their systems need to scale. This can happen often when using proprietary protocols or hardware/software combined products that are unique to that vendor. We see examples of this when looking at programmable logic controllers (PLCs) that are locked down or don’t have the ability to connect to IT networks, causing the user to have to change PLCs and reprogram the machine to open it up to their network or even to just make simple changes to machine configuration.
Hardware-defined automation is rigid and requires manual/physical adjustments, whereas software-defined automation provides users with the ability to make more fluid and quick changes when necessary.

Roy Krans , software development manager, ACS : SDA focuses on software-driven automation versus hardware-driven automation. This moves some or
all of the automation logic to cloud or edge-based platforms versus local proprietary hardware, such as a programmable logic controller (PLC).
SDA systems have existed for some time but are now expanding their range and penetration to where even individual sensors are remote-accessible and previously test-lab-scoped systems are now accessible worldwide.

Davy Demeyer, founder, Acceleer : The two main targets are decoupling the software runtimes from the hardware, which is something that, for example, OPA accomplishes, and making applications and data manageable through software instead of through graphical user interfaces (GUIs).
Most of the focus today is on the former, but the biggest gains will come from the latter, especially since we’re moving to a world where artificial-intelligence (AI) agents will help us manage both the engineering and production workflows.
Graphical user interfaces are not really a limit for AI agents, but they will just be too slow, so there will be a tendency to select applications that are highly performant in automated workflows.
What are the primary benefits of softwaredefined automation?

Davy Demeyer, founder, Acceleer : The main benefits are speed, scalability and flexibility. For scalability, it is not only the scalability of the automation layer itself, but also for applications that build on the automation layer, like data analytics, digital twins and the manufacturing execution system (MES).
One example benefit is that it will be very easy to quickly spin up testing and staging environments that are almost identical, like the production environment.
Another example is that, in a big organization, it will be much easier for the global engineering team to get a quick overview of all deployed control systems they have in their worldwide organization, even if it spans hundreds of plants.
A final example is that it will allow compressing the engineering workflows and time to market.

Andre Babineau , marketing director, Next-Generation Industrial Automation Incubator, Schneider Electric : Software-defined automation offers a range of
machine input
valuable benefits. One of the main advantages is its flexibility and portability, allowing automation applications to be deployed across different hardware platforms without requiring major reengineering or testing. This makes systems much more adaptable to changing needs. Additionally, it offers scalability and efficiency by minimizing downtime when making changes to automation systems, which helps operations run smoothly and makes it easier to integrate equipment from different vendors.
Cost efficiency is another significant benefit, as the vendor-agnostic nature of software-defined automation helps lower both capital and operational costs, while also eliminating the need for large hardware inventories. It also empowers the workforce by shifting the focus to broader skill sets rather than specialized knowledge, making it easier to attract and retain talent. Finally, software-defined automation enhances interoperability by ensuring seamless communication between systems from different vendors, cutting down on the complexity and cost of using gateways and improving overall multi-vendor compatibility.
Another benefit from SDA is the ability to get rid of obsolescence from a software and a hardware perspective contributing to lower total cost of ownership (TCO).

Garrett Wagg , product manager, ctrlX Automation, Bosch Rexroth : The primary benefit of SDA is detaching the control logic from the physical machinery and allowing you to control, monitor and adjust automated processes through software interfaces rather than rewiring or manually adjusting a piece of hardware.

Nathaniel Scroggins , product marketing manager, connection technology, Balluff : The primary benefits of an SDA approach lie in its ability to provide and maintain flexibility and interchangeability. Users’ system architectures are all continuously evolving and scaling as we push deeper into an Industry 4.0 landscape. This means that having the ability to swap out devices or systems as needed with minimal impact to the existing environment is important.

Roy Krans , software development manager, ACS : Benefits include less reliance on proprietary hardware; enabled remote monitoring and updates; quicker deployment and updates; and support of enterprise-level machine learning, maintenance and optimizations.
machine input
How does software-defined automation figure in the convergence of IT and OT?

Roy Krans , software development manager, ACS : SDA merges IT and OT. Cloud-based and edge systems require IT infrastructure to connect to OT resources thereby mandating the convergence of these two technologies.

Davy Demeyer, founder, Acceleer : The meaning that’s most connected to IT-OT convergence is linking data between the two worlds. There is another meaning, and that is bringing benefits and best practices that we are used to in IT to our world of OT, while still taking into account the specific environments and requirements we have in OT. These benefits are the same—speed, scalability and flexibility.

Andre Babineau , marketing director, Next-Generation Industrial Automation Incubator, Schneider Electric : Software-defined automation allows a seamless integration. This decoupling of hardware and software allows for standardized automation applications to run across various hardware platforms, whether in the IT or OT space. It also breaks down the silos between IT and OT systems, facilitating digital collaboration, real-time data management and optimized control strategies.

Garrett Wagg , product manager, ctrlX Automation, Bosch Rexroth : SDA is the binding factor that allows IT and OT to integrate seamlessly. Software-defined automation takes the control of OT systems and puts it into software layers, which allows IT to easily interact.
Which standards and protocols will be affected most or increase/decrease in use because of software-defined automation?

Davy Demeyer, founder, Acceleer : A general concern and risk for many of the industrial standards is that they are not openly available. SDA will bring a faster pace and more innovation to the world of automation, And agentic AI will more and more be used to help with this advanced pace, and it also needs SDA to work efficiently. It is already becoming clear that standards that are not known by the main language models don’t get recommend-
ed. It is an important consideration that the main standards organizations in our field—International Society of Automation (ISA) and International Electrotechnical Commission (IEC), but also newer organizations such as Open Process Automation Forum (OPAF)—need to start thinking about.
Staying with the theme of agentic AI, one of the standards that has a high chance of wrapping all our industrial engineering and production applications is the recently proposed model context protocol (MCP).
According to the MCP website introduction: “MCP is an open protocol that standardizes how applications provide context to LLMs. Think of MCP like a USB-C port for AI applications. Just as USB-C provides a standardized way to connect your devices to various peripherals and accessories, MCP provides a standardized way to connect AI models to different data sources and tools.”
A similar idea, but without the explicit AI focus, is the proposal for a common API by CESMII.
Another set of standards that will probably increase is standards defined under the OPC UA nodesets.
Many of them are still not very commonly used, but the combination of SDA, agentic AI and network effects will probably increase their adoption.
When mentioning OPC UA, we should also mention message queuing telemetry transport (MQTT), which seems to be here to stay.
In process automation, two interesting standards are Data Exchange in the Process Industry (DEXPI) because piping and instrumentation diagrams (P&IDs) are the source of our engineering workflows and Module Type Package (MTP) because integrating individual machines and skids into bigger control or distributed control systems is one of the key bottlenecks to be resolved.
Of course, ISA-95 and ISA-88 are here to stay, and ISA-106 is probably picking up more popularity for the continuous processes.
I should also mention the Asset Administration Shell (AAS) because some end users see it as part of the solution for an SDA-future.
For version control, everything will probably move to Git, just because it’s the de-facto standard in software development, and SDA will build on the tooling and workflows we have in the software world.
Also, definitely note LinuxContainers can run on Docker/ Kubernetes, where we already see organizations making vendor decisions based on this capability.
machine input

Roy Krans , software development manager, ACS : There are numerous OT protocols that lends themselves to machine-to-machine communication, or intranet, such as Profinet, EtherCAT, EtherNet/IP, CAN and Modbus TCP/IP.
More remote SDA systems, spanning the internet, employ IT protocols including message queuing telemetry transport (MQTT) and data distribution service (DDS).
These protocols will generally see an increase in usage as more systems move toward SDA.

Nathaniel Scroggins , product marketing manager, connection technology, Balluff : We are seeing a rise in the use of OPC-UA, message queuing telemetry transport (MQTT) and representational state transfer (REST) application programming interfaces (APIs). These IoT protocols are universal and allow devices and systems to communicate throughout a system architecture more freely and easily. Specifically in regard to SDA and REST APIs, I am seeing a large increase in use as we continue to integrate, control and manage software systems on-premises and in the cloud.
It is likely that we will see a decrease in any proprietary protocols that box users into a specific vender or hardware family. Users are more acutely aware of this now more than ever as they are striving for systems environments that remain mostly agnostic, flexible and scalable.

Andre Babineau , marketing director, Next-Generation Industrial Automation Incubator, Schneider Electric : Several key standards and protocols are seeing a boost in adoption from the rise of software-defined automation. For example, the Open Process Automation Standard (O-PAS) is a de-facto standard for continuous process industries. The key benefits of this standard are achieving interoperability and portability while being cybersecure. A central standard in this shift is the OPC UA protocol, which enables secure and reliable communication across various systems, allowing multiple vendors’ solutions to communicate together to provide a flexible solution.
The IEC standards, particularly IEC 61499 system model language and the IEC 61131-3 programming languages are part of OPAS.
As modularity becomes more critical in manufacturing, protocols like Module Type Package (MTP) and the
NAMUR Open Architecture (NOA) will see more use. These standards enable easier integration of smart sensors, field devices and other technologies, further enhancing the flexibility of software-defined automation.

Garrett Wagg , product manager, ctrlX Automation, Bosch Rexroth : The most affected protocols and standards include OPC UA, message queuing telemetry transport (MQTT), Ethernet/IP, time-sensitive networking (TSN) and cybersecurity standards like IEC 62443.
Which components will see the biggest impact from software-defined automation?

Nathaniel Scroggins , product marketing manager, connection technology, Balluff : Edge devices will be most impacted from the use of SDA. Users are looking for ways to quickly get devices onto the network and integrated into their system architecture. Edge devices that utilize IoT protocols and open communication methods, like MQTT, OPC-UA and REST APIs, allow users to get data to parts of their architecture without needing to change much within a system. For example, an edge device could be connected to an open reporting software via MQTT or REST APIs—displaying real-time data from sensors monitoring the conditions of assets within a machine.

Roy Krans , software development manager, ACS : Operational technology (OT) will provide increasing support for cloud-based and edge automation systems. For example, a sensor will no longer output an analog signal but provide a network-based signal. Similarly, controllers will provide remote automation and monitoring support via network-based signals.

Davy Demeyer, founder, Acceleer : Definitely the control modules/CPUs will be impacted. Controllers that are not agnostic will quickly disappear. Virtual runtimes and programmable logic controllers (PLCs) will run on open hardware. In Open Process Automation, these are called the decentralized computing nodes (DCNs). There might even be a trend to run virtual PLCs on classical server hardware. Anyway, everything becomes very flexible.
On the integrated development environments used to program the PLCs, we will see a shift to IDEs built around
machine input
the code, instead of the code being locked away inside the IDEs. This will even happen for distributed control systems.
In both cases the GUIs will have a similar interface as before, still allowing graphical ways of programming like ladder, function block diagram (FBD), continuous function chart (CFC) and sequential function chart (SFC). Just underlying everything will be text files.
On the applications in production, interfaces will become open, and the speed of these interfaces will become very important. And everything will be running in containers.

Garrett Wagg , product manager, ctrlX Automation, Bosch Rexroth : Programmable logic controllers (PLCs), edge devices, gateways, servers and cloud infrastructure, legacy equipment are greatly impacted by software-defined automation. With a stronger focus on software, things like communication, data, flexibility and openness are prioritized. We will continue to see this trend reflected across new software and hardware releases.

Andre Babineau , marketing director, Next-Generation Industrial Automation Incubator, Schneider Electric : Software-defined automation will have a significant impact on several components within industrial systems. Automation software will be decoupled from hardware, becoming vendor-independent, and enabling modular, reusable applications that can be integrated into various systems.
Control systems will shift toward more modular designs, allowing components from multiple vendors to be seamlessly integrated, improving scalability and flexibility. We are at the infancy of software-defined automation and the primary component to leverage SDA principles is the controller. The other layers of the control solution will come next.
Communication infrastructure will benefit from standardized protocols, like OPC UA, simplifying connectivity and enhancing interoperability between IT and OT systems, while reducing reliance on proprietary methods. Additionally, field devices and sensors will be more easily integrated through standards, like NAMUR’s NOA and MTP, enabling real-time data collection for better monitoring and optimization. As automation systems become more interconnected, cybersecurity and safety measures will also play a critical role in ensuring the security and reliability of these systems.
In what ways does software-defined automation allow machine builders more flexibility in hardware selection and management?

Andre Babineau , marketing director, Next-Generation Industrial Automation Incubator, Schneider Electric : Software-defined automation offers machine builders greater flexibility in hardware selection by enabling them to choose the best hardware for their specific needs without being locked into a particular vendor’s ecosystem. By using open standards and modular systems, builders can integrate products from various vendors into one cohesive system, improving interoperability and adaptability across the board.

Garrett Wagg , product manager, ctrlX Automation, Bosch Rexroth : Software-defined automation allows for flexibility in managing brownfield applications with legacy hardware by using gateways or middleware. Builders can also mix and match different companies’ equipment by utilizing standard interfaces like OPC UA and MQTT.

Davy Demeyer, founder, Acceleer : One of the main reasons machine builders only propose one automation brand is because it is too much engineering effort to switch between and maintain multiple ways of programming.
In the software world, it would be like teams switching between Java and .NET: both solid programming standards, but no team that wants to be efficient considers switching between them.
By making the software runtimes hardware-agnostic the machine vendors will have much more flexibility.
One case will be where an end user will ask for a specific hardware brand, because they want to keep all hardware the same for maintenance reasons.
Another case will be where machine builders will be able to switch between hardware brands when there are supply issues, like what we saw during COVID.
Finally, hardware will become much more powerful than what’s available today for the same price, allowing it to run multiple workloads on it—for example, one PLC runtime and maybe a separate software application to help manage the machine.
machine input


Roy Krans , software development manager, ACS : Hardware management will be much improved as most devices will be intelligent and automatically provide a variety of status information to the automation system (Figure 1). In selecting hardware, builders will still need to be cognizant of compatibility, but that compatibility is simplified to supporting a communication protocol across all devices versus having additional hardware to support a large variety of physical signals.
Figure 1: Hardware management will be much improved with SDA, as most devices will be intelligent and automatically provide a variety of status information to the automation system.
How can machine builders prepare for and leverage software-defined automation?

Roy Krans , software development manager, ACS : Develop familiarity with remote protocols and standards that may be employed in SDA systems. Also, investigating currently available automation software, its capabilities and features will provide some insight into the overall power and advantages of SDA (Figure 2).
Figure 2: Investigating available automation software, its capabilities and features will provide some insight into the overall power and advantages of SDA. (PHOTO: ACS)


Andre Babineau, marketing director, Next-Generation Industrial Automation Incubator, Schneider Electric: To prepare for and leverage software-defined automation, machine builders should focus on embracing open standards
and modular architectures. These frameworks enable flexibility and ease of integration, allowing builders to adopt the best technologies and systems for their needs.

Garrett Wagg , product manager, ctrlX Automation, Bosch Rexroth : Machine builders can implement software-defined automation layers today and scale and add flexibility for the future. They can do this by setting up software layers and pushing updates and new features via software. The builders can continue to expand and update with new hardware that can utilize software-defined automation.

Davy Demeyer, founder, Acceleer : For hardware-software decoupling, first, have conversations with the main customers. Understand what their near and longer-term expectations are.
Second, have conversations with the main automation vendor. Ask what SDA-based products can be bought today and at least require the possibility to switch out the hardware by another vendor.
For open software applications, think what softwarebased functionalities would make a lot of sense to add to the machine if it would not add repeated engineering effort. What often happens with machine vendors is that they have good ideas, but it makes no financial sense if they have to repeat the engineering effort for every individual machine.
If, instead, a configuration of each machine could be fed to an automated configuration/engineering/deployment workflow, then it could really help to drive innovation on the machines.
How does software-defined automation build on existing IT and network infrastructure in factories and plants?

Davy Demeyer, founder, Acceleer: Workloads will move toward containers and virtual runtimes. Windows systems will slowly disappear from the factory floors. And, since everything will become software-defined, it will become standard practice to have a staging environment that will be a full duplicate of the production environment.

Roy Krans , software development manager, ACS : SDA utilizes a variety of standard IT infrastructures such as Wi-Fi networks, wired networks and
machine input
routers. In some cases, this infrastructure will have to be updated to accommodate the increased bandwidth and desired connectivity, but, either way, an SDA will be using off-the-shelf IT hardware to enable its connectivity.

Andre Babineau, marketing director, Next-Generation Industrial Automation Incubator, Schneider Electric: Software-defined automation leverages standard IT software and network infrastructure and utilizes common standard communication protocol, like OPC UA, and event-driven architectures that facilitate seamless integration of real-time data with enterprise applications. This integration allows for smoother communication between automation systems and enterprise tools, such as analytics, asset management and resource-planning systems. By enabling modular and flexible systems, software-defined automation makes it easier to scale and adapt factory operations without needing a complete overhaul of existing IT infrastructure.

Garrett Wagg , product manager, ctrlX Automation, Bosch Rexroth: Software-defined automation taps into established IT networks like Ethernet or 5G using protocols like OPC UA or MQTT to enhance systems and handle real-time rugged demands of OT. This enables users to have a more seamless, data-driven manufacturing process.
Tell us about your company’s state-of-theart product that involves software-defined automation.

Roy Krans , software development manager, ACS : Our software, Acselerant, provides configurationbased automation running on a Linux RT based computer. Acselerant links remote and local data sources while supporting most OT communication protocols through IT infrastructure.
By utilizing dynamic configuration files, Acselerant empowers users to easily reconfigure their test systems to accommodate a variety of applications. Additionally, Acselerant can be configured to support cloud-based servers to receive and send commands and data (Figure 3).

Nathaniel Scroggins , product marketing manager, connection technology, Balluff : Balluff’s Condition Monitoring Tool Kit (CMTK) is an edge device with a free, open-source software pre-loaded onto it that acts as a

quick-start gateway to a user’s existing network architecture and provides a SCADA-like environment to deploy and monitor process sensors. The CMTK device utilizes the IO-Link protocol for sensor-level connections, which is globally accepted by automation users. It doesn’t tie users to proprietary protocols, meaning the device is sensoragnostic. It acts as a gateway to a user’s network by offering MQTT, OPC-UA and REST API communication methods to connect to various other systems. The software is built on Grafana, offering quick deployment of customizable dashboards to display sensor information. It also provides access to Docker and Node-Red via the optional SD card slot for more advanced configuration.
This device is most impactful when used to step customers into SCADA-like monitoring functionality via the software, but it can easily be reconfigured to a data aggregation and monitoring gateway that can push data into a full SCADA or PLC system, which enables control and automation. The device remains completely open, and the user owns their own data, which can live on the device via the InfluxBD or be passed to other user databases.

Andre Babineau , marketing director, Next-Generation Industrial Automation Incubator, Schneider Electric : Schneider Electric offers the first software-defined automation solution decoupling hardware from software, enabling greater portability and interoperability. This solution also enables the end user to select the fit to purpose hardware to maximize efficiency while reducing overall operation cost.
Figure 3: By utilizing dynamic configuration files, Acselerant empowers users to reconfigure their test systems to accommodate a variety of applications. (PHOTO: ACS)
As a leader in multi-vendor, software-defined automation, we provide tailored solutions, such as EcoStruxure Automation Expert and EcoStruxure Foxboro DCS. Our EcoStruxure Platform provides a single unique user experience across our fleet of software solutions minimizing training and maximizing uptime.

Davy Demeyer, founder, Acceleer : Acceleer brings Design-Ops to the world of process automation engineering.
It simplifies and scales the automation engineering workflows, just like what DevOps has done for normal software development.
The big difference with DevOps is that we start from the design stage. The future of automation engineering is based on an open ecosystem, where different engineering applications can be linked together through flexible interfaces.
Upstream, most of the engineering vendors have agreed to export their P&IDs into the DEXPI XML format. Acceleer allows an automatic import from all the equipment defined on these P&IDs.
Next, process and automation engineers collaborate in defining the detailed functional specifications, defining how the code in the control system will work.
Once the functional specifications are ready, the code is automatically generated, using a template-based approach very common in software automation workflows.
The code is then automatically imported into the integrated engineering environment of the selected control system, either through the software development kit (SDK) or directly into the text files containing the code and configuration. From there, the users can complete any code that can’t be auto-generated and finally deploy to testing, staging and production environments.
The overall workflow stays as close as possible to the workflows our engineers are used to today, except it avoids any repeated data entry, which often takes up a big part of the engineering time.
Design-Ops allows acceleration of the overall engineering workflow, making the entire process more predictable and repeatable. It allows users to take back ownership of the functionality of the plants, today often locked away inside the PLCs and distributed control systems.
Design-Ops only works if everything becomes softwaredefined. Software-defined does not only mean decoupling the software runtimes from the hardware. It also means
that applications and the data they contain become manageable through software instead of through GUIs.
Automation systems are becoming more open. We already see this in PLC vendors moving to the next generation of IDEs such as Siemens’ Simatic AX, CoDeSys go!, Beckhoff’s TwinCAT PLC++ and B&R Automation Studio Code.
The same is happening in the DCS world with Open Process Automation (OPA) being ready for deployment, supported by most of the main vendors.

Garrett Wagg , product manager, ctrlX Automation, Bosch Rexroth : The ctrlX Core line of controllers is a scalable, multicore industrial control system that utilizes a real-time Linux-based operating system, ctrlX OS and modular app technology to handle any current or future task in automation. This includes a wide range of PLC functions, IOT integration and advanced motion control.
The ctrlX Automation product line utilizes an app-based, open approach to empower users to take a more softwarefocused approach to automation.












by Anna Townshend, managing editor
AI helps ease the coding load


PHOTO:
Machine builders and system integrators are experimenting with the benefits of artificial intelligence (AI), just like many other industries. The popularity and ubiquitous nature of generative AI has piqued the interest of OEM and integrator engineers, as well. Practical applications of AI for engineers programming code have real benefits and are being implemented into daily workflow. AI tools are more advanced with certain programming languages, and true AI-enhanced programmable logic controller (PLC) code needs a highly customized generative AI.
Generative AI supports engineers by automating coding tasks and giving them time for higher-level programming
It’s hard to look anywhere and not be bombarded by advancements in artificial intelligence in media, politics and business. All our devices, appliances, utilities, vehicles, you name it, are enhanced with “smart” features and only getting smarter, all distributed and embedded at scale. As a consumer sometimes it feels like you’re just along for the ride, but machine builders and system integrators have the opportunity to explore AI solutions to help advance automation and machine engineering. But the plethora of AI offerings makes it difficult to know where to start and what will work for you. Because the AI buzzword is everywhere, you have to be skeptical of its use or approach decision about its use with a clear knowledge of what artificial intelligence is and how it can benefit industry.
What is artificial intelligence, machine learning and deep learning?
What is artificial intelligence? It’s a hard concept to comprehend without also defining machine learning (ML), which is a subset of AI, and deep learning, which is a subset of ML.
“While AI captures much of the spotlight, understanding its interplay with machine learning (ML) is crucial. AI is a broad field aimed at creating systems capable of tasks requiring human intelligence,” says Milton Guerry, president of Schunk.
“Artificial intelligence is the overarching field dedicated to creating systems that can learn, reason and act autonomously. Machine learning is a subset of AI, providing the tools and techniques that make learning possible,” says Pradeep Paul, director of manufacturing intelligence at E Tech Group.
Artificial intelligence encompasses a broad field for creating many systems to perform tasks that require human intelligence. “ML, a subset of AI, involves training algorithms on data to improve task performance without explicit programming,” Guerry adds. AI and ML work together to develop smarter, adaptive systems.
“Deep learning, in turn, is a subfield of ML, focusing on complex models that drive many of today’s AI advancements,” Paul says. Deep learning models enable the algorithm training for machine learning.
“AI provides the intelligence framework, while ML enables continuous learning from data,” Guerry adds.
Data
analytics:
the role of
smart equipment and IoT Does that mean that all our smart devices and appliances are using AI? Mostly, the answer is no, but “smart” equipment is a necessary step to the proliferation of AI. A smart device or equipment is one that is connected to a network and can collect and analyze data from sensors. That network of equipment and data forms the Internet of Things (IoT).
Data analysis alone can perform some decision-making without human interaction and also without using artificial intelligence. Likewise, we also couldn’t have AI without smart devices networked together collecting massive amounts of data.
Many industries have been using machine learning for decades. The power of ML has expanded as compute power advanced quite rapidly. However, the use of algorithms to perform data analytics isn’t new; the computation power behind ML algorithms has increased rapidly, allowing ML to power AI features. But it’s important to remember, just because it’s ML, doesn’t mean it’s AI.
As an example, E Tech Group’s manufacturing customers are typically asking for actionable production insights for their individual sites. Manufacturers want to understand asset performance, identify downtime drivers and optimize factors that influence yield and resource consumption. “While the potential of AI, particularly large language models, is significant, practical and deployable solutions for process automation and equipment integration are still emerging. Currently, machine learning models and analytics effectively address many of these needs,” Paul says.
He points to energy-intensive industries, such as mining and data centers, which benefit greatly from energy-management analytics. Process industries, like pulp and paper and plastics, leverage process models to improve yield and
reduce resource consumptions. None of those employs artificial intelligence, and not every application needs AI, where basic analytics will do the trick. However, AI does have potential for machine automation and the design of machines in manufacturing in the right applications.
What is generative AI?
What are large language models?
And how can they help controls programming? Generative AI has the most potential for influencing the work of machine builders and system integrators. Whether experimenting with generative AI or thinking about and planning for its future potential in machines themselves, many are taking cues from the information technology (IT) sector.
IT has long been using AI to boost programmer productivity, says Chris Gibson, director of emerging technology growth at A&E Engineering, a system integrator and CSIA member. “We’re starting to see this trend extend into the controls world, as well,” he adds.
Generative AI should be thought of as an assistant, not a complete replacement for human intervention or programming engineers.
Aaron Dahlen, applications engineer at DigiKey, describes the relationship like that of a conductor and the musician. “Programming has become a hybrid activity with the programmer acting as the conductor and the AI as the musician,” Dahlen says.
“With regard to machine design, we see a continued trend to network the industrial controllers to collect data. We also see a tension as some designers move the data to the cloud or local servers, while others move the computational power to the edge of the machinery, leveraging the capability of modern PLCs,” Dahlen says. “At DigiKey, we have seen this trend reflected in our growing sales of industrial products.”
Chatbots and generative AI tools like OpenAI’s ChatGPT and Google’s Gemini are popular and familiar, but the term “generative AI” also covers content creation for images, music, videos and other audio. Large language models (LLMs) are a specific type of generative AI that are trained on large amounts of text data with deep learning models that use natural language processing (NLP), another subfield of AI, to produce text. Natural language processing allow LLMs to read human language by encoding and processing the data, which, in this case, is text.
Paul says LLMs can be trained and used to address complex programming issues, but using LLMs for controls programming will require vast quantities of code examples, code documentation and even natural language descriptions of the designed functionality. “This data allows the model to learn syntax, common patterns and the relationships between code and its purpose,” Paul says.
A general-purpose LLM must also be fine-tuned or customized on datasets for the specific automation vendor platform or protocols. “This fine-tuning adapts the model to the vendor’s unique instruction sets, libraries and best practices,” Paul adds. “This addresses the challenge of proprietary function libraries.”
Finally, reinforcement learning can further refine the model. “Engineers or automated systems can provide feedback on the generated code, rewarding the model for correct and efficient solutions, and penalizing errors. This iterative process improves the model’s accuracy and ability to handle complex scenarios,” Paul says.
Once trained, AI models can save significant development time in complex applications, performing tasks like code generation, automated documentation, code error detection and debugging, code optimization and test case generation. “Natural language prompts can generate functional code blocks, reducing the time spent writing code from scratch. This is especially helpful for repetitive tasks or complex logic,” Paul says.
E Tech Group has experimented with industry-specific generative AI tools like Rockwell Automation’s FactoryTalk Design Studio and other platforms, and its engineers are still learning how to best incorporate them into the workflow, but the potential is huge, Paul says. “We’ve played around with it, but it’s still very nascent,” he adds.
Right now, E Tech Group is working to incorporate generative AI into its standard coding workflow. “We maintain a robust, in-house code base that typically addresses approximately 80% of project requirements,” he adds. “To finalize deployment, we’ve created AI-driven internal tools.”
These tools excel at automating repetitive coding, such as templating programming for multiple tags. Instead of manual, tag-by-tag development, engineers can upload a CSV file, enabling the tools to rapidly generate and replicate the necessary programming logic, resulting in substantial time savings.
Generative AI tools also have potential to help with generating function requirement specifications (FRS), which are
developed from the customer requirements for an automation project, and then E Tech Group builds out code from the FRS. However, defining detailed specifications often requires preliminary coding to solidify design elements.
“Sometimes it’s hard to build out your functional spec without doing some upfront coding to iron out design components,” Paul says. With an AI tool, it’s easy to give it the general inputs, and it will generate a framework for a function requirement specification with all the required components, without the need to do any sample coding.
“AI can also help generate test cases automatically, ensuring more comprehensive testing and reducing the time spent on manual test creation,” Paul says. After the FRS, engineers write the test protocols to test the functional requirements and all the features, and AI can help draft test protocols.
Documentation in general can be a tedious but necessary task for engineers, and AI can help generate the needed documentation from the code itself. E Tech Group also takes on projects started by other companies or projects that require integrating systems from different vendors and equipment, which might not be following the same programming practices as E Tech Group engineers.
interfaces (APIs), for example. The basic, free versions of ChatGPT or Gemini are great at finding flaws in code for those widely used languages, Paul says. He predicts that in the future those tools will do even more than error detection and provide better code practices and suggestions to improve the formatting. Already, it has cut down on the need for as many subject matter experts at E Tech Group and given younger engineers more tools to advance their coding skills faster.
Paul says generative AI is helping younger engineers to hone their programming skills and expand their language knowledge. They can use Rockwell Automation’s FactoryTalk Design Studio to develop code structure based on specific requirements and then compare that to the in-house code base and learn how they are different and why one works better than the other.
Using LLMs for controls programming will require vast quantities of code examples, code documentation and even natural language descriptions of the designed functionality.
The potential is there, Paul says, for generative AI to do some of the reverse-engineering of the present code, instead of its engineers spending hours trying to understand the intention behind old code. It could produce at least some documentation and a summary of the code’s intent, Paul says. Some engineers at E Tech Group are working with generative AI tools to try reverse-engineering code.
Once code is written, AI can step in again to assist. “AI models can be trained to identify potential errors in code, suggesting fixes or highlighting areas that need review. This can drastically reduce debugging time,” Paul says. “The AI can analyze existing code and suggest optimizations for performance, memory usage or readability.”
Generative AI platforms are also well-suited to work with more traditional programming languages like Python, SQL or .NET, Paul says. E Tech Group also uses these more traditional languages for building interfaces for historians and customer application programming
“We use a lot of software and a lot of different platforms,” he says. “Every system is a little different.” This can make using generative AI for PLC programming more complicated. “With PLC programming, because every vendor has its own different methodology of program structure and code modules, it becomes hard to have a generalpurpose tool for that,” Paul says. That’s where the custom AI tools can come in, but those take significant time to develop. It will significantly change engineering workflows and employee hiring practices, as E Tech Group has already seen during the development of in-house AI tools.
What is retrieval-augmented generation (RAG)?
What are program organization units (POUs)?
How can AI simplify, complex PLC programming? For now, caution is still advised for any LLM tool, says Gibson. “Caution is essential when using generative AI for PLC programming because it can and will hallucinate, meaning it may generate incorrect or misleading information,” he adds. He recommends mitigating this risk through retrieval-augmented generation (RAG) AI systems for code creation. RAG is an AI framework that works with LLMs to be more accurate and relevant, by searching more external data sources and pre-processing information and prompts before they’re integrated into the LLM.
“RAG allows you to train AI with speci c knowledge, essentially putting guardrails around its responses. By feeding it approved libraries and best practices, you can ensure that AI-generated code aligns with your standards,” Gibson says. “With RAG, AI can learn machine speci cations and coding practices to assist in generating PLC, HMI and SCADA code. This emerging trend will only grow, signi cantly reducing tedious, error-prone and repetitive programming tasks.”
With traditional AI foundation models, they are pretrained off-line and do not include data or information that has come into existence after that training. RAG mitigates these shortcomings by retrieving external data and information. RAG also uses that information to enrich the prompt, taking relevant information and data and updating the original prompt, and the enriched prompt is passed to the LLM.
Complexity is also an important consideration for all programming languages, Dahlen says, and AI can assist here. “Parsing a program into smaller program organization units (POUs) is the gold standard for today’s PLC programmer. Instead of constructing a 100-line ladder logic diagram (LD) serpent, we break the code into smaller, more manageable pieces,” Dahlen says.
The long, serpent code is dif cult to construct, troubleshoot and maintain. “It’s a poor programming practice that will cost you a considerable amount of money over the lifetime of the machine,” he adds.
Instead, if the code is broken into several smaller POUs, where each one performs a dedicated function, the code is easier to build and troubleshoot. “This is where the AI excels as a partner to help us explore the inner working and the boundaries between POUs,” Dahlen says. “Knowing that

RAG is an AI framework that works with LLMs to be more accurate and relevant, by searching more external data sources and pre-processing information and prompts before they’re integrated into the LLM.

any given POU is small, the AI can generally comprehend the POU’s function and purpose within the larger program.” With this capability, programmers can use generative AI to optimize individual POUs or the full program. They can clarify the POU’s purpose and optimize the scope, structure and name of the variables, Dahlen says. Using known programming metrics, AI can also estimate program complexity or identify methods for reducing POU complexity. For any given POU, complexity can be de ned by measuring the number of decision points, nesting or hierarchical arrangement and total number of operators, Dahlen adds.
AI can help programmers make code easier to read and maintain, by helping in refactoring or improving the code. “The depth of refactoring depends on the speci c project. Sometimes, it’s as simple as changing the variable names for clarity. At other times, the programmer will make a key discovery that changes the structure of the entire project. This could be part of a formal code review or an individual programmer exploring the code. On a related note, refactoring is challenging in an industrial environment as a change in PLC code may require extensive veri cation tests to work out unintended bugs. Sometimes refactoring ends with a TODO statement to identify code that should be updated in the future,” Dahlen says. AI can assist along many steps of this process.
Dahlen notes that he prefers ladder-logic programming for PLCs, in part because ladder logic diagrams are one of the best PLC troubleshooting methods. However, these AI tools work better with structured text (ST). “Today, these tasks are easily accomplished by using ST, as the code may be copied and pasted between the AI and PLC development environment. In the future, we may be able to use LD,” Dahlen says. “Imagine the day when we can talk to the AI and then see the changes incorporated into the ladder logic.”


PHOTO:
Other AI applications for machine builders: digital twins and vision systems
AI can also assist with automating the machine-design process and enabling rapid prototyping and development of complex applications. In combination with digital twins, together they can accurately replicate physical characteristics and the behaviors of machines. “An AIenhanced digital twin allows for advanced simulations, testing and optimization without the need for physical prototypes,” Guerry says.
AI can also significantly enhance vision systems by improving image reliability and fostering adaptation and continuous learning. “AI allows these systems to refine their accuracy over time and adapt to new scenarios, making them more robust and versatile. Additionally, AI facilitates training with synthetic models, allowing vision systems to perform effectively under various conditions without the need for extensive real-world data,” Guerry says.
Schunk’s smart grasping technology uses the benefits of AI to enhance vision systems. “It goes beyond basic visionbased picking by incorporating AI-driven enhancements, resulting in more precise and adaptable grasping capabilities. This advancement demonstrates how AI optimizes vision-system design, providing smarter and more efficient solutions for complex applications,” Guerry says.
The
future of AI and machine programming:
E Tech Group engineers tackle new software projects
Many are experimenting with generative AI tools for backend development, code optimization and as an added feature on machines, but AI is not abundant in industry, just yet. “While the potential of AI, particularly large language models, is significant; practical and deployable solutions for process automation and equipment integration are still emerging,” Paul says. And AI doesn’t belong where regular machine learning and data analytics will get the job done.
AI-generated code should also be used responsibly, Gibson says. “Generative AI-generated code should always be treated as a suggestion and reviewed by experienced professionals before being implemented in machine control or monitoring systems,” he adds.
In the future, AI may change, perhaps revolutionize, machine design. With more intelligent machines that will eventually become autonomous systems. “AI-driven design tools will allow for greater customization and
adaptability, leading to machines that can self-optimize and evolve based on operational data. The integration of AI into machine design will benefit all industries by reducing costs, improving product quality, and shortening time-to-market, ultimately driving innovation and competitiveness,” Gibson says.
With AI assistance to write and optimize code, this allows engineers to focus on higher-level design and problem-solving and less time on repetitive coding and debugging, Paul says.
E Tech Group has seen other changes in its engineering staff, where those managing three or four projects used to be “stressed out,” Paul says. Generative AI has made engineers more productive, so they’re less pressured by the same amount of work, and it has also lowered the programming knowledge and experience level required for a new engineer. The staff used to focus more on core competencies, such as someone with computer science expertise vs. chemical engineering expertise. “That’s getting blended out. It’s becoming more even,” Paul says. “You don’t have to have as high a programming knowledge, so our hiring practices have also changed in that way.”
With all this added time from their AI assistants, E Tech Group engineers do have more time to focus on higher-level projects, such as an automated system to help create tags. Engineers are trying to build a system to make the time spent on creating tags a little easier. For example, a plant could have 500 tags that all have to be built out in the supervisory control and data acquisition (SCADA) system.
“We’d like to use tools to do it very quickly. Just put it in a spreadsheet and then import it, and it’s built out for us. It creates these tags in the right formatting and configuration,” Paul says. With many different systems and formatting, they have also been using generative AI to help build the tool. It will eliminate the manual process of converting spreadsheets into the tags and coding them one by one, and the new software application will automatically process tags and code with minimal input. What the company might have allocated as 24 hours of engineering time for tag generation could be reduced to four hours with the help of this new system, powered by generative AI and possible because of the time-saving nature of these AI tools.
This is the world where we all have capable and ever more competent assistants. Machine builders and system integrators are starting to explore the benefits of this new world, and it will have wide implications for machine development, deployment and operation.
Hydraulic motion control revitalizes languishing legacy aerospace press
Macrodyne turns to Delta Motion for help with seemingly impossible press upgrade
by Mike Dorian, Macrodyne Technologies
“IT’S A STRETCH forming press, a huge one. I mean, it can barely fit into an Olympic-size swimming pool, and it’s two or three times as deep. You look down from the edge and you feel dizzy.” Daniel Sion, senior programmer at hydraulic press manufacturer Macrodyne, based in Toronto, Canada, pauses to reflect on the seemingly impossible retrofit upgrade project that consumed weeks of his life (Figure 1).
“I heard that it cost $25 to $30 million back when it was commissioned in the 1980s, and it was controlled by this industrial PC with specialized I/O cards,” he continues. “But time went
on. The parent company that made the press stopped making compatible equipment, and eventually the spare parts ran out. The owner was able to get things fixed by an electronics repair center, but there was no way to improve the press’s features or functionality. Finally, it just sat there gathering dust until they called us. And then we called Delta Motion.”
Form(ing) and function
The press has a critical job: stretching metal alloy sheets to form parts for airplane wings. Workers keep parts in a freezer at a specific temperature;

heat treat them for forming and then return them to the freezer. The entire process is complex and proprietary. Naturally, forming aerospace parts also demands a high level of precision, which played a crucial role in the upgrade’s planning.
“The press owner had been seeing an accuracy of a few millimeters in position deadband,” says Sion. “They wanted it to be a lot more accurate than that, at least keeping up with the press’s original specs. But that was a tough issue because of the technology they used, and the components are just getting old.”
In addition to accuracy, the owner also needed more force. Specifically, the machine’s light table, the component with the largest area and biggest cylinder mechanism, was designed to press 800 tons. With the existing pumps, the owner could only reach 540 tons. “The customer chose to keep the existing pumps,” explains Sion.
Ultimately, production had fallen behind, the press could no longer provide the necessary precision, speed or force, and presses at competing firms provided superior functionality. Moreover, the competitors’ presses could be serviced far more affordably.
Figure 1: The stretch forming press is almost the size of an Olympic-size swimming pool and more than twice as deep.

It was up to Macrodyne to figure out how to breathe new life into this antiquated relic, and much of that task fell to Sion. Fortunately, he knew who to call for help.
Press control extraordinaire
“My first exposure to Delta Motion was in late 2005,” says Sion. “I was only using the basic RMC functionality, but, year after year, each new project got more complicated, and I used more of Delta’s capacity. Now, I believe they have the best motion control. And that’s not just me; ask any Delta user.” Macrodyne certainly needed the best because the client’s press
involved a total of 53 axes. Such a monumental task required substantial preparation and considerable programming and tuning. However, the challenges began early, because Sion went into the design stage half-blind. All he had were “some pictures of the hydraulic schematics and their electrical drawings,” which might or might not have reflected current reality. The original press manufacturer couldn’t offer any help and hadn’t dealt with press setup in decades. All Sion could do was make educated guesses.
To get started, Sion purchased the Delta RMC200 motion controller he expected to use at the press site.
The RMC200 controls up to 50 axes of motion and comes loaded with a wealth of communication interfaces to different PCs, human-machine interfaces (HMIs) and programmable logic controllers (PLCs). Sion connected the RMC200 to a PLC and began programming. At the same time, the client maintained pressure on Macrodyne to complete it, as the company needed the press back in operation to fulfill pending orders.
Adding even more challenge, the customer requested adding a new press function that Macrodyne had never been implemented before (Figure 2).
Figure 2: The customer requested adding a new press function that Macrodyne had never been implemented before.
motion control
“The table control has two cylinders, but they treat the entire table as one axis,” says Sion. “They don’t care that there are two physical cylinders. My program has to control each cylinder and make sure the average position (virtual axis), which is in the center of the table, is the one being controlled and has the force applied to that point. There’s one position, one tilt angle, one force, even though the cylinders are acting independently. It’s like lying to yourself.”
Many more cylinder motions had to be accounted for, as well—large cylinders, near and far, pushing and pulling from the left and the right, both inward and outward. Each cylinder functioned independently but might have to cooperate and act as a single axis, depending on the motion. It essentially was a research project for Sion and his team, who had never programmed something like this before.
Deploying into action
Incredibly, Macrodyne arrived at a fully functional solution from the outset without having to redo any work. Sion confirmed this when he arrived at the client site and got hands-on with the actual press. While no one would call the effort easy, Sion says Delta Motion was critical to the success of the project. He notes, “If the Delta didn’t have the capacity to handle this, I didn’t have a solution. There was no second option. It had to work.”
With the motion control system installed, the first order of business was to tune the press’s 53 axes across the RMC200 and accompanying RMC150 controllers (Figure 3). Moreover, the press’s process spanned a sequence of roughly 100 motions. Taken together, this amounted to a genuinely Herculean programming task.
Compounding the challenge even further, the age of the client’s equip -

ment meant that sensors couldn’t be mounted within the press cylinders. Instead, the retrofit team installed external wire-draw encoders on the cylinders. The approach is less precise, but Delta’s polling rate and software programmability helped mitigate most of the sensor’s drawbacks.
Delta Motion sent two application engineers, David McNichol and Sean O’Banion, to the client’s site to assist Sion in reducing a tuning job that he says would have taken him two weeks down to “about two days.” The Delta engineers also validated the programs Sion and his team wrote.
The graphical interface of Delta’s free RMCTools software, Sion adds, made configuration impressively quick and easy. Part of Delta’s advantage in this scenario is its application-specific focus. Programmable logic controllers are well-suited for managing overall machine control, and electric servo systems are highly effective in motor control. Delta Motion does not attempt to replace PLCs, but rather complements them to optimize motion control for specific applications. While the RMC controls electric motors very well, Delta’s primary focus is on hydraulic motion control, an area in which it demonstrates exceptional proficiency.
Outstanding outcomes
Throughout the weeks Macrodyne spent completing the press retrofit, technicians with the client worked and watched alongside Sion’s team. Substantial training would have been required in similar installations, but the Delta software’s intuitive nature allowed most training to be done unofficially during the installation.
Figure 3: With the motion control system installed, the first order of business was to tune the press’s 53 axes across the RMC200 and accompanying RMC150 controllers.

“From the first moment I started to jog the machine, people were with us,” says Sion. “I didn’t invite them, but they started to come and ask questions. Naturally, every answer led to more questions. It made my life a little more difficult, but it also turned into an advantage because the experience was very thorough for the client. In the end, they still asked for an official training period, even though they’d already been making parts for a month.”
In short, the aerospace manufacturer began this process with a machine that had been out of commission for six months, and no parts or expertise were available to fix it. Today, the
company has its machine back, and it is working better than ever.
Before, the press had lacked precision and could not operate at its full tonnage. Budget restraints meant keeping all the original pumps, piping and cylinders; only the seals and valves were updated. Adding Delta motion control allows the position to be measured every millisecond and managed with superior accuracy. Greater accuracy allows for increased speed because now operators know precisely where parts are, so they don’t collide with other components and cause excess wear or damage to cylinders,
or sensors.
The client now has its press operating at full capacity, with improved quality control and greater functionality (Figure 4). Placing leading-edge motion control at the center of the solution turned a seemingly impossible press upgrade into a satisfied, productive manufacturer.

Mike Dorian is senior electrical automation designer at Macrodyne Technologies in Concord, Ontario. Contact him at mdorian@ macrodynepress.com.
valves
Figure 4: The client now has its press operating at full capacity, with improved quality control and greater functionality.
by Patrick Bunn, Bunn Automation Consulting
What are the 7 layers of the OSI model?
How to understand the standards and protocols relevant to industrial control system networks
I HAVE BEEN WORKING with controls and automation for more than 20 years. I’ve witnessed the transition of control systems from primarily private, serial, mostly proprietary networks to more Ethernet-based networks.
Controls engineers need to gain more knowledge involving these types of systems. The Open Systems Interconnection (OSI) model is an important topic, as it relates to industrial control system (ICS) networking.
Let’s say we want to communicate a message to someone. What two fundamental elements are needed for that to take place? The first would be the format, and the second would be the media. What do I mean by that?
The format would be the specific language with which we are going to communicate—for example, American English. There are many languages, or formats, of communication, that can be used, but, unless there is a common, agreed-upon way to communicate, the message will never be understood.
The media would be the avenue through which we would deliver this
message. A web browser, for instance, displays content. There are many avenues we can use to communicate, such as a phone call, text message, email or in-person communication. This, too, needs to be agreed upon. We may both intend to communicate in English, but, if I am expecting a phone call and an email is sent instead, it might be a while before I realize I have a message. For us to be able to communicate, we would have to agree on both a format and media.
Industrial control system networks are similar. We just use different words to describe it.
Like format and media, we have protocols and standards that we use to help broker communication between devices. Think of protocols kind of like a specific language and standards as an agreed-upon way of formatting that communication.
When we want to send an email to someone, first we open an email application, like Microsoft Outlook, on our laptop. Next we click on “new message” icon to create an email. Then we
put in an email address to designate the recipient. We fill out the subject line and body of the email. After we have proofread the message, we hit the send icon, and away it goes.
What about the person on the other side? That person gets a notification. After opening the email application, the person sees the unread email, clicks on it and reads it. This message went from one computer to another. Was it magic?
7 layers of the OSI model
To help see how that worked, let’s take look at the OSI model, which describes seven layers that computer systems use to communicate over a network (Figure 1).
Starting at Layer 7, the highest-level layer and moving down, you have:
• application layer
• presentation layer
• session layer
• transport layer
• network layer
• data link layer
• physical layer.








In our email example, the message starts in Layer 7 of the source machine, and then it moves all the way down through the layers to Layer 1. Next, the message is transmitted to the destination machine. Finally, the message goes from Layer 1 of the destination machine up to Layer 7.
OSI layers 4 through 7 are always implemented in the software, while layers 2 through 3 are a combination of hardware and software. Layer 1 is almost completely hardware.
Layers 1, 2 and 3 are network support layers and happen to be the layers we are most concerned about from an automation network standpoint. They deal with physical aspects of moving
data, such as electrical speci cations, physical connections, physical addresses, transport time and reliability from one device to another. Layer 4 ensures reliable data transmission. Not all applications need to use all seven layers. The lower three layers are sufcient for most applications.
1
Application
At the very top of the OSI reference model stack, we nd the application layer, which is implemented by the network applications. These applications produce the data that is to be transferred over the network. This layer also serves as a window for the application services to access
the network and for displaying the received information to the user.
The hardware associated with this layer is almost always the computer, and the information is communicated as data. For our email example, Microsoft Outlook operates in this layer.
2
Presentation
The presentation layer is also called the translation or syntax layer. The data from the application layer is extracted here and manipulated as needed to conform to the required format to transmit over the network.
The hardware associated with this layer is almost always the computer and the implementation of this
Figure 1: The OSI model describes seven layers that computer systems use to communicate over a network.
industrial networking
layer is done by network application software, such as web browsers and email clients. The information is communicated as data.
For our email example, the data gets converted into American Standard Code for Information Interchange (ASCII) format in this layer, and likely is encrypted using secure sockets layer (SSL).
3Session
The session layer is responsible for establishment of connection, maintenance of sessions, authentication and ensuring security.
The hardware associated with this layer is almost always the computer and the implementation of this layer is done by network application software, such as web browsers and email clients. The information is communicated as data. For our email example, remote procedure call (RPC) is likely being used in this layer.
4Transport
The transport layer provides services to the application layer and takes services from the network layer. It is responsible for the end-toend delivery of the complete message. The transport layer also provides the acknowledgment of the successful data transmission and re-transmits the data if an error is found.
The hardware associated with this layer includes load balancers and firewalls, and the communicated information is broken down into segments. For our email example, transmission control protocol (TCP) is likely being used in this layer.
5Network
The network layer works for the transmission of data from one
The transport layer also provides the acknowledgment of the successful data transmission and re-transmits the data if an error is found.
host to the other located in different networks (internetworking). It also takes care of packet routing, which is the selection of the shortest path to transmit the packet, from the number of routes available. To identify each device on the internetwork uniquely, the network layer defines an addressing scheme. The sender’s and receiver’s internet protocol (IP) addresses are placed in the header by the network layer.
The hardware associated with this layer includes routers, and the information is communicated in packets. For our email example, IP is likely being used in this layer.
6
Data link
The data link layer is responsible for the node-to-node delivery of the message. The main function of this layer is to make sure data transfer is error-free from one node to another, over the physical layer. When a packet arrives in a network, it is the responsibility of the data link layer (DLL) to transmit it to the host using its media access control (MAC) address.
The hardware associated with this layer includes switches, and information communicated is frames. For our email example, Ethernet is likely being used in this layer.
7Physical
The lowest layer of the OSI reference model is the physical layer. It is responsible for the actual physical connection between the devices. It controls the transmitting
of the individual bits from one node to the next. When receiving data, this layer will get the signal received and convert it into 0s and 1s and send them to the data link layer, which will put the frame back together.
The hardware associated with this layer includes hubs, repeaters, modems and cables. Cables come in many different varieties. The information communicated is bits. For our email example, a combination of fiber optics, copper and Wi-Fi are likely being used in this layer.
Recap
To review, the application, presentation, and session layers are all primarily dealing with data. When you get to the transport layer, the data is broken into segments. In the network layer, those segments are broken into packets. In the data link layer, those packets are broken into frames, and, in the physical layer, those frames are converted to a binary transmission.

Patrick Bunn, owner of Bunn Automation Consulting in Birmingham, Alabama, will be discussing industrial network protocols, the OSI model and how to use Wireshark software for troubleshooting in his presentation, ICS Networking, at OT SCADA Con 2025 in Houston. Bunn will speak at the event on July 24 at 3 pm. Use the code PATRICK to receive 15% off your registration fee for OT SCADA Con 2025 (www.eventcreate.com/e/otscadacon25). You can also contact Bunn at patrick@ bunnautomation.com.
Tool up for industrial networks
Gateways, modules, routers, switches, converters, I/O and terminals to transmit data
Beckhoff EtherCAT terminal

Beckhoff offers the EL8601-8411 EtherCAT Terminal with flexibility in a compact, 12-mm-wide design. With up to 12 signal interfaces (8 x DI, 2 x DO, 1 x AI, 1 x AO) and nine signal types in one terminal, the multi-interface is ideal for numerous applications. These use cases include systems that require only a few complex signals or to enable highly flexible signal configuration on custom machines without adding single-purpose hardware. The EL8601-8411 offers a large number of configurable combinations, and in addition to the digital inputs and outputs, one analog input and one analog output can be configured as a current or voltage signal.
Beckhoff / www.beckhoff.com
AutomationDirect MQTT gateway

AutomationDirect offers the STRIDE MQTT gateway, which connects industrial Modbus devices to a message queuing telemetry transport (MQTT) cloud-based data logging solution. MQTT is a machine-to-machine IoT connectivity protocol that provides low power usage, minimized data packets and efficient distribution of information to one or many receivers. The STRIDE MQTT gateway provides easy hardware setup to add Modbus RTU/TCP devices to an existing MQTT cloud data collection platform. The MQTT gateway interfaces with up to 32 Modbus devices; wired and Wi-Fi gateway models are available. The gateway is the hardware component only; the user must have an existing cloud computing service account from a cloud services provider.
AutomationDirect / www.automationdirect.com
IDEC coupler module
IDEC introduces the SX8R Bus Coupler module for deploying I/O modules, in support of the industry design trend of using smaller, decentralized control panels to simplify installations and reduce wiring complexity. The IDEC SX8R Bus Coupler can be used to design distributed remote I/O systems or to expand the I/O count for controllers with limited base unit I/O points. Each SX8R supports up to 7 I/O modules on the base unit, and up to eight additional modules for a total of up to 15 I/O modules. A single SX8R can therefore support up to 480 discrete points (input and/or output), 120 analog inputs, and/or 60 analog outputs.

/ www.idec.com
Pepperl+Fuchs Ethernet switch
Pepperl+Fuchs’ RocketLinx series Ethernet switches are designed for mission-critical environments that require extended operating temperatures, rugged enclosures, high-performance communication and reliable data transfer. A variety of managed and unmanaged models are available with copper and SFP fiber ports, Gigabit speeds, alarm relays and redundant power inputs for the most demanding applications. RocketLinx unmanaged switches offer a plug-and-play solution for connecting Ethernet-enabled devices at the field level where reliability is critical. RocketLinx managed models provide advanced security, management and redundancy capabilities for establishing the more complex field-to-enterprise communication required by advanced networks and Industry 4.0 initiatives.

Pepperl+Fuchs / www.pepperl-fuchs.com
product roundup
Balluff networking block
The Balluff BNI004F networking block is a rugged, high-performance I/O solution for industrial Ethernet/IP systems. With eight M12 connection slots supporting 16 configurable digital inputs and outputs, it enables efficient device integration. Its robust die-cast zinc housing and IP67 rating ensure durability in harsh environments. Supporting up to 9 A for sensors and ac tuators, it provides reliable power dis tribution while maintaining flexible communication and control, making it ideal for industrial automation applications.
Galco / www.galco.com
Wago injector and converter

Wago’s power over Ethernet (PoE) injectors and media converters make communication fast and powerful. With a small footprint and wide temperature range, these prod ucts are ideal for any networking application. The 60- and 90-watt PoE injectors easily supply power and communica tions over a single RJ45 cable. These devices are based on the IEEE 802.3bt standard for 4PPoE that uses all eight wires in an Eth ernet cable. The media converter provides extended Ethernet com munication as it changes data transmission from copper to fiber optic.

Wago / www.wago.com
HMS Networks wireless 5G networking
from the 5G side, enabling seamless routing of messages to these devices. Machine builders can integrate 5G technology into their products as it becomes more prevalent in industrial settings.
HMS Networks / www.hms-networks.com
Fluke Networks network tester
Fluke Networks introduces the LinkIQ Duo Cable+WiFi+Network Tester, which combines the company’s cabling qualification with Wi-Fi 6E network testing and analysis into a single solution. LinkIQ Duo combines the Fluke Networks’ cable testing with active network and Wi-Fi testing to understand Wi-Fi environments quickly for installing and troubleshooting wireless networks. LinkIQ Duo advanced Wi-Fi testing can meet the challenges and requirements with the rapid adoption of Wi-Fi 6E, and it includes all the features necessary for troubleshooting Wi-Fi connection and performance issues and identifying the location, availability and configuration of access points. The LinkIQ Duo also flags common configuration errors.

Fluke Networks / www.flukenetworks.com
Lutze introduces unmanaged E-CO switches with 5, 8 or 16 ports for universal network communication. Quality of service (QoS) allows for data prioritization, autonegotiation and autocrossing. The switches offer an ac/dc wide input voltage range of dc 12
HMS Networks offers the Anybus Wireless Bolt 5G, designed to connect industrial networks over 5G. Its unique bolt-on form factor, pre-directed antennas, IP67 protection, and minimal in-cabinet footprint make it ideal for deployment across new and existing equipment. HMS Networks now offers routing behind mobile station (RBMS) standard within the Anybus Bolt 5G and the 5G router. This feature allows the IP addresses of devices “behind” the Bolt 5G to be assigned

V to 48 V as well as ac 24
V. Compact metal housing with IP30 and an extended temperature range of -40 °C to +75 °C make these DIN rail or panel mountable and suited for harsh industrial environments. The switches include intelligent energy management and improved electro static discharge (ESD) protection for energy-saving Ethernet networks. Energy Efficient Ethernet (EEE) is implemented in compliance with the IEEE 802.3az standard.
Lutze / www.lutze.com

Brainboxes Ethernet switch
Newark offers the SW-7717 (14x 10/100, 2x 1G & 1x SFP

Ethernet switch. This 16-port managed power over Ethernet (PoE+) switch delivers connectivity and centralized management for missioncritical industrial operations. Brainboxes’ range of Ethernet switches offer versatile connectivity options designed to meet the unique demands of many applications. These switches are built to last and ensure industrial networks operate seamlessly.
Newark / www.newark.com
Contemporary Controls IP router
Contemporary Controls’ Skorpion IP routers can integrate new machines or subsystems that have a fixed range of IP addresses that conflict with other existing plant addresses or the overall addressing policy already in place. Each machine connects to the LAN side of the router while keeping its same IP settings for the devices and the application. The IP address for the WAN port on the IP router is the only setting that requires modification to join the factory network. IP routers allow the machine builder to retain the same configuration used during factory acceptance testing when installing at the customer site.

when intercepting an existing PLC output. Data is transmitted to business systems over wired Ethernet, WiFi, or Bluetooth. A range of onboard configurable signal processing options accommodates encoder inputs, counters, switches, and scaling of analog signals, allowing simple integration into the Unified Namespace.
Mr. IIoT/ www.mriiot.com
Mencom receptacle

Mencom offers a range connectors for sensors and actuators in industries such as automation, manufacturing, military, aerospace and transportation. The receptacles are engineered to withstand harsh environments and IP67/69-rated protection. M12 and ½”-20 receptacles are popular choices for connecting sensors, actuators and fieldbus devices. M8 receptacles are Mencom’s most compact connectors. M23 receptacles are designed for data, signal and power transmission, enduring positive vibration for motor drives and moving assemblies. The 6 and 8 pole MCVH receptacles handle higher electrical loads for high-power applications such as servo motor systems and power transmission with a maximum of 630V and 30A.
Mencom / www.mencom.com
Phoenix Contact Wi-Fi modules
Contemporary Controls / www.ccontrols.com
Mr. IIoT sensor adapter
Mr. IIoT offers the SHARC IoT sensor adapter, which streams data from industrial sensors via the message queuing telemetry transport (MQTT) protocol. The A-coded five-pin M12 connector accepts PNP, NPN, 0-10V, and 4-20mA signal inputs in a single channel. The SHARC and sensor can be powered with power over Ethernet (PoE) or an existing 24VDC supply. The SHARC can draw power from an existing 24VDC supply or PoE

The WLAN 1121 and WLAN 1021 from Phoenix Contact offer the latest Wi-Fi 6 technology (IEEE 802.11ax) for high performance, security and reliability. With its modern industrial Wi-Fi 6 board, the WLAN 1000 series has up to ten times more data throughput than the previous modules in the WLAN 1000 product family, which are based on WI-Fi 4 (IEEE 802.11n). The Wi-Fi 6 devices provide data rates up to 2,402 Mbps gross (160 MHz channel). The new version retains the WLAN family’s compact housing without sacrificing performance. The new client modules are ideal for creating larger networks with many devices, such as automated guided vehicles (AGV) or shuttle systems.

Phoenix Contact / www.phoenixcontact.com
Joey Stubbs joey.stubbs@gmail.com

Software updates affect industrial PCs
REAL-TIME COMPUTING is particularly important in manufacturing. Many industrial PCs (IPCs) rely on a real-time operating system (RTOS) or real-time extensions to operating systems to guarantee predictable, deterministic behavior for controlling machinery and processes.
Real-time systems must respond to inputs and events within very tight time constraints, typically within milliseconds or microseconds, to ensure safe and efficient operations. A deviation from this expected behavior, even for a fraction of a second, can have severe consequences.
However, because the hardware and upper-level software is similar to standard office computers, and because these IPCs will likely be connected to a site’s computer network, making them both targets of and conduits for hacking, it is not uncommon for the information-technology (IT) department to add any computer, whether industrial or not, under its umbrella and, in doing so, do a disservice to the real-time capabilities of these systems by treating them the same as any other computer in the office.
time extensions interact with the hardware, causing delays or jitter in the execution of time-sensitive operations. This disruption could lead to processes being executed out of order, data being processed too late or even machinery being controlled incorrectly. Given that many industrial systems rely on precise timing and synchronization, even small disruptions can lead to system instability or failure.
Take the quiz: How are industrial PCs affected by software updates? at www.controldesign.com.
Another key issue when updating industrial PCs automatically is the potential for new drivers or firmware to be incompatible with legacy hardware. In manufacturing environments, many IPCs are deployed with specialized, sometimes custom, hardware designed to work with specific machines or sensors. These components often require specialized drivers that may not be updated as frequently as general-purpose drivers.
I’m not trying to condemn IT departments, but I have seen this same story many times—the design engineer spends many hours researching available real-time systems and IPCs that can provide marked improvements to the manufacturing process, only to have the IT department set the IPC up for automatic updates from the cloud. This can erase any gains to the manufacturing process, result in additional rework or downtime or even cause safety issues in the plant.
A significant risk posed by automatic software updates to industrial PCs is the potential for incompatibility between the updated software and the real-time extensions or operating systems. Real-time extensions provide the low-latency performance necessary for time-critical tasks in industrial settings. An update pushed without thorough compatibility testing may result in drivers, application programming interfaces (APIs) or other critical software components becoming incompatible with the real-time requirements of the IPC.
For example, an update to the operating-system kernel or a hardware driver may inadvertently affect how the real-
Automatic updates are often deployed with the assumption that they will improve system functionality and security. However, they can just as easily introduce instability or new bugs into the environment. IPCs are often running specialized software tailored to the unique requirements of a given manufacturing process. These systems are frequently optimized for performance, and even minor changes to the software can disrupt the delicate balance that allows them to function reliably.
Many industrial sectors are governed by strict regulatory standards that mandate specific operational conditions, software versions or configurations. Automatically pushing software updates can introduce significant compliance risks if the new software versions inadvertently cause systems to fail to meet these regulations. For tips on managing software updates, visit www.controldesign. com/55270924.
Joey Stubbs is a former Navy nuclear technician, holds a BSEE from the University of South Carolina, was a development engineer in the fiber optics industry and is the former head of the EtherCAT Technology group in North America.






















