Edge computing serves two main purposes: extracting signal from noise by locally processing large volumes of data that are not feasible to send across the internet and providing the ability to process specific things locally where and when latency is a concern.

Why Edge Computing Is So Crucial for Industrial IoT

Michael Schuldenfrei, Chief Technology Fellow | OptimalPlus

The invention of the Printed Circuit Board (PCB) in the 1950s changed the world of automation. Prior to the PCB, electronic circuit boards were assembled exclusively by hand, a laborious process that greatly limited global production. Today, industry is experiencing yet another revolutionary leap with the introduction of instrumentation in the manufacturing process and the use of edge computing.

Instrumentation of the manufacturing process involves numerous sensors and microcontrollers which can subtly alter manufacturing conditions in response to environmental conditions detected by the sensors. These sensors produce large quantities of data, but the microcontrollers cannot respond directly to the data produced.

Both the sensors and microcontrollers used in manufacturing instrumentation are basically small networked computers. The sensors send their data to a central location where the data is then analyzed. These small, autonomous computers are not monitored by humans in real time and are part of the Internet of Things (IoT). More specifically, in a manufacturing context, they are Industrial IoT (IIoT) devices.

 

IIoT Use Case for Manufacturing Instrumentation

IIoT devices are used in any number of contexts to do jobs that would be difficult—if not impossible—for humans to do reliably and/or accurately every time. Consider, for example, weld inspection. Welding is an integral part of many electronics production lines and critical to the functionality and durability of the final product.

Unfortunately, manufacturers are being asked to perform welds on increasingly smaller components, with increasingly tighter constraints. In order to protect components, welds must be performed at the lowest possible heat and with the smallest possible electrical charge.

IIoT devices that might help refine this process include heat, voltage, and pressure sensors to help determine the minimum amperage necessary to perform a weld in the current environmental conditions. IIoT cameras may also feed machine learning-based visual weld inspection systems to verify that welds are satisfactory, even when they are far too small for the human eye to see; and this is just for starters.

Manufacturing instrumentation can make any manufacturing—not just electronics manufacturing –—more accurate, with fewer production errors and requiring fewer people involved. Unfortunately, this instrumentation isn't easy, especially given the complexities of the modern manufacturing supply chain.

 

Making Manufacturing Instrumentation Function

Information Technology (IT) teams have been making use of instrumentation for decades. It doesn't cost as much to build sensors into software as it does to build them into hardware. As a result, operating systems, applications, and IT equipment of all kinds are absolutely littered with sensors. Because of this, IT teams have been struggling with the amount of data they produce since before the modern microcomputer existed.

 

So Much Data, So Little Time

In the real world, any instrumented infrastructure produces way more information than a single human can possibly process. Even large teams of humans cannot be expected to comb through all the data emitted by even a modest IT infrastructure. Entire disciplines exist within the IT field dedicated to making the data emitted by IT instrumentation understandable. Technologies and techniques range from simple filters to sophisticated artificial intelligence (AI) and machine learning (ML) techniques.

Until recently, this was good enough for most IT teams. Information would be collected and sent to a central location, numbers would be crunched, and only the important data was forwarded to systems administrators. If this took a few seconds or minutes, that was okay; a brief IT outage was generally acceptable.

But as organizations around the world became more and more dependent upon their IT, the acceptable amount of time it took to act on instrumentation decreased significantly. For many organizations, the acceptable reaction time is today far below what a human could possibly achieve. Modern IT systems in the most advanced organizations thus use powerful AI and ML suites to have their IT infrastructure react to changes reported by the sensor data before human administrators are even aware there's a problem.

Modern manufacturers, as one might imagine, look for manufacturing instrumentation solutions that are capable of also reacting faster than a human. While reading sensors and telling humans a problem has developed is helpful, it's nowhere near as helpful as responding to sensor data in real time.

 

IT Instrumentation vs. Manufacturing Instrumentation

The difference between the two is that IT Instrumentation is comparatively easy: one collects data about IT infrastructure and applications from devices that are already fully digital. Manufacturing instrumentation is more challenging. IIoT devices used in manufacturing instrumentation collect data about the physical world. This means collecting analogue data and converting it into digital—and that's a whole other ball game. Physical sensors need to be calibrated, and over time they wear out. Physical sensors are also typically deployed in clusters so that quorum sensing is possible.

Quorum sensing uses multiple independent sensors in order to compensate for calibration drift or sensor malfunction. If one sensor in a cluster reports data that is divergent from its partners, it can be ignored and/or flagged for recalibration. This allows manufacturing to continue with known good sensors until the malfunctioning one can be recalibrated or replaced.

The complications of analogue sensing, combined with the pressing requirement for real-time responsiveness to sensor data, present real-world challenges for manufacturing instrumentation.

 

Can't Cloud Computing Fix Everything?

IT teams have had to deal with many different and difficult computational requirements. One example of a solution developed by IT vendors is cloud computing.

 

Cloud Computing & BDCA

Cloud computing allows organizations to access seemingly limitless IT infrastructure with the push of a button. While the reasons behind cloud computing are numerous and complex, perhaps the most important one is that cloud computing allows IT teams to operate IT workloads without having to manage or maintain the underlying IT infrastructure. The cloud provider handles that part for them.

Cloud computing has proven very useful for Bulk Data Computational Analysis (BDCA) workloads. There are many types of BDCA workloads, including AI, ML, Big Data, and more; anything where large quantities of data are collected and subsequently need to be analyzed is a BDCA workload. In the past few years, cloud computing has been the destination for the majority of new BDCA projects.

One of the reasons that cloud computing is used for BDCA workloads is the concept of cloud bursting. Cloud workloads—such as the computation workloads used to analyze large datasets—can be spun up only as needed and to whatever scale required. This suits BDCA workloads well because most BDCA workloads only need to generate analyses on a set schedule. End-of-month reports are a popular use case here.

Unfortunately, economics of scale mean that traditional public clouds are centrally located. This allows public cloud vendors to situate their data centers where costs are lowest and simply build really, really big data centers. While this is useful for batch-job style BDCA workloads that run on schedules, this is less than helpful for workloads that require real-time responsiveness.

In order to solve this, edge computing was developed.

 

Edge Computing

Edge computing can be thought of as cloud computing, but in someone else's data center. Edge computing evolved because IT teams had workloads that required low-latency responsivity that traditional public cloud computing couldn't provide. IT teams were perfectly capable of creating such infrastructures but simply didn't want the burden and hassle of dealing with it themselves.

 

Meeting New Data Demands

After a meeting of minds, it was decided that in order to meet the needs of these customers, public cloud providers would install servers into the data centers of relevant organizations. This allowed the IT teams of those organizations to execute workloads on what, to them, looked identical to a region created just for them by the public cloud provider but which was located on the same Local Area Network (LAN) as the rest of their workloads.

These "edge computing" servers allow IoT sensor data to be processed and acted upon far faster than would be possible if that data had to traverse the internet to a public cloud data center, be processed, and then have the results travel back across the internet. Edge computing is enabling a number of new technologies, including driverless cars

 

Use Case: Real-Time Data for Driverless Cars

Driverless cars are a great example of a technology where waiting for data just isn't an option. Cloud computing could help driverless cars by collecting sensor information for all cars in a given area, crunching the data, and sending those cars a map of where everyone and everything is located inside a given radius. This could allow these cars to literally see around corners, making them even safer.

However, even at the speed of light, sending information from a car to the public cloud and back again can take up to a quarter of a second. People can die in a quarter of a second when cars are involved. So moving the processing closer to the cars—say by locating the relevant servers within a few blocks of where cars will be trying to navigate tricky urban environments—can enable technologies that otherwise wouldn't be possible.

In the same way, manufacturing can make use of edge computing to enable needed instrumentation. As is usually the case, however, manufacturing has its own twists and turns that not only make edge computing more critical to the process but also present various challenges that have to be overcome.

 

Why Use Edge Computing in Manufacturing?

A common pitch for the relevance of edge computing to manufacturing companies revolves around the need for real-time responsiveness. When trying to keep manufacturing defects near zero on a fast-moving production line, it helps to be able to make use of sensor clusters. A sensor cluster can quorum sense if an individual sensor is faulty, and then recalibrate. However, recalibration must be done very quickly to avoid disrupting the production line.

If it takes 100 or 250 milliseconds to send sensor data over the internet, then products on the line could be lost, or equipment could be damaged. But if the data can be processed locally, taking approximately five milliseconds, then manufacturers can recalibrate sensors in real time and/or alter manufacturing equipment settings in response to environmental conditions.

 

Sensor Overload

Another reason behind edge computing’s usefulness that doesn't get discussed quite so readily is that there can be unmanageably large numbers of sensors involved in manufacturing instrumentation. This can not only overwhelm network capacity but also produce a huge collection of data, which is not required in its entirety. Thus, it is useful to sift through the data before forwarding on only that which needs to be sent.

It is common for data volumes to be overwhelming or require some form of filtering, where sensors are used in a quorum to overcome calibration or aging issues. Here, individual sensors may be rejected if other nearby sensors that participate in a quorum do not agree with the readings. A fully instrumented factory may contain millions of individual sensors that ultimately consist of only a few tens of thousands of sensor quorums—potentially quite a lot more than the local internet connection can reasonably be expected to handle.

In other edge computing configurations for manufacturing, there are some sensors that are only used locally. This could be because they are used in real-time responsiveness, or because they are only relevant locally, for example, as part of a security solution.

 

Contract Manufacturing          

Edge computing is also useful in the increasingly common scenario of contract manufacturers (CMs). CMs have IT solutions independent from the Original Equipment Manufacturers (OEMs) that commission work. However, many OEMs see benefits in instrumenting their entire supply chain, even those portions of it that have been contracted out.

In this case, OEMs may extrude part of their network into the network of the CM using edge computing. The OEM's IT team might place servers into the CM's network that connect back to the OEM's private cloud. Combined with IIoT sensors, these edge computing servers would allow the CM to meet the OEM's instrumentation and supply chain integration goals without impinging upon the CM's own network or requiring radical changes to the CM's network design.

Edge computing gives the OEM the ability to view their entire supply chain and manufacturing operation using a consistent interface and integrated set of applications, regardless of whether the individual components are being manufactured in the OEM's facilities or those of a CM. This consistency makes training and supporting CMs easier, as everyone is using the same toolchain.

 

Summary

Cloud computing, which has been around for more than a decade now, is often marketed as the solution to all IT ills. It's not. Cloud computing solves a great many problems, but the speed of light means that giant centralized server farms are only ever going to be so useful.

Edge computing serves two main purposes: extracting signal from noise by locally processing large volumes of data that are not feasible to send across the internet and providing the ability to process specific things locally where and when latency is a concern. Both of these are useful to manufacturing companies that are increasingly dependent on instrumentation.

Manufacturing can't wait around for light to make it from A to B and back. There's too much on the line and no time for errors. Edge computing solves problems clouds can't, so it’s time to evolve or be left behind.

 

The content & opinions in this article are the author’s and do not necessarily represent the views of ManufacturingTomorrow

Comments (0)

This post does not have any comments. Be the first to leave a comment below.


Post A Comment

You must be logged in before you can post a comment. Login now.

Featured Product

Zeigo Activate by Schneider Electric: Energy efficiency software for manufacturing facilities.

Zeigo Activate by Schneider Electric: Energy efficiency software for manufacturing facilities.

Whether you're responding to new legislation and regulations or getting pressure from stakeholders and customers, Zeigo Activate empowers companies to effectively calculate, track, and reduce their carbon footprint and become more energy efficient. By providing valuable insights, actionable data, and intuitive tools, Zeigo Activate is tailored for businesses at any stage in their energy efficiency journey. Our easy-to-use software allows you to set your emissions baseline and target, receive a customizable project roadmap, and connect to a network of regional solution providers in energy efficiency and renewable energy so that you can put your ambitions into action.