Close
Close

Published

Edge-comp meets digital twins to boost IoT analytics and event processing

Edge computing and digital twins were hot topics at the recent AI World conference in Boston, highlighting the growing role of infrastructure and hardware architecture in advanced computing. The two concepts are themselves intimately related, to the extent that some pundits are betting on their combination yielding the next big tech giant.

That edge computing featured prominently at the conference was little surprise, given that AI and machine learning applications consume large amounts of data often collected from distributed IoT devices where network bandwidth and latency can be major constraints, as well as more arguably data privacy.

Edge processing can meet these three challenges although it also brings others including potentially higher hardware costs because economies of scale can be lost. It is also true that implementation, or even definition, of edge computing depend on the balance between latency, hardware costs and bandwidth availability, which will change over time. We have heard arguments that deployment of 5G cellular networks will shift the balance back towards centralized cloud computing because that will reduce latency and increase network capacity.

That argument is a red herring, in the sense that 5G is just part of the overall improvement in both fixed and wireless network capacity which is occurring in tandem with advances in hardware and AI, as well of course as increase in data volumes  As such, 5G will make little difference to the edge computing tradeoffs because increases in capacity will be cancelled out by further growth in data volumes. If anything, 5G will encourage edge processing because it will provide the connectivity for more localized processing while there will still be pressure to keep a lot of the data storage and analytics local.

However, edge computing poses software as well as hardware challenges, and that is where the digital twin comes in. It emerged originally from simulations in a variety of applications including verification of chip design or larger circuit diagrams, so the idea was to replicate a system in software.

This has spawned a variety of additional applications where AI and machine learning are figuring increasingly, such as predictive maintenance and optimization of oil or gas well-extraction plant, with great potential savings. General Electric subsidiary Baker Hughes is among players here claiming extravagant improvements in maintenance capability and training of the associated machine learning models by simulating the systems in a digital twin on a platform based on Nvidia GPUs.

But the digital twin concept has also morphed into software abstractions of IoT processes, including event tracking and device monitoring. Here the digital twins are not providing mere virtual simulations but running real world functions.

Their role is focusing attention on matching software to the hierarchal hardware structure around edge computing which could have three or even four levels radiating out from centralized cloud-based systems, through distributed devices close to the end points and then perhaps gateways, before finally reaching the end devices themselves.

The objective is to organize the software enshrined in this digital twin model of abstraction, such that higher level twins represent subsystems, themselves possible spanning several levels, that control the end devices. Then, lower level twins represent the end points themselves. Higher-level twins receive data about events from lower-level twins, which in turn interact directly with the physical devices. Twins at all levels can receive data from and send messages to points lower in the hierarchy to gather information for analytics while transmitting control signals downwards, ultimately to devices.

The point of all this is to ensure that all places in the hardware hierarchy have all the information and context they need to perform the required analytics there and generate effective real time feedback where that is needed, without being burdened with or bombarded by large amounts of data.

One example we have seen cited is analyzing telemetry from the components of a windmill, where the system can zoom in on data from each component and combine this with relevant contextual data. The latter could be the component’s make, model, and service history, which would help predict impending failures in the light of real time data coming in.

The digital twin model would correlate telemetry data from the three principal components of a hypothetical windmill, the blades, generator and control panel, so that this can be delivered to associated objects within say an in-memory data grid. There event handlers would analyze the telemetry to generate feedback and any alerts.

This may all sound abstract but will feature increasingly in the design of edge-based analytics and monitoring systems, because without some such concept the benefits of distributed or hierarchical architecture will still be overwhelmed by data.

Close