Close
Close

Published

OpenFog unveils Reference Architecture to ram edge-compute down IoT throats

Network-edge evangelist, the OpenFog Consortium, has unveiled its Reference Architecture design, aimed at providing a universal framework to enable the IoT, 5G, and AI applications that are coming to the fore. Calling it a significant first step towards creating the necessary standards needed to manage these multi-faceted apps, OpenFog is looking to becoming the center-point in IoT data interoperability.

By the sounds of it, the new Reference Architecture is going to act as a foundation, on which to build new standards and approaches – meaning that other standards and industry organizations are going to have to get on board with the approach. However, the IoT is an enabling factor in many other potentially lucrative applications, and so there’s a strong commercial incentive to ensure that your products and services play nice with other components in value chains.

The evolution of computing architectures needs to be quickly revisited in order to explain the ‘fog’ term. In the good old days, a central mainframe carried out all the processing, serving the results to terminals at the edge of the network – i.e. desks inside an office building.

The emergence of the PC meant that mainframes were no longer required to do as much processing, as those personal computers were able to run their own processes and applications at the edge. Cloud computing servers were born of the need for more powerful computing than what was on offer in those PCs, and the rise of cloud has parallels with the days of the mainframe.

However, with the rise of mobile networks, the edge has expanded from desks within a building to truly remote locations – enabling all kinds of wireless applications, with the trade-off for that extended range usually being less processing power on tap to enable a longer battery-life.

As such, those remote edge devices are typically pretty lightweight if they don’t have a wired power supply, and are in the fashion of simply sending readings back to cloud applications – where the data they generate can be put to use in gargantuan instances of processors and storage, somewhere hidden in the clouds and out of sight of the user.

Edge-computing is the term used to describe moving some of that computational workload from the cloud and nearer to the source (ground) – and that’s where the term ‘fog’ arises from. So; like cloud, but closer to the source. The main benefits that fog advocates point to are improvements in security, latency, agility, efficiency, and cognition. These devices are major candidates for AI and machine-learning technologies, for efficiently processing that data.

When it comes to security and privacy, processing data on the network-edge can ensure that it stays off of airways and out of networking infrastructure that could be compromised or leak. Similarly, keeping it at the edge would reduce the latency of applications that need to take action based on the data – as the data can be analyzed at the gateway, rather than having to travel all the way to a central cloud application that would then send a command back to the gateway.

Similarly, the costs of relaying that information to the cloud can be removed, which is especially beneficial to wireless applications that might have to rely on an MNO’s cellular network to collect information. The data bill for these gateways could be slashed, and for the user, the cloud storage and processing costs can also be significantly reduced – as long as you trust the network-edge boxes to be able to carry out the computation themselves.

The consortium was founded back in November 2015, by ARM, Cisco, Dell, Intel, Microsoft, and Princeton University. It has since grown to 55 members, including AT&T, GE, Hitachi, Sakura Internet, Schneider Electric, and Shanghai Tech University (all Contributor members), as well as Hon Hai (Foxconn), Fujitsu, Mitsubishi, NEC, NTT, OSIsoft, PrismTech, relayr, and Toshiba – and a plethora of academic institutions.

PrismTech and its parent ADLink are good examples of the ways that edge-computing are being pitched at customers. We covered a partnership that saw ADLink gateways running PrimsTech software connected to a version of IBM’s Watson cloud platform designed to run at the network-edge.

The OpenFog Reference Architecture itself is comprised of a core set of pillars, which collectively comprise what the Consortium would describe as a horizontal system-level architecture. The pillars are; Security, Scalability, Open, Autonomy, RAS, Agility, Hierarchy, and Programmability.

The document has a pretty sage way of summarizing fog computing’s purpose in turning data into actionable wisdom. Shortened to DIKW, it states “Data gathered becomes Information when stored, and retrievable [information] becomes Knowledge. Knowledge enables Wisdom for autonomous IoT.”

For resiliency, the fog nodes can be linked together as a mesh, to provide load-balancing, fault tolerance, and data sharing, with minimization of cloud communications. As for hardware approaches, it covers CPUs, GPU accelerators, FPGAs, as well as RAM array storage, SSDs, and HDDs – as well as Hardware Platform Management (HPM) devices.

The document outlines the software side of things too, as well a rather detailed use case examining how the spec would work in and end-to-end airport visual security system (vehicle arrival, baggage, security, transit), as it negotiates and manages the different interactions between the disparate systems.

As for next steps, the group will set about the lengthy process of bringing other standards into the ecosystem – a gargantuan process. With testbeds and APIs, OpenFog certification would denote those systems and standards that are compatible with the spec.

“Just as TCP/IP became the standard and universal framework that enabled the internet to take off, members of OpenFog have created a standard and universal framework to enable interoperability for 5G, IoT, and AI applications,” said Helder Antunes, the chairman of the OpenFog Consortium, and Cisco’s senior director for the Corporate Strategic Innovation Group. “While fog computing is starting to be rolled out in smart cities, connected cars, drones, and more, it needs a common, interoperable platform to turbocharge the tremendous opportunity in digital transformation. The new Reference Architecture is an important giant step in that direction.”

Close