While the Industrial Internet of Things (IIoT) is maturing rapidly, there’s often a strong link to Artificial intelligence (AI) and Machine Learning (ML) tools to manage data flows. This raises a host of challenges including connectivity, security, storage and modeling requirements. Designers of IIoT devices and larger deployments must build in contingencies for shifts in these areas or risk falling short.
While IIoT is gaining in maturity, the overall market still has many challenges to overcome. Some of these challenges are technical barriers to IoT device operation, such as connectivity standards and battery technology, but the picture is even wider than that. The world of IIoT not only requires IoT hardware to operate reliably and efficiently in all environments but also relies on supporting networks and services too. The result is that IIoT deployments cannot be viewed as isolated technology upgrades or productivity drives, but highly interconnected ecosystems sensitive to a wide range of variables – in particular the developing world of AI.
Although AI has long existed in a conceptual form, bringing the principles into the real world has proven a complicated business, but an essential one for many potential IIoT applications. Monitoring multiple data streams from a network of tiny sensors, detecting anomalies and spotting patterns that can be flagged up for preventative maintenance or as potential efficiencies is a vital part of IIoT – without it, the vast amounts of data generated are just noise.
For example, a fully autonomous AI vehicle will generate approximately 40 terabytes of data for every eight hours of driving – a staggering volume that precludes any manual analysis. This is a challenge for IIoT and AI designers – ensuring that modeling and training datasets are highly accurate and field-tested before deployment.
Transmitting and receiving the volumes of data generated by IIoT and needed for analysis by AI creates significant challenges in itself, which has led to the development of “Edge AI” to process as much as possible on the device before burdening the network. That network comes in a bewildering array of existing possibilities from Wi-Fi to 4G to fiber to the new spectrum, low-lower LoRa and NB-IoT networks designed to run alongside 5G. Managing the inevitable outages and latencies in these networks is certainly an ongoing challenge, which might potentially be overcome with mesh-style network architectures. But even these can fail, especially in IIoT scenarios where there may not always be the option of avoiding a single point of failure.
An autonomous AI vehicle is at the higher end of the scale, not only carrying a wide array of different sensors to detect other vehicles, hazards and people, demanding immediate responses to stimuli such as a sudden obstacle appearing but also being self-contained and mobile. However, combinations of these requirements are relevant to most IIoT scenarios too. For example, acoustic sensors deployed to detect vibration in plant machinery need to be able to respond to a sudden pitch change very rapidly or risk being no better than dumb sensors that will plot the meltdown of a costly machine. Most IIoT scenarios will include multiple sensor types to enable false positives to be filtered out, and more extreme applications will involve some level of redundancy too.
Arguably the biggest challenge is maintaining network security and ensuring that operational, personal or audit data cannot be leaked. While personal/customer data may not be an issue for most IIoT scenarios, the potential for commercially motivated attacks to obtain or corrupt operational or audit data is significant.
Keeping devices patched against the latest vulnerabilities is an ongoing battle, as any device owner or enterprise IT team knows, but IIoT devices aren’t so easily managed. Low downstream bandwidth can be a challenge for networks mainly designed to transmit data, while local storage limitations and power restrictions can make regular updates problematic or impossible. In addition, leaving the bootloader unlocked to enable updates to be made on an ad-hoc basis is potentially problematic, allowing an attacker to gain a resilient foothold on the network for long periods if they can introduce a rootkit or similar compromise.
HRoT – TPM, FPGA?
Fortunately, the question of IIoT security has seen plenty of attention, with the Industrial Internet Consortium recently publishing the Data Protection Best Practices White Paper, a document intended to address the question of data security in IIoT networks. The core recommendation is that IIoT deployments should rely on hardware-based security (so-called Hardware-based Root of Trust or HRoT), which not only authenticates the device bootloader but establishes a chain of trust from that point up through the device OS, applications and then across the network, preventing bootloader or OS manipulation. Unfortunately, it’s here that standards and approaches can vary, with some manufacturers embedding TPM chips to handle the cryptographic requirements, and others using FPGA-based chips to similar effect.
The Road Ahead Clears
Although, at first sight, the challenges facing IIoT and AI are considerable, the hard work of early adopters, a maturing technology stack and simple innovation have solved many of the biggest issues, while others are awaiting a consensus. While AI offers the opportunity to manage and interpret the vast volumes of data in question, it’s the underlying architecture that must enforce security and enhance visibility, as well as future-proof deployments as much as possible. This will be an essential area of focus for IIoT designers.