The latest buzz in IIoT is about edge computing. Edge computing is evolving because of the high demand to move computer processing closer to sensors to decrease latency and improve efficiency.
Edge computing is a mesh network of microdata IoT centers that process and stores critical data locally, before transferring it to a central data center or cloud storage repository. Edge computing helps optimize cloud computing systems from disruptions related to data transfer.
With a containerized microservice architecture, cloud servers become control nodes for intelligent edge devices, performing summary analytics while leaving real-time decision making to edge servers.
The IoT device at the edge must be responsible for computing, storing and network connectivity, all in a small form factor. Depending on the requirement, processed data from the sensors can be sent to the cloud either in parts or all at once.
Where Is Edge Computing Required?
One can deploy Edge computing in instances where:
- IoT devices have poor connectivity.
- The applications rely on Machine Learning and require large amounts of data to give quick feedback.
- There’s a need to keep data within the factory premises, for security and privacy.
- Raw data at the edge need preprocessing in order to reduce computation.
Typical Applications for edge computing are facial recognition, intelligent navigation, etc. It’s worth noting that edge computing is not efficient if the IoT device has to be constantly connected to a central cloud.
How Is Edge Computing Different From Fog Computing?
Fog computing refers to actions based on interactions between edge devices and the cloud. Edge computing refers to the IoT devices with computational capability; they act as gateways between sensors and personnel within the factory. In a sense, edge computing is a sub-set of fog computing.
Edge computing brings the power closer to the end user application, so instead of devices needing to constantly connect to centralized cloud infrastructure for instructions or analysis, they’re given the ability to accomplish these tasks on their own.
Edge Computing Security
The level of security with respect to Edge computing is generally higher since data isn’t sent to the cloud over their network. In edge computing, data is decentralized, which makes it harder for the data to be compromised.
Since edge computing is a relatively new technology, old problems remain, including weak log-in credentials, zero-day vulnerabilities, lack of updates and a less than optimal network architecture.
On the other hand, edge devices on their own can prove to be prone to attacks and hacks. This is something that should be kept in mind while designing security architectures.
Together, Cloud and Edge computing architectures can store and process data more effectively.
The following security measures can be taken at the Edge:
Securing Sensor Data
- Inserting Gaussian noise in data at a certain level of variance to lower the chance of sniffing attacks.
- Splitting data into blocks and shuffling them will avoid Man-in-the-Middle (MITM) attacks.
- Implementing Public Key Infrastructure for encrypting each data block.
IoT devices, especially in smart grids, are prone to data tampering and spoofing attacks and can be prevented with the help of a Public Key Infrastructure (PKI), Diffie-Hellman key exchange, intrusion detection techniques and monitoring for modified input values.
For data at rest, the AES algorithm with 256-bit key size can be used to ensure privacy, while the Secure Socket Layer (SSL) protocol can be used for establishing secure communication between a server and a client.
Artificial Neural Networks (ANNs) and rule matching can be used for threat detection as a large number of heterogeneous IoT devices are transmitting and processing heterogeneous data on multiple levels (hypervisor, operating system and applications).
Machine Learning techniques can be applied to identify zero-day attacks with higher accuracy. These techniques train algorithms like support vector machines with a benign software model and, after that, any abnormal behavior can trigger the detection event. Apart from stealing data or modifying core system functionality, the presence of malware can decrease system performance.
In healthcare, it would be essential, should a Fog system became compromised, that critical data and functionality still be protected by having strong integrity checks and that the system is quarantined as soon as malicious activity appears within the host operating system.
Edge Computing Converge IT and OT
IT consists of computing/processing systems and data storage. OT consists of the hardware and software required for running and monitoring production systems such as SCADA (Supervisory Control and Data Acquisition), DCS (Distributed Control Systems), and ICS (Industrial Control Systems). The recent technologies aimed to converge IT and OT onto a common ground for easy communication and action. Edge Computing is accelerating this convergence.
Companies at the forefront of Industrial IoT (IIoT) have established common ground for IT and OT to function as one unified system. For example, health monitors are such as converged systems. Edge computing that does the computations closer to the sensors (hardware) is bringing IT and OT together.
With IT (specifically data science & ML models), users can build algorithms that continuously learn and adapt in order to provide better outcomes. OT can then automate their workflow while providing insights towards better monitoring & conditional anomalies. Factories that have integrated their OT/IT teams have derived successful results like lower energy consumption, better product quality & asset health, and lesser downtime.
Here’s a comparison of standard IT and OT with respect to networks:
|Industrial OT Network
|Enterprise IT Network
|Standard IEEE 802.3 Ethernet and proprietary (non-standard) versions. Standard IETF Internet Protocol (IPv4) and proprietary (non-standard) alternatives
|Standard IEEE 802.3 Ethernet.
Standard IETF Internet Protocol (IPv4 and IPv6)
|Switch-Level and Device-Level Topologies.
Ring Topology is predominant for both.
Redundant Star for switch topologies is emerging.
Standard IEEE, IEC and vendor specific Layer 2 resiliency protocols
Redundant Star Topology is predominant.
Standard IEEE, IETF and vendor specific. Layer 2 and Layer 3 resiliency protocols
|Service Level Agreement (SLA)
|Mean time to recovery (MTTR) – Minutes, Hours
|Mean time to recovery (MTTR) – Hours, Days
|Primarily local –traffic between local assets.
Information, control, safety, motion, time synchronization, energy management.
Smaller frames for control traffic.
Industrial application layer protocols: CIP, PROFINET, IEC 61850, Modbus TCP, etc.
|Primarily non-local –traffic to remote assets.
Voice, Video, DataLarger packets and frames.
Standard application layer protocols: HTTP, SNMP, DNS, RTP, SSH, etc.
|Low Latency, Low JitterData Prioritization –QoS –Layer 2 & 3
|Low Latency, Low JitterData Prioritization –QoS –Layer 3
|Open by default, must close by configuration and architecture.
Industrial security standards –e.g. IEC, NIST.
Inconsistent deployment of security policies.
No line-of-sight to policies
No line-of-sight to the Enterprise or to the Internet
|Pervasive Enterprise security standards.
Strong security policies.
Line-of-sight across the Enterprise and to the Internet
We can now see that Edge computing is essential and that IoT has progressed far enough to welcome Edge Computing. The proof of this being that big players like AWS (Greengrass), Google (Cloud IoT Edge) and Microsoft (Azure IoT Edge) have recognized the future of computing on the edge. It’s also believed to have matured enough to open new avenues for analytics in IoT.
Here are some recent events in Edge computing:
AWS Snowball Edge is a data migration and edge computing device with 100 TB of capacity and support for computing tasks via Amazon EC2 applications or AWS Lambda functions. Customers use these devices for data collection, machine learning and processing as well as storage in environments with intermittent connectivity (such as manufacturing, industrial and transportation) or in extremely remote locations (such as military or maritime operations) before shipping it back to AWS.
Intel launched Myriad™ X, the world’s first vision processing unit (VPU) to ship with a dedicated Neural Compute Engine to deliver artificial intelligence (AI) capabilities to the edge in an incredibly low-power, high-performance package. This will give devices the ability to see, understand and interact with the world around them in real time.
Written by Bhavani Raman, IIoT Practice Head at Ideas2IT Technologies.