Whatever happened to our fourth industrial revolution? A combination of IoT, cloud computing, and big data analytics promised higher productivity, predictive maintenance, and end-to-end process automation for all industrial operators. And while these benefits are appearing in fits and starts, we have yet to introduce a holistic reimagining of processes and systems – the ultimate promise of Industry 4.0.
There’s one major reason Industry 4.0 is off to a rocky start: IoT solutions remain siloed, monolithic, each trapped in its own narrow area of functionality. Maybe that was inevitable, given the organic, ad hoc nature of the market, but it’s ironic given that connectivity is one of the key values of IoT and Industry 4.0 writ large.
The fact is, siloed IoT solutions are hampering innovation. Luckily, new system architectures are offering distributed data collection, storage, and processing – along with a localized and secure central hub. If Industry 4.0 is broken, these edge-to-cloud platforms are the fix.
How Far Edge Computing Unifies Industrial IoT
Most industrial IoT offerings are full-stack solutions: They include sensors or devices, data storage, cloud applications, and, finally, the dashboard or user interface. Users don’t have control of the data that flows through the stack; they don’t even own the data, which typically belongs to the IoT manufacturer.
When you have a different full-stack IoT solution for every function of your business, the IT architecture becomes a nightmare to use, let alone maintain: You’ve got dozens of dashboards. You have to update devices on-site. There’s no communication between systems. The pace of innovation becomes too slow.
What’s missing is a system with distributed storage and processing. Edge computing is a start, but it takes a far edge platform to fully realize this vision.
In an edge computing architecture, data storage occurs close to the source (usually the sensor or device). Far edge computing takes this distributed approach a step further, processing data where it’s collected, i.e., on or near the devices that do the data collection.
Far edge computing allows you to both collect and process data that wasn’t previously available. Or, if it was previously available, there was so much of it that it caused storage problems in the cloud – problems that distributed, local-cloud storage can solve.
Making Far Edge Computing Actionable
On its own, far edge computing alone isn’t enough. You also need a platform that creates a local cloud, combining multiple far edge data “nodes” into a single, easy-to-use system. A platform like that could take distributed data storage and processing and make them available, not just to users, but to third-party applications—without another full-stack IoT system gumming up the works. Event-driven data logging makes such a platform possible.
Event-driven architecture creates an immutable event stream of data. That allows users to see correlations between all the different data points and events—and add new applications, algorithms, and functionality as you see fit. You maintain ownership and control of all the data. When you have access, you can innovate.
This sort of distributed platform creates a local cloud, separate from the broader internet, but with full connectivity for every necessary application. That leads to much more secure integrations with OT and control systems, a crucial consideration for critical infrastructure that you definitely don’t want exposed to external actors. We call a solution like this an edge micro cloud platform. This technology could be the key to breaking down the walls that limit IoT effectiveness, finally realizing the true promise of Industry 4.0.
How Edge Micro Cloud Platforms Work—and Achieve the Promise of Industry 4.0
The innovative companies that drive IoT implementation are the first to feel the sting of data siloization, the broken promise of Industry 4.0. Take the example of aquaculture: A fish farm might run more than 30 individual IoT solutions, one monitoring water conditions, another measuring feed waste, still another counting fish lice, and on, and on. How helpful is that if operators have to wrangle 30 separate dashboards, with no simple way to correlate data between the various systems?
Or imagine a cargo ship with 10 or 15 distinct digital systems (navigation, engine controls, cargo monitoring, etc.) Each has its own vertical architecture, and for security reasons, they can’t connect to the internet. When that ship comes into port, technicians have to manually update each one of these siloed systems, costing shippers hundreds of thousands of dollars in downtime.
In these scenarios (and many, many more), an edge micro cloud platform implementation begins by setting up a local cloud on the asset – whether that’s a building, a ship, a manufacturing line, or a coastal fish pen. Individual applications run locally, in a micro cloud with private connections to asset-based local clouds. For AI systems, you may need to occasionally connect to a broader commercial cloud to train the algorithms—but once your models are ready for use, you close that connection to ensure full security, while the AI runs on your micro cloud.
Event-driven architecture maintains data integrity across applications and data sources. Crucially, this system separates data collectors (sensors) from compute layers (applications) from user interfaces (dashboards). That provides full functionality across discrete operations within a single platform, reducing maintenance costs and simplifying the addition of new applications. And once you have the edge micro cloud platform set up, you can easily customize the system—without worrying about security, which is already built into the platform.
Edge computing platforms help every industrial operator achieve fact-based, cost-efficient operation that maximizes production and output, while limiting downtime to the absolute minimum. In short, they deliver on the promises of Industry 4.0.