When founder and executive chairman of the World Economic Forum, Klaus Schwab, coined the phrase “Fourth Industrial Revolution,” he had in mind a vision of the future marked by massive technological breakthroughs. Robotics, AI, extended reality, nanotech, decentralized decision-making and autonomous machines would fast become the buzzwords of Industry 4.0. Industry 4.0, and the pursuit of greater efficiency, automation and autonomy at scale, popularized a previously little-known architectural model called Internet of Things, or IoT. Tied to IoT is the plethora of devices connecting to the internet, currently estimated to reach a trillion by 2025, according to SoftBank Group. As capabilities evolve and new thinking occurs in IoT, we will arrive at IoT 2.0.
What Is the IoT Architecture Framework?
There are different ways to describe the IoT architecture framework, but I like to break it into three logical steps:
The IoT framework begins with sensors and devices such as radar, lidar, cameras, and weather or thermal instruments that each produce raw data.
The processor element has been the subject of industry fervor in recent years. This is also referred to as edge computing. Computing power needs to be as close as possible to the things generating the data to satisfy ultra-low latency requirements and to save cost on data backhaul and needless storage. Software and related algorithms then process this data to create actionable information. Algorithms infer a present state and yield a decision or course of action.
Finally, that decision set is used by the actuator in the framework. The actuator could be a machine or the thing taking the prescribed action, such as braking a car that is about to collide with an object, or powering down a factory machine that is overheating.
Sounds easy enough, right? Well, not exactly.
IoT 1.0 & the Early Years of IoT
The first years of IoT exploration can best be described as an extended experiment. It is a period that I call IoT 1.0 because versioning presumes a natural course of improvement. Indeed, there will be an IoT 2.0. IoT 1.0 is characterized by customized point solutions, and a fatally fragmented vendor ecosystem. Everybody, it seems, is reinventing the same wheels. It’s not surprising that early surveys of IoT 1.0 done by companies such as Cisco and Microsoft revealed that a staggering percentage–nearly 75 percent–of enterprise IoT projects failed to progress beyond the experimental stage.
The cost and complexity of IoT 1.0 isn’t a failure as much as it is a natural course of technology evolution. IoT 1.0 is merely the genesis stage of a massive technology paradigm. As modern industrial strategist, Simon Wardley, once put it, “every activity, practice, data type, etc., starts in the Genesis stage [and] things evolve through vendors competing to improve the product (supply-side evolution) and through the market adapting to take better advantage of the thing (demand-side evolution).”
The “Rhyming” Nature of IoT 1.0 and IoT 2.0
“History never repeats itself, but it does rhyme,” said Mark Twain. This quote applies to technology today. The rhyming of history we see is between today’s IoT and the client-server world of 1980s computing. In the client-server era, every enterprise environment was an island, scale was difficult to achieve, and commonality of implementation was complicated and cumbersome. Businesses had to hope they had a decent IT department to get the most out of their computing investment. These challenges should resonate with anyone implementing IoT today.
The shift from enterprise IT to IaaS and then cloud led to a jump in ROI on computing spend because of economies of scale, common platforms, and a dramatic reduction in in-house maintenance. Things got less complicated and less expensive, all abated by vendors realizing that the future of commercial business would be driven by the web services developer, not some archaic central IT manager or even the CIO.
The green shoots of this pattern are emerging in IoT. As capabilities evolve and new thinking occurs, we’ll arrive at IoT 2.0. The IoT developer is today’s version of what the web services developer was to the pioneers of cloud computing. IoT 2.0 will place the IoT developer at the center of decision-making.
Are We Ready for IoT 2.0?
Here is the big newsflash: Neither the technologies nor business models of today’s hyperscale cloud providers are designed for the world of IoT 2.0.
IoT 2.0 will be underpinned by an OS for the edge, a common platform designed to deliver computing power where it needs to be automatically to accommodate the imperatives of IoT. This OS will provide distributed computing that embraces a Schwabian decentralized framework driven by protocols rather than products. It will be designed for low latency and deployment models that will break developers free from the iron shackles of legacy compilers and orchestrators that truly have no place at the edge. Perhaps most importantly, a common OS for the edge would embrace multi-tenancy as a business prerogative, thereby expanding the perimeter of the edge economy.
The consequence of IoT 2.0 means content, code, and data delivery will be achieved at dramatically lower costs. But it also means a changing economic model for the supply chain. A decentralized value chain means everyone from data center owners, tower companies, network providers, and hardware OEMs can generate a scalable return on edge.
The Future of Industry Growth & IoT 2.0
In the big picture, IoT 2.0 will usher in the evolutionary transition from custom-built systems to highly scalable and widely applicable product ecosystems. It will mark the inflection point, as Wardley would describe, to the hockey stick of industry growth and prosperity. Industry 4.0 is about to come of age.