No Code facilitates the reuse of predefined components, typically using a drag and drop interface or a web form. Such platforms always include things like identity and access management. They are always related to one particular task and audience, like web development, spreadsheets, analytics, market automation, and more. Most importantly, don’t require any code to stitch components together, reducing the need for engineers to spend time architecting databases, APIs, or internal workflows.
On the other hand, Low Code has a different set of goals and user personas in mind. The major misconception about Low Code is that the “low” in Low Code means that a person with hardly any knowledge of coding is the user of such a platform.
In my view, the goal of a Low Code platform is to enable developers to code and deploy their apps at a fast rate, with minimal setup effort and with the added confidence in what the platform provides out of the box. In that sense, a Low Code platform reduces the complexity of the application development process, shortening the time to market. The basic building block of the Low Code platform is usually a small snippet of code, wrapped as a reusable component that is applicable across different use cases, just like a Lego brick.
So, how does that differ from No Code, which is also about “stitching” components (code snippets) together? First, Low Code should allow developers to develop and publish new features, which require coding. Second, the platform must allow these code snippets to be connected and synchronized in an async manner, which is not as trivial a problem as it sounds.
There are two ways to look at the second challenge. The first one is to create a vertical-specific Low Code platform, which limits what “Low Coders” can do by providing them with a safety net in the form of a predefined way on how stitching should be done, leaving no room for a mistake (or maneuvering, depending on how you look at it). Since this is driven by a “make the common case fast” philosophy, it also limits possible uses, as anything more challenging or not envisioned by the tool provider becomes hard, if not impossible, to do. If the purpose of the end application deviates even slightly from what is provided by the predefined components, one has to write extensible code or re-write the piece entirely. Most Low Code platforms require coding in specific languages, sometimes proprietary ones, making the achievement of the end-goal even more complex.
The second way is to provide a horizontal, general-purpose Low Code platform, which facilitates the creation of custom components (using code) and provides the engine, the APIs, and the user interfaces required to combine and execute them as part of the more extensive application. This approach brings much bigger flexibility, with the caveat that vertical aspects need to be built on top with a slightly more significant effort (as the platform concepts are domain agnostic). In the next section, we will discuss why we believe this to be the better approach in the long run.
Turing Completeness and Its Relation to Low Code and RPA Tools
Experienced developers and system architects are always skeptical when they hear about yet another model-based Low Code platform. They feel that soon they will discover “gotchas”: instead of frameworks helping them with implementation, they will need to work around their limitations.
There is a term used in computability theory called Turing completeness. If somebody says, “My new thing is Turing complete,” that means that in principle, it could be used to solve any computational problem. Software languages are Turing complete. When serverless hit the mainstream, it was widely accepted that serverless was the best candidate for “the low code lego brick approach.” And that brings us to the story of Turing Complete automation. Suppose we use code snippets to implement the application logic. In that case, we need a potent and flexible rules engine that can orchestrate them without resolving back to coding all that in a programming language. Otherwise, we would lose the Low Code benefits entirely.
Frontend, Backend and the Rise of Serverless and Integration Architects
Front-end folks are all about react/Vue, webpack, CSS, and user experience to use a stereotype. They are not necessarily interested in all the nitty-gritty details that happen behind the hood. They closely work with business owners to make sure that everything looks pretty and usable, like this picture. The Low Code platform means nothing for them, as they will interact with it using APIs that already provide a high-level abstraction. They dislike No Code even more, as any excellent user experience and the beautiful vertical app are impossible to deliver using No Code tools alone.
OT people know everything about SCADA, PLCs, machines, and industrial processes. These are the folks from the trenches. They often feel somewhat agitated when the other personas talk about IIoT, feeling that they are trespassing in the realm of their experience and deep knowledge built over the decades — something you can’t grasp and “teach yourself” in like 21 days. They prefer problems to be defined using more or less causal relationships, that is to say, create automation rules based on knowledge modeling techniques, and if needed, enhanced by ML, but not the other way around, by having black box models going beyond running anomaly detections and predictions to guide repair actions, etc.
Here I noticed a little bit of a generation gap. Younger data scientists are very good at Python, as in this respect, University curriculums have been heavily influenced and improved over the last decade by the needs of the industry. Previous generations are still more comfortable with either Matlab or only No Code tools.
Nevertheless, when data scientists work on a particular use case such as anomaly detection or prediction modeling, they are faced with other challenges first, which need to be addressed before even reaching the Low Code platform.
- Which type of ML algorithm should be used for this problem?
- Which ML platform fits the problem best?
- Is the quality of the data good enough to solve this problem?
As mentioned earlier, No Code platforms are applications that enable non-technical users to build applications or ML models by dragging and dropping pieces of software or data on canvas. These ML/AI platforms allow users without previous coding experience or even machine learning knowledge to build a machine learning model starting from a dataset using no-code.
With BigML, as one example of this kind of AI platform, you can create a machine learning model from scratch in a simple way without needing to know a lot about coding, using their dashboard (No Code) or their Python SDK (Low Code). This application makes it possible to experiment with a particular dataset, try out different ML algorithms, and fine-tuning hundreds of hyperparameters.
Business owners don’t care about coding, and why should they? But still, the idea of a citizen developer attracts them. The idea that everyone in the organization is empowered to contribute and deliver working software in a shorter time is the wet dream of every business owner. No more need to talk to these terrible software people or set up long and expensive integration projects!
In my mind, till we get NLP integrated into the automation creation, this remains, to a certain extent, wishful thinking.
How About the Future? NLP to the Rescue?
Is it possible to solve the problem of No Code automation tools in a new disruptive way? CodexAI is about to disrupt the software development industry. You type a sentence, and the code comes out.