What if managing change were as simple as manipulating different types of logical abstractions? In software engineering, this is the conceit behind model-driven software development.
MDSD is not a new idea; however, thanks to the popularity of so-called low-code development platforms (LCDP), MDSD is in the midst of a moment. Proponents say the combination of LDCP and MDSD has the potential to insulate businesses against the effects of change. They argue, first, that MDSD tools position businesses to better accommodate change, and, second, that low-code tools permit businesses to pivot rapidly to exploit changing conditions.
Analyst Eric Kavanagh, CEO of the Bloor Group, highlighted one low-code product – Mendix – on a recent episode of DM Radio, the weekly data management-themed radio program he hosts.
LCDP tools such as Mendix are also MDSD platforms. This makes sense; the two complement one another, for the most part. For example, the purpose of MDSD is to encapsulate domain-specific business knowledge into reusable models; in theory, this should free developers to focus on the technical bits. And, in fact, LCDP tools like Mendix actually equip business people with a visual, drag-and-drop app-dev environment, effectively enlisting them in the software engineering process. Business “developers” build, test, and perfect functional applications that encapsulate the stuff they know best – viz., the objects, rules, processes, etc. that are specific to a business domain – and then hand these proto-applications off to skilled developers.
“They developed a visual programming language to allow business and technology people to sit down at the table and map out … new products [and] new workflows,” Kavanagh told listeners, adding that by abstracting business rules at a logical level, Mendix and similar products “allow … business people to sit down and see the workflow and understand” what is happening. In the same way, Kavanagh argued, low-code, MDSD-based tools such as Mendix aim to promote collaboration between the business and IT.
“The people in the company can actually sit down and map out how this stuff is supposed to work, and then that [conceptual model is] what gives you your digital transformation as opposed to just telling developers, ‘Look we want to be able to do this,’” he told listeners.
MDSD and LCDP explained
Think of MDSD as roughly analogous to logical data modeling in the world of relational databases. First, analysts or business subject matter experts create different types of models that encapsulate the objects, relationships, rules, etc. that are germane to each business domain. Basically, these models encapsulate the business logic of each domain. Then, an MDSD tool uses these models to generate code in a supported programming language. This is similar to how a data modeling tool uses a logical data model as a basis to generate code (DDL scripts) optimized for specific database systems. In both cases, modeling software translates from a higher (i.e., a logical model or data model) to a lower (a programming language or DDL) form of abstraction. LCDP technologies take this logic even further. Whereas MDSD tools and methods presuppose tight collaboration between business and IT stakeholders, low-code technologies enlist business power users to build and test-drive applications that they then hand off to skilled developers, who focus on coding application (as distinct to business) logic.
In this way, proponents claim, low-code tools not only help improve collaboration between the business and coders, but (eo ipso) have the potential to accelerate the development of new business applications, services, and workflows. For the same reasons, LDCP tools could make it easier for businesses to incorporate new functions or new data into existing apps, services, and workflows – and, by extension, the processes they power. Supporters say that low-code tools can simplify the creation of new customer-facing products and services, along with the workflows that support them. They believe LDCP tools can likewise make it easier to accommodate changes to business operations: for example, to support business expansion into new regions, reorganization in existing regions, or absorption of one region by another. Finally, they argue, low-code tools give enterprises a rapid way to accommodate business change, such as the system integration required to onboard a new supplier or to launch a new business partnership. In sum, the case for MDSD + low-code is that the combined technologies have the potential to simplify any type of app-dev work that (a) has to do with business functions, services, or workflows; (b) is recurrent; and (c) can be formalized and encapsulated into reusable components.
Sheryl Koenigsberg, head of global product marketing with Mendix, shared more background on the product, its history, and the evolution of low-code as a software development modus vivendi. To start with, she said, Mendix debuted in 2005 as a kind of WYSIWYG tool for business “developers.” “We … introduced a tool for business developers to create their [own] logic flow, where they could go on to a web-based app and start developing their app, and it wasn’t a prototype: it was an actual working app or database,” she explained.
The idea was not just to bootstrap the development of business apps, but (to switch metaphors) to ensure that developers and business people were reading from more or less the same script, she told Kavanagh. To this end, Mendix sought to put business people in the driver’s seat, equipping them with a low-code, WYSIWYG-like environment they could use to design apps that, though functional, were also intended as empirical proofs-of-concept of what the business needed. “They could bring [in] professional developers and say ‘This is what I want to do,’ and the professional developers could pick up right there,” she said.
Over time, she explained, Mendix evolved to address the needs of an emerging class of business power user.
“We’ve seen the rise of citizen developers … and those are people who … [have] all the Excel macros figured out, or the person on the team everybody goes to for like the PowerBI stuff,” she said. “Those guys are great candidates to … start building stuff in low-code. They have the mindset, they have the skill set.”
Low-code proponents point to what they claim is another distinct advantage: LDCP technologies help businesses do more with less. What is more, they promise to free skilled software engineers to focus on hard problems, on creative solutions, on what they (i.e., proponents) call “value-creating” work, as distinct to the types of recurrent, repeatable problems that MDSD and LDCP technologies aim to formalize and encapsulate in reusable applications and workflows. “We have four or five developers that … work in Mendix and they accomplish more than a team of, no lie, probably 15 to 20 developers,” Conway Solomon, CEO with Mendix customer WRSTBND, a company that provides event-management software and services, told Kavanagh. “So, what kind of cost savings is that? Especially as a small company that has a lot of ambitions, where you know, like, a lot of extra money has been [spent] on payroll, you can do it in a fraction of the cost and have the same outcome … if not better, and so we use that to our advantage.”
A useful tool, yes, but not a panacea
One problem is that MDSD and low-code technologies are most useful in greenfield scenarios: in businesses that do not already maintain existing investments in software. In other words, in businesses that have not yet taken on large amounts of technical debt. In these greenfield environments, low-code and MDSD technologies give businesses useful tools with which to create and change apps, services, or workflows.
These businesses are fortunate: they can, in effect, model and code from scratch. Alas, in mature environments, as in most Global 2000 firms, low-code + MDSD is … not so much a no-go as a slow-go.
First, it ignores the problem of technical debt, which constrains the business’ ability to realize the benefits of technologies such as MDSD and LCDP. Second, it presupposes detailed knowledge of the digital workflows and services that span business processes. (These workflows and services are not necessarily transparent – even to business analysts and subject matter experts.) Third, it presupposes insight into the (often poorly understood) IT systems, applications, services, etc. that power these workflows and services.
Like it or not, the sine qua non of model-driven development, like that of data modeling in data warehouse design, is accurate and detailed knowledge of the anatomy of the business – and, moreover, of its undergirding IT infrastructure. This is much less of a problem in a greenfield environment. Absent this knowledge, modeling can only succeed on the basis of what amounts to, in effect, the prior reverse engineering of the business, its core processes and services, their real-world workflows, and the IT systems that support them. In large organizations that have taken on significant amount of technical debt, the work of instantiating business logic in reusable code – i.e., the bulk of the work that MDSD and LCDP tools purport to simplify – is akin to low-hanging fruit.
More challenging and frustrating is the work of figuring out how to apply this logic to and across the IT assets that knit together workflows and sustain essential business services. The IT infrastructure of a large organization is a lot like an archaeological site, and the work of managing, classifying, and maintaining its hardware and software “artifacts” – some of them undocumented and/or black-box systems – is akin to the work of an archaeological dig. So it is that coders will squander enormous amounts of time and energy as they dig through the historical strata of an IT infrastructure, laboring to determine how to access, exchange data with, and schedule tasks to execute in the more ancient of IT artifacts – unearthing, as they dig, all kinds of forgotten technical-debt Band-Aids, some of which will have metastasized over time.
This is not in any sense a niche problem. In data warehouse design, for example, it is taken as a given that top-level business representations will not always correspond to the structure of the business as it exists in reality: that both the business and IT are apt to misapprehend themselves. Nor is it uncommon for IT to possess imperfect information about what, exactly, older systems, applications, services, etc. actually do: e.g., what data lives in them, how it is created, their upstream and downstream dependencies, and other pertinent facts. In such cases, the usual recourse is for analysts, subject matter experts and data modelers to attempt to reverse-engineer the relevant processes and their workflows from the bottom up. (This last is similar to the bottom-up approach employed in dimensional modeling, as against the top-down perspective used in entity-relationship modeling.)
Nor is MDSD immune to the natural shocks that business is heir to. Imagine, for example, an M&A scenario between two businesses, one of which – the acquiring company – is an MDSD practitioner, the other of which is not. In this case, the acquiring company is arguably little better off for all of its MDSD-enforced discipline and transparency: at a minimum, it must reverse-engineer, map, and document the domain-specific objects, rules, workflows, and processes specific to the company that it is absorbing, along with the archaeology of that company’s IT artifacts. Ironically, then, some of the benefits the acquiring company might have expected to realize via its prior investments in MDSD and LCDP technologies – e.g., an improved ability to respond to or accommodate business change – will be partially offset by the acquired company’s technical debt.
On the other hand, the combination of MDSD + LCDP could provide a pragmatic framework – a method and a means – to direct the acquiring company as it undertakes the reengineering work that is a prerequisite to integrating its business operations with those of the company it has acquired. Business and IT people, iterating rapidly in a piecemeal agile process, could use model-driven, low-code tools to reengineer the acquired company’s digital workflows and services. This would not happen overnight, of course. And these projects would likely be less totalizing than tactical: i.e., low-code app-dev would complement conventional app-dev. But MDSD and LCDP tools could prove useful in accelerating chunks of this work.
The point is that the combination of MDSD and low-code technologies is not a universal prescription. To posit that it is challenging to design and maintain a modeling schema which is at once expressive and supple enough to encompass the totality of business experience is radically to understate the case. (The late Stanley Rosen put it best when – invoking Kurt Gödel’s first incompleteness theorem – he observed that “every attempt to systematize life or to govern it by a set of axioms rich enough to encompass the totality of experience results in a contradiction.” Thus: models and modeling.) The combination of LCDP + MDSD = a useful tool – one that, although suitable for a large number of users and use cases, is also likely to be misused, or misapplied, in others. A tool, at the risk of belaboring the point: not a prescriptive panacea.
The qualified case for MDSD + LCDP
Vendors such as Mendix seem to get this. They tend to emphasize a pragmatic, as against a formal or absolutist, case for low-code app-dev. Koenigsburg, for example, alluded not only to the well-known shortage of software engineers, but to the frequency of burn-out among skilled software engineers, especially. One reason for this, she told Kavanagh, is that expert coders are asked to squander a portion of their ingenuity and creativity on rote, tedious, repetitive work – or, in the context of application maintenance, on tasks that amount to keeping the lights on.
Low-code tools and MDSD methods address this in a few ways, she argued. First, they provide a methodology and a software framework that businesses can use to create reusable design patterns. Second, they enlist business people themselves to perform the work of bootstrapping applications and workflows. These business “developers” work in a visual environment and assemble apps and workflows by manipulating pre-built components. Third, they free skilled software engineers to focus on innovative solutions, to tackle creatively taxing problems, etc.
“There’s the cost savings from a development perspective that is real for a company like Conway’s, but there’s also the developer shortage,” she told Kavanagh. “I read … the other day that if we took every computer science major, graduating … this year, there would still be seven times the number of openings for software developers in the United States. So … I think a big part of solving that talent gap is broadening our idea of who is a developer.”
About Vitaly Chernobyl
Vitaly Chernobyl is a technologist with more than 40 years of experience. Born in Moscow in 1969 to Ukrainian academics, Chernobyl solved his first differential equation when he was 7. By the early-1990s, Chernobyl, then 20, along with his oldest brother, Semyon, had settled in New Rochelle, NY. During this period, he authored a series of now-classic Usenet threads that explored the design of Intel’s then-new i860 RISC microprocessor. In addition to dozens of technical papers, he is the co-author, with Pavel Chichikov, of Eleven Ecstatic Discourses: On Programming Intel’s Revolutionary i860.