3 Ways to Keep Your Options Open

Posted February 3, 2021 | Technology |
3 Ways to Keep Your Options Open

As architects, our priority should be to get an acceptable degree of confidence on key feasibility uncertainties, using the methods we outlined in a recent Cutter Business Technology Journal article. We must know whether something that can make or break our solution from a desirability or financial viability perspective is doable, and how. There’s no two ways about it.

Yet, as always with architecture, solutions that keep our options as open as possible are better. This is particularly the case for high-liability feasibility uncertainties, provided the associated complexity or optimization tradeoff — there usually is one — is acceptable. The goal is to make hard-to-change decisions as late as possible in the lifecycle, only once they really must be made. If feasibility can be solved in two ways with acceptable tradeoffs, choose the most open way.

We should also always try to keep our solutions adaptable to changes in what is desirable. Desirability uncertainties are the easiest to get wrong, even after testing, because they involve human beings. In this case, it is as important for the architect to understand the high-liability desirability uncertainties that have been addressed as the ones that remain question marks. These are likely to become the cause of pivots in the future.

While it is impossible to predict the future, the ability to adapt to what we already know is most likely to prompt a solution’s future pivot can make the difference between an elegantly evolving architecture and one that must be thrown away. This is especially true for data architectures where there is a strong need to provide a quick turnaround to support exploratory areas and new use cases in areas such as data science and artificial intelligence (AI) research.

In our experience, the following patterns and techniques are useful for keeping our digital and data architectures open to changes:

  1. Use domain-driven design and contracts. This technique is an evolution of the principle of decoupling in complex systems — and almost as old as software development itself — but it’s clearly a pillar of an evolutive architecture. It forces modularization and enables independent evolution and optimization of different parts of a system behind their clear abstractions, with few to no ripple effects. The use of domain-driven design and contracts is a fundamental tenet of digital architectures.

  2. Use expand and contract data schemas. This technique can be used for database schema design as well as interface design. When defining changes to data structures, it is much more flexible for users of your data or consumers of your contract to deal with a staged expansion and contraction approach. This means that instead of making a change straight away, you first make a nondestructive change — typically by adding and potentially duplicating data rather than renaming or removing structures — and then retiring what needed to be removed at a later point in time.

  3. Don’t optimize too early. It’s often tempting to build abstraction mechanisms today to optimize hypothetical extension use cases for tomorrow. Always consider the level of certainty for such future extensions as well as the added complexity tradeoff to assess whether it should be done now. Beware of complex abstractions for things that might never happen. Instead, start thinking about high-level plans for what would have to happen if these extensions were indeed needed in the future.

    Similarly, when choosing technologies, always favor the most versatile one that fits your current and upcoming constraints. For example, we have found ourselves favoring modern RDBMS solutions with good JSON document storage and querying capabilities more and more lately, to keep the number of moving parts and technologies down in our solutions. Of course, this might not work for extremely specialized heavy and complex querying scenarios that require more specialized engines (e.g., graph-based AI). If such optimization is not required, though, why make things more complex than they need to be and close down your options?

[For more from the authors on this topic, see “Designing Emerging, Adaptive Digital & Data Architectures.”]

About The Author
Olivier Pilot
Olivier Pilot is a Senior Consultant with Cutter Consortium and a Senior Architect with Arthur D. Little’s UK Digital Problem Solving practice. He has broad experience across a range of projects involving enterprise and solution architecture. Mr. Pilot's focus areas include digital strategy, Agile digital solution delivery, design and architecture, and design thinking innovation. His recent sample engagements include the design and delivery of a… Read More
Michael Papadopoulos
Michael Papadopoulos is a Cutter Expert, Chief Architect of Arthur D. Little’s (ADL's) UK Digital Problem Solving practice, and a member of ADL's AMP open consulting network. He is passionate about designing the right solutions using smart-stitching approaches, even when elegance and architectural purity are overshadowed by practicality. Mr. Papadopoulos leads the scaling of multidisciplinary organizations by focusing on continuous improvement,… Read More
Michael Eiden
Michael Eiden is a Senior Consultant with Cutter Consortium. Dr. Eiden, who serves as Head of AI at Arthur D. Little, is an expert in machine learning (ML) and artificial intelligence (AI) with more than 15 years' experience across different industrial sectors. He has designed, implemented, and productionized ML/AI solutions for applications in medical diagnostics, pharma, biodefense, and consumer electronics. Dr. Eiden brings along deep… Read More