Is Software Eating the World? — An Introduction
In 2011, Marc Andreessen, developer of the Netscape browser and cofounder of the Silicon Valley venture capital firm Andreessen Horowitz, stated in an article in the Wall Street Journal that “software is eating the world.” I remember thinking at the time that this was a memorable aphorism, but while it captured the increasing importance of software, it seemed somewhat cryptic or vague. Little did I realize that, over the next 10 or so years, it would come to articulate a profound transformation of the world we live in and, especially, the enterprises we lead and operate within.
Over the last 10 years we have seen a fundamental shift, whereby organizations that have spent decades developing and perfecting their business models and core capabilities have been outcompeted by organizations that have used software to disrupt existing models or establish wholly new models. Additionally, this “softwarization” of products, services, and experiences has, in many ways, only just started — especially if we consider artificial intelligence (AI) and machine learning to be a specialized class of software.
So what does this mean for your organization?
If the current capabilities, strategies, differentiation, and/or competitive advantage that define your organization are rooted in the physical world but can be replicated within software, then the challenge is clear. If you are creating value through mastery of the physical environment and your competitor can replicate this mastery in a software environment, then the outcome will be inevitable, although the timeline might be variable.
Take the hypothetical example of two pharmaceutical companies in a race to develop a new blockbuster drug. One organization has optimized its R&D processes to have both the highest velocity and lowest cost in developing new formulations in the lab and assessing their potential efficacy. The second organization has invested in state-of-the-art simulation software, theoretically allowing it to formulate drugs and, just as importantly, eliminate noneffective options within the software environment. If the second organization can simulate and eliminate 50% of options within the software environment that previously would have had to be developed in the lab, then probability suggests there will be only one winner in the race, despite the advantage the first organization holds in physical R&D.
I recently encountered an interesting example in the industrial refrigeration market. The traditional goal of R&D within this industry has been targeted on physical engineering to optimize energy and product efficiency. However, a disruptive competitor was looking to enter the market with a way of driving efficiency based on applying fluid dynamics and optimizing airflow. The software necessary to achieve a reliable simulation of airflow needs to handle high complexity, be computationally intensive, and requires an understanding of mathematics far removed from typical engineering R&D. However, these capabilities are well established and available to aerodynamicists operating within motor racing, where the potential disruptor learned and perfected its capability. The inevitable outcome is that there is likely to be one long-term winner when physical engineering innovation is competing with complex mathematical models that can optimize over a thousand iterations in an automated software simulation.
Fortunately, though, there is a better answer than physical engineering competing with sophisticated software in a dialectic battle, and this is to combine the two domains into one physical/digital innovation capability. All organizations will need to become equally skilled in both domains if they are to become leaders in their industry, but this introduces a big challenge.
The Fundamental Challenge Facing Organizations
For the last 30 years, most large and well-established organizations have followed industry “best practice” in terms of their IT capability and platforms. They have implemented standard packaged applications, inevitably delivered through the agency of specialized system integrators and overseen by internal IT functions, whose role has been limited to strategy, procurement, and delivery management. This has led to a situation where large enterprises, with internal IT functions comprising several hundred people, might contain no specific roles focused on software creation or might even not possess an understanding of how software should be developed!
To put it simply: at the point where mastery of software is becoming critical to the success and ongoing survival of the enterprise, there is an absence of expertise and insight within the organization’s decision-making forums to represent the potential that software can unleash and the inevitable disruption that will be required to seize this potential.
In some ways this dichotomy is an updated manifestation of the “Two Cultures” C.P. Snow identified in British academia in the 1950s. Snow was perplexed at how the scientific and arts and humanities communities he encountered in leading universities could be so ignorant of each other’s domains; an ignorance that was especially confounding given that these were some of the brightest minds, almost exclusively drawn from the same backgrounds and demography. Two nonoverlapping cultures had emerged, where almost all interactions and experiences served to reinforce the division and mutual antipathy.
The two cultures we experience in 2019 within our enterprises can be categorized as those schooled in the business school curriculum of case-study strategy, financial management, and corporatism versus those schooled in mathematics, software engineering, and algorithms.
The Start of an Answer
The good news is there are various patterns and approaches that can start to bridge the two cultures and unlock the value-creating potential of softwarization.
A great place to start is to fully embrace the principles of Lean Startup, as set out by Eric Ries. The culture of experimentation, rapid iteration, and a single cross-functional team, working in an accelerated and nonhierarchical way, is a great learning experience. It quickly exposes the team to both the software process and its potential and builds confidence through a “show me, don’t tell me” approach.
A second way is to focus ruthlessly on those areas where bespoke software can unlock a problem or drive competitive advantage. I believe in the principle of “build for competitive advantage but buy for competitive parity.” It is important, if the senior leadership is to start believing in the power of software, that that power be applied to the opportunities that will unlock substantial business value and where a real difference can be manifested.
Third, it is imperative that those who understand software find ways of communicating and evangelizing the opportunity it presents within their organization and to its leadership. This almost invariably involves patience in overcoming frustration, constant reframing, humility, and tenacity. How many of us tasked with creating understanding and enthusiasm for software within our organization can genuinely say they embrace these virtues on a daily basis?
I am reminded of a saying that changed my personal approach over a decade ago: “Nobody ever changed their mind by being proven wrong!” If the opportunity presented by software is to be fully realized in the enterprise, then it is imperative that those who understand software create the bridge to those who need to understand.
In This Issue
In our first article, Cutter Consortium Fellow Steve Andriole examines the extent of software’s rule in the areas of process automation, privacy and security, enterprise software, intelligent software engineering, and converged convenience. For each area, he evaluates in what ways software’s reign is good (rewarding us), bad (punishing us), or ugly (threatening us). Andriole’s belief is that software’s rule is inevitable and will expand. It is our decision what to do about the “kingdom of software.”
In the next article, Joost Visser begins with an acceptance of software’s having “eaten the world” and the need, after your organization’s digital transformation, to master the evolution of software. Software evolves in the environment of the marketplace, where the forces of innovation, cost reduction, growth, regulation, and coevolution drive change. As with biological evolution, only the fittest will survive. For sustainable evolution — for organizations not to see their software eaten by the world — refactoring and commoditization are essential. After examining these internal changes, Visser discusses the essential capabilities organizations must possess in the areas of data, design, and decisions to master software evolution. He concludes with the critical questions organizations must answer to determine whether they are ready for the long haul.
Next, Sunil Mithas, Kaushik Dutta, and Cutter Consortium Senior Consultant San Murugesan intriguingly compare software to the ouroboros, the mythical serpent of the ancient world that eats its own tail and is reborn from itself. Like the ouroboros, software has cannibalized and transformed itself. In recent years, software has evolved toward autonomy. Autonomous software has the capability to change itself (as with automatic updates) and even to write itself (AI can write software code or even be the software). Software evolution and changes in software development imply that software will become ever more pervasive and affordable, that firms must master disciplined autonomy in order to follow dual strategies, and that the role of IT professionals is being redefined. The authors conclude with the steps that senior leaders and managers need to take for their organizations to transform and be reborn.
Paul Pagel next discusses the key importance of a modern software labor strategy for organizations hoping to remain competitive in today’s digital and innovative world. The right team is key to crafting software systems capable of supporting innovation. Software delivery talent, however, is extremely difficult to find for a multitude of reasons. The solution, according to Pagel, is to structure software teams to deal with fragility and to thrive on change.
In our final article, Michael Papadopoulos and Olivier Pilot examine how a limited view of digital transformation impedes organizations from fully benefiting from the new, Agile ways of working. Papadopoulos and Pilot attribute this failure, fundamentally, to reliance on traditional architectural stacks where multiple teams and products rely on large, shared layers, and a change in a layer to meet the needs of one product may inadvertently break other products. To support a feature team–based organization, each team must have full end-to-end ownership of its stack, which consists of smaller, decoupled parts — microservices — that are loosely bound together. The authors advocate domain-driven design and the atomic design principle as the basis for enabling reuse. A managed, messy architecture is the key to an organization structured around feature teams, which enable digital transformation.
Clearly, as this issue suggests, the rise of software represents the biggest single hurdle and opportunity to business. We hope the articles inspire you to conquer the fundamental challenges facing your organization today and help you unlock your full value-creating potential.
The five articles in this issue of Cutter Business Technology Journal address the fundamental challenges facing organizations today in their efforts to transform into software organizations, including strategies for unlocking full value-creating potential. Available for purchase in the Cutter Bookstore. Cutter Members and Subscribers access here.
More Articles Like This
- Is Software Eating the World? — Opening Statement
- Is Software Good, Bad, or Ugly? Depends on Where You Sit
- The Evolving Role of Architecture in Digital Transformation
- The Next Frontier in Automation: Opportunities, Challenges, and Impact — An Introduction
- Powering the Supply Chain with AI and Converging Technologies