Advisor

5 Technologies to Automate Software Development

Posted June 20, 2019 | Technology |

As expected, the road to automating software development is long and full of twists and turns. No doubt, there will be potholes and detours along the way. However, getting to the end goal now seems possible, if we travel a short distance at a time. A few years ago, I wrote a Cutter Business Technology Journal article on the use of cognitive computing technology to automate software development. Since then, I have partnered with clients to establish a technical concept, devise a roadmap, and start my journey to test its feasibility and develop products. In my opinion, there are five technologies with immediate influence that can be matured and put to work in the next two to five years:

  1. Agile-at-scale and DevOps working together in a lean and efficient fashion
     
  2. Sustainable designs with embedded self-diagnosis and repair abilities
     
  3. Knowledge oracles (or hubs) to provide developers with assistance
     
  4. Test avatars, which are a form of digital twins, that automatically generate and run tests at all levels
     
  5. Deep learning via machine automata for software development tasks

In the long term (next 10-12 years), I believe we can automate the entire process. That leads to thinking about what to do with the people we displace. This, to me, is the biggest problem: software as an industry, according to the Bureau of Labor Statistics, is currently one of the fastest-growing and highest-paying areas in the US. In this Advisor, we start our roadmap journey by looking at the status of each of the five technologies and discussing their potential.

Agile and DevOps

First on my list is the merger of Agile methods with DevOps. To me, this is a no-brainer because it is happening as you read this Advisor. When looking at social media and the press, hype can quickly become overwhelming. So you need to ask yourself whether this transition is indeed occurring and, if it is, how fast. Let me set the stage. According to my assessment, Agile methods are the “new normal” for software development worldwide. What this means is that most firms are using some form of Agile methods as their primary means to develop software. For small-scale developments, Scrum dominates. For larger-scale development (or enterprise-wide use), organizations are deploying one of the many Agile-at-scale methods (e.g., LeSS and SAFe) or a hybrid approach (combination of Agile, Agile-at-scale, traditional, LeanKanban). Which method is best, you might ask? The answer revolves around what works best for the user/customer.

Like Agile methods, DevOps comes in many flavors. Whereas the goal for Agile methods is to continuously deliver working software products, DevOps places its emphasis on continuous deployment of hardened products. This changes the end game. To harden a product, we need additional effort to make the software work in its intended operational environment in the field. In response, we need to expend effort to package, document, distribute, and test the software, as well as to train and support users. To provide such a capability, some firms combine their software operations along with other groups, such as business operations, customer service, field service, marketing, quality assurance, training, and others. Some include security; others do not. Does it make sense to combine them? Of course. Besides synergy, such combinations facilitate streamlining operations and eliminating redundancy. For example, firms can reduce time to market and achieve economies of scale by using compatible work processes supported by a single toolset rather than several toolsets. The challenge is how to do it in order to reap the most benefits. And then there are the infrastructure and readiness issues once you’ve started your journey. Needless to say, combining the two is still a work in progress for most firms. The task is more complicated than it seems, and there is no one formula that works.

Sustainable Designs

The second technology on my watch list is one that automatically embeds diagnostics and self-repair pro­tocols into designs so as to make them more sustainable. Pattern-matching techniques, AI and rule sets, and ML concepts are the best way to accomplish this. We can use pattern matching to insert the initial instrumentation, and we can utilize AI to establish the initial conditions and rules for invoking them. Finally, ML can evolve the rules. For example, an initial set of exception conditions for the application domain in question could be identified and then expanded based on learning where the breakages occur as the code is continuously integrated and tested. In its next iteration, the design automata might be able to generate the code and the tests needed to validate functionality and performance by generating neural networks that search for issues discovered and provide advice based on experience on how to arrive at an optimal solution. Results are found iteratively using a “learn as you go” approach that converges on solutions employed to improve sustainability. For example, the application could be instrumented to trigger diagnostics that would automatically generate needed tests. One of my clients has already started the maturation process by implementing a maintenance protocol by embedding self-diagnostics in the design, automatically using pattern-matching techniques. Optimization is being performed using AI, and updates are automatically being generated and implemented during development sprints using ML techniques.

Knowledge Oracles/Hubs

Next, let’s peek at oracles and hubs that provide assistance rather than automation during software devel­opment. Such assistance is the precursor to automation in that it sets the stage for learning how best to achieve it. These assistants search for knowledge on your servers and the Web via the cloud using unique algorithms that provide insight and instructions to developers. In natural language, they ask the developer, “Did you consider x?” or “Wouldn’t doing it this other way be more efficient?” They then offer advice and examples. In order to support the firm’s design and coding standards and to tap its experience history, the firm’s knowledge bases are searched as well. Think of a device similar to Alexa or Google Home that would be on every developer’s desk and could help them complete their work tasks and flows. The developer could say, “Hello software developer assistant, can you give me an example of how to do such and such a task?” In addition, it would be possible to hold virtual meetings using digital assistants that would talk and act like real people.

As mentioned, such assistants could serve as a first step toward automation. The gathering of context-specific experience through trial-and-error via interactions with these assistants during the software development process enables the gathering of information needed to determine how to proceed using analytics and deep learning technology. You could then dig deep and learn how to further automate the workflows and processes that developers are currently performing and learn where issues lie that require further exploration. Another of my clients has shown that such a network of assistants is feasible and is busy building a prototype. Her organization is using Alexa-like devices tied to its own servers to act as knowledge oracles. The hope is that the system will be operational early in 2020.

Test Avatars

The fourth technology on my list involves avatars that perform testing automatically. Using the digital twin analogy, machines perform the functions of testers as part of the development team by keying to triggers in specifications and requirements posed by environments. This technology would extend that developed for design (as discussed above) to address requirements and their validation. Specifically, implementation of such technology would facilitate deployment of test-first concepts to ensure that the test criteria needed for validation are embedded as triggers in the specifications for the software (e.g., user stories, functional specs). Such an implementation would take advantage of a version-control system that automatically tests designs, code, and other artifacts when they are checked in and checked out at the unit and unit-integration levels. The concept would facilitate designing quality into the code rather than trying to test it in. In addition, complexity and other metrics could be checked, and advice could be offered, using the knowledge base, on how to make the artifact simpler. Code would then be tested at the unit level and next in combination according to work and process flows to ensure that threads function and perform correctly. Builds could be validated using the criteria taken from the specification and products could be continuously delivered or deployed via the process. Finally, the concept could be extended to finish up testing during DevOps and to reduce the total effort needed to achieve the end game as the product is continuously deployed and fielded operationally.

Deep Learning

The final technology builds on the first four and uses deep learning and automata to mechanize as much of the software development process as possible. As you can see, the other four initiatives address require­ments, design, and testing (possibly deployment as well, although that may be a reach at the present time). The goal is to automate specific development functions and provide advice on others. At first, functionality would tie together across the cloud via the IoT using AI, deep learning, and other technology to auto­mate as much of the software development process as possible. People might remain in the loop, but they would assume new roles. Rather than talking with other people, they would collaborate and talk with avatars acting as digital twins. At a minimum, business operations (i.e., product managers/owners and users) would be in the loop because their early contributions would be deemed important. For Agile-at-scale developments, people performing other roles might be in the loop because some people outside of development have needed input. The key to succeeding in such a venture is to attempt it incrementally using an Agile approach to develop the knowledge base needed to piece the parts together and teach the system how to do the job.

So can we do this in the near term? The honest answer is: I don’t know. Making such a leap could take years. Nonetheless, we can certainly get closer to full automation, and that is worth a try. You are also probably asking, “Would maturing technology like this be better attempted in a university environment?” After all, academics can run controlled experiments and examine results much more empirically. My answer to this suggestion is an emphatic “No!” Students just do not have the skills, knowledge, and abilities to design the experiments necessary to pull this off, and professors sometimes seem to be more interested in experi­menting with the technology than getting a working system operational. Thus, I believe that the future of such technology resides in experimentation and applied research done in an industrial setting.

Enablers

Are there other technologies that are maturing and will soon be useful and ready? Yes. However, the five highlighted here seem the most likely candidates at the moment, and clients of mine are banking on them. Are enablers needed to take advantage of these technologies and speed the maturation process? Again, the answer is yes. Let’s briefly look at two enablers (i.e., infrastructure, decision support, and tools; top management support) that I believe you must put in place to ensure success.

The infrastructure, decision support system, and tools that facilitate teamwork and collaboration and help the team get the job done are essential. They provide the backdrop that enables people to use innovations to get their work done efficiently, with a minimum of difficulty. Also essential are top management support and the establishment of a mindset supportive of the changes you are trying to introduce. Organizational transformation at any level is never easy and needs to be pursued in a systematic manner, which process models facilitate. Tactics need to be formulated to address the issues that change raises, many of which, as mentioned earlier, revolve around the affected people. With automation, the primary resistance often comes from those who will be displaced and those who fear change.

While new jobs and retraining will likely be options, the number of new jobs normally will be lower, meaning that a large percentage of staff will have to be let go. Some pundits even go so far as to say that the use of intelligent machines in the workplace will alter the nature of work, the design of organizations, and the execution of transactions. That is why Bill Gates suggested that if a robot (or I assume other automata) replaces a person who makes $50,000 annually, that robot should be taxed at the same income level the person was, and the revenue should be used to retrain the displaced worker. Just how we will deal with this displacement is still heavily debated, but a possible outcome is that one of the fastest-growing highly paid professions, software development, may find its status changed as machines take the place of human workers.

[For more from the author on where AI and these other technologies take us as we travel the road toward software development automation, see “Software Development: Can Technology Help Us Automate the Work?”]

About The Author
Donald Reifer
Donald J. Reifer is recognized as one of the leading figures in the fields of software engineering and management, with more than 40 years of progressive management experience in both industry and government. He has built businesses, managed key programs, and led major R&D initiatives. Mr. Reifer is often called upon by clients to review troubled programs, examine red team proposals, and perform competitive assessments with an emphasis on… Read More