Greener Software Development Strategy: Why, How, and Why Not?
Jacek Chmiel draws attention to the increasing energy consumption by the electronic devices integrated into our daily lives. In addition to the ubiquitous mobile devices we all carry around, there are billions of devices (employing the Internet of Things and the Internet of Everything), not to mention back-end servers, all of which consume energy. Exacerbating this problem are ML and distributed ledger technologies, such as blockchain, that require intensive computing cycles, and use even more energy. The good news is that a greener software development strategy can have significant impact on energy usage of electronic devices. Chmiel explains the challenges to implementing this strategy and how organizations can overcome them.
Nowadays, software is everywhere. In fact, technology drives so many aspects of our private and professional lives that it’s easy to argue that software is ruling the world.
Considering how omnipresent software is, and how much has changed because of it during the last few decades, it’s pretty astonishing that when we hear the term “green computing,” we often think of cloud providers running data centers or big tech companies promising carbon neutrality faster than the competition.
However, an often-overlooked aspect in the global discussion about sustainability when it comes to information and communications technology is how software can be created and deployed to mitigate a company’s environmental footprint.
For this reason, let’s take a look at different aspects of software development and deployment (which, as we all well know, are bound together and should not be disconnected) and see what we can do to create a more positive environmental impact.
Mobile & Web Apps
Power optimizations of computing are as old as computing itself, but here we focus on the enterprise application development perspective. Thus, we begin our discussion with the apps themselves.
Mobile developers are familiar with the requirements to design and implement applications that follow best practices for battery savings. Nobody wants their phone’s battery life to deteriorate because one particular app is a battery hog. Battery life directly impacts our lives as we grow ever more dependent on mobile technology, especially when we are on the go. The more often we have to charge our phones, the more energy we use, so the benefits of long battery life are mutual for the users and the environment.
We all know the saga of battery life: a battery charged often needs to be replaced by another battery, which is produced using energy, transported using energy, and must be disposed of safely. Or users decide to replace the entire device as a result of deteriorating battery life.
Fortunately, battery life requirements are already part of the policy ecosystem enforced by the likes of Apple and Google on mobile developers. There are strict limits on what mobile apps can do — and how often they are allowed to do it. Developers must follow these requirements or risk their application being rejected from the application store.
On the plus side, mobile users can limit notifications to avoid waking apps on the phone and consuming energy. They can also disable background updates of applications to make mobile apps even more energy-efficient, at the expense of not being constantly updated.
Finally, dark modes for both entire operating systems and applications help save more than half the battery life related to the screen, which is a smartphone’s most energy-consuming element. So it’s disappointing that we still observe applications that fail to follow system-wide settings and display white screens during use.
Of course, there’s still plenty more for designers and developers to improve upon, but enforced requirements and best practices for mobile development are a good starting point. In this case, strict requirements from mobile platform owners perfectly align with customer experience benefits. Unfortunately, this is not always the case. I’ll get back to this point later.
Web apps are also mobile apps, but all too often, this fact is ignored by their creators. Even though more than half of users access Web pages from mobile devices, battery optimization is regularly not even in the top 10 priorities when developing Web apps.
And let’s not forget that we spend much of our laptop/Mac/PC time using the browser and searching for information, as I did when writing this very article.
Whether or not laptops are connected most of the time to the wall socket, their energy consumption directly relates to CO2 consumption. In the case of mobile use, this translates to the need for extended battery life and longer remote work capabilities, so again, user experience (UX) and environmental friendliness go hand in hand.
For starters, online energy-efficiency assessment tools analyze Web pages, applications, and resources to identify energy-consumption bottlenecks that designers and developers should address. Since it’s clear energy consumption can be improved, why isn’t it done? Aside from the usual set of excuses (e.g., time, money), energy optimizations can impair UX and may require hard-to-define tradeoffs.
Let’s look at the example of a Web page that automatically refreshes certain information; for instance, the result of an ongoing basketball, football, or soccer match. Users would like those refreshes to come as quickly as possible, while the developers wish for them to be as rare as possible, as refreshes consume computing power and bandwidth. (Of course, there are newer techniques such as Web sockets to avoid pooling the information, but let’s keep it simple for a moment.) So a tradeoff must occur between frequency of updates and performance, plus costs of Web services.
Beyond that tradeoff, the need to push information to us, the users, is not only bad for our attention spans and our ability to focus on things that matter, it also consumes more energy on our personal devices, network devices, and data centers.
Personally, I predict that when it comes to mobile apps and Web apps, reducing the number and frequency of notifications and eliminating flashy banners, annoying popups, and animations are the way to go both for our sake as users and also for energy consumption.
Devices, Devices: IoT & IoE
Billions of devices are flooding our world. One particularly negative environmental impact is caused by their short-lived design, questionable quality, and lack of repairability, which generates tons of toxic electronic waste. But let’s focus on direct energy consumption and software.
Battery-operated devices are optimized by design because of business requirements. Many devices are supposed to last for months without recharging, or get recharged using tiny solar panels or wind energy. This changes everything when it comes to the software side of things as well. It means more efficient dedicated operating systems, low-level software using efficient languages such as the old but very efficient C, and more energy-efficient protocols (e.g., Zigbee instead of Wi-Fi).
Internet of Things (IoT) experts try to avoid Linux-based devices on common hardware, as they are much less efficient than dedicated devices with lower power consumption. Using more niche technologies, however, translates to difficulty in getting the right people to create those systems and applications. Still, both business and the environment benefit from energy-efficient devices.
Notifications and update frequency also matter, as network communications consume much power. So devices should be programmed to store data locally and only exchange small portions of data as rarely as possible. These are difficult design choices from a software architecture point of view, but fortunately, there’s already a relatively long tradition of Internet of Everything (IoE) and good patterns to follow.
Data centers require lots of hardware and huge facilities with round-the-clock power and cooling to keep a single enterprise running. So let’s now take a look at what can be done on the server side.
In its original form, Moore’s law no longer works, despite the common myth that it still does. The speed increase of silicon CPUs has not been doubling every 18 months for a long time. Physics limits the speed of development of faster chips. But there’s another version of Moore’s law — the green one — which says that the energy efficiency of chips is significantly improving every year.
This is great but cannot replace the efforts of engineers to create more efficient software. The energy-efficiency gains of CPU manufacturers are in the low tens of percent per generation of their chips, while software can be optimized more, and much faster. So while it’s nice to have Moore’s law for energy efficiency, it’s just one component in a green computing effort.
ARM vs. Intel?
In 2020, we witnessed a revolutionary move from Apple when it ditched Intel CPUs and the x86-64/AMD64 architecture and instead embraced the Apple M1, an ARM-based chip. The M1 is known for its speed, but should be even more famous for its power efficiency, which is sometimes orders of magnitude better than its Intel counterparts. Nvidia and Amazon have made similar moves in the past.
In layman’s terms, the same technology that has been driving our phones and tablets for years is becoming more present in our desktop computers, laptops, and data centers. What’s different about these newer architectures is that they can help data centers save a lot of power; the same business applications and data processing use less energy.
Most business software is still built to target Intel and AMD processors. Changing this would enable huge energy savings. For instance, containers, which are now the key building blocks of most business software, can be created to support multiple architectures, not just x86-64. This requires changes to DevOps pipelines and deployment routines to support more efficient CPU architectures. It’s happening, but very slowly.
Cloud as the Green Option
Cloud vendors promise carbon neutrality. They like to show that their energy management routines are driven by artificial intelligence (AI) — always good for marketing — and their impressive cooling systems. We cannot verify these claims, but let’s assume they are true.
Developing software using building blocks delivered by public cloud vendors can accelerate cloud adoption, but greener computing is another reason to turn to the cloud faster.
Hybrid cloud, which is a combination of local and public infrastructure, is very much a limiting factor to the full adoption of cloud-specific services. The popular cloud-agnostic tradeoff called “cloud-native” is the most popular choice nowadays, yet not the most energy-efficient. The future decline of local infrastructure for many organizations will result in more environmentally friendly software architectures and the ability to embrace newer patterns. For instance, serverless paradigms for solution architecture are focused on short-lived ephemeral workloads instead of keeping the “digital engine” running all the time.
The cloud transformation traditionally has been slower than predicted, but it’s inevitable. In case of any doubt, you can add the environment as another excellent reason to do it faster.
Developers Like Inefficient Languages
Languages closer to machine language and assembly language are much more energy-efficient, but they are losing popularity among developers because most business applications and even games are created using higher-level languages.
C is still very popular, but newer generations of developers prefer to learn Python, Java, and C#, instead of C. The resulting lack of developers with the needed skillset will become a serious problem for many low-level programming needs such as operating systems and, especially, devices. For business Web applications or mobile application back ends, energy inefficiency might be tolerable because the end users don’t see it. But it won’t work for small devices and operating systems — database systems that are expected to utilize limited resources as efficiently as possible.
Optimal Code Is Not Green Code?
Regular developers write acceptable code; the best developers write optimized code, which is code that runs as fast as possible and consumes less memory. Usually, execution time is the priority at the expense of memory usage — and energy consumption. Modern CPUs have multiple cores. Fast code uses all of them, waking them up and creating much activity to complete the operation as quickly as possible. Ironically, regular developers may create more energy-efficient code than their more ambitious colleagues. We want our apps to run as fast as possible, but ... servers sleep most of the time.
Energy-efficient code-writing techniques are lesser known than traditional code-optimization techniques. In a car driver’s analogy, this is the difference between driving from point A to B in the shortest time versus consuming as little fuel as possible between those points (eco-driving). But in the case of driving, it’s well known how to save a lot of fuel. In the case of software, it’s much more difficult and less precisely defined.
Machine Learning (aka AI) for Business
Machine learning (ML) is all around us: our mobile phones, our laptops, our TVs. It’s already doing a good job of identifying usage and energy-consumption patterns to help optimize the energy consumption of our smallest and largest devices, from smartwatches to supercomputers to data centers.
But let me remind you that ML is extremely heavy computationally, consuming a lot of power to train neural networks to recognize images, learn grammar, and detect irregularities in data. At the enterprise scale, ML requires dedicated TPUs and GPUs that consume much energy to deliver useful results for businesses.
Even worse, ML is a set of experiments, model tuning, retraining, testing, and transferring tons of data, all of which take a vast amount of energy. This is not something that we typically read about when a headline announces another AI breakthrough. There’s a steep price for that progress, and it’s not “just” the privacy of our data (a good topic for another time) but also the enormous CO2 footprint of ML.
What can we do about it?
First of all, hardware and software are getting more optimal, and energy limits of computers are forcing creators to optimize watts usage in each future iteration. ML workloads, especially training, are ephemeral workloads that may not require a set infrastructure and can be run in the cloud.
When models seem to be good enough, they can be trained less often using subsets of data to minimize training time and to train at full scale only from time to time. This helps save a lot of costs (e.g., data scientists’ time, computing resources) but also helps a great deal by reducing the carbon footprint by an order of magnitude.
Blockchain (aka DLTs) for Business
Digital ledger technologies (DLTs), better known as blockchain nowadays, failed to gain as much attention and interest as anticipated a few years ago. Yet numerous new projects use blockchain technologies. Unfortunately, power consumption is one of the biggest disadvantages of the most popular blockchain technologies. They require far more energy than business transactions that don’t use them — by an order of hundreds or even thousands. This means projects should seek out blockchain technologies that consume less energy instead of the most popular, but inefficient, ones.
In the recent past, it was quite common to see blockchain overused. Now it seems to be used only when it helps significantly to deliver business value. But it’s worth considering blockchain’s environmental impact as another factor when making architectural decisions about whether or not to use DLTs in the digital solution.
Why Don’t We Embrace Green Software Development?
Software developers are busy. Changing requirements, ambitious goals, and constant changes in technology make the resource shortage more painful than ever. Pressure for faster digitalization and resilience creates a lot of work in software architecture design, coding, testing, DevOps, and data processing, including ML.
Software development is hard enough already, even without considering the requirements of green computing. And it’s not getting any easier ... or cheaper. Staying competitive in business means delivering better digital experiences faster and making them more resilient. Adding green computing goals on top of all that may seem like overload on a developer’s plate. But ... “the times they are a-changin’.”
Let’s not forget that what works for the environment may very well work for your business efficiency. This has been proven already in the case of mobile apps and IoT devices, and to some extent in cloud computing. Web apps, their back-end services, and the entire data processing universe with its jewel in the crown, ML (a subset of AI), are expected to join this green computing movement slowly but surely.
Even though energy-efficient software development sometimes means functional and performance tradeoffs, it’s still worth trying to add an IT component to our global efforts to save humanity from humans. Everyone has a role to play. Software rules the world — let it rule a greener world for us and for generations to come.