Cutter Consortium Senior Consultant Curt Hall presents intriguing examples of how corporations and governments use AI and compatible technologies to move us toward a more sustainable world. He explains how many companies are using AI to reduce the energy consumption of … AI(!) and other digital technologies. He illustrates innovative technology-based solutions being developed for tracking carbon emissions and presents a selection of companies targeting key infrastructure areas for carbon reduction initiatives (e.g., data centers, transportation, waste management).
Governments and industries worldwide are accelerating their efforts to reduce the carbon footprint associated with their products and services, driven by evolving government regulations, corporate shareholder responsibility initiatives, grassroots environmental campaigns, and green-conscious consumers inspired by celebrity climate activists like Greta Thunberg. These efforts range from consumer digital technologies, corporate data centers, and manufacturing operations to supply chains, resource extraction, shipping and transportation, and agriculture and waste management.
Today, organizations are using artificial intelligence (AI) to help achieve their goals of developing and promoting green technologies and managing environmental impacts. This includes applying AI in conjunction with complementary technologies to carry out carbon reduction strategies and to realize efficiency gains that can lead to cost savings via better energy management and successful decarbonization of key operations.
This article examines how AI and associated technologies can help companies with their net zero carbon and other sustainability efforts. Specifically, it examines how AI can support this vision today and in the not-too-distant future, offers examples of how companies are applying these technologies, and discusses available commercial products and services employing AI to assist companies in attaining their environmental goals.
How AI Can Help with Climate Change and Other Sustainability Efforts
The widespread availability of high-performance cloud computing services and the advent of deep learning neural networks have significantly impacted the development and application of AI technologies. This is especially true of machine learning (ML) systems, which have become quite popular due to their ability to identify patterns and trends in large volumes of data (both historical data and real-time streaming data). As a result of these developments, AI is playing an increasingly important role in helping create a greener, more sustainable world.
Currently, companies and government organizations are utilizing AI-based carbon intelligence platforms to analyze and report on their carbon emissions, while the Internet giants are applying neural networks and other ML algorithms to optimize energy usage in their massive data centers in order to decrease their carbon footprint.
Due to the increasing accuracy of machine vision systems (largely attributed to the use of deep learning neural net architectures), organizations are using smart cameras and ML to analyze video and still images of the insides of garbage bins, for example, to streamline their waste management and recycling operations.
It’s important to note, however, that although AI gets most of the attention, in reality, most of the AI applications used to solve real-word problems, including those covered here, do not operate standalone. Rather, they combine various forms of AI techniques in conjunction with other advanced supporting technologies. For example, big data management platforms, like Hadoop and NoSQL databases, are frequently used for managing and applying AI analytics to large amounts of data from Internet of Things (IoT) applications (like real-time data streaming from cameras and other sensors deployed on vehicles, machinery, equipment, and other connected devices).
A more recent development is digital twins, an advanced form of simulation and modeling for creating digital versions of machines, buildings, vehicles, infrastructure, equipment, processes, and other assets in software and for visualizing and analyzing these models as they run. These models are synchronized with the actual physical assets or processes they model. This means that changes simulated in a model can be applied to the assets they are modeling (e.g., to lessen loads carried in heavy haul trucks in order to reduce the carbon output of certain mining operations).
Digital twins applications often rely on edge computing, ML, IoT, and other AI technologies for assimilating and analyzing data from sensors used to monitor physical objects (e.g., building management systems, jet engines, pollution control devices) in order to identify patterns and anomalies and to predict outcomes based on these analyses.
These are just some examples of how organizations are leveraging AI and complementary technologies to support their decarbonization and other sustainability initiatives.
Carbon Intelligence Platforms & Services
One of the biggest problems companies confront with their sustainability efforts is how to develop an accurate representation of their carbon output that will allow them to implement and conduct practical sustainability plans in order to meet their carbon reduction and other climate goals. Companies also struggle with issues around how to practically manage and analyze the vast amounts of carbon-related data necessary to model their carbon footprint. This includes a lack of scalable carbon reporting infrastructure as well as an absence of in-house experts knowledgeable in analyzing and applying the findings.
Technology providers see a viable market in helping such companies. They now offer comprehensive cloud-based carbon intelligence platforms and services employing machine vision, predictive modeling, and other ML techniques, as well as IoT and digital twins technologies. These platforms are designed to help organizations track, measure, and model their emissions in real or near real time, and chart the progress of their sustainability initiatives. Some also implement market mechanisms for issuing, buying, and selling carbon offsets credits.
CarbonChain’s platform targets companies in the extraction industries — specifically for tracking and reducing the carbon emissions of their supply chains.1 This includes some of the world’s most polluting industries: oil and gas, metals, mining, minerals, and agriculture, which are estimated to be responsible for approximately 50% of global greenhouse gas emissions.2
The platform uses various technologies to develop models for accurately measuring carbon dioxide (CO2) emissions arising from supply chain operations in the global commodities sector, including ML, big data analytics, IoT, and digital twins. It also maintains a database of greenhouse gas emissions facts and figures reflective of the assets and operations of commodities companies around the world. This data is used in conjunction with digital twins to model equipment, processes, activities, and other assets utilized in the supply chains of commodities industries, ranging from excavators, earth movers, farming equipment, and refining plants (along with other machinery used in mining operations and specific pieces of manufacturing equipment and factories) to ships, trucks, railway, and other transport.
CarbonChain’s experts work with sustainability and risk management departments within companies to integrate the models and analytics of organizations’ workflow functionality for tracking their carbon risk. This includes a customized dashboard for monitoring carbon risk and for supporting carbon reduction and other sustainability initiatives.
ENGIE North America
ENGIE’s Ellipse is a dedicated carbon reduction platform that helps companies track their carbon emissions in real time for net zero carbon reduction initiatives.3 It is designed to support global carbon mitigation efforts by enabling companies to design decarbonization strategies, track emissions, and chart their progress. It integrates into a company’s existing digital environment.
Ellipse features advanced analytics for accurately modeling an organization’s carbon output to execute on climate goals. Its ML algorithms are trained on insights from more than 1 million facilities to support “carbon-first” decision making in response to quickly changing market conditions. The Ellipse platform also mitigates the issues of companies having to manage large volumes of carbon-related data. Employing ML and custom APIs, it can aggregate and analyze dynamic data streams to provide an accurate view of emissions across an organization’s entire portfolio and supply chain, allowing organizations to identify hot spots and model supplier-specific mitigation scenarios. This real-time view serves to measure carbon as a key performance indicator on an ongoing basis. In addition, reporting, analysis, and data visualization capabilities connect project performance to expected outcomes for measuring carbon impact and ROI.
According to climate scientists, reducing emissions is just one part of the solution to the climate change problem. Governments and industries must also focus on removing carbon from the air in order to make a real impact on the buildup of CO2 in the earth’s atmosphere.
Pachama comprises scientists, entrepreneurs, and engineers who believe that forests are the most cost-effective and practical way to reduce atmospheric carbon.4 However, to date, forests have been underutilized due to a lack of reliable data on how much CO2 they can actually absorb. Moreover, manual tracking of forest carbon absorption is difficult and time-consuming — especially when attempting to apply it in an efficient manner to support commercial carbon offsets markets. Pachama’s technology provides a solution to this problem and offers a platform that simplifies how companies can buy and sell carbon credits.
Pachama’s platform uses machine vision, predictive analytics, and other ML techniques for remote verification and monitoring of forests. Specifically, it analyzes satellite and LiDAR (light detection and ranging) imagery of forest growth to accurately predict the amount of carbon a particular forest is capturing from the atmosphere.
Pachama also implements a carbon offsets market that allows landowners involved in forest restoration and other conservation efforts to receive carbon credits for their work. Companies can purchase these credits to offset their CO2 emissions (one credit equals one ton of CO2 emissions). They can also choose to support a number of reforestation and conservation efforts taking place around the world, review these projects, and invest in those that align with their sustainability goals.
In short, Pachama’s platform serves to validate reforestation and other conservation efforts, ensuring accountability, integrity, and transparency in a carbon offsets market. According to the company, Pachama is protecting nearly 2 million hectares of forest to date.5
Infrastructure: Key Target for Carbon Reduction Initiatives
Infrastructure, which includes data centers, supply chains, transportation, manufacturing, waste management, and various industrial operations and processes, is a key area companies are targeting for carbon reduction. These areas tend to be high-carbon-output generators. In this section, we explore some infrastructure initiatives.
Data Center Optimization
AI is playing an important part in data center transformation, including making data centers more energy-efficient to address environmental and cost concerns. Data center infrastructure has undergone significant transformation over the last five years to support the immense growth in data traffic volumes that accompanies accelerating developments around digital transformation, e-commerce, outsourcing, and the move to cloud computing.
Major data center providers like Amazon, Facebook, Google, Huawei, and Microsoft use AI to optimize their data centers. This includes precisely managing and predicting cooling system demand, controlling energy consumption, and forecasting availability of electricity generated by green energy sources.
Google & DeepMind AI
One of the main uses of energy in the data center is for cooling the large number of servers that comprise a modern data center. But data centers are complex environments that are difficult to optimize using conventional energy management systems. To address this problem, Google turned to its AI company, DeepMind, to develop an ML application that could automate its data center cooling systems to reduce overall energy consumption.
DeepMind developers trained several deep learning neural net models using data streamed from the thousands of sensors deployed on servers and other data center equipment, including data pertaining to various parameters and operations (e.g., temperature, pump speeds/pressure, power consumption/fluctuation). The models detected trends and anomalies in power usage, which developers were able to apply to model data center dynamics and optimize efficiency. Developers also trained additional neural nets to predict future temperatures and pressure in the data center — information used to ensure that data center operators do not exceed operating constraints that could lead to abnormal energy usage. According to the company, deploying ML in its data centers allowed Google to reduce the amount of energy used for cooling by up to 40%.6
Powering Data Centers with Renewable Energy
The most straightforward way to reduce data centers’ carbon footprint is to run them entirely on renewable energy. But this is not as easy as it sounds because energy from renewable sources such as wind and solar is intermittent. Consequently, green energy may simply not be available for purchase when needed.
Data center providers are using AI to address this problem. For example, Google used DeepMind to build a predictive modeling application that helps its data center operators predict how much energy a particular wind farm will generate in a specific period. This knowledge allows engineers to better plan data center energy consumption needs and has led to significant cost reductions.
AI for Traffic Optimization & Sustainable Flight Selection
Companies, government agencies, and other organizations, often in consort, are building AI systems that analyze geographic and other location data for a range of applications. Some of the more popular include transportation optimization like traffic control and airline flight selection.
Google is developing AI applications for optimizing the efficiency of traffic signals in order to reduce congestion and reduce CO2 emissions by limiting the amount of time vehicles spend idling in traffic. The goal is to make routes more efficient across an entire city or roadway network. An initial pilot conducted at four intersections in Israel used predictive analytics to predict traffic conditions and other ML algorithms to improve the timing of when traffic lights change. Google claims it is seeing a 10%-20% reduction in fuel consumption and delay time at intersections.7 A similar project is in the works to apply the technology in Rio de Janeiro, Brazil.
Google is also leveraging its search engine and mapping dominance to get more consumers involved in carbon reduction efforts. This includes fuel-efficient routing with Google Maps and providing information on the carbon footprint of flight options on Google Flights:
Google Maps and most fuel-efficient routing. Daily automobile travel is a very carbon-intensive activity. Google has added a new feature to Google Maps that lets users choose the most fuel-efficient driving route and estimates that this has the potential to avoid adding over 1 million tons of carbon emissions per year (i.e., equal to removing more than 200,000 cars from the road). This feature should appeal to both climate-conscious and non-climate-conscious consumers because it will save users money at the pump by reducing their vehicles’ fuel consumption. This feature became available in the US in October 2021. It will be available in Europe in early 2022.
Google Flights and carbon estimation. Currently, users can view carbon emissions information alongside the price and duration of flight information when they search Google Flights.8 To produce these estimates, Google Flights combines data from the European Environmental Agency (EEA) with flight-specific data provided by airlines and others, including type of aircraft, trip distance, and the number of seats in each seating class. Google/Alphabet CEO Sundar Pichai discusses this and other Google sustainability efforts on his blog.9
Government agencies, businesses, universities, and other organizations are working together to build AI systems that analyze geomapped and other location-based data. There is considerable innovation in this area, particularly involving the implementation of traffic planning and safety applications that deliver the added benefit of reducing vehicle carbon footprints by reducing unnecessary drive times and vehicles stuck idling in traffic.
TranSEC (which stands for transportation state estimation capability) is a big data and ML application that analyzes driver data to track and alleviate urban traffic congestion.10 It was developed by the US Department of Energy’s Pacific Northwest National Laboratory (PNNL) to provide urban transportation analysts and traffic engineers with detailed information about city traffic patterns in order to relieve bottlenecks and other problems.
Most publicly available traffic information at the street level is sparse and incomplete. Consequently, traffic engineers have had to rely on isolated traffic counts, collision statistics, and speed data to determine roadway conditions. In contrast, TranSEC integrates anonymized GPS traffic data sets sourced from hundreds of thousands of online Uber drivers with other publicly available traffic sensor data to map street-level traffic flow over time. This big data integration and ML analysis capability provides a more encompassing picture of city traffic, including a method detailing how traffic is actually moving around cities.
Basically, TranSEC provides street-level estimation over a large metropolitan area. Moreover, the tool (and its models) is portable, so cities can apply it to any urban area where aggregated traffic data is available. TranSEC differs from consumer smartphone traffic apps like Waze. While the latter is designed for individual drivers trying to get from point A to point B, city traffic engineers are concerned about how to help all vehicles get to their destinations efficiently. This difference is key. For example, a route that is efficient for an individual driver could possibly lead to too many vehicles trying to access a road that was not built to handle high traffic volume. TranSEC also has a capability most other traffic-monitoring applications lack: it can analyze sparse and incomplete information.
In one pilot application, which used public data from the entire 1,200-square-mile Los Angeles, California, metropolitan area, the PNNL team reduced the time needed to create a traffic congestion model from hours to minutes.11 This is significant because it demonstrates that real-time (or near-real-time) traffic analysis is feasible.
TranSEC’s ML component improves the accuracy of analyses over time; as the tool ingests and processes more data, it becomes more refined. Here, the importance of ML analysis for traffic-monitoring applications shows its worth, by providing the ability to understand how disturbances spread across networks. Fed enough data, ML can predict impacts so that traffic engineers can create strategies to work around them.
PNNL researchers used a graph-based model combined with novel sampling methods and optimization engines to learn both the travel times and the routes. TranSEC has applications beyond roadway traffic optimization, including uses in analyzing transit and freight traffic. PNNL plans to make TranSEC available to municipalities nationwide. Researchers could also apply it to other scenarios, such as autonomous vehicle routes. TranSEC core technology is available for licensing for all fields of use. It is scalable and can be used with high-performance computing for building full-scale city models or desktop computers for smaller-scale modeling applications.
Machine Vision & ML for Waste Management
Companies are using machine vision and ML to develop smart camera applications for optimizing waste management and recycling operations (e.g., to support smart cities projects, governments are equipping dumpsters with smart cameras that measure and track waste production).
By analyzing this waste output data, operators can optimize their waste collection services by eliminating unnecessary dumpster pickups, enabling cities and companies to shrink their carbon footprints and reduce costs. Additionally, AI-powered smart camera technology can identify contaminants in recycling streams for removal prior to entering a recycling facility.
Compology’s smart camera and complementary smart metering software is designed to modernize recycling infrastructure by identifying contaminants in recycling dumpsters before they contaminate loads in a processing facility. It also measures and tracks waste production, providing users with the data they need to plan waste pickups effectively to reduce carbon emissions.
Compology’s smart cameras analyze dumpsters or recycling streams to identify four key data points: fullness, content, location, and activity. This allows garbage pickup or recycling centers to operate more efficiently. Specifically, Compology’s technology reduces the number of miles garbage trucks must drive and helps improve the amount of material that can be recycled by keeping it out of landfills.
McDonald’s in Las Vegas, Nevada, teamed up with Global Trash Solutions (GTS) and Compology to use the latter’s smart cameras and waste-metering solution to take images inside McDonald’s dumpsters three times a day. By knowing just how full its dumpsters are and when they actually need to be emptied, McDonald’s was able to optimize its waste and recycling service levels to match its actual waste output, allowing the franchise to cut monthly waste spend by 31%.12
Optimizing dumpster pickup schedules enabled McDonald’s to save money and reduced the miles garbage trucks traveled by servicing each of McDonald’s Las Vegas locations by approximately 8,000 miles. This, effectively, reduced carbon emissions by approximately 32 metric tons, or 640,000 pounds of CO2/year. Waste metering also uncovered that 25% of McDonald’s Las Vegas dumpsters were contaminated with nonrecyclable materials.13 Compology’s smart metering system now auto-generates alerts to notify teams about specific dumpster contamination, allowing them to proactively remove the contaminants before they can foul recycling operations.
In October 2021, the US city of Miami, Florida, partnered with Compology to equip city dumpsters with smart cameras that measure how much and what types of waste are present. The city government is installing smart cameras in dumpsters in municipal buildings, police and fire stations, and parks, including the Miami Marlins’ Major League Baseball ballpark. This effort will help modernize the city’s recycling infrastructure and is expected to reduce waste collection costs by a projected 30%-40%.14
By implementing Compology’s technology, the city expects to reduce CO2 emissions, traffic congestion, noise pollution, and illegal dumping, while improving code compliance and enhancing urban development efforts. Data from this initial project will underpin the design of a waste-metering program for all municipal buildings and private businesses in Miami.
Zero-Emission & Autonomous Heavy Equipment for Mining and Industry
Mining operations are some of the most carbon-intensive in the world. Naturally, mining companies are seeking ways to reduce their carbon-heavy footprints. Heavy equipment manufacturers have responded and are developing new lines of zero-emission and autonomous bulldozers, extractors, heavy-haul trucks, and other equipment for mining and industrial operations.
These vehicles utilize AI in several ways. AI (ML in particular) and machine vision are crucial components of autonomous driving systems. Additionally, AI serves to analyze the extremely large volumes of data streaming from myriad sensors deployed on heavy equipment mining vehicles, ranging from real-time data from sensors measuring operating environmental conditions (e.g., temperature, humidity, road conditions) to the state of key onboard components like battery energy usage, engine and power train KPIs, wear-and-tear, load-bearing hydraulics, and so on.
Caterpillar has several zero-emission/autonomous earth-moving machine projects underway. This includes the Cat D11 XE electric drive dozer prototype. Another is the Cat R1700 XE LHD underground loader, featuring full battery electric propulsion, which generates significantly less heat and noise.15 It has a 16.5-ton payload and 11.2 mph top speed and offers multiple levels of autonomy, including line-of-sight remote, tele-remote, and copilot (i.e., requiring only directional input from the operator) operation. Additional components available offer fully autonomous capabilities.
Cat 793 220-ton zero-emissions autonomous haul trucks are currently under development. In September 2021, Caterpillar and mining conglomerate Rio Tinto signed a memorandum of understanding in which the latter will assist Caterpillar in developing these zero-emissions heavy trucks, including testing them in preproduction trials at Rio Tinto’s Western Australian mining operations.16
No definite date has been given for when Rio Tinto will deploy the trucks in production. However, Rio Tinto Chief Commercial Officer Alf Barrios stated, “We should be able to validate these zero-emissions haul trucks in just a few years’ time.”17 It is anticipated that the first operational deployment of the new zero-emissions trucks will consist of 35 vehicles at Rio Tinto’s Australian Gudai-Darri mine.
This article provides some examples of how companies and governments are applying AI and complementary technologies to help create a greener, more sustainable world. In reality, there are too many use cases to cover.
The most important takeaway from these findings is that the technologies are currently available and that innovative organizations are now applying them to make progress with their decarbonization and other sustainability initiatives.
Finally, having researched this article, I believe it will not be a lack of advanced technologies that prevents the world from averting globally disruptive climate changes — the technology exists today — but rather a lack of will on the part of our political leaders.
1“About Us.” CarbonChain, 2021.
2Oberle, Bruno, et al. “Global Resources Outlook 2019: Natural Resources for the Future We Want.” United Nations Environment Programme (UNEP), 2019.
3“ENGIE Launches Ellipse, World’s Most Comprehensive Carbon Intelligence Program.” Press release, ENGIE North America, 20 September 2021.
4“About Us.” Pachama, 2021.
5“Restoring Nature to Solve Climate Change.” Pachama, 2021.
6Evans, Richard, and Jim Gao. “DeepMind AI Reduces Google Data Centre Cooling Bill by 40%.” DeepMind, 20 July 2016.
7Pichai, Sundar. “Giving You More Sustainable Choices with Google.” The Keyword, Google, 6 October 2021.
8Holden, Richard. “Find Flights with Lower Carbon Emissions.” The Keyword, Google, 6 October 2021.
9Pichai (see 7).
10“TranSEC: Tracking and Alleviating Urban Congestion.” US Department of Energy/Pacific Northwest National Library (PNNL), September 2021.
11US Department of Energy/PNNL (see 10).
12“Waste Metering Helps McDonald’s Cut Costs by 31% While Reducing Carbon Emissions.” Compology, accessed October 2021.
13Compology (see 12).
14“Live on ABC Miami: Dumpster Cameras May Be the Key to Cleaning Up Miami’s Trash Trouble.” Compology, 29 October 2021.
15Caterpillar Mining. “The Cat R1700 XE Battery Electric LHD.” YouTube, 22 September 2021.
16“Rio Tinto Teams Up with Caterpillar for Zero-Emissions Autonomous Trucks.” Press release, Rio Tinto, 14 September 2021.
17Rio Tinto (see 16).