conceptual neural network image

Reducing AI’s Carbon Footprint

Pursuing more efficient AI 

We must address climate change for a sustainable future. The scientific consensus is clear from Intergovernmental Panel on Climate Change (opens in new tab) reports: our world confronts an urgent greenhouse gas problem. Carbon dioxide (CO2) is the greenhouse gas that persists the longest in our atmosphere, creating a blanket of gas that traps heat and is changing the world’s climate. Already, the planet’s temperature has risen by 1 degree Celsius (1.8 degrees Fahrenheit). If we don’t curb emissions, and temperatures continue to climb, science tells us that the results will be catastrophic. With current global emissions of over 50 gigatons of CO2 annually, reaching the goal of net-zero emissions in less than 30 years will require a challenging systemic transformation of industries, infrastructures, economies and societies around the world.

Artificial Intelligence (AI) can help accelerate progress for the necessary transformations. For example, AI-based systems can better integrate variable renewable energy into a stable electricity grid. They can also help reduce the cost of carbon capture by accelerating the discovery of new materials with desired properties. At the same time, AI technology itself needs to be environmentally sustainable. Especially as AI models using deep learning (opens in new tab) have grown both in scale and in breadth of application, research has increasingly focused on how to ensure AI models use computing resources more efficiently.

AI models run on hardware – computers that incorporate resources such as processors, memory, and networks. Computing gives rise to two forms of CO2 emissions. First, most computing resources are powered by electricity. So-called operational carbon emissions arise when the source of that electricity is not carbon-free. Fortunately, all major cloud providers (including Microsoft) are either already powering their cloud computing datacenters with 100% carbon-free energy or have roadmaps to do so by 2030. Second, so-called embodied carbon emissions arise because the manufacturing processes that create computing hardware also generate carbon emissions, which are then attributed to the manufactured products. Reducing embodied carbon can be very challenging, as discussed in Microsoft’s 2021 Sustainability Report.

Research that makes AI run more efficiently on computing hardware – using less processor time, less memory and so on – can reduce both the operational and embodied emissions associated with AI-based tasks. The challenge is to achieve this goal without sacrificing other desirable attributes, in particular the accuracy of the predictions that AI models are built to deliver. Click on the focus areas below to learn more about our research into AI efficiency gains across the life cycle of machine learning (model selection, hyperparameter tuning, training and inference), our efforts to extend those gains beyond the cloud to AI running on constrained edge hardware, and the tools we are developing to empower AI developers to make their models more sustainable.

Our areas of focus

conceptual image - Programming code abstract technology background of software developer and computer script

Emit less carbon from AI

Make more efficient use of computational resources for AI while maintaining machine learning model accuracy

Urban innovation: farmer selling vegetables to a customer using a cellphone to pay

Improve edge-device AI efficiency

Ensure resource and energy efficiency on constrained hardware, such as smartphones and IoT devices

graph showing GPU usage

Empower AI developers

Provide tools that enable AI developers to find appropriate tradeoffs between performance and carbon emissions