conceptual neural network image

Reducing AI’s Carbon Footprint

Empower AI developers

Share this page

GPU energy usage graph - showing 4 out of 8 GPUs

Progress in machine learning is measured in part through the constant improvement of performance metrics such as accuracy or latency. Carbon footprint metrics, while being an equally important target, have not received the same degree of attention. With contributions from our research team, Azure ML now provides transparency around machine learning resource utilization, including GPU energy consumption and computational cost, for both training and inference at scale. This reporting can raise developers’ awareness of the carbon cost of their model development process and encourage them to optimize their experimentation strategies.

Read the blog > (opens in new tab)

An animated illustration of the neural architecture search platform Archai automatically identifying neural network architectures for a given dataset.
An animated illustration of the neural architecture search platform Archai automatically identifying neural network architectures for a given dataset.

Archai, an open-source tool, can inform model development tradeoffs. In combination with a set of Neural Architecture Search (NAS) algorithms, Archai can perform a cost-aware architecture search, where “cost” can represent different resources of interest such as compute time or peak memory footprint. Running Archai provides the model developer with the entire spectrum of cost vs. accuracy tradeoffs, allowing them a choice of which tradeoff best meets their needs.

Finally, Accera (opens in new tab) is an open-source compiler that aggressively optimizes for AI workloads. The Accera compiler doesn’t change or approximate a model; rather, it finds the most efficient implementation of that model. For example, matrix multiplication with a ReLU activation is commonly used in machine learning algorithms; by optimizing its implementation, developers can reduce the computational intensity of running their models.