News & features
Improve edge-device AI efficiency
Machine learning models are increasingly running on edge hardware, such as mobile phones or Internet of Things (IoT) devices. Motivations include protection of private data and avoidance of networking latency, for example with applications that recognize speech. Ensuring efficient inference…
Emit less carbon from AI
Multiple activities are involved in developing and using machine learning models, including selection of model architectures and algorithms, hyperparameter tuning, training on existing datasets, and making predictions on new data (aka inference). Optimizing results across these activities involves many complex…
Empower AI developers
Progress in machine learning is measured in part through the constant improvement of performance metrics such as accuracy or latency. Carbon footprint metrics, while being an equally important target, have not received the same degree of attention. With contributions from…
In the news | Microsoft Power Apps
Behind the Scenes – What it Takes to Teach GPT-3 How to Build Low-Code Apps
Last year, we announced the public preview of Power Apps Ideas, which enables Power Apps makers to take advantage of Microsoft AI technologies that make it easy to write Power Fx formulas with no-code. In this article, we’re going to…
In the news | The Register
Microsoft, OpenAI method could make training large neural networks cheaper
Cost of tuning hyperparameters using μTransfer was 7% of what it would be to pre-train GPT-3. Companies scaling up their neural network models could cut expensive training costs by employing a technique developed by researchers at Microsoft and OpenAI.
µTransfer: A technique for hyperparameter tuning of enormous neural networks
| Edward Hu, Greg Yang, and Jianfeng Gao
Great scientific achievements cannot be made by trial and error alone. Every launch in the space program is underpinned by centuries of fundamental research in aerodynamics, propulsion, and celestial bodies. In the same way, when it comes to building large-scale…
In the news | Microsoft Green Tech Blog
Charting the path towards sustainable AI with Azure Machine Learning resource metrics
Today, we are announcing a new set of resource metrics available in Azure Machine Learning to help customers understand the computational and energetic costs of their AI workloads across the machine learning lifecycle. Azure Machine Learning is a platform that…
Factorized layers revisited: Compressing deep networks without playing the lottery
| Misha Khodak, Neil Tenenholtz, Lester Mackey, and Nicolo Fusi
From BiT (928 million parameters) to GPT-3 (175 billion parameters), state-of-the-art machine learning models are rapidly growing in size. With the greater expressivity and easier trainability of these models come skyrocketing training costs, deployment difficulties, and even climate impact. As…
In the news | Official Microsoft Blog
One year later: The path to carbon negative – a progress report on our climate ‘moonshot’
A year ago, we launched the biggest commitment in Microsoft’s history to focus on the climate crisis. As Satya Nadella, Amy Hood, and I announced last January, Microsoft committed to become carbon negative as a company by 2030 – meaning that by that date we will remove…