Thought leadership - Microsoft Industry Blogs http://approjects.co.za/?big=en-us/industry/blog/content-type/thought-leadership/ Wed, 25 Sep 2024 22:39:21 +0000 en-US hourly 1 http://approjects.co.za/?big=en-us/industry/blog/wp-content/uploads/2018/07/cropped-cropped-microsoft_logo_element-32x32.png Thought leadership - Microsoft Industry Blogs http://approjects.co.za/?big=en-us/industry/blog/content-type/thought-leadership/ 32 32 Sustainable by design: Innovating for energy efficiency, part 2 http://approjects.co.za/?big=en-us/microsoft-cloud/blog/2024/09/26/sustainable-by-design-innovating-for-energy-efficiency-in-ai-part-2/ Thu, 26 Sep 2024 16:00:00 +0000 In this blog, I’d like to share a few examples of how we’re bringing promising efficiency research out of the lab and into commercial operations.

The post Sustainable by design: Innovating for energy efficiency, part 2 appeared first on Microsoft Industry Blogs.

]]>

Tags

Learn more about how we’re making progress towards our sustainability commitments in part 1 of this blog: Sustainable by design: Innovating for energy efficiency in AI, part 1.

As we continue to deliver on our customer commitments to cloud and AI innovation, we remain resolute in our commitment to advancing sustainability. A critical part of achieving our company goal of becoming carbon negative by 2030 is reimagining our cloud and AI infrastructure with power and energy efficiency at the forefront.

We’re pursuing our carbon negative goal through three primary pillars: carbon reduction, carbon-free electricity, and carbon removal. Within the pillar of carbon reduction, power efficiency and energy efficiency are fundamental to sustainability progress, for our company and for the industry as a whole.

Explore how we’re advancing the sustainability of AI

Explore our three areas of focus

Read more 

Although the terms “power” and “energy” are generally used interchangeably, power efficiency has to do with managing peaks in power utilization, whereas energy efficiency has to do with reducing the overall amount of power consumed over time.

This distinction becomes important to the specifics of research and application because of the type of efficiency in play. For an example of energy efficiency, you might choose to explore small language models (SLMs) with fewer parameters that can run locally on your phone, using less overall processing power. To drive power efficiency, you might look for ways to improve the utilization of available power by improving predictions of workload requirements.  

From datacenters to servers to silicon and throughout code, algorithms, and models, driving efficiency across a hyperscale cloud and AI infrastructure system comes down to optimizing the efficiency of every part of the system and how the system works as a whole. Many advances in efficiency have come from our research teams over the years, as we seek to explore bold new ideas and contribute to the global research community. In this blog, I’d like to share a few examples of how we’re bringing promising efficiency research out of the lab and into commercial operations.

Silicon-level power telemetry for accurate, real-time utilization data

We’ve made breakthroughs in delivering power telemetry down to the level of the silicon, providing a new level of precision in power management. Power telemetry on the chip uses firmware to help us understand the power profile of a workload while keeping the customer workload and data confidential. This informs the management software that provides an air traffic control service within the datacenter, allocating workloads to the most appropriate servers, processors, and storage resources to optimize efficiency.

Working collaboratively to advance industry standards for AI data formats

Inside the silicon, algorithms are working to solve problems by taking some input data, processing that data through a series of defined steps, and producing a result. Large language models (LLMs) are trained using machine learning algorithms that process vast amounts of data to learn patterns, relationships, and structures in language.

Microsoft copilotTry Copilot

Simplified example from Microsoft Copilot: Imagine teaching a child to write stories. The training algorithms are like the lessons and exercises you give the child. The model architecture is the child’s brain, structured to understand and create stories. Inference algorithms are the child’s thought process when writing a new story, and evaluation algorithms are the grades or feedback you give to improve their writing.1

One of the ways to optimize algorithms for efficiency is to narrow the precision of AI data floating-point formats, which are specialized numerical representations used to handle real numbers efficiently. Working with the Open Compute Project, we’ve collaborated with other industry leaders to form the Microscaling Formats (MX) Alliance with the goal of creating and standardizing next-generation 6- and 4-bit data types for AI training and inferencing. 

Narrower formats allow silicon to execute more efficient AI calculations per clock cycle, which accelerates model training and inference times. These models take up less space, which means they require fewer data fetches from memory, and can run with better performance and efficiency. Additionally, using fewer bits transfers less data over the interconnect, which can enhance application performance or cut network costs. 

Driving efficiency of LLM inferencing through phase-splitting

Research also shows promise for novel approaches to large language model (LLM) inference, essentially separating the two phases of LLM inference onto separate machines, each well suited to that specific phase. Given the differences in the phases’ resource needs, some machines can underclock their AI accelerators or even leverage older generation accelerators. Compared to current designs, this technique can deliver 2.35 times more throughput under the same power and cost budgets.2

Learn more and explore resources for AI efficiency

In addition to reimagining our own operations, we’re working to empower developers and data scientists to build and optimize AI models that can achieve similar outcomes while requiring fewer resources. As mentioned earlier, small language models (SLMs) can provide a more efficient alternative to large language models (LLMs) for many use cases, such as fine-tuning experimentation on a variety of tasks or even grade school math problems.

In April 2024, we announced Phi-3, a family of open, highly capable, and cost-effective SLMs that outperform models of the same and larger sizes across a variety of language, reasoning, coding, and math benchmarks. This release expands the selection of high-quality models for customers, offering practical choices for composing and building generative AI applications. We then introduced new models to the Phi family, including Phi-3.5-MoE, a Mixture of Experts model that combines 16 smaller experts into one, and Phi-35-mini. Both of these models are multi-lingual, supporting more than 20 languages.

Learn more about how we’re advancing sustainability through our Sustainable by design blog series, starting with Sustainable by design: Advancing the sustainability of AI.


1Excerpt from prompting Copilot with: please explain how algorithms relate to LLMs.

2Splitwise: Efficient generative LLM inference using phase splitting, Microsoft Research.

The post Sustainable by design: Innovating for energy efficiency, part 2 appeared first on Microsoft Industry Blogs.

]]>
Driving business value with ESG data readiness http://approjects.co.za/?big=en-us/industry/blog/sustainability/2024/09/25/driving-business-value-with-esg-data-readiness/ Wed, 25 Sep 2024 15:00:00 +0000 Customers from many industries have been reaching out to us to discuss how to move from reporting to carbon reductions, and how to identify opportunities to drive sustainability progress in their organizations.

The post Driving business value with ESG data readiness appeared first on Microsoft Industry Blogs.

]]>
Customers from many industries have been reaching out to us to discuss how they can move beyond their environmental, social, and governance (ESG) reporting, to where they can also spot opportunities to drive sustainability progress and reduce carbon across their value chains. At Microsoft, we’ve been tackling related questions for nearly two decades, and in 2020 we committed to becoming carbon negative, zero waste, and water positive, and to protecting more land than we use. We share our successes and setbacks annually in our Environmental Sustainability Report.

Now, we’re sharing additional learnings—from our efforts and those of our global ecosystem of partners and customers, and from advice we’ve received from external sustainability experts—in the Leader’s Guide to Sustainable Business Transformation. The guide offers practical tips to help business leaders consider what steps can help their organizations build a culture and infrastructure around ESG data, and provides context for taking the Microsoft ESG Data Readiness Assessment.

For quick examples of industry-specific considerations addressed in the guide, see the chart below.

ESG Data Readiness Assessment

Leveraging the Leader’s Guide

When we began our own sustainability journey, we learned that to make progress on our ESG goals we had to bring sustainability out of siloed reporting efforts and into the core of our business. The Leader’s Guide discusses approaches to this operational shift, including ways to facilitate cross-functional conversations within your organization around the value of harnessing ESG data.

The guide also shows how Microsoft customers in different industries are using sustainability solutions to transform their operations. For example, international forestry group Södra had been limited to a labor-intensive process—siloed within its sustainability unit—to answer routine inquiries about its environmental data. Since collaborating to adopt Microsoft Sustainability Manager in 2023, Södra’s IT and sustainability teams have been generating significant insights for stakeholders, including the organization’s estimate that its positive climate impact is equal to about one-fifth of Sweden’s reported carbon dioxide emissions.

We’re ready to partner with your organization so you can also use ESG data to uncover operational insights to support your sustainability progress.

How ESG data supports business resilience

As organizations worldwide try to predict and prepare for emerging ESG disclosure requirements, they also need to respond to expanding market expectations. Investors, consumers, and shareholders are tracking companies’ sustainability commitments, and they’re looking for products and solutions that champion those commitments.

Our customer King Steel, a Taiwan-based global shoe manufacturer, faced this situation when major brands like Nike and Adidas began expecting sustainable and recyclable products. To align with its customers, King Steel started collecting and digitizing its ESG data using Dynamics 365 and capabilities within Microsoft Cloud for Sustainability. This shift to a digitized data estate not only enabled King Steel to deliver transparent data to demanding customers, it also helped the company uncover insights into materials and operations that resulted in more sustainable production, reduction of waste, and innovation of new customizable products.

By implementing a robust ESG data infrastructure, your organization can also respond to sustainability-driven market demands with speed and insights.

From commitments to solutions

Early in our journey, we quickly realized our environmental commitments necessitated better tools to manage the increasing number, size, and complexity of our ESG datasets. We also needed to unify siloed data and ensure traceability and transparency—to execute our plan to publicly self-disclose our ESG progress, and to prepare for the evolution of sustainability reporting requirements. Simultaneously, we wanted to use our ESG data to identify opportunities to drive sustainability efficiencies and business growth.

This led us to Microsoft Fabric and AI tools built on Microsoft Azure. By adopting these capabilities, we’ve now integrated our ESG, operational, and financial data, empowering our employees to access timely data intelligence so they can contribute ideas and innovate.

But to meet our ambitious sustainability commitments, we must also drive change across our value chain. In 2023, we estimated that 75% or more of our carbon footprint was coming from indirect, or Scope 3 emissions, which organizations accrue from suppliers. To address this, the Microsoft Procurement team needed customized ESG inquiries, granular data, and flexible, collaborative reporting. The team partnered with Microsoft engineers to add new capabilities to our data technologies, including low-code customization and self-service features to help our value-chain partners find ways to reduce their environmental impact.

Then we carried these robust solutions forward to our customers, so organizations like US farming powerhouse Land O’Lakes can access their ESG data for day-to-day decision-making. For example, the company relies on Azure Data Manager for Agriculture to collect and unify data on weather, soil, and irrigation—freeing Land O’Lakes data scientists to help optimize planting decisions. The Azure-based ESG data infrastructure also boosts the Land O’Lakes competitive stance by providing consumers with visibility into the organization’s farming practices and environmental outcomes.

Exploring industry-specific considerations

With the right tools, ESG data can support each industry’s unique set of goals, challenges, and opportunities.

Here’s a look at some of the issues our Leader’s Guide and ESG Assessment can help you start exploring:

Visit the Leader’s Guide for in-depth information and resources.

Next steps

As you rethink operations to support your organization’s sustainability progress, we’re ready to share our learnings and continuous innovation to help advance your ESG priorities, accelerate your growth, and partner for a shared sustainable future.

The post Driving business value with ESG data readiness appeared first on Microsoft Industry Blogs.

]]>
Sustainable by design: Innovating for energy efficiency in AI, part 1 http://approjects.co.za/?big=en-us/microsoft-cloud/blog/2024/09/12/sustainable-by-design-innovating-for-energy-efficiency-in-ai-part-1/ Thu, 12 Sep 2024 15:00:00 +0000 Read some examples of how we’re advancing the power and energy efficiency of AI.

The post Sustainable by design: Innovating for energy efficiency in AI, part 1 appeared first on Microsoft Industry Blogs.

]]>
Learn more about how we’re making progress towards our sustainability commitments through the Sustainable by design blog series, starting with Sustainable by design: Advancing the sustainability of AI.

Earlier this summer, my colleague Noelle Walsh published a blog detailing how we’re working to conserve water in our datacenter operations: Sustainable by design: Transforming datacenter water efficiency, as part of our commitment to our sustainability goals of becoming carbon negative, water positive, zero waste, and protecting biodiversity.

At Microsoft, we design, build, and operate cloud computing infrastructure spanning the whole stack, from datacenters to servers to custom silicon. This creates unique opportunities for orchestrating how the elements work together to enhance both performance and efficiency. We consider the work to optimize power and energy efficiency a critical path to meeting our pledge to be carbon negative by 2030, alongside our work to advance carbon-free electricity and carbon removal.

Advance sustainability

Explore our three areas of focus

The rapid growth in demand for AI innovation to fuel the next frontiers of discovery has provided us with an opportunity to redesign our infrastructure systems, from datacenters to servers to silicon, with efficiency and sustainability at the forefront. In addition to sourcing carbon-free electricity, we’re innovating at every level of the stack to reduce the energy intensity and power requirements of cloud and AI workloads. Even before the electrons enter our datacenters, our teams are focused on how we can maximize the compute power we can generate from each kilowatt-hour (kWh) of electric power.

In this blog, I’d like to share some examples of how we’re advancing the power and energy efficiency of AI. This includes a whole-systems approach to efficiency and applying AI, specifically machine learning, to the management of cloud and AI workloads.

Driving efficiency from datacenters to servers to silicon

Maximizing hardware utilization through smart workload management

True to our roots as a software company, one of the ways we drive power efficiency within our datacenters is through software that enables workload scheduling in real time, so we can maximize the utilization of existing hardware to meet cloud service demand. For example, we might see greater demand when people are starting their workday in one part of the world, and lower demand across the globe where others are winding down for the evening. In many cases, we can align availability for internal resource needs, such as running AI training workloads during off-peak hours, using existing hardware that would otherwise be idle during that timeframe. This also helps us improve power utilization.

We use the power of software to drive energy efficiency at every level of the infrastructure stack, from datacenters to servers to silicon.

Historically across the industry, executing AI and cloud computing workloads has relied on assigning central processing units (CPUs), graphics processing units (GPUs), and processing power to each team or workload, delivering a CPU and GPU utilization rate of around 50% to 60%. This leaves some CPUs and GPUs with underutilized capacity, potential capacity that could ideally be harnessed for other workloads. To address the utilization challenge and improve workload management, we’ve transitioned Microsoft’s AI training workloads into a single pool managed by a machine learning technology called Project Forge.

Project Forge global scheduler uses machine learning to virtually schedule training and inferencing workloads so they can run during timeframes when hardware has available capacity, improving utilization rates to 80% to 90% at scale.

Currently in production across Microsoft services, this software uses AI to virtually schedule training and inferencing workloads, along with transparent checkpointing that saves a snapshot of an application or model’s current state so it can be paused and restarted at any time. Whether running on partner silicon or Microsoft’s custom silicon such as Maia 100, Project Forge has increased our efficiency across Azure to 80% to 90% utilization at scale.

Safely harvesting unused power across our datacenter fleet

Another way we improve power efficiency involves placing workloads intelligently across a datacenter to safely harvest any unused power. Power harvesting refers to practices that enable us to maximize the use of our available power. For example, if a workload is not consuming the full amount of power allocated to it, that excess power can be borrowed by or even reassigned to other workloads. Since 2020, this work has recovered approximately 800 megawatts (MW) of electricity from existing datacenters, enough to power approximately 2.8 million miles driven by an electric car.1  

Over the past year, even as customer AI workloads have increased, our rate of improvement in power savings has doubled. We’re continuing to implement these best practices across our datacenter fleet in order to recover and re-allocate unused power without impacting performance or reliability.

Driving IT hardware efficiency through liquid cooling

In addition to power management of workloads, we’re focused on reducing the energy and water requirements of cooling the chips and the servers that house these chips. With the powerful processing of modern AI workloads comes increased heat generation, and using liquid-cooled servers significantly reduces the electricity required for thermal management versus air-cooled servers. The transition to liquid cooling also enables us to get more performance out of our silicon, as the chips run more efficiently within an optimal temperature range.

A significant engineering challenge we faced in rolling out these solutions was how to retrofit existing datacenters designed for air-cooled servers to accommodate the latest advancements in liquid cooling. With custom solutions such as the “sidekick,” a component that sits adjacent to a rack of servers and circulates fluid like a car radiator, we’re bringing liquid cooling solutions into existing datacenters, reducing the energy required for cooling while increasing rack density. This in turn increases the compute power we can generate from each square foot within our datacenters.

Learn more and explore resources for cloud and AI efficiency

Stay tuned to learn more on this topic, including how we’re working to bring promising efficiency research out of the lab and into commercial operations. You can also read more on how we’re advancing sustainability through our Sustainable by design blog series, starting with Sustainable by design: Advancing the sustainability of AI and Sustainable by design: Transforming datacenter water efficiency

For architects, lead developers, and IT decision makers who want to learn more about cloud and AI efficiency, we recommend exploring the sustainability guidance in the Azure Well-Architected Framework. This documentation set aligns to the design principles of the Green Software Foundation and is designed to help customers plan for and meet evolving sustainability requirements and regulations around the development, deployment, and operations of IT capabilities.   


1Equivalency assumptions based on estimates that an electric car c

The post Sustainable by design: Innovating for energy efficiency in AI, part 1 appeared first on Microsoft Industry Blogs.

]]>
Optimize supply chain resiliency by integrating diverse AI-powered solutions http://approjects.co.za/?big=en-us/industry/blog/manufacturing-and-mobility/2024/08/29/optimize-supply-chain-resiliency-by-integrating-diverse-ai-powered-solutions/ Thu, 29 Aug 2024 17:00:00 +0000 How do you build resiliency in your supply chain? In a world where constant change is the norm, AI is emerging as a powerful differentiator that is helping organizations sustain operations on a global scale.

The post Optimize supply chain resiliency by integrating diverse AI-powered solutions appeared first on Microsoft Industry Blogs.

]]>
How do you build resiliency in your supply chain? In a world where constant change is the norm, AI is emerging as a powerful differentiator that is helping organizations sustain operations on a global scale. Forrester predicts that 2024 will see enterprises develop strategies around more AI use cases, from reducing risk to improving customer service and boosting working capital. Discover how Microsoft is accelerating this trajectory by equipping customers and platform providers with advanced and generative AI capabilities through Microsoft Azure AI and Dynamics 365 Supply Chain Management.

Microsoft Cloud for Manufacturing

Design, build, and operate with AI

A factory worker reviewing a machine

Organizations are looking for intelligent supply chain solutions

Operations solutions

Learn more

The COVID-19 pandemic didn’t necessarily create new challenges for supply chains. Instead, it magnified problems that already existed. Lockdowns, for example, brought into stark relief the fragility of many supply chains, leading to evident shortages of raw materials and finished goods. These events underscored a crucial point in 2020: to efficiently handle disruptions and meet customer needs, organizations need real-time visibility into their supply chain. Businesses need to go a step further. Not only do companies need to monitor every supplier, process, and system supporting operations, but they also must be able to contextualize this information to proactively mitigate risks and prepare for future possibilities. That alone requires serious coordination and a lot of brainpower.

This is where AI enters the scene. Enterprises that have relied on traditional paper-based systems, legacy tools, and on-premises databases to manage supply chains, are looking for intelligent solutions to address the long-standing issues of disruption, visibility, and risk. AI—particularly generative AI—stands out as a viable solution to query across data silos and provide meaningful insights to enhance the efficiency and effectiveness of supply chain operations.

Customizing generative AI for your supply chain operations

AI has been a part of supply chain management for decades, with its roots in traditional AI applications like computer algorithms and data analysis in logistics and inventory management. However, the recent shift to generative AI is transforming the landscape by leveraging powerful data infrastructures and cloud platforms. Unlike traditional AI, generative AI democratizes insights through natural language processing, making critical information accessible to everyone across an organization.

For example, tools like Microsoft Copilot highlight the disruptive potential of generative AI when seamlessly integrated with enterprise resource planning (ERP), supply chain planning, warehouse, transportation, customer relationship management (CRM), and many other business systems. Copilot’s integration with Dynamics 365 shows how AI-powered, interactive assistance can revolutionize supply chain management (SCM), driving efficiency, reducing costs, and enhancing customer satisfaction. Nevertheless, it’s important to recognize that generative AI alone cannot solve long-standing challenges like supply chain disruption and risk.

To truly harness AI’s potential, organizations must adopt a comprehensive approach, combining intelligent solutions to break down data silos and foster supply chain resilience. Generative AI tools such as Copilot deliver optimal outcomes when paired with platforms like Microsoft Fabric and Azure Open AI Service that contextualizes the data from many sources. This synergy boosts operational effectiveness with seamless orchestration to drive productivity and profitability. However, it’s essential to choose business applications that operate on centralized, cloud-based platforms with integrated AI and machine learning, connected workflows, and a unified database to fully unlock AI’s value for your supply chain.

Redefining business value through AI and data integration

AI is not just about improving productivity for specific roles—it’s a powerful tool for driving business value and enhancing outcomes across the entire organization. Take, for instance, the case of Cemex, a global concrete manufacturer that took up to an hour to respond to confirm their customer orders. Time taken to analyze the fulfillment option is necessary to avoid revenue loss, customer dissatisfaction, and environmental risks from waste. Traditionally, customer representatives would spend about an hour to validate ingredient supply, equipment readiness, labor availability, and delivery logistics. However, by integrating AI into their daily job, these tasks could be handled in seconds, dramatically improving business performance.

This manufacturer aimed to enhance more than just speed—they wanted to transform the entire process with AI-powered insights. Leveraging Azure, Azure OpenAI Service, and Microsoft Teams, they achieved impressive results. Azure OpenAI Service utilized vast amounts of historical data stored in Microsoft’s secure cloud to analyze and contextualize information in real time. This contextualization was crucial in quickly assessing whether a new order could be fulfilled, driving faster, more informed decision making.

The time to fulfill customer orders dropped from an hour to just nine seconds, showcasing how integrating AI and data can drive substantial business outcomes—far beyond mere productivity gains. This transformation not only enhanced operational efficiency but also elevated customer satisfaction and overall business agility.

This demonstrates how AI innovation is not only feasible but also highly accessible, empowering organizations to embed intelligence into their operations and realize greater business value across the board.

Microsoft’s commitment to responsible, accurate, and trustworthy AI

When generative AI and machine learning are applied to centralized data models, they create opportunities to enhance supply chain efficiency and profitability. However, success lies in adopting the right technology and infrastructure that unify processes and data while prioritizing security, accessibility, and reliability.

Microsoft is uniquely equipped to drive AI advancements in supply chain management by promoting responsible AI practices. Their platforms are secure, extendable, and interoperable, ensuring seamless data and supply chain orchestration. Microsoft delivers AI-powered innovations such as Copilot across its entire platform, as well as a range of applications from data summarization to more critical, high-stakes decisions that directly impact supply chain operations.

Understanding that supply chain management demands precision and accountability, Microsoft enables organizations to refine use cases and add structure to Copilot models, allowing Supply Chain Management users to inquire about data points and track shifts in customer demand within defined timeframes, such as year-over-year or period-over-period. This level of specificity delivers granular insights, saving organizations substantial time otherwise spent on manual research.

Integrating AI into supply chain operations

Microsoft’s approach to integrating AI into supply chain operations focuses on grounding data within a relevant, secure framework, to make sure that insights are reliable and credible. Their AI models are designed to not only analyze past actions but to also forecast what can be achieved next. With the combination of different AI techniques and Microsoft’s secure cloud platforms, organizations can build on this innovation using low-code and no-code tools, unlocking new use cases, and driving trustworthy AI adoption across their operations.

Explore Microsoft Cloud for Manufacturing to see how you can accelerate your transformation.

The post Optimize supply chain resiliency by integrating diverse AI-powered solutions appeared first on Microsoft Industry Blogs.

]]>
How energy firms power the world with secure Microsoft technologies http://approjects.co.za/?big=en-us/industry/blog/energy-and-resources/2024/08/29/how-energy-firms-power-the-world-with-secure-microsoft-technologies/ Thu, 29 Aug 2024 15:00:00 +0000 With AI advancements analyzing trillions of security signals daily, together we can build a safer, more resilient digital energy ecosystem.

The post How energy firms power the world with secure Microsoft technologies appeared first on Microsoft Industry Blogs.

]]>
In 2023, the Microsoft Digital Defense Report revealed that critical infrastructure remained a persistent target for cyberthreats, increasing again from the previous year.1 The interconnectivity of the power industry with global commerce makes its infrastructure both essential and vulnerable. Without it, we can no longer power hospitals, heat and cool homes, open schools, or produce food. Power supply is the lifeblood of the global economy, and our resilience depends on it. 

Field engineers using a laptop on truck tailgate to review data after inspection of turbines on a wind farm.

Microsoft for energy and resources

Achieve more with trusted solutions

A growing need to transform security

Chief Information Security Officers (CISOs) at power companies know this reality well. They’re tasked with managing a complicated portfolio while protecting against cyber risks from both insiders and nation-state actors. Left unresolved, these challenges create a ripple effect across the enterprise and lead to issues like:   

  • Increasingly complex environments: Widespread digital adoption combined with evolving customer preferences, decentralized energy generation, and a changing workforce are driving utility providers to rethink their services and business models to help increase flexibility and maintain a resilient grid. In a recent survey conducted by Guidehouse and Public Utilities Fortnightly, 61% of respondents agreed that increasing flexibility to improve energy system resilience is the highest priority outcome for utility investments today.2
  • Tool fatigue: Many power companies work with hundreds of disparate management tools that are costly to manage and limited in cross-visibility. These tools must be integrated and maintained by teams with the right skillsets. As tools are added or replaced and personnel come and go, companies face the inevitable costs of re-skilling and new integrations.
  • Technical debt: While many utilities are designing new solutions in support of energy transition and the grid of the future, they still rely heavily on legacy infrastructures that carry significant tech debt. These legacy systems increase cybersecurity and operational risks as well as operational expenses through extended support costs, timelines, and integration complexities. Research shows companies pay an additional 10 to 20% to address tech debt on top of project base costs.3  

Modernizing infrastructure is costly and not easily adaptable as the risk landscape evolves. In fact, 59% of cybersecurity teams identify integration of legacy operational technology (OT) and modern information technology (IT) systems as their biggest challenge to securing OT.4 If you’re a CISO, how do you solve the challenge of securing both IT and OT against modern and fast-changing threats? 

The answer is to work with technology partners who not only understand threat actors around the world, but who also recognize the business risks and operational concerns across the industry. 

Increasing security and efficiency without sacrificing value 

With a unified security stack running on the Microsoft Cloud, utilities can significantly reduce the number of tools they manage every day for lower costs, time-savings, and better insight into IT and OT environments.  

For example, Turkish energy provider Enerjisa Üretim partnered with Senkron.Energy Digital Services to build Senkron ROC, a remote operations center that represents a critical piece of becoming cloud-native. Knowing that a single cyberthreat could shut down operations, Enerjisa Üretim also established its Operational Technology-Specific Security Operation Center (OT SOC), which relies on Microsoft Defender for IoT and Microsoft Sentinel to operate around the clock and process 3.3 million security events daily.   

The IBM Maximo Application Suite on Azure for asset operations and maintenance is another example. High performance and ultra-low latency combined with the multi-layered security capabilities of the Microsoft Azure stack provide a foundation for secure analytics that boost operational resiliency and reliability. With those advanced security features, utility providers can scale their operations to handle varying workloads without compromising operational security.  

Security solutions to meet your needs 

With Microsoft Security services, customers can leverage the latest technologies and deep industry understanding to enhance their security posture today. Microsoft Defender for IoT offers a complete inventory and continuous monitoring of connected assets across vendors and protocols; Microsoft Purview can secure and govern data across your entire estate while helping to reduce risk and meet compliance requirements; and Microsoft Sentinel provides enterprise-grade intelligent security analytics that help detect previously undetected threats and minimize false positives.  

Microsoft security solutions can also offer improvements across key use cases, including: 

  • Augmentation of security operations centers (SOCs): Microsoft security solutions empower SOCs with cloud-native capabilities that enable faster detection and response times—even automating entire responses to security events. Machine learning, AI, and advanced analytics perform the heavy lifting so SOC workers can clarify what’s happening in the SOC environment and focus on the highest-priority events. Our unified security platform eases tool fatigue in SOCs with solutions that work together seamlessly for optimal visibility and efficiency. Solutions such as Microsoft Defender Experts for XDR and Microsoft Incident Response allow for expanded capabilities to support the SOC analysts in their mission.
  • Business continuity and disaster recovery: Microsoft security solutions provide automated backup processes that are both scalable and cost-effective, and they can be integrated with on-premise data protection solutions. Our solutions include features like encryption and multi-factor authentication, which protect data during the backup and recovery process and help keep sensitive information secure. This holistic approach helps utility organizations quickly recover from data loss incidents, minimizing downtime and maintaining business continuity. 

Supporting the energy customer and partner ecosystem for a secure future 

To support continued innovation in data security and cloud adoption, we collaborated with the Idaho National Laboratory (INL) and the Department of Energy’s Grid Deployment Office on an initiative for seamless integration of cloud technology into the grid of the future. Now in its pilot phase, the Cirrus cloud feasibility assessment tool (Cirrus) offers strategic guidance on how to prepare for, or deploy, a cloud solution responsibly, with the ultimate objective to strengthen the resilience and future adaptability of a decarbonized electric grid.  

Built on the security and reliability of Azure, the online version of Cirrus is also accessible through independent platforms with a license. The tool provides valuable insights to integrators, stakeholders, and operators by clarifying goals, future plans, and risk tolerance.  

With visual outputs like key performance indicator (KPI) graphs and consequence diagrams, Cirrus offers contextualized understanding, helping users prioritize critical systems and data based on potential benefits and risks associated with cloud disruptions. Additionally, Cirrus incorporates threat detection and alerts, leveraging Cyber-Informed Engineering (CIE) principles to empower organizations to make risk-informed decisions and address high-consequence events. 

Opportunities on the horizon with AI 

It’s an exciting time for the industry as AI creates tremendous potential for energy companies to increase their security posture.  

Imagine equipping workers with Microsoft Copilot for Security to help them identify threats earlier, build their risk mitigation skills, and respond to incidents faster. What took hours or days to complete can now be finished in minutes with AI. The efficiency is about more than labor costs. Every minute that goes by gives attackers more opportunity to wreak havoc across the board.  

With AI advancements analyzing trillions of security signals daily, together we can build a safer, more resilient digital energy ecosystem.  

Learn more with Microsoft for energy and resources 

Ready to dive deeper? Don’t miss our webinar, Rethinking cybersecurity in a renewable-powered energy system on October 10, 2024, where we will be sharing how leading energy companies are using the power of technology to safeguard their businesses. Read more about the webinar and sign up to attend.  


1 Microsoft Digital Defense Report, October 2023.

2 The Power Industry: Presently and Projected, Guidehouse, July 2024.

3 Breaking technical debt’s vicious cycle to modernize your business, McKinsey & Company, April 2023.

4 How is cyber innovation disrupting the energy sector and critical infrastructure?, World Economic Forum, October 2023.

The post How energy firms power the world with secure Microsoft technologies appeared first on Microsoft Industry Blogs.

]]>
Enabling carbon reduction in the energy industry http://approjects.co.za/?big=en-us/industry/blog/energy-and-resources/2024/08/21/enabling-carbon-reduction-in-the-energy-industry/ Wed, 21 Aug 2024 15:00:00 +0000 Led by the European Union (EU), the new global push toward improved industrial carbon management (ICM) requires sophisticated new support mechanisms, including the development of technologies capable of orchestrating the carbon capture and storage (CCS) process from early planning to operations.

The post Enabling carbon reduction in the energy industry appeared first on Microsoft Industry Blogs.

]]>
The ways the energy industry captures, transports, stores, and otherwise removes carbon dioxide (CO2) from the atmosphere are changing. Led by the European Union (EU), this new global push toward improved industrial carbon management (ICM) requires sophisticated new support mechanisms, including the development of technologies capable of orchestrating the carbon capture and storage (CCS) process from early planning to operations. Microsoft is committed to be carbon negative by 2030 and by 2050 to remove from the environment all the carbon the company has emitted since it was founded in 1975. Our goal is to empower organizations worldwide to accelerate innovation across the entire end-to-end CCS value chain. By leveraging the standardized data model and secure data sharing in Microsoft Azure Data Manager for Energy and Microsoft Cloud for Sustainability, along with operations data management powered by Azure AI and Microsoft Copilot, we aim to achieve business goals of net zero, sustainability, and profitability.

Azure Data Manager for Energy

An energy employee working on a tablet

Enabling energy industry innovation through modern technology

The process of finding suitable CCS sites is costly and time consuming, and not without its own unique information security risks. Traditional energy industry technologies used during this process both increase in cost over time and contribute to the data silos that exist between the site selection process and operational concerns like site-specific safe liquid CO2 injection speeds and storage capacities. These factors have led to challenging commercial margins of CCS as a process, presenting a barrier to entry for many interested businesses.

Carbon management technologies set to soar in Europe

Read more

The process is not unfamiliar to Microsoft, which has already invested in multiple large-scale CCS projects around the world, including Northern Lights, a partnership between the Norwegian government and energy companies Equinor, Shell, and TotalEnergies. Northern Lights was created to help accelerate the decarbonization of European industry and mitigate its otherwise unavoidable emissions. The project facilitates the capture and transport of industrial CO2 emissions, which it then liquifies and stores safely in the pores of saline aquifers 2,600 meters below the seafloor.

By 2030, Microsoft plans to have an established system that removes five million metric tons of carbon from the atmosphere each year. With Azure Data Manager for Energy and operations data management powered by Azure AI and Microsoft Copilot, Microsoft aims to help increase the return on investment (ROI) of CCS projects, helping customers optimize their costs with AI, automation, and the discovery of new best practices. Additionally, organizations can employ Microsoft Cloud for Sustainability—a growing set of powerful data and AI capabilities designed to help businesses create more accurate and reliable data intelligence to drive impact reduction efforts and business transformation. These solutions help users gain actionable insights to drive sustainable practices, providing visibility into sustainability performance with advanced analytics and reporting. The global Microsoft partner network, with its industry specific expertise and highly targeted CCS solutions, further strengthens these capabilities, providing customers with valuable resources and support

The path forward for carbon capture storage

There are two divergent paths ahead for the emerging CCS industry, both recursive in nature. On the first and more positive path, companies will see a clear value in negating and offsetting their carbon emissions efficiently and effectively. On the other path, companies could lack the tools that efficiently connect the dots between carbon emissions and offsets, and hence be left with a less clear value proposition. By underpinning the positive path with technology, Microsoft hopes to help industry and humanity at large meet their shared sustainability goals.

Azure Data Manager for Energy is aligned with the highly secure OSDU® and OPC Unified Architecture (OPC UA) data standards, which will ease the development of new services and workflows that transcend today’s data silos. This standardization also paves the way for the adoption of co-pilots and other time-saving AI solutions. Combining Azure Data Manager for Energy with other services, such as Microsoft Fabric, Environmental Credit Service, and Microsoft Sustainability Manager, helps organizations in the energy industry validate and demonstrate their CCS efforts and carbon credit purchases to regulators in the rapidly emerging and expanding ICM business.

intelligent forecasting and data analysis

Advance your carbon reduction strategy

Sustainability data solutions in Fabric offer unique capabilities that provide prebuilt and preconfigured Fabric resources. These resources include data stores in the form of data lakes, prebuilt notebooks, and dashboards to ingest, process, aggregate, and display data for various ESG scenarios. By combining and transforming disparate social and governance data into a standardized data lake, organizations can compute, analyze, and disclose social and governance metrics effectively. 

How scalability, standardization, and security contribute to sustainability

Data standardization and AI readiness are the first steps toward innovative capabilities, especially when paired with the hyper-scalability of Azure. During the process of identifying ideal sites for carbon storage, energy companies run multiple site-specific simulations that traditionally include the manual numerical simulation of seismic data. These simulations are time consuming, complex, and data intensive. They’re also critical to the site selection process, so when companies are given the opportunity to infuse them with AI and run them at scale, there’s massive potential for time savings and efficiency gains.

The ability to scale up the computing power required to run thousands of simulations against hundreds of potential sites when required could help shorten the CCS site selection process substantially. It could also help refine the simulations and their related data models and lead to further efficiency gains. Scaling compute back down after the simulations have been run can help energy companies not only reduce their costs, but also reduce the same carbon footprint the CCS process is helping to address. By running the simulations on Azure, energy companies are taking advantage of hyper-scalability on a cloud that has itself been carbon neutral since 2012.

After months of work going into the selection and analysis of a proper CCS site, energy companies want to make sure their data is not just secure but fully under their own control. If that information were to leak to either the public or their competitors, all that effort and investment could be lost. For this reason, Microsoft is working toward enabling Azure Data Manager for Energy on customer cloud tenants, which will grant them the control they require as well as the layered security of Microsoft managed services in the cloud. For real-time CCS operations data, Microsoft is also developing a reference architecture and toolkit to enable partners to build ICM solutions to deliver value to our customers. 

Clearer skies ahead

With Azure Data Manager for Energy and the power of Azure AI, Microsoft Copilot, and capabilities from Microsoft Cloud for Sustainability, Microsoft hopes to give the energy industry the standardization and systemization that its past technologies may not have provided. To keep global warming within 1.5 degrees, the United States Department of Energy’s Pacific Northwest National Laboratory reports that the world needs to start removing 10 gigatons of CO2 from the atmosphere annually by 2050.1 To reach that important milestone in time, the energy industry needs a technological foundation to build its next wave of advancements upon.

If, as the Clean Air Task Force states, Europe alone has the storage capacity for 1,520 gigatons of carbon dioxide emissions, helping energy companies rapidly, cost-effectively identify and provision CCS sites is a big step in the right direction, and one which Microsoft hopes to help the energy industry take.2

Explore more on carbon management


1Diverse Approach Key to Carbon Removal, Pacific Northwest National Laboratory, 2023.

2Unlocking Europe’s CO2 Storage Potential, Clean Air Task Force, 2023.

The post Enabling carbon reduction in the energy industry appeared first on Microsoft Industry Blogs.

]]>
DAX Copilot: New customization options and AI capabilities for even greater productivity http://approjects.co.za/?big=en-us/industry/blog/healthcare/2024/08/08/dax-copilot-new-customization-options-and-ai-capabilities-for-even-greater-productivity/ Thu, 08 Aug 2024 16:00:00 +0000 DAX Copilot, part of the Microsoft Cloud for Healthcare ecosystem, offers advanced AI-powered capabilities—enhancing clinician and patient experiences by improving productivity and efficiency.

The post DAX Copilot: New customization options and AI capabilities for even greater productivity appeared first on Microsoft Industry Blogs.

]]>
Dragon Ambient eXperience (DAX) Copilot has contributed to better clinician and patient experiences—in some cases, even encouraging clinicians to keep practicing medicine and staying with their current healthcare organization.1 With DAX Copilot, clinicians are achieving new levels of productivity and efficiency with AI-based note creation.

In a Microsoft survey of 879 clinicians using DAX Copilot (July 2024)1:

  • 5 minutes saved per clinician per encounter on average
  • 77% say it improves documentation quality
  • 70% say it improves work-life balance, and reduces feelings of burnout and fatigue

In a survey2 of more than 400 patients whose clinicians are using DAX Copilot:

  • 93% say their clinician is more personable and conversational
  • 85% say their clinician is more focused
  • 90% say their clinician spends less time on the computer

Get documentation done your way

Like many clinicians, you have your own documentation style and preferences for organizing and communicating clinical information. Allowing you to choose your own format and customize your style helps cut down on documentation time by letting you input details in a way that makes the most sense for you and your practice. 

Style and formatting customizations—design and define your style

Clinicians spend 25% more time on documentation now than 10 years ago3

Whether you prefer your documentation to be concise or verbose, in bulleted or paragraph form, DAX Copilot lets you easily design and define your style, and apply these preferences automatically or on-demand, including adjusting pronouns with a single click. 

This “do it your way” approach makes documentation easier and expedites your ability to recognize and retrieve information later.  

More than AI notes

Beyond automating notes, DAX Copilot now supports a series of new advanced AI-powered capabilities including referral letters, summaries of evidence, after-visit summaries, encounter summaries, and coaching. Designed to further enhance, improve, or ease workflows and how clinicians capture and create patient-related documentation, these capabilities simplify and streamline processes—giving clinicians more time to focus on taking better care of their patients, and themselves.

Referral letters—more from your patient conversations without extra work

Clinicians spend 50% of their day on documentation4

The process is simple and seamless. Once DAX Copilot has drafted your clinical note, it can quickly and easily use the same information gathered during a patient encounter to create a referral letter. 

While serving as a critical communication tool to ensure a smooth transition of care, the process to create a referral can be tedious and time-consuming. 

Copilot automatically extracts key information, medical history, and the requested services along with pertinent test or imaging results from notes. Information is promptly repurposed in the form of a referral letter and readily available to facilitate next steps.

Summarize diagnosis evidence—helps validate medical information in your note

Clinicians see up to 20 patients a day5

More than just linking notes to transcripts, DAX Copilot curates diagnosis evidence from subjective elements such as symptoms, objective elements including labs and imaging, as well as other relevant information shared during the encounter.

Clinicians ask patients questions to assess their symptoms, the severity of those symptoms, and any notable changes. But patients may not remember everything and are even less likely to know specific details about previous testing. Copilot’s automated summaries of evidence provide a quick, objective, and reliable way to validate a patient’s clinical history. Instead of just linking to a key word, it cites the origin and context of any included information—allowing clinicians to proceed with trust and confidence.

After visit summaries—empower patients without adding to your workload

Patients forget about 40 to 80% of medical information6

Between the stress of their situation and potential confusion around medical terminology, patients may have difficulty remembering or processing their doctor’s advice. To help them follow their care plan, DAX Copilot can create succinct and easy-to-understand summaries—highlighting key details and important instructions for patients and their caretakers. 

Most medical instructions are given to patients verbally even though written instructions help increase the likelihood of those instructions being followed properly—especially when a patient relies on others to assist with their care needs. Copilot converts clinical documentation from encounter visits into written patient-friendly after-visit summaries, providing an easy reference for key clinical highlights and important directions. It helps eliminate forgotten or ‘misremembered’ information and increases patient awareness of—and ability to adhere to—your care recommendations.  

Summarize encounter—get a quick refresher on a patient before finalizing the notes

2,500 average patient panel size over 12 to 18 months7

DAX Copilot displays a synopsis of the encounter that includes key facts and details in the mobile and desktop app instantly—streamlining workflow and reducing cognitive load.  

You can’t be expected to retain every detail of an encounter. To help remember key points before finalizing the note, DAX Copilot provides concise clinical summaries of information on demand. Now it’s easier to review an encounter quickly and efficiently before moving on to your next patient. 

Coaching—identifies areas where you can include more information verbally for more complete clinical notes

86% of denials could be avoided8

Using each encounter recording, DAX Copilot evaluates your patient notes to help you capture more appropriate details. Whether including family history or specific metrics, like body temperature or body mass index (BMI), it suggests areas where you can verbalize more information to create more complete notes.​  

Omitting pertinent information may affect your ability—and that of subsequent providers—to properly diagnose and treat patients or it could result in a claim denial or improper reimbursement for care delivered. Copilot combs through encounter transcripts to check that important details—such as information necessary to identify the patient, relevant history, social and lifestyle factors, physical exam findings as well as your assessment and treatment plan—are included. It also analyzes observations, findings, and diagnosis to help evaluate if there are sufficient details to support medical coding. These suggestions help promote more complete and accurate documentation which benefits patient health as well as the financial health of healthcare organizations.

Why DAX Copilot? 

More than just AI notes, DAX Copilot offers a more capable and comprehensive experience

Our technological leadership, scale, and world-class infrastructure and support allows you to meet today’s demands while providing extensibility for the future. Better than typing, scribes, or other ambient solutions, DAX Copilot harnesses the power of AI to automate workflows that extend beyond clinical documentation. 

Backed by a proven track record and decades of clinical expertise, DAX Copilot is part of the Microsoft Cloud for Healthcare and extensive copilot ecosystem. With a seamless integration with Dragon Medical One, users have access to hundreds of advanced features that increase productivity and efficiency pre, during, and post visit. Built on a secure, responsible AI framework and foundation of trust, these collective capabilities offer exceptional value and efficiencies across the entire clinical workflow.   

Doctor shaking hands with a patient in an office setting.

DAX Copilot

Your clinical documentation and workflow copilot


1 Microsoft survey of 879 clinicians across 340 healthcare organizations using DAX Copilot; July 2024 

2 Survey of 413 patients conducted by multiple healthcare organizations whose clinicians use DAX Copilot; June 2024.

3 Nuance, Assessing the burden of clinical documentation.

4 Advisory Board, Doctors spend 27% of the workday with patients, study finds. What do they do for the rest of it?

5 Statista, Number of patients that physicians in the U.S. saw per day from 2012 to 2018, November 30, 2023.

6 JRSM, Patients’ memory for medical information, May 2003

7 Delaware Journal of Public Health, Considerations for Patient Panel Size, December 2022.

8 Becker’s Healthcare, 86% of denials are potentially avoidable: Strategies to better prevent, manage denials, November 2020.

The post DAX Copilot: New customization options and AI capabilities for even greater productivity appeared first on Microsoft Industry Blogs.

]]>
Sustainable by design: Transforming datacenter water efficiency http://approjects.co.za/?big=en-us/microsoft-cloud/blog/2024/07/25/sustainable-by-design-transforming-datacenter-water-efficiency/ Thu, 25 Jul 2024 16:00:00 +0000 In our datacenter operations, one of the essential engineering questions we ask each day is: how can we continue to conserve water while meeting growing customer demand for cloud and AI innovation?

The post Sustainable by design: Transforming datacenter water efficiency appeared first on Microsoft Industry Blogs.

]]>
Learn more about how we’re making progress towards our sustainability commitments through the Sustainable by design blog series, starting with Sustainable by design: Advancing the sustainability of AI.


Last month, we unveiled our Datacenter Community Pledge, emphasizing that datacenters are not only the backbone of modern technology but also a force for good in the communities they serve. As part of this commitment, at Microsoft we recognize our crucial role in protecting and replenishing freshwater resources both in the regions where we operate and around the world.

That’s why in our datacenter operations, one of the essential engineering questions we ask each day is: how can we continue to conserve water while meeting growing customer demand for cloud and AI innovation?

In datacenters, water is primarily used for cooling and humidification. As demand for high performance cloud and AI applications has grown over the past few years to fuel customer applications and enable a new frontier of discovery and innovation, so have the power requirements for silicon chips—the basic building blocks of cloud and AI computing—that sit within the racks and servers of datacenters. Because advanced chips typically utilize more power, they also generate more heat. To prevent the chips from malfunctioning, more intensive cooling is needed, and this has historically required consuming water.

To reduce the water required for operations, a critical path to our company goal of becoming water positive by 2030, we’re innovating everywhere from our datacenter buildings all the way to the chips. Collectively, this work is delivering substantial results. From our first generation of owned datacenters in the early 2000s to our current generation in 2023, we have reduced our water intensity (water consumed per kilowatt-hour) by over 80%. This shows that it’s possible to significantly reduce how much water our datacenters use per kilowatt of power even as our cloud infrastructure expands.

Today, we are sharing more about two focus areas for continuing to drive down water intensity: (1) conserving water at every stage of operations, and (2) innovative technologies that reduce the amount of water needed for cooling.

Conserving water at every stage of operations

At all locations, we work to minimize the amount of water we require for cooling. This includes operating our datacenters at a temperature that allows us to cool with outdoor air the majority of the year, reducing the need for ambient cooling, and conserving water at every stage of day-to-day operations.

Embracing a digital operating model: a call to action for banks (1)
In our datacenters, we work to minimize the amount of water we require from municipal water systems. This includes water conservation practices in existing datacenters and new datacenter designs that are optimized to support AI workloads and consume zero water for cooling.​

We conduct regular audits of our datacenters to identify inefficiencies and areas where design and day-to-day use don’t align. Our 2022 audit resulted in targeted improvements that eliminated 90% of the instances in which excess water was used. In addition, we’re building advanced prediction models that help us anticipate water requirements based on real-time weather and operational data. Comparing anticipated needs to actual consumption patterns enables us to quickly identify inefficiencies, such as water leaks that may otherwise go unnoticed.

To minimize freshwater requirements from municipal water systems, we employ conservation strategies that are tailored to the bioregion of the datacenters. For example, in Texas, Washington, California, and Singapore we’ve expanded our use of reclaimed and recycled water. In the Netherlands, Ireland, and Sweden we’re harvesting rainwater, and we’re also bringing this capability to new datacenters in Canada, the United Kingdom, Finland, Italy, South Africa, and Austria.

Innovative technologies that reduce the water needed for cooling

Advancing sustainabilityLearn more

Innovative cooling technologies are essential to Microsoft’s water strategy, and we are rapidly expanding proven solutions across our datacenter portfolio. This includes solutions that bring cooling directly to the source of heat generation—the chip itself.

Cold plates are a prime example of this: a direct-to-chip cooling technology that provides heat exchange in a closed loop system. Cold plates dissipate heat more effectively than traditional air cooling, directly chilling the silicon and then recirculating the cooling fluid, like a car radiator. This solution significantly improves cooling efficiency and enables more precise temperature control compared to traditional methods.

To harness the increased efficiency cold plates offer, we’re developing a new generation of datacenter designs optimized for direct-to-chip cooling, which requires reinventing the layout of servers and racks to accommodate new methods of thermal management as well as power management. In existing datacenters, we’re also using innovations like the ‘sidekick,’ a liquid cooling system we’re already using adjacent to racks of Microsoft Azure Maia AI Accelerator chips, circulating fluid to draw heat away from the cold plates attached to the surface of the chips.  

We’re also evolving cold plate technologies through our work with microfluidics, a technology that brings cooling inside the silicon by integrating tiny fluid channels into chip designs. Embedding the liquid cooling inside the chip brings the coolant right next to the processors, resulting in even more efficiency and precision.

Our newest datacenter designs are optimized to support AI workloads and consume zero water for cooling. To achieve this, we’re transitioning to chip-level cooling solutions, providing precise temperature cooling only where it’s needed and without requiring evaporation. With these innovations, we can significantly reduce water consumption while supporting higher rack capacity, enabling more compute power per square foot within our datacenters.

Reducing global water use through partnership, investing to replenish water

Our water positive goal guides us to consider not only how we can shift our business practices to reduce our water footprint but also how this work can benefit customers and partners working toward similar goals. The five pillars of water positive: reduction, replenishment, access, innovation, and policy all play important roles in our water positive journey.

Azure Gov helps Veterans Affairs address improper payments

Over the past year, we grew our water replenishment program significantly, nearly doubling our water replenishment portfolio to include more than 49 replenishment projects around the world. Together, these have the potential to replenish more than 24,000 Olympic size swimming pools over the lifetime of the projects. We also met our 2030 water access target to provide 1.5 million people with access to clean water and sanitation services.2

In addition, we’re working to reduce global water use by collaborating with customers, partners, local communities and municipalities to advance water infrastructure and policy around the globe. Because corporate approaches to water management generally lag investments in carbon reduction1, we’re taking an active role in championing effective and innovative water management practices and water policies. Some of our advocacy projects include: (1) serving on a coalition to increase water reuse and recycling across the United States, (2) funding projects that support Tribal Nations and state governments in increasing water security, and (3) supporting research, analysis, and advocacy on water in the European Commission.

Learn more about how Microsoft is advancing sustainability

Learn more about how we’re advancing sustainability through our Sustainable by design blog series, starting with Sustainable by design: Advancing the sustainability of AI. For more information on our progress towards our sustainability goals, read the Microsoft 2024 Environmental Sustainability Report.

Explore Microsoft’s approach to water replenishment


1Why investment in water is crucial to tackling the climate crisis, World Economic Forum, 2024.

22024 Environmental Sustainability Report, Microsoft.

The post Sustainable by design: Transforming datacenter water efficiency appeared first on Microsoft Industry Blogs.

]]>
Unlock generative AI value in private equity: AI use cases and prompts http://approjects.co.za/?big=en-us/industry/blog/financial-services/2024/07/17/unlock-generative-ai-value-in-private-equity-ai-use-cases-and-prompts/ Wed, 17 Jul 2024 16:00:00 +0000 Helping private equity firms to navigate effective strategies to adopt generative AI and modern cloud solutions is our primary focus and a tenet of our work with Microsoft Cloud for Financial Services.

The post Unlock generative AI value in private equity: AI use cases and prompts appeared first on Microsoft Industry Blogs.

]]>
With generative AI now a dominant topic in financial services, private equity leaders often regard the new technology’s potential with a mix of excitement and caution.

Private equity firms are enticed by the new technology’s promise of transformation in operations and innovation. However, in the heavily risk-aware business of acquisition, management, and monetization of private companies, leaders are also keenly interested in use cases and user experiences that demonstrate value in both the near-term and long-term.

Helping these companies to navigate effective strategies to adopt generative AI and modern cloud solutions is our primary focus and a tenet of our work with Microsoft Cloud for Financial Services. Private equity leaders recognize that business transformation also means disruption, which is why solid evidence of the near-term value and impact of generative AI is an important key to success.

Microsoft Cloud for Financial Services

Unlock business value in the era of AI

A decorative image of a finance worker viewing a tablet

The impact of AI in private equity transformation

Generative AI can transform the way private equity firms do business in a variety of ways. Here are just three areas in which it offers significant potential gains in return on investment (ROI):

Manage risk with responsible AI

Read more
  • Due diligence accelerated. Generative AI can help accelerate the process of investing in assets by automating and improving various aspects of the investigation and evaluation process. It can streamline information gathering from financial statements, identify trends in financial documents and reports, and analyze contracts, legal documents, and compliance records.
    • Generative AI use case: By parsing financial statements, legal documents, and industry reports, a copilot identifies hidden risks and anomalies related to a potential investment opportunity with the goal of avoiding costly mistakes.
  • Deal sourcing (origination) efficiency. Generative AI can help firms better identify and evaluate potential investment opportunities in ways that increase speed and create competitive advantages. It can retrieve, summarize, and extract insights from multiple unstructured data sources, automate the generation of documents like requests for proposals (RFPs) and contracts, and automate processes.
    • Generative AI use case: By quickly analyzing vast sets of data and documents, a copilot identifies potential investment targets faster and more accurately than before, resulting in a broader deal pipeline and better odds of finding lucrative opportunities.
  • Portfolio optimization. Generative AI can help enhance operational efficiency and decision-making related to maximizing the performance and value of a portfolio of investments. It can aid in asset allocation decisions through fast insights, improve efficiency by automating key processes, and improve information sharing by making institutional knowledge instantly available. For example, scenario analysis can help assess the impact of different market conditions on a portfolio.
    • Generative AI use case: A firm can analyze and prioritize investments based on potential returns and strategic fit.

An easy first step in AI innovation: Copilot for Microsoft 365

The scope of possibilities in deploying generative AI in private equity is so broad as to sometimes feel overwhelming.

Many firms are initiating long-term innovation with the goal of building customized AI services and products for both internal and customer-facing purposes. For example, Moody’s is now focused on developing generative AI-powered data, analytics, research, collaboration, and risk solutions for financial services, as part of a strategic partnership with Microsoft.

This level of innovation includes developing customized generative AI applications—chatbots and plugins, for example—built on a rich cloud platform for enterprise such as Microsoft Fabric and Azure AI Studio that enable the development and deployment of responsible AI solutions.

Powerful as they are, these innovations require time and planning. In the meantime, many companies also want solutions that can be put to work right away to deliver immediate benefits. This is why Microsoft Copilot for Microsoft 365 is so compelling for many financial services organizations. Designed as a real-time, intelligent assistant integrated deeply into the Microsoft 365 applications that most employees use on a regular basis, Copilot for Microsoft 365 promises to deliver the new value in the near term.

A brilliant digital assistant, or “AI-powered search engine”?

For companies now successfully deploying Copilot for Microsoft 365, the results are impressive. Microsoft research indicates that a company’s productivity and efficiency can be dramatically improved. For example, 70% of people using Copilot reported being more productive and 77% said that once they got up to speed with it, they didn’t want to give it up.1

The key to such success is getting up to speed—or, more precisely, understanding how to begin using Copilot.

When fully deployed in an enterprise environment, Copilot appears in many places across applications and even within documents, as a small icon integrated into a toolbar or workspace. Click it, and a window opens inviting you to engage—and away you go.

A decorative GIF of the Copilot logo

Copilot Success Kit

Accelerate your time to value with Copilot for Microsoft 365

However, one bit of feedback we often hear is that users have only vague ideas of what Copilot can actually do. Lacking guidance and drawing on known paradigms, people often tend to use it like a search engine—in other words, they ask for information based on keywords. Copilot is adept at that, but only as an entry into the real power of AI. Copilot knows and understands huge swaths of data across the enterprise, and draws on it to conduct analysis, perform research, provide insights, explain things, and create entirely new artifacts.

To get such benefits, it is important to understand how to query the technology—in other words, to do what we call “prompt engineering.”

The key to unlocking value: A great prompt

The effectiveness of any natural language-based interaction with a generative AI application depends largely on the quality and specificity of the prompt you submit to it. A well-crafted prompt is essential to generating useful results. A prompt can be a question, a statement, a set of keywords, or even a more complex set of instructions.

This is where a quick demonstration can highlight the true power of Copilot. Try it for yourself. If your company has licensed and enabled Copilot for Microsoft 365 across the business, click the Copilot icon on your browser (be sure to toggle it to Work mode) or on Copilot for Microsoft Teams, and cut-and-paste the following prompt:

  • “Summarize my emails, Teams messages, and channel messages from the last workday and today. List action items in a dedicated column. Suggest follow-ups if possible in a dedicated column. The table should look like this: Type (Mail/Teams /Channel) | Topic | Summarization | Action Item | Follow Up. If I have been directly mentioned, make the font of the topic bold.”

In a few moments, Copilot will generate a table that looks like this:

TypeTopicSummarizationAction ItemFollow Up
MailContoso deal report needs updatingChris will provide missing data. Mark and Shauna need to know when he is finished to update CRM.Chris needs to be done by the end of the week.Chris to update Mark and Shauna when he is done.
TeamsContoso  – Internal Sales Enablement callUpdated Dynamics 365 engagement with Contoso on future sales engagement in Canada.Continue sales engagement with Contoso.Schedule Contoso meeting.
ChannelContoso meeting 8/11Reviewed with Contoso action plan.Review the meeting recording.Share any relevant insights or action items with the team.

For more fun, experiment with prompts like these:

  • “Summarize where I was mentioned in email, Teams messages, and channel messages in the last 24 hours. Use that information to prioritize my top three action items for today.”
  • “Summarize my emails, Teams messages, and channel messages for the last six weeks about the <project or topic>. List action items in a dedicated column.”

This exercise, while enlightening, is just the tip of the iceberg in what Copilot can deliver. Functionally speaking, adding Copilot to private equity operations can assist a team with tasks such as researching, monitoring, structuring, supporting processes, and making informed recommendations.

Once a company’s users are fully empowered on prompt engineering, the door is open to realizing greater value in the investments the firm makes with generative AI.

Learn more and move forward

Every AI journey is unique, and the best way to start is to engage with Microsoft or a global partner to explore options and opportunities. To learn more about how Microsoft can help financial services organizations unlock business value and deepen client relationships, see the Microsoft Cloud for Financial Services website. To get started with Copilot for Microsoft 365, contact your Microsoft support team or technology partner.


1What Can Copilot’s Earliest Users Teach Us About Generative AI at Work?, Microsoft, November 2023.

The post Unlock generative AI value in private equity: AI use cases and prompts appeared first on Microsoft Industry Blogs.

]]>
How Microsoft and AI work to reduce government benefit fraud and error http://approjects.co.za/?big=en-us/industry/blog/government/2024/07/15/how-microsoft-and-ai-work-to-reduce-government-benefit-fraud-and-error/ Mon, 15 Jul 2024 16:00:00 +0000 Governments are increasingly challenged to protect benefits programs from fraud, abuse, and waste. Here's how generative AI and Microsoft are helping agencies respond in ways that can mitigate loss and help improve trust in government.

The post How Microsoft and AI work to reduce government benefit fraud and error appeared first on Microsoft Industry Blogs.

]]>
An organized criminal group defrauds a national stimulus program through the use of elaborate fake identities. An 87-year-old woman receives maternity benefits, decades after giving birth. A newly unemployed worker unwittingly submits incorrect online forms and is approved for twice the benefits he qualifies for. These representative anecdotes are just a sampling of the myriad ways in which funding earmarked for government benefits can be lost to fraud, abuse, and waste.  

Governments do remarkable work to help people live with dignity and endure difficult circumstances by providing essential benefits and resources across a broad spectrum of programs. Yet the challenge of providing the right benefits at the right time to all qualified recipients while also minimizing the incidence of improper payments is a balancing act that many governments struggle to achieve. 

Helping agencies and organizations to address these types of challenges is central to our work in Microsoft for Government. We focus on enabling thriving communities and inclusive programs through technology. In the case of benefits protection, we work to help each government discover the right mix of technological and organizational innovation to mitigate risk within the parameters of their unique circumstances and requirements. 

Microsoft for Government

Help solve society’s biggest challenges.

A red-haired woman with a green shirt and glasses, sitting inside an office working on her laptop. She has a cup of coffee or tea and a phone by her side.

The rising tide of improper payments

The loss of benefits funds to unauthorized payments has long been a problem for governments around the world. In recent years, the scale and complexity of the challenge has escalated. Fraud is a particularly expensive and multidimensional problem. To offer just one example, the US Government Accountability Office (GAO) estimated that the amount of fraud in US unemployment insurance programs during the COVID-19 pandemic was likely between $100 billion and $135 billion USD.1

The reasons for this are varied, and they parallel larger trends in society. A key driver is the ready availability of AI tools, which fraudsters have enthusiastically embraced to automate, scale, and generally uplevel the impact of their efforts. But it’s not always a matter of criminal intent. So-called “unintentional errors”—mistakes made without maliciousness due to things like administrative errors or poorly designed solutions—also incur huge costs. According to the GAO, overpayments or payments that should not have been made (for example, to deceased people or those no longer eligible for government programs) totaled an estimated $247 billion during fiscal year 2022.2

Why governments struggle to ensure proper payments 

Unlike organizations in other global industry sectors, governments face unique challenges in administering complicated social benefits programs. The nature of distributing funds accurately and efficiently to constituents in all corners of society is inherently fraught. Governments must contend with a set of vexing factors, including: 

  • Growing demand for benefits funding—A mix of aging populations, increases in immigrant and refugee populations, and people seeking broader ranges of benefits is driving higher budgets and greater complexity. 
  • Mandates to “do more with less”—Austerity programs and budget cuts are colliding with rising public expectations for modern services and impacting the ability of security teams to procure top cybersecurity talent.  
  • Disconnected data silos—Across departments and agencies, teams often work with older tools and isolated systems that cannot share data easily or securely, making integrated new “tell us once” solutions difficult to implement. 
  • Ineffective legacy systems—A high “technical debt” of outdated computing environments and applications impedes innovation and threatens to hinder the most vulnerable people from accessing payments they’re entitled to.  

Beyond these costs and considerations, the larger risk here is the erosion of public trust. Especially in social benefits systems, which exist to ensure the welfare of people across all walks of life, government is obligated to demonstrate reliability and integrity. Widespread fraud and abuse do more than increase deficits and siphon off critical resources. They also undermine confidence in all aspects of government.  

A vision for modern benefits delivery and protection 

Imagine if governments were able to innovate on par with private industry. Imagine if people could easily access all the information they need through any channel (online, over phone, text, and more) and have their own virtual assistant to help them apply for benefits using natural, everyday language. Imagine if these systems would inherently enforce rules and keep up with constituents’ changing lives. Imagine if customer service representatives had access to a complete, 360-degree view of the constituent—if it knew their history and all their previous contacts with every social service agency. And imagine that such things could be accomplished without breaking the budget.  

These are obviously lofty, aspirational visions. However, thanks to the relatively recent technological advancements, some innovative government social service organizations are doing just that. The advent of generative AI and the innovation it has inspired has already delivered promising results on many fronts of government operations, including benefits. To cite just one example, a generative AI-powered chatbot developed in India called Jugalbandi is helping people get assistance for any of 171 government programs, simply by conversing through mobile devices in 10 of India’s 22 official languages. “This is revolutionary for people who could not interact with tech because of language barriers,” said Abhigyan Raman, a project officer. 

Ensuring the integrity of benefits payments means reevaluating a government’s existing slate of technology investments and charting a multistep course of transformation. Embracing AI is a common theme, but the destination in terms of functional outcomes and benefits will be unique to the organization.  

Three areas of generative AI impact in benefit payments protection

In our work with customers who are either early in the journey or simply just embarking, we identify three categories of solutions that hold the greatest promise for cloud and AI innovation:

  1. Boost workforce productivity. Governments can potentially save time and money using generative AI for scenarios such as live transcriptions and translations during benefit eligibility interviews. AI can also draw insights more easily from large volumes of data. One department, for example, identified GBP14 million in potentially fraudulent loans by analyzing a set of 250 networks of people, organizations, and places, processing 100 million data items.
  1. Embrace “Prevention by Design. AI can integrate controls into systems to detect fraudulent or erroneous activities and enable real-time profiling and alerts. For example, a leading bank is using Voice ID analysis for incoming phone calls from 31 countries and 15 languages, checking more than 100 behavioral and physical vocal traits in a matter of seconds. Since launch, it has flagged more than 43,000 fraudulent calls and reduced fraud by 50%, preventing an estimated GBP981 million in losses.
  1. Enhance citizen engagement. When chatbots become knowledgeable assistants, when a person’s voice becomes their password, when the disabled have equal access to resources, then everyone will benefit. For example, a government department is developing an Intelligent Website AI Assistant to help taxpayers more easily process their tax returns. It uses natural, everyday language to address queries and disseminate timely and consistent information. This will help improve compliance and avoid revenue leakages from fraudulent activities.

Beyond technology: How Microsoft partners for the long term 

Embracing the paradigm shift of generative AI obviously begins with technology. The table stakes of modernization for government include migrating to a modern cloud platform and the adoption of a comprehensive AI development solution from a vendor who demonstrates a deep commitment to security and responsible AI practices

Microsoft invests heavily in all these areas. However, success involves much more than just technology. Governments also depend heavily on the contributions of trusted solution providers, and we believe our global partner ecosystem sets us apart, with expertise in all corners of the world. Then, the final unique benefit we offer is the deep experience of our industry advisors and the many highly experienced government veterans on the Microsoft Government team. 

Our job is to help build the bridge between the technical and the strategic, on realistic terms. When we sit down with customers, we help clarify challenges and goals, educate on important challenges (for example, how governments can tackle cybersecurity and AI skilling), and share our experiences with other governments facing similar challenges (sometimes even connecting them to help foster learning). Then, we embark on identifying and exploring use cases, evaluating impact, and taking the knowledge gained for further innovation.  

We are excited to work with governments to mitigate fraud and abuse in payments. To learn more about how Microsoft is helping to create opportunities that support vulnerable communities, see our Microsoft public health and social services website, and learn more about Microsoft for Government


1Estimated Amount of Fraud during Pandemic Likely Between $100 Billion and $135 Billion, GAO. September 2023.

2Federal Payment Errors, Known As Improper Payments, Are A Continuing Concern, GAO. March 29, 2023.

The post How Microsoft and AI work to reduce government benefit fraud and error appeared first on Microsoft Industry Blogs.

]]>