Data and AI Archives - Inside Track Blog http://approjects.co.za/?big=insidetrack/blog/tag/data-and-ai/ How Microsoft does IT Thu, 09 Apr 2026 16:34:58 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.4 137088546 Conditioning our unstructured data for AI at Microsoft http://approjects.co.za/?big=insidetrack/blog/conditioning-our-unstructured-data-for-ai-at-microsoft/ Thu, 09 Apr 2026 16:05:00 +0000 http://approjects.co.za/?big=insidetrack/blog/?p=23020 Anyone who has ever stumbled across an old SharePoint site or outdated shared folder at work knows firsthand how quickly documentation can fall out of date and become inaccurate. Humans can usually spot the signs of outdated information and exclude it when answering a question or addressing a work topic. But what happens when there’s […]

The post Conditioning our unstructured data for AI at Microsoft appeared first on Inside Track Blog.

]]>
Anyone who has ever stumbled across an old SharePoint site or outdated shared folder at work knows firsthand how quickly documentation can fall out of date and become inaccurate.

Humans can usually spot the signs of outdated information and exclude it when answering a question or addressing a work topic. But what happens when there’s no human in the loop?

At Microsoft, we’ve embraced the power and speed of agentic solutions across the enterprise. This means we’re at the forefront of developing and implementing innovative tools like the Employee Self-Service Agent, a chat-based solution that uses AI to address thousands of IT support issues and human resources (HR) queries every month—queries that used to be handled by humans. Early results from the tool show great promise for increased efficiency and time savings.

In developing tools like this agent, we were confronted with a challenge: How do we make sure all the unstructured data the tool was trained on is relevant and reliable?

Many organizations are facing this daunting task in the age of AI. Unlike structured data, which is well organized and more easily ingested by AI tools, the sprawling and unverified nature of unstructured data poses some tricky problems for agentic tool development. Tackling this challenge is often referred to as data conditioning.

Read on to see how we at Microsoft Digital—the company’s IT organization—are handling data conditioning across the company, and how you can follow our lead in your own organization.

How AI has changed the game

We already fundamentally understand that the power of AI and large language models has changed the game for many work tasks. The way employee support functions is no exception to this sweeping change.

A photo of Finney.

“A tool like the Employee Self-Service Agent doesn’t know if something is true or false—it only sees information it can use and present. That’s why stale or outdated information is such a risk, unless you manage it up front.”

David Finney, director of IT Service Management, Microsoft Digital

Instead of relying on human agents to answer employee questions or resolve issues, we now have AI agents trained on vast corpora of data that can find the answer to a complex question in seconds.

But in our drive to give these tools access to everything they might need, they sometimes end up consuming information that isn’t helpful.

“A tool like the Employee Self-Service Agent doesn’t know if something is true or false—it only sees information it can use and present,” says David Finney, director of IT Service Management. “That’s why stale or outdated information is such a risk, unless you manage it up front.”

Before AI, support teams didn’t need to worry as much about the buried issues with unstructured content because a human could generally spot it or filter it out manually. After we turned these tools loose, they began reading everything, including:

  • Older or hidden SharePoint content that humans would never find—but AI can
  • Large knowledge base articles with buried incorrect information
  • Region-specific content that’s not properly labeled

“For example, humans never saw the old, decommissioned SharePoint sites because they were automatically redirected,” says Kevin Verdeck, a senior IT service operations engineer. “But AI definitely could find them, and it surfaced ancient information that we didn’t even know was still out there.”

Data governance is the key

A major part of the solution to this problem is better governance. We had to get a handle on our data.

A photo of Cherel.

“We needed to determine the owners of the sites and then establish processes for reviewing content, updating it, and defining how it should be structured. I would highly encourage that our customers think about governance first when they are launching their own AI tools, because everything flows from it.”

Olivier Cherel, senior business process manager, Microsoft Digital

The first step was a massive cleanup effort, including removing decommissioned SharePoint sites and deleting references to retired programs and policies. The next step was making sure all content had ownership assigned to establish who would be maintaining it. This was followed by setting up schedules for regular content updates (lifecycle management).

Governance was the first priority for IT content, according to Olivier Cherel, a senior business process manager in Microsoft Digital.

“We had no governance in place for all the SharePoint sites, which were managed by the various IT teams,” Cherel says. “We needed to determine the owners of the sites and then establish processes for reviewing content, updating it, and defining how it should be structured. I would highly encourage that our customers think about governance first when they are launching their own AI tools, because everything flows from it.”

Content governance was also a huge challenge for other support areas, such as human resources. A coordinated approach was needed.

“HR content is vast, distributed across multiple SharePoint sites, and not everything has a clear owner,” says Shipra Gupta, an engineering PM lead in Human Resources who worked on the Employee Self-Service Agent project. “So, we collaborated with our content and People Operations teams to create a true content strategy: one source of truth, no duplication, with clear ownership and lifecycle management.”

Cherel observes that this process forces teams to think about their support content in a totally different way.

“People realize they need a new function on their team: content management,” he says. “You can’t simply rely on the knowledge found in the technicians’ heads anymore.”

Adding structure to the unstructured data

The simple truth is that part of what makes unstructured data so difficult for agentic AI tools to deal with is that it’s disorganized.

A photo of Gupta.

“Our HR Web content already had tagging for many policy documents, which helped us get started. But it wasn’t consistent across all content, so improved tagging became a big part of our governance effort.”

Shipra Gupta, engineering PM lead, Human Resources

AI works best with content that has as many of the following characteristics as possible:

  • Document structure, including:
    • Clear headers and sections
    • Page-level summaries
    • Ordered steps and lists
    • Explicit labels for processes
    • HTML tags (which AI can see, but humans can’t)
  • Structured metadata, including:
    • Region codes (e.g., US-only policies)
    • Device-specific tags
    • Secure device classification
    • Country-based hardware procurement policies and HR rules

This kind of formatting and metadata allows the AI tool to more clearly parse and sort the information, meaning its answers are going to have a much higher accuracy level (even if it might be a little slower to return them).

“A good example here is tagging,” Gupta says. “Our HR Web content already had tagging for many policy documents, which helped us get started. But it wasn’t consistent across all content, so improved tagging became a big part of our governance effort.”

Be sure that as part of your content review, you’re setting aside the time and resources to add this kind of structure to your unstructured data. The investment will pay off in the long run.

Using AI to help condition data for use

As AI tools grow more sophisticated, we’re using them to directly work on AI-related challenges. This includes using AI on the challenge of unstructured data itself.

“Right now, these efforts are primarily human-led, but we are applying AI to, for example, help write knowledge base articles,” Cherel says. “Also, we’re starting to use AI to determine where we have content gaps, and to analyze the feedback we’re getting on the tool itself. If we just rely on humans, it’s not going to scale. We need to leverage AI to stay on top of things and keep improving the tools.”

Essentially, the future of such technology is all about using AI to improve itself.

“We’re looking at building an agent to help validate content,” Finney says. “We can use it to check for outdated references, old processes, or abandoned terms that are no longer used. Essentially, we’ll have AI do a readiness check on the content that it is consuming.”

Ultimately, the better the data is conditioned, the more accurate and relevant the agent’s responses will be. And that will make the end user—the truly important human in the loop—much happier with the final outcome.

Key takeaways

We’ve highlighted some insights to keep in mind as you consider how to condition your own organization’s data for ingestion by AI tools:

  • Unstructured data becomes a business risk when AI is in the loop. AI agents consume everything they can access, including outdated, hidden, or conflicting content, making data conditioning a critical prerequisite for agentic solutions.
  • AI highlights content issues that were previously invisible. Decommissioned SharePoint sites, outdated policies, and region-specific content without proper labels all became visible after AI agents began scanning across systems.
  • Governance is a vital part of the conditioning process. Assigning clear content ownership and establishing lifecycle management are essential steps in ensuring the content being fed to AI tools is of high quality and is well managed.
  • Adding structure to data dramatically improves AI accuracy. Clear document formatting, consistent tagging, and rich metadata help AI agents return more relevant, reliable answers.
  • AI will increasingly be used to condition and validate the data it consumes. Microsoft is already exploring using AI to identify content gaps, analyze feedback, and flag outdated information, creating a continuous improvement loop that can scale faster than human review alone.

The post Conditioning our unstructured data for AI at Microsoft appeared first on Inside Track Blog.

]]>
23020
Harnessing AI: How a data council is powering our unified data strategy at Microsoft http://approjects.co.za/?big=insidetrack/blog/harnessing-ai-how-a-data-council-is-powering-our-unified-data-strategy-at-microsoft/ Thu, 09 Apr 2026 16:00:00 +0000 http://approjects.co.za/?big=insidetrack/blog/?p=23030 Information technology is an ever-evolving landscape. Artificial Intelligence is accelerating that evolution, providing employees with unprecedented access to information and insights. Data-driven decision making has never been more critical for businesses to achieve their goals. In light of this priority, we have established a Microsoft Digital Data Council to help accelerate our companywide AI-powered transformation. […]

The post Harnessing AI: How a data council is powering our unified data strategy at Microsoft appeared first on Inside Track Blog.

]]>
Information technology is an ever-evolving landscape. Artificial Intelligence is accelerating that evolution, providing employees with unprecedented access to information and insights. Data-driven decision making has never been more critical for businesses to achieve their goals.

In light of this priority, we have established a Microsoft Digital Data Council to help accelerate our companywide AI-powered transformation.

Our data council is a cross-functional team with representation from multiple domains within Microsoft, including Microsoft Digital, the company’s IT organization; Corporate, External, and Legal Affairs (CELA); and Finance.

A photo of Tripathi.

“By championing robust data governance, literacy, and responsible data practices, our data council is a crucial part of our AI-powered transformation. It turns enterprise data into a strategic capability that fuels predictive insights and intelligent outcomes across the organization.”

Naval Tripathi, principal engineering manager, Microsoft Digital

Our data council’s mission is to drive transformative business impact by establishing a cohesive data strategy across Microsoft Digital, empowering interconnected analytics and AI at scale. Our vision is to guide our organization toward Frontier Firm maturity through a clear blueprint for high-quality, reliable, AI-ready data delivered on trusted, scalable platforms.

“By championing robust data governance, literacy, and responsible data practices, our data council is a crucial part of our AI-powered transformation,” says Naval Tripathi, principal engineering manager in Microsoft Digital. “It turns enterprise data into a strategic capability that fuels predictive insights and intelligent outcomes across the organization.”

Enterprise IT maturity

This article is part of series on Enterprise IT maturity in the era of agents. We recommend reading all four of these articles to gain a comprehensive view of how your organization can transform with the help of AI and become a Frontier Firm.

  1. Becoming a Frontier Firm: Our IT playbook for the AI era
  2. Enterprise AI maturity in five steps: Our guide for IT leaders
  3. The agentic future: How we’re becoming an AI-first Frontier Firm at Microsoft
  4. AI at scale: How we’re transforming our enterprise IT operations at Microsoft (this story)

Our evolving data strategy

Over the past two decades, we at Microsoft—along with other large enterprises—have continuously evolved our data strategies in search of the right balance between control and agility. Early approaches were highly decentralized, with different teams owning and managing their own data assets. While this enabled local optimization, it also resulted in inconsistent quality and limited enterprise-wide insight.

Our subsequent shift toward centralized data platforms brought much-needed standardization, security, and scalability. However, as data platforms grew more sophisticated, ownership often drifted away from the business domains closest to the data, slowing responsiveness and diluting accountability.

Today, we and other leading companies are embracing a more balanced, federated approach, often described as a data mesh. Rather than forcing all our data into a single centralized system or allowing unchecked decentralization, the data mesh formalizes domain ownership while embedding governance, quality, and interoperability directly into shared platforms.

With this approach, our domain teams publish data as well-defined, discoverable products, while common standards for security, metadata, and compliance are enforced through automation rather than manual processes. This model preserves enterprise trust and consistency without sacrificing speed or autonomy.

By adopting a data mesh mindset, we can scale analytics and AI more effectively across the organization while still keeping ownership closely connected to the business focus. The result is a system that supports innovation at the edges, strong governance at the core, and seamless collaboration across domains, enabling the transformation of data from a technical asset to a strategic, enterprise-wide capability.

Quality, accessibility, and governance

To scale enterprise data and AI, organizations must first ensure their data is trusted, discoverable, and responsibly governed. At Microsoft Digital, our data strategy is designed to create data foundations that power intelligent applications and effective decision making across the company.

A photo of Uribe.

“High-quality, well-governed data is essential to accelerate implementation and adoption of AI tools. Data quality, accessibility, and governance are imperatives for AI systems to function effectively, and recognizing that is propelling our data strategy.”

Miguel Uribe, principal PM manager, Microsoft Digital

By implementing a data mesh strategy at scale, we aim to unlock valuable data insights and analytics, enabling advanced AI scenarios. Our data council focuses on three core dimensions that make AI-ready data possible:

  • Quality: Making sure enterprise data is reliable and complete
  • Accessibility: Enabling secure and discoverable access to data
  • Governance: Protecting and managing our data responsibly

Together, these dimensions form the foundation for scalable innovation and AI-powered data use. They connect data silos and ensure consistent, high‑quality access across the enterprise—enabling both humans and AI systems to work from the same trusted data foundation. As AI use cases mature, this foundation allows AI agents to retrieve and reason over data through enterprise endpoints, while supporting advanced analytics, data science, and broader technology.

“High-quality, well-governed data is essential to accelerate implementation and adoption of AI tools,” says Miguel Uribe, a principal PM manager in Microsoft Digital. “Data quality, accessibility, and governance are imperatives for AI systems to function effectively, and recognizing that is propelling our data strategy.”

Quality

AI-ready data is available, complete, accurate, and high-quality. By adopting this standard, our data scientists, engineers, and even our AI agents are better able to locate, process, and govern the information needed to drive our organization and maximize AI efficiencies.

By utilizing Microsoft Purview, our data council can oversee the monitoring of data attributes to ensure fidelity. It also monitors parameters to enforce standards for accuracy and completeness.

Accessibility

Ensuring that our employees get access to the information they need while prioritizing security is a foundational element of our enterprise data strategy. Microsoft Fabric allows us to unify our organization’s siloed data in a single “mesh” that enables advanced analytics, data science, data visualization and other connected scenarios.

Microsoft Purview then gives us the ability to democratize that data responsibly. By implementing a data mesh architecture, our employees can work confidently, unencumbered by siloed or inaccessible data, and with the assurance that the data they’re working with is secure.

A graphic shows how the data mesh architecture allows employees to access data they need, with platform services and data management zones surrounding this architecture.
The data mesh architecture enables our employees to do their work efficiently while preventing the data they’re working on from becoming siloed.

The data mesh connects and distributes data products across domains, enabling shared data access and compute while scaling beyond centralized architectures.

Platform services are standardized blueprints that embed security, interoperability, policies, standards, and core capabilities—providing guardrails that enable speed without fragmentation.

Data management zones provide centralized governance capabilities for policy enforcement, lineage, observability, compliance, and enterprise-wide trust.  

Governance

As organizations scale AI capabilities, strong governance becomes essential to ensure security, compliance, and ethical data use. Data governance—which includes establishing data policies, ensuring data privacy and security, and promoting ethical AI usage—is critical, as is compliance with General Data Protection Regulation (GDPR) and Consumer Data Protection Act (CDPA) regulations, among others.

However, governance is not only a technical capability; it’s also a cultural commitment.

Responsible data use must be embedded into the way teams manage data and build AI solutions. Through Microsoft Purview, we implemented an end-to-end governance framework that automates the discovery, classification, and protection of sensitive data across the enterprise data landscape.

This unified approach allows teams to innovate confidently, knowing that the data powering their insights and AI systems is trusted and protected, as well as responsibly managed.

“AI systems are only as reliable as the data that powers them,” Uribe says. “By investing in trusted and well-managed data, we accelerate not only the adoption of AI tools but our ability to generate meaningful insights and intelligent outcomes.”

The data catalog as the discovery layer

By serving as a common discovery layer for humans and AI, the data catalog ensures that governance translates directly into speed, accuracy, and trust at scale.

A unified data strategy only succeeds if both people and AI systems can consistently find the right data. At Microsoft, this is enabled by our enterprise data catalog, which operationalizes the standards set by our data council. 

For business users, the catalog provides intuitive search, ownership transparency, and trust signals—enabling confident self‑service analytics. For AI agents, the same catalog exposes machine‑readable metadata, allowing agents to programmatically discover canonical datasets, validate schema and freshness, and respect governance constraints.

Our role as Customer Zero

In Microsoft Digital, we operate as Customer Zero for the company’s enterprise solutions, so that our customers don’t have to.

That means we do more than adopt new products early. We deploy them at enterprise-scale, operate them under real‑world constraints, and hold them to the same standards our customers expect. The result is more resilient, ready‑to‑use solutions and a higher quality bar for every enterprise customer we serve.

A photo of Baccino.

“When we engage product teams with real telemetry from how data is created, governed, and consumed at scale, we move the conversation from theory to execution. That’s how enterprise readiness becomes real.”

Diego Baccino, principal software engineering manager, Microsoft Digital

Our data council embodies this Customer Zero mindset through its Enterprise Readiness initiative. By engaging product engineering as a unified enterprise voice, the council drives strategic conversations that surface operational blockers, influence roadmap prioritization, and ensure new and existing data solutions are truly ready for enterprise use.

These learnings are then shared broadly across Microsoft Digital to accelerate adoption, reduce duplication, and scale proven patterns across teams.

“When we engage product teams with real telemetry from how data is created, governed, and consumed at scale, we move the conversation from theory to execution,” says Diego Baccino, a principal software engineering manager in Microsoft Digital and a member of the council. “That’s how enterprise readiness becomes real.”

This work is deeply integrated with our AI Center of Excellence (CoE), where Customer Zero principles are applied to accelerate AI outcomes responsibly. Together, the AI CoE and the data council focus on improving data documentation and quality—foundational capabilities that are required to make AI feasible, trustworthy, and scalable across the enterprise.

By grounding AI innovation in measurable data quality and governance standards, Microsoft Digital ensures that experimentation can safely mature into production‑ready solutions. The partnership between our data council, our AI CoE, and our Responsible AI (RAI) Council is essential to our broader data and AI strategy.

“AI readiness isn’t aspirational—it’s operational,” Baccino says. “By measuring the health of our data, setting clear quality baselines, and using those signals to guide product and platform decisions, we turn data into a strategic asset and AI into a repeatable capability.”

Together, these teams exemplify what it means to be Customer Zero: Transforming enterprise experience into action, governance into acceleration, and data into durable competitive advantage.

Advancing our data culture

Our data council plays a pivotal role in advancing the organization transition from data literacy to enterprise data and AI capability. In conjunction with our AI CoE, it creates curricula and sponsors learning pathways, operational practices, and community programs to equip our employees with the skills and mindset required to thrive in a data- and AI-centric world.

While early efforts focused on improving data literacy, our data council ’s mission has evolved to enable data and AI capability at scale together with our AI CoE—where employees not only understand data but can effectively apply it to build, operate, and govern intelligent solutions.

“Our focus is not just teaching our teams about data. It is enabling employees to apply data to create AI-driven outcomes. When teams understand how data powers AI systems, they can make better decisions, design better products, and build more responsible AI experiences.”

Miguel Uribe, principal product manager, Microsoft Digital

Our curriculum includes high-level courses on data concepts, applications, and extensibility of AI tools like Microsoft 365 Copilot, as well as data products like Microsoft Purview and Microsoft Fabric.

By facilitating AI and data training, offering internally focused data and AI certifications, and internal community engagement, our council ensures that employees develop the capabilities required to responsibly build and operate AI-powered solutions. Achieving data and AI certifications not only promotes career development through improved data literacy, it also enhances the broader data-driven culture within our organization.

“We recognize that AI capability is built when data skills are applied directly to real AI scenarios and business outcomes—not when learning exists in isolation,” Uribe says. “Our focus is not just teaching our teams about data; it is enabling employees to apply data to create AI‑driven outcomes. When teams understand how data powers AI systems, they can make better decisions, design better products, and build more responsible AI experiences.”

Lessons learned

Our data council was created to develop and execute a cohesive data strategy across Microsoft Digital and to foster a strong data culture within our organization. Over time, several critical lessons have emerged.

Executive sponsorship enables transformation

Executive sponsorship is a key element to ensure implementation and adoption of a data strategy. Our leaders are committed to delivering and sustaining a robust data strategy and culture and have been effective champions of the council’s work.

“Leadership provides support and reinforcement of the council’s mission, as well as guidance and clarity related to diverse organizational priorities,” Baccino says.

Cross-functional collaboration accelerates impact

Our council’s work has also benefited from the diverse representation offered by different disciplines across our organization. Embracing diverse perspectives and understanding various organizational priorities is critical to implementing a successful data strategy and culture in a large and complex organization like Microsoft Digital.

Modern platforms allow for scalable AI productivity

Technology and architecture also play a critical role in enabling enterprise data and AI capability. Platforms like Microsoft Purview and Microsoft Fabric provide the governance, discovery, and analytics infrastructure required to create trusted, AI-ready data ecosystems.

Combined with strong leadership support and community engagement, these platforms allow our organization to move beyond isolated data projects toward connected, enterprise-wide intelligence.

As our organization continues to evolve, our data council’s strategic work and valuable insights will be crucial in shaping the future of data-driven decision making and AI transformation at Microsoft.

Key takeaways

Here are some things to keep in mind as you contemplate forming a data council to help you manage and scale AI impacts responsibly at your own organization:

  • A data mesh strikes the balance enterprises have been chasing. By formalizing domain ownership while enforcing standards through shared platforms, you avoid both chaotic decentralization and slow, over-centralized control.
  • Governance is an accelerator when it’s automated and embedded. Using platforms like Microsoft Purview and Microsoft Fabric, governance shifts from a manual gatekeeping function to a built‑in capability that enables faster, trusted analytics and AI.
  • AI systems are only as strong as their discovery layer. A unified enterprise data catalog allows both people and AI agents to find, trust, and use data consistently—turning standards into operational speed.
  • Customer Zero turns theory into enterprise‑ready execution. By operating its own data and AI platforms at scale, Microsoft Digital provides real telemetry and practical feedback that directly shapes product readiness.
  • Building AI capability is a cultural effort, not just a technical one. Our data council’s focus on applied learning, certification, and real-world AI scenarios ensures data skills translate into durable business outcomes.
  • AI scale exposes the cost of fragmented data ownership. A data council cuts through silos by aligning priorities, resolving tradeoffs, and concentrating investment on the data assets that matter most for AI impact.
  • Shared metrics create shared ownership. Publishing data quality and AI‑readiness scores at the leadership level reinforces accountability and positions data as a core enterprise asset.

The post Harnessing AI: How a data council is powering our unified data strategy at Microsoft appeared first on Inside Track Blog.

]]>
23030
Powering data governance at Microsoft with Purview Unified Catalog http://approjects.co.za/?big=insidetrack/blog/powering-data-governance-at-microsoft-with-purview-unified-catalog/ Thu, 05 Feb 2026 17:00:00 +0000 http://approjects.co.za/?big=insidetrack/blog/?p=22272 Data fuels everything that we do here at Microsoft, from the daily operations that keep the business running to the innovations that shape the future. But as data sprawls across teams, systems, and borders, the task of ensuring that it remains secure, accurate, and well-governed is a daunting one. A sound approach to data governance […]

The post Powering data governance at Microsoft with Purview Unified Catalog appeared first on Inside Track Blog.

]]>
Data fuels everything that we do here at Microsoft, from the daily operations that keep the business running to the innovations that shape the future.

But as data sprawls across teams, systems, and borders, the task of ensuring that it remains secure, accurate, and well-governed is a daunting one. A sound approach to data governance is the backbone of responsible data use across the enterprise, creating clarity around data ownership and access.

In an organization the size of Microsoft, no single team can carry this responsibility on its own. Effective data governance must be a distributed effort across all departments and functions.

This story explains how our marketing organization uses the Microsoft Purview Unified Catalog to organize and standardize the data we rely on daily. By putting clear ownership, consistent definitions, and reliable governance in place, we’re turning fragmented, unreliable data into an advantage that supports faster decisions and more effective campaigns.

Data governance at scale

As companies grow, their data governance becomes increasingly complex, with different teams creating their own versions of key data concepts, often without realizing it. The complexity is most visible in the way users across an organization define foundational terms.

A photo of Doughty.

“We found adoption to be much easier when helping teams focus on building more value in their data instead of driving governance like a compliance effort.”

Nick Doughty, senior product manager, Microsoft Purview Unified Catalog

Examples in marketing include what counts as a customer (active vs. inactive, marketing- or sales-qualified), what constitutes sensitive data (personally identifiable information, behavioral data, partner data), and what a metric means (conversion, engagement, attribution windows).

When inconsistent practices take hold, ownership becomes murky. With the increasing demands that managing data quality and integrity put on our leaders and their teams, effective data governance becomes one more hurdle to productivity.

“We started off implementing data governance like an issue register,” says Nick Doughty, a senior product manager within Microsoft Purview Unified Catalog. “Then we progressed to more of an enforcement method, similar to how we were doing security at the time. We found that when we started to push really hard on teams, similar to how we drove other compliance efforts, it was difficult for them to justify or understand why they would want the added governance.”

The introduction of Microsoft Azure Purview in 2020 marked a turning point.

A united platform for data governance, security, and compliance, Purview helps organizations understand, protect, and manage data across environments. It also addresses fragmented data, lack of visibility into where sensitive data lives and how it moves, compliance complexity with regulations (including GDPR and HIPAA), and security risks.

A photo of Mathur

“Our marketing teams used to spend hours hunting for the right customer list because multiple versions lived in different locations, each with unclear owners and inconsistent labels. Now our marketers can trust they are working from current information, while avoiding compliance risks associated with incorrect or unauthorized data.”

Sourabh Mathur, principal engineering lead, Global Marketing Engines and Experiences

The Purview Unified Catalog serves as the AI-powered backbone, automatically discovering, classifying, and organizing information so users can easily find and trust the data they need.

By launching the unified catalog, we gave our users a consistent way to understand and use their data, while reinforcing strong governance and compliance practices. The result is data that’s more discoverable, reliable, and actionable. (The product was renamed Microsoft Purview in 2022 and became part of Microsoft 365 compliance tools.)

“Our marketing teams used to spend hours hunting for the right customer list because multiple versions lived in different locations, each with unclear owners and inconsistent labels,” says Sourabh Mathur, a principal engineering lead in Global Marketing Engines and Experiences, who helped set up Purview for our marketing organization.

With the unified catalog in place, Purview surfaces the dataset, shows its lineage, and applies the correct sensitivity classifications.

“Now our marketers can trust they are working from current information, while avoiding compliance risks associated with incorrect or unauthorized customer data,” Mathur says.

Powering marketing at Microsoft with Purview

With more than 200 Microsoft Azure subscriptions, our marketing organization manages one of the largest data estates at the company. The team faces the constant challenge of scattered data, unclear data ownership, and inconsistent governance practices that slow down campaigns and increase compliance risk.

A photo of Biswal.

“Marketing can now scale governance across hundreds of data products, support self-service data collection with guardrails, automate access decisions, and enable AI workloads on trusted data.”

Deepak Kumar Biswal, principal software engineering lead, Global Marketing Engines and Experiences

By adopting Purview, our marketing team gained unified visibility, clearer classification standards, and smoother collaboration with other departments, like IT and legal. This reduces friction while strengthening data protection.

The result is an organization that moves faster with greater confidence in how it handles customer and campaign data.

Instead of relying on legacy knowledge, forcing users to dig through different servers and SharePoint sites, or constantly sending queries to the engineering teams, our marketing professionals can now explore the curated Purview Unified Catalog, making streamlined, efficient data discovery possible.

“Marketing can now scale governance across hundreds of data products, support self-service data collection with guardrails, automate access decisions, and enable AI workloads on trusted data,” says Deepak Kumar Biswal, a principal software engineering lead in Global Marketing Engines and Experiences. “Purview turns responsible data use into everyday practice, not extra work.”

Data governance and security: Two sides of the same coin

For our marketing organization, data governance and security are inseparable concepts. As soon as you have customer information, you need to make sure it’s secure—sensitive data must be carefully defined, consistently managed, and protected from misuse or breach.

Purview supports this goal by combining governance capabilities with security and compliance controls that provide added layers of protection.

Within marketing, the governance and security teams work closely together. Good governance measures ensure our data is properly defined and standardized, while strong security policies ensure it’s handled with proper safeguards. By pairing governance with strong security practices, our marketing team can remain compliant with data privacy laws, prevent misuse of sensitive information, and foster trust across their organization.

When our marketing team began its Purview journey five years ago, it adopted a centralized governance model. Much like the structure of a government—where federal, state, and local entities each play a role—our approach allows both centralized standards and local autonomy. This creates consistency across the organization without stifling agility.

Our Data Governance team took on the role of steward, defining standards, onboarding systems, and collaborating with its IT partners to connect data environments. Existing assets like data dictionaries and process flows were used to seed the catalog, ensuring the team started from known ground rather than reinventing definitions from scratch.

This deliberate, incremental approach allowed our marketing team to thoughtfully build out healthy governance practices. By moving slowly, the team learned from each step on its journey, refining processes and establishing consistent practices as it moved along.

For example, working closely with our team in Microsoft Digital allowed them to experiment with different ways of discovering and cataloging their data. This involved taking learn and refine how Purview tuned their data before they rolled anything out broadly.

Our goal is to transition to a completely federated model in which responsibility shifts outward. Rather than the marketing governance team doing all the stewardship, individual groups will take ownership of their data within Purview. This shift distributes accountability, embeds governance deeper into daily operations, and makes it easier for teams to monitor data quality and enforce standards on their own.

Impact across the enterprise

Since adopting Purview Unified Catalog, we’ve seen tangible results across our data estate and our data governance practices in marketing and across all verticals within the company. Here are some companywide highlights:

  • Better consolidation: We’ve unified five catalogs into one.
  • Increased scale: We added 250 data sources onboarded in six months, representing roughly 10 million assets.
  • Higher internal adoption: We set up more than 50 governance domains, an effort we supported with reusable training assets, guides, and onboarding materials.

The benefits also include and extend beyond marketing:

  • Teams across the company are gaining increased confidence in their data definitions.
  • Compliance and privacy obligations are being met more effectively.
  • Business value is being generated through better, more trusted use of data.
  • Organizations are benefiting from faster time-to-insight.

Launching the marketing governance domain

We’re using Purview to combine essential capabilities like data governance, classification, and quality checks across our Microsoft services, which creates a unified foundation for our enterprise-wide metadata management. These unified capabilities make Purview an indispensable tool for us, and for large-scale enterprises.

A photo of Singh

“With various role types like data curator and data reader, we can add more visibility into our data—where it lives, how it’s being used, and who are its primary owners. Clearly defining these parameters helps us use the data governance framework as a starting point and improve our data governance capabilities.”

Vinny Singh, principal program manager, Global Marketing Engines and Experiences

As early adopters of Purview Unified Catalog, the group launched the Marketing Governance domain, registering more than 200 data products using the Unified Catalog’s data map.

The products, spanning various datasets, are aligned with strict internal governance standards. This gives marketing the ability to govern, classify, and track data across its ecosystem—ensuring adherence to GDPR and other regulatory compliance measures.

“With various role types like data curator and data reader, we can add more visibility into our data—where it lives, how it’s being used, and who are its primary owners,” says Vinny Singh, a principal program manager in Global Marketing Engines and Experiences. “Clearly defining these parameters helps us use the data governance framework as a starting point and improve our data governance capabilities.”

Key takeaways

Our journey with Microsoft Purview Unified Catalog has generated key insights that you can apply to your own data governance efforts. These include:

  • Start small: Don’t try to “boil the ocean.” Begin with three to five governance domains and scale from there.
  • Leverage what you have: Data dictionaries, glossaries, and existing documentation provide a strong starting point for a governance platform founded on the Purview Unified Catalog.
  • Focus on value, not enforcement: Governance resonates when teams see how it helps them, not when it’s mandated.
  • Adapt to your organization: Each team at your company will use Purview differently. Flexibility helps encourage adoption.
  • Build community: Data governance is not a solo effort. Collaboration among stakeholders produces stronger standards and better results.

The post Powering data governance at Microsoft with Purview Unified Catalog appeared first on Inside Track Blog.

]]>
22272
Unleashing API-powered agents at Microsoft: Our internal learnings and a step-by-step guide http://approjects.co.za/?big=insidetrack/blog/unleashing-api-powered-agents-at-microsoft-our-internal-learnings-and-a-step-by-step-guide/ Thu, 02 Oct 2025 16:05:00 +0000 http://approjects.co.za/?big=insidetrack/blog/?p=19793 Agentic AI is the frontier of the AI landscape. These tools show enormous promise, but harnessing their power isn’t always as straightforward as prompting a model or accessing data from Microsoft 365 apps. To reach their full potential in the enterprise, agents sometimes need access to data beyond Microsoft Graph. But giving them access to […]

The post Unleashing API-powered agents at Microsoft: Our internal learnings and a step-by-step guide appeared first on Inside Track Blog.

]]>
Agentic AI is the frontier of the AI landscape. These tools show enormous promise, but harnessing their power isn’t always as straightforward as prompting a model or accessing data from Microsoft 365 apps. To reach their full potential in the enterprise, agents sometimes need access to data beyond Microsoft Graph. But giving them access to that data relies on an extra layer of extensibility.

To meet these demands, many of our teams within Microsoft Digital, the company’s IT organization, have been experimenting with API-based agents. This approach combines the best of two worlds: accessing diverse apps and data repositories and eliminating the need to build an agent from the ground up.

We want to empower every organization to unlock the full power of agents through APIs. The lessons we’ve learned on our journey can help you get there.

The need for API-based agents

The vision for Microsoft 365 Copilot is to serve as the enterprise UX. Within that framework, agents serve as the background applications that streamline workflows and save our employees time.

For many users, the out-of-the-box access Copilot provides to Microsoft Graph is enough to support their work. It surfaces the data and content they need while providing a foundational orchestration layer with built-in capabilities around compliance, responsible AI, and more.

But there are plenty of scenarios that require access to other data sources.

“Copilot provides you with data that’s fairly static as it stands in Microsoft Graph,” says Shadab Beg, principal software engineering manager on our International Sovereign Cloud Expansion team. “If you need to query from a data store or want to make changes to the data, you’ll need an API layer.”

By using APIs to extend agents built on the Copilot orchestration layer, organizations can apply its reasoning capabilities to new data without the need to fine-tune their models or create new ones from scratch. The possibilities these capabilities unlock are driving a boom in API-based agents for key functions and processes.

“Cost is one of the most critical dimensions in how we design, deploy, and scale our solutions. Declarative API-driven agents in Microsoft 365 Copilot offer a path to unify agentic experiences while leveraging shared AI compute and infrastructure.”

A photo of Nasir.
Faisal Nasir, AI Center of Excellence and Data Council lead, Microsoft Employee Experience

In many ways, IT organizations like ours are the ideal places to implement API-based agents. Our teams are adept at creating and deploying internal solutions to solve technical challenges, and IT work is often about enablement and efficiency—exactly what agents do best.

“Cost is one of the most critical dimensions in how we design, deploy, and scale our solutions,” says Faisal Nasir, AI Center of Excellence and Data Council lead in Microsoft Employee Experience. “Declarative API-driven agents in Microsoft 365 Copilot offer a path to unify agentic experiences while leveraging shared AI compute and infrastructure. By aligning with core architectural principles such as efficiency, scalability, and sustainability, we can ensure these agents not only drive intelligent outcomes but also maximize value across service areas with minimal overhead.”

{Learn more about our vision and strategy around deploying agents internally at Microsoft.}

The Azure FinOps Budget Agent

Our Azure FinOps Budget Agent is a perfect example of a scenario for API-based agents.

The team responsible for managing our Microsoft Azure budget for IT services was looking for ways to reduce costs by 10–20 percent. To do that effectively, service and finance managers needed the ability to track their spending quickly, accurately, and easily.

The conventional approach to solving this problem would be creating a dashboard with access to the relevant data. The problem with a UI-based approach is that it tends to cater to more specific personas by providing data only they need while oversaturating others with information that’s irrelevant to their work.

Azure spend is basically the lifeline for our services,” says Faris Mango, principal software engineering manager for infrastructure and engineering services within Microsoft Digital. “Getting the information you need in a concise format that provides a nice, holistic view can be challenging.”

With the advent of generative AI and Microsoft 365 Copilot, the team knew that a natural language interface would be much more intuitive. The result was the Azure FinOps Budget Agent.

The team created the agent and the necessary APIs using Microsoft Visual Studio Code. Its tables and functions run on Azure Data Explorer, allowing the APIs and their consumers to access data almost instantaneously, thanks to its low latency and rapid read speeds.

The tool retrieves data by running Azure Data Factory pipelines that pull and transform data from three sources:

  • Our SQL Server for service budget and forecast data
  • Azure Spend for the actual spending amounts
  • Projected spending, a separate service stored in other Azure Data Explorer tables

Processing the information relies on our business logic’s join operations, followed by aggregations by fiscal year and service tree levels. These summarize the data per service, team group, service group, and organization.

After the back end processes the day’s data, it ingests the information into our Azure Data Explorer tables, which the agent accesses by calling via Kusto functions (the query language for Azure Data Explorer). The outcome is very low latency. Typically, the agent returns results in under 500 milliseconds.

For users, the tool is stunningly simple. They simply access Copilot and navigate to the Azure FinOps Budget Agent.

The agent provides three core prompts at the very top of the interface: “My budgets,” “Service budget information,” and “Service group budget information.” Clicking on one of these pre-loaded prompts returns role-specific information around budget, forecasts, actuals, projections, and variance, all at a single glance. The interface even includes graphs to help people track spending visually.

If users are looking for more specific information, they can input their own queries. For example:

  • “Get me the monthly breakdown of service Azure Optimization Assessment analytics.”
  • “Find me the service in this tree with the highest budget.”
  • “Show me the Azure budget for our facilities reporting portal.”
  • “Which service deviates most from its budget forecasts?”

The Azure FinOps Budget Agent primarily serves two groups: service managers who directly oversee spend for Azure-based services and FinOps managers responsible for larger budget silos.

Mango is responsible for the internal UI that helps network employees access parts of the Microsoft network. With 18–20K users per month, budgeting and forecasting are highly dynamic due to traffic fluctuations and the resourcing that supports them. He also oversees the internal portal that helps service engineers manage our networks. The tool is growing rapidly as we onboard more teams, so forecasting is anything but linear.

For both of these services, keeping close track of spending is essential. Mango finds himself checking the Azure FinOps Budget Agent about twice a month to gauge how his services are trending.

“It’s taking me less time to do analysis and come up with accurate numbers. And the enhanced user experience just feels more natural, like you’re asking questions conversationally rather than engaging with a dashboard.”

A photo of Mango.
Faris Mango, principal software engineering manager for infrastructure and engineering services, Microsoft Digital

For FinOps managers, the value is more high-level. They are responsible for overseeing tens of services featuring vast volumes of Azure usage across storage and compute while managing strict budgets. That requires constant vigilance.

Switching context from one dashboard to another to track different Azure management groups was a constant hassle for them. Now, they use the Azure FinOps Budget Agent to get an up-to-date view of the overall spend picture. It gives them a place to start. From there, they can drill down if he sees any abnormalities.

“It’s taking me less time to do analysis and come up with accurate numbers,” Mango says. “And the enhanced user experience just feels more natural, like you’re asking questions conversationally rather than engaging with a dashboard.”

The arrival of the Azure FinOps Budget Agent is just one example of how agents take your context and get your people the answers they care about faster at less cost.

Benefits like these are spreading across teams throughout Microsoft. Overall, we’ve been able to save 10–12 percent of our overall Azure cost footprint for Microsoft Digital, and individual users are thrilled at the amount of time and effort they’re saving.

“Now the info is at people’s fingertips. The advantage of an agent is that users don’t have to understand a complex UI, so they can get quick answers and get back to work.”

A photo of Beg.
Shadab Beg, principal software engineering manager, International Sovereign Cloud Expansion

Five key strategies for building an API-based agent

After seeing what we’ve accomplished with API-based agents, you might be wondering how to put them into action at your organization. This step-by-step guide can help you get there.

Building an API-based agent needs to fulfill multiple requirements. It has to expose APIs, align with real user needs, integrate seamlessly with Microsoft 365 Copilot, and work reliably, efficiently, and scalably. Achieving those outcomes depends on five key strategies.

Start with user intent, not the API

Start by asking a simple but powerful question: What will users actually ask your agent? Instead of designing the API first, flip the process:

  • Gather real user queries to understand actual use cases.
  • Refine the queries using prompt engineering techniques to align them with expected AI behavior.
  • Design the API to provide structured responses to those refined queries.

By starting with user intent, you ensure your agent answers real user questions directly, avoids over-engineering unnecessary endpoints, and delivers meaningful results without excessive back-end processing.

“Now the info is at people’s fingertips,” Beg says. “The advantage of an agent is that users don’t have to understand a complex UI, so they can get quick answers and get back to work.”

The advantage of an agent is that users don’t have to understand a complex UI, so they can get quick answers and get back to work.”

Key learning: An API that doesn’t align with user intent won’t be effective—even if you design it well.

Design APIs for Microsoft 365 Copilot Integration

It’s important to build an API schema that returns precise and structured data to make it easy for Copilot to consume. This ensures your APIs return data in a format that directly answers user queries. Copilot expects responses in under three seconds, so focus on optimizing API responses for low latency.

Once you have your list of key questions, design your API schema to return the exact data you need to answer those questions. Your goal should be to ensure every API response has a structure that makes it easy for Copilot to understand.

Teach Microsoft 365 Copilot to call your API

Copilot needs to know how to call your API. Manifests and OpenAPI descriptions accomplish that training.

Create detailed OpenAPI documentation and plugin manifests so Copilot knows what your API does, how to invoke it, and what responses to expect. You’ll likely need to adjust to these files through a process of trial and error.  

Scale APIs for performance and reliability

Once you have your schema and integration in place, it’s time to move on to the primary engineering challenge: making your API scalable, efficient, and reliable.

Prioritize the following goals:

  • Fast response times: Copilot expects quick answers.
  • High scalability: This ensures seamless performance at scale.
  • Reliable uptime: The system needs to remain robust.

We recommend setting a very strict latency limit while implementing your API to retrieve data, since Copilot needs time to generate its response. Existing API endpoints often involve complex data joins rather than simply returning rows from data tables. This complexity can lead to longer processing times, particularly with intricate queries that involve multiple data stores.

To address these potential delays, pre-cache results to significantly enhance performance. This can help overcome the latency requirements imposed by Copilot.

At this point, you’ll see why starting with user intent and iteratively refining API design is important. By grounding your work in user behaviors, you’ll align with the following best practices:

  • Structure your response to directly address user queries.
    Instead of just returning raw data, the API should provide meaningful insights Copilot can interpret. Prompt engineering marries user intent with the most understandable API schema.
  • Keep your API flexible enough to adapt to evolving business needs.
    Real-world workflows change over time, and an API should be able to support those changes without massive refactoring.
  • Avoid performance bottlenecks caused by unnecessary complexity.
    Understanding the exact data requirements up front prevents heavy joins, excessive filtering, and inefficient data retrieval logic.
  • Optimize for Copilot’s real-time response constraints.
    With a strict limit on latency, consider pre-optimization techniques like pre-caching results and simplifying query logic from the very beginning of your API implementation.

If you attempt to build a scalable, reliable API without first understanding how users will interact with your agent, you’ll spend months reworking the schema, debugging inefficiencies, and struggling with integration challenges.

Key learning: A fast, scalable, and reliable API isn’t just about technical optimization. It starts with a deep understanding of the questions it needs to answer and how to structure responses so Copilot can interpret them correctly.

Consider compliance and responsible AI

Unlike custom agents or OpenAI API integrations, knowledge-only agents require far less effort to meet Microsoft’s Responsible AI Standard. Microsoft tools’ built-in compliance capabilities handle much of the complexity. As a result, you can focus on efficiency and optimization rather than regulatory hurdles.

“Agent-based automation must balance speed with responsibility,” Nasir says. “We embed compliance, cost control, and telemetry from the start, so our systems don’t just scale, they mature.”

Key learning: It’s helpful to revisit your existing compliance, governance, and responsible AI processes and policies before implementing AI solutions. Copilot adheres to protective structures within your Microsoft technology ecosystem, so this process will ensure you’re starting from the most secure position.

APIs and the agentic future

Building API-based agents is more than just an integration exercise. It’s about creating scalable, intelligent, and compliant AI-driven workflows. By aligning your API design with user intent, you set Microsoft 365 Copilot free to retrieve and interpret information accurately. That leads to a seamless AI experience for your employees.

Thanks to Copilot’s built-in security and compliance features, API-based Copilot agents are some of the most efficient, compliant, and enterprise-ready ways to deploy AI solutions. They represent another step into an AI-first future tailored to your employees’ and organization’s needs.

Tools like API-based agents democratize the information we all need to do our jobs better, because we’re all getting the same data from the same place. This is why an AI-first mindset is actually human-first.

Key takeaways

Here are some things to keep in mind when designing agent-powered experiences that are fast, reliable, and aligned with user expectations.

  • Response time is key. Choose single APIs that have low latency to facilitate both the technical requirements of Copilot and users’ needs.
  • Consider the source. Data has to be high-quality on the backend. It’s worth reviewing your data and ensuring the hygiene is good.
  • Agents and APIs need to align. Design with task-centric, well-structured agents. Determine your high-level goals, then use the OpenAI standard, OpenAPI, or graph schemas to describe task endpoints. Define each API’s capability, input schema, and expected outcome very clearly.
  • Plan ahead to avoid surprises. Design your APIs to minimize potential side effects, especially through enabling natural-language-to-API mapping, because that’s the biggest change in methodology.
  • Design for visibility. Agents need to be observable and explainable, so implement metrics-driven monitoring. Having API-level telemetry in addition to Copilot-level telemetry enables continuous improvement.

The post Unleashing API-powered agents at Microsoft: Our internal learnings and a step-by-step guide appeared first on Inside Track Blog.

]]>
19793
Embracing emerging technology at Microsoft with new AI certifications http://approjects.co.za/?big=insidetrack/blog/embracing-emerging-technology-at-microsoft-with-new-ai-certifications/ Wed, 16 Apr 2025 15:56:00 +0000 http://approjects.co.za/?big=insidetrack/blog/?p=12532 This story reflects updated guidance from Microsoft Digital—it was first published in 2023. As the company’s IT organization, we at Microsoft Digital realized that advanced AI was going to create opportunities for our employees to increase their reach and impact. We knew we needed to move quickly to help them get ready for the moment. […]

The post Embracing emerging technology at Microsoft with new AI certifications appeared first on Inside Track Blog.

]]>
This story reflects updated guidance from Microsoft Digital—it was first published in 2023.

As the company’s IT organization, we at Microsoft Digital realized that advanced AI was going to create opportunities for our employees to increase their reach and impact. We knew we needed to move quickly to help them get ready for the moment.

Our response?

We assembled an ambitious data and AI curriculum through Microsoft Viva Learning that draws from Microsoft Learn and other content sources. This curriculum is now empowering our employees with the skills they need to harness these tools.

Microsoft Viva Learning and Microsoft Learn

Microsoft Viva Learning and Microsoft Learn are two distinct platforms that serve different purposes.

Microsoft Viva Learning is a centralized learning hub in Microsoft Teams that lets you seamlessly integrate learning and building skills into your day. With Viva Learning your team can discover, share, recommend, and learn from content libraries provided by both your organization and select partners. They can do all of this without leaving Microsoft Teams.

Microsoft Learn is a free online learning platform that provides interactive learning content for Microsoft products and services. It offers a wide range of courses, tutorials, and certifications to help users learn new skills and advance their careers. Microsoft Learn is accessible to anyone with an internet connection and is available in multiple languages.

It’s all part of our approach to infusing AI into everything we do to support the company. The more successful we are in Microsoft Digital, the better our team can deploy our new AI technologies to the rest of our colleagues across the organization.

A photo of MacDonald.

“We take a holistic approach. It’s not just about winning with technology—it’s about supporting the community and doing things the right way.”

Sean MacDonald, partner director of product management, Microsoft Digital

Infusing AI into Microsoft through a learn-it-all culture

Fully unleashing AI across Microsoft is a bold aspiration that will require plenty of guidance and support from our Microsoft Digital team. It’s both a technology and a people challenge that requires us to have more than IT knowledge to deliver.

“We take a holistic approach,” says Sean MacDonald, partner director of product management in Microsoft Digital. “It’s not just about winning with technology—it’s about supporting the community and doing things the right way.”

With our learn-it-all culture and Microsoft Viva Learning, Microsoft Learn, and other content sources at our disposal, a progressive curriculum was the natural choice for upskilling our technical professionals. Microsoft Viva Learning connects content from our organization’s internal learning libraries and third-party learning management systems.

As a result, it makes it easy for our team to develop learning paths with content from Microsoft Learn, LinkedIn Learning, and external providers like Pearson.

“As a tech company, we’re always encountering new concepts and new technologies,” says Miguel Uribe, principal product manager lead for Employee Experience Insights in Microsoft Digital. “It’s part of our culture to absorb technology and consume concepts very quickly, and AI is just the latest example.”

Building meaningful AI certifications for Microsoft employees

Our AI Center of Excellence (AI CoE)—the Microsoft Digital team tasked with designing and championing how our organization uses AI—is at the forefront of these efforts. They’re working to standardize how we leverage AI internally.

[Read our story on our CoE here: The AI Revolution: How Microsoft Digital is responding with an AI Center of Excellence.]

The AI CoE operates according to the principles of AI 4 ALL: Accelerate, learn, and land.

“Our first priority is creating a common understanding and language around these fairly new topics,” says Humberto Arias, senior product manager in Microsoft Digital. “The technology changes constantly, so you need to learn continually to keep up.”

Fortunately, enterprising employees within Microsoft have been laying the groundwork for this moment for years. Our Artificial Intelligence and Machine Learning (AI/ML) community had been working on their own time to deepen their knowledge through research and independent certifications.

When generative AI took off at the start of 2023, that community began partnering with the AI CoE and got serious about empowerment. They brought their knowledge. The AI CoE brought their organizational leadership.

“No other organization within Microsoft can provide such a clear picture of what you need for upskilling,” says Urvi Sengar, an AI/ML senior software engineer in Microsoft Digital. “Only our IT organization is functionally diverse enough.”

Their work is a testament to the power of trusting your technology champions to lead change.

A photo of Sengar.

“So much is changing that we don’t want to stop at just one static certification. We want to keep the learning going, along with everything new and relevant, so we can take this community forward.”

Urvi Sengar, senior software engineer, Microsoft Digital

In previous years, Sengar and her AI/ML community colleagues had already built a learning path focused on Azure AI Fundamentals. They relaunched the course in 2023 to represent the core of our AI certifications.

From there, a diverse group of technical and employee experience professionals collaborated to assemble, create, and structure a series of learning paths to launch our Microsoft Digital employees into the next level of AI expertise. That’s where Microsoft Viva Learning really shines. The platform makes it easy to curate our AI content actively as the technology landscape evolves.

“So much is changing that we don’t want to stop at just one static certification,” Sengar says. “We want to keep the learning going, along with everything new and relevant, so we can take this community forward.”

The result is a granular, multidisciplinary curriculum that gets Microsoft Digital employees leveled up not just to AI literacy, but to AI proficiency.

Innovative AI certifications designed for employee success

Our AI and Data Learning curriculum divides into three distinct learning paths: basic, intermediate, and advanced.

  • AI Learning Basic gives beginners a ground-level, conceptual understanding of the technology. It builds familiarity with generative AI and no-code AI tools, as well as more theoretical frameworks and topics like the responsible AI principles, AI ethics, and how to align AI projects with our values.
  • AI Learning Intermediate is where things get more functional. Here, employees learn about natural language processing and prompt engineering, as well as several specific AI tools, including ChatGPT, AI Builder in Power Automate, Semantic Kernel (for building AI-based apps), Azure OpenAI generative models, and more.
  • AI Learning Advanced goes from function to innovation. This is where employees can dive deeper into working with large language models (LLMs), training neural networks, self-supervised machine learning, and other skills that will help them develop more advanced solutions and automations. Examples include units on Advanced Natural Language Processing with Python and UX for AI.

When employees complete each learning path, they receive a sharable badge. We used Credly, a digital credentialing solution created by Pearson, to design and manage those badges. We can then distribute them to our employees through Credly’s integration with Microsoft Viva Learning.

Microsoft Digital AI certification levels

The three AI certification badges available through Microsoft Viva Learning: beginner, intermediate, and advanced.
Microsoft employees can obtain three levels of AI certification: beginner, intermediate, and advanced.

Curating the curriculum is only one part of the AI CoE’s job. It’s also crucial to promote and socialize these learning opportunities internally. The wider Microsoft Viva employee experience suite takes care of that.

We actively socialize the AI certifications through Microsoft Viva Engage, our employee communication platform, but top-down promotion is only one component of their success. Microsoft Digital employees often share their certifications via LinkedIn or through Viva Engage. As a result, there’s an element of virality that leads even more of our employees to take these courses—even outside Microsoft Digital.

A photo of Pancholi.

“People are observing the work we do and looking for ways to bring it into their organizations. Even external customers are asking how they can set up their own centers of excellence and what to prioritize.”

Nitul Pancholi, principal product manager, Microsoft Digital

Our teams are clearly excited about their success. The share rate for AI Learning badges is 67 percent, well above Credly’s average of 47 percent.

Beyond Microsoft Digital, lines of business across Microsoft are adapting these certifications for their own needs.

“People are observing the work we do and looking for ways to bring it into their organizations,” says Nitul Pancholi, principal product manager in Microsoft Digital, who leads the AI CoE’s culture pillar. “Even external customers are asking how they can set up their own centers of excellence and what to prioritize.”

Freshly empowered AI practitioners, ready for the future

We’re still at the beginning of our internal AI adoption journey. But by raising the baseline of AI knowledge, these certifications ensure our technical professionals are ready to lead the rest of our organization.

“That’s one of the super cool things about Microsoft,” MacDonald says. “We have the playground at our fingertips, and we have the autonomy and opportunity to dream up whatever we want.”

The advent of advanced AI supported by thoughtful empowerment initiatives will only amplify our employees’ ability to experiment with emerging technologies. We’re confident that developing our own AI curriculum will help us work our way into a virtuous cycle of more learning, more creativity, and more business innovation.

Customers with access to Microsoft Viva Learning can start assembling their own AI curriculum, drawing from Microsoft Learn content, their own educational materials, and external providers and learning management systems. By unlocking AI for employees through education, organizations will be better positioned to ride the wave of the next digital revolution.

Key takeaways

Here are some things to consider as you think about launching an AI curriculum at your company:

  • Leverage your integrations with tools like Microsoft Viva Learning and LinkedIn Learning.
  • Actively curate your courses to keep your curriculum up to date.
  • Busy schedules get in the way: Build time for learning into your employees’ days, then support them with curriculum.
  • Leverage executive sponsorship, employee champions, and the social aspects of learning.
  • Incentivize and recognize progress through gamification, friendly competition, badges, and testimonials.
  • Build a diverse enablement team from across different disciplines, seniorities, and technical backgrounds.
  • Think about how to segment learners by level of expertise and learning style, then tailor the learning to those segments.

The post Embracing emerging technology at Microsoft with new AI certifications appeared first on Inside Track Blog.

]]>
12532
How Microsoft HR is using Viva and Microsoft 365 Copilot to empower our employees http://approjects.co.za/?big=insidetrack/blog/how-microsoft-hr-is-using-viva-and-copilot-for-microsoft-365-to-empower-our-employees/ Thu, 03 Apr 2025 16:11:00 +0000 http://approjects.co.za/?big=insidetrack/blog/?p=16203 Technology is about people, and no Microsoft organization understands that better than Human Resources. When our HR team started rolling out Microsoft 365 Copilot to the global HR organization, a human-centered approach was a natural fit. And what better way to focus on the human side of adoption than Microsoft Viva? This story shares how […]

The post How Microsoft HR is using Viva and Microsoft 365 Copilot to empower our employees appeared first on Inside Track Blog.

]]>
Microsoft digital stories

Technology is about people, and no Microsoft organization understands that better than Human Resources. When our HR team started rolling out Microsoft 365 Copilot to the global HR organization, a human-centered approach was a natural fit.

And what better way to focus on the human side of adoption than Microsoft Viva?

This story shares how our HR team used the Viva suite to communicate, provide opportunities for skilling and development, and measure success. As a result, they’ve been able to craft and disseminate effective adoption content through several different channels, provide both centralized and peer-led learning opportunities, and effectively track their progress.

Applying HR expertise to Microsoft 365 Copilot

Individual images of Spahr, Owen, and Friedman are combined into a collage.
David Spahr (left to right), Chris Owen, and Liz Friedman are helping lead Microsoft 365 Copilot adoption efforts at Microsoft HR.

AI represents the greatest workplace shift in a generation, so Microsoft 365 Copilot adoption has been a priority across the company. The HR team has an especially important role in that process: They’re both practitioners supporting AI transformation across the rest of Microsoft and professionals who use this technology in their own roles.

“As champions for Responsible AI at Microsoft, we have a special duty to learn, experiment, and apply Copilot to the space where we work,” says Liz Friedman, senior director of HR AI Transformation. “To support the rest of the company on this journey, we have to understand it ourselves.”

Copilot brings an unprecedented solution to the table, so we’re applying new technology in innovative ways as we experiment with fresh approaches to adoption. And new technical capabilities aren’t the only aspect of Copilot that affects its rollout.

“In some ways, the adoption challenge with Copilot is flipped on its head,” says Caribay Garcia, a principal people scientist on the Microsoft Viva product team. “It’s not as much about building momentum as it is about guiding excitement to create effective usage and long-lasting change.”

Driving AI adoption is about harnessing the optimism and energy of employees. With its human-centered suite of apps designed for employee engagement, Microsoft Viva has been a natural fit for agile Copilot adoption efforts that respect how people assimilate new tools and processes.

Accelerating Microsoft 365 Copilot adoption with Viva

Viva Connections

Sharing key news related to deployment and enablement, generating “buzz,” and tying Copilot to Microsoft culture.

Viva Amplify

Producing and efficiently distributing employee communications to build awareness and excitement.

Viva Learning

Courses and training for our employees on how to maximize value from Copilot, inclusive of building effective prompts.

Viva Engage

Actively engaging employees, providing leader updates, listening to feedback, and enabling Champs community.

Viva Insights

Using the Microsoft Copilot Dashboard to identity actionable insights and usage trends.

Viva Pulse

Instant feedback from employees on their Copilot experience to fine-tune our landing and adoption approach.

Viva Glint

Understanding employee sentiment and gauging the overall effectiveness of our Copilot deployment effort.

With support from our team at Microsoft Digital, the company’s IT organization, HR is leaning into Viva as a vehicle for change. Their team recognizes its value across every component of the adoption journey, from communication and learning to engagement and feedback.

“Using Viva for our promotion, awareness, skilling, and reinforcement process is tremendously useful,” says Anand Shah, senior business program manager with Microsoft Digital. “It’s critical at scale because it captures so many more people than we could ever manage with just instructor-led trainings or other centralized efforts.”

HR and people science: A powerful pairing

Images of Garcia and Kalafut are combined into a composite photo.
Caribay Garcia (left) and Carolyn Kalafut are principal people scientists on the Microsoft Viva product team.

As their Microsoft 365 Copilot adoption unfolds, HR is taking a highly specialized approach that keeps Microsoft’s culture and priorities at its center. The team also benefits from the expertise of people science professionals in the Viva product group who have a background in organizational psychology, behavioral science, data science, and employee experience. These experts take a research-based and people-focused approach, infusing the Viva suite with insights from their discipline and ensuring every app effectively supports employee needs.

“People science helps us understand what’s important to employees to help them feel happy and successful,” says Carolyn Kalafut, a principal people scientist in the Microsoft Viva product group. “When employees experience rapid changes like AI transformation at work, our research shows that we can ease the process by encouraging them to share their voice, addressing their concerns, keeping them informed with key updates, and providing relevant skilling opportunities.”

To support their efforts, Microsoft Digital provides technical and change management expertise based on our experience with repeated deployment and adoption cycles.

The HR AI Transformation team built a strategy to address key challenges associated with AI adoption in HR that include:

  • HR practitioners’ professional caution in using AI tech in the context of constantly evolving guidance and laws around protecting employee confidentiality and privacy
  • Uncertainty about where and when HR and employees can responsibly use Copilot in their work
  • Questions about data security and the appropriate flow of information in the context of Copilot
  • The ongoing introduction of new features and capabilities in a rapidly evolving solution category
  • How to embrace elective technology that requires buy-in from employees rather than embedding itself into business functions by default

“We think of our efforts in terms of a cohesive strategy for driving change, from generating awareness and motivation to building knowledge and skills, then applying and tracking the behaviors we’ve enabled,” Friedman says. “Viva apps are really well designed for each of these steps.”

The resulting combination of HR’s organizational expertise, the product group’s people science insights, and Microsoft Digital’s time-tested change management processes helped the team develop an effective and multifaceted adoption strategy enhanced by Viva.

Laura Luethe

Change and adoption communication is a well-established discipline that relies on both centralized campaigns and more localized, intra-departmental efforts. Above and beyond Microsoft Digital’s company-wide Copilot communications, HR’s internal adoption leaders actively construct campaigns specific to their organization’s needs.

In addition to running these campaigns using Viva Amplify, the Viva suite’s organizational communications app, change leaders can deploy adoption material through HR’s own Copilot sponsors and champions. This gives them the opportunity to influence their communities on whatever channels feel most natural: Teams, Outlook, or Viva Engage. This approach captures the benefits of an organizationally aligned, carefully crafted narrative while capitalizing on the reach and trust that employees’ leaders and peers inspire.

The ongoing HR AI Roundup is one example of a Viva Amplify campaign. It provides a monthly update on HR’s goals and progress with Copilot adoption, shares new features and capabilities, and offers clear actions employees can take to further their usage.

As this work continues, the HR adoption team is learning from their initial experiments with Viva Amplify. The goal is to collaborate with sponsors and champions to disseminate customized campaigns that provide precise analytics, contributing to continuous improvement. “Everyone can be an effective communicator with Viva Amplify,” says Laura Luethe, a director of communications on the Microsoft HR AI Transformation team. “It combines the capabilities of corporate communicators with the ability to tailor messaging to uniquely relevant audiences.”


Amia Randazzo

There isn’t one right way to learn, so HR accommodates a diverse array of learning styles. Viva Learning provides opportunities for both self-directed and group learning as components of their Copilot skilling offerings.

Aside from providing multiple paths to learning and development, one of the principal upskilling challenges is the location and discoverability of content. In that context, Viva Learning’s flexibility and ability to pull learning material from multiple sources is a major asset.

“People are hungry for skilling opportunities for their specific disciplines,” says Amia Randazzo, a learning and development partner in Microsoft HR. “Viva Learning supports that desire by providing one central location for all their learning needs.”

The HR team assembled a set of discipline-specific learning materials to create the AI for HR Academy. It contains the mainstay AI for HR learning path, as well as other modules that include discipline-specific content, material on systems thinking, and more. For people who learn best alongside their peers, there are also opportunities for collaborative skilling activities like group trainings, community teams, or forums. With this academy, HR change leaders have a shareable core of learning material they can deploy alongside enablement communications, community blasts, or other activations.


Chris Owen

A human-centric approach to adoption respects that change is often community-based and meets people where they are—and that it works best when it’s fun. In an organization as large as Microsoft HR, with many people working in different locations or functioning on a hybrid model, a digital solution for community-based engagement is essential.

“There’s no watercooler big enough for our global community to gather around,” says Chris Owen, a senior program manager on the Microsoft HR AI Transformation team. “In that environment, Viva Engage becomes the one and only space where all of HR can meet regularly and exchange ideas.”

As a grassroots employee community app, Viva Engage enables peer-to-peer guidance and communication about Copilot. It also unlocks synergies between different Viva apps. Change leaders can introduce campaigns into Viva Engage by rallying employees to share pre-populated posts distributed through Viva Amplify.

HR’s Copilot Champs Community, a subset of our wider HR AI Community of Practice, is an essential part of those peer-to-peer change management initiatives. They share best practices for local outreach and support each other’s efforts to drive adoption within their specific disciplines, often through Viva Engage.

Although Viva Engage relies on community members to drive conversations forward, it also allows guidance from change leaders and community managers tasked with supporting the adoption. The tool provides just the right elevation for leaders to introduce change initiatives into the community, which members can adapt and socialize throughout their peer groups.

In one instance, the HR adoption team created a digital Copilot escape room designed to gamify learning about prompts for their HR AI Community of Practice members. Community members found the initiative so helpful that they recommended making it available to the wider organization, encouraging others to replicate the experience within their local teams.

One of most valuable aspects of Viva Engage is the way it decentralizes the responsibility for leading change. By letting conversations unwind organically, it becomes a repository for knowledge sharing, content, and ongoing conversations about the best ways to use Copilot.

When employees learn together, share best practices, and ask questions, collective conversations unfold and people build valuable connections, which are especially helpful in a hybrid environment. That isn’t just a powerful adoption practice. It also boosts a sense of community and improves morale.

As one of the most effective vehicles for change management in HR, Viva Engage illustrates how social connections can be a powerful force for good.


David Spahr

Orderly data and the ability to process insights are essential to applying people science principles practically. We aren’t just interested in understanding what people are doing, but how they feel about it—uncovering the “why” behind adoption behavior. Employee signals are central to that goal.

As part of HR’s enterprise employee strategy, the team deploys biannual, organization-wide Signals surveys through Viva Glint to gauge sentiment across different topics. With the introduction of Copilot, they added questions about engagement with the new technology. HR’s AI adoption leaders work alongside the HR Business Intelligence team to use select data from the surveys to understand sentiment about AI and fine-tune their adoption approach.

The team has also worked to correlate results from questions about AI usage with metrics regarding “employee thriving,” the main statistic we use at Microsoft to understand the employee experience. To “thrive” is to be energized and empowered to do meaningful work. Based on recent survey results, the HR team is beginning to understand trends and sentiments around Copilot adoption and its impact on employee experiences.

By collating thriving metrics acquired through Viva Glint surveys with Copilot usage, the HR team has found that employees who use Copilot at least once every week are more likely to say they’re thriving and taking initiative to be productive.

That insight accomplishes two things: It validates further investment in Copilot adoption efforts, and it simultaneously demonstrates the value of Copilot to our employees.

“It’s all about energy, empowerment, and whether work is meaningful to employees,” says David Spahr, a director on the Microsoft HR AI Transformation team. “We aren’t just trying to understand the productivity and efficiency gains of AI—we want to see the ways it’s helping humans become more human.”


Microsoft 365 Copilot adoption is going strong in HR

Thanks to robust change management practices, cross-organizational collaboration, and Microsoft Viva initiatives grounded in people science and tailored to employee needs, HR’s adoption of Microsoft 365 Copilot has generated impressive results so far.

“HR has been a powerful adopter,” says Eric Wand, head of IT at Microsoft Canada. “This organization has been so committed to their Copilot journey; they’ve achieved many of their end-of-year adoption metrics in just the first few months.”

Microsoft Viva is a crucial part of HR’s process, and it continues to be a critical tool for advancing usage at scale. As they incorporate further Viva apps, HR will continue to fine-tune their adoption activities and find new ways to unlock even greater value for employees. The team is currently exploring ways to programmatize agile and decentralized Viva Pulse surveys, according to people science principles and gain further visibility into employee sentiment and usage at the team level.

HR is also partnering with Microsoft Digital to track usage through the Copilot Dashboard in Viva Insights. The combination of qualitative and quantitative data will support even more effective change management as HR continues to deepen its Copilot adoption.

“Our job is to empower the people who empower the planet, and that means changing the way we change,” Friedman says. “The more we can do to meet people where they are and help enhance how they work, the more success we’ll have. So, the investments we’ve already made in employee experience areas—communications, skilling, and measurement—have given us a valuable head start at accelerating our own functional AI transformation efforts here in HR.”

Key Takeaways

Here are some tips on how to use Microsoft Viva and Microsoft 365 Copilot to empower your employees: 

  • Build your change strategy first. Know what problems you’re trying to solve, what you’re trying to accomplish, and use Microsoft Viva as your partner in change management.
  • Don’t interrupt—enhance. Build change management activations into the flow of work.
  • Build a good virtual team that brings people together from different organizations, with different skill sets within your business.
  • Ensure sponsorship is active and visible by using Viva to communicate your strategy with your employees on a regular basis
  • Start with the basics. Don’t feel like you need to be an expert in AI, and just land your foundational pieces around communications, skilling, and measurement.
  • Don’t be intimidated by the scope and possibility of the new product. Adopt a growth mindset and take bite-sized steps forward.
  • Community offers agility. Traditional learning and development may not be able to keep up with the pace of change, so let peer-to-peer enablement in different types of community settings take on some of the burden.

The post How Microsoft HR is using Viva and Microsoft 365 Copilot to empower our employees appeared first on Inside Track Blog.

]]>
16203
Transforming Microsoft’s enterprise IT infrastructure with AI http://approjects.co.za/?big=insidetrack/blog/transforming-microsofts-enterprise-it-infrastructure-with-ai/ Sun, 05 Jan 2025 21:58:00 +0000 http://approjects.co.za/?big=insidetrack/blog/?p=13516 AI is changing everything at Microsoft, including our approach to core IT. We in Microsoft Digital, the company’s IT organization, are using the advent of generative AI to reexamine and transform our entire IT infrastructure. “We’ve crossed an important threshold with AI,” says Mark Sherwood, vice president of Infrastructure and Engineering Services in Microsoft Digital. […]

The post Transforming Microsoft’s enterprise IT infrastructure with AI appeared first on Inside Track Blog.

]]>
AI is changing everything at Microsoft, including our approach to core IT.

We in Microsoft Digital, the company’s IT organization, are using the advent of generative AI to reexamine and transform our entire IT infrastructure.

“We’ve crossed an important threshold with AI,” says Mark Sherwood, vice president of Infrastructure and Engineering Services in Microsoft Digital. “We’re now using it to transform all our core IT services, to make everything we do more efficient, and secure.”

Sherwood and his team manage our core IT services, a massive enterprise IT estate that supports all of Microsoft’s business worldwide. Microsoft is an expansive universe of connected devices made up of hundreds of thousands of PCs and laptops, conference rooms, building IoT sensors, and personal devices—all dependent on a foundation of network connectivity and security to enable seamless access to the tools and services our employees rely on every day.

It’s clear that AI brings immense value to our IT infrastructure.

“This is a fascinating time to be working in IT,” Sherwood says. “We’re using AI across all of our services, and now we get to take that investment to the next level. Now it’s all about seeing what we can do with it.”

Aligning IT infrastructure innovation with the rest of the organization

The strategy for AI transformation in core IT infrastructure is one part of a larger vision for the impact of AI across all of Microsoft Digital.

“The potential for transformation through AI is nearly limitless,” says Natalie D’Hers, corporate vice president of Microsoft Digital. “We’re evaluating every service in our portfolio to consider how AI can improve outcomes, lower costs, and create a sustained competitive advantage for Microsoft and for our customers.”

We’re hyper-focused on our employee experience, and AI will be instrumental in shaping the future of how Microsoft employees interact with customers, the organization, and each other.

Transforming and securing our network and infrastructure

AI holds enormous potential across all of Microsoft Digital, but within IT infrastructure, the benefits of AI-enabled transformation play out across several specific pillars where we’re focusing our efforts: Device management, network infrastructure, tenant management, security, and the IT support experience.

Security

We can’t transform without adequate security. Properly implemented security controls and governance provide the secure foundation on which our engineering teams build solutions, and that security is especially relevant as we incorporate AI into our services and solutions.

Securing our network and endpoints is imperative, and our Zero Trust Networking efforts across our IT infrastructure provide essential protection against threats to our network security. AI will enhance the security and compliance of these efforts in our cloud and on-premises environments.

AI-based network assignment for devices will simplify network classification and provide more robust risk-based isolation to isolate risky devices and reduce unwanted movement across the network.

We’re automating access controls for our wired and wireless networks to improve security effectiveness. AI-infused processes for analyzing device vulnerabilities, detecting anomalous firewall traffic flow, and diagnosing other network incidents will play a critical role in our continued shift toward the internet as our primary network transport.

We anticipate that AI-supplemented capability in Microsoft 365’s multi-tenant organization feature will help us meet our ever-changing network segmentation needs by maintaining tenant separation and enabling secure tenant cross-collaboration when required.

AI will help us manage third-party app access and revolutionize how we understand user interactions with applications across managed devices or SaaS platforms. We’ll increase access efficiency and reduce costs by capturing third-party app usage and needs more accurately, using AI to determine the how, why, and when of user access.

Intelligent infrastructure

Sherwood (left to right), Apple, Selvaraj, and Suver appear in a composite image.
Mark Sherwood (left to right), Pete Apple, Senthil Selvaraj, and Phil Suver were part of the team incorporating AI into Microsoft Digital’s vision for core IT.

Software-defined networking and infrastructure code are already transforming how we approach networking, but AI amplifies the benefits radically.

AI enables us to build data-driven intelligence into network infrastructure, engineering, and operations. AI-driven processes will help us eliminate configuration drift, comply with security policies, reduce operator errors, and efficiently respond to rapidly changing business needs.

We’re implementing AI-driven automation to simplify resource management and deployment, capitalizing on the flexibility provided by software-defined networking and infrastructure as code.

AI will assist with generating code designs, defining and managing network configurations, managing deployments, conducting pre- and post-deployment verifications, and assisting with change management over time. Near real-time streaming telemetry from network devices will form the foundation to guide operation and continuous improvement.

We’re improving network self-healing capabilities by using AI to detect and remediate network issues, creating a more reliable, resilient, and elastic network environment and reducing human intervention and potential for error.

One of our current projects is creating an AI-based assistant app for our direct engineering teams that mines and analyzes our current network infrastructure catalog, providing an advanced set of capabilities that supplement our engineers’ expertise in the field. The assistant app improves productivity and mitigation time for network infrastructure incidents. The AI component is trained on more than 200,000 prior incidents for anomaly detection and predictive analytics. We’re confident it will lead to a considerable reduction in network outages and maintenance costs.

Device management

With more than 1 million interconnected devices, AI-powered capabilities will significantly benefit our device management practices with a focus on user and administrator workflows.

We’re implementing intelligent device recommendations to ensure our employees have the best tools to do their work. Building AI into a centralized device lifecycle management tool will create efficiencies in procurement, tracking, and responsible device recycling.

We’re designing AI-powered predictive maintenance and intelligent troubleshooting to reduce device-related issues significantly. AI-enabled device maintenance schedules and tasks will automate the device management process and reduce the load on our IT help desk by correcting device issues before they become user problems, reducing device-related helpdesk incidents.

Across our vast scope of device management, many alerts and tickets contain information or fixes that our helpdesk engineers can use in other situations. We’re employing AI to generate device insights by analyzing a massive set of signals, including device configurations, network traffic, vulnerabilities, and user behavior. These insights will power more informed decisions across the device management portfolio, including device replacement, software updates, and capacity increases.

We have more than 100,000 IoT devices on our corporate network. AI-automated IoT device registration will create more robust and efficient IoT device management, tracking, and security.

AI and machine learning will help us to perform aggregated meetings and call data for device monitoring across personal devices, Microsoft Teams meeting rooms, networks, IoT devices, and Microsoft 365, improving and safeguarding the user experience.

Tenant management

Our cloud tenants in Microsoft Azure, Microsoft 365, Dynamics 365, and the Power Platform are among those platforms’ largest and most complex implementations. Our internal implementation includes more than 205,000 Microsoft Teams, 534,000 SharePoint sites, 430,00 Microsoft Exchange mailboxes, 93,00 Power Apps, 5,000 Viva Engage communities, and a massive 25,000 Microsoft Azure subscriptions.

It’s a lot to manage, and AI will improve how we do it.

In tenants of our size, unmanaged assets can lead to unnecessary costs. Our asset compliance and lifecycle management processes will include an AI-powered compliance assistant that informs tenant users and owners, recommends assets for deletion, and proactively identifies areas of high risk for the tenant. Through the assistant, tenant admins gain an all-up view of compliance status and can investigate and resolve issues more granularly.

AI is also simplifying and streamlining our license management processes. We adhere to precise rules and regulations, which result in complex access scenarios across different countries and regions. AI will bolster our ability to detect and remediate non-compliant tenants amidst this complexity.

IT support

We’re poised to transform how Microsoft employees interact with our support services using generative AI.

Our employees interact with Microsoft support services in a complex, global hybrid environment. Our self-help solution using Microsoft Azure OpenAI will enable contextual and human-like conversation and support in the employee’s local language. Our chat and incident summarization tools will use AI to summarize incidents and provide context when assisted support is necessary.

We’re infusing our support ticketing systems with AI capability for forecasting support requirements and proactively checking the health of devices to reduce issues and improve resource planning and response times.

Transforming our IT infrastructure as Customer Zero

As Customer Zero for Microsoft, we pilot and deploy new products and capabilities in our IT infrastructure before releasing them externally. Our scale, size, and knowledge of our products and services enable us to envision connected experiences across large enterprises, manage complex combinations of product use cases, and engineer solutions on top of our product platforms.

AI improves our role as Customer Zero by accelerating insights and improving time-to-value. We’re using AI capabilities to capture, review, analyze, and report on the most important and actionable insights from the Customer Zero experience. We’re also using AI to redevelop processes, regulatory compliance, security reviews, and deployment practices within the Customer Zero environment.

Looking forward         

It’s almost impossible to envision a future for corporate IT infrastructure without AI. Our active planning for AI in our infrastructure is continually evolving, and we’ve only just begun our implementation. We’re positioning Microsoft to be a catalyst for innovation, and we’re committed to innovating with AI to streamline our IT operations.

“We will continue to infuse AI into every dimension of our enterprise portfolio,” Sherwood says. “We’ll continue to identify new opportunities for building AI-powered applications and services that improve how we deliver IT services to the company.”

By showcasing our progress with AI capabilities, we aim to transform our approach to AI internally here at Microsoft and to fuel a similar transformation across the IT sector.

Key Takeaways

Here are four important steps you can take to transform your IT infrastructure with AI:

  • Make device handling smarter with AI. Use AI to manage all devices better, helping to fix problems before they affect people and easing the workload for your IT team.
  • Use AI to improve the network. Integrate AI into the network system to make it more intelligent and more adaptable, which helps reduce downtime and facilitates faster and easier changes.
  • Manage cloud services better with AI. AI can help keep track of cloud services, ensuring everything is used properly and securely.
  • Boost security and helpdesk with AI. Enhance safety and helpdesk services using AI, leading to better network protection and quicker, more effective support for employees when they need it.

The post Transforming Microsoft’s enterprise IT infrastructure with AI appeared first on Inside Track Blog.

]]>
13516
Transforming our data culture with AI-ready data http://approjects.co.za/?big=insidetrack/blog/transforming-our-data-culture-with-ai-ready-data/ Thu, 05 Dec 2024 17:00:00 +0000 http://approjects.co.za/?big=insidetrack/blog/?p=17761 IT organizations—at Microsoft and companies around the world—will never be the same thanks to AI. For all the benefits that AI and machine learning offer, one element we and companies like ours need to get right is data. After all, data is what’s powering the AI revolution. Here in Microsoft Digital, the company’s IT organization, […]

The post Transforming our data culture with AI-ready data appeared first on Inside Track Blog.

]]>
Microsoft Digital technical stories

IT organizations—at Microsoft and companies around the world—will never be the same thanks to AI.

For all the benefits that AI and machine learning offer, one element we and companies like ours need to get right is data. After all, data is what’s powering the AI revolution.

Here in Microsoft Digital, the company’s IT organization, getting our data to an AI-ready state is a fundamental imperative. As such, we’re focused on four key areas of data management: quality, governance, compliance, and infrastructure.

Understanding AI-ready data

AI-ready data is data that’s available, complete, accurate, and high quality. With AI-ready data, our data scientists and engineers are better equipped to locate, process, and govern the enterprise data that drives our organization.

A composite image of Pelland, Clement, and Dubuisson.
Our team that’s working to drive our adoption of AI with Microsoft 365 Copilot, Microsoft Purview, and Microsoft Fabric includes Patrice Pelland (left to right), Delphine Clement, and Edith Dubuisson.

Our days of assembling, cleaning, and massaging data each time we launch a data-driven project are gone. Using guidance from our Microsoft Digital Data Council, a multi-disciplinary team that’s responsible for defining data quality standards for Microsoft Digital, and our Microsoft Digital AI Center of Excellence (CoE), we enhance our data discoverability and documentation before we launch any new AI-powered product or experience.  

“Our customers understand that data is the fuel that powers IT,” says Patrice Pelland, partner engineering manager for Microsoft Digital. “By ensuring our employees have access to data that is complete and accurate and prioritizing good governance, Microsoft is embracing the generational change brought on by AI.”

AI is already having transformative impact globally. In Microsoft Digital, we’re driving internal adoption of Microsoft 365 Copilot in every division of the company to increase productivity, enhance creativity, and improve efficiency. The benefits are already being realized, but the fact remains that Copilot and other AI tools are only as good as the data that supports them. The last thing we want our employees to experience is inaccurate or incomplete answers from AI-generated content. Powering tools like Copilot with AI-ready data allows our employees to work confidently, knowing that they can trust the information they’re working with.

AI-ready data is all about ensuring secure access to the quality, accurate information employees need when they need it.

{Learn more about how we’re responding to the AI Revolution with an AI Center of Excellence.}

Enhancing data management with AI

Before we truly realized the benefits of tools like Copilot, we needed to incorporate AI-ready data into the same data management and governance tools that many of you use: Microsoft Fabric and Microsoft Purview. For decades, the challenge of data analysts and engineers was maintaining a consistently reliable “source of truth” despite inconsistent data quality, insufficient governance, and years of collecting data in siloes. Fabric and Purview help to resolve these issues.

Fabric is our unified data and AI platform that combines the best of Microsoft Power BI, Azure Synapse Analytics, and Azure Data Factory to create a single, unified software as a service (SaaS) solution. Part of our AI-ready data strategy includes embracing data-mesh architecture. By using Fabric’s data lake, OneLake, to connect to data from anywhere and work from the same copy across platforms, our data scientists and engineers are executing that strategy. Fabric’s ability to unify data sources provides data professionals with the AI-ready data they need, all in one SaaS experience.

“There is no good AI without a solid, curated data stack,” says Delphine Clement, a principal product manager for the Microsoft Purview product team. “Democratizing data unlocks the power of enterprise data by cataloging, curating, and certifying it, then making it available to employees.”

Purview is our primary tool for data governance and ensures the security and compliance of Microsoft’s data assets. Purview has been reimagined to provide an integrated SaaS solution to the practice of data governance for enterprise-wide users. Delivering AI-ready data is a priority for maximizing the effectiveness of Purview and tools like it.

In addition to providing a unified data catalog that helps us classify and identify defects in our enterprise data, Purview enables Microsoft Digital to safely manage our data estate by applying data sensitivity labels to all the digital assets that comprise our Microsoft 365 content estate. Copilot uses sensitivity labels, file permissions, and rights management services to ensure that private or sensitive data isn’t reasoned over and overexposed. Purview also helps us maintain an effective chain of custody for our digital assets with strong data loss protection (DLP) capabilities to help us catch the 1% case when sensitive data leaks from our environment. An effective data governance strategy powered by Microsoft Purview is essential to enabling Microsoft Digital to support Responsible AI at Microsoft.

Our everyday corporate functions like Microsoft HR and Corporate, External, and Legal Affairs (CELA) depend on Purview to provide accurate data to complete projects, whether they’re smaller in scope or large-scale initiatives. For example, the accuracy of legal data required to complete a brief for a court filing is essential. With Purview, our CELA teams know the information they’re working with is high quality, accurate, and complete.

{Explore how we’re transforming our data governance at Microsoft with Purview and Fabric.}

Accelerating time to value with powerful AI models

AI-ready data can fast-track value realization by leveraging powerful AI models. On Microsoft platforms, AI data model options for information retrieval and custom engine agents offer varying levels of flexibility and control.

Agents focused on knowledge or information retrieval are built using tools like Microsoft Copilot Studio and operate on our pre-configured AI models and orchestrators, which are the software layers that manage and coordinate the execution of tasks and services across multiple systems. This approach simplifies development by eliminating the need for organizations to manage their own AI infrastructure, as these agents utilize the Copilot engine to handle prompts and leverage foundational models. Additionally, retrieval agents have native access to indexed Microsoft Graph data, such as SharePoint and OneDrive files, enhancing their integration capabilities.

{Find out how we’re unlocking deeper AI value at Microsoft with Microsoft 365 Copilot extensibility.}

In contrast, custom agents provide organizations with the ability to. integrate their own AI models, including models from Azure OpenAI or Azure AI Foundry These agents—built using tools like the Teams Toolkit, Azure AI, and Microsoft Copilot Studio—can be tailored to specific domains or workflows. This approach allows for the use of custom foundational models and orchestrators, enabling specialized experiences that align closely with their unique requirements. However, this increased flexibility necessitates a greater level of security and compliance oversight, as organizations are responsible for managing and maintaining their custom AI infrastructure. 

{Learn how we’re embracing this new ‘agentic’ moment at Microsoft.}

AI-ready data + Copilot

Microsoft Dynamics for Sales (MSX) and Microsoft Sales are our principal platforms for managing customer and sales data. MSX is the pipeline through which we manage the sales of Microsoft products. Microsoft 365 Copilot for Sales is already being used to improve the data quality and hygiene of MSX. Instead of sellers needing to manually update sales each month or clean up duplicate data, Copilot for Sales can do the work automatically, freeing employees to focus their time more strategically.

“There is a great opportunity for AI-ready data to help with data hygiene in tools like MSX and Microsoft Sales,” says Edith Dubuisson, senior business program manager for Employee Experience Success. “It can quickly organize account data to reflect the correct hierarchies and account parenting.”

Microsoft Sales is the database of all purchases from Microsoft. The amount of information is massive, and data quality is critical. Thanks to AI-ready data, in the future Copilot will assist with organizing the data associated with thousands of accounts, updating hierarchies and maintaining account contact information.

{See how we’re simplifying our sales with AI-powered Microsoft 365 Copilot for Sales.}

Accelerating corporate functions growth

All corporate functions are being asked to do more with less because they can no longer afford to grow operational costs linearly with top-line revenue or employee count. AI tools, powered by AI-ready data, will play a fundamental role in transforming corporate functions’ workflows while improving operational efficiency, user productivity, regulatory and corporate compliance, and data-driven decision making.

Human Resources agents will be empowered to summarize support cases, find answers to user inquiries, and craft email responses faster and more effectively using AI tools backed by AI ready data. Legal professionals in CELA will exploit AI-ready data within CELA’s workflows to provide swift access to legal findings by consolidating trusted knowledge assets across diverse data sources. Global Workplace Services (GWS), our facilities management team, will use AI-ready data to forecast occupancy and make real-estate portfolio recommendations based on complete and accurate information.

{Learn how AI is revolutionizing the way we support corporate functions at Microsoft.}

Key Takeaways

Democratizing access to enterprise data, powered by AI, is a strategic imperative for Microsoft. We’re focused on delivering a strong data culture that prioritizes data quality, infrastructure, and governance. Emphasizing AI-ready data to power our data and AI solutions ensures that Microsoft meets the needs of the company, customers, and employees.

Here are some tips for getting started with getting your data AI-ready:

  • Identify and assign enterprise data owners to implement and oversee the processes that guarantee data quality.
  • Verify and document existing data sources to understand where datasets need to be connected across domains.
  • Ensure strategic governance by using tools like Microsoft Purview to focus on the origin, sensitivity, and lifecycle of your enterprise data.
  • Enterprise data is one of your most valuable assets. Form a data council to help promote a data culture to ensure your data is AI-ready.

The post Transforming our data culture with AI-ready data appeared first on Inside Track Blog.

]]>
17761
Boosting Microsoft 365 Copilot Chat with smart enterprise content management http://approjects.co.za/?big=insidetrack/blog/boosting-business-chat-in-microsoft-365-copilot-with-smart-enterprise-content-management/ Thu, 31 Oct 2024 16:00:00 +0000 http://approjects.co.za/?big=insidetrack/blog/?p=17337 When our employees look for content internally here at Microsoft, they go to Microsoft 365 Copilot Chat first. With Copilot Chat, they can easily get answers to questions, catch up on missed work, generate new ideas, and more by tapping into the work data that they have access to via Microsoft Graph. An employee might […]

The post Boosting Microsoft 365 Copilot Chat with smart enterprise content management appeared first on Inside Track Blog.

]]>
Microsoft Digital technical stories

When our employees look for content internally here at Microsoft, they go to Microsoft 365 Copilot Chat first. With Copilot Chat, they can easily get answers to questions, catch up on missed work, generate new ideas, and more by tapping into the work data that they have access to via Microsoft Graph.

An employee might ask, “Can you tell me how I can learn more about AI in health care and who the experts are in the company?”

Whether they ask in Microsoft Teams or another Microsoft 365 app, or right in their browser, they likely will get a helpful, accurate response very specific to the health care sector. The answer could refer to an AI industry PowerPoint presentation, articles on responsible AI strategies, Microsoft Research publications, or a list of employees who are experts on the topic.

But how does Copilot know how to reference the AI industry PowerPoint presentation for health care? How does it know which versions of the responsible AI strategies for health care articles to use? How does it identify experts in the company?

It’s because Copilot connects to all the content on the topic available through Microsoft Graph.

“Our internal Microsoft content is the content Copilot uses to generate its results,” says Dodd Willingham, a principal program manager and internal search administrator in Microsoft Digital, the IT organization at Microsoft. “How Copilot consumes and uses our content determines the success—or failure—of Copilot for our employees.”

Enabling useful results

Johnson, Willingham, Sanchez Almaguer, and Liu appear in a composite image.
David Johnson (left to right), Dodd Willingham, Rene Sanchez Almaguer, and Stan Liu, are part of the team that’s responsible for ensuring Microsoft Digital’s content management capabilities are ready to efficiently support Microsoft 365 Copilot Chat.

When it comes to returning the right content with Copilot, context is key.

Copilot uses the capabilities of Microsoft Graph to power its AI-generated results. For Microsoft Digital—like most organizations—that includes the content our users store and work with in our Microsoft 365 tenant. Results from Copilot directly depend on the quality of the content it uses. There’s an enormous opportunity to increase the capabilities of Copilot-based solutions because the underlying content is of such high quality. We’re seizing that opportunity to get this right internally at Microsoft.

“You hear a lot of people talking about Copilot, but few address the importance of improving content quality,” says Stan Liu, a senior product manager and knowledge management lead with Microsoft Digital. “The quality of an organization’s content management must be considered in every implementation of Copilot, and we’re doing some great things at Microsoft Digital to ensure Copilot returns accurate, relevant, and up-to-date responses.”

It’s an exciting time to be in content management, and we’re excited to share how our team in Microsoft Digital has met and addressed the challenges of preparing our content for a bright future with Microsoft 365 Copilot.

Curating enterprise content for Microsoft 365 Copilot

There’s an urgency for organizations to bring advanced AI tools to their employees, but it must be done thoughtfully and with good intentions. One of the fundamental challenges in implementing generative AI technologies like Copilot is balancing the desire to move quickly with the need for caution with technology possessing potential risks that haven’t been fully revealed.

An infographic displaying relevant statistics about the Microsoft enterprise content management environment.
The enterprise content management landscape at Microsoft.

“A lot of what we do lies in managing our content in a way that aligns with company strategy, and Copilot isn’t any different in that respect,” says David Johnson, a tenant and compliance architect in Microsoft Digital who ensures that the company’s content is well governed. “It’s important that Microsoft employees understand why content management is important and how they can help do it well.”

To be effective, we must lean into our ongoing culture shift to embrace knowledge sharing. We’ve been fostering a knowledge-sharing culture at Microsoft for years, and our adoption of Copilot has magnified the importance of that culture and the need to continue driving awareness and education for Microsoft employees.

Liu and his team are prioritizing this culture transformation.

“You need to build and encourage a culture that embraces user-driven content management,” Liu says. “Teams that document their knowledge, follow a content lifecycle in their workflows, and manage content consistently across the company are contributing hugely to what we’re trying to accomplish.”

It’s a two-fold challenge that involves encouraging and supporting our employees in collaboration and sharing and ensuring that the tools they use—including Copilot—provide results they can trust and use confidently.

“We’ve set goals within our organization to make Copilot a daily habit,” Liu says. “Community engagement and participation is a significant part of Copilot adoption, and we’ve been identifying use cases and success stories across Microsoft to share as success stories to inspire our employees and encourage adoption and innovation.”

Next generation content management with SharePoint

Microsoft SharePoint is a critical part of our content management strategy to get the most out of Copilot. We’re using the AI capabilities in SharePoint to give employees access to simple and capable content management tools.

SharePoint helps our Microsoft Digital enterprise content team ensure the right capabilities are in place to help people manage content. Missing metadata is a common issue with content management, and SharePoint now makes it easier for users and administrators to add metadata and classify and organize content.

SharePoint now brings AI, automation, and added security to our content experiences, processing, and governance. The product delivers new ways to engage with our most critical content and prepare it for Copilot, helping us manage and protect it throughout its lifecycle.

Automating metadata extraction with document processing

The document processing capabilities in SharePoint simplify and automate the process of extracting important information from existing content. Liu’s team helped deploy the document processing capabilities across Microsoft to enable teams to automate processing of important content, such as contracts, invoices, and statements of work.

Document processing uses machine learning models to recognize documents and the structured information within them. Using optical character recognition (OCR) combined with deep learning models, it identifies and extracts predefined text and data fields common to specific document types, including contracts, invoices, and receipts. It also supports the detection and extraction of sensitive data such as personal and financial identification.

Liu’s team is using prebuilt and custom document processing models to automatically populate metadata columns in SharePoint for many different document types. The metadata this processing provides improves search and creates a more complete understanding of what the files contain, so Copilot can recognize and return relevant information that was previously incomplete or unavailable.

“We’re capturing information across a plethora of documents automatically and almost none of it was recorded previously,” Liu says. “Some of our business groups were entering the metadata manually, but it was a time-consuming and expensive process. For most documents, it just wasn’t done. It’s a massive difference-maker in finding information about a specific contract or invoice that would have been close to impossible. By combining SharePoint with the power of Copilot, it’s a simple question away.”

Standardizing content creation with content assembly

Liu’s team enabled the content assembly feature of SharePoint across the company to simplify document creation and ensure that new documents follow metadata and structure guidelines.

Content assembly creates modern templates that can be easily maintained and used to generate repetitive documents quickly. This feature is particularly useful for departments like finance, where templates for partner letters or contracts are frequently needed. By using content assembly, teams can reduce the time spent on template management and document generation, as it allows for the creation of documents with just the key parts needing changes.

While the time-saving benefits of content assembly don’t directly affect Copilot results, they do encourage users to create better documents, eliminating the need to rewrite entire documents or repeatedly upload the same document. These documents—created using modern templates—are significantly more discoverable and classifiable and lead to more authoritative answers in Copilot.

Taxonomy tagging

Liu oversees a team that has been managing the company’s corporate taxonomy in the SharePoint term store for many years, maintaining a hierarchy of terms that can be reused throughout the SharePoint environment. The term store helps ensure that SharePoint metadata is consistent across the organization, and it provides employees with a standard set of choices when populating commonly used metadata such as products, job roles, or departments.

Taxonomy tagging in SharePoint automatically tags documents in SharePoint libraries with terms configured in the term store using AI. As at other companies, we face the ongoing challenge of getting employees to tag content. Most times, even when you have managed metadata set up in your document library, employees often don’t use it. This means the content is never further enriched with that metadata.

With taxonomy tagging, you set it and forget it. Uploaded content is automatically tagged, which does the job that a person would typically do, but often never does. This automated process ensures that documents get one or more metadata columns populated with standard terms from the term store based on the document content. This leads to more complete metadata for documents and more authoritative results for Copilot results when referencing data in those documents.

Using generative AI to help generative AI with autofill columns

Autofill columns in SharePoint takes content management to the next level by harnessing AI LLMs to automatically extract, summarize, and generate content from files uploaded to your SharePoint document library. This feature allows users to set up a simple natural language prompt on a column in SharePoint that extracts specific information or generates content from files. The extracted information is then displayed in the columns of the library, making it easier to manage and analyze data.

Liu has a lot to say about how his team is transforming document processing with autofill columns in SharePoint.

“Autofill columns are a game-changer for enhancing productivity in Copilot,” Liu says. “They also ensure that our documents have the necessary context for efficient retrieval and use. Autofill’s impact on our metadata within SharePoint document libraries is huge.”

Teams within Microsoft have started setting up new and existing columns with prompts to identify the types of information to extract from a file. These prompts can be customized and tested to ensure that they provide the desired results. After the autofill columns are set up, any new files uploaded to the library are automatically processed (and existing documents can be manually processed), and the result of the prompt is saved to the corresponding columns.

This approach not only streamlines document processing workflows but also enhances the overall efficiency and accuracy of their data management practices, making Copilot even more powerful and effective.

Continuing to grow with SharePoint

Liu’s team continues to drive SharePoint as a crucial part of their content management toolkit.

“We’re seeing immediate and significant benefits from using SharePoint and its AI features to manage our content,” Liu says. “In the first half of 2024, we estimated that our employees saved more than 120,000 hours in processing documents, pages, and images across the company for over 1,000,000 content items in our environment. It’s a great start, and we’re targeting even greater improvements across more content soon.”

Protecting content with Microsoft Purview Information Protection

Microsoft Purview Information Protection provides a wide range of content governance capabilities that help us discover, classify, and protect sensitive information wherever it stays or moves in the Microsoft tenant.

We use Purview Information Protection tools to identify sensitive content using expressions, functions, and trainable classifiers. With these tools, our enterprise data teams and employees can use corroborative evidence like keywords, confidence levels, and proximity to identify sensitive information types. They can also use examples of sensitive content to train recognition engines on expected patterns. All of this helps to better inform Copilot regarding the relevance of our Microsoft 365 content.

We use sensitivity labels in Purview to apply flexible protection actions that include encryption, access restrictions, and visual markings. This capability ties in nicely with SharePoint, which also uses and applies sensitivity labels.

Purview sensitivity labels provide a single labeling solution across apps, services, and devices to protect content as it travels inside and outside our organization. Purview sensitivity labels can be applied to Microsoft Office documents, third-party document types, meetings, chats, and the broader Microsoft 365 environment.

The sensitivity labels that we use to protect our content are recognized and used by Copilot to provide an extra layer of protection. For example, in Copilot Chat conversations, which can reference content from different types of items, the sensitivity label with the highest priority (typically, the most restrictive label) is visible to users. If the labels apply encryption from Microsoft Purview Information Protection, Copilot checks the usage rights for the user and only returns content that the user has access rights to.

Looking forward

Our enterprise content management transformation is ongoing. Our teams are looking at new content management capabilities across the company to ensure Copilot continues to provide current, accurate, and relevant results for our employees.

We’re continually evaluating our enterprise content management to identify new ways to create a Copilot-assisted workday for Microsoft employees. We’re also evaluating new technology, organizational practices, and industry standards as we strive to set the standard for how an organization can capture maximum value from its content using Copilot.

We’re currently working with the SharePoint product team to grow the AI-based capabilities for content management and classification. We’re experimenting with our own solutions and capabilities in SharePoint that will lead to the next generation of AI-supporting features that drive innovation and creativity here at Microsoft and for our customers.

Key Takeaways

Are you looking to prepare your enterprise content for Copilot and AI? Here are a few suggestions:

  • Pursue content quality. Ensure that the content is current, accurate, and relevant. This is crucial for Copilot to provide authoritative answers and maintain user trust.
  • Promote knowledge sharing. Foster a culture of knowledge sharing within the organization. Encourage teams to document their knowledge, follow a content lifecycle in their workflows, and manage content consistently across the company.
  • Use SharePoint. The AI capabilities in SharePoint can help you simplify and automate content management processes.
  • Implement Purview Information Protection Use Purview Information Protection tools to apply sensitivity labels to ensure content is protected as it travels inside and outside the organization.
  • Prepare for future enhancements. Stay updated with ongoing transformations in enterprise content management and Copilot capabilities.

The post Boosting Microsoft 365 Copilot Chat with smart enterprise content management appeared first on Inside Track Blog.

]]>
17337
Protecting against oversharing Power BI reports with Microsoft Sentinel http://approjects.co.za/?big=insidetrack/blog/protecting-against-oversharing-power-bi-reports-with-microsoft-sentinel/ Mon, 08 Jan 2024 23:12:52 +0000 http://approjects.co.za/?big=insidetrack/blog/?p=12951 Microsoft Power BI is an essential tool for monitoring performance, identifying trends, and developing stunning data visualizations that many teams across Microsoft use every day. A well-built Power BI report can play a critical role in helping communicate business information efficiently and effectively. But with great Power BI reports comes great responsibility, which includes keeping […]

The post Protecting against oversharing Power BI reports with Microsoft Sentinel appeared first on Inside Track Blog.

]]>
Microsoft Digital storiesMicrosoft Power BI is an essential tool for monitoring performance, identifying trends, and developing stunning data visualizations that many teams across Microsoft use every day. A well-built Power BI report can play a critical role in helping communicate business information efficiently and effectively. But with great Power BI reports comes great responsibility, which includes keeping data and reports secure, and ensuring that only the right people have access to it.

Across Microsoft, we use Microsoft Purview Data Loss Prevention (DLP), which is now in general availability, to help secure our data. Purview DLP policies allow administrators to comply with governmental and industry regulations such as the European Union General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA), and automatically detect sensitive information to prevent data leaks. These policies can now also uncover data that might have accidentally been uploaded to Power BI without your knowledge.

While Purview’s controls ensure sensitive data is handled appropriately, we learned from customer research that sensitive data can be accidentally overshared with unauthorized individuals when large audience groups are inadvertently granted access to the report. This often happens when report owners grant access to Power BI reports without first checking who is authorized to view them—both inside and outside data boundaries.

We wanted to find a solution that would prevent this kind of unintentional oversharing and make it easy for Power BI administrations to set up, use, and configure.

— Prathiba Enjeti, senior program manager, Microsoft Digital Security and Resilience team

To address this problem, Microsoft Digital Security and Resilience collaborated with the Microsoft Sentinel product group to develop an out-of-the-box Microsoft Sentinel solution for Power BI reports to detect and respond to oversharing. Using the Power BI connector for Microsoft Sentinel, which is now available in preview, you can track user activity in your Power BI environment with Microsoft Sentinel using Power BI audit logs. This solution helps administrators to identify potential data leaks with automatically generated reports.

How it works

With Microsoft Sentinel playbook automation for Power BI detection, the SOC can achieve higher productivity and efficiency, saving analysts’ time and energy for investigative tasks.

— Prathiba Enjeti, senior program manager, Microsoft Digital Security and Resilience team

Enjeti faces the camera standing outside in a natural area.
Prathiba Enjeti is a senior security program manager on the Microsoft Security Standards and Configuration team.

Our oversharing detection logic uses Power BI audit logs, which are cross-referenced against Microsoft Sentinel-generated watchlists that track high-risk security groups. When a report is shared with a group that exceeds a specified number of users, the detection is triggered. Thresholds can be adjusted by administrators to suit any organization’s needs and policies.

Additionally, we used the Microsoft Sentinel playbook to automate the remediation process. We configured it to automatically send email notifications containing remediation instructions to report owners. From our discussions with customers, we learned that some organizations preferred that accountability remain with the Power BI report owners for various periods of time to remediate, before escalating to the tenant administrators. To meet customer needs for flexibility, administrators can configure time spans ranging from instantaneous escalation, to hours, days, and weeks.

“With Microsoft Sentinel playbook automation for Power BI detection, the SOC can achieve higher productivity and efficiency, saving analysts’ time and energy for investigative tasks,” Enjeti says.

Automating how cases of data oversharing are found and fixed will allow IT administrators to detect, notify, and limit access to Power BI reports in real time. We’re excited to bring this Microsoft Sentinel solution to our customers, which will be available for public release soon.

Key Takeaways

Here are some suggestions for tackling oversharing at your company:

  • Oversharing of data is a problem that many organizations face. They might not be aware of the magnitude of the problem. If you don’t already, consider auditing distribution and security groups used by employees to share information.
  • Understand where potential data loss issues might be occurring. Be sure to enable data loss prevention policies wherever possible.
  • Consider implementing detections and automated workflows solutions such as the Microsoft Sentinel solution for Power BI reports oversharing to reduce manual effort and reduce time to identify and remediate oversharing.

Try it out

Try Microsoft Sentinel at your company.

Related links

We'd like to hear from you!

Want more information? Email us and include a link to this story and we’ll get back to you.

Please share your feedback with us—take our survey and let us know what kind of content is most useful to you.

The post Protecting against oversharing Power BI reports with Microsoft Sentinel appeared first on Inside Track Blog.

]]>
12951
Transforming data governance at Microsoft with Microsoft Purview and Microsoft Fabric http://approjects.co.za/?big=insidetrack/blog/transforming-data-governance-at-microsoft-with-microsoft-purview-and-microsoft-fabric/ Tue, 19 Sep 2023 18:40:34 +0000 http://approjects.co.za/?big=insidetrack/blog/?p=12172 Data is an invaluable asset for all businesses. Over recent years, the exponential growth of data collection and ingestion has forced most organizations to rethink their strategies for managing data. Increasing compliance requirements and ever-changing technology prevent anyone from simply leaving their enterprise data in its current state. We’re accelerating our digital transformation with an […]

The post Transforming data governance at Microsoft with Microsoft Purview and Microsoft Fabric appeared first on Inside Track Blog.

]]>
Microsoft Digital technical storiesData is an invaluable asset for all businesses. Over recent years, the exponential growth of data collection and ingestion has forced most organizations to rethink their strategies for managing data. Increasing compliance requirements and ever-changing technology prevent anyone from simply leaving their enterprise data in its current state.

We’re accelerating our digital transformation with an enterprise data platform built on Microsoft Purview and Microsoft Fabric. Our solution addresses three essential layers of data transformation:

  • Unifying data with an analytics foundation
  • Responsibly democratizing data with data governance
  • Scaling transformative outcomes with intelligent applications

As a result, we’re creating agile, regulated, and business-focused data experiences across the organization that accelerate our digital transformation.

[Unpack how we’re deploying a modern data governance strategy internally at Microsoft. Explore how we’re providing modern data transfer and storage service at Microsoft with Microsoft Azure. Discover how we’re modernizing enterprise integration services at Microsoft with Microsoft Azure.]

Accelerating responsible digital transformation

Digital transformation in today’s world is not optional. An ever-evolving set of customer expectations and an increasingly competitive marketplace prohibit organizations from operating with static business practices. Organizations must constantly adapt to create business resilience, improve decision-making, and increase cost savings.

Data is the fuel for digital transformation. The capability of any organization to transform is directly tied to how effectively they can generate, manage, and consume their data. These data processes—precisely like the broader digital transformation they enable—must also transform to meet the organization’s needs.

The Enterprise Data team at Microsoft Digital builds and operates the systems that power Microsoft’s data estate. We’re well on our way into a journey toward responsibly democratizing the data that drives global business and operations for Microsoft. We want to share our journey and give other organizations a foundation—and hopefully a starting point—for enabling their enterprise data transformation.

Seizing the opportunity for data transformation

Data transformation focuses on creating business value. Like any other organization, business value drives most of what we do. As Microsoft has grown and evolved, so has our data estate.

Our data was in silos. Various parts of the organization were managing their data in different ways, and our data wasn’t connected.

—Damon Buono, head of enterprise governance, Microsoft

At the genesis of our data transformation, we were in the same situation many organizations find themselves in. Digital transformation was a top priority for the business, and our data estate couldn’t provide the results or operate with the agility the business required.

We felt stuck between two opposing forces: maintaining controls and governance that helped secure our data and the pressure from the business to move fast and transform our data estate operations to meet evolving needs.

“Our data was in silos,” says Damon Buono, head of enterprise governance for Microsoft.  “Various parts of the organization were managing their data in different ways, and our data wasn’t connected.”

As a result, a complete perspective on enterprise-wide data wasn’t readily available. It was hard to implement controls and governance across these silos, and implementing governance always felt it was slowing us down, preventing us from supporting digital transformation at Microsoft at the required pace.

“We needed a shared data catalog to democratize data responsibly across the company,” Buono says.

Transforming data: unify, democratize, and create value

Transforming our data estate fundamentally disrupted how we think about and manage data at Microsoft. With our approach, examining data at the top-level organization became the default, and we began to view governance as an accelerator of our transformation, not a blocker. As a result of these two fundamental changes, our data’s lofty, aspirational state became achievable, and we immediately began creating business value.

Our enterprise data platform is built on three essential layers of data transformation: unifying data with an analytics foundation, responsibly democratizing data with data governance, and scaling transformative outcomes with intelligent applications.

Unifying data with an analytics foundation

Buono smiles in a corporate photo.
Establishing and adopting strong governance standards has helped Microsoft democratize access to data, says Damon Buono, head of enterprise governance for Microsoft. “When data is adequately democratized—safely accessible by everyone who should access it—transformation is accelerated,” Buono says.

Unified data is useful and effective data. Before our data transformation, we recognized the need to unify the many data silos present in the organization. Like many businesses, our data has evolved organically. Changes over the years to business practices, data storage technology, and data consumption led to increased inefficiencies in overall data use.

Analytics are foundational to the remainder of the data transformation journey. Without a solid and well-established analytics foundation, it’s impossible to implement the rest of the data transformation layers. A more centralized source of truth for enterprise data creates a comprehensive starting point for governance and creating business value with scalable applications.

With Microsoft Fabric at the core, our analytics foundation unifies data across the organization and allows us to do more with less, which, in turn, decreases data redundancy, increases data consistency, and reduces shadow IT risks and inefficiencies.

“It connects enterprise data across multiple data sources and internal organizations to create a comprehensive perspective on enterprise data,” Buono says.

Microsoft Fabric ensures that we’re all speaking the same data language. Whether we’re pulling data from Microsoft Azure, multi-cloud, or our on-premises servers, we can be confident that our analytics tools can interpret that data consistently.

Functionally, this reduces integration and operation costs and creates a predictable and transparent operational model. The unity and visibility of the analytics foundation then provide the basis for the rest of the transformation, beginning with governance.

Responsibly democratizing data with data governance

Data can be a transformative asset to the organization through responsible democratization. The goal is to accelerate the business through accessibility and availability. Democratizing data is at the center of our governance strategy. Data governance plays an active role in data protection and complements the defensive posture of security and compliance. With effective governance controls, all employees can access the data they need to make informed decisions regardless of their job function or level within the organization. Data governance is the glue that combines data discovery with the business value that data creates.

It’s critical to understand that governance accelerates our digital transformation in the modern data estate. Governance can seem like a burden and a blocker across data access and usage scenarios, but you cannot implement effective and efficient governance without a unified data strategy. This is why many organizations approach data governance like it’s a millstone hanging around their neck. Many organizations struggle with harnessing the power of data because they don’t have a data strategy and they lack alignment across the leadership teams to improve data culture.

In the Microsoft Digital data estate, governance lightens the load for our data owners, administrators, and users. Microsoft Purview helps us to democratize data responsibly, beginning with our unified analytics foundation in Microsoft Fabric. With a unified perspective on data and a system in place for understanding the entire enterprise estate, governance can be applied and monitored with Purview across all enterprise data, with an end-to-end data governance service that automates the discovery, classification, and protection of sensitive data across our on-premises, multi-cloud, and SaaS environments.

“The governance tools that protect and share any enterprise data are transparent to data creators, managers, and consumers,” Buono says. “Stakeholders can be assured that their data is being shared, accessed, and used how they want it to be.”

Our success begins with an iterative approach to data transformation. We started small, with projects that were simple to transform and didn’t have a critical impact on our business.

—Karthik Ravindran, general manager, data governance, Microsoft Security group

Responsible democratization encourages onboarding and breaks down silos. When data owners are confident in governance, they want their data on the platform, which drives the larger unification and governance of enterprise-wide data.

Scaling transformative outcomes with intelligent applications

The final layer of our data transformation strategy builds on the previous two to provide unified, democratized data to the applications and business processes used every day at Microsoft. These intelligent applications create business value. They empower employees, reduce manual efforts, increase operational efficiencies, generate increased revenue, and contribute to a better Microsoft.

How we transformed: iteration and progression

Ravindran smiles in a corporate portrait photo.
Microsoft Purview and Microsoft Fabric are enabling the company to rethink how we use data internally at Microsoft, says Karthik Ravindran, a general manager who leads data governance for the Microsoft Security group.

While the three layers provide a solid structure for building a modern data platform, they provide value only if implemented. Actual transformation happens in the day-to-day operations of an organization. We transformed by applying these layers to our business groups, data infrastructure, and even our cultural data approach at Microsoft Digital.

“Our success begins with an iterative approach to data transformation,” says Karthik Ravindran, a general manager who leads data governance for the Microsoft Security group. “We started small, with projects that were simple to transform and didn’t have a critical impact on our business.”

These early projects provided a testing ground for our methods and technology.

“We quickly iterated approaches and techniques, gathering feedback from stakeholders as we went, Ravindran says. “The results and learnings from these early implementations grew into a more mature and scalable platform. We were able to adapt to larger, more complex, and more critical sections of our data estate, tearing down larger data silos as we progressed.”

To understand how this worked, consider the following examples of our transformation across the organization.

Transforming marketing

The Microsoft Global Demand Center supports Microsoft commercial operations, including Microsoft Azure, Microsoft 365, and Dynamics 365. The Global Demand Center drives new customer acquisition and builds the growth and adoption of Microsoft products.

The Global Demand Center uses data from a broad spectrum of the business, including marketing, finance, sales, product telemetry, and many more. The use cases for this data span personas from any of these areas. Each internal Microsoft persona—whether a seller, researcher, product manager, or marketing executive—has a specific use case. Each of these personas engages with different customers to provide slightly different outcomes based on the customer and the product or service. It’s an immense swath of data consumed and managed by many teams for many purposes.

The Global Demand Center can holistically manage and monitor how Microsoft personas engage with customers by converging tools into the Microsoft Digital enterprise data platform. Each persona has a complete picture of who the customer is and what interactions or engagements they’ve had with Microsoft. These engagements include the products they’ve used, the trials they’ve downloaded, and the conversations they’ve had with other internal personas throughout their lifecycle as a Microsoft customer.

The enterprise data platform provides a common foundation for insights and intelligence into global demand for our products. The platform’s machine learning and AI capabilities empower next actions and prioritize how the Global Demand Center serves personas and customers. Moving the Global Demand Center toward adopting the enterprise data platform is iterative. It’s progressive onboarding of personas and teams to use the toolset available.

The adoption is transforming marketing and sales across Microsoft. It’s provided several benefits, including:

  • More reliable data and greater data quality. The unification of data and increased governance over the data create better data that drives better business results.
  • Decreased data costs. Moving to the enterprise data platform has reduced the overall cost compared to managing multiple data platforms.
  • Increased agility. With current and actionable data, the Global Demand Center can respond immediately to the myriad of daily changes in sales and marketing at Microsoft.

Improving the employee experience

Employee experience is paramount at Microsoft. The Microsoft Digital Employee Experience team is responsible for all aspects of the employee experience. They’re using the enterprise data platform to power a 360-degree view of the employee experience. Their insights tool connects different data across Microsoft to provide analytics and actionable insights that enable intelligent, personalized, and interconnected experiences for Microsoft employees.

The employee experience involves many data points and internal departments at Microsoft. Previously, when data was managed and governed in silos, it was difficult to build data connections to other internal organizations, such as Microsoft Human Resources (Microsoft HR). With the enterprise data platform, the Employee Experiences team can access the data they need within the controls of the platform’s governance capabilities, which gives the Microsoft HR department the stewardship and transparency they require.

The enterprise data platform creates many benefits for the Employee Experiences team, including:

  • Coordinated feature feedback and implementation. All planned software and tools features across Microsoft align with employee feedback and practical needs obtained from the enterprise data platform.
  • Better detection and mitigation of issues. Intelligent insights help Employee Experiences team members identify new and recurring issues so they can be mitigated effectively.
  • Decreased costs. The efficiencies created by using the enterprise data platform reduce engineering effort and resource usage.

Creating greater sustainability in operations

Microsoft Sustainability Operations supports efforts to increase global sustainability for Microsoft and minimize environmental impact. Sustainability Operations is responsible for environmental efforts across the organization, including waste, water, and carbon management programs.

Their internal platform, the Microsoft Cloud for Sustainability, is built on the enterprise data platform. It leverages the unified analytics and governance capabilities to create important sustainability insights that guide Sustainability Operations efforts and programs.

These insights are combined in the Microsoft Environmental Sustainability Report. This report contains 20 sections detailing how Microsoft works to minimize environmental impact. The report includes sections for emissions, capital purchases, business travel, employee commuting, product distribution, and managed assets, among others.

To provide the data for this report, Sustainability Operations has created a data processing platform with the Microsoft Cloud for Sustainability that ingests and transforms data from Microsoft Operations into a data repository. The unified data enables the team to create reports from many different perspectives using a common data model that enables quick integration.

Governance is central to the effective democratization of data, and when data is adequately democratized—safely accessible by everyone who should access it—transformation is accelerated. Modern governance is achievable using automated controls and a self-service methodology, enabling immediate opportunity to create business value.

—Damon Buono, head of enterprise governance, Microsoft

The Microsoft Environmental Sustainability Report supports decision-making at the enterprise and business group level, which enables progress tracking against internal goals, forecasting and simulation, qualitative analysis of environmental impact, and compliance management for both perspectives. These tools allow Microsoft Sustainability Operations to discover and track environmental hotspots across the global enterprise with greater frequency and more precision. Using these insights, they can drive changes in operations that create more immediate and significant environmental impact reductions.

Implementing internal data governance

Governance has been a massive part of our journey. Realizing governance as an accelerator of transformation has radically changed our approach to governance. Understanding who is accessing data, what they’re accessing, and how they’re accessing is critical to ensuring controlled and measured access. It also creates the foundation for building transparency into the enterprise data platform, growing user confidence, and increasing adoption.

“Governance is central to the effective democratization of data, and when data is adequately democratized—safely accessible by everyone who should access it—transformation is accelerated,” Buono says. “Modern governance is achievable using automated controls and a self-service methodology, enabling immediate opportunity to create business value.”

Our governance strategy uses data standards and models with actionable insights to converge our entire data estate, which spans thousands of distinct data sources. We built our approach to data governance on some crucial learnings:

  • Evidence is critical to driving adoption and recruiting executive support.
  • Automated data access and a data catalog are critical to consolidating the data estate.
  • Data issue management can provide evidence, but it doesn’t scale well.
  • A centralized data lake, scorecards for compliance, and practical controls help create evidence for governance in large enterprises.

Key Takeaways
We continue to drive the adoption of the enterprise data platform at Microsoft. As we work toward 100 percent adoption across the enterprise, we generate efficiencies and reduce costs as we go. The iterative nature of our implementation means we’ve been able to move quickly and with agility, improving our processes as we go.

We’re really very excited about where we are now with Purview, Fabric, and the entire suite of tools we now have to manage our data here at Microsoft. They are helping us rethink how we use data internally here at Microsoft, and we’re just getting started.

—Karthik Ravindran, general manager, data governance, Microsoft Security group

We’re also supporting organizational alignment and advocacy programs that will increase adoption. These programs include an internal data governance management team to improve governance, an enterprise data education program, and a training program for the responsible use of AI.

As our enterprise data estates expand and diversify, tools like Microsoft Purview and Microsoft Fabric have become indispensable in ensuring that our data remains an asset, not a liability. These tools offer a compelling solution to the pressing challenges of governing and protecting the modern data estate through automated discovery, classification, and a unified approach to hybrid and multi-cloud deployments.

“We’re really very excited about where we are now with Purview, Fabric, and the entire suite of tools we now have to manage our data here at Microsoft,” Ravindran says. “They are helping us rethink how we use data internally here at Microsoft, and we’re just getting started.”

Try it out

Related links

We'd like to hear from you!
Your feedback is valued, take our user survey here!

The post Transforming data governance at Microsoft with Microsoft Purview and Microsoft Fabric appeared first on Inside Track Blog.

]]>
12172
Providing modern data transfer and storage service at Microsoft with Microsoft Azure http://approjects.co.za/?big=insidetrack/blog/microsoft-uses-azure-to-provide-a-modern-data-transfer-and-storage-service/ Thu, 13 Jul 2023 14:54:07 +0000 http://approjects.co.za/?big=insidetrack/blog/?p=8732 Companies all over the world have launched their cloud adoption journey. While some are just starting, others are further along the path and are now researching the best options for moving their largest, most complex workflows to the cloud. It can take time for companies to address legacy tools and systems that have on-premises infrastructure […]

The post Providing modern data transfer and storage service at Microsoft with Microsoft Azure appeared first on Inside Track Blog.

]]>
Microsoft Digital technical storiesCompanies all over the world have launched their cloud adoption journey. While some are just starting, others are further along the path and are now researching the best options for moving their largest, most complex workflows to the cloud. It can take time for companies to address legacy tools and systems that have on-premises infrastructure dependencies.

Our Microsoft Digital Employee Experience (MDEE) team has been running our company as mostly cloud-only since 2018, and continues to design cloud-only solutions to help fulfill our Internet First and Microsoft Zero Trust goals.

In MDEE, we designed a Modern Data Transfer Service (MDTS), an enterprise-scale solution that allows the transfer of large files to and from partners outside the firewall and removes the need for an extranet.

MDTS makes cloud adoption easier for teams inside Microsoft and encourages the use of Microsoft Azure for all of their data transfer and storage scenarios. As a result, engineering teams can focus on building software and shipping products instead of dealing with the management overhead of Microsoft Azure subscriptions and becoming subject matter experts on infrastructure.

[Unpack simplifying Microsoft’s royalty ecosystem with connected data service. | Check out how Microsoft employees are leveraging the cloud for file storage with OneDrive Folder Backup. | Read more on simplifying compliance evidence management with Microsoft Azure confidential ledger.]

Leveraging our knowledge and experience

As part of Microsoft’s cloud adoption journey, we have been continuously looking for opportunities to help other organizations move data and remaining legacy workflows to the cloud. With more than 220,000 employees and over 150 partners that data is shared with, not every team had a clear path for converting their transfer and storage patterns into successful cloud scenarios.

We have a high level of Microsoft Azure service knowledge and expertise when it comes to storage and data transfer. We also have a long history with legacy on-premises storage designs and hybrid third-party cloud designs. Over the past decade, we engineered several data transfer and storage services to facilitate the needs of Microsoft engineering teams. Those services traditionally leveraged either on-premises designs or hybrid designs with some cloud storage. In 2019, we began to seriously look at replacing our hybrid model, which included a mix of on-premises resources, third party software, and Microsoft Azure services, with one modern service that would completely satisfy our customer scenarios using only Azure—thanks to new capabilities in Azure making it possible and it being the right time.

MDTS uses out of the box Microsoft Azure storage configurations and capabilities to help us address legacy on-premises storage patterns and support Microsoft core commitments to fully adopt Azure in a way that satisfies security requirements. Managed by a dedicated team of service engineers, program managers, and software developers, MDTS offers performance, security, and is available to any engineering team at Microsoft that needs to move their data storage and transfer to the cloud.

Designing a Modern Data Transfer and Storage Service

The design goal for MDTS was to create a single storage service offering entirely in Microsoft Azure, that would be flexible enough to meet the needs of most engineering teams at Microsoft. The service needed to be sustainable as a long-term solution, continue to support ongoing Internet First and Zero Trust Network security designs, and have the capability to adapt to evolving technology and security requirements.

Identifying use cases

First, we needed to identify the top use cases we wanted to solve and evaluate which combination of Microsoft Azure services would help us meet our requirements. The primary use cases we identified for our design included:

  • Sharing and/or distribution of complex payloads: We not only had to provide storage for corporate sharing needs, but also share those same materials externally. The variety of file sizes and different payload characteristics can be challenging because they don’t always fit a standard profile for files (e.g., Office docs, etc.).
  • Cloud storage adoption (shifting from on-premises to cloud): We wanted to ensure that engineering teams across Microsoft that needed a path to the cloud would have a roadmap. This need could arise because of expiring on-premises infrastructure, corporate direction, or other modernization initiatives like ours.
  • Consolidation of multiple storage solutions into one service, to reduce security risks and administrative overhead: Having to place data and content in multiple storage datastores for the purposes of specific sharing or performance needs is both cumbersome and can introduce additional risk. Because there wasn’t yet a single service that could meet all their sharing needs and performance requirements, employees and teams at Microsoft were using a variety of locations and services to store and share data.

Security, performance, and user experience design requirements

After identifying the use cases for MDTS, we focused on our primary design requirements. They fell into three high-level categories: security, performance, and user experience.

Security

The data transfer and storage design needed to follow our Internet First and Zero Trust network design principles. Accomplishing parity with Zero Trust meant leveraging best practices for encryption, standard ports, and authentication. At Microsoft, we already have standard design patterns that define how these pieces should be delivered.

  • Encryption: Data is encrypted both in transit and at rest.
  • Authentication: Microsoft Azure Active Directory supports both corporate synced domain accounts, external business-to-business accounts, as well as corporate and external security groups. Leveraging Azure Active Directory allows teams to remove dependencies on corporate domain controllers for authentication.
  • Authorization: Microsoft Azure Data Lake Gen2 storage provides fine grained access to containers and subfolders. This is possible because of many new capabilities, most notably the support for OAuth, hierarchical name space, and POSIX permissions. These capabilities are necessities of a Zero Trust network security design.
  • No non-standard ports: Opening non-standard ports can present a security risk. Using only HTTPS and TCP 443 as the mechanisms for transport and communication prevents opening non-standard ports. This includes having software capable of transport that maximizes the ingress/egress capabilities of the storage platform. Microsoft Azure Storage Explorer, AzCopy, and Microsoft Azure Data Factory meet the no non-standard ports requirement.

Performance

Payloads can range from being comprised of one very large file, millions of small files, and every combination in between. Scenarios across the payload spectrum have their own computing and storage performance considerations and challenges. Microsoft Azure has optimized software solutions for achieving the best possible storage ingress and egress. MDTS helps ensure that customers know what optimized solutions are available to them, provides configuration best practices, and shares the learnings with Azure Engineering to enable robust enterprise scale scenarios.

  • Data transfer speeds: Having software capable of maximizing the ingress/egress capabilities of the storage platform is preferable for engineering-type workloads. It’s common for these workloads to have complex payloads, payloads with several large files (10-500 GB) or millions of small files.
  • Ingress and egress: Support for ingress upwards of 10 Gbps and egress of 50 Gbps. Furthermore, client and server software that can consume the maximum amount of bandwidth possible up to the maximum amount in ingress/egress possible on client and storage.

 

Data size/ bandwidth 50 Mbps 100 Mbps 500 Mbps 1 Gbps 5 Gbps 10 Gbps
1 GB 2.7 minutes 1.4 minutes 0.3 minutes 0.1 minutes 0.03 minutes 0.010 minutes
10 GB 27.3 minutes 13.7 minutes 2.7 minutes 1.3 minutes 0.3 minutes >0.1 minutes
100 GB 4.6 hours 2.3 hours 0.5 hours 0.2 hours 0.05 hours 0.02 hours
1 TB 46.6 hours 23.3 hours 4.7 hours 2.3 hours 0.5 hours 0.2 hours
10 TB 19.4 days 9.7 days 1.9 days 0.9 days 0.2 days 0.1 days

Copy duration calculations based on data size and the bandwidth limit for the environment.

User experience

Users and systems need a way to perform manual and automated storage actions with graphical, command line, or API-initiated experiences.

  • Graphical user experience: Microsoft Azure Storage Explorer provides Storage Admins the ability to graphically manage storage. It also has storage consumer features for those who don’t have permissions for Administrative actions, and simply need to perform common storage actions like uploading, downloading, etc.
  • Command line experience: AzCopy provides developers with an easy way to automate common storage actions through CLI or scheduled tasks.
  • Automated experiences: Both Microsoft Azure Data Factory and AzCopy provide the ability for applications to use Azure Data Lake Gen2 storage as its primary storage source and destination.

Identifying personas

Because a diverse set of personas utilize storage for different purposes, we need to design storage experiences that satisfy the range of business needs. Through the process of development, we identified these custom persona experiences relevant to both storage and data transfer:

  • Storage Admins: The Storage Admins are Microsoft Azure subscription owners. Within the Azure subscription they create, manage, and maintain all aspects of MDTS: Storage Accounts, Data Factories, Storage Actions Service, and Self-Service Portal. Storage Admins also resolve requests and incidents that are not handled via Self-Service.
  • Data Owners: The Data Owner personas are those requesting storage who have the authority to create shares and authorize storage. Data Owners also perform the initial steps of creating automated distributions of data to and from private sites. Data Owners are essentially the decision makers of the storage following handoff of a storage account from Storage Admins.
  • Storage Consumers: At Microsoft, storage consumers represent a broad set of disciplines, from engineers and developers to project managers and marketing professionals. Storage Consumers can use Microsoft Azure Storage Explorer to perform storage actions to and from authorized storage paths (aka Shares). Within the MDTS Self Service Portal, a storage consumer can be given authorization to create distributions. A distribution can automate the transfer of data from a source to one or multiple destinations.

Implementing and enhancing the solution architecture

After considering multiple Microsoft Azure storage types and complimentary Azure Services, the MDTS team chose the following Microsoft Azure services and software as the foundation for offering a storage and data transfer service to Microsoft Engineering Groups.

  • Microsoft Azure Active Directory: Meets the requirements for authentication and access.
  • Microsoft Azure Data Lake Gen2: Meets security and performance requirements by providing encryption, OAuth, Hierarical namespace, fine grained authorization to Azure Active Directory entities, and 10+ GB per sec ingress and egress.
  • Microsoft Azure Storage Explorer: Meets security, performance, and user experience requirements by providing a graphical experience to perform storage administrative tasks and storage consumer tasks without needing a storage account key or role based access (RBAC) on an Azure resource. Azure Storage Explorer also has AzCopy embedded to satisfy performance for complex payloads.
  • AzCopy: Provides a robust and highly performant command line interface.
  • Microsoft Azure Data Factory: Meets the requirements for orchestrating and automating data copies between private networks and Azure Data Lake Gen2 storage paths. Azure Data Factory copy activities are equally as performant as AzCopy and satisfy security requriements.

Enabling Storage and Orchestration

As illustrated below, the first MDTS design was comprised entirely of Microsoft Azure Services with no additional investment from us other than people to manage the Microsoft Azure subscription and perform routine requests. MDTS was offered as a commodity service to engineering teams at Microsoft in January 2020. Within a few months we saw a reduction of third-party software and on-premises file server storage, which provided significant savings. This migration also contributed progress towards the company-wide objectives of Internet First and Zero Trust design patterns.

The first design of MDTS provides storage and orchestration using out of the box Microsoft Azure services.
The first design of MDTS provides storage and orchestration using out of the box Microsoft Azure services.

We initially onboarded 35 engineering teams which included 10,000 Microsoft Azure Storage Explorer users (internal and external accounts), and 600 TB per month of Microsoft Azure storage uploads and downloads. By offering the MDTS service, we saved engineering teams from having to run Azure subscriptions themselves and needing to learn the technical details of implementing a modern cloud storage solution.

Creating access control models

As a team, we quickly discovered that having specific repeatable implementation strategies was essential when configuring public facing Microsoft Azure storage. Our initial time investment was in standardizing an access control process which would simplify complexity and ensure a correct security posture before handing off storage to customers. To do this, we constructed onboarding processes for identifying the type of share for which we standardized the implementation steps.

We implemented standard access control models for two types of shares: container shares and sub-shares.

Container share access control model

The container share access control model is used for scenarios where the data owner prefers users to have access to a broad set of data. As illustrated in the graphic below, container shares supply access to the root, or parent, of a folder hierarchy. The container is the parent. Any member of the security group will gain access to the top level. When creating a container share, we also make it possible to convert to a sub-share access control model if desired.

 

Microsoft Azure Storage Explorer grants access to the root of a folder hierarchy using the container share access control model.
Microsoft Azure Storage Explorer grants access to the root, or parent, of a folder hierarchy using the container share access control model. Both engineering and marketing are containers. Each has a specific Microsoft Azure Active Directory Security group. A top-level Microsoft Azure AD Security group is also added to minimize effort for users who should get access to all containers added to the storage account.

This model fits scenarios where group members get Read, Write, and Execute permissions to an entire container. The authorization allows users to upload, download, create, and/or delete folders and files. Making changes to the Access Control restricts access. For example, to create access permissions for download only, select Read and Execute.

Sub-share access control model

The sub-share access control model is used for scenarios where the data owner prefers users have explicit access to folders only. As illustrated in the graphic below, folders are hierarchically created under the container. In cases where several folders exist, a security group access control can be implemented on a specific folder. Access is granted to the folder where the access control is applied. This prevents users from seeing or navigating folders under the container other than the folders where an explicit access control is applied. When users attempt to browse the container, authorization will fail.

 

Microsoft Azure Storage Explorer grants access to sub-folder only using the sub-share access control model.
Microsoft Azure Storage Explorer grants access to sub-folder only using the sub-share access control model. Members are added to the sub-share group, not the container group. The sub-share group is nested in the container group with execute permissions to allow for Read and Write on the sub-share.

This model fits for scenarios where group members get Read, Write, and Execute permissions to a sub-folder only. The authorization allows users to upload, download, create folders/files, and delete folders/files. The access control is specific to the folder “project1.” In this model you can have multiple folders under the container, but only provide authorization to a specific folder.

The sub-share process is only applied if a sub-share is needed.

  • Any folder needing explicit authorization is considered a sub-share.
  • We apply a sub-share security group access control with Read, Write, and Execute on the folder.
  • We nest the sub-share security group in the parent share security group used for Execute only. This allows members who do not have access to the container enough authorization to Read, Write, and Execute the specific sub-share folder without having Read or Write permissions to any other folders in the container.

Applying access controls for each type of share (container and or sub-share)

The parent share process is standard for each storage account.

  • Each storge account has a unique security group. This security group will have access control applied for any containers. This allows data owners to add members and effectively give access to all containers (current and future) by simply changing the membership of one group.
  • Each container will have a unique security group for Read, Write, and Execute. This security group is used to isolate authorization to a single container.
  • Each container will have a unique group for execute. This security group is needed in the event sub-shares are created. Sub-shares are folder-specific shares in the hierarchical namespace.
  • We always use the default access control option. This is a feature that automatically applies the parent permissions to all new child folders (sub-folders).

The first design enabled us to offer MDTS while our engineers defined, designed, and developed an improved experience for all the personas. It quickly became evident that Storage Admins needed the ability to see an aggregate view of all storage actions in near real-time to successfully operate the service. It was important for our administrators to easily discover the most active accounts and which user, service principle, or managed service identity was making storage requests or performing storage actions. In July 2020, we added the Aggregate Storage Actions service.

Adding aggregate storage actions

For our second MDTS design, we augmented the out of the box Microsoft Azure Storage capabilities used in our first design with the capabilities of Microsoft Azure Monitor, Event Hubs, Stream Analytics, Function Apps, and Microsoft Azure Data Explorer to provide aggregate storage actions. Once the Aggregate Storage Actions capability was deployed and configured within MDTS, storage admins were able to aggregate the storage actions of all their storage accounts and see them in a single pane view.

 

The second design of MDTS introduces aggregate storage actions.
The second design of MDTS introduces aggregate storage actions.

The Microsoft Azure Storage Diagnostic settings in Microsoft Azure Portal makes it possible for us to configure specific settings for blob actions. Combining this feature with other Azure Services and some custom data manipulation gives MDTS the ability to see which users are performing storage actions, what those storage actions are, and when those actions were performed. The data visualizations are near real-time and aggregated across all the storage accounts.

Storage accounts are configured to route logs from Microsoft Azure Monitor to Event Hub. We currently have 45+ storage accounts that generate around five million logs each day. Data filtering, manipulation, and grouping is performed by Stream Analytics. Function Apps are responsible for fetching UPNs using Graph API, then pushing logs to Microsoft Azure Data Explorer. Microsoft Power BI and our modern self-service portal query Microsoft Azure Data Explorer and provide the visualizations, including dashboards with drill down functionality. The data available in our dashboard includes the following information aggregated across all customers (currently 35 storage accounts).

  • Aggregate view of most active accounts based on log activity.
  • Aggregate total of GB uploaded and download per storage account.
  • Top users who uploaded showing the user principal name (both external and internal).
  • Top users who downloaded showing the user principal name (both external and internal).
  • Top Accounts uploaded data.
  • Top Accounts downloaded data.

The only setting required to onboard new storage accounts is to configure them to route logs to the Event Hub. Because we can have an aggregate store of all the storage account activities, we are able to offer MDTS customers an account view into their storage account specific data.

Following the release of Aggregate Storage Actions, the MDTS team, along with feedback from customers, identified another area of investment—the need for storage customers to “self-service” and view account specific insights without having role-based access to the subscription or storage accounts.

Providing a self-service experience

To enhance the experience of the other personas, MDTS is now focused on the creation of a Microsoft Azure web portal where customers can self-service different storage and transfer capabilities without having to provide any Microsoft Azure Role Based Access (RBAC) to the underlying subscription that hosts the MDTS service.

When designing MDTS self-service capabilities we focused on meeting these primary goals:

  • Make it possible for Microsoft Azure Subscription owners (Storage Admins) to provide the platform and services while not needing to be in the middle of making changes to storage and transfer services.
  • The ability to create custom persona experiences so customers can achieve their storage and transfer goals through a single portal experience in a secure and intuitive way. Some of the new enterprise scale capabilities include:
    • Onboarding.
    • Creating storage shares.
    • Authorization changes.
    • Distributions. Automating the distribution of data from one source to one or multiple destinations.
    • Provide insights into storage actions (based on the data provided in Storage Actions enabled in our second MDTS release).
    • Reporting basic consumption data, like the number of users, groups, and shares on a particular account.
    • Reporting the cost of the account
  • As Azure services and customer scenarios change, the portal can also change.
  • If customers want to “self-host” (essentially take our investments and do it themselves), we will easily be able to accommodate.

Our next design of MDTS introduces a self-service portal.
Our next design of MDTS introduces a self-service portal.

Storage consumer user experiences

After storage is created and configured, data owners can then share steps for storage consumers to start using storage. Upload and download are the most common storage actions, and Microsoft Azure provides software and services needed to perform both actions for manual and automated scenarios.

Microsoft Azure Storage Explorer is recommended for manual scenarios where users can connect and perform high speed uploads and downloads manually. Both Microsoft Azure Data Factory and AzCopy can be used in scenarios where automation is needed. AzCopy is heavily preferred in scenarios where synchronization is required. Microsoft Azure Data Factory doesn’t provide synchronization but does provide robust data copy and data move. Azure Data Factory is also a managed service and better suited in enterprise scenarios where flexible triggering options, uptime, auto scale, monitoring, and metrics are required.

Using Microsoft Azure Storage Explorer for manual storage actions

Developers and Storage Admins are accustomed to using Microsoft Azure Storage Explorer for both storage administration and routine storage actions (e.g., uploading and downloading). Non-storage admin, otherwise known as Storage Consumers, can also use Microsoft Azure Storage Explorer to connect and perform storage actions without needing any role-based access control or access keys to the storage account. Once the storage is authorized, members of authorized groups can perform routine steps to attach the storage they are authorized for, authenticating with their work email, and leveraging the options based on their authorization.

The processes for sign-in and adding a resource via Microsoft Azure Active Directory are found in the Manage Accounts and Open Connect Dialogue options of Microsoft Azure Storage Explorer.

After signing in and selecting the option to add the resource via Microsoft Azure Active Directory, you can supply the storage URL and connect. Once connected, it only requires a few clicks to upload and download data.

 

Microsoft Azure Storage Explorer Local and Attached module.
Microsoft Azure Storage Explorer Local and Attached module. After following the add resource via Microsoft Azure AD process, the Azure AD group itshowcase-engineering is authorized to Read, Write, and Edit (rwe) and members of the group can perform storage actions.

To learn more about using Microsoft Azure Storage Explorer, Get started with Storage Explorer. There are additional links in the More Information section at the end of this document.

Note: Microsoft Azure Storage Explorer uses AzCopy. Having AzCopy as the transport allows storage consumers to benefit from high-speed transfers. If desired, AzCopy can be used as a stand-alone command line application.

Using AzCopy for manual or automated storage actions

AzCopy is a command line interface used to perform storage actions on authorized paths. AzCopy is used in Microsoft Azure Storage Explorer but can also be used as a standalone executable to automate storage actions. It’s a multi-stream TCP based transport capable of optimizing throughput based on the bandwidth available. MDTS customers use AzCopy in scenarios which require synchronization or cases where Microsoft Azure Storage Explorer or Microsoft Azure Data Factory copy activity doesn’t meet the requirements for data transfer. For more information about using AzCopy please see the More Information section at the end of this document.

AzCopy is a great match for standalone and synchronization scenarios. It also has options that are useful when seeking to automate or build applications. Because AzCopy is a single executable running on either a single client or server system, it isn’t always ideal for enterprise scenarios. Microsoft Azure Data Factory is a more robust Microsoft Azure service that meets the most enterprise needs.

Using Microsoft Azure Data Factory for automated copy activity

Some of the teams that use MDTS require the ability to orchestrate and operationalize storage uploads and downloads. Before MDTS, we would have either built a custom service or licensed a third-party solution, which can be expensive and/or time consuming.

Microsoft Azure Data Factory, a cloud-based ETL and data integration service, allows us to create data-driven workflows for orchestrating data movement. Including Azure Data Factory in our storage hosting service model provided customers with a way to automate data copy activities. MDTS’s most common data movement scenarios are distributing builds from a single source to multiple destinations (3-5 destinations are common).

Another requirement for MSTS was to leverage private data stores as a source or destination. Microsoft Azure Data Factory provides the capability to use a private system, also known as a self-hosted integration runtime. When configured this system can be used in copy activity communicating with on-premises file systems. The on-premises file system can then be used as a source and/or destination datastore.

In the situation where on-premises file system data needs to be stored in Microsoft Azure or shared with external partners, Microsoft Azure Data Factory provides the ability to orchestrate pipelines which perform one or multiple copy activities in sequence. These activities result in end-to-end data movement from one on-premises file systems to Microsoft Azure Storage, and then to another private system if desired.

The graphic below provides an example of a pipeline orchestrated to copy builds from a single source to several private destinations.

 

Detailed example of a Microsoft Azure Data Factory pipeline, including the build system source.
Microsoft Azure Data Factory pipeline example. Private site 1 is the build system source. Build system will build, load the source file system, then trigger the Microsoft Azure Data Factory pipeline. Build is then uploaded, then Private sites 2, 3, 4 will download. Function apps are used for sending email notifications to site owners and additional validation.

For more information on Azure Data Factory, please see Introduction to Microsoft Azure Data Factory. There are additional links in the More Information section at the end of this document.

If you are thinking about using Microsoft Azure to develop a modern data transfer and storage solution for your organization, here are some of the best practices we gathered while developing MDTS.

Close the technical gap for storage consumers with a white glove approach to onboarding

Be prepared to spend time with customers who are initially overwhelmed with using Azure Storage Explorer or AzCopy. At Microsoft, storage consumers represent a broad set of disciplines—from engineers and developers to project managers and marketing professionals. Azure Storage Explorer provides an excellent experience for engineers and developers but can be a little challenging for less technical roles.

Have a standard access control model

Use Microsoft Azure Active Directory security groups and group nesting to manage authorization; Microsoft Azure Data Lake Gen2 storage has a limit to the number of Access Controls you can apply. To avoid reaching this limit, and to simplify administration, we recommend using Microsoft Azure Active Directory security groups. We apply the access control to the security group only, and in some cases, we nest other security groups within the access control group. We nest Member Security Groups within Access Control Security Groups to manage access. These group types don’t exist in Microsoft Azure Active Directory but do exist within our MDTS service as a process to differentiate the purpose of a group. We can easily determine this differentiation by the name of the group.

  • Access Control Security Groups: We use this group type for applying Access Control on ADLS Gen2 storage containers and/or folders.
  • Member Security Groups: We use these to satisfy cases where access to containers and/or folders will constantly change for members.

When there are large numbers of members, nesting prevents the need to add members individually to the Access Control Security Groups. When access is no longer needed, we can remove the Member Group(s) from the Access Control Security Group and no further action is needed on storage objects.

Along with using Microsoft Azure Active Directory security groups, make sure to have a documented process for applying access controls. Be consistent and have a way of tracking where access controls are applied.

Use descriptive display names for your Microsoft Azure AD security groups

Because Microsoft Azure AD doesn’t currently organize groups by owners, we recommend using naming conventions that capture the group’s purpose and type to allow for easier searches.

  • Example 1: mdts-ac-storageacct1-rwe. This group name uses our service standard naming convention for Access Control group type on Storage Account 1, with access control Read, Write, and Execute. mdts = Service, ac = Access Control Type, storageacct1 = ADLS Gen2 Storage Account Name, rwe = permission of the access control.
  • Example 2: mdts-mg-storageacct1-project1. This group name uses our service standard naming convention for Member Group type on Storage Account 1. This group does not have an explicit access control on storage, but it is nested in mdts-ac-storageacct1-rwe where any member of this group has the Read, Write, and Execute access to storage account1 because it’s nested in mdts-ac-storageacct1-rwe.

Remember to propagate any changes to access controls

Microsoft Azure Data Lake Gen2 storage, by default, doesn’t automatically propagate any access control changes. As such, when removing, adding, or changing an access control, you need to follow an additional step to propagate the access control list. This option is available in Microsoft Azure Storage Explorer.

Storage Consumers can attempt Administrative options

Storage Consumers use Microsoft Azure Storage Explorer and are authenticated with their Microsoft Azure Active Directory user profile. Since Azure Storage Explorer is primarily developed for Storage Admin and Developer personas, all administrative actions are visible. It is common for storage consumers to attempt administrative actions, like managing access or deleting a container. Those actions will fail due to only being accessed via access control lists (ACLs). There isn’t a way to provide administration actions via ACL’s. If administrative actions are needed, then users will become a Storage Admin which has access via Azure’s Role Based Access Control (RBAC).

Microsoft Azure Storage Explorer and AzCopy are throughput intensive

As stated above, AzCopy is leveraged by Microsoft Azure Storage Explorer for transport actions. When using Azure Storage Explorer or AzCopy it’s important to understand that transfer performance is its specialty. Because of this, some clients and/or networks may benefit from throttling AzCopy’s performance. In circumstances where you don’t want AzCopy to consume too much network bandwidth, there are configurations available. In Microsoft Azure Storage Explorer use the Settings option and select the Transfers section to configure Network Concurrency and/or File Concurrency. In the Network Concurrency section, Adjust Dynamically is a default option. For AzCopy, there are flags and environment variables available to optimize performance.

For more information, visit Configure, optimize, and troubleshoot AzCopy.

Microsoft Azure Storage Explorer sign-in with MSAL

Microsoft Authentication Library, currently in product preview, provides enhanced single sign-on, multi-factor authentication, and conditional access support. In some situations, users won’t authenticate unless MSAL is selected. To enable MSAL, select the Setting option from Microsoft Azure Storage Explorer’s navigation pane. Then in the application section, select the option to enable Microsoft Authentication Library.

B2B invites are needed for external accounts (guest user access)

When there is a Microsoft business need to work with external partners, leveraging guest user access in Microsoft Azure Active Directory is necessary. Once the B2B invite process is followed, external accounts can be authorized by managing group membership. For more information, read What is B2B collaboration in Azure Active Directory?

Key Takeaways

We used Microsoft Azure products and services to create an end-to-end modern data transfer and storage service that can be used by any group at Microsoft that desires cloud data storage. The release of Microsoft Azure Data Lake Gen 2, Microsoft Azure Data Factory, and the improvements in the latest release of Azure Storage Explorer made it possible for us to offer MDTS as a fully native Microsoft Azure service.

One of the many strengths of using Microsoft Azure is the ability to use only what we needed, as we needed it. For MDTS, we started by simply creating storage accounts, requesting Microsoft Azure Active Directory Security Groups, applying an access control to storage URLs, and releasing the storage to customers for use. We then invested in adding storage actions and developed self-service capabilities that make MDTS a true enterprise-scale solution for data transfer and storage in the cloud.

We are actively encouraging the adoption of our MDTS storage design to all Microsoft engineering teams that still rely on legacy storage hosted in the Microsoft Corporate network. We are also encouraging any Microsoft Azure consumers to consider this design when evaluating options for storage and file sharing scenarios. Our design has proven to be scalable, compliant, and performant with the Microsoft Zero Trust security initiative, handling extreme payloads with high throughput and no constraints on the size or number of files.

By eliminating our dependency on third-party software, we have been able to eliminate third-party licensing, consulting, and hosting costs for many on-premises storage systems.

Are you ready to learn more? Sign up for your own Microsoft Azure subscription and get started today.

To receive the latest updates on Azure storage products and features to meet your cloud investment needs, visit Microsoft Azure updates.

Related links

 

The post Providing modern data transfer and storage service at Microsoft with Microsoft Azure appeared first on Inside Track Blog.

]]>
8732