Data and AI Archives - Inside Track Blog http://approjects.co.za/?big=insidetrack/blog/tag/data-and-ai/ How Microsoft does IT Thu, 31 Oct 2024 16:15:00 +0000 en-US hourly 1 https://wordpress.org/?v=6.6.2 137088546 Embracing emerging technology at Microsoft with new AI certifications http://approjects.co.za/?big=insidetrack/blog/embracing-emerging-technology-at-microsoft-with-new-ai-certifications/ Fri, 09 Aug 2024 15:56:29 +0000 http://approjects.co.za/?big=insidetrack/blog/?p=12532 As an organization, we immediately saw that advanced AI was going to create opportunities for our employees to increase their reach and impact. We knew we needed to move quickly to help them get ready for the moment. Our response? We assembled an ambitious data and AI curriculum through Microsoft Viva Learning that draws from […]

The post Embracing emerging technology at Microsoft with new AI certifications appeared first on Inside Track Blog.

]]>
As an organization, we immediately saw that advanced AI was going to create opportunities for our employees to increase their reach and impact. We knew we needed to move quickly to help them get ready for the moment.

Our response?

We assembled an ambitious data and AI curriculum through Microsoft Viva Learning that draws from Microsoft Learn and other content sources. This curriculum is empowering our employees at Microsoft Digital, the company’s IT organization, with the skills they need to harness these tools.

Microsoft Viva Learning and Microsoft Learn

Microsoft Viva Learning and Microsoft Learn are two distinct platforms that serve different purposes.

Microsoft Viva Learning is a centralized learning hub in Microsoft Teams that lets you seamlessly integrate learning and building skills into your day. In Viva Learning, your team can discover, share, recommend, and learn from content libraries provided by both your organization and partners. They can do all of this without leaving Microsoft Teams.

Microsoft Learn is a free online learning platform that provides interactive learning content for Microsoft products and services. It offers a wide range of courses, tutorials, and certifications to help users learn new skills and advance their careers. Microsoft Learn is accessible to anyone with an internet connection and is available in multiple languages.

It’s all part of our approach to infusing AI into everything we do to support the company. The more successful we are in Microsoft Digital, the better our team can deploy our new AI technologies to the rest of our colleagues across the organization.

Infusing AI into Microsoft through a learn-it-all culture

Fully unleashing AI across Microsoft is a bold aspiration that will require plenty of guidance and support from our Microsoft Digital team. It’s both a technology and a people challenge that requires us to have more than IT knowledge to deliver.

“We take a holistic approach,” says Sean MacDonald, partner director of product management in Microsoft Digital. “It’s not just about winning with technology—it’s about supporting the community and doing things the right way.”

With our learn-it-all culture and Microsoft Viva Learning, Microsoft Learn, and other content sources at our disposal, a progressive curriculum was the natural choice for upskilling our technical professionals. Microsoft Viva Learning connects content from our organization’s internal learning libraries and third-party learning management systems. As a result, it makes it easy for our team to develop learning paths with content from Microsoft Learn, LinkedIn Learning, and external providers like Pearson.

“As a tech company, we’re always encountering new concepts and new technologies,” says Miguel Uribe, principal product manager lead for Employee Experience Insights in Microsoft Digital. “It’s part of our culture to absorb technology and consume concepts very quickly, and AI is just the latest example.”

{Learn how we’re managing our response to the AI Revolution internally at Microsoft with an AI Center of Excellence. See how we’re getting the most out of generative AI at Microsoft with good governance.}

Building meaningful AI certifications for Microsoft employees

The AI Center of Excellence (AI CoE), our Microsoft Digital team tasked with designing and championing how our organization uses AI, is at the forefront of these efforts. They’re working to standardize how we leverage AI internally.

Operational pillars of the AI Center of Excellence

We’ve created four pillars to guide our internal implementation of generative AI across Microsoft: Strategy, architecture, roadmap, and culture. Our AI certifications program falls under culture.

“Our first priority is creating a common understanding and language around these fairly new topics,” says Humberto Arias, senior product manager on the Frictionless Devices team in Microsoft Digital. “The technology changes constantly, so you need to learn continually to keep up.”

Fortunately, enterprising employees within Microsoft have been laying the groundwork for this moment for years. Our Artificial Intelligence and Machine Learning (AIML) community had been working on their own time to deepen their knowledge through research and independent certifications.

MacDonald, Uribe, Arias, Sengar, Pancholi, Ceurvorst, Philpott, and Paniaras pose for pictures that have been assembled into a collage.
Sean MacDonald (left to right), Miguel Uribe, Humberto Arias, Urvi Sengar, Nitul Pancholi, Amy Ceurvorst, John Philpott, Sunitha Bodhanampati, Yannis Paniaras, and Dave Rodriguez (not pictured), are all part of a larger Microsoft Digital team implementing a new AI training curriculum for employees at Microsoft.

When generative AI took off at the start of 2023, that community began partnering with the AI CoE and got serious about empowerment. They brought their knowledge. The AI CoE brought their organizational leadership.

“No other organization within Microsoft can provide such a clear picture of what you need for upskilling,” says Urvi Sengar, AIML software engineer in Microsoft Digital. “Only our IT organization is functionally diverse enough.”

Their work is a testament to the power of trusting your technology champions to lead change. In previous years, Sengar and her AIML community colleagues had already built a learning path focused on AI-900, a pathway dedicated to AI fundamentals. They relaunched the course in 2023 to represent the core of our AI certifications.

From there, a diverse group of technical and employee experience professionals collaborated to assemble, create, and structure a series of learning paths to launch Microsoft Digital’s employees into the next level of AI expertise. That’s where Microsoft Viva Learning really shines. The platform makes it easy to curate our AI content actively as the technology landscape evolves.

“So much is changing that we don’t want to stop at just one static certification,” Sengar says. “We want to keep the learning going along with everything new and relevant so we can take this community forward.”

The result is a granular, multidisciplinary curriculum that gets Microsoft Digital employees not just up to AI literacy, but AI proficiency.

Innovative AI certifications designed for employee success

Our AI and Data Learning curriculum breaks into three distinct learning paths: basic, intermediate, and advanced.

  • AI Learning Basic gives beginners a ground-level, conceptual understanding of the technology. It builds familiarity with generative AI, Azure OpenAI Service, and no-code AI, as well as more theoretical frameworks like the Responsible AI principles, AI ethics, and how to align AI projects with our values.
  • AI Learning Intermediate is where things get more functional. Here, employees learn about natural language processing and prompt engineering, as well as several specific AI tools, including ChatGPT, AI Builder in Power Automate, Semantic Kernal for building AI-based apps, Microsoft 365 Copilot Extensibility Framework, and more.
  • AI Learning Advanced goes from function to innovation. This is where employees can dive deeper into technologies like large language models (LLMs), training neural networks, self-supervised machine learning, and other skills that will help them develop more advanced solutions and automations.

When employees complete each learning path, they receive a sharable badge. We used Credly, a digital credentialing solution created by Pearson, to design and manage those badges. We can then distribute them to our employees through Credly’s integration with Microsoft Viva Learning.

Microsoft Digital AI certification levels

Microsoft employees can obtain three levels of AI certification: beginner, intermediate, and advanced.

Curating the curriculum is only one part of the AI CoE’s job. It’s also crucial to promote and socialize these learning opportunities internally. The wider Microsoft Viva employee experience suite takes care of that.

We actively socialize the AI certifications through Microsoft Viva Engage, our employee communication platform, but top-down promotion is only one component of their success. Microsoft Digital employees often share their certifications via LinkedIn or through Viva Engage. As a result, there’s an element of virality that leads even more of our employees to take these courses—even outside Microsoft Digital.

Our teams are clearly excited about their success. The share rate for AI Learning badges is 72 percent, well above Credly’s average of 47 percent.

Beyond Microsoft Digital, lines of business across Microsoft are even adapting these certifications for their own needs.

“People are observing the work we do and looking for ways to bring it into their organizations,” says Nitul Pancholi, engineering product manager in Microsoft Digital leading the AI CoE’s culture pillar. “Even external customers are asking how they can set up their own centers of excellence and what to prioritize.”

Freshly empowered AI practitioners, ready for the future

We’re still at the beginning of our internal AI adoption journey. But by raising the baseline of AI knowledge, these certifications ensure our technical professionals are ready to lead the rest of our organization.

“That’s one of the super cool things about Microsoft,” MacDonald says. “We have the playground at our fingertips, and we have the autonomy and opportunity to dream up whatever we want.”

The advent of advanced AI supported by thoughtful empowerment initiatives will only amplify our employees’ ability to experiment with emerging technologies. We’re confident that developing our own AI curriculum will help us work our way into a virtuous cycle of more learning, more creativity, and more business innovation.

Customers with access to Microsoft Viva Learn can start assembling their own AI curriculum from Microsoft Learn content, their own organizations’ educational materials, and external providers and learning management systems today. By unlocking AI for employees through education, organizations are positioned to ride the wave of the next digital revolution.

Read more about the AI CoE and how we’re responding to the AI revolution here.

Key Takeaways

Here are some things to consider as you think about launching an AI curriculum at your company:

  • Leverage your integrations with tools like Microsoft Viva Learning and LinkedIn Learning.
  • Actively curate your courses to keep your curriculum up to date.
  • Busy schedules get in the way: Build time for learning into your employees’ days, then support them with curriculum.
  • Leverage executive sponsorship, employee champions, and the social aspects of learning.
  • Incentivize and recognize progress through gamification, friendly competition, badges, and testimonials.
  • Build a diverse enablement team from across different disciplines, seniorities, and technical backgrounds.
  • Think about how to segment learners by level of expertise and learning style, then tailor the learning to those segments.

The post Embracing emerging technology at Microsoft with new AI certifications appeared first on Inside Track Blog.

]]>
12532
Transforming Microsoft’s enterprise IT infrastructure with AI http://approjects.co.za/?big=insidetrack/blog/transforming-microsofts-enterprise-it-infrastructure-with-ai/ Wed, 21 Feb 2024 21:58:01 +0000 http://approjects.co.za/?big=insidetrack/blog/?p=13516 AI is changing everything at Microsoft, including our approach to core IT. We in Microsoft Digital, the company’s IT organization, are using the advent of generative AI to reexamine and transform our entire IT infrastructure. “We’ve crossed an important threshold with AI,” says Mark Sherwood, vice president of Infrastructure and Engineering Services in Microsoft Digital. […]

The post Transforming Microsoft’s enterprise IT infrastructure with AI appeared first on Inside Track Blog.

]]>

AI is changing everything at Microsoft, including our approach to core IT.

We in Microsoft Digital, the company’s IT organization, are using the advent of generative AI to reexamine and transform our entire IT infrastructure.

“We’ve crossed an important threshold with AI,” says Mark Sherwood, vice president of Infrastructure and Engineering Services in Microsoft Digital. “We’re now using it to transform all our core IT services, to make everything we do more efficient, and secure.”

Sherwood and his team manage our core IT services, a massive enterprise IT estate that supports all of Microsoft’s business worldwide. Microsoft is an expansive universe of connected devices made up of hundreds of thousands of PCs and laptops, conference rooms, building IoT sensors, and personal devices—all dependent on a foundation of network connectivity and security to enable seamless access to the tools and services our employees rely on every day.

It’s clear that AI brings immense value to our IT infrastructure.

“This is a fascinating time to be working in IT,” Sherwood says. “We’re using AI across all of our services, and now we get to take that investment to the next level. Now it’s all about seeing what we can do with it.”

Aligning IT infrastructure innovation with the rest of the organization

The strategy for AI transformation in core IT infrastructure is one part of a larger vision for the impact of AI across all of Microsoft Digital.

“The potential for transformation through AI is nearly limitless,” says Natalie D’Hers, corporate vice president of Microsoft Digital. “We’re evaluating every service in our portfolio to consider how AI can improve outcomes, lower costs, and create a sustained competitive advantage for Microsoft and for our customers.”

We’re hyper-focused on our employee experience, and AI will be instrumental in shaping the future of how Microsoft employees interact with customers, the organization, and each other.

Transforming and securing our network and infrastructure

AI holds enormous potential across all of Microsoft Digital, but within IT infrastructure, the benefits of AI-enabled transformation play out across several specific pillars where we’re focusing our efforts: Device management, network infrastructure, tenant management, security, and the IT support experience.

Security

We can’t transform without adequate security. Properly implemented security controls and governance provide the secure foundation on which our engineering teams build solutions, and that security is especially relevant as we incorporate AI into our services and solutions.

Securing our network and endpoints is imperative, and our Zero Trust Networking efforts across our IT infrastructure provide essential protection against threats to our network security. AI will enhance the security and compliance of these efforts in our cloud and on-premises environments.

AI-based network assignment for devices will simplify network classification and provide more robust risk-based isolation to isolate risky devices and reduce unwanted movement across the network.

We’re automating access controls for our wired and wireless networks to improve security effectiveness. AI-infused processes for analyzing device vulnerabilities, detecting anomalous firewall traffic flow, and diagnosing other network incidents will play a critical role in our continued shift toward the internet as our primary network transport.

We anticipate that AI-supplemented capability in Microsoft 365’s multi-tenant organization feature will help us meet our ever-changing network segmentation needs by maintaining tenant separation and enabling secure tenant cross-collaboration when required.

AI will help us manage third-party app access and revolutionize how we understand user interactions with applications across managed devices or SaaS platforms. We’ll increase access efficiency and reduce costs by capturing third-party app usage and needs more accurately, using AI to determine the how, why, and when of user access.

Intelligent infrastructure

Sherwood (left to right), Apple, Selvaraj, and Suver appear in a composite image.
Mark Sherwood (left to right), Pete Apple, Senthil Selvaraj, and Phil Suver were part of the team incorporating AI into Microsoft Digital’s vision for core IT.

Software-defined networking and infrastructure code are already transforming how we approach networking, but AI amplifies the benefits radically.

AI enables us to build data-driven intelligence into network infrastructure, engineering, and operations. AI-driven processes will help us eliminate configuration drift, comply with security policies, reduce operator errors, and efficiently respond to rapidly changing business needs.

We’re implementing AI-driven automation to simplify resource management and deployment, capitalizing on the flexibility provided by software-defined networking and infrastructure as code.

AI will assist with generating code designs, defining and managing network configurations, managing deployments, conducting pre- and post-deployment verifications, and assisting with change management over time. Near real-time streaming telemetry from network devices will form the foundation to guide operation and continuous improvement.

We’re improving network self-healing capabilities by using AI to detect and remediate network issues, creating a more reliable, resilient, and elastic network environment and reducing human intervention and potential for error.

One of our current projects is creating an AI-based assistant app for our direct engineering teams that mines and analyzes our current network infrastructure catalog, providing an advanced set of capabilities that supplement our engineers’ expertise in the field. The assistant app improves productivity and mitigation time for network infrastructure incidents. The AI component is trained on more than 200,000 prior incidents for anomaly detection and predictive analytics. We’re confident it will lead to a considerable reduction in network outages and maintenance costs.

Device management

With more than 1 million interconnected devices, AI-powered capabilities will significantly benefit our device management practices with a focus on user and administrator workflows.

We’re implementing intelligent device recommendations to ensure our employees have the best tools to do their work. Building AI into a centralized device lifecycle management tool will create efficiencies in procurement, tracking, and responsible device recycling.

We’re designing AI-powered predictive maintenance and intelligent troubleshooting to reduce device-related issues significantly. AI-enabled device maintenance schedules and tasks will automate the device management process and reduce the load on our IT help desk by correcting device issues before they become user problems, reducing device-related helpdesk incidents.

Across our vast scope of device management, many alerts and tickets contain information or fixes that our helpdesk engineers can use in other situations. We’re employing AI to generate device insights by analyzing a massive set of signals, including device configurations, network traffic, vulnerabilities, and user behavior. These insights will power more informed decisions across the device management portfolio, including device replacement, software updates, and capacity increases.

We have more than 100,000 IoT devices on our corporate network. AI-automated IoT device registration will create more robust and efficient IoT device management, tracking, and security.

AI and machine learning will help us to perform aggregated meetings and call data for device monitoring across personal devices, Microsoft Teams meeting rooms, networks, IoT devices, and Microsoft 365, improving and safeguarding the user experience.

Tenant management

Our cloud tenants in Microsoft Azure, Microsoft 365, Dynamics 365, and the Power Platform are among those platforms’ largest and most complex implementations. Our internal implementation includes more than 205,000 Microsoft Teams, 534,000 SharePoint sites, 430,00 Microsoft Exchange mailboxes, 93,00 Power Apps, 5,000 Viva Engage communities, and a massive 25,000 Microsoft Azure subscriptions.

It’s a lot to manage, and AI will improve how we do it.

In tenants of our size, unmanaged assets can lead to unnecessary costs. Our asset compliance and lifecycle management processes will include an AI-powered compliance assistant that informs tenant users and owners, recommends assets for deletion, and proactively identifies areas of high risk for the tenant. Through the assistant, tenant admins gain an all-up view of compliance status and can investigate and resolve issues more granularly.

AI is also simplifying and streamlining our license management processes. We adhere to precise rules and regulations, which result in complex access scenarios across different countries and regions. AI will bolster our ability to detect and remediate non-compliant tenants amidst this complexity.

IT support

We’re poised to transform how Microsoft employees interact with our support services using generative AI.

Our employees interact with Microsoft support services in a complex, global hybrid environment. Our self-help solution using Microsoft Azure OpenAI will enable contextual and human-like conversation and support in the employee’s local language. Our chat and incident summarization tools will use AI to summarize incidents and provide context when assisted support is necessary.

We’re infusing our support ticketing systems with AI capability for forecasting support requirements and proactively checking the health of devices to reduce issues and improve resource planning and response times.

Transforming our IT infrastructure as Customer Zero

As Customer Zero for Microsoft, we pilot and deploy new products and capabilities in our IT infrastructure before releasing them externally. Our scale, size, and knowledge of our products and services enable us to envision connected experiences across large enterprises, manage complex combinations of product use cases, and engineer solutions on top of our product platforms.

AI improves our role as Customer Zero by accelerating insights and improving time-to-value. We’re using AI capabilities to capture, review, analyze, and report on the most important and actionable insights from the Customer Zero experience. We’re also using AI to redevelop processes, regulatory compliance, security reviews, and deployment practices within the Customer Zero environment.

Looking forward         

It’s almost impossible to envision a future for corporate IT infrastructure without AI. Our active planning for AI in our infrastructure is continually evolving, and we’ve only just begun our implementation. We’re positioning Microsoft to be a catalyst for innovation, and we’re committed to innovating with AI to streamline our IT operations.

“We will continue to infuse AI into every dimension of our enterprise portfolio,” Sherwood says. “We’ll continue to identify new opportunities for building AI-powered applications and services that improve how we deliver IT services to the company.”

By showcasing our progress with AI capabilities, we aim to transform our approach to AI internally here at Microsoft and to fuel a similar transformation across the IT sector.

Key Takeaways

Here are four important steps you can take to transform your IT infrastructure with AI:

  • Make device handling smarter with AI. Use AI to manage all devices better, helping to fix problems before they affect people and easing the workload for your IT team.
  • Use AI to improve the network. Integrate AI into the network system to make it more intelligent and more adaptable, which helps reduce downtime and facilitates faster and easier changes.
  • Manage cloud services better with AI. AI can help keep track of cloud services, ensuring everything is used properly and securely.
  • Boost security and helpdesk with AI. Enhance safety and helpdesk services using AI, leading to better network protection and quicker, more effective support for employees when they need it.

The post Transforming Microsoft’s enterprise IT infrastructure with AI appeared first on Inside Track Blog.

]]>
13516
Deploying Microsoft 365 Copilot internally at Microsoft http://approjects.co.za/?big=insidetrack/blog/deploying-copilot-for-microsoft-365-internally-at-microsoft/ Mon, 05 Feb 2024 15:27:22 +0000 http://approjects.co.za/?big=insidetrack/blog/?p=12757 We have deployed Microsoft 365 Copilot to nearly all our employees and vendors at Microsoft. As Customer Zero, we are the first enterprise to do so at scale. As of this week, nearly everyone at the company has it. We’re seeing immediate and wide-ranging benefits. “Our employees are putting it to work for them right […]

The post Deploying Microsoft 365 Copilot internally at Microsoft appeared first on Inside Track Blog.

]]>
Microsoft Digital stories

We have deployed Microsoft 365 Copilot to nearly all our employees and vendors at Microsoft. As Customer Zero, we are the first enterprise to do so at scale. As of this week, nearly everyone at the company has it.

We’re seeing immediate and wide-ranging benefits.

“Our employees are putting it to work for them right away,” says Claire Sisson, a principal group product manager in Microsoft Digital, the internal IT organization at Microsoft. “It immediately starts giving you smart insights that help you get on top of your work.”

So, what is Microsoft 365 Copilot, exactly?

It’s our new productivity solution that uses the power of AI and large language models (LLMs) to help us be more productive. It’s part of our Microsoft 365 Copilot offering that creates a single, seamless user experience across Microsoft 365, Copilot in Windows, Bing, the Microsoft Cloud, and other Microsoft applications.

Here at Microsoft, we got Microsoft 365 Copilot early as part of our role as the company’s Customer Zero, which allows us to test our products at enterprise scale.

“We’re pioneers in using AI in our products,” says Sisson, who is leading the rollout of Microsoft 365 Copilot internally at Microsoft. “We’re the first enterprise in the world to launch Copilot at scale, and we’re capturing a ton of insights.”

Sharing the feedback that we get from our early adopters is helping the company create the best possible version of Microsoft 365 Copilot for our customers.

“We’re learning so much,” Sisson says. “We’re learning how to deploy it at enterprise scale, how to make sure we put in the right protections for our employees and for Microsoft, and then ensuring our employees start using it in productive ways that reap all the new benefits of Copilot.”

Microsoft 365 Copilot gets its power from seamlessly integrating with data in Microsoft 365 Graph and the various Microsoft 365 applications and services that we all—our employees and our customers—use to get work done, including Word, Excel, PowerPoint, Outlook, and Teams.

It helps you in numerous ways, ranging from summarizing action items from an important meeting, to helping you conduct on-the-spot analysis of data that has just come your way, to instantly crafting high quality content that helps you land an important project.

“Its capacity to boost productivity and produce high quality work for our employees enables them to dedicate their attention to their core work and big aspirations,” Sisson says. “And frankly, that’s what’s happening—our employees are getting time back to work on the things they love, and that’s the high value work that drives this company forward.”

[Learn how we’re getting the most out of generative AI at Microsoft with good governanceSee how we’re responding to the AI revolution with an AI Center of ExcellenceCheck out how we’re embracing AI at Microsoft with new AI certificationsFind out how we’re enabling and securing Microsoft Teams meeting data retention at Microsoft.]

Succeeding as Customer Zero

We know that Microsoft 365 Copilot is evolving day by day. As the company’s IT organization, we work to understand those changes, document them, and translate them into communications and readiness assets for our employees. As we work to build a better product, the speed of these cycles accelerates awareness and improvement.

Using Microsoft 365 Copilot

Each of the main Microsoft 365 applications has a Copilot that you can use. Here are some of the ways you can use Copilot in each of the apps:

Copilot in Teams

  • Quickly recap, identify follow-up tasks, create agendas, and ask questions for more effective and focused meetings and calls.
  • Summarize key meeting takeaways, see what you might have missed, and pinpoint key people of interest in chat threads users were added to.

Business Chat in Microsoft 365 Copilot

  • Find and use info that’s buried in documents, presentations, emails, calendar invites, notes, and contacts.
  • Summarize information across multiple sources like recent customer interactions, meetings, shared content, and deliverables.
  • Prepare for a meeting based on topics and sources.
  • Create a status update from the day’s meetings, emails, and chat discussions.

Copilot in PowerPoint

  • Transform existing written content (such as documents, images, and graphs) into decks complete with speaker notes and sources.
  • Start a new presentation from a prompt or outline.
  • Condense and streamline presentations.
  • Use natural language commands to adjust layouts, reformat text, and time animations.

Copilot in Word

  • Write, edit, summarize, and create content in Word.
  • Create a first draft, bringing in information from across your organization as needed.
  • Add content to existing documents, summarize text, and rewrite sections or the entire document.
  • Incorporate suggested themes, styles, and tone.
  • Provide suggestions to strengthen arguments or smooth out inconsistencies.

Copilot in Excel

  • Query your data set in natural language, not just formulas.
  • Reveal correlations, propose what-if scenarios, and suggest new formulas based on your questions.
  • Generate models based on questions via natural language.
  • Identify trends, create visualizations, or ask for recommendations and insights.

Copilot in Outlook

  • Get help finding the best time for everyone to meet.
  • Compose more effective emails with suggestions from Copilot.
  • Analyze, summarize, and get action items from email threads.
  • Respond to existing emails with simple prompts and turn quick notes into new email messages.

Copilot in Loop

  • As Microsoft Loop pages get filled with ideas and content, ask Copilot to summarize themes and actions.
  • Edit or correct Copilot summaries, add additional details or context, and communicate with others.
  • Collaborate with colleagues in real time in the same Workspace.

Copilot in Whiteboard

  • Elevate and ideate on new ideas on any topic.
  • Visualize any idea and use Microsoft Designer to generate designs.
  • Categorize unorganized bullet lists and notes, including sticky notes, into organized groups based on themes.
  • Summarizes all note content in the board and creates a summary output as a Loop component for further collaboration.

Microsoft 365 Copilot brings a robust list of new capabilities to each Microsoft 365 application.

“We’re constantly engaging our employees,” Sisson says. “Every application within Microsoft 365 has its own product team developing Copilot features and experiences. We’re working with these teams to provide valuable feedback, and while doing that, we, like our customers, benefit from the product getting better and better.”

As the company’s Customer Zero, it’s our job to make sure that our products are human-centric, and that using them makes sense in consistent and coherent ways. Culture, religion, and political beliefs are a few considerations that we’re using to evaluate how Microsoft 365 Copilot is working for and interacts with our employees. To help with this, Copilot is available in eight languages with a plan to add more shortly.

“The level of complexity that we’re dealing with is very high,” Sisson says. “We’re capturing and responding to very diverse feedback from employees who are spread across the globe.”

We also prioritize trust, security, and compliance. We’re working with the product teams, a broad set of regulatory bodies, and regional works councils to ensure that we’re looking out for the best interests of our employees and our company.

“The true potential of Microsoft 365 Copilot lies not just in its current capabilities but in its future possibilities,” Sisson says.

By continuously integrating feedback, staying attuned to global best practices, and harnessing the latest in AI research, Microsoft 365 Copilot is poised to redefine the boundaries of workplace productivity.

Works council collaboration

One of the ways we provide valuable Customer Zero feedback to the product group is via our partnership with our works councils in the European Union. They’re currently helping us examine the regulatory and ethical boundaries for Copilot usage across the company. Our works councils and other employee representation groups are providing valuable feedback on how AI should work to augment human capabilities, not replace them. They’re evaluating our internal deployment and considering Microsoft 365 Copilot use cases to ensure compliance with local laws as well as how AI can increase productivity and job satisfaction among their members. They’ve also provided critical insights that help us respect cultural nuances and ensure language accessibility.

“Our strong relationship with our works council in Germany helped us get the approvals we needed to deploy Copilot quickly,” says Anna Kopp, director of business programs for Microsoft Digital in Germany. “Because of our proactive outreach, our works council almost immediately moved into a state of ‘tolerance’ for Copilot—which basically means that they haven’t blessed it, but they’re not blocking it either.”

It was about working together early and often.

“We gave our works council members early access to Microsoft 365 Copilot so they could test it and give feedback,” Kopp says. “Their feedback helped us improve the product in numerous ways, especially around potential uses for behavioral assessment, which is not allowed in Germany. Giving them early access enhanced trust and enabled them to help us improve the product for all our customers with works councils.”

Getting governance right

While AI technology has recently generated considerable attention and excitement, questions and concerns also exist. Our teams have strong governance measures in place that aim to protect our employees and the company while not getting in the way of our powerful new AI tools. We build trust and are transparent with our employees as part of adhering to our Responsible AI and governance guidelines and are committed to working with our global employees who are represented by labor unions and works councils.

“It’s important that our employees—and the larger Microsoft 365 Copilot user base—understand that we’re implementing AI as an enhancer and accelerator, not a replacer,” Sisson says. “Successful use of AI lies in helping humans make better decisions, not in replacing those decisions altogether.”

We’ve done a lot of work to put strong governance measures in place here at Microsoft, and that early work is helping us to deploy Microsoft 365 Copilot quickly.

Three tips for accelerating works council reviews of Microsoft 365 Copilot:

  1. Give your works council members early access to the product. This gives them time to understand how it works, they will be able to give their seal of approval much quicker if they can test it out themselves.
  2. Have regular calls like office hours with specialists who can answer technical questions they might have, working with your Microsoft Account Team and your own IT folks. It is important to monitor and stay close to resolve misconceptions early.
  3. Define a learning path using available learning materials, for example from Microsoft Viva Learning or LinkedIn Learning, on what generative AI is and, for example, how to prompt it properly. We all need to learn how to interact with this new technology to get the excellent results and time savings we want. Microsoft provides a lot of training and courses to help with this for free!

“Microsoft 365 Copilot governance and compliance lies within the underlying data that it uses to deliver its insights and take action,” says David Johnson, the principal PM architect in Microsoft Digital who leads our governance efforts. “The data discovery that Copilot does is built on a good data governance structure. We have strong permission systems in place that make sure our AI tools don’t share insights based on data that an employee shouldn’t have access to.”

Johnson and his team are working with the Microsoft 365 Copilot product teams to ensure that these data governance practices and standards are used across all Copilot deployments. This includes deploying governance measures—highlighted by our robust and matrixed labeling measures—designed to prevent oversharing of data, to educate employees on appropriate AI use cases, and to continually assess how Copilot uses our data and how we use the data that Copilot is serving up to us.

“It’s all working together to create an environment where Microsoft 365 Copilot can be a trusted, practical companion for productivity and empowerment for our employees,” Johnson says.

Showcasing the power of AI

Getting Microsoft 365 Copilot in the hands of our employees is only the first step—now we want to get them to start using it in the flow of their work. It’s about showing them that they can use it to do more high-level work and have more impact by doing things for them like analyzing Excel data sets and creating PowerPoint presentations from Word documents.

We use a robust, tried-and-true adoption approach built on the pillars of envisioning, where you build a strong plan for getting your employees to adopt a new product, on onboarding, where you launch your adoption effort, and on driving value, where you look to help your employees find value in your product.

Get your prompt on

Using the right prompt is your key for getting the most out of Microsoft 365 Copilot.

Prompting Microsoft 365 Copilot is the process of giving instructions or asking questions to Copilot in natural language. You can prompt Copilot by typing your request in the Copilot window. Copilot then responds with relevant output, such as text, images, graphs, or actions. To prompt Copilot effectively, follow these best practices:

Use the default prompts provided for better results

The prompts that Microsoft 365 Copilot suggests have been designed to provide clear instruction for Copilot to follow. Choose from a variety of prompts for different types of documents, such as reports, proposals, summaries, and more. You can also customize the prompts to suit your preferences and goals.

Use clear and specific language

This helps Copilot understand your request and provide a more accurate response. For example, instead of asking “How do I write a good email?”, you can ask “How do I write a formal email requesting a meeting with a client?”

Provide as much context as possible

The more information you provide, the better Copilot can tailor its response to your needs. For example, you can provide the purpose, audience, tone, and format of your document, as well as any relevant details or examples. You can also attach or link any existing documents or sources that you want Copilot to refer to. More details and examples are always better.

Experiment and use prompts that work best for you

Copilot is designed to learn from your interactions and improve overtime. You can try different ways of phrasing your prompts or ask Copilot to generate multiple outputs for comparison. You can also provide feedback to Copilot by rating, editing, or commenting on its outputs. Your Copilot experience will always be better than the previous engagement.

Review our prompt guidance

You can access infographic capabilities by selecting the Help icon in the Copilot window. This infographic explains the components and structure of a good prompt, as well as tips and examples.

Following these best practices, new users can optimize their experience with Microsoft 365 Copilot and get the most out of its capabilities. Click here to get more information on Microsoft 365 Copilot prompting.

In the end, the best way to get your employees to start using a new product like Microsoft 365 Copilot is to show them how it can make their lives better.

“Copilot represents a profound shift in how our employees are working,” says Stephan Kerametlian, a director for employee experience in Microsoft Digital. “They’re seeing benefits immediately.”

For starters, they’re now using natural language to unlock tools that have been right in front of them for a long time—they just didn’t know that they were there, or if they did, how to use them.

“Most people only use a small fraction of the features built into Microsoft 365 apps,” Kerametlian says. “Copilot users don’t need to know precisely where a feature is in the app’s interface—they don’t even need to know that it’s there. They just have to imagine what they want to accomplish, and Copilot will surface the feature for them.”

As a result, our employees are discovering new and creative ways to work.

Microsoft 365 Copilot boosts our employees’ productivity by taking care of their mundane tasks and repetitive processes. “That lets them focus on ideation, creativity, and strategy, the tasks where they can genuinely add value,” Kerametlian says.

It’s also upleveling our employees’ skill sets.

“Talking to Copilot using intuitive natural language interaction leads to easier discovery and accelerated learning,” Kerametlian says. “This makes our employees better at what they’re good at and helps them quickly become experts at what they’re learning. They can rapidly get comfortable with the handful of commands that help them learn the technology and boost their skill.”

In truth, Kerametlian says, Microsoft 365 Copilot is so powerful that it is unleashing a new era of employee creativity and productivity. “It’s such a natural thing to do, to ask it for help,” he says. “The more interesting thing will be to see what our employees do with it. We expect that we’re going to see some very creative, and surprising results.”

Look to the future

Sisson, Kopp, Johnson, and Kerametlian appear in a collage of corporate photos that have been joined together in one image.
Claire Sisson (left to right), Anna Kopp, David Johnson, and Stephan Kerametlian are among those in Microsoft Digital helping deploy Microsoft 365 Copilot internally.

The future vision for Microsoft 365 Copilot is bright. The measure of a tool’s success isn’t just in its deployment but its adoption, utility, and the tangible impact it brings to everyday tasks. Our Customer Zero emphasis on capturing user feedback from internal teams and external customers will be instrumental in shaping how we grow into using Microsoft 365 Copilot over time.

“We’re committed to the evolution of Microsoft 365 Copilot at Microsoft,” Sisson says. “As we move forward, our mission continues to be to facilitate a seamless, productive, and enriched work experience for our employees.”

We’re charting a path where technology and human ingenuity converge through continuous feedback, rigorous refinement, and adherence to best practices.

“We’re very excited about where we are right now,” Sisson says. “Microsoft 365 Copilot isn’t just a tool for our employees to use—it’s a testament to the future of work, a future where technology augments human potential, empowering every individual to achieve more.”

Key Takeaways

Here are some things to think about as you consider getting started with Microsoft 365 Copilot:

  • Copilot is a powerful AI assistant that can help you generate text and images, summarize documents, create presentations, and more using natural language prompts.
  • Copilot works across various Microsoft 365 apps such as Word, Outlook, PowerPoint, Teams, and OneNote. As soon as you have it, you can access Copilot by selecting the Copilot icon in the app.
  • Copilot respects your privacy and security. It only accesses the information that you have permission to view and doesn’t store any of your data. You can also review and edit the content that Copilot generates before using it.
  • Copilot is designed to enhance your creativity and productivity, not replace it. You should always check the facts, accuracy, and quality of the content that Copilot produces and use your own judgment and expertise to decide whether to use it or not.
  • Microsoft is constantly learning and improving Copilot based on your feedback.

If you want to learn more about Copilot and how to use it, you can check out the following resources:

  • Microsoft 365 Copilot help and learning: This is the official support page for Copilot, where you can find tutorials, videos, FAQs, and tips on how to get the most out of Copilot.
  • How to get ready for Microsoft 365 Copilot: This is a video from Microsoft Mechanics that explains how to prepare your organization for Copilot, including the prerequisites, licensing, and best practices.
  • Get ready for Microsoft 365 Copilot: Implementation and best practices: This is an article from OnMSFT that provides more details on how to implement and optimize Copilot for your organization, including how to encourage collaboration, emphasize security, and stay updated.
  • Copilot in OneNote: This is a page that shows how Copilot can help you improve your notetaking and supercharge your productivity in OneNote.
  • 7 Things you can do with Microsoft 365 Copilot and why you should use it: This is another article from OnMSFT that showcases some of the amazing things that Copilot can do for you, such as writing emails, summarizing documents, creating presentations, and more.

The post Deploying Microsoft 365 Copilot internally at Microsoft appeared first on Inside Track Blog.

]]>
12757
How automation is transforming revenue processing at Microsoft http://approjects.co.za/?big=insidetrack/blog/how-automation-is-transforming-revenue-collection-at-microsoft/ Fri, 26 Jan 2024 15:05:35 +0000 http://approjects.co.za/?big=insidetrack/blog/?p=4788 The Microsoft partner and customer network brings in more than $100 billion in revenue each year, most of the company’s earnings. Keeping tabs on the millions of annual transactions is no small task—just ask Shashi Lanka Venkata and Mark Anderson, two company employees who are leading a bid to automate what historically has been a […]

The post How automation is transforming revenue processing at Microsoft appeared first on Inside Track Blog.

]]>
Microsoft Digital storiesThe Microsoft partner and customer network brings in more than $100 billion in revenue each year, most of the company’s earnings.

Keeping tabs on the millions of annual transactions is no small task—just ask Shashi Lanka Venkata and Mark Anderson, two company employees who are leading a bid to automate what historically has been a painstakingly manual revenue transaction process.

“We support close to 50 million platform actions per day,” says Venkata, a principal group engineering manager in Microsoft Digital. “For a quarter-end or a month-end, it can double. At June-end, we’re getting well more than 100 million transactions per day.”

That’s a lot, especially when there cannot be any mistakes and every transaction must be processed in 24 hours.

To wrangle that high-stakes volume, Venkata and Anderson, a director on Microsoft’s Business Operations team, teamed up to expand the capabilities of Customer Obsessed Solution Management and Incident Care (COSMIC), a Dynamics 365 application built to help automate Microsoft’s revenue transactions.

[Learn more about COSMIC including where to find the code here: Microsoft Dynamics 365 and AI automate complex business processes and transactions.]

First tested in 2017 on a small line of business, the solution expanded quickly and was handling the full $100 billion-plus workload within one year.

That said, the team didn’t try to automate everything at once—it has been automating the many steps it takes to process a financial transaction one by one.

Anderson sits at his desk in his office.
Mark Anderson (shown here) partnered with Shashi Lanka Venkata from Microsoft Digital to revamp the way the company processes incoming revenue. Anderson is a director on Microsoft’s Business Operations team.

“We’re now about 75 percent automated,” Anderson says. “Now we’re much faster, and the quality of our data has gone way up.”

COSMIC is saving Microsoft $25 million to $30 million over the next two to three years in revenue processing cost. It also automates the rote copy-and-paste kind of work that the company’s team of 3,800 revenue processing agents used to get bogged down on, freeing them up to do higher value work.

The transformation that Anderson, Venkata, and team have been driving is part of a larger digital transformation that spans all Microsoft Digital. Its success has led to a kudos from CEO Satya Nadella, a well-received presentation to the entire Microsoft Digital organization, and lots of interest from Microsoft customers.

“It’s been a fantastic journey,” Anderson says. “It’s quite amazing how cutting edge this work is.”

Unpacking how COSMIC works

Partners transact, purchase, and engage with Microsoft in over 13 different lines of businesses, each with its own set of requirements and rules for processing revenue transactions (many of which change from country to country).

To cope with all that complexity, case management and work have historically been handled separately to make it easier for human agents to stay on top of things.

That had to change if COSMIC was going to be effective. “When we started, we knew we needed to bring them together into one experience,” Venkata says.

Doing so would make transactions more accurate and faster, but there was more to it.

“The biggest reason we wanted to bring them together is so we could get better telemetry,” he says. “Connecting all the underlying data gives us better insights, and we can use that to get the AI and machine learning we need to automate more and more of the operation.”

Giving automation its due

The first thing the team decided to automate was email submissions, one of the most common ways transactions get submitted to the company.

“We are using machine learning to read the email and to automatically put it in the right queue,” Venkata says. “The machine learning pulls the relevant information out of the email and enters it into the right places in COSMIC.”

The team also has automated sentiment analysis and language translation.

What’s next?

Using a bot to start mimicking the work an agent does, like automatic data entry or answering basic questions. “This is something that is currently being tested and will soon be rolled out to all our partners using COSMIC,” he says.

How does it work?

When a partner submits a transactional package to Microsoft, an Optical Character Recognition bot scans it, opens it, checks to see if everything looks correct, and makes sure business roles are applied correctly. “If all looks good, it automatically gets routed to the next step in the process,” Venkata says.

The Dynamics workflow engine also is taking on some of the check-and-balance steps that agents used to own, like testing to see if forms have been filled out correctly and if information extracted out of those forms is correct.

“Azure services handle whatever has to be done in triage or validation,” he says. “It can check to see if a submission has the right version of the document, or if a document is the correct one for a particular country. It validates various rules at each step.”

All of this is possible, Venkata says, because the data was automatically abstracted. “If, at any point the automation doesn’t work, the transaction gets kicked back for manual routing,” he says.

As for the agents? They are getting to shift to more valuable, strategic work.

“The system is telling them what the right next action is going to be,” Venkata says. “Before this, the agent had to remember what to do next for each step. Now the system is guiding them to the next best action—each time a step is completed, the automation kicks in and walks the agent through the next action they should take.”

Eventually the entire end-to-end process will be automated, and the agents will spend their time doing quality control checks and looking for ways to improve the experience. “We want to get to the point where we only need them to do higher level work,” he says.

Choosing Dynamics 365 and Microsoft Azure

There was lots of technology to choose from, but after a deep assessment of the options, the team chose Dynamics 365 and Microsoft Azure.

“We know many people thought Dynamics couldn’t scale to an enterprise the size of Microsoft, but that’s not the case anymore,” Venkata says. “It has worked very well for us. Based on our experience, we can definitively say it can cover Microsoft’s needs.”

The team also used Azure to build COSMIC—Azure Blob Storage for attachments, Azure Cosmos DB for data archival and retention, SQL Azure for reporting on data bases, and Microsoft Power BI for data reporting.

Anderson says it’s a major leap forward to be using COSMIC’s automation to seamlessly route customers to the right place, handing them off from experience to experience without disrupting them.

Another major improvement is how the team has gained an end-to-end view of customers (which means the company no longer must ask customers what else they’re buying from Microsoft).

“It’s been a journey,” Anderson says. “It isn’t something we’ve done overnight. At times it’s been frustrating, and at times it’s been amazing. It’s almost hard to imagine how far we’ve come.”

Related links

The post How automation is transforming revenue processing at Microsoft appeared first on Inside Track Blog.

]]>
4788
Protecting against oversharing Power BI reports with Microsoft Sentinel http://approjects.co.za/?big=insidetrack/blog/protecting-against-oversharing-power-bi-reports-with-microsoft-sentinel/ Mon, 08 Jan 2024 23:12:52 +0000 http://approjects.co.za/?big=insidetrack/blog/?p=12951 Microsoft Power BI is an essential tool for monitoring performance, identifying trends, and developing stunning data visualizations that many teams across Microsoft use every day. A well-built Power BI report can play a critical role in helping communicate business information efficiently and effectively. But with great Power BI reports comes great responsibility, which includes keeping […]

The post Protecting against oversharing Power BI reports with Microsoft Sentinel appeared first on Inside Track Blog.

]]>
Microsoft Digital storiesMicrosoft Power BI is an essential tool for monitoring performance, identifying trends, and developing stunning data visualizations that many teams across Microsoft use every day. A well-built Power BI report can play a critical role in helping communicate business information efficiently and effectively. But with great Power BI reports comes great responsibility, which includes keeping data and reports secure, and ensuring that only the right people have access to it.

Across Microsoft, we use Microsoft Purview Data Loss Prevention (DLP), which is now in general availability, to help secure our data. Purview DLP policies allow administrators to comply with governmental and industry regulations such as the European Union General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA), and automatically detect sensitive information to prevent data leaks. These policies can now also uncover data that might have accidentally been uploaded to Power BI without your knowledge.

While Purview’s controls ensure sensitive data is handled appropriately, we learned from customer research that sensitive data can be accidentally overshared with unauthorized individuals when large audience groups are inadvertently granted access to the report. This often happens when report owners grant access to Power BI reports without first checking who is authorized to view them—both inside and outside data boundaries.

We wanted to find a solution that would prevent this kind of unintentional oversharing and make it easy for Power BI administrations to set up, use, and configure.

— Prathiba Enjeti, senior program manager, Microsoft Digital Security and Resilience team

To address this problem, Microsoft Digital Security and Resilience collaborated with the Microsoft Sentinel product group to develop an out-of-the-box Microsoft Sentinel solution for Power BI reports to detect and respond to oversharing. Using the Power BI connector for Microsoft Sentinel, which is now available in preview, you can track user activity in your Power BI environment with Microsoft Sentinel using Power BI audit logs. This solution helps administrators to identify potential data leaks with automatically generated reports.

How it works

With Microsoft Sentinel playbook automation for Power BI detection, the SOC can achieve higher productivity and efficiency, saving analysts’ time and energy for investigative tasks.

— Prathiba Enjeti, senior program manager, Microsoft Digital Security and Resilience team

Enjeti faces the camera standing outside in a natural area.
Prathiba Enjeti is a senior security program manager on the Microsoft Security Standards and Configuration team.

Our oversharing detection logic uses Power BI audit logs, which are cross-referenced against Microsoft Sentinel-generated watchlists that track high-risk security groups. When a report is shared with a group that exceeds a specified number of users, the detection is triggered. Thresholds can be adjusted by administrators to suit any organization’s needs and policies.

Additionally, we used the Microsoft Sentinel playbook to automate the remediation process. We configured it to automatically send email notifications containing remediation instructions to report owners. From our discussions with customers, we learned that some organizations preferred that accountability remain with the Power BI report owners for various periods of time to remediate, before escalating to the tenant administrators. To meet customer needs for flexibility, administrators can configure time spans ranging from instantaneous escalation, to hours, days, and weeks.

“With Microsoft Sentinel playbook automation for Power BI detection, the SOC can achieve higher productivity and efficiency, saving analysts’ time and energy for investigative tasks,” Enjeti says.

Automating how cases of data oversharing are found and fixed will allow IT administrators to detect, notify, and limit access to Power BI reports in real time. We’re excited to bring this Microsoft Sentinel solution to our customers, which will be available for public release soon.

Key Takeaways

Here are some suggestions for tackling oversharing at your company:

  • Oversharing of data is a problem that many organizations face. They might not be aware of the magnitude of the problem. If you don’t already, consider auditing distribution and security groups used by employees to share information.
  • Understand where potential data loss issues might be occurring. Be sure to enable data loss prevention policies wherever possible.
  • Consider implementing detections and automated workflows solutions such as the Microsoft Sentinel solution for Power BI reports oversharing to reduce manual effort and reduce time to identify and remediate oversharing.

Try it out

Try Microsoft Sentinel at your company.

Related links

We'd like to hear from you!

Want more information? Email us and include a link to this story and we’ll get back to you.

Please share your feedback with us—take our survey and let us know what kind of content is most useful to you.

The post Protecting against oversharing Power BI reports with Microsoft Sentinel appeared first on Inside Track Blog.

]]>
12951
Revolutionizing the rollout: Using product adoption data to streamline deployments http://approjects.co.za/?big=insidetrack/blog/revolutionizing-the-rollout-using-product-adoption-data-to-streamline-deployments/ Fri, 10 Nov 2023 17:00:27 +0000 http://approjects.co.za/?big=insidetrack/blog/?p=10231 How do you turn data into knowledge? Once you have it, how to you turn that knowledge into action? Adopting a new product across an organization as large as Microsoft is an enormous task, and we’re always expanding our knowledge about how to roll out a new tool or platform more effectively. Product adoption data […]

The post Revolutionizing the rollout: Using product adoption data to streamline deployments appeared first on Inside Track Blog.

]]>
Microsoft Digital storiesHow do you turn data into knowledge? Once you have it, how to you turn that knowledge into action?

Adopting a new product across an organization as large as Microsoft is an enormous task, and we’re always expanding our knowledge about how to roll out a new tool or platform more effectively. Product adoption data is at the core of how we build that knowledge — and how we implement the strategy that goes along with it.

“When it comes to driving adoption, the key to success is to identify the insights we need to deliver success upfront,” says Nathalie D’Hers, corporate vice president of employee experience at Microsoft. “We define those success measures then ensure we have the right data to meet our objectives.”

[Check out our Microsoft Viva adoption guide. Visit our Microsoft Viva content suite to learn how work life is better at Microsoft with Viva. Learn how we’re creating the digital workplace at Microsoft.]

D’Hers poses for a portrait photo.
“When it comes to driving adoption, the key to success is to identify the insights we need to deliver success upfront. We define those success measures then ensure we have the right data to meet our objectives,” says Nathalie D’Hers, corporate vice president of employee experience at Microsoft.

The discipline of deployment

The transition to a modern, hybrid workplace has introduced new challenges into the product adoption process.

“The first challenge in a remote or hybrid environment is awareness because it’s really difficult to get users’ attention,” says Amy Ceurvorst, senior business program manager and adoption specialist with Microsoft Digital (MSD). “Now you’re competing against messages, chats, and emails to grab your employees’ attention, so we’ve had to try a bunch of different strategies to help users engage with a new product.”

Over time, we’ve learned some key lessons about driving effective adoption:

  • Change doesn’t happen automatically—Employees need encouragement and guidance, or they’ll simply stick with existing technology.
  • Buy-in from leadership matters—Engaged executives are effective champions for wide-reaching rollouts.
  • Targets need to be measurable—If it doesn’t get measured, it won’t get done.
  • Learn from the process and each other—Every adoption is an opportunity to build knowledge and improve processes, and collaboration provides valuable insights.

These principles have helped us build out an adoption process that flows naturally through envisioning, onboarding, and driving value by measuring success. Each stage of that journey involves thoughtful consideration, collaboration, and preparation.

The product adoption process at Microsoft, including three steps: envision, onboard, and drive value.
Product adoption at Microsoft breaks down into three key processes: envisioning, onboarding, and driving value.

Capturing the right data

But how do we ensure we’re making the most of the adoption journey? The answer is in the data.

The first challenge is securing the right data to meet our goals for each stage of the adoption journey, whether we’re building out our vision for a new tool, generating buy-in from executive sponsors, or demonstrating the success of a rollout.

That’s where our Employee Experience Insights team comes in. They’re responsible for looking at the experiences we support at Microsoft and collecting that data. Through collaboration with technical teams and change managers, they source and collocate the information we need to inform our deployments.

“We start with operational data from different sources, including services, applications, and devices,” says Miguel Uribe, principal product manager for Employee Experience Insights. “We put that together with enterprise data like employee profiles, support tickets, and survey results, and then use it to discover knowledge about how the user is perceiving the experience.”

Turning subjective, written responses into useful data can be a challenge, but Miguel’s team leverages machine learning and natural language processing to summarize and aggregate large swathes of feedback into sentiment scores and other useful information.

To protect employee privacy and ensure compliance with confidentiality standards, our data professionals work with our Human Resources and Privacy teams as well as our Corporate, External, and Legal Affairs team. They provide the guidance to ensure that user data is anonymized and aggregated so it doesn’t violate our employees’ privacy.

You need to give leaders a reason to buy into a tool’s value, articulate how you’ll measure success, develop targets to track against, and provide specific actions they can take to be successful.

—Marcus Young, employee experience director, Microsoft Digital

The goal is to enable a HEART analytics model that provides insights across five aspects of usage: happiness, engagement, adoption, retention, and task value. All of that data provides rich context, not just for guiding future rollouts, but for engaging sponsors throughout the ongoing adoption. The HEART framework is a set of user-centered metrics used to evaluate the quality of a user experience and to help measure the impact of UX changes.

Inspiring sponsors to drive Microsoft Viva adoption

Our recent Microsoft Viva rollout is a powerful example of good data’s effect on executive sponsorship. When Viva became available, MSD led the platform’s adoption across our more than 200,000 employees. Check out our Microsoft Viva adoption guide to get tips for deploying Viva at your company.

“We knew from experience that active and visible sponsorship is the number one factor for success when you roll out a change,” says Marcus Young, employee experience director with MSD. “You need to give leaders a reason to buy into a tool’s value, articulate how you’ll measure success, develop targets to track against, and provide specific actions they can take to be successful.”

To meet these needs, the team combined usage data with gamification, automated recommendations, and a stylish user interface to let leaders know where their organizations stand within our Microsoft Viva rollout. We called it the Microsoft Viva Leader Index (VLI).

The Viva Leader Index, featuring an individual leader’s scorecard for the adoption of different Microsoft Viva modules.
The Microsoft Viva Leader Index helped drive Microsoft Viva adoption by gamifying usage and providing recommendations to deepen employee engagement.

Through this tool, any leader managing more than 20 people can navigate to an internal website that automatically detects their role and organization. The VLI then delivers an all-up view of Microsoft Viva usage across that leader’s team, including information about individual modules, how their team stacks up against their peers and the entire organization, and automated recommendations for how they can advance adoption.

This simple and engaging tool provides everything a leader needs to gauge their organization’s adoption success and work toward their goals. It’s a powerful example of what data can do when paired with purpose and personalization.

Measuring impact and managing knowledge

All throughout the rollout process, adoption specialists from our Employee Experience Success team actively consider the cycle of envisioning, onboarding, and driving value. As a deployment matures, they can use data to inform the process of their next rollout.

Ceurvorst, Young, and Uribe pose for portraits in this composite image.
Amy Ceurvorst, Marcus Young, and Miguel Uribe (right) are part of a cross-team effort to leverage the power of product adoption data to enhance rollouts at Microsoft.

The Employee Experience Insights team collaborates with change management practitioners to incorporate adoption activities like localized comms, office hours, or sponsor influencing through our Early Adopter Program. That way, they can find correlations between adoption activities and spikes in usage to identify which methods drive the best results.

In Microsoft Viva’s case, we discovered that attention-grabbing, one-minute demos and direct-to-employee communications through Microsoft Teams were the most effective ways to boost adoption. Insights from this test-and-learn process mean our adoption specialists will be more prepared to support the next rollout.

“It’s hard to compete for people’s time,” Ceurvorst says. “So we’ve learned that we need to work hard to get into the sessions and spaces where they already work.”

On the product side, getting granular also supports a better development process and user experience moving forward. That depends on asking the right questions.

“Is there something happening making usage surge, or do we have significant drop-offs at certain times of year?” Ceurvorst asks. “There’s so much value in data from feature requests, bugs, and anonymized usage behaviors, and it helps us come up with insights we can share back to product groups.”

When our teams work together to capture effective adoption data, it produces meaningful impacts on the employee experience — from change management best practices to executive sponsorship, right through to product development. That means being intentional about the kind of information we capture and how we put it into action for ongoing and future adoption programs.

It’s all about creating a virtuous cycle, driven by data.

Key Takeaways

Here are some tips that you can use to build a plan for driving Microsoft Viva adoption at your company:

  • Identify what success looks like, how to quantify it, and how to distribute accountability for that success to your leaders.
  • Have the right data because it makes ROI powerfully tangible.
  • Be diligent about advocating for your data desires to your data team.
  • Deliver data with a human touch to energize sponsors and stakeholders.
  • Align there is a plan and resourcing for collecting high-quality data signals.
  • Work with legal professionals to understand the security and privacy rules and regulations around data.
  • Identify who the primary users of your success measurement reporting will be, such as adoption specialists, and partner with them to define and design reports.

Related links

The post Revolutionizing the rollout: Using product adoption data to streamline deployments appeared first on Inside Track Blog.

]]>
10231
Transforming data governance at Microsoft with Microsoft Purview and Microsoft Fabric http://approjects.co.za/?big=insidetrack/blog/transforming-data-governance-at-microsoft-with-microsoft-purview-and-microsoft-fabric/ Tue, 19 Sep 2023 18:40:34 +0000 http://approjects.co.za/?big=insidetrack/blog/?p=12172 Data is an invaluable asset for all businesses. Over recent years, the exponential growth of data collection and ingestion has forced most organizations to rethink their strategies for managing data. Increasing compliance requirements and ever-changing technology prevent anyone from simply leaving their enterprise data in its current state. We’re accelerating our digital transformation with an […]

The post Transforming data governance at Microsoft with Microsoft Purview and Microsoft Fabric appeared first on Inside Track Blog.

]]>
Microsoft Digital technical storiesData is an invaluable asset for all businesses. Over recent years, the exponential growth of data collection and ingestion has forced most organizations to rethink their strategies for managing data. Increasing compliance requirements and ever-changing technology prevent anyone from simply leaving their enterprise data in its current state.

We’re accelerating our digital transformation with an enterprise data platform built on Microsoft Purview and Microsoft Fabric. Our solution addresses three essential layers of data transformation:

  • Unifying data with an analytics foundation
  • Responsibly democratizing data with data governance
  • Scaling transformative outcomes with intelligent applications

As a result, we’re creating agile, regulated, and business-focused data experiences across the organization that accelerate our digital transformation.

[Unpack how we’re deploying a modern data governance strategy internally at Microsoft. Explore how we’re providing modern data transfer and storage service at Microsoft with Microsoft Azure. Discover how we’re modernizing enterprise integration services at Microsoft with Microsoft Azure.]

Accelerating responsible digital transformation

Digital transformation in today’s world is not optional. An ever-evolving set of customer expectations and an increasingly competitive marketplace prohibit organizations from operating with static business practices. Organizations must constantly adapt to create business resilience, improve decision-making, and increase cost savings.

Data is the fuel for digital transformation. The capability of any organization to transform is directly tied to how effectively they can generate, manage, and consume their data. These data processes—precisely like the broader digital transformation they enable—must also transform to meet the organization’s needs.

The Enterprise Data team at Microsoft Digital builds and operates the systems that power Microsoft’s data estate. We’re well on our way into a journey toward responsibly democratizing the data that drives global business and operations for Microsoft. We want to share our journey and give other organizations a foundation—and hopefully a starting point—for enabling their enterprise data transformation.

Seizing the opportunity for data transformation

Data transformation focuses on creating business value. Like any other organization, business value drives most of what we do. As Microsoft has grown and evolved, so has our data estate.

Our data was in silos. Various parts of the organization were managing their data in different ways, and our data wasn’t connected.

—Damon Buono, head of enterprise governance, Microsoft

At the genesis of our data transformation, we were in the same situation many organizations find themselves in. Digital transformation was a top priority for the business, and our data estate couldn’t provide the results or operate with the agility the business required.

We felt stuck between two opposing forces: maintaining controls and governance that helped secure our data and the pressure from the business to move fast and transform our data estate operations to meet evolving needs.

“Our data was in silos,” says Damon Buono, head of enterprise governance for Microsoft.  “Various parts of the organization were managing their data in different ways, and our data wasn’t connected.”

As a result, a complete perspective on enterprise-wide data wasn’t readily available. It was hard to implement controls and governance across these silos, and implementing governance always felt it was slowing us down, preventing us from supporting digital transformation at Microsoft at the required pace.

“We needed a shared data catalog to democratize data responsibly across the company,” Buono says.

Transforming data: unify, democratize, and create value

Transforming our data estate fundamentally disrupted how we think about and manage data at Microsoft. With our approach, examining data at the top-level organization became the default, and we began to view governance as an accelerator of our transformation, not a blocker. As a result of these two fundamental changes, our data’s lofty, aspirational state became achievable, and we immediately began creating business value.

Our enterprise data platform is built on three essential layers of data transformation: unifying data with an analytics foundation, responsibly democratizing data with data governance, and scaling transformative outcomes with intelligent applications.

Unifying data with an analytics foundation

Buono smiles in a corporate photo.
Establishing and adopting strong governance standards has helped Microsoft democratize access to data, says Damon Buono, head of enterprise governance for Microsoft. “When data is adequately democratized—safely accessible by everyone who should access it—transformation is accelerated,” Buono says.

Unified data is useful and effective data. Before our data transformation, we recognized the need to unify the many data silos present in the organization. Like many businesses, our data has evolved organically. Changes over the years to business practices, data storage technology, and data consumption led to increased inefficiencies in overall data use.

Analytics are foundational to the remainder of the data transformation journey. Without a solid and well-established analytics foundation, it’s impossible to implement the rest of the data transformation layers. A more centralized source of truth for enterprise data creates a comprehensive starting point for governance and creating business value with scalable applications.

With Microsoft Fabric at the core, our analytics foundation unifies data across the organization and allows us to do more with less, which, in turn, decreases data redundancy, increases data consistency, and reduces shadow IT risks and inefficiencies.

“It connects enterprise data across multiple data sources and internal organizations to create a comprehensive perspective on enterprise data,” Buono says.

Microsoft Fabric ensures that we’re all speaking the same data language. Whether we’re pulling data from Microsoft Azure, multi-cloud, or our on-premises servers, we can be confident that our analytics tools can interpret that data consistently.

Functionally, this reduces integration and operation costs and creates a predictable and transparent operational model. The unity and visibility of the analytics foundation then provide the basis for the rest of the transformation, beginning with governance.

Responsibly democratizing data with data governance

Data can be a transformative asset to the organization through responsible democratization. The goal is to accelerate the business through accessibility and availability. Democratizing data is at the center of our governance strategy. Data governance plays an active role in data protection and complements the defensive posture of security and compliance. With effective governance controls, all employees can access the data they need to make informed decisions regardless of their job function or level within the organization. Data governance is the glue that combines data discovery with the business value that data creates.

It’s critical to understand that governance accelerates our digital transformation in the modern data estate. Governance can seem like a burden and a blocker across data access and usage scenarios, but you cannot implement effective and efficient governance without a unified data strategy. This is why many organizations approach data governance like it’s a millstone hanging around their neck. Many organizations struggle with harnessing the power of data because they don’t have a data strategy and they lack alignment across the leadership teams to improve data culture.

In the Microsoft Digital data estate, governance lightens the load for our data owners, administrators, and users. Microsoft Purview helps us to democratize data responsibly, beginning with our unified analytics foundation in Microsoft Fabric. With a unified perspective on data and a system in place for understanding the entire enterprise estate, governance can be applied and monitored with Purview across all enterprise data, with an end-to-end data governance service that automates the discovery, classification, and protection of sensitive data across our on-premises, multi-cloud, and SaaS environments.

“The governance tools that protect and share any enterprise data are transparent to data creators, managers, and consumers,” Buono says. “Stakeholders can be assured that their data is being shared, accessed, and used how they want it to be.”

Our success begins with an iterative approach to data transformation. We started small, with projects that were simple to transform and didn’t have a critical impact on our business.

—Karthik Ravindran, general manager, data governance, Microsoft Security group

Responsible democratization encourages onboarding and breaks down silos. When data owners are confident in governance, they want their data on the platform, which drives the larger unification and governance of enterprise-wide data.

Scaling transformative outcomes with intelligent applications

The final layer of our data transformation strategy builds on the previous two to provide unified, democratized data to the applications and business processes used every day at Microsoft. These intelligent applications create business value. They empower employees, reduce manual efforts, increase operational efficiencies, generate increased revenue, and contribute to a better Microsoft.

How we transformed: iteration and progression

Ravindran smiles in a corporate portrait photo.
Microsoft Purview and Microsoft Fabric are enabling the company to rethink how we use data internally at Microsoft, says Karthik Ravindran, a general manager who leads data governance for the Microsoft Security group.

While the three layers provide a solid structure for building a modern data platform, they provide value only if implemented. Actual transformation happens in the day-to-day operations of an organization. We transformed by applying these layers to our business groups, data infrastructure, and even our cultural data approach at Microsoft Digital.

“Our success begins with an iterative approach to data transformation,” says Karthik Ravindran, a general manager who leads data governance for the Microsoft Security group. “We started small, with projects that were simple to transform and didn’t have a critical impact on our business.”

These early projects provided a testing ground for our methods and technology.

“We quickly iterated approaches and techniques, gathering feedback from stakeholders as we went, Ravindran says. “The results and learnings from these early implementations grew into a more mature and scalable platform. We were able to adapt to larger, more complex, and more critical sections of our data estate, tearing down larger data silos as we progressed.”

To understand how this worked, consider the following examples of our transformation across the organization.

Transforming marketing

The Microsoft Global Demand Center supports Microsoft commercial operations, including Microsoft Azure, Microsoft 365, and Dynamics 365. The Global Demand Center drives new customer acquisition and builds the growth and adoption of Microsoft products.

The Global Demand Center uses data from a broad spectrum of the business, including marketing, finance, sales, product telemetry, and many more. The use cases for this data span personas from any of these areas. Each internal Microsoft persona—whether a seller, researcher, product manager, or marketing executive—has a specific use case. Each of these personas engages with different customers to provide slightly different outcomes based on the customer and the product or service. It’s an immense swath of data consumed and managed by many teams for many purposes.

The Global Demand Center can holistically manage and monitor how Microsoft personas engage with customers by converging tools into the Microsoft Digital enterprise data platform. Each persona has a complete picture of who the customer is and what interactions or engagements they’ve had with Microsoft. These engagements include the products they’ve used, the trials they’ve downloaded, and the conversations they’ve had with other internal personas throughout their lifecycle as a Microsoft customer.

The enterprise data platform provides a common foundation for insights and intelligence into global demand for our products. The platform’s machine learning and AI capabilities empower next actions and prioritize how the Global Demand Center serves personas and customers. Moving the Global Demand Center toward adopting the enterprise data platform is iterative. It’s progressive onboarding of personas and teams to use the toolset available.

The adoption is transforming marketing and sales across Microsoft. It’s provided several benefits, including:

  • More reliable data and greater data quality. The unification of data and increased governance over the data create better data that drives better business results.
  • Decreased data costs. Moving to the enterprise data platform has reduced the overall cost compared to managing multiple data platforms.
  • Increased agility. With current and actionable data, the Global Demand Center can respond immediately to the myriad of daily changes in sales and marketing at Microsoft.

Improving the employee experience

Employee experience is paramount at Microsoft. The Microsoft Digital Employee Experience team is responsible for all aspects of the employee experience. They’re using the enterprise data platform to power a 360-degree view of the employee experience. Their insights tool connects different data across Microsoft to provide analytics and actionable insights that enable intelligent, personalized, and interconnected experiences for Microsoft employees.

The employee experience involves many data points and internal departments at Microsoft. Previously, when data was managed and governed in silos, it was difficult to build data connections to other internal organizations, such as Microsoft Human Resources (Microsoft HR). With the enterprise data platform, the Employee Experiences team can access the data they need within the controls of the platform’s governance capabilities, which gives the Microsoft HR department the stewardship and transparency they require.

The enterprise data platform creates many benefits for the Employee Experiences team, including:

  • Coordinated feature feedback and implementation. All planned software and tools features across Microsoft align with employee feedback and practical needs obtained from the enterprise data platform.
  • Better detection and mitigation of issues. Intelligent insights help Employee Experiences team members identify new and recurring issues so they can be mitigated effectively.
  • Decreased costs. The efficiencies created by using the enterprise data platform reduce engineering effort and resource usage.

Creating greater sustainability in operations

Microsoft Sustainability Operations supports efforts to increase global sustainability for Microsoft and minimize environmental impact. Sustainability Operations is responsible for environmental efforts across the organization, including waste, water, and carbon management programs.

Their internal platform, the Microsoft Cloud for Sustainability, is built on the enterprise data platform. It leverages the unified analytics and governance capabilities to create important sustainability insights that guide Sustainability Operations efforts and programs.

These insights are combined in the Microsoft Environmental Sustainability Report. This report contains 20 sections detailing how Microsoft works to minimize environmental impact. The report includes sections for emissions, capital purchases, business travel, employee commuting, product distribution, and managed assets, among others.

To provide the data for this report, Sustainability Operations has created a data processing platform with the Microsoft Cloud for Sustainability that ingests and transforms data from Microsoft Operations into a data repository. The unified data enables the team to create reports from many different perspectives using a common data model that enables quick integration.

Governance is central to the effective democratization of data, and when data is adequately democratized—safely accessible by everyone who should access it—transformation is accelerated. Modern governance is achievable using automated controls and a self-service methodology, enabling immediate opportunity to create business value.

—Damon Buono, head of enterprise governance, Microsoft

The Microsoft Environmental Sustainability Report supports decision-making at the enterprise and business group level, which enables progress tracking against internal goals, forecasting and simulation, qualitative analysis of environmental impact, and compliance management for both perspectives. These tools allow Microsoft Sustainability Operations to discover and track environmental hotspots across the global enterprise with greater frequency and more precision. Using these insights, they can drive changes in operations that create more immediate and significant environmental impact reductions.

Implementing internal data governance

Governance has been a massive part of our journey. Realizing governance as an accelerator of transformation has radically changed our approach to governance. Understanding who is accessing data, what they’re accessing, and how they’re accessing is critical to ensuring controlled and measured access. It also creates the foundation for building transparency into the enterprise data platform, growing user confidence, and increasing adoption.

“Governance is central to the effective democratization of data, and when data is adequately democratized—safely accessible by everyone who should access it—transformation is accelerated,” Buono says. “Modern governance is achievable using automated controls and a self-service methodology, enabling immediate opportunity to create business value.”

Our governance strategy uses data standards and models with actionable insights to converge our entire data estate, which spans thousands of distinct data sources. We built our approach to data governance on some crucial learnings:

  • Evidence is critical to driving adoption and recruiting executive support.
  • Automated data access and a data catalog are critical to consolidating the data estate.
  • Data issue management can provide evidence, but it doesn’t scale well.
  • A centralized data lake, scorecards for compliance, and practical controls help create evidence for governance in large enterprises.

Key Takeaways
We continue to drive the adoption of the enterprise data platform at Microsoft. As we work toward 100 percent adoption across the enterprise, we generate efficiencies and reduce costs as we go. The iterative nature of our implementation means we’ve been able to move quickly and with agility, improving our processes as we go.

We’re really very excited about where we are now with Purview, Fabric, and the entire suite of tools we now have to manage our data here at Microsoft. They are helping us rethink how we use data internally here at Microsoft, and we’re just getting started.

—Karthik Ravindran, general manager, data governance, Microsoft Security group

We’re also supporting organizational alignment and advocacy programs that will increase adoption. These programs include an internal data governance management team to improve governance, an enterprise data education program, and a training program for the responsible use of AI.

As our enterprise data estates expand and diversify, tools like Microsoft Purview and Microsoft Fabric have become indispensable in ensuring that our data remains an asset, not a liability. These tools offer a compelling solution to the pressing challenges of governing and protecting the modern data estate through automated discovery, classification, and a unified approach to hybrid and multi-cloud deployments.

“We’re really very excited about where we are now with Purview, Fabric, and the entire suite of tools we now have to manage our data here at Microsoft,” Ravindran says. “They are helping us rethink how we use data internally here at Microsoft, and we’re just getting started.”

Try it out

Related links

We'd like to hear from you!
Your feedback is valued, take our user survey here!

The post Transforming data governance at Microsoft with Microsoft Purview and Microsoft Fabric appeared first on Inside Track Blog.

]]>
12172
Providing modern data transfer and storage service at Microsoft with Microsoft Azure http://approjects.co.za/?big=insidetrack/blog/microsoft-uses-azure-to-provide-a-modern-data-transfer-and-storage-service/ Thu, 13 Jul 2023 14:54:07 +0000 http://approjects.co.za/?big=insidetrack/blog/?p=8732 Companies all over the world have launched their cloud adoption journey. While some are just starting, others are further along the path and are now researching the best options for moving their largest, most complex workflows to the cloud. It can take time for companies to address legacy tools and systems that have on-premises infrastructure […]

The post Providing modern data transfer and storage service at Microsoft with Microsoft Azure appeared first on Inside Track Blog.

]]>
Microsoft Digital technical storiesCompanies all over the world have launched their cloud adoption journey. While some are just starting, others are further along the path and are now researching the best options for moving their largest, most complex workflows to the cloud. It can take time for companies to address legacy tools and systems that have on-premises infrastructure dependencies.

Our Microsoft Digital Employee Experience (MDEE) team has been running our company as mostly cloud-only since 2018, and continues to design cloud-only solutions to help fulfill our Internet First and Microsoft Zero Trust goals.

In MDEE, we designed a Modern Data Transfer Service (MDTS), an enterprise-scale solution that allows the transfer of large files to and from partners outside the firewall and removes the need for an extranet.

MDTS makes cloud adoption easier for teams inside Microsoft and encourages the use of Microsoft Azure for all of their data transfer and storage scenarios. As a result, engineering teams can focus on building software and shipping products instead of dealing with the management overhead of Microsoft Azure subscriptions and becoming subject matter experts on infrastructure.

[Unpack simplifying Microsoft’s royalty ecosystem with connected data service. | Check out how Microsoft employees are leveraging the cloud for file storage with OneDrive Folder Backup. | Read more on simplifying compliance evidence management with Microsoft Azure confidential ledger.]

Leveraging our knowledge and experience

As part of Microsoft’s cloud adoption journey, we have been continuously looking for opportunities to help other organizations move data and remaining legacy workflows to the cloud. With more than 220,000 employees and over 150 partners that data is shared with, not every team had a clear path for converting their transfer and storage patterns into successful cloud scenarios.

We have a high level of Microsoft Azure service knowledge and expertise when it comes to storage and data transfer. We also have a long history with legacy on-premises storage designs and hybrid third-party cloud designs. Over the past decade, we engineered several data transfer and storage services to facilitate the needs of Microsoft engineering teams. Those services traditionally leveraged either on-premises designs or hybrid designs with some cloud storage. In 2019, we began to seriously look at replacing our hybrid model, which included a mix of on-premises resources, third party software, and Microsoft Azure services, with one modern service that would completely satisfy our customer scenarios using only Azure—thanks to new capabilities in Azure making it possible and it being the right time.

MDTS uses out of the box Microsoft Azure storage configurations and capabilities to help us address legacy on-premises storage patterns and support Microsoft core commitments to fully adopt Azure in a way that satisfies security requirements. Managed by a dedicated team of service engineers, program managers, and software developers, MDTS offers performance, security, and is available to any engineering team at Microsoft that needs to move their data storage and transfer to the cloud.

Designing a Modern Data Transfer and Storage Service

The design goal for MDTS was to create a single storage service offering entirely in Microsoft Azure, that would be flexible enough to meet the needs of most engineering teams at Microsoft. The service needed to be sustainable as a long-term solution, continue to support ongoing Internet First and Zero Trust Network security designs, and have the capability to adapt to evolving technology and security requirements.

Identifying use cases

First, we needed to identify the top use cases we wanted to solve and evaluate which combination of Microsoft Azure services would help us meet our requirements. The primary use cases we identified for our design included:

  • Sharing and/or distribution of complex payloads: We not only had to provide storage for corporate sharing needs, but also share those same materials externally. The variety of file sizes and different payload characteristics can be challenging because they don’t always fit a standard profile for files (e.g., Office docs, etc.).
  • Cloud storage adoption (shifting from on-premises to cloud): We wanted to ensure that engineering teams across Microsoft that needed a path to the cloud would have a roadmap. This need could arise because of expiring on-premises infrastructure, corporate direction, or other modernization initiatives like ours.
  • Consolidation of multiple storage solutions into one service, to reduce security risks and administrative overhead: Having to place data and content in multiple storage datastores for the purposes of specific sharing or performance needs is both cumbersome and can introduce additional risk. Because there wasn’t yet a single service that could meet all their sharing needs and performance requirements, employees and teams at Microsoft were using a variety of locations and services to store and share data.

Security, performance, and user experience design requirements

After identifying the use cases for MDTS, we focused on our primary design requirements. They fell into three high-level categories: security, performance, and user experience.

Security

The data transfer and storage design needed to follow our Internet First and Zero Trust network design principles. Accomplishing parity with Zero Trust meant leveraging best practices for encryption, standard ports, and authentication. At Microsoft, we already have standard design patterns that define how these pieces should be delivered.

  • Encryption: Data is encrypted both in transit and at rest.
  • Authentication: Microsoft Azure Active Directory supports both corporate synced domain accounts, external business-to-business accounts, as well as corporate and external security groups. Leveraging Azure Active Directory allows teams to remove dependencies on corporate domain controllers for authentication.
  • Authorization: Microsoft Azure Data Lake Gen2 storage provides fine grained access to containers and subfolders. This is possible because of many new capabilities, most notably the support for OAuth, hierarchical name space, and POSIX permissions. These capabilities are necessities of a Zero Trust network security design.
  • No non-standard ports: Opening non-standard ports can present a security risk. Using only HTTPS and TCP 443 as the mechanisms for transport and communication prevents opening non-standard ports. This includes having software capable of transport that maximizes the ingress/egress capabilities of the storage platform. Microsoft Azure Storage Explorer, AzCopy, and Microsoft Azure Data Factory meet the no non-standard ports requirement.

Performance

Payloads can range from being comprised of one very large file, millions of small files, and every combination in between. Scenarios across the payload spectrum have their own computing and storage performance considerations and challenges. Microsoft Azure has optimized software solutions for achieving the best possible storage ingress and egress. MDTS helps ensure that customers know what optimized solutions are available to them, provides configuration best practices, and shares the learnings with Azure Engineering to enable robust enterprise scale scenarios.

  • Data transfer speeds: Having software capable of maximizing the ingress/egress capabilities of the storage platform is preferable for engineering-type workloads. It’s common for these workloads to have complex payloads, payloads with several large files (10-500 GB) or millions of small files.
  • Ingress and egress: Support for ingress upwards of 10 Gbps and egress of 50 Gbps. Furthermore, client and server software that can consume the maximum amount of bandwidth possible up to the maximum amount in ingress/egress possible on client and storage.

 

Data size/ bandwidth 50 Mbps 100 Mbps 500 Mbps 1 Gbps 5 Gbps 10 Gbps
1 GB 2.7 minutes 1.4 minutes 0.3 minutes 0.1 minutes 0.03 minutes 0.010 minutes
10 GB 27.3 minutes 13.7 minutes 2.7 minutes 1.3 minutes 0.3 minutes >0.1 minutes
100 GB 4.6 hours 2.3 hours 0.5 hours 0.2 hours 0.05 hours 0.02 hours
1 TB 46.6 hours 23.3 hours 4.7 hours 2.3 hours 0.5 hours 0.2 hours
10 TB 19.4 days 9.7 days 1.9 days 0.9 days 0.2 days 0.1 days

Copy duration calculations based on data size and the bandwidth limit for the environment.

User experience

Users and systems need a way to perform manual and automated storage actions with graphical, command line, or API-initiated experiences.

  • Graphical user experience: Microsoft Azure Storage Explorer provides Storage Admins the ability to graphically manage storage. It also has storage consumer features for those who don’t have permissions for Administrative actions, and simply need to perform common storage actions like uploading, downloading, etc.
  • Command line experience: AzCopy provides developers with an easy way to automate common storage actions through CLI or scheduled tasks.
  • Automated experiences: Both Microsoft Azure Data Factory and AzCopy provide the ability for applications to use Azure Data Lake Gen2 storage as its primary storage source and destination.

Identifying personas

Because a diverse set of personas utilize storage for different purposes, we need to design storage experiences that satisfy the range of business needs. Through the process of development, we identified these custom persona experiences relevant to both storage and data transfer:

  • Storage Admins: The Storage Admins are Microsoft Azure subscription owners. Within the Azure subscription they create, manage, and maintain all aspects of MDTS: Storage Accounts, Data Factories, Storage Actions Service, and Self-Service Portal. Storage Admins also resolve requests and incidents that are not handled via Self-Service.
  • Data Owners: The Data Owner personas are those requesting storage who have the authority to create shares and authorize storage. Data Owners also perform the initial steps of creating automated distributions of data to and from private sites. Data Owners are essentially the decision makers of the storage following handoff of a storage account from Storage Admins.
  • Storage Consumers: At Microsoft, storage consumers represent a broad set of disciplines, from engineers and developers to project managers and marketing professionals. Storage Consumers can use Microsoft Azure Storage Explorer to perform storage actions to and from authorized storage paths (aka Shares). Within the MDTS Self Service Portal, a storage consumer can be given authorization to create distributions. A distribution can automate the transfer of data from a source to one or multiple destinations.

Implementing and enhancing the solution architecture

After considering multiple Microsoft Azure storage types and complimentary Azure Services, the MDTS team chose the following Microsoft Azure services and software as the foundation for offering a storage and data transfer service to Microsoft Engineering Groups.

  • Microsoft Azure Active Directory: Meets the requirements for authentication and access.
  • Microsoft Azure Data Lake Gen2: Meets security and performance requirements by providing encryption, OAuth, Hierarical namespace, fine grained authorization to Azure Active Directory entities, and 10+ GB per sec ingress and egress.
  • Microsoft Azure Storage Explorer: Meets security, performance, and user experience requirements by providing a graphical experience to perform storage administrative tasks and storage consumer tasks without needing a storage account key or role based access (RBAC) on an Azure resource. Azure Storage Explorer also has AzCopy embedded to satisfy performance for complex payloads.
  • AzCopy: Provides a robust and highly performant command line interface.
  • Microsoft Azure Data Factory: Meets the requirements for orchestrating and automating data copies between private networks and Azure Data Lake Gen2 storage paths. Azure Data Factory copy activities are equally as performant as AzCopy and satisfy security requriements.

Enabling Storage and Orchestration

As illustrated below, the first MDTS design was comprised entirely of Microsoft Azure Services with no additional investment from us other than people to manage the Microsoft Azure subscription and perform routine requests. MDTS was offered as a commodity service to engineering teams at Microsoft in January 2020. Within a few months we saw a reduction of third-party software and on-premises file server storage, which provided significant savings. This migration also contributed progress towards the company-wide objectives of Internet First and Zero Trust design patterns.

The first design of MDTS provides storage and orchestration using out of the box Microsoft Azure services.
The first design of MDTS provides storage and orchestration using out of the box Microsoft Azure services.

We initially onboarded 35 engineering teams which included 10,000 Microsoft Azure Storage Explorer users (internal and external accounts), and 600 TB per month of Microsoft Azure storage uploads and downloads. By offering the MDTS service, we saved engineering teams from having to run Azure subscriptions themselves and needing to learn the technical details of implementing a modern cloud storage solution.

Creating access control models

As a team, we quickly discovered that having specific repeatable implementation strategies was essential when configuring public facing Microsoft Azure storage. Our initial time investment was in standardizing an access control process which would simplify complexity and ensure a correct security posture before handing off storage to customers. To do this, we constructed onboarding processes for identifying the type of share for which we standardized the implementation steps.

We implemented standard access control models for two types of shares: container shares and sub-shares.

Container share access control model

The container share access control model is used for scenarios where the data owner prefers users to have access to a broad set of data. As illustrated in the graphic below, container shares supply access to the root, or parent, of a folder hierarchy. The container is the parent. Any member of the security group will gain access to the top level. When creating a container share, we also make it possible to convert to a sub-share access control model if desired.

 

Microsoft Azure Storage Explorer grants access to the root of a folder hierarchy using the container share access control model.
Microsoft Azure Storage Explorer grants access to the root, or parent, of a folder hierarchy using the container share access control model. Both engineering and marketing are containers. Each has a specific Microsoft Azure Active Directory Security group. A top-level Microsoft Azure AD Security group is also added to minimize effort for users who should get access to all containers added to the storage account.

This model fits scenarios where group members get Read, Write, and Execute permissions to an entire container. The authorization allows users to upload, download, create, and/or delete folders and files. Making changes to the Access Control restricts access. For example, to create access permissions for download only, select Read and Execute.

Sub-share access control model

The sub-share access control model is used for scenarios where the data owner prefers users have explicit access to folders only. As illustrated in the graphic below, folders are hierarchically created under the container. In cases where several folders exist, a security group access control can be implemented on a specific folder. Access is granted to the folder where the access control is applied. This prevents users from seeing or navigating folders under the container other than the folders where an explicit access control is applied. When users attempt to browse the container, authorization will fail.

 

Microsoft Azure Storage Explorer grants access to sub-folder only using the sub-share access control model.
Microsoft Azure Storage Explorer grants access to sub-folder only using the sub-share access control model. Members are added to the sub-share group, not the container group. The sub-share group is nested in the container group with execute permissions to allow for Read and Write on the sub-share.

This model fits for scenarios where group members get Read, Write, and Execute permissions to a sub-folder only. The authorization allows users to upload, download, create folders/files, and delete folders/files. The access control is specific to the folder “project1.” In this model you can have multiple folders under the container, but only provide authorization to a specific folder.

The sub-share process is only applied if a sub-share is needed.

  • Any folder needing explicit authorization is considered a sub-share.
  • We apply a sub-share security group access control with Read, Write, and Execute on the folder.
  • We nest the sub-share security group in the parent share security group used for Execute only. This allows members who do not have access to the container enough authorization to Read, Write, and Execute the specific sub-share folder without having Read or Write permissions to any other folders in the container.

Applying access controls for each type of share (container and or sub-share)

The parent share process is standard for each storage account.

  • Each storge account has a unique security group. This security group will have access control applied for any containers. This allows data owners to add members and effectively give access to all containers (current and future) by simply changing the membership of one group.
  • Each container will have a unique security group for Read, Write, and Execute. This security group is used to isolate authorization to a single container.
  • Each container will have a unique group for execute. This security group is needed in the event sub-shares are created. Sub-shares are folder-specific shares in the hierarchical namespace.
  • We always use the default access control option. This is a feature that automatically applies the parent permissions to all new child folders (sub-folders).

The first design enabled us to offer MDTS while our engineers defined, designed, and developed an improved experience for all the personas. It quickly became evident that Storage Admins needed the ability to see an aggregate view of all storage actions in near real-time to successfully operate the service. It was important for our administrators to easily discover the most active accounts and which user, service principle, or managed service identity was making storage requests or performing storage actions. In July 2020, we added the Aggregate Storage Actions service.

Adding aggregate storage actions

For our second MDTS design, we augmented the out of the box Microsoft Azure Storage capabilities used in our first design with the capabilities of Microsoft Azure Monitor, Event Hubs, Stream Analytics, Function Apps, and Microsoft Azure Data Explorer to provide aggregate storage actions. Once the Aggregate Storage Actions capability was deployed and configured within MDTS, storage admins were able to aggregate the storage actions of all their storage accounts and see them in a single pane view.

 

The second design of MDTS introduces aggregate storage actions.
The second design of MDTS introduces aggregate storage actions.

The Microsoft Azure Storage Diagnostic settings in Microsoft Azure Portal makes it possible for us to configure specific settings for blob actions. Combining this feature with other Azure Services and some custom data manipulation gives MDTS the ability to see which users are performing storage actions, what those storage actions are, and when those actions were performed. The data visualizations are near real-time and aggregated across all the storage accounts.

Storage accounts are configured to route logs from Microsoft Azure Monitor to Event Hub. We currently have 45+ storage accounts that generate around five million logs each day. Data filtering, manipulation, and grouping is performed by Stream Analytics. Function Apps are responsible for fetching UPNs using Graph API, then pushing logs to Microsoft Azure Data Explorer. Microsoft Power BI and our modern self-service portal query Microsoft Azure Data Explorer and provide the visualizations, including dashboards with drill down functionality. The data available in our dashboard includes the following information aggregated across all customers (currently 35 storage accounts).

  • Aggregate view of most active accounts based on log activity.
  • Aggregate total of GB uploaded and download per storage account.
  • Top users who uploaded showing the user principal name (both external and internal).
  • Top users who downloaded showing the user principal name (both external and internal).
  • Top Accounts uploaded data.
  • Top Accounts downloaded data.

The only setting required to onboard new storage accounts is to configure them to route logs to the Event Hub. Because we can have an aggregate store of all the storage account activities, we are able to offer MDTS customers an account view into their storage account specific data.

Following the release of Aggregate Storage Actions, the MDTS team, along with feedback from customers, identified another area of investment—the need for storage customers to “self-service” and view account specific insights without having role-based access to the subscription or storage accounts.

Providing a self-service experience

To enhance the experience of the other personas, MDTS is now focused on the creation of a Microsoft Azure web portal where customers can self-service different storage and transfer capabilities without having to provide any Microsoft Azure Role Based Access (RBAC) to the underlying subscription that hosts the MDTS service.

When designing MDTS self-service capabilities we focused on meeting these primary goals:

  • Make it possible for Microsoft Azure Subscription owners (Storage Admins) to provide the platform and services while not needing to be in the middle of making changes to storage and transfer services.
  • The ability to create custom persona experiences so customers can achieve their storage and transfer goals through a single portal experience in a secure and intuitive way. Some of the new enterprise scale capabilities include:
    • Onboarding.
    • Creating storage shares.
    • Authorization changes.
    • Distributions. Automating the distribution of data from one source to one or multiple destinations.
    • Provide insights into storage actions (based on the data provided in Storage Actions enabled in our second MDTS release).
    • Reporting basic consumption data, like the number of users, groups, and shares on a particular account.
    • Reporting the cost of the account
  • As Azure services and customer scenarios change, the portal can also change.
  • If customers want to “self-host” (essentially take our investments and do it themselves), we will easily be able to accommodate.

Our next design of MDTS introduces a self-service portal.
Our next design of MDTS introduces a self-service portal.

Storage consumer user experiences

After storage is created and configured, data owners can then share steps for storage consumers to start using storage. Upload and download are the most common storage actions, and Microsoft Azure provides software and services needed to perform both actions for manual and automated scenarios.

Microsoft Azure Storage Explorer is recommended for manual scenarios where users can connect and perform high speed uploads and downloads manually. Both Microsoft Azure Data Factory and AzCopy can be used in scenarios where automation is needed. AzCopy is heavily preferred in scenarios where synchronization is required. Microsoft Azure Data Factory doesn’t provide synchronization but does provide robust data copy and data move. Azure Data Factory is also a managed service and better suited in enterprise scenarios where flexible triggering options, uptime, auto scale, monitoring, and metrics are required.

Using Microsoft Azure Storage Explorer for manual storage actions

Developers and Storage Admins are accustomed to using Microsoft Azure Storage Explorer for both storage administration and routine storage actions (e.g., uploading and downloading). Non-storage admin, otherwise known as Storage Consumers, can also use Microsoft Azure Storage Explorer to connect and perform storage actions without needing any role-based access control or access keys to the storage account. Once the storage is authorized, members of authorized groups can perform routine steps to attach the storage they are authorized for, authenticating with their work email, and leveraging the options based on their authorization.

The processes for sign-in and adding a resource via Microsoft Azure Active Directory are found in the Manage Accounts and Open Connect Dialogue options of Microsoft Azure Storage Explorer.

After signing in and selecting the option to add the resource via Microsoft Azure Active Directory, you can supply the storage URL and connect. Once connected, it only requires a few clicks to upload and download data.

 

Microsoft Azure Storage Explorer Local and Attached module.
Microsoft Azure Storage Explorer Local and Attached module. After following the add resource via Microsoft Azure AD process, the Azure AD group itshowcase-engineering is authorized to Read, Write, and Edit (rwe) and members of the group can perform storage actions.

To learn more about using Microsoft Azure Storage Explorer, Get started with Storage Explorer. There are additional links in the More Information section at the end of this document.

Note: Microsoft Azure Storage Explorer uses AzCopy. Having AzCopy as the transport allows storage consumers to benefit from high-speed transfers. If desired, AzCopy can be used as a stand-alone command line application.

Using AzCopy for manual or automated storage actions

AzCopy is a command line interface used to perform storage actions on authorized paths. AzCopy is used in Microsoft Azure Storage Explorer but can also be used as a standalone executable to automate storage actions. It’s a multi-stream TCP based transport capable of optimizing throughput based on the bandwidth available. MDTS customers use AzCopy in scenarios which require synchronization or cases where Microsoft Azure Storage Explorer or Microsoft Azure Data Factory copy activity doesn’t meet the requirements for data transfer. For more information about using AzCopy please see the More Information section at the end of this document.

AzCopy is a great match for standalone and synchronization scenarios. It also has options that are useful when seeking to automate or build applications. Because AzCopy is a single executable running on either a single client or server system, it isn’t always ideal for enterprise scenarios. Microsoft Azure Data Factory is a more robust Microsoft Azure service that meets the most enterprise needs.

Using Microsoft Azure Data Factory for automated copy activity

Some of the teams that use MDTS require the ability to orchestrate and operationalize storage uploads and downloads. Before MDTS, we would have either built a custom service or licensed a third-party solution, which can be expensive and/or time consuming.

Microsoft Azure Data Factory, a cloud-based ETL and data integration service, allows us to create data-driven workflows for orchestrating data movement. Including Azure Data Factory in our storage hosting service model provided customers with a way to automate data copy activities. MDTS’s most common data movement scenarios are distributing builds from a single source to multiple destinations (3-5 destinations are common).

Another requirement for MSTS was to leverage private data stores as a source or destination. Microsoft Azure Data Factory provides the capability to use a private system, also known as a self-hosted integration runtime. When configured this system can be used in copy activity communicating with on-premises file systems. The on-premises file system can then be used as a source and/or destination datastore.

In the situation where on-premises file system data needs to be stored in Microsoft Azure or shared with external partners, Microsoft Azure Data Factory provides the ability to orchestrate pipelines which perform one or multiple copy activities in sequence. These activities result in end-to-end data movement from one on-premises file systems to Microsoft Azure Storage, and then to another private system if desired.

The graphic below provides an example of a pipeline orchestrated to copy builds from a single source to several private destinations.

 

Detailed example of a Microsoft Azure Data Factory pipeline, including the build system source.
Microsoft Azure Data Factory pipeline example. Private site 1 is the build system source. Build system will build, load the source file system, then trigger the Microsoft Azure Data Factory pipeline. Build is then uploaded, then Private sites 2, 3, 4 will download. Function apps are used for sending email notifications to site owners and additional validation.

For more information on Azure Data Factory, please see Introduction to Microsoft Azure Data Factory. There are additional links in the More Information section at the end of this document.

If you are thinking about using Microsoft Azure to develop a modern data transfer and storage solution for your organization, here are some of the best practices we gathered while developing MDTS.

Close the technical gap for storage consumers with a white glove approach to onboarding

Be prepared to spend time with customers who are initially overwhelmed with using Azure Storage Explorer or AzCopy. At Microsoft, storage consumers represent a broad set of disciplines—from engineers and developers to project managers and marketing professionals. Azure Storage Explorer provides an excellent experience for engineers and developers but can be a little challenging for less technical roles.

Have a standard access control model

Use Microsoft Azure Active Directory security groups and group nesting to manage authorization; Microsoft Azure Data Lake Gen2 storage has a limit to the number of Access Controls you can apply. To avoid reaching this limit, and to simplify administration, we recommend using Microsoft Azure Active Directory security groups. We apply the access control to the security group only, and in some cases, we nest other security groups within the access control group. We nest Member Security Groups within Access Control Security Groups to manage access. These group types don’t exist in Microsoft Azure Active Directory but do exist within our MDTS service as a process to differentiate the purpose of a group. We can easily determine this differentiation by the name of the group.

  • Access Control Security Groups: We use this group type for applying Access Control on ADLS Gen2 storage containers and/or folders.
  • Member Security Groups: We use these to satisfy cases where access to containers and/or folders will constantly change for members.

When there are large numbers of members, nesting prevents the need to add members individually to the Access Control Security Groups. When access is no longer needed, we can remove the Member Group(s) from the Access Control Security Group and no further action is needed on storage objects.

Along with using Microsoft Azure Active Directory security groups, make sure to have a documented process for applying access controls. Be consistent and have a way of tracking where access controls are applied.

Use descriptive display names for your Microsoft Azure AD security groups

Because Microsoft Azure AD doesn’t currently organize groups by owners, we recommend using naming conventions that capture the group’s purpose and type to allow for easier searches.

  • Example 1: mdts-ac-storageacct1-rwe. This group name uses our service standard naming convention for Access Control group type on Storage Account 1, with access control Read, Write, and Execute. mdts = Service, ac = Access Control Type, storageacct1 = ADLS Gen2 Storage Account Name, rwe = permission of the access control.
  • Example 2: mdts-mg-storageacct1-project1. This group name uses our service standard naming convention for Member Group type on Storage Account 1. This group does not have an explicit access control on storage, but it is nested in mdts-ac-storageacct1-rwe where any member of this group has the Read, Write, and Execute access to storage account1 because it’s nested in mdts-ac-storageacct1-rwe.

Remember to propagate any changes to access controls

Microsoft Azure Data Lake Gen2 storage, by default, doesn’t automatically propagate any access control changes. As such, when removing, adding, or changing an access control, you need to follow an additional step to propagate the access control list. This option is available in Microsoft Azure Storage Explorer.

Storage Consumers can attempt Administrative options

Storage Consumers use Microsoft Azure Storage Explorer and are authenticated with their Microsoft Azure Active Directory user profile. Since Azure Storage Explorer is primarily developed for Storage Admin and Developer personas, all administrative actions are visible. It is common for storage consumers to attempt administrative actions, like managing access or deleting a container. Those actions will fail due to only being accessed via access control lists (ACLs). There isn’t a way to provide administration actions via ACL’s. If administrative actions are needed, then users will become a Storage Admin which has access via Azure’s Role Based Access Control (RBAC).

Microsoft Azure Storage Explorer and AzCopy are throughput intensive

As stated above, AzCopy is leveraged by Microsoft Azure Storage Explorer for transport actions. When using Azure Storage Explorer or AzCopy it’s important to understand that transfer performance is its specialty. Because of this, some clients and/or networks may benefit from throttling AzCopy’s performance. In circumstances where you don’t want AzCopy to consume too much network bandwidth, there are configurations available. In Microsoft Azure Storage Explorer use the Settings option and select the Transfers section to configure Network Concurrency and/or File Concurrency. In the Network Concurrency section, Adjust Dynamically is a default option. For AzCopy, there are flags and environment variables available to optimize performance.

For more information, visit Configure, optimize, and troubleshoot AzCopy.

Microsoft Azure Storage Explorer sign-in with MSAL

Microsoft Authentication Library, currently in product preview, provides enhanced single sign-on, multi-factor authentication, and conditional access support. In some situations, users won’t authenticate unless MSAL is selected. To enable MSAL, select the Setting option from Microsoft Azure Storage Explorer’s navigation pane. Then in the application section, select the option to enable Microsoft Authentication Library.

B2B invites are needed for external accounts (guest user access)

When there is a Microsoft business need to work with external partners, leveraging guest user access in Microsoft Azure Active Directory is necessary. Once the B2B invite process is followed, external accounts can be authorized by managing group membership. For more information, read What is B2B collaboration in Azure Active Directory?

Key Takeaways

We used Microsoft Azure products and services to create an end-to-end modern data transfer and storage service that can be used by any group at Microsoft that desires cloud data storage. The release of Microsoft Azure Data Lake Gen 2, Microsoft Azure Data Factory, and the improvements in the latest release of Azure Storage Explorer made it possible for us to offer MDTS as a fully native Microsoft Azure service.

One of the many strengths of using Microsoft Azure is the ability to use only what we needed, as we needed it. For MDTS, we started by simply creating storage accounts, requesting Microsoft Azure Active Directory Security Groups, applying an access control to storage URLs, and releasing the storage to customers for use. We then invested in adding storage actions and developed self-service capabilities that make MDTS a true enterprise-scale solution for data transfer and storage in the cloud.

We are actively encouraging the adoption of our MDTS storage design to all Microsoft engineering teams that still rely on legacy storage hosted in the Microsoft Corporate network. We are also encouraging any Microsoft Azure consumers to consider this design when evaluating options for storage and file sharing scenarios. Our design has proven to be scalable, compliant, and performant with the Microsoft Zero Trust security initiative, handling extreme payloads with high throughput and no constraints on the size or number of files.

By eliminating our dependency on third-party software, we have been able to eliminate third-party licensing, consulting, and hosting costs for many on-premises storage systems.

Are you ready to learn more? Sign up for your own Microsoft Azure subscription and get started today.

To receive the latest updates on Azure storage products and features to meet your cloud investment needs, visit Microsoft Azure updates.

Related links

 

The post Providing modern data transfer and storage service at Microsoft with Microsoft Azure appeared first on Inside Track Blog.

]]>
8732
Transforming sales at Microsoft with AI-infused recommendations and customer insights http://approjects.co.za/?big=insidetrack/blog/transforming-sales-at-microsoft-with-ai-infused-recommendations-and-customer-insights/ Tue, 03 Mar 2020 17:13:16 +0000 http://approjects.co.za/?big=insidetrack/blog/?p=5137 Peter Schlegel’s job is to build trust with Microsoft customers, and he’s using AI to do it. “Specialists are solution sellers,” says Schlegel, a data and AI specialist for Microsoft Digital Sales. “We help customers solve problems with an eye toward helping them move down the path of digital transformation. To do this, we also […]

The post Transforming sales at Microsoft with AI-infused recommendations and customer insights appeared first on Inside Track Blog.

]]>
Microsoft Digital storiesPeter Schlegel’s job is to build trust with Microsoft customers, and he’s using AI to do it.

“Specialists are solution sellers,” says Schlegel, a data and AI specialist for Microsoft Digital Sales. “We help customers solve problems with an eye toward helping them move down the path of digital transformation. To do this, we also must develop high-quality relationships with them.”

Schlegel introduces customers to Microsoft technologies that can help them efficiently address their business needs. He says that he and other solution seller specialists can identify opportunities for sales based on customer purchase history, Microsoft Azure consumption levels, and workload usage.

However, it can be challenging for Microsoft sellers to holistically understand their customers because of the company’s scale and the broad set of rich products it offers to customers.

“I could do this manually, but it would consume most of my time,” Schlegel says. “If a tool gives me recommendations, I could spend more time with the customer.”

Enter Daily Recommender, an internal AI solution that uses Microsoft Dynamics 365, Azure, and an AI interface to provide data-driven recommendations to sellers based on over 1,000 data points per customer, including past purchases, marketing engagement, and digital and local event attendance.

“A lot of companies invest in AI solutions,” says Praveen Kumar, a principal program manager in Microsoft Digital. “The primary differentiator is that Daily Recommender presents specialists and account executives with meaningful data, insights, and artifacts so they can make the right decisions.”

[Learn more about how Microsoft Digital developed Daily Recommender. Learn how Microsoft Digital modernized the toolset Microsoft sellers use.]

Scoping customer conversations based on past engagements

Daily Recommender uses internal and external data points such as current consumption levels, licenses, customer interactions with marketing material, and machine learning techniques such as collaborative filtering and natural language processing to identify the next logical product recommendation for the customer.

“We help customers achieve the solutions they intend to build in the most efficient way using Microsoft technologies,” says Siddharth Kumar, a principal machine learning scientist manager who works on the team that provides machine learning solutions to Daily Recommender. “With these curated recommendations, sellers can spend less time creating sales pitches and focus on having meaningful and useful discussions with customers.”

These recommendations and insights are presented in a curated dashboard, which is available to the entire Digital Sales team.

“Let’s say a Microsoft account team is responsible for over 100 customers,” says Salman Mukhtar, the director of business programs for the Digital Sales team. “Daily Recommender gives you access to product recommendations for the accounts across your solution areas. The app also provides a rationale for the recommendation, what material you can use, and a suggested action date. It takes the AI to the last mile.”

Using Daily Recommender, account executives and specialists work together to understand what may be top of mind for the customer, review product recommendations, identify the right customer contacts, and provide customer-centric recommendations based on the customer’s needs and interests.

For example, say a customer downloaded a piece of Microsoft content showcasing how to move legacy SQL servers to the cloud. Daily Recommender could prompt a specialist to provide that customer with resources for cloud migration and suggest that they unlock the advanced capabilities of the cloud by investing in a business intelligence tool like Microsoft Power BI.

“Within minutes, I have a clear picture of what’s currently driving the customer and how I can structure my conversations based on their current consumption and interest in Microsoft products,” says Alexander Mildner, an account executive for Microsoft Digital Sales. “If I had this two and a half years ago, my life would have been easier.”

Equipped with this data, sellers and account executives can collaborate and connect customers with Microsoft resources, products, and specialists to achieve their projects’ goals. Specialists can work with customers to create execution plans or discuss the technical details of implementation, often within their area of expertise.

“Collaboration is an essential part of an account team,” Mildner says. “The more insights you can use as a specialist or account executive, the better.”

Committing to continuous improvement over time

With Daily Recommender, one out of every three recommendations qualifies as a sales opportunity. This is almost four times higher than the industry average of 6 to 10 percent. The app becomes more intelligent over time as it continues to learn from seller actions and sales outcomes. The team also takes a hands-on approach to improving Daily Recommender by analyzing clickthrough and seller action data and soliciting feedback through in-person roadshows, emails, and community calls.

“I think we will look back in a year or two, and we won’t be able to imagine a time before this tool,” Mildner says. “I’ve already seen the progress that the tool has made in the past two years, which tells you how strong its AI is.”

Daily Recommender was built for the Microsoft Sales team by Microsoft Digital as part of an ongoing effort to transform the tools and processes that the company provides for its sales force.

“For a sales model that requires sellers to do active prospecting at scale, we needed a robust and AI enabled solution that would help sellers quickly identify and actively engage with the customers to make faster buying decisions,” says Hyma Davuluri, a principal program manager in Microsoft Digital. “This led to the development of Daily Recommender, which enabled sellers to identify and act on sales opportunities.”

The journey to create and improve Daily Recommender has been educational for Mukhtar and the team. They have learned that the best way to improve the experience is to create synergy across business groups, sellers, and AI experts.

The result?

The Digital Sales team was able to transform the sales process with AI.

Mukhtar says that supporting this collaboration took time, but it started with bringing people together to invest in changing the way the Microsoft sales teams organized and approached their customers for prospecting new business.

“Changing people’s behavior isn’t easy,” Mukhtar says. “We focused on bringing together different stakeholders to invest in changing our processes. We found that value is really unlocked by how well you bring together AI and the sales process, seller behavior, and customer needs and integrate into a modern app.”

Related links

The post Transforming sales at Microsoft with AI-infused recommendations and customer insights appeared first on Inside Track Blog.

]]>
5137
Retooling how Microsoft sellers sell the company http://approjects.co.za/?big=insidetrack/blog/retooling-how-microsoft-sellers-sell-the-company/ Mon, 16 Sep 2019 22:35:17 +0000 http://approjects.co.za/?big=insidetrack/blog/?p=4805 Selling Microsoft hasn’t been easy. Just ask the 25,000 people who pitch the complex array of products and services that the company sells to customers across the globe. Those sellers used to wade through more than 30 homegrown applications to get their jobs done, often spending more time filling out forms and cross referencing tools […]

The post Retooling how Microsoft sellers sell the company appeared first on Inside Track Blog.

]]>
Microsoft Digital storiesSelling Microsoft hasn’t been easy.

Just ask the 25,000 people who pitch the complex array of products and services that the company sells to customers across the globe.

Those sellers used to wade through more than 30 homegrown applications to get their jobs done, often spending more time filling out forms and cross referencing tools than talking to customers.

“We needed to modernize our toolset,” says Kim Kunes, who leads the Microsoft Digital team that provides the tools and experiences that the sales and marketing organizations use to sell the company’s wares. “We needed to turn what had been our sellers’ biggest headache into an asset that would help them flourish in a connected, cloud world.”

A complete overhaul of the company’s tools and processes is fully underway, says Kunes, but it’s not complete.

“We’ve winnowed a disconnected, heavily seamed group of tools down to a core group of critical experiences connected in ways that make sense for our sellers and marketers, but there is still work to do,” she says. “We’re in year two of a multiyear journey to revamp our sellers’ toolset.”

Kunes says her team, like all of Microsoft Digital, is shifting its focus from working for internal partners in a traditional IT manner to building experiences in partnership with the business that make sense for users. In this case, those users are the heart and soul of the company’s revenue-generating selling community.

“Now, just like any other product team at Microsoft, we operate with a baseline budget that funds a group of FTE (full-time employee) engineers and a continuous prioritization and planning process to deliver functionality most critical to our users and businesses,” she says. “Now we’re thinking, ‘What should the seller experience be from start to finish? Are we doing everything we can to make their experience as seamless as possible?’”

This transformation has Microsoft Digital’s Commercial Sales and Marketing Engineering Team working in new ways. It’s centralizing and standardizing the many channels of feedback and data to derive a picture of users’ unmet, unarticulated needs. The shift is built around a new focus on how Microsoft Digital approaches customer research. It’s adopting a fluent, modern look and feel that’s consistent with how the rest of the Microsoft is approaching design. It’s using DevOps and other agile engineering principles that truly keep the team focused on the user’s end-to-end experience as it moves, fast and flexibly.

“All of our sellers’ regular tasks need to be in one place and arranged so that it’s efficient and virtually seamless to flow through them,” Kunes says. “Everything has to be intuitive. There should be no big learning curves. They shouldn’t have to figure out how to use a new application every time they want to get something done.”

This laser focus on the customer experience has required the team to think and work differently.

“I’ve seen our team’s culture shift,” Kunes says. “In the past we were focused on incremental improvements to make the process and tools better. Now we’re thinking bigger. For example, we’re beginning to use AI and machine learning to curate the gold mine of valuable data we have to surface critical next best action insights to our sellers and marketers.”

This transformation is driving results that are paying dividends, says Siew Hoon Goh, the Microsoft director of sales excellence in charge of making sure the tools and experience that Microsoft Digital is building meet the needs of the company’s digital sales force.

“Our sellers do recognize that there has been lots of progress,” Goh says. “Technology is one of the best enablers for us to scale to bigger and better things and increased revenue for the company.”

Microsoft’s umbrella tool for sales is Microsoft Sales Experience. Known as MSX, it’s an integrated solution built on Dynamics 365, Microsoft Azure products, Office 365 productivity and collaboration services, and Power BI. In July, MSX was upgraded to the new modern Microsoft Dynamics 365 for Sales user experience. It includes a simplified user experience and integration into LinkedIn, Microsoft Teams, and several internal tools.

“Our MSX instance is one of the largest implementations of Dynamics 365 in the company,” says Ismail Mohammed, a principal program manager on the Microsoft Digital team working to make life better for the company’s sales field. “Ultimately, we want to make our tools more intuitive and help our sellers get their time back so they can focus more of their time on selling.”

MSX is the gateway to several important seller experiences that you’ll read about here:

  • Portal, a second generation of MSX meant to be a true single pane of glass for sellers to work from
  • Account Based Marketing, a transformed approach to sifting through marketing sales leads to find the ones that are worth pursuing
  • Daily Recommender, a machine learning-based discovery engine that advises sellers on the specific leads they should pursue next
  • Account 360, an aggregated view of customer content that helps sellers find the right customer information before they reach out to leads

Charting the evolution of MSX

When MSX launched in 2015, it replaced eight on-premises instances of Dynamics CRM 2011, each of which was highly customized and complex. Built on Azure Cloud Services, MSX brought all those experiences into one cloud-based platform.

Though it was a big improvement, it was still just a beginning.

“For perspective, MSX started out as a collection of links,” Kunes says. “It was nice to have a place where you could get to everything, but it really wasn’t the seamless, single-pane-of-glass experience that we are working toward.”

The team has continued to refine MSX, pushing hard to evolve it into an experience in which sellers feel more productive. They felt less so when they pieced their sales story together with their own offline Excel spreadsheets, PowerPoint decks, and secret contact lists (the latter of which are no longer allowed anyway, because of GDPR, or General Data Protection Regulation).

The original MSX solution is gradually making way for MSX Portal, a new, transformative experience that is being rolled out to the company’s sellers role by role, says Steffie Hofmann, Microsoft Digital’s lead MSX Portal program manager. MSX Portal debuted in February, when the experience was provided to the company’s customer-success managers, a specialized sales role at Microsoft.

MSX Portal helps sellers figure out what the next best action they can take. They get suggestions on their homepage and in context-driven ways within their workflow.

“Each time we ship MSX Portal to a new group of sellers, the experience improves dramatically,” Hofmann says. “They no longer have to leave the tool to get their work done.”

The goal has long been to have MSX provide sellers everything they need as they reach out to customers to sell the company on a daily basis , says Steve Thomas, Microsoft Digital’s lead software engineering manager for MSX Portal.

“We built MSX Portal with the idea of making it a great place for our sellers to start their day, to get their work done,” Thomas says. “We wanted to get past the notion that it was something they had to work around.”

The rollout of MSX Portal is expected to be complete by the end of the 2019 calendar year.

Sifting through the noise

Sales leads pour into a company the size of Microsoft from all directions, at massive scale. After their interest gets piqued by the company’s wide-ranging marketing efforts, leaders at other businesses watch webinars and make decisions:

  • Should CIOs invest in Microsoft’s stack?
  • Should CEOs ask Microsoft to see how the company can help them digitally transform?
  • Should IT pros ask for Microsoft’s help via product websites and customer-service lines and at conferences?

[Read this case study on how Microsoft uses a bot to improve basic lead qualification to see how millions of potential sales leads received each year are qualified down to thousands. Read about how we use AI to serve up the next best lead to sellers.]

All those many thousands of leads get funneled into the Microsoft Global Demand Center.

“Before we move a lead to one of our sellers, we nurture them in the Global Demand Center,” says Prabhu Jayaraman, a group engineering manager who helps lead Microsoft Digital’s marketing effort. “They don’t go to our sellers right away—first we need to make sure our leads are high quality and have a high propensity to result in wins before we transfer them.”

It used to be that all those marketing-driven leads would get dumped on sellers, tossed over the fence with little vetting or insight.

“Sellers would look at these queues, they’d see 25 pages of leads, and randomly say, ‘This looks interesting, let me go talk to them,’” Jayaraman says. “The problem was the lead they picked out of the 10,000 options might not be the next best lead to pick.”

To help sellers get to the right lead, Microsoft adopted a new approach to how it markets to larger customers by infusing AI into its Account Based Marketing (ABM) program.

“ABM is not a tool, it’s a concept,” Jayaraman says. “It’s about stitching these opportunities together in ways that make sense—when one company contacts us in five different ways, we will connect those together into one opportunity.”

To Vinh Nguyen, ABM is about bringing marketing and sales closer together—something it does by weaving relevant contacts and insights together in ways that help sellers be more effective.

“It may sound simple, but it hasn’t been,” says Nguyen, the senior program manager leading Microsoft Digital’s efforts around Account Based Marketing. “We’re trying to use machine learning and automation to optimize when sellers should engage with a customer on products that their employees have shown interest in.”

The team has been working for more than a year on getting it right.

“We’re using Marketo marketing software to listen to our customer interaction signals,” Nguyen says. “When signals come into the Global Demand Center, we feed them into our machine-learning models.”

Those ML model-fueled recommendations are fed into the Daily Recommender, where sellers use them to decide which leads to pursue on a daily basis.

Finding the best leads with Daily Recommender

Until recently, Microsoft’s most successful sellers were those best at finding gold nuggets of customer information hidden in the company’s many sales tools. That was when star sellers were known for maintaining their own offline databases and sales pitches more than they were for building close relationships with customers.

“Why should our most successful sellers be the ones who are the best at navigating complex systems?” Kunes says. “Why shouldn’t success be about having intelligent, human connections with customers?”

This culture was fed by the fact that the company’s sales strategy was built around educated guesswork—each quarter, SWAT team-like groups of sellers would gather, discuss the indicators that each of them were seeing, talk it out, and use that war-room discussion to set sales targets for the upcoming quarter.

All of this made selling more art than science.

The team looked to change that when it developed Daily Recommender, a machine-learning tool that makes individualized recommendations for each seller, says Hyma Davuluri, principal program manager in Microsoft Digital.

“With Daily Recommender, we’re pushing the envelope on using AI to influence large-scale selling at Microsoft,” Davuluri says. “It’s also helping us accelerate our digital transformation journey across the company’s sales organization.”

Launched three years ago, Daily Recommender has been rolled out to about 1,000 sellers and, as it has learned and matured, is starting to show very promising results. So says Salman Mukhtar, the director of business programs who leads the selling community’s use of Daily Recommender.

“It’s Microsoft using Microsoft,” Mukhtar says. “We’re using SQL Server, Azure Fabric, Azure Machine Learning—we’re using a lot of our own technology together and connecting it on top of Dynamics.”

Microsoft started small with the intent to prove the value of an AI-enabled discovery engine that would improve targeting of new business while reducing the preparation efforts by sellers. So far, the results have been promising—one in four recommendations pursued by sellers result in a customer opportunity or engagement.

“Machine-to-human AI requires a mindset change,” Mukhtar says. “It requires legacy processes to be enhanced and new habits to be formed across the sales force.”

For example, sellers must give up their personalized Excel spreadsheets and PowerPoint decks. “The sponsors and developers of our legacy toolkits and processes need to be bold and decommission where necessary,” he says.

The needed changes are happening but are not complete yet.

“Digital transformation is a journey—for us it involves data, tools, processes, and people all enabled by AI,” Mukhtar says. “We are scaling up our enablement efforts to transform Daily Recommender into the primary discovery engine for the business.”

Account 360 stitches the customer story together

Historically, it has been a challenge for sellers, as they reach out, to understand what relationship a customer has with Microsoft.

“The key challenge for sellers was to gather consistent insights in order to have a productive conversation,” says Alioscha Leon, Microsoft Digital’s program manager for Account 360, a new MSX sales tool that seeks to stamp out that legacy of opaqueness. “They would have to go to several tools with different interfaces and search functionality in order to get the information required to have a productive conversation, and there still was no guarantee that they were getting the full picture.”

To change that, Microsoft Digital rolled out Account 360 in May 2019.

It was introduced in beta form to an initial wave of sellers from Microsoft Inside Sales. Built into MSX Portal, it aggregates multiple tools into one, with a consistent user interface, giving sellers a comprehensive view of their customers. More than 1,300 sellers volunteered to try out the tool, exceeding the goal of 800.

“We allow sellers to very quickly prepare for an interaction with a customer,” Leon says. “We’re making it easy for them to have relevant conversations without having to do huge amounts of research, increasing the seller productivity and interaction quality.”

Account 360 allows sellers to see Microsoft’s agreements across modern and legacy systems, revenue across products, marketing interactions, partner association, and account profiles. It also shows what opportunities and leads are already being pursued, and what products and services the customer is already consuming. The insights are available and delivered in a fast and consistent manner, using an interface tailor made for sellers.

The goal is for the sellers to get all the info they need to enable a productive customer interaction in the Account 360 interface. But if they need to go deeper, a linking strategy allows them to navigate to additional resources.

A first version of Account 360 went live in July for all seller audiences. “We continue to have exponential growth in both monthly and weekly unique users, with 3,000 unique monthly users and a run rate of 1,300 weekly unique users in August,” Leon says.

Dynamics 365 is the backbone of selling at Microsoft

MSX’s heavy use of the Dynamics 365 platform is very helpful, says Linda Simovic, principal group program manager for the Dynamics 365 product group.

“I think the way we’re drinking our own champagne inside the company is amazing,” Simovic says. “With 25,000 sellers or more in the company, it gives us a lot of great ways to test out our products and services.”

Showcasing the way Microsoft uses Dynamics 365 products also helps other companies understand what they can do with the platform, she says.

Simovic says the Dynamics team continuously talks with the Microsoft selling community and Microsoft Digital, weaving their steady stream of feedback into Dynamics 365 as fully and quickly as possible.

“We actually say to the MSX team, ‘We’re thinking about building this—what do you think?’” she says. “We want them to use it and to let us know if it works. It’s a litmus test to see if what we’re thinking is a good idea or not.”

The recent decision to upgrade MSX to the latest version of Dynamics 365 helps with this—now the Microsoft Digital team can try out new features as soon as they’re ready for testing.

“We want to be able to cover their needs out of the box as much as possible,” Simovic says. “The better we can support the company’s complex sales motion, the better we can support our external customers.”

Mohammed agrees, calling out how the two teams have worked together to bring new enterprise-level capabilities into Dynamics 365.

In fact, he says, the teams are working so closely together that in some cases the Microsoft Digital Commercial Sales and Marketing Engineering Team is co-developing directly with the Dynamics team to add features that the sales teams need.

“That’s a big change from our historical approach of building in-house bridge software,” Mohammed says. “This is a pretty major leap forward for us—we’re working hand-in-hand with the product group to build new capabilities for customers.”

For Kunes and her Microsoft Digital team, the successful partnership with Dynamics is just one more signal that their new, transformed approach to supporting the company’s complex sales motion is working.

“We’ve laid the groundwork for us to finally get this right for our sellers,” Kunes says. “Now we just need to go finish what we started. It’s an exciting time to be working on this team.”

Related links

The post Retooling how Microsoft sellers sell the company appeared first on Inside Track Blog.

]]>
4805
Tackling environmental sustainability from the inside out at Microsoft http://approjects.co.za/?big=insidetrack/blog/tackling-environmental-sustainability-from-the-inside-out-at-microsoft/ Tue, 13 Aug 2019 15:56:40 +0000 http://approjects.co.za/?big=insidetrack/blog/?p=4775 Microsoft operates 100 percent carbon neutral. It also levies an internal carbon tax on its own business units to help pay for climate change and environmental sustainability initiatives, including giving technology grants to environmental projects outside of Microsoft. Both are part of being a good steward of the environment, says Elizabeth Willmott, carbon program manager […]

The post Tackling environmental sustainability from the inside out at Microsoft appeared first on Inside Track Blog.

]]>
Microsoft operates 100 percent carbon neutral. It also levies an internal carbon tax on its own business units to help pay for climate change and environmental sustainability initiatives, including giving technology grants to environmental projects outside of Microsoft.

Both are part of being a good steward of the environment, says Elizabeth Willmott, carbon program manager at Microsoft, but they are only a beginning. There is a much bigger opportunity within reach.

“Microsoft is in a unique position with its enormous network of customers, partners, and suppliers,” Willmott says. “We have an incredible reach with our software and services, and with our devices. If we can use that reach to drive positive change for the environment, then we can really start to help the planet.”

[Read this case study on how Microsoft is using machine learning to minimize its carbon footprint and reduce its energy consumption.]

Employees are asking about and encouraging Microsoft’s efforts all the time, and the company’s leaders are also pushing to find ways to do more.

“This is definitely a year when we’re on the move in terms of doing a lot more on sustainability,” says Brad Smith, Microsoft’s president and chief legal officer, speaking to company employees at a recent internal event. “It starts with getting our house in order, but then it ultimately connects to how we can help everyone on the planet use technology to drive sustainability goals.”

To mitigate its impact on the climate, the company’s efforts have been in three main areas: decreasing its carbon emissions through energy efficiency and conservation; moving to renewable energy for its datacenters and buildings; and offsetting the carbon emissions of business air travel.

“We’re on a leadership path in these areas,” Willmott says. “Now we are leaning in to encourage and help our suppliers, our partners, and our customers to do the same.”

Levying a tax for good

Microsoft charges internal teams a tax of $15 per metric ton on all operational carbon emissions, a fee that just went up from the approximately $8 per metric ton charged previously. The money is paid into a sustainability fund that is used to achieve carbon neutrality via efficiency, renewable energy, and offsets. From there it is granted to Microsoft internal teams and external organizations to address climate change and other environmental sustainability priorities. This process is transparent, and the investment areas are tracked publicly on Microsoft’s Sustainability Fund Power BI Dashboard.

Among those investments to accelerate progress is Microsoft’s AI for Earth program, a commitment by Microsoft to spend $50 million over five years to support projects that use Microsoft’s AI and machine-learning technology to tackle environmental challenges, says Bonnie Lei, AI for Earth program manager at Microsoft.

“We’re supporting individuals and organizations that are building AI models that are broadly useful for the environment,” Lei says. “We want to provide exponential impact with our investment, and so we work with our partners to make these models available on the AI for Earth website to the wider public.”

For example, Microsoft is supporting SilviaTerra, a California company that is using an AI for Earth grant to create a national forest inventory, which is now being piloted in an effort to help people who own small private forest land receive payment for keeping that land forested.

“They created machine-learning algorithms that were scaled through Microsoft Azure to create the first map of every single tree in every forest in the continental US, down to the tree’s species and size,” Lei says.

Landowners can use those maps to better manage their land, and, more importantly, they are testing a new approach to qualify their lands as viable carbon offsets. This means that, for the first time, these owners can be paid to keep their land forested by companies looking to offset their carbon emissions.

“Previously, they were not able to enter the carbon market due to the high cost of overhead and monitoring,” Lei says. “Now they have more incentive and a way to value keeping their forest stands standing.”

Landowners are also leveraging the maps to improve how they manage their land, including better preparing themselves for fire danger and drought.

Easing the environmental cost of buildings

One of the core ways Microsoft aims to reduce its carbon footprint is by transforming how it constructs and manages its buildings.

The company is currently rebuilding part of its headquarters in Redmond, Washington, and it’s seeking to do so in ways that slash the amount of carbon released into the atmosphere that is typical of new construction, says Katie Ross, global sustainability program manager for Microsoft Real Estate and Security.

“Traditionally, the building sector has been focused on operational carbon,” says Ross, referring to the carbon associated with the energy used to run a building. “But that’s only half of the carbon problem in the building sector—the other half is embodied carbon, or the carbon that is emitted when building materials are manufactured.”

Think about the latter as “upfront carbon.”

“It’s the carbon you expend before you even flip the switch to turn on the building,” Ross says.

Unlike operational carbon, which you can reduce to zero over the life of the building by implementing energy-efficiency programs and by sourcing renewable energy, embodied carbon was emitted to make that concrete—to pull the raw materials out of the ground, to process them, and so on.

“That’s a carbon footprint number you can only reduce when you pick the material, and once the building is built, you cannot change it,” Ross says.

Microsoft is constructing 17 new buildings on its east campus—a total of 2.5 million square feet of new space.

“We knew we wanted to tackle both sides of the carbon equation, with the aim to build zero-carbon buildings,” Ross says. “To get there, we’re focusing on reducing our energy usage, we’re sourcing 100-percent carbon-free electricity, we’re removing natural gas—including for cooking—in our cafes, and we’re using a new tool, Embodied Carbon Calculator for Construction (EC3), to track and reduce our embodied carbon.”

Microsoft is partnering with the University of Washington’s Carbon Leadership Forum and the global project-development and construction company Skanska to pilot EC3 on its new campus. This open-source, free-to-use tool is helping the Microsoft Real Estate and Security construction team assess the embodied carbon within construction materials it considers for the project.

“A lot of this is unchartered territory,” Ross says.

Efforts by corporations to track and reduce the embodied carbon impact of their buildings are in their infancy, she says.

“So far we are on target to reduce our embodied carbon emissions by 15 to 30 percent,” she says. “We are learning a lot about what’s possible by piloting this tool and hope to create a roadmap to support the industry targeting embodied carbon reductions in future projects.”

Energy and airplanes

Willmott says that one of the most notable ways Microsoft has made an impact on the environmental sustainability side is by procuring renewable energy to power the company’s datacenters.

So far, Microsoft has procured enough renewable energy to power 60 percent of its datacenter load by the end of this calendar year. The goal is to continue on a path to power 100 percent of its datacenters with renewable energy.

One area the company is exploring to further shift behavior and reduce carbon emissions is by encouraging employees to skip carbon-intensive airline trips in favor of using Microsoft Teams to meet and collaborate.

When employees do fly, Microsoft offsets the associated emissions by investing in verified carbon-offset projects, such as a first-of-its-kind forest conservation project in King County, Washington. The company’s environmental sustainability team vets all offset projects closely to ensure that they are having a measurable impact.

“These investments have supported the protection of 5.1 million acres of sensitive land worldwide,” Willmott says. But even though scaling up the impact of these “natural climate solutions” investments is meaningful, the team is very aware that avoiding carbon emissions altogether is the first and best line of action.

It’s all part of Microsoft’s commitment to sustainability, Willmott says.

“We’re using Microsoft tools and purchasing decisions to prove what our research has told us—that AI and other technologies can help usher in a low-carbon transition and protect the planet from catastrophic degradation,” she says.

If there were significantly greater adoption of AI in key sectors, greenhouse gas emissions could be reduced enough to zero out the annual emissions of Australia, Canada, and Japan combined.

“There is so much we can do if we all work together on this,” Willmott says. “Let’s go do this.”

Read this case study on creating business intelligence with Azure SQL Database and this case study on using machine learning to develop smart energy solutions to see how Microsoft is using technology to minimize its building footprint and reduce its energy consumption.

The post Tackling environmental sustainability from the inside out at Microsoft appeared first on Inside Track Blog.

]]>
4775