request a virtual engagement<\/a> on this topic with experts from our Microsoft Digital team.<\/p>\n<\/aside>\n\n\n\nFor years, our IT teams have been focused on scale, reliability, and operational excellence. Those priorities didn\u2019t change. What changed were the possibilities.<\/p>\n\n\n\n
Suddenly, engineers could draft code in seconds, summarize complex systems instantly, or automate work that had once consumed hours or days. It was an opportunity to take the skills and capabilities of our people and amplify them with AI.<\/p>\n\n\n\n
That realization forced us to step back and ask harder questions.<\/p>\n\n\n\n
How do you help thousands of engineers understand what AI can actually do to impact their day-to-day work? How do you move from experimentation to trust? And how do you adopt AI in a way that strengthens engineering fundamentals instead of eroding them?<\/p>\n\n\n\n
The answer came in the form of a phased journey grounded in people, culture, and continuous learning.<\/p>\n\n\n\n
Phase 1: Awareness and access<\/h2>\n\n\n\n It might sound surprising when speaking about engineering processes, but our first challenge wasn\u2019t technology; it was understanding.<\/p>\n\n\n\n
When generative AI entered the conversation, most engineers saw the headlines and dabbled in various tools, but few understood fully what it meant for their work. Some were excited, others were wary. Many simply didn\u2019t know where to start. That gap between awareness and practical value was the first barrier we had to address.<\/p>\n\n\n\n
We realized early that top-down mandates wouldn\u2019t work. Telling engineers to \u201cuse AI\u201d without context or relevance would only deepen skepticism. Instead, we focused on something both simpler and more difficult: Exposure.<\/p>\n\n\n\n
We started by making AI visible and accessible in the tools engineers already used. GitHub Copilot. Microsoft 365 Copilot. Early copilots embedded directly into engineering workflows. The goal wasn\u2019t immediate productivity gains. It was familiarity. Letting engineers see, firsthand, what AI could and couldn\u2019t do.<\/p>\n\n\n\n <\/figure>\n\n\n\n\n\u201cWe encouraged tool usage and adoption so people would at least play around with AI. And once they did, they started seeing the value. That\u2019s when the mindset shifted from \u2018AI might replace me\u2019 to \u2018AI can be my companion.\u2019\u201d<\/p>\nMukul Singhal, partner group engineering manager, Microsoft Digital<\/cite><\/blockquote>\n\n\n\nJust as important, we talked openly about limitations.<\/p>\n\n\n\n
AI wasn\u2019t perfect. It hallucinated. It made confident mistakes. And that honesty mattered. By framing AI as an assistant, we reinforced the role of engineering judgment. Engineers didn\u2019t need to fear losing control. They needed to understand how to stay in control.<\/p>\n\n\n\n
We also made experimentation safe.<\/p>\n\n\n\n
No quotas. No forced adoption metrics. Engineers were encouraged to try AI on low\u2011risk tasks: summarizing documentation, generating test cases, or exploring unfamiliar codebases. Small wins built confidence, confidence built curiosity, and curiosity drove organic adoption.<\/p>\n\n\n\n
As that experimentation took hold, the mindset began to shift.<\/p>\n\n\n\n
\u201cWe encouraged tool usage and adoption so people would at least play around with AI,\u201d says Mukul Singhal, a partner group engineering manager in Microsoft Digital. \u201cAnd once they did, they started seeing the value. That\u2019s when the mindset shifted from \u2018AI might replace me\u2019 to \u2018AI can be my companion.\u2019\u201d<\/p>\n\n\n\n
Over time, conversations changed from \u2018Should we use AI?\u2019 to \u2018Where does AI help most?\u2019<\/p>\n\n\n\n
Engineers began sharing prompts, tips, and lessons learned with one another. What started as individual exploration turned into community learning. Awareness gave way to momentum.<\/p>\n\n\n\n
Phase one was about providing access to explore, to question, and to learn. And that foundation made everything that followed possible.<\/p>\n\n\n\n
Phase 2: Culture shift<\/h2>\n\n\n\n Access created awareness and awareness created curiosity.<\/p>\n\n\n\n
As more engineers began experimenting with AI, we noticed a pattern. Some teams were moving faster, learning faster, and reducing friction in their day\u2011to\u2011day work. Others stalled after initial trials. The difference wasn\u2019t technical skill or capability, it was mindset.<\/p>\n\n\n\n <\/figure>\n\n\n\n\n\u201cPeople started shifting from the mindset of \u2018Will AI work?\u2019 to \u2018AI is working for me.\u2019 I think that was a very transformational shift, to where I believe a lot of engineers in the organization started believing in AI.\u201d<\/p>\nVeera Mamilla, principal group engineering manager, Microsoft Digital<\/cite><\/blockquote>\n\n\n\nTo move forward, we had to shift how AI was perceived from something optional or experimental to something that was simply part of how modern engineering gets done.<\/p>\n\n\n\n
That meant normalizing AI as a trusted partner in the engineering process.<\/p>\n\n\n\n
Leaders played a critical role in that shift. Rather than positioning AI as a productivity shortcut, they framed it as a way to strengthen engineering fundamentals: clearer design discussions, better documentation, faster feedback loops, and more time for deep problem\u2011solving. The message was intentional and consistent. Using AI wasn\u2019t about cutting corners, it was about reimagining how work gets done.<\/p>\n\n\n\n
We also had to address a fear that surfaced early: that AI adoption was a signal of replacement rather than empowerment.<\/p>\n\n\n\n
\u201cPeople started shifting from the mindset of \u2018Will AI work?\u2019 to \u2018AI is working for me,\u2019\u201d says Veera Mamilla, a principal group engineering manager in Microsoft Digital. \u201cI think that was a very transformational shift, to where I believe a lot of engineers in the organization started believing in AI.\u201d<\/p>\n\n\n\n
That framing mattered.<\/p>\n\n\n\n
As engineers incorporated AI into their workflows, success stopped being measured by output alone. The focus shifted to outcomes. Did AI help you understand a system faster? Did it surface risks earlier? Did it free up time to focus on higher\u2011value work?<\/p>\n\n\n\n
Over time, AI stopped feeling like a novelty. It became part of the engineering fabric. We reinforced it through leadership modeling, peer learning, and shared success stories. Teams no longer asked whether AI belonged in their workflows. They asked how to use it responsibly and effectively.<\/p>\n\n\n\n
Phase 3: Upskilling and role evolution<\/h2>\n\n\n\n Once AI moved from curiosity to expectation, the challenge of skill building became unavoidable.<\/p>\n\n\n\n
From the start, we made a deliberate choice: This would be an upskilling and reskilling journey, not a wholesale replacement of roles. The goal wasn\u2019t a new workforce. It was an investment in the one we had.<\/p>\n\n\n\n
That decision shaped everything that followed.<\/p>\n\n\n\n
Early upskilling efforts focused on practical entry points. Prompt engineering. Tool literacy. Understanding how copilots and early agents behaved in real engineering workflows. We treated these as something every engineer needed to experiment with, regardless of discipline.<\/p>\n\n\n\n
But it quickly became clear that skills alone weren\u2019t the full story. Roles themselves were starting to evolve.<\/p>\n\n\n\n <\/figure>\n\n\n\n\n\u201cYour title might still be software engineer or principal engineer. But if you\u2019re acting like an AI engineer, what does that actually mean? That question helped us start defining how these roles were evolving.\u201d<\/p>\nRagini Singh, partner group engineering manager, Microsoft Digital<\/cite><\/blockquote>\n\n\n\nAcross software development, service engineering, and cloud network engineering, the work was shifting from manual execution toward orchestration and oversight. Engineers were no longer expected to do every task end\u2011to\u2011end by hand. Instead, they were learning how to guide AI, review its output, and decide where automation made sense and where it didn\u2019t.<\/p>\n\n\n\n
As part of this shift, we began researching how the industry itself was redefining engineering roles. Leaders examined emerging job descriptions from across the market and compared them with Microsoft\u2019s own role frameworks. At the time, there was no formal \u201cAI engineer\u201d role in the internal job library. Rather than creating a new title, the focus stayed on evolving expectations within existing roles.<\/p>\n\n\n\n
The idea of an \u201cAI\u2011native engineer\u201d emerged not as a job description, but as a mindset.<\/p>\n\n\n\n
An AI\u2011native engineer still understands systems, architecture, and risk. What\u2019s different is how that expertise gets applied. Routine tasks are delegated to AI. Judgment, design, and accountability stay with the human. Engineers move from doing all the work themselves to supervising work done in partnership with AI.<\/p>\n\n\n\n
\u201cYour title might still be software engineer or principal engineer,\u201d says Ragini Singh, a partner group engineering manager in Microsoft Digital. \u201cBut if you\u2019re acting like an AI engineer, what does that actually mean? That question helped us start defining how these roles were evolving.\u201d<\/p>\n\n\n\n
This evolution looked different across disciplines. Software engineers focused on AI\u2011assisted coding, test generation, and spec\u2011driven development. Service engineers leaned into AI for incident response, knowledge capture, and operational decision support. Cloud network engineers began moving from manual intervention toward intelligent orchestration and agent\u2011assisted troubleshooting. The common thread wasn\u2019t identical tooling, it was a shared shift toward higher\u2011order work and reduced toil.<\/p>\n\n\n\n
Phase 4: Embedding AI across the engineering lifecycle<\/h2>\n\n\n\n By this phase, we knew individual productivity gains were simply the starting point for larger and broader benefits.<\/p>\n\n\n\n
Early on, most AI usage showed up in familiar places: Code suggestions, documentation summaries, quick answers. Useful, but fragmented. The bigger opportunity emerged when we stepped back and asked a harder question: What would it look like if AI were embedded across the entire engineering lifecycle, not just used at isolated moments?<\/p>\n\n\n\n
We stopped thinking in terms of tools and started thinking in terms of flow. Design. Build. Test. Deploy. Operate. Improve. AI needed to show up across all of it, in ways that reinforced how engineers already worked.<\/p>\n\n\n\n <\/figure>\n\n\n\n\n\u201cIf AI is only showing up at one step, you don\u2019t get the full value. The real impact comes when it\u2019s integrated across the lifecycle, where engineers can design, build, operate, and learn faster as a system.\u201d<\/p>\nSudhakar Sadasivuni, principal group engineering manager, Microsoft Digital<\/cite><\/blockquote>\n\n\n\nIn software engineering, that meant pulling AI earlier into the process. We began using it to help draft requirements, reason through design options, and review code with broader system context to accelerate how quickly we could get to informed decisions. Coding assistance mattered, but it was no longer the center of gravity.<\/p>\n\n\n\n
Testing and quality followed a similar pattern. AI supported test generation, defect analysis, and code review, reducing repetitive effort and helping issues surface sooner. That gave engineers more time to focus on quality and architecture instead of cleanup.<\/p>\n\n\n\n
In service engineering, we embedded AI into incident management and operational workflows. Engineers used it to summarize incidents, surface relevant knowledge, and analyze signals across systems. In cloud network engineering, AI helped shift work away from manual intervention toward orchestration and intelligent troubleshooting. Across disciplines, the principle stayed the same: AI should reduce friction, not introduce it.<\/p>\n\n\n\n
As we scaled this approach, one thing became clear. Embedding AI wasn\u2019t just a technical exercise. It was a systems change.<\/p>\n\n\n\n
\u201cIf AI is only showing up at one step, you don\u2019t get the full value,\u201d says Sudhakar Sadasivuni, a principal group engineering manager in Microsoft Digital. \u201cThe real impact comes when it\u2019s integrated across the lifecycle, where engineers can design, build, operate, and learn faster as a system.\u201d<\/p>\n\n\n\n
As AI became part of core workflows, engineers remained accountable for outcomes. AI output was reviewed, tested, and validated like any other engineering input. Embedding AI didn\u2019t lower the bar for rigor. It raised expectations around judgment, oversight, and data quality. We became more deliberate about responsibility and governance.<\/p>\n\n\n\n
Over time, these integrations created compound benefits.<\/p>\n\n\n\n
Faster design cycles reduced downstream rework. Better testing lowered operational noise. Improved operational insight shortened recovery times. AI stopped being something we used occasionally and became something the engineering system itself was built around.<\/p>\n\n\n\n
Phase 5: Eliminating toil and accelerating outcomes<\/h2>\n\n\n\n At some point, every AI story hits the same test. Does it actually make engineers\u2019 days better? For us, that proof showed up fastest in elimination of toil.<\/p>\n\n\n\n
Across Microsoft Digital, engineers have always spent time on work that was necessary but draining. It included tasks such as manual troubleshooting, repetitive diagnostics, log analysis, and routine operational tasks that kept systems running but didn\u2019t move the organization forward.<\/p>\n\n\n\n
AI gave us a chance to change that.<\/p>\n\n\n\n <\/figure>\n\n\n\n\n\u201cToil reduction is the biggest thing. That\u2019s where engineers\u2019 eyes light up. If we can eliminate toil, people engineers will flock to use AI. I really believe it.\u201d<\/p>\nBeth Garrison, principal cloud network engineer, Microsoft Digital<\/cite><\/blockquote>\n\n\n\nIn cloud network engineering, for example, troubleshooting used to require manually reconstructing what happened, such as logging into devices, chasing configurations, and piecing together context after the fact. As we began introducing agents and machine learning into these workflows, that work shifted. Instead of spending time assembling the picture, engineers could generate the views they needed faster and focus on resolving issues.<\/p>\n\n\n\n
The same shift showed up in how we used operational data.<\/p>\n\n\n\n
Rather than reacting to incidents after impact, we started using machine learning to analyze logs, identify patterns, and surface anomalies earlier. That moved teams from reactive response toward proactive monitoring and prevention.<\/p>\n\n\n\n
One thing became clear very quickly: Toil reduction wasn\u2019t just a benefit; it was the catalyst for adoption.<\/p>\n\n\n\n
\u201cToil reduction is the biggest thing. That\u2019s where engineers\u2019 eyes light up,\u201d says Beth Garrison, a principal cloud network engineer at Microsoft Digital. \u201cIf we can eliminate toil, people engineers will flock to use AI. I really believe it.\u201d<\/p>\n\n\n\n
Service engineering followed a similar arc.<\/p>\n\n\n\n
Across governance, operations, productivity, and cost management, we began applying agents and automation to simplify complex work and reduce manual review cycles. Governance and compliance workflows became faster and more consistent. Operational processes benefited from guided remediation and earlier insight. Knowledge capture improved as documentation and remediation guidance could be generated and updated automatically.<\/p>\n\n\n\n
When we removed repetitive work such as manual triage, rote diagnostics, endless documentation cleanup, we transformed how engineers spent their time. More focus on design. More proactive problem\u2011solving. More energy directed toward improving systems instead of just maintaining them.<\/p>\n\n\n\n
Toil reduction made the value of AI tangible. It\u2019s the moment AI stopped being interesting and became indispensable, and our engineering teams started asking where else we can apply it next.<\/p>\n\n\n\n
Measuring what matters<\/h2>\n\n\n\n By the time AI was embedded across our engineering lifecycle, a new question came into focus: \u201cHow do we know it\u2019s working?\u201d<\/p>\n\n\n\n
In the early days, we paid close attention to usage. Which tools engineers were trying, where adoption was growing, or where it stalled. Those signals mattered and adoption was the leading indicator that people were getting comfortable and starting to integrate AI into real work.<\/p>\n\n\n\n
\n\u201cAdoption was always the starting point. But we were clear from the beginning that usage isn\u2019t the destination. The real goal is impact; more time for engineers to focus on the work that truly matters.\u201d<\/p>\nUllas Kumble, principal group software engineering manager, Microsoft Digital<\/cite><\/blockquote>\n\n\n\nBut using AI doesn\u2019t automatically mean better outcomes. So, we shifted the conversation and started asking, \u201cWhat\u2019s different now that our engineers are using AI?\u201d<\/p>\n\n\n\n
That change reframed how we thought about measurement. We began looking beyond tool activity to understand impact across the engineering system. Faster design cycles. Earlier defect detection. Reduced time spent on repetitive operational work. Shorter incident resolution. Clearer documentation. Fewer handoffs. Less rework.<\/p>\n\n\n\n
These weren\u2019t abstract metrics. They showed up in the flow of work.<\/p>\n\n\n\n
We were intentional about not forcing a single definition of value across every role. Software engineers, service engineers, and cloud network engineers experience impact differently. What mattered was that each team could point to tangible improvements in how work moved through the system.<\/p>\n\n\n\n
That perspective shaped how leadership talked about success.<\/p>\n\n\n\n
\u201cAdoption was always the starting point,\u201d says Ullas Kumble, a principal group software engineering manager at Microsoft Digital. \u201cBut we were clear from the beginning that usage isn\u2019t the destination. The real goal is impact; more time for engineers to focus on the work that truly matters.\u201d<\/p>\n\n\n\n
Over time, this approach changed the quality of our conversations. Instead of debating whether AI was worth the investment, teams talked about where it was removing friction and where it still wasn\u2019t delivering enough value. Measurement became a tool for learning and prioritization.<\/p>\n\n\n\n
Moving forward<\/h2>\n\n\n\n Looking ahead, one lesson stands out: this journey isn\u2019t complete.<\/p>\n\n\n\n
AI tools will continue to evolve. Agents will become more capable. Roles will keep shifting. What it means to be an engineer will continue to change. And that means our approach must stay grounded in the same principles that guided us from the start: invest in people, reinforce fundamentals, embed AI into real workflows, and stay honest about what\u2019s working and what isn\u2019t.<\/p>\n\n\n\n
We didn\u2019t set out to build an AI\u2011driven engineering organization overnight, we built it phase by phase.<\/p>\n\n\n\n
By meeting engineers where they were By reshaping culture before redefining roles. By embedding AI across the lifecycle, not bolting it on. By reducing toil and measuring impact where it mattered most.<\/p>\n\n\n\n
The result is better engineering: powered by AI, guided by human judgment, and built to keep evolving.<\/p>\n\n\n\n