SAP Archives - Inside Track Blog http://approjects.co.za/?big=insidetrack/blog/tag/sap/ How Microsoft does IT Fri, 30 Aug 2024 16:18:40 +0000 en-US hourly 1 https://wordpress.org/?v=6.6.2 137088546 Monitoring Microsoft’s SAP Workload with Microsoft Azure http://approjects.co.za/?big=insidetrack/blog/monitoring-microsofts-sap-workload-with-microsoft-azure/ Wed, 04 Sep 2024 16:00:22 +0000 http://approjects.co.za/?big=insidetrack/blog/?p=8984 At Microsoft, our Microsoft Digital Employee Experience (MDEE) team is using Microsoft Azure telemetry tools to get key insights on our business processes that flow through our SAP instance, one of the largest in the world. Our new platform provides our leadership with a comprehensive view of our business-process health and allows our engineering teams to […]

The post Monitoring Microsoft’s SAP Workload with Microsoft Azure appeared first on Inside Track Blog.

]]>
Microsoft Digital technical storiesAt Microsoft, our Microsoft Digital Employee Experience (MDEE) team is using Microsoft Azure telemetry tools to get key insights on our business processes that flow through our SAP instance, one of the largest in the world. Our new platform provides our leadership with a comprehensive view of our business-process health and allows our engineering teams to create a more robust and efficient SAP environment.

Like many enterprises, we use SAP—the global enterprise resource planning (ERP) software solution—to run our various business operations. Our SAP environment is critical to our business performance, and we integrate it into most of our business processes. SAP offers functionality for enterprise services at Microsoft, such as human resources, finance, supply-chain management, and commerce. We use a wide variety of SAP applications, including:

  • SAP S/4HANA
  • ERP Central Component (ECC)
  • Global Trade Screening (GTS)
  • Business Integrity Screening (BIS) on S4
  • Master Data Governance (MDG) on S4
  • Governance, Risk, Compliance (GRC)
  • Revenue Management, Contract Accounting (RMCA)
  • OEM Services (OER)
  • SAP SaaS (Ariba, IBP, Concur, SuccessFactors)

Since 2018, Microsoft’s instance of SAP is 100 percent migrated to Microsoft Azure. This project entailed moving all SAP assets to more than 800 Azure virtual machines and numerous cloud services.

We approached the migration by using both vertical and horizontal strategies.

From a horizontal standpoint, we migrated systems in our SAP environment that were low risk—training systems, sandbox environments, and other systems that weren’t critical to our business function. We also looked at vertical stacks, taking entire parts of our SAP landscape and migrating them as a unified solution.

We gained experience with both migration scenarios, and we learned valuable lessons in the early migration stages that helped us smoothly transition critical systems later in the migration process.

[Unpack how we’re optimizing SAP for Microsoft Azure. | Discover how we’re protecting Microsoft’s SAP workload with Microsoft Sentinel. | Explore how we’re unlocking Microsoft’s SAP telemetry with Microsoft Azure.]

Operating as Microsoft Azure-native

At Microsoft, we develop and host all new SAP infrastructure and systems on Microsoft Azure. We’re using Azure–based cloud infrastructure and SAP–native software as a service (SaaS) solutions to increase our architecture’s efficiency and to grow our environment with our business. The following graphic represents our SAP landscape on Azure.

Detailed illustration of SAP in Microsoft Azure listed by department: HR, Finance, SCM, Commerce, Enterprise services, SAP platform.
Microsoft’s SAP environment on Microsoft Azure.

The benefits of SAP on Microsoft Azure

SAP on Microsoft Azure provides several benefits to our business, many of which have resulted in significant transformation for our company. Some of the most important benefits include:

  • Business agility. With Microsoft Azure’s on-demand SAP–certified infrastructure, we’ve achieved faster development and test processes, shorter SAP release cycles, and the ability to scale instantaneously on demand to meet peak business usage.
  • Efficient insights. SAP on Microsoft Azure gives us deeper visibility across our SAP landscape. On Azure, our infrastructure is centralized and consolidated. We no longer have our SAP infrastructure spread across multiple on-premises datacenters.
  • Efficient real-time operations and integration. We can leverage integration with other Microsoft Azure technologies such as Internet of Things (IoT) and predictive analytics to enable real-time capture and analysis of our business environment, including areas such as inventory, transaction processing, sales trends, and manufacturing.
  • Mission-critical infrastructure. We run our entire SAP landscape—including our most critical infrastructure—on Microsoft Azure. SAP on Azure supports all aspects of our business environment.

Identifying potential for improved monitoring

As we examined our SAP environment on Microsoft Azure, we found several key areas where we could improve our monitoring and reporting experience:

  • Monitoring SAP from external business-process components. External business process components had no visibility into SAP. Our monitoring within individual SAP environments provided valuable insight into SAP processes, but we needed a more comprehensive view. SAP is just one component among many in our business processes, and the owners of those business processes didn’t have any way to track their processes after they entered SAP.
  • Managing and viewing end-to-end processes. It was difficult to manage and view end-to-end processes. We couldn’t capture the end-to-end process status to effectively monitor individual transactions and their progress within the end-to-end process chain. SAP was disconnected from end-to-end monitoring and created a gap in our knowledge of the entire process pipeline.
  • Assessing overall system health. We couldn’t easily assess overall system health. Our preexisting monitoring solution didn’t provide a holistic view of the SAP environment and the processes with which it interacted. The overall health of processes and systems was incomplete because of missing information for SAP, and issues that occurred within the end-to-end pipeline were difficult to identify and problematic to troubleshoot.

Our SAP on Microsoft Azure environment was like a black box to many of our business-process owners, and we knew that we could leverage Azure and SAP capabilities to improve the situation. We decided to create a more holistic monitoring solution for our SAP environment in Azure and the business processes that defined Microsoft operations.

Creating a telemetry solution for SAP on Microsoft Azure

The distributed nature of our business process environment led us to examine a broader solution—one that would provide comprehensive telemetry and monitoring for our SAP landscape and any other business processes that constituted the end-to-end business landscape at Microsoft. The following goals drove our implementation:

Integrate comprehensive telemetry into our monitoring.

  • Move toward holistic health monitoring of both applications and infrastructure.
  • Create a complete view of end-to-end business processes.
  • Create a modern, standards-based structure for our monitoring systems.

Guiding design with business-driven monitoring and personas

We adopted a business-driven approach to building our monitoring solution. This approach examines systems from the end-user perspective, and in this instance, the personas represented three primary business groups: business users, executives, and engineering teams. Using the synthetic method, we planned to build our monitoring results around what these personas wanted and needed to observe within SAP and the end-to-end business process, including:

  • Business user needs visibility into the status of their business transactions as they flow through the Microsoft and SAP ecosystem.
  • Executives need to ensure that our business processes are flowing smoothly. If there are critical failures, they need to know before customers or partners discover them.
  • Engineers need to know about business-process issues before those issues affect business operations and lead to customer-satisfaction issues. They need end-to-end visibility of business transactions through SAP telemetry data in a common consumption format.

Creating end-to-end telemetry with our Unified Telemetry Platform

The MDEE team developed a telemetry platform in Microsoft Azure that we call the Unified Telemetry Platform (UTP). UTP is a modern, scalable, dependable, and cost-effective telemetry platform that’s used in several different business-process monitoring scenarios in Microsoft, including our SAP–related business processes.

UTP is built to enable service maturity and business-process monitoring across MDEE. It provides a common telemetry taxonomy and integration with core Microsoft data-monitoring services. UTP enables compliance with and maintenance of business standards for data integrity and privacy. While UTP is the implementation we chose, there are numerous ways to enable telemetry on Microsoft Azure. For additional considerations, access Best practices for monitoring cloud applicationson the Azure documentation site.

Capturing telemetry with Microsoft Azure Monitor

To enable business-driven monitoring and a user-centric approach, UTP captures as many of the critical events within the end-to-end process landscape as possible. Embracing comprehensive telemetry in our systems meant capturing data from all available endpoints to build an understanding of how each process flowed and which SAP components were involved. Azure Monitor and its related Azure services serve as the core for our solution.

Microsoft Azure Application Insights

Application Insights provides a Microsoft Azure–based solution with which we can dig deep into our Azure–hosted SAP landscape and extract all necessary telemetry data. By using Application insights, we can automatically generate alerts and support tickets when our telemetry indicates a potential error situation.

Microsoft Azure Log Analytics

Infrastructure telemetry such as CPU usage, disk throughput, and other performance-related data is collected from Azure infrastructure components in the SAP environment by using Log Analytics.

Microsoft Azure Data Explorer

UTP uses Microsoft Azure Data Explorer as the central repository for all telemetry data sent through Application Insights and Microsoft Azure Monitor Logs from our application and infrastructure environment. Azure Data Explorer provides enterprise big-data interactive analytics; we use the Kusto query language to connect the end-to-end transaction flow for our business processes, for both SAP process and non–SAP processes.

Microsoft Azure Data Lake

UTP uses Microsoft Azure Data Lake for long-term cold-data storage. This data is taken out of the hot and warm streams and kept for reporting and archival purposes in Azure Data Lake to reduce the cost associated with storing large amounts of data in Microsoft Azure Monitor.

Diagram of UTP dataflow architecture for SAP on Microsoft Azure. Application and infrastructure telemetry are captured and evaluated.
A UTP data-flow architecture.

Constructing with definition using common keys and a unified platform

UTP uses Application Insights, Microsoft Azure Data Explorer, and Microsoft Azure Data Lake as the foundation for our telemetry data. This structure unifies our data by using a common schema and key structure that ties telemetry data from various sources together to create a complete view of business-process flow. This telemetry hub provides a central point where telemetry is collected from all points in the business-process flow—including SAP and external processes—and then ingested into UTP. The telemetry is then manipulated to create comprehensive business-process workflow views and reporting structures for our personas.

Common schema

UTP created a clearly defined common schema for business-process events and metrics based on a Microsoft-wide standard. That schema contains the metadata necessary for mapping telemetry to services and into processes, and it allows for joins and correlation across all telemetry.

Common key

As part of the common schema for business process events, the design includes a cross-correlation vector (XCV) value, common to all stored telemetry and transactions. By persisting a single value for the XCV and populating this attribute for all transactions and telemetry events related to a business process, we can connect the entire process chain related to an individual business transaction as it flows through our extended ecosystem.

Multilayer telemetry concept for SAP

For SAP on Microsoft Azure, our MDEE team focused on four specific areas for telemetry and monitoring:

  1. SAP Business Process layer
  2. SAP Application Foundation layer
  3. Infrastructure layer
  4. Surrounding API layer
The multilayered approach to SAP on Microsoft Azure.
Microsoft’s multilayer approach for its SAP instance.

The result was holistic telemetry and monitoring across these layers, a structure that leverages Microsoft Power BI as the engine behind our reporting and dashboarding functionality.

Our MDEE team created reporting around business-driven monitoring and constructed standard views and dashboards that offer visibility into important areas for each of the key business personas. Dashboards are constructed from Kusto queries, which are automatically translated in the Microsoft Power BI M formula language. For each persona, we’ve enabled a different viewpoint and altitude of our business process that allows the persona to view the SAP monitoring information that’s most critical to them.

Dashboard reporting views from the four SAP on Microsoft Azure layers.
Sample dashboards view for each layer.

Microsoft Azure Monitor for SAP Solutions

Microsoft previously announced the launch of Microsoft Azure Monitor for SAP Solutions (AMS) in public preview—an Azure-native monitoring solution for customers who run SAP workloads on Azure. With AMS, customers can view telemetry of their SAP landscapes within the Azure portal and efficiently correlate telemetry between various layers of SAP. AMS is available through Microsoft Azure Marketplace in the following regions: East US, East US 2, West US 2, West Europe, and North Europe. AMS doesn’t require a license fee.

Our MDEE team worked in close collaboration with Microsoft Azure product teams to build and release SAP NetWeaver provider in Microsoft Azure Monitor for SAP solutions.

  • The SAP NetWeaver provider in Microsoft Azure Monitor for SAP Solutions enables SAP on Microsoft Azure customers to monitor SAP NetWeaver components and processes on Azure in the Azure portal. The SAP NetWeaver provider includes default visualizations and alerts that can be used out of the box or customized to meet customer requirements.
  • SAP NetWeaver telemetry is collected by configuring the SAP NetWeaver provider within AMS. As part of configuring the provider, customers are required to provide the host name (Central, Primary, and/or Secondary Application server) of SAP system and its corresponding Instance number, Subdomain, and System ID (SID).

For more information, go to AMS quick start video and SAP NetWeaver monitoring-Azure Monitoring for SAP Solutions.

AMS architecture diagram.
Microsoft’s AMS architecture.

Our telemetry platform provides benefits across our SAP and business-process landscape. We have created a solution that facilitates end-to-end SAP business-process monitoring, which in turn enables our key personas to do their jobs better.

Persona benefits

Benefits for each persona include the following:

  • Business users no longer need to create service tickets to get the status of SAP transaction flows. They can examine our business processes from end to end, including SAP transactions and external processes.
  • Executives can trust that their business processes execute seamlessly and that any errors are proactively addressed with no impact to customers or partners.
  • Engineers no longer need to check multiple SAP transactions to investigate business-process issues and identify in which step the business process failed. They can improve their time-to-detect and time-to-resolve numbers with the correct telemetry data and avoid business disruption for our customers.

 Organization-wide benefits

The benefits of our platform extend across Microsoft by providing:

  • End-to-end visibility into business processes. Our Unified Telemetry Platform (UTP) provides visibility into business processes across the organization, which then facilitates better communication and a clearer understanding of all parts of our business. We have a more holistic view of how we’re operating, which helps us work together to achieve our business goals.
  • Decreased time to resolve issues. Our visibility into business processes informs users at all levels when an issue occurs. Business users can examine the interruption in their workflow, executives are notified of business-process delays, and engineers can identify and resolve issues. This activity all occurs before our customers are affected.
  • More efficient business processes. Greater visibility leads to greater efficiency. We can demonstrate issues to stakeholders quickly, everyone involved can recognize areas for potential improvement, and we can monitor modified processes to ensure that improvement is happening.

Key Takeaways

We learned several important lessons with our UTP implementation for SAP on Microsoft Azure. These lessons helped inform our progress of UTP development, and they’ve given us best practices to leverage in future projects, including:

  • Perform a proper inventory of internal processes. You must be aware of events within a process before you can capture them. Performing a complete and informed inventory of your business processes is critical to capturing the data required for end-to-end business-process monitoring.
  • Build for true end-to-end telemetry. Capture all events from all processes and gather telemetry appropriately. Data points from all parts of the business process—including external components—are critical to achieving true end-to-end telemetry.
  • Build for Microsoft Azure-native SAP.  SAP is simpler to manage on Azure and instrumenting SAP processes becomes more efficient and effective when SAP components are built for Azure.
  • Encourage data-usage models and standards across the organization. Data standards are critical for an accurate end-to-end view. If data is stored in different formats or instrumentation in various parts of the business process, the end reporting results won’t accurately represent the business-process’ state.

We’re continuing to evaluate and improve as we discover new and more efficient ways to track our business processes in SAP. Some of our current focus areas include:

  • Machine learning for predictive analytics. We’re using machine learning and predictive analytics to create deeper insights and more completely understand our current SAP environment. Machine learning also helps us anticipate growth and change in the future. We’re leveraging anomaly detection in Microsoft Azure Cognitive Services to track SAP business service-health outliers.
  • Actionable alerting. We’re using Microsoft Azure Monitor alerts to create service tickets, generate service-level agreement (SLA) alerts, and provide a robust notification and alerts system. We’re working toward linking detailed telemetry context into our alerting system to create intelligent alerting that enables us to more accurately and quickly identify potential issues within the SAP environment.
  • Telemetry-based automation. We’re using telemetry to enable automation and remediation within our environment. We’re creating self-healing scenarios to automatically correct common or easy-to-correct issues to create a more intelligent and efficient platform.

We’re continually refining and improving business-process monitoring of SAP on Microsoft Azure. This initiative has enabled us to keep key business users informed of business-process flow, provided a complete view of business-process health to our leadership, and helped our engineering teams create a more robust and efficient SAP environment. Telemetry and business-driven monitoring have transformed the visibility that we have into our SAP on Azure environment, and our continuing journey toward deeper business insight and intelligence is making our entire business better.

Related links

The post Monitoring Microsoft’s SAP Workload with Microsoft Azure appeared first on Inside Track Blog.

]]>
8984
Upgrading Microsoft’s core Human Resources system with SAP SuccessFactors http://approjects.co.za/?big=insidetrack/blog/upgrading-microsofts-core-human-resources-system-with-sap-successfactors/ Wed, 22 May 2024 08:36:52 +0000 http://approjects.co.za/?big=insidetrack/blog/?p=7813 Microsoft’s core Human Resources system was aging and needed to be replaced. Thanks to its limitations, when a company employee transferred to a new job in a different country or region, it could seem like they were starting over. “When employees moved country to country within the company, behind the scenes our HR operations teams […]

The post Upgrading Microsoft’s core Human Resources system with SAP SuccessFactors appeared first on Inside Track Blog.

]]>
Microsoft Digital technical storiesMicrosoft’s core Human Resources system was aging and needed to be replaced.

Thanks to its limitations, when a company employee transferred to a new job in a different country or region, it could seem like they were starting over.

“When employees moved country to country within the company, behind the scenes our HR operations teams had to manually move them to a different country code,” says Sruthi Annamaneni, a partner director of software engineering on the Microsoft Digital team deploying the company’s new HR system.

It was a matter of bringing the company’s HR data together in one place.

One of the reasons we’re rebuilding Microsoft’s Human Resources core is so we can unify the experience our employees have with us. We want it to feel like it’s the same Microsoft no matter which country or region someone works in.

—Sruthi Annamaneni, partner director of software engineering, Microsoft Digital

“It was about aggregating and having a single place to master all employee data across 109 countries,” Annamaneni says. “This would enable us to have a single global policy for all of Microsoft and to have all of our country-specific local policies implemented in one place. Reducing and improving error-prone, high-touch manual processes would help us keep our core HR systems running smoothly while improving our ability to support employees across the globe.”

For those reasons, Microsoft Digital—the organization that powers, protects, and transforms the company—has been upgrading Microsoft’s core Human Resources systems.

“One of the reasons we’re rebuilding Microsoft’s Human Resources core is so we can unify the experience our employees have with us,” Annamaneni says. “We want it to feel like it’s the same Microsoft no matter which country or region someone works in.”

Microsoft is wrapping up a multiyear effort to move its core Human Resources systems to SAP SuccessFactors. The makeover of Microsoft’s Human Resources core is largely complete with a last handful of external staff and newly acquired employees being upgraded this winter.

When we stood up our instance on Azure, that was a big, big milestone for us and for them. We’re a frontline user of their product in our cloud.

—Kerry Olin, Microsoft corporate vice president of Human Resources Services

“Our legacy system was not scaling to our global requirements and aspirations for a consistent employee experience,” says Kerry Olin, Microsoft’s corporate vice president of Human Resources Services. “We needed a more modern, flexible, and capable core HR system.”

Olin says the company reviewed many HR systems—it even considered working with Microsoft Digital to build an in-house system. In the end, the team decided to go with SAP SuccessFactors because it would play a foundational role in Microsoft’s bid to transform its vast array of secondary HR systems; like improving mobility, supporting new acquisitions, or transforming payroll. It also helped that the cloud-based SAP SuccessFactors Human Experience Management (HXM) Suite runs on Microsoft Azure. SAP has a longstanding partnership with Microsoft as a preferred cloud provider.

Microsoft is one of first large on-premises enterprises to move its HR systems to Microsoft Azure, a migration that is paving the way for other SAP SuccessFactors customers (and SAP SuccessFactors itself) to transition to the same cloud platform.

“When we stood up our instance on Azure, that was a big, big milestone for us and for them,” Olin says. “We’re a frontline user of their product in our cloud.”

Getting Microsoft’s Human Resources core to nearly-finished status hasn’t been easy—with HR systems in 109 countries and regions, the company’s core system is massive and complex. Until this overhaul, the HR data management approach varied considerably around the world, making it feel like the company had a separate HR system in every country and location. “This project was a great example of how people, process, and technology have to transform together to successfully land a big transformation in any enterprise,” Annamaneni says.

While such complexity challenges force most companies of Microsoft’s size to start over when they upgrade their HR systems, Microsoft rejected that tradition in favor of keeping the lights on as they went about the four-year upgrade.

We didn’t want to disrupt anyone. We didn’t want to have our team or our employees have to learn an entirely new system.

—Rajamma Krishnamurthy, principal program manager, Human Resources Foundational Services team, Microsoft Digital

“This is like completely rebuilding a train while the train is running,” Annamaneni says. “First you change the wheels, then you swap out the engine, and you keep going until everything is new and updated—it’s not easy. There are a zillion things that can go wrong.”

Why not start fresh like everyone else?

Because they wanted to minimize the impact on its 225,000 employees and external partners, and importantly, on the hundreds of HR professionals who work in the system to sustain business continuity on a daily basis.

“We didn’t want to disrupt anyone,” says Rajamma Krishnamurthy, a principal program manager for the Human Resources Foundational Services team in Microsoft Digital. “We didn’t want to have our team or our employees have to learn an entirely new system.”

First, Microsoft flipped the switch on in Canada, Norway, and Sweden.

“Many things went well, and a lot of things didn’t go well,” Krishnamurthy says. “We learned a lot, and we took what we learned to build a template that we used for the rest of the roll out.”

Next came India, which was the most complex country besides the US.

“The idea was the path to the United States was through India,” she says. “We had our governance ready—we knew where things could go wrong, we knew which stakeholders we would have to help get through it.”

India went well, which opened the door to tackle the US, which began in February 2020.

Krishnamurthy and Kalimuthu pose for photos joined together in a photo collage.
Rajamma Krishnamurthy and Suresh Kalimuthu are part of the team that transformed Microsoft’s core Human Resources system. (Photos by Rajamma Krishnamurthy and Suresh Kalimuthu)

“The United States was the biggest, scariest for us,” Krishnamurthy says. “We had a lot of people using the system who were not managers—we had a large admin population who used it every day. They had strong needs and desires on how the system should work.”

They used it heavily from Day One.

“We made sure their voices were heard,” she says. “Their work was not disrupted.”

The team commissioned a vendor to build a solution to bulk load employee data changes.  It is also using a Microsoft Power Application solution for access management while it works with a third party to build its own solution.

As for the HR specialists and admins using the new system? When issues flared up, Krishnamurthy and team funneled them into rapid response channels in Microsoft Teams, which allowed them to help each other work through it and gave them a place to share best practices.

“We were able to manage a lot of upheaval through these channels,” Krishnamurthy says. “They were a great change management channel.”

And there was a lot of volume. “We had close to 1,000 people using them to get answers on a daily basis,” she says. “It was awesome to see our community help each other like that.”

One of our biggest goals was to provide business agility. We’ve been able to do that in a way that sets us up well for the future.

—Suresh Kalimuthu, principal software engineering manager, Human Resources Foundational Services team, Microsoft Digital

Interestingly, the channels have turned into such a helpful resource that HR teams demanded that they live on past the system upgrade.

“With COVID, we need to continue that longer than we thought,” Krishnamurthy says. “Our exec admin professional community came together in the Teams support channel for the SAP SuccessFactors launch and continues to depend on and leverage each other for support on a wide range of issues.”

[Learn how Microsoft Dynamics 365 and AI automate complex business processes and transactions. Read about migrating critical financial systems to Microsoft Azure. Discover examining SAP transactions with Azure Anomaly Detector.]

Deploying Microsoft’s Human Resources core

The daunting challenge of deploying Microsoft’s new core HR system fell to Suresh Kalimuthu, a principal software engineering manager on Microsoft Digital’s HR Foundational Services team.

“One of our biggest goals was to provide business agility,” Kalimuthu says. “We’ve been able to do that in a way that sets us up well for the future.”

The technical challenge was tremendous—not only did the company move its HR system to a new platform while also moving to the cloud, it also adopted a new agile engineering method to do all the deployment work.

“We were taking on a lot all at once,” Kalimuthu says. “It has been an interesting, rewarding journey.”

And Krishnamurthy, Kalimuthu’s colleague, says the pressure was on to get it right.

“We have hundreds of stakeholders, finance, benefits, local HR,” Krishnamurthy says. “This is not glamorous. This is the basic running of our company. It takes a lot of effort to bring people along. People needed to understand the value of it.”

One of the big challenges the team had to account for when building the new system was how laws, rules, and systems in each country or region varied.

“We built a system that allows us to make local adjustments that don’t affect the larger system,” Kalimuthu says. “For example, if I want to create a new hire system for Canada, we can make those changes without disrupting anything that we deploy globally.”

Another challenge was the need to build custom solutions where SAP SuccessFactors’ out of the box product stopped short.

“Our HR systems are very complex and matrixed, much more so than most enterprises,” Kalimuthu says. “In several cases, we needed to fill in gaps with our own solutions.”

In those cases, doing so was straightforward. “SAP SuccessFactors has an ability to allow custom integrations and extend their capability,” he says. “We saved big time by leveraging our own technology when we needed it—this gave us a lot of flexibility.”

We solved some of these challenges ourselves, but we did so in partnership with SAP SuccessFactors. They are addressing our concerns—there has been a good give and take, and SAP SuccessFactors and their other customers have benefitted.

—Suresh Kalimuthu, principal software engineering manager, Human Resources Foundational Services team, Microsoft Digital

For example, Microsoft HR wanted to be able to deliver data on hires, promotions, and so on in near real time. “We wanted to make sure the data was readily available within 30 minutes, but it was only available 24 hours later out of the box,” Kalimuthu says. “We built that capability ourselves.”

The Microsoft products the team used include Azure Functions, Azure Keywords, Azure APIs, Azure Storage, Azure Service Bus, Azure Hub, Azure Active Directory, and Azure Encryption.

A big shift was moving all HR data onto one, connected platform. “We realized the value of having one data platform that stretches across all of Microsoft,” Kalimuthu says. “It takes our data from one end point to another.”

The team beefed up other areas as well, including on SOX compliancy, privacy, and security.

“We solved some of these challenges ourselves, but we did so in partnership with SAP SuccessFactors,” Kalimuthu says. “They are addressing our concerns—there has been a good give and take, and SAP SuccessFactors and their other customers have benefitted.”

Now that the team is winding down its upgrade of the core HR system, it is now turning to a future where updates and changes become much easier.

“While we won’t light up all of its new capabilities today or tomorrow, there is functionality in the system which significantly expands and enhances what we can do next,” Olin says. “We have some terrific examples of business value realization from the new core system already—there’s more opportunity ahead.”

Key Takeaways

Here are some principles you can use to guide you as you consider upgrading your HR core systems:

  • Standardize how you will approach the upgrade before you start working through your various HR processes.
  • Focus on completeness and data quality from the start.
  • Think globally but act locally when it comes to data, privacy, and other requirements.
  • Recognize that it takes a village when it comes to a project as large as upgrading your core systems—do everything you can to get the village ready and to keep them informed along the way.
  • There will be multiple moving parts—focus on the critical ones and ensure they do not break when you release the product (for example, make triple sure payroll will work the day after launch).
  • You will need able and willing partners who both know the product you’re deploying and how to deploy it.
  • Do not try to boil the ocean—to be successful you will need to break the project into a series of well thought out steps.

Related links

Learn how Microsoft Dynamics 365 and AI automate complex business processes and transactions.

Read about migrating critical financial systems to Microsoft Azure.

Discover examining SAP transactions with Azure Anomaly Detector.

We'd like to hear from you!

Want more information? Email us and include a link to this story and we’ll get back to you.

Please share your feedback with us—take our survey and let us know what kind of content is most useful to you.

The post Upgrading Microsoft’s core Human Resources system with SAP SuccessFactors appeared first on Inside Track Blog.

]]>
7813
Simplifying external staff management by extending SAP SuccessFactors on SAP Business Technology Platform http://approjects.co.za/?big=insidetrack/blog/simplifying-external-staff-management-by-extending-sap-successfactors-on-sap-business-technology-platform/ Wed, 27 Mar 2024 22:50:40 +0000 http://approjects.co.za/?big=insidetrack/blog/?p=13912 Our core system for managing our external workforce was showing its age in an increasingly demanding modern workplace environment—we needed to simplify our external staff management. Here at Microsoft, our large and diverse external, non-FTE workforce plays a vital role in our success. For us, external staff includes contractors, consultants, freelancers, and business guests who […]

The post Simplifying external staff management by extending SAP SuccessFactors on SAP Business Technology Platform appeared first on Inside Track Blog.

]]>
Microsoft Digital technical storiesOur core system for managing our external workforce was showing its age in an increasingly demanding modern workplace environment—we needed to simplify our external staff management.

Here at Microsoft, our large and diverse external, non-FTE workforce plays a vital role in our success. For us, external staff includes contractors, consultants, freelancers, and business guests who provide services, expertise, and support for various projects and products. As a company, we partner with more than 110,000 external staff in 119 countries and regions, representing almost 40% of our workforce.

Too many of our onboarding processes involved manual tasks that created inefficiency and introduced the potential for error.

— Himanshu Wadhwa, senior product manager, HR Foundational Services team, Microsoft Digital

Managing such a large and dynamic population of external staff is a big challenge for our Human Resources Operations team, which handles onboarding, data management, compliance, and reporting tasks for external staff.

Until recently, we in Microsoft Digital (MSD), the company’s IT organization, used our own internally built solution to help HR handle these tasks. However, our application was showing the symptoms of a legacy application in a modern workplace.

“Too many of our onboarding processes involved manual tasks that created inefficiency and introduced the potential for error,” says Himanshu Wadhwa, a senior product manager on the HR Foundational Services team in MSD. His team is responsible for investigating and building a new solution for managing external staff at Microsoft.

Wadhwa lists several ways the previous solution was struggling to meet the dynamic and changing needs of the business and the external staff, including:

  • Real-time tracking and status visibility for external staff requests wasn’t possible, leading to thousands of extra queries and delays.
  • It didn’t offer self-service functionality for external staff to update their personal information, resulting in heavy reliance on the operations team.
  • It didn’t integrate well with other systems, such as procurement, finance, and third-party vendors, causing data inconsistency and duplication.
  • It continually required development changes to support compliance with global and local regulations, exposing Microsoft to potential risks and penalties.

To address these challenges and achieve its vision of having a single HR source system for our entire workforce, the team decided to migrate external staff management to SAP SuccessFactors Employee Central, a cloud-based human experience management (HXM) solution that offers a comprehensive and flexible platform for employee and external staff lifecycle management.

[Discover upgrading Microsoft’s core Human Resources system with SAP SuccessFactors. Explore boosting employee engagement at Microsoft with Dynamics 365 and Power Platform. Unpack how we’re helping Microsoft employees understand their value with the Total Rewards Portal.]

Building on a solid HR operations core

We had already migrated the company’s core HR system to SuccessFactors, and for many of the same reasons, we were examining the external staff management solution. By rebuilding our HR core system at the company, our team provided a unified experience for all our employees.

In short, we wanted to extend the unified experience developed for our internal employees and managers to include the people and processes involved in managing our external staff.

Moving our contingent staff into the same solution as our full-time staff was a huge win for our HR team and it transformed how they can manage the employee experience for every person contributing to Microsoft’s success—internal or external.

— Kishore Rajaraman, principal product manager, HR Foundational Services team, Microsoft Digital

SAP SuccessFactors made perfect sense to our project teams. It’s already played a foundational role in our bid to transform our vast array of secondary HR systems, such as improving mobility, supporting new acquisitions, or transforming payroll. It also helped that the cloud-based SAP SuccessFactors HXM Suite runs on Microsoft Azure. SAP has a longstanding partnership with Microsoft as a preferred cloud provider.

“Moving our external staff into the same solution as our full-time staff was a huge win for our HR team, and it transformed how they can manage the employee experience for every person contributing to Microsoft’s success—internal or external,” says Kishore Rajaraman, a principal product manager for the HR Foundational Services team in the MSD team that implemented the SuccessFactors-based solution for external staff.

Implementation involved massive amounts of coordination from Rajaraman and his team. The implementation involved massive amounts of coordination from the core program team, which conducted a comprehensive study and analysis of user behavior, feedback, and pain points involving over 25 stakeholder groups. The team also performed multiple mock runs and data quality checks to ensure a successful data conversion from the previous solution to SAP SuccessFactors Employee Central, involving over 5 million rows of data for external staff.

We had many complex, unique requirements for our contingent staff. We wanted to create a solution that met those complexities with a consistent user experience and integrated with our other dependent systems.

— Kishore Rajaraman, principal product manager, HR Foundational Services team, Microsoft Digital

Creating integration with SAP BTP

The migration to SuccessFactors Employee Central was by no means simple.

“We had many complex, unique requirements for our external staff,” Rajaraman says. “We wanted to create a solution that met those complexities with a consistent user experience and integrated with our other dependent systems.”

The MSD Engineering Team used the SAP Business Technology Platform (BTP) to extend and customize SAP SuccessFactors Employee Central and SAP Fiori to create a consistent and intuitive user experience.

SAP BTP is an open and integrated platform that enables customers to build, extend, and integrate applications in the cloud. SAP Fiori is a design system that provides a user-friendly and responsive interface for SAP applications. Using BTP and Fiori enabled us to take the broad functionality of SAP SuccessFactors Employee Central and build in the customization and specialized functionality that a business of our complexity required. By using SAP BTP and SAP Fiori, our team was able to:

  • Provide real-time integration with third-party systems, such as Adobe Sign, e-learning, and procurement, to ensure data accuracy and completeness and enable end-to-end automation.
  • Automate process checks that were previously done manually. This eliminated the need for human follow-up and reduced the number of requests managed by External Operations team by approximately 65%.
  • Enable self-service functionality for external staff to access and update their information, such as name, personal email, and evacuation assistance, improving their satisfaction and engagement.
  • Provide dashboards and reports for managers, business admins, and operations teams to monitor and manage external staff requests, data changes, and compliance status, enhancing visibility and transparency.
  • Support global workforce enablement and compliance with local labor laws and regulations in more than 100 countries and regions, ensuring alignment with our company policies and best practices.

We had capability requirements specific to Microsoft that no out-of-the-box solution could account for. SAP BTP allowed us to extend the platform to support those requirements while still staying within SAP’s development stack. It was a huge part of why we chose SAP SuccessFactors Employee Central for this solution.

— Manoj Lakshmanan, principal software engineer, Microsoft Digital

“Approximately 40% of our final solution was built by extending SuccessFactors using BTP,” says Manoj Lakshmanan, a principal software engineer on the MSD engineering team. His team helped build extensibility solutions that fulfilled specific use-cases for Microsoft.

“We had capability requirements specific to Microsoft that no out-of-the-box solution could account for,” Lakshmanan says. “SAP BTP allowed us to extend the platform to support those requirements while still staying within SAP’s development stack. It was a huge part of why we chose SAP SuccessFactors Employee Central for this solution.”

The team also used the SAP Analytics Cloud solution to enable data-driven decision-making and taxonomy management for our external workforce. Analytics Cloud includes BTP integration components in its reporting data, so the custom capabilities and integration built with BTP are monitored by Analytics Cloud right alongside the built-in capabilities. With SAP BTP and Analytics Cloud, our team was able to create an end-to-end solution that monitors the user experience, and increases the efficiency of our HR operations, managers, and external staff.

Supporting SAP BTP integration with Microsoft Azure

Microsoft’s partnership with SAP makes Microsoft Azure a perfect fit for SAP BTP integration processes. Critical components hosted in Microsoft Azure support custom integration and functionality in BTP. The scalability and reliability of Azure make it simple and efficient for our engineering teams to create and maintain these integration points, regardless of workload scale and demand.

In the Microsoft process for external staff onboarding, we have rules we must validate to confirm information about external staff that comes in from partner systems and data sources. This information could be anything from an individual’s background check to purchase order and supplier information. Azure-based APIs enable us to connect to those partner systems, get the incoming data validated, and then respond with the details.

— Ramkumar Perumal, principal software engineer, Microsoft Digital

Microsoft Azure Functions hosts a large number of API-to-API integration points in the solution architecture that allow us to retrieve and store data from some of our internal business systems that are external to Employee Central, including CRM, e-signing, identity management, employee training, finance, purchasing, and many more. We can also perform real-time validation and—if necessary—data transformation from these systems as it passes in and out of Employee Central.

“In the Microsoft process for external staff onboarding, we have rules we must validate to confirm information about external staff that comes in from partner systems and data sources,” says Ramkumar Perumal, a principal software engineer in MSD. “This information could be anything from an individual’s background check to purchase order and supplier information. Azure-based APIs enable us to connect to those partner systems, get the incoming data validated, and then respond with the details.”

The time saved by automating partner system integrations is a huge part of the cost savings realized from the new solution. Each integration point removes the need for manually tracking the progression of transactions, allowing them to focus on the areas of the onboarding process where their attention is most required. API-based integration with partner systems has automated more than 65% of the 70,000 onboarding transactions that go through External Operations each year.

It’s a significant win for the External Operations team, MSD Engineering, Microsoft HR, and the entire organization.

Realizing the benefits of Employee Central across the enterprise

Successful migration has provided immediate benefits for our external staff, managers, business admins, HR operations team, and other dedicated teams involved in the external staff management process. Employee Central provides extensive automation for external staff policies.

External staff require access to buildings and technology that can change often, depending on the contract or project they’re involved with. Employee Central provides oversight of external staff from beginning to end. Managers and stakeholders in the onboarding process can use Employee Central to determine the status of any external staff member, and external staff members have that same visibility into their assignment with Microsoft.

Integrating our internal and external staff management solutions is a critical step for Microsoft HR. Centralizing HR data administration opens opportunities to manage Microsoft’s workforce more wholistically and to leverage shared support models.

— Cari Paddock, architect, HR Service Data Next Administration

“Integrating our internal and external staff management solutions is a critical step for Microsoft HR,” says Cari Paddock, an architect in HR Service Data Next Administration. Her team ensured that the Employee Central solution met the diverse requirements for managing external staff. “Centralizing HR data administration opens opportunities to manage Microsoft’s workforce more wholistically and to leverage shared support models.”

Successful migration has provided immediate benefits for our external staff, managers, business admins, HR operations team, and other dedicated teams involved in the external staff management process.

Lakshmanan, Paddock, Wadhwa, and Rajaraman appear in a composite image.
Manoj Lakshmanan, Cari Paddock, Himanshu Wadhwa, and Kishore Rajaraman are part of the team that’s simplifying and streamlining external staff management by extending SAP SuccessFactors on SAP Business Technology Platform.

“Improved efficiency and productivity are definitely at the top of the list,” Wadhwa says. “Reduced manual tasks, errors, and queries enable faster and more accurate responses for everyone involved in managing external staff.”

This integration enables External Operations support to enter the process at exactly the right time and to focus on the relevant transactions. Previously, Operations had to act on ever onboarding transaction.

Some of the other notable benefits include:

  • Enhanced user experience and satisfaction by providing a modern, intuitive, consistent interface for accessing and updating external staff information and performing other HR-related tasks.
  • Increased data quality and consistency by eliminating data duplication and inconsistency and ensuring data integrity and completeness across different systems.
  • Strengthened compliance and risk management by adhering to global and local regulations addressing audit and reporting requirements.
  • Enabled data-driven decision-making by providing a single source of truth for all workforce-related information and enabling advanced analytics and planning capabilities.

Our success is a testament to the power of collaboration and co-innovation between Microsoft and SAP, two industry leaders who share a vision of using technology to empower organizations and their employees. By working together, Microsoft and SAP have delivered a solution that highlights the best of both worlds: Microsoft’s engineering innovation and SAP’s HXM expertise.

— Himanshu Wadhwa, senior product manager, HR Foundational Services team, Microsoft Digital

By simplifying and streamlining the external staff management process with SAP SuccessFactors Employee Central and SAP BTP, we now have a single HR source system for both internal and external workforce while providing an excellent HR experience for our external staff and internal stakeholders.

Our solution also lays the foundation for future innovation and transformation, such as using artificial intelligence, machine learning, and predictive analytics to optimize and enhance the external staff management process.

“Our success is a testament to the power of collaboration and co-innovation between Microsoft and SAP, two industry leaders who share a vision of using technology to empower organizations and their employees,” Wadhwa says. “By working together, Microsoft and SAP have delivered a solution that highlights the best of both worlds: Microsoft’s engineering innovation and SAP’s HXM expertise.”

Key Takeaways

If you want to learn from our experience and improve your external staff management process with SAP SuccessFactors Employee Central and SAP BTP, consider the following:

  • Evaluate your current external staff management process and identify the pain points and opportunities for improvement. Consider aspects such as data quality, user experience, compliance, scalability, and integration.
  • Explore how SAP SuccessFactors Employee Central can help you streamline and standardize your external staff management process by providing a single source of truth, flexible workflows, and comprehensive functionality.
  • Investigate how SAP BTP and SAP Fiori can help you extend and customize SAP SuccessFactors Employee Central to meet your specific needs and preferences by enabling rapid application development, seamless integration, and user-centric design.
  • Examine how Microsoft Azure can support SAP BTP integration with data from external systems and data stores.

Try it out

SAP SuccessFactors Integration with Microsoft 365

As an admin of your organization’s SAP SuccessFactors system, you can configure it to collaborate with Microsoft 365 applications. Here’s how:

Related links

We'd like to hear from you!

Want more information? Email us and include a link to this story and we’ll get back to you.

Please share your feedback with us—take our survey and let us know what kind of content is most useful to you.

The post Simplifying external staff management by extending SAP SuccessFactors on SAP Business Technology Platform appeared first on Inside Track Blog.

]]>
13912
Examining Microsoft’s SAP transactions with Microsoft Azure Anomaly Detector http://approjects.co.za/?big=insidetrack/blog/examining-microsofts-sap-transactions-with-microsoft-azure-anomaly-detector/ Mon, 11 Mar 2024 16:00:43 +0000 http://approjects.co.za/?big=insidetrack/blog/?p=9010 As part of our continuing digital transformation journey, our Microsoft Digital Employee Experience (MDEE) team is constantly looking for ways to improve our business processes and detect issues and anomalies before they become serious problems. Sometimes these failures happen sporadically within the application framework and often go undetected. Even if a user detects an anomaly, […]

The post Examining Microsoft’s SAP transactions with Microsoft Azure Anomaly Detector appeared first on Inside Track Blog.

]]>
Microsoft Digital technical storiesAs part of our continuing digital transformation journey, our Microsoft Digital Employee Experience (MDEE) team is constantly looking for ways to improve our business processes and detect issues and anomalies before they become serious problems.

Sometimes these failures happen sporadically within the application framework and often go undetected. Even if a user detects an anomaly, they need to decide how to react, which is time consuming.

To do this, we’re using Microsoft Azure Anomaly Detector to examine transactions across our SAP environment, which helps us identify issues before they become problems. In turn this enables us to proactively improve the performance, consistency, and reliability of our entire SAP landscape.

[Unpack how we’re optimizing SAP for Microsoft Azure. | Discover how we’re protecting Microsoft’s SAP workload with Microsoft Sentinel. | Explore how we’re upgrading Microsoft’s core Human Resources system with SAP SuccessFactors.]

Understanding the need for anomaly detection in SAP

At Microsoft, our SAP environment comprises many complex processes across multiple lines of business. To avoid having disparate environments and isolated monitoring and reporting data, we wanted to build a single codebase solution for monitoring and anomaly detection that each line-of-business can use with minimal code implementation.

We wanted to build intelligence to detect anomalies and inconsistencies in business process flow to improve platform health. Improved platform health improves engineering service-level agreements (SLAs) and reduces revenue loss by being proactive rather than reactive.

There were hundreds of areas that could benefit from anomaly detection in our SAP portfolio, but we wanted to identify a single area for our pilot project. In the Master Data Management (MDM) space, we create thousands of objects representing business entities such as customers and business partners.

Most of these objects are created by using an application programming interface (API), and no human interaction is needed. However, it’s extremely difficult to identify if issues related to MDM are occurring in upstream systems, so we needed a way to capture issues in advance, proactively and quickly.

In the MDM space, we have SAP Master Data Governance (MDG) background processes, such Customer Master data creation, which run without any user interaction. Across various batch and scheduled jobs, process runtime varies based on data volume, time of day, time of year, and resource availability.

Understanding the potential for issues in each process and the larger process environment involves several challenging questions, including:

  • Is the transaction supposed to run that long?
  • Is there a problem in an upstream system?
  • Are there resources that reached their maximum capacity or that are creating a performance bottleneck?

Assessing Microsoft Azure Cognitive Services and Microsoft Azure Anomaly Detector

Detecting these issues by using human triage was difficult and time and resource intensive. Many issues went undetected, resulting in poor customer experience and the loss of potential revenue, in addition to lost capacity that could have been used for more productive purposes.

To solve this problem, we required a solution that was reliable, scalable, and easy to integrate with our SAP systems. The solution that we wanted would be process agnostic, implemented as a single codebase, and require no human intervention to detect issues.

The Microsoft Azure Anomaly Detector service, available within Microsoft Azure Cognitive Services, fits all our requirements.

The Anomaly Detector API enabled us to monitor and detect abnormalities in our data without having to know machine learning. The Anomaly Detector API’s algorithms adapt by automatically identifying and applying the best-fitting models to data, regardless of industry, scenario, or data volume, which greatly reduced our development efforts. Our primary steps were quite simple:

  1. Provision a service instance for Anomaly Detector in Microsoft Azure Cognitive Services.
  2. Start using the REST APIs in application code and interactions.

Using time-series data and data anomalies

For Anomaly Detector to identify anomalies, it requires time-series data, which is a series of data points indexed in time-based order.

For example, your car might have embedded sensors that send information regarding engine health, speed, tire pressure, and gasoline capacity. This information about your car is constantly updated over time and, as such, it can be used as time‑series data.

Most data received throughout time can be manipulated to be time‑series data if it’s a consistent data sequence with a time stamp. Time-series data with a single variable is considered a univariate series, while time-series data with more than one variable is considered a multivariate time series. Anomaly detector supports both univariate and multivariate series.

A data anomaly is outlying data that doesn’t fit within expected boundaries. The graphic below depicts the visual pattern of the time-series data with highlighted anomaly points in the time-series data. The graphic contains each of the time‑series data on the plot.

Data should be within minimum and maximum boundaries. In the figure, the boundary is filled with a light color. Most of the data points are within the expected boundaries. However, some data points that exceed the expected boundaries are highlighted in red in the figure, are data anomalies.

Time-series data with anomaly data points, with some data points outside the expected limits of the graph.
Time-series data with anomaly data points.

For example, a stock price that drops below the expected limit is a data anomaly. If the temperature reading of a power plant core exceeds the acceptable limit, the reading is a data anomaly, and the technicians at the power plant should be immediately notified so that they can act based on the anomaly.

Not all data anomalies are negative.

For example, if you have an article on your website that’s trending and experiencing larger traffic volume than normal, you likely want to be notified about the anomaly.

Or, if you have an e‑commerce website and receive a sudden spike in product demand, you, as the product supplier, should be notified so that you can act immediately. The graphic below contains examples of inputs and results for the Anomaly Detector service.

Examples of inputs and results for the Anomaly Detector service.
Inputs and results for the Anomaly Detector service.

Using Microsoft Azure services to create a business solution

To enable integration with our SAP portfolio, we’ve implemented several decoupled software components. Each component has a specific use case, and we decouple business logic and the presentation layers to the extent possible. All application code is committed to a Microsoft Azure DevOps repository and is built as a Microsoft Azure-native solution.

  • Microsoft Azure Web Apps. We host the front-end (presentation layer) application in an Azure Web App, from which the user can call the anomaly-detection service by using the prepared time-series data. Microsoft Azure Web App Service gives our developers the option to work in their preferred language, which can be .NET, .NET Core, Java, Ruby, Node.js, PHP, or Python. We protect the application endpoint with Microsoft Azure Active Directory for user authentication and authorization.
  • Microsoft Azure Function Apps. We host all business-logic functionality in Azure Function Apps. We use two Azure Function Apps. The first is used to connect to Microsoft Azure Application Insights and capture SAP telemetry, such as customer or business-partner processes that need anomaly detection.
    The Function App transforms the data into JavaScript Object Notation (JSON) format with time-series subformatting. The second Function App captures the precompiled time-series data from the first Function App, makes a call to the Anomaly Detector service, and then retrieves the result. The Web App presentation layer displays the results in a graph format. Function App endpoints are protected with access tokens.
  • Application Insights. We store all SAP log data in Application Insights. This log data is posted from various business processes, including Customer Master Data creation, Business Partner Creation and updates, and batch program logs. These logs are the source for all anomaly detection.
  • Microsoft Azure Anomaly Detector. Anomaly Detector uses the Anomaly Detector API to detect and return all anomaly points based on time-series data that the Function Apps send. While there are two options for interacting with Anomaly Detector, our developers chose to call the HTTP REST API directly for the Anomaly Detector rather than use the client SDK to integrate Anomaly Detector directly with their application. Using the API removes the limitation of using a single codebase and enables simple integration with any modern language that supports calling REST APIs through HTTP.

Implementation architecture

As depicted in the graphic below, various SAP applications post their business-process logs into the Application Insights instance. The Web App hosts the core application, including the presentation layer and user interaction. The two Function Apps perform extract and process data from the Application Insights service and control interaction with the Anomaly Detector service. The Function Apps send the final results from the Anomaly Detector service for display and consumption in the Web App.

Diagram of Azure Anomaly Detector for an SAP architecture.
Microsoft Azure Anomaly Detector for an SAP architecture.

Business implementation and benefits

One of our key business processes that we onboarded to the Anomaly Detector–based solution was the Master Data Management (MDM) business-partner creation that uses SAP Master Data Governance (MDG).

We constantly create and update business-partner data in our SAP system via API calls from various upstream tenants and front-end systems. Based on incoming telemetry sources, the Anomaly Detector solution detects if there is a sudden drop in creation or update processes because of API failure or network issues.

The detection algorithm can detect these issues automatically, in real time, which helps our system users to take corrective action. This simple addition to the issue-detection process helps us supply a better customer experience and eliminates major negative effects on revenue.

Key Takeaways

We’re planning to implement the same solution design across many other business processes, such as batch-job monitoring.

Currently, we have several hundred batch jobs that range from a runtime of a few seconds to several hours. It’s extremely difficult to monitor them manually and individually.

Sometimes, due to system issues or transaction locking, these jobs take more time, further affecting downline processes. Anomaly detection will play a critical role in detecting those issues, creating automatic alerts, and reducing manual monitoring.

This application has many potential use cases across multiple business scenarios. We’re planning to explore several of these use cases, including:

  • SAP batch job monitoring, evaluating long running jobs and triggering alerts.
  • Business-document processing and creation, such as sales orders, purchase orders, financial postings, and work orders.
  • Any set of data that has time-series patterns. Data sets such as these can be evaluated and monitored for anomaly detection on a case-by-case basis.

Using Microsoft Azure Anomaly Detector has enabled us to quickly and efficiently build a solution to detect abnormalities in our SAP processes without having to know machine learning. The Anomaly Detector API’s algorithms help us to identify issues before they become problems, thereby proactively improving the performance, consistency, and reliability of our entire SAP landscape.

Related links

The post Examining Microsoft’s SAP transactions with Microsoft Azure Anomaly Detector appeared first on Inside Track Blog.

]]>
9010
Revolutionizing SAP and ADP connectivity with Microsoft Azure VWAN and VPN http://approjects.co.za/?big=insidetrack/blog/revolutionizing-sap-and-adp-connectivity-with-microsoft-azure-vwan-and-vpn/ Thu, 22 Feb 2024 21:21:25 +0000 http://approjects.co.za/?big=insidetrack/blog/?p=13507 Connecting line-of-business solutions into the Microsoft Azure networking environment is a critical part of enabling an efficient hybrid network environment and an important step in transforming enterprise networking using Azure. We’ve recently helped SAP, one of our key business partners, replace an outdated VPN connectivity solution with a highly secure, cost-efficient connection into Azure for […]

The post Revolutionizing SAP and ADP connectivity with Microsoft Azure VWAN and VPN appeared first on Inside Track Blog.

]]>
Microsoft Digital technical storiesConnecting line-of-business solutions into the Microsoft Azure networking environment is a critical part of enabling an efficient hybrid network environment and an important step in transforming enterprise networking using Azure.

We’ve recently helped SAP, one of our key business partners, replace an outdated VPN connectivity solution with a highly secure, cost-efficient connection into Azure for integration with ADP payroll using Azure VWAN and Azure VPN.

As cloud networking has matured, so have the requirements for security and traffic control. SAP’s pre-existing solution for connecting their VPN clients lacked support for critical IKEv2 security and traffic control mechanisms such as AES256 and SHA256 encryption and border gateway protocol (BGP). To support these new requirements, the existing hardware used in SAP’s VPN solution needed to be replaced to be compliant with IKEv2 standards.

Thammineni, Scheffler, and Dobler appear in a composite image.
Chakri Thammineni (left to right), Eric Scheffler, and Christian Dobler are part of a team at Microsoft Digital that created a Microsoft Azure-based VPN connectivity solution for SAP and ADP support.

Our cloud networking engineers at Microsoft Digital (MSD), the company’s IT organization, proposed a different solution: a cloud-first VPN solution using Azure VWAN and Azure VPN.

The cloud-first solution uses Azure VPN policies and tunneling to bypass the requirement for routing hardware in SAP’s environment. It also provides full support for IKEv2 and removes dependency on outdated VPN hardware and controls.

This is the first Azure-based solution using IKEv2 policy-based VPN connectivity for SAP and ADP support. This cutting-edge solution introduces a highly secure and cost-efficient way for business partners to connect with Azure while ensuring strict adherence to regulatory compliance.

The introduction of IKEv2 VPN connectivity allows partners’ systems to securely communicate with Microsoft’s SAP environment in Azure, fostering a seamless integration of services. It sets the stage for unprecedented connectivity, creating a robust ecosystem for business partners to collaborate and exchange data securely.

Azure VWAN provides native IKEv2 VPN connectivity support. By taking advantage of Azure’s powerful networking capabilities, the solution is instantly scalable and highly available. The Azure-native approach streamlines the entire connectivity process by simplifying set up, management, and monitoring.

With IKEv2 VPN and Azure VWAN, all communications between partners’ or providers’ systems and Microsoft’s SAP environment are encrypted and authenticated, safeguarding sensitive information from unauthorized access. As a result, we can provide our partners with peace of mind and uphold the highest standards of data protection.

By eliminating the need for complex hardware and streamlining the set-up process, we enable our internal businesses and partners to achieve significant cost savings. This cost-effectiveness empowers them to invest resources strategically and fuel their growth.

Regulatory compliance is non-negotiable in today’s business landscape. Our VPN connectivity and native network solutions are designed with strict adherence to regulatory requirements in mind. By meeting industry standards and regulatory mandates, we ensure that our business partners can confidently operate within the bounds of compliance, mitigating potential risks and challenges.

The first IKEv2 policy-based VPN connectivity solution for SAP and ADP support through Azure native network solutions represents an entirely new way to approach SAP and ADP connectivity for Microsoft partners. With improved security, better cost efficiency, and support for regulatory compliance adherence, we’re reshaping the standards for business partner connectivity in the cloud. We’re continuing to innovate and explore new approaches to secure and efficient VPN connectivity to support SAP systems.

Key Takeaways

To navigate the integration of your network with Microsoft Azure networking solutions, consider the following:

  • Explore Azure VWAN and VPN. Evaluate your network for potential improvements by considering Azure Virtual WAN and VPN for enhanced security and efficiency.
  • Assess security and compliance. Review your network’s security protocols and compliance standards against Azure’s IKEv2 encryption and regulatory adherence.
  • Initiate a pilot project. Test the impact of Azure VWAN and VPN by launching a small-scale pilot within your network, focusing on performance and cost benefits.

Try it out

Here’s how you can create a peer-to-site VPN connection using Azure Virtual WAN.

Related links

We'd like to hear from you!
Want more information? Email us and include a link to this story and we’ll get back to you.

Please share your feedback with us—take our survey and let us know what kind of content is most useful to you.

The post Revolutionizing SAP and ADP connectivity with Microsoft Azure VWAN and VPN appeared first on Inside Track Blog.

]]>
13507
Protecting Microsoft’s SAP workload with Microsoft Sentinel http://approjects.co.za/?big=insidetrack/blog/protecting-microsofts-sap-workload-with-microsoft-sentinel/ Thu, 25 Jan 2024 09:01:48 +0000 http://approjects.co.za/?big=insidetrack/blog/?p=7827 For any large enterprise like Microsoft, monitoring threats to infrastructure and applications developing and maintaining an always-on Security Information and Event Management (SIEM) solution like Microsoft Sentinel that’s equipped to ward off threats isn’t only a weighty task but also a truly challenging undertaking. The threat landscape is constantly evolving, and data breaches—originating from outside […]

The post Protecting Microsoft’s SAP workload with Microsoft Sentinel appeared first on Inside Track Blog.

]]>
For any large enterprise like Microsoft, monitoring threats to infrastructure and applications developing and maintaining an always-on Security Information and Event Management (SIEM) solution like Microsoft Sentinel that’s equipped to ward off threats isn’t only a weighty task but also a truly challenging undertaking.

The threat landscape is constantly evolving, and data breaches—originating from outside or within organizations—are commonplace. Organizations now acknowledge that securing their digital perimeter is an insufficient and inherently reactive approach. That’s because modern security solutions, to be robust, must protect the entire enterprise environment, including core business processes, the sensitive data that those processes might expose, and the systems that support those processes. This recognition prompted Microsoft to seek a more complete, continuous, and dynamic solution to better protect its SAP assets—Microsoft Sentinel, a cloud-native SIEM platform that uses built-in AI to quickly help analyze large volumes of data across an enterprise. Microsoft’s SAP assets include applications that support Microsoft’s core business processes and combined, comprise an impressive 24 terabytes (TB) of data.

Microsoft had a two-fold rationale for developing its new Microsoft Sentinel SAP SIEM solution: to better detect suspicious activity and to fully document security incidents and how the organization resolves them. Enterprise resource planning (ERP) systems like SAP are facing increasing cybersecurity threats, across the industry spectrum, from healthcare and manufacturing, to finance, retail, and e-commerce. Threat actors, recognizing such systems’ vulnerabilities, have identified ERP systems as a prime target. SAP vulnerability is a critical concern for enterprise executives and senior security professionals, given that the average cost of an SAP breach is $5 million per attack. That cost doesn’t figure in the reputational damage a breach or attack might confer, which is often substantial and prolonged.

[Using Microsoft Azure AD MFA at Microsoft to enhance remote security. Moving to next-generation SIEM with Microsoft Sentinel. Using shielded virtual machines to help protect high-value assets.]

Identifying threats—and a solution

Microsoft Sentinel and Microsoft’s SAP Security teams defined a roadmap to address current challenges and chart a path to using the preventive and detective capabilities of Microsoft Sentinel and Microsoft Azure. Additionally, the collaborative efforts of SAP and Microsoft Azure increase end-to-end visibility across enterprise systems and applications and help bolster system resilience. Although Microsoft has tools in place to detect and log threats across its SAP landscape, the challenge was managing the number of tools, the data sources, and the effort required to analyze and help remediate a threat.

Five layers of Microsoft Sentinel monitoring shown in a graphic: SAP business logic, SAP application, database, OS, and network.
SAP security layers that Microsoft Sentinel monitors and some of the security-risk scenarios that Microsoft Sentinel addresses.

Considering that any breach in Microsoft SAP applications could have catastrophic consequences for us, we knew that we needed a solution that enabled rapid vulnerability detection and monitoring capabilities to reduce risks to the organization. We needed an internally managed and configured SIEM solution that could baseline user behaviors and detect anomalies across SAP to include the OS and network layer, the database layer, and the application and business logic layers.

—Kusuma Sri Veeranki, senior software engineer and SAP security lead, Microsoft Digital

At the same time, Microsoft also wanted to implement a centralized SIEM solution that detects and helps prevent threats. The SIEM tools in use were effective, but the monitoring structure was inherently reactive because it didn’t allow for real-time monitoring. When a potential threat or an active security incident was identified, an alert was generated. However, the time to assess and remediate threats was variable, and response lags were common. Further, the process for addressing some vulnerabilities involved patching, for example, so any exposure remained viable until patches were applied across the system. Finally, there wasn’t an easy way to follow an SAP security alert through the system to determine the remedial actions taken within Microsoft and by whom.

Microsoft also recognized that the existing SAP SIEM solution didn’t always meet its stringent compliance requirements and didn’t permit sufficient visibility into the entire threat environment. And from an enterprise-wide perspective, Microsoft was essentially operating separate corporate and SAP security solutions, an outdated model that the company sought to replace. The new monitoring solution, when operational, would include a separate Sentinel instance for SAP but would also fully integrate with the Microsoft corporate Security Operations Center (SOC).

We’re excited to be able to use the capabilities that Sentinel provides our customers out of the box along with SAP specific capabilities on an initiative as important as Microsoft SAP security. This represents a new approach in SIEM solutions.

—Yoav Daniely, principal group product manager, Microsoft Security, Compliance, Identity, and Management

“Considering that any breach in Microsoft SAP applications could have catastrophic consequences for us, we knew that we needed a solution that enabled rapid vulnerability detection and monitoring capabilities to reduce risks to the organization,” says Kusuma Sri Veeranki, a senior software engineer and SAP security lead for Microsoft Digital, the organization that powers, protects, and transforms the company. “We needed an internally managed and configured SIEM solution that could baseline user behaviors and detect anomalies across SAP to include the OS and network layer, the database layer, and the application and business logic layers.”

The ideal solution, Veeranki says, would also permit visibility into all other systems, products, and applications that interconnect with SAP.

SAP as Microsoft Sentinel ’customer zero’

To develop its new SIEM solution for SAP, the organization decided to use Microsoft Sentinel, a relatively new product, in conjunction with Microsoft’s existing security, orchestration, automation, and response (SOAR) platform. Developed initially for Microsoft Azure, Microsoft Sentinel is designed to collect data and monitor suspicious activities at cloud scale by using sophisticated analytics and threat intelligence. Recently cited in a Forrester Consulting study as an efficient, highly scalable, and flexible SIEM solution that incorporates Azure Log Analytics, Sentinel is also the first cloud-native product in the market.

Our objective is to deliver a configurable solution that has the ability to monitor end-to-end processes and take the appropriate action as defined within the system, including those that should be stopped. Many of the current products on the market are SAP-centric but are limited in their integration capabilities. So, we’re customer zero for leveraging Microsoft Sentinel for SAP security and for enabling that cross-correlation capability.

—Aaron Hillard, principal software engineering manager and SAP security lead, Microsoft Digital

“We’re excited to be able to use the capabilities that Sentinel provides our customers out of the box along with SAP specific capabilities on an initiative as important as Microsoft SAP security,” says Yoav Daniely, principal group product manager on the Microsoft Security, Compliance, Identity, and Management (SCIM) team. “This represents a new approach in SIEM solutions.”

To configure Microsoft Sentinel to monitor the entire Microsoft SAP environment—it includes 15 SAP production systems including six Sarbanes-Oxley (SOX) systems—the engineering team and the Microsoft Azure product group recognized that the solution also needed to provide cross-correlation coverage. Cross correlation is the ability to surveil the entire organization to include junctures where SAP integrates with other systems and applications such as Microsoft Dynamics 365. For example, Sentinel could detect a hypothetical scenario in which a user who creates a new payee in Dynamics but also “pays” that customer in SAP without the activity being detected.

Microsoft Sentinel for SAP’s collect, detect, investigate, and respond elements shown in a graphic.
Microsoft Sentinel for SAP monitoring solution highlights.

“Our objective is to deliver a configurable solution that has the ability to monitor end-to-end processes and take the appropriate action as defined within the system, including those that should be stopped,” says Aaron Hillard, principal software engineering manager and SAP security lead in Microsoft Digital. “Many of the current products on the market are SAP-centric but are limited in their integration capabilities. So, we’re customer zero for leveraging Microsoft Sentinel for SAP security and for enabling that cross-correlation capability.”

Ultimately, the goal is to equip Microsoft Sentinel to assess and respond dynamically to all security threats across all enterprise hosts, platforms, applications, and business processes, and then provide automated remediation as feasible and appropriate. The risk scenarios that Microsoft Sentinel addresses will continue to expand as the product evolves.

“Sentinel gives us the ability to monitor the data and activities holistically, because Microsoft, like many other enterprises, uses numerous systems throughout the operations environment,” Veeranki says. “That’s a key differentiator of Sentinel compared to SIEM systems that are designed purely for SAP.”

Microsoft Sentinel incorporates advanced machine learning and AI capabilities that identify suspicious patterns and activities that previously defied detection. Additionally, it is able to readily integrate numerous platforms and products that enterprise companies use and enable organizations to customize configuration to meet their security-monitoring needs.

Managing massive inputs efficiently with an innovative data connector

To date, the Microsoft SAP and Microsoft Sentinel SAP threat monitoring engineering teams identified an initial 27 initial high-risk scenarios that encompass a broad range of use cases. These use cases involve changes in system, client, or audit-log configuration, and suspicious or unauthorized user logins, data access, or role assignments. Monitoring also covers account-modification or password-change activities, and any audit-log manipulation or brute-force attack, among others. Other risk scenarios are being identified with respect to highly sensitive business and financial threats, and the teams are developing and completing proofs of concept for those scenarios. The SAP and Sentinel teams will continue to expand the threat-detection capabilities of Microsoft Sentinel and the risk scenarios that it addresses the product evolves.

The Microsoft SAP footprint is massive and change management within the platforms is highly complex. Therefore, to prevent system overload because of memory requirements, the engineering team must deploy a robust yet nimble mechanism to accommodate the vast amount of data coming into Microsoft Sentinel. To that end, the engineering team developed a Microsoft Sentinel-specific data connector that manages SAP inputs in a manner that’s specific to the underlying applications. The connector facilitates a complete security solution to visualize, alert, and respond to threats, and it’s easily configurable through built-in watchlists that match specific environment needs.

The data connector extracts data for monitoring, stores it, and then moves it through Sentinel in an incremental manner that the system can “understand,” says Anirudh Dahuja, an SAP platform engineer in Microsoft Digital. “Otherwise, there’s a risk of overloading the system, an issue that we’ve encountered,” he says.

To accomplish efficient use of the new tool, the engineering team used indexing to accommodate unwieldy tables and expedite querying. The team also incorporated secrets connectivity by using the Microsoft Azure Key Vault, which provides a secure store to create, store, and maintain keys that access and encrypt cloud resources, apps, and solutions. To manage the requisite memory optimization, the team leveraged Docker containers to accommodate the data connector functions before moving data into Microsoft Sentinel using custom Microsoft Azure APIs.

There’s another challenge that Microsoft Sentinel engineers are experiencing and working to remedy: how to reduce the “noise” in the monitoring system to differentiate between authorized, permissible activities and real threats that warrant action. Because Sentinel is designed to detect a very broad range of potentially suspicious or intentionally malicious activities, the number of alerts it raised initially produced many false positives.

“That’s something we’re working on now—improving alert fidelity and fine-tuning the system to produce fewer false positives,” Veeranki says.

That’s the biggest advantage of using Sentinel for SAP monitoring—the analytics. There are a lot of other tools in the market that alert you to SAP threats, but that’s where they stop. Microsoft Sentinel offers a scalable cross-platform solution to detect and mitigate threats in near real time. We’re not only detecting threats but also quickly responding to and remediating them.

—Anirudh Dahuja, SAP platform engineer, Microsoft Digital

She adds that Microsoft will continue to share the challenges and remedies that teams discover as the Microsoft Sentinel implementation proceeds. Customers then can accelerate their own implementations by using these learnings.

Key Takeaways

Tallying early Sentinel benefits and moving forward

Microsoft Sentinel allows for comprehensive cross correlation across enterprise resources, in addition to SAP, thereby helping identify known and previously difficult-to-detect security threats in near real time. That’s a capability high on the wish list for many of Microsoft’s existing enterprise customers. Interestingly, two other critical Sentinel benefits are emerging. Despite the initiative’s early development stage—it’s been less than a year since its inception—Microsoft Sentinel has proved highly scalable and customizable from the outset. It also promises to engender efficiencies generally for Microsoft security operations, by providing a single SIEM system and “pane of glass” through which to continuously view security logs, alerts, and incidents across the enterprise.

Further benefits, still in development, are the advanced analytics being integrated to help detect anomalies in activities involving SAP systems and the automated remediation that Microsoft Sentinel will eventually provide. That’s a winning combination, in Dahuja’s view.

“That’s the biggest advantage of using Sentinel for SAP monitoring—the analytics. There are a lot of other tools in the market that alert you to SAP threats, but that’s where they stop. Microsoft Sentinel offers a scalable cross-platform solution to detect and mitigate threats in near real time,” Dahuja says. “We’re not only detecting threats but also quickly responding to and remediating them.”

Related links

The post Protecting Microsoft’s SAP workload with Microsoft Sentinel appeared first on Inside Track Blog.

]]>
7827
Optimizing Microsoft’s SAP environment with Microsoft Azure http://approjects.co.za/?big=insidetrack/blog/optimizing-microsofts-sap-environment-with-microsoft-azure/ Tue, 16 Jan 2024 09:04:09 +0000 http://approjects.co.za/?big=insidetrack/blog/?p=8948 At Microsoft, we use SAP enterprise resource management software to run our mission-critical business functions like finance and human resources. In an on-premises model, physical computing resources are costly and may go unused. But by moving our SAP systems to Microsoft Azure, we avoid maintaining unused resources—we scale our systems up and down for current […]

The post Optimizing Microsoft’s SAP environment with Microsoft Azure appeared first on Inside Track Blog.

]]>
Microsoft Digital technical storiesAt Microsoft, we use SAP enterprise resource management software to run our mission-critical business functions like finance and human resources.

In an on-premises model, physical computing resources are costly and may go unused. But by moving our SAP systems to Microsoft Azure, we avoid maintaining unused resources—we scale our systems up and down for current and short-term needs. By doing this, we’ve fine-tuned our capacity management processes for lower costs, and more agility, scalability, and flexibility.

Like many enterprises, our company uses SAP—the enterprise resource planning (ERP) software solution—to run most of our business operations. We’re running SAP on Microsoft Azure, the preferred platform for SAP and the optimal platform for digital transformation.

We’ve optimized our SAP on Microsoft Azure environment to gain business and operational benefits that make our SAP instance agile, efficient, and able to grow and change with our business needs.

Optimizing our Microsoft Azure environment has allowed us to:

  • Increase cost savings by using our Microsoft Azure infrastructure more efficiently.
  • Create a more agile, scalable, and flexible SAP on Microsoft Azure solution.

As of 2018, Microsoft’s instance of SAP is 100 percent migrated to Microsoft Azure. By optimizing SAP on Azure, we’re positioning our SAP environment to grow and change with our business needs. Additionally, we’re positioned to lead our digital transformation and empower everyone in our organization to achieve more. Azure makes SAP better.

[Discover how we’re protecting Microsoft’s SAP workload with Microsoft Sentinel. | Learn how Microsoft moved its SAP workload to the cloud. | Find out how we’re transforming how we monitor our SAP instance with Microsoft Azure.]

SAP at Microsoft

Each SAP system or app in our overall SAP landscape uses servers and hardware, computing resources (like CPU and memory), and storage resources. Each system also has separate environments, like sandbox and production. The resources required to run SAP can be costly in an on-premises model, where you have physical or virtualized servers that often go unused.

Consider a typical on-premises system.

The IT industry often sizes on-premises servers and storage infrastructure for the next three to five years, based on the expected maximum utilization and workload during the life span of an asset. But often, the full capacity of the hardware isn’t used outside of peak periods—or isn’t needed at all. Maintaining these on-premises systems is costly.

With Microsoft Azure, we avoid infrastructure underutilization and overprovisioning. We quickly and easily scale up and scale down our SAP systems for current and short-term needs, not for maximum load over the next three to five years.

Capacity management offers a boost

By managing capacity and sizing our SAP systems on Microsoft Azure, we’ve experienced improvements in several areas:

  • We have a lower total cost of ownership—we only pay for what we need, when we need it. We save on costs of unused hardware and ongoing server maintenance.
  • We cut the core counts (number of CPUs in a machine) nearly in half—from 64-core physical machines to 32-core virtual machines for almost every server that we moved.
  • We are much more agile. We size for our needs now, and easily add or change our setup as needed to accommodate new functionality.
  • For example, in a few minutes, we changed an 8-CPU virtual machine to 16 CPUs, doubled the memory, and added Microsoft Azure premium storage to meet our short-term needs. Later, to save costs, we easily reverted to the original setup.

What does optimizing involve?

Optimizing involves calculating our hardware requirements like CPU resources, storage space, memory, input/output (I/O), and network bandwidth.

When we optimize, we size for today. We assess our infrastructure, resources, and costs, and then size our systems as small as possible. We also ensure that sufficient space to run business processes without causing performance issues during expected events like product releases or quarterly financial reporting.

This capability provides us with the ability to optimize our storage and computing power, giving us flexibility and on-demand agility.

Tips for sizing

Sizing is an ongoing task because your load, business requirements, and behavior patterns can change at any time. The following are some considerations and tips, based on the process that we use:

  • Design for easy scale-up and scale-out. Upsize only when needed, rather than scaling up or out months or quarters ahead of an actual business need. Start with the smallest possible computing capacity.
    It’s easy to add capacity later and resize before business processes change or before new processes go live in the environment. Autoscaling up and out brings additional benefits because it’s an automatic response to current conditions and usage patterns.
    Designing for easy scale-up also includes configuring the SAP HANA database and the SAP instances to dynamically adjust the amount of memory used or number of work processes depending on available resources in the virtual machine (VM). Keep the instance design simple: One SAP instance per VM.
  • Figure out how many virtual machines a system needs. Our production and user acceptance testing (UAT) systems have multiple virtual machines, but for sandbox and quality assurance, we usually allocate single virtual machines. Sometimes our SAP app instances and database instance are on the same virtual machine.
  • Don’t size for only CPUs and memory. Size for storage I/O and throughput requirements, too.
  • Consider upstream and downstream dependencies in data movement and in app-to-app communication. Let’s say that you move an app into a public cloud. Adding 20 to 40 milliseconds in communication between on-premises and public cloud apps can affect dependencies and can also affect customers or your service-level agreements (SLAs) with business partners.
  • Decide whether all your systems need Microsoft Azure premium storage. It’s possible to change storage via a short downtime from Azure standard storage to premium storage without the need to manually copy the data.
    Here is an example of how we’re using this feature: For our archiving system, we used Microsoft Azure Standard Storage and small virtual machines. When we loaded data into the system, we temporarily doubled the memory and CPU, and added Microsoft Azure Premium Storage for log file drives in the database.
    To save costs, after we loaded the data, we made the virtual machine smaller again and removed the premium storage drives.
  • Decide if all apps must run continuously. Can some apps run eight hours a day, Monday through Friday? If so, you can save costs by snoozing. At night, or on weekends or holidays, you can often snooze development and sandbox systems if they aren’t in use.
    Also consider identifying test systems and potentially even some production systems for snoozing. Create a snooze schedule and provide key personnel with the ability to un-snooze systems on demand.
    If you have a separate business-continuity system and you snooze hardware for it, you pay only for storage, not for compute consumption. Also, consider using the smallest size feasible. If there’s a disaster, resize to a bigger size before you start the production system, such as the database server, for business continuity.
  • Keep monitoring and managing system and resource capacity. Make changes before issues occur. Monitor storage use, growth rates, CPU, network utilization, and memory resources that are used on virtual machines. Again, consider autoscaling up and out. If monitoring indicates that a system is consistently oversized, then adjust downward.

Strategies for sizing SAP systems

We used two common strategies for sizing SAP systems, each in a different way and at different points in the optimization process.

We used the SAP Quick Sizer at the start of our optimization process because it had a simple, web-based interface and it allowed us to prepare our sizing strategies from the start.

We used reference sizing later, after we determined some context around our virtual-machine sizing and could provide virtual machines in Microsoft Azure for reference.

SAP Quick Sizer

If you don’t yet have systems or workloads in Microsoft Azure, start with SAP Quick Sizer—it’s an online app that guides you on sizing requirements, based on your business needs.

Quick Sizer is helpful for capacity and budget planning. The app has a questionnaire where you indicate the number of SAP users, how many transactions you expect, and other details.

The SAP system recommends a number for the SAP Application Performance Standard (SAPS), a measurement of processing requirements that you need, such as for a database server.

If the recommended number is 80,000, you need to leverage servers with SAPS that add up to 80,000.

You can find more information about SAPS for Azure virtual machines in SAP Note #1928533 SAP Applications on Azure: Supported Products and Azure VM types (SAP logon required).

You should keep a few considerations in mind when you’re using SAP Quick Sizer.

There can be customization and variations of SAP systems, depending on business processes, which could change system behavior. Or you might have capabilities enabled for new SAP deployments or custom code for which no Quick Sizer exists.

Also, in the past, hardware vendors guided customers on the servers that they needed and how to install them. With Azure, customers make their own decisions—for example, how to grow storage as data volume grows, or how to adjust CPU compute resources.

Reference sizing

After systems are live in Microsoft Azure, reference sizing is the recommended method. With this approach, you need to look at the performance of systems you’ve already moved to Azure that have a similar load to the systems that you want to move.

This comparison helps you estimate your sizing requirements accurately. For example, if you have an on-premises system that you want to move to Microsoft Azure, and it’s three times larger than one of the systems that you already have on Azure, adjust the sizing based on systems you’ve already deployed in Azure, and then deploy the new system.

If it turns out that your estimate wasn’t accurate, it’s much easier and quicker to adjust CPU and memory resources in Microsoft Azure than on-premises, by switching to a different virtual machine size. Adjusting the database on-premises is more difficult because you might need to buy servers with more CPU and memory.

For on-premises, you must look at what you have, add a buffer, and consider the additional load that you’ll have in the next few years.

Technical considerations

When we integrate SAP with Windows Server and SQL Server, our main considerations are cost of ownership and low complexity. When you plan your integration and reference architecture, make sure that the technical landscape is easy and cost effective. With business-critical systems, it’s difficult to scale when you have an architecture whose maintenance requires highly skilled individuals, or when there are emergencies where you need business continuity.

For easy administration and operations, we use the same app design in all SAP production systems. We only adjust VM sizes and numbers based on the system-specific requirements.

Also, to avoid issues for customers who run SAP workloads on Azure, Microsoft certifies only certain Microsoft Azure VM types. These VMs must meet memory, CPU, and ratio requirements, and they must support defined throughputs. To learn more about Azure VM types certified for SAP, review SAP certifications and configurations runninon Microsoft Azure.

Technical implementation and technical capabilities

The graphic below shows the Microsoft SAP ERP/ECC production system in Microsoft Azure. By moving to Azure, we’ve gained agility and scalability on the SAP Application layer.

We can scale the SAP Application layer up and down by increasing and decreasing the size and number of the VMs. The design and architecture have high-availability measures against single points of failure.

So, if we need to update Windows Server or Microsoft SQL Server, perform infrastructure maintenance, or make other system changes, it doesn’t require much, if any, downtime. We implement infrastructure in Azure for our production systems with standard SAP, SQL Server, SAP HANA, Windows Server, and SUSE Linux high-availability features.

Illustration of current Microsoft SAP EP/ECC production system in Azure with an example of how we use Azure Availability Zones for VMs
A typical SAP BW/HANA production system in Microsoft Azure.

High availability and scalability

To ensure high availability, we are leveraging Microsoft Azure Availability Zones: We distribute our VMs in multiple zones. The graphic above includes an example of how we use Azure Availability Zones. If a problem arises in one zone, the system is still available.

All single points of failure are secured with clustering: Windows Server Failover Cluster for the Windows operating system and Pacemaker cluster for SUSE Linux. For databases, we use SQL Server Always On and SAP HANA System Replication (HSR). The databases are configured for a synchronous commit on both local HA nodes (no data loss occurs, and automatic failover is possible) and an asynchronous commit to the remote disaster recovery node. If an issue arises with the main database server, SAP will automatically reconnect to the local high availability node.

Because we can use the secondary database, we can upgrade software and SQL Server, roll back to previous releases, and do automatic failovers with no or minimal risk.

For scalability and high availability of the SAP application layer, multiple SAP app server instances are assigned to SAP redundancy features such as logon groups and batch server groups. Those app server instances are configured on different Microsoft Azure virtual machines to ensure high availability. SAP automatically dispatches the workload to multiple app-server instances per the group definitions. If an app server instance isn’t available, business processes can still run via other SAP app server instances that are part of the same group.

Rolling maintenance

The scale-out logic of SAP app server instances is also used for rolling maintenance. We remove one virtual machine, and the SAP app server instances running on it, from the SAP system without affecting production. After we finish our work, we add back the virtual machine, and the SAP system automatically uses the instances again. If high load occurs and we need to scale out, we add additional virtual machines to our SAP systems.

Automated shutdown of SAP systems

To ensure that VM costs are managed, we’ve established new policies for nonproduction SAP systems: Default setting for a system is that’s unavailable. Users can start systems on demand by using a snooze application build on top of Microsoft Power Apps.

The system will be available for 12 hours and then automatically shut down again unless the availability time was extended. Additionally, systems that are used regularly are assigned to a fixed availability schedule. For example, the schedule might be: Systems will be shut down Friday evening and started again on Monday morning without user interaction. In the event that the system is needed over the weekend, users can start the system by using the snooze application.

Telemetry and monitoring using Microsoft Azure Monitor

Moving SAP systems to Microsoft Azure enabled easy integration with various Microsoft Azure Monitor services for enhanced telemetry and monitoring. We approached telemetry by using a multilayer concept: 1) SAP Business Process layer, 2) SAP Application Foundational layer, 3) SAP Infrastructure layer, and 4) SAP Surrounding API layer.

Microsoft Azure Log Analytics offers many standard metrics for the SAP Infrastructure layer, and it’s easy to integrate with SAP to collect custom metrics from the Application Foundational layer.

Implementing telemetry for the SAP business process and API layers is relatively challenging because doing so requires custom development. We use tools in the ABAP SDK to integrate and export data from SAP to Microsoft Azure Monitor Application Insights.

Learn more about end-to-end telemetry for SAP on Microsoft Azure and find out additional information about how Microsoft’s internal SAP workload has received a telemetry boost from Microsoft Azure.

Key Takeaways

We keep learning and iterating as we optimize SAP for Microsoft Azure in our environment. Here are some important lessons that we’ve learned:

  • Ensure that you don’t over-provision your virtual machines, but make sure that you provision sufficient resources to avoid having to keep increasing your system resources weekly.
  • Design and build your infrastructure and storage in Microsoft Azure so that it can scale. Even for our development and test systems, we decided to use Azure premium storage because it offers low latency. That approach is optimal, because during project implementation, there are often multiple developers simultaneously using the development systems.
  • The types of virtual machine storage and Microsoft Azure networking that we use are influenced by the lessons that we’ve learned about functionality. The Azure cloud platform is continually improved based on customer feedback and requirements.
  • Design for high availability in your production systems by using Windows Server Failover Clustering, SQL Server Always On, and SAP features like logon groups, remote function call groups, and batch server groups.

We’re excited about the decreased costs and increased agility that we’ve experienced so far in optimizing SAP for Microsoft Azure. In the future, we plan to share more lessons that we learn as we move forward with post-migration improvements.

For information on designing a migration initiative, review Strategies for migrating SAP systems to Microsoft Azure. Our future plans include:

  • Automating the sizing of our simpler systems and environments and developing autoscale. Automation and autoscale apply more to the middle tier—the SAP application layer—but we’d also like to autoscale up and down for the database layer and file servers. We want our systems to autoscale based on current conditions.
  • Adding more automation for business continuity. Right now, we use the same semiautomated business-continuity process in Microsoft Azure that we used on-premises. If there’s a disaster, production fails over to a different Azure region.
  • Exploring new business-continuity strategies and technology options as they apply to Microsoft Azure.
  • Helping our customers that have SAP scenarios like Microsoft Azure Backup or Microsoft Azure Data Encryption at rest address questions such as:
    • Which policies do I apply in the SAP landscape?
    • What do I encrypt? Do I use disk encryption or database encryption?
    • Do I need the same backup methods for a 50-gigabyte database that I require for a 10-terabyte database?
  • Add and use new Microsoft Azure capabilities. We want to enable more SAP scenarios to run in Azure—better and faster storage, larger virtual machines, better network connectivity, and more Azure operational guidance.

Related links

The post Optimizing Microsoft’s SAP environment with Microsoft Azure appeared first on Inside Track Blog.

]]>
8948
Hello Azure: Unpacking how Microsoft moved its SAP workload to the cloud http://approjects.co.za/?big=insidetrack/blog/hello-azure-unpacking-microsoft-moved-sap-workload-cloud/ Mon, 08 Jan 2024 16:05:49 +0000 http://approjects.co.za/?big=insidetrack/blog/?p=3429 [Editor’s note: This story on how Microsoft moved its SAP workload to the cloud has been updated with new details and updated terminology. A new SAP on Microsoft Azure video has been added below. This content was written to highlight a particular event or moment in time. Although that moment has passed, we’re republishing it […]

The post Hello Azure: Unpacking how Microsoft moved its SAP workload to the cloud appeared first on Inside Track Blog.

]]>
Microsoft Digital stories[Editor’s note: This story on how Microsoft moved its SAP workload to the cloud has been updated with new details and updated terminology. A new SAP on Microsoft Azure video has been added below. This content was written to highlight a particular event or moment in time. Although that moment has passed, we’re republishing it here so you can see what our thinking and experience was like at the time.]

Microsoft is heavily invested in SAP applications—it uses them extensively to run finance, human resources, global trade, supply chain, and other parts of its $168.1 billion global business. That’s why it was a big deal when Microsoft moved its SAP workload to the cloud and Microsoft Azure.

Moving to the cloud saved us money, but this was really about becoming more agile and innovative.

—Krassimir Karamfilov, general manager, Microsoft SAP program, Microsoft Digital

In early 2018, the company finished moving its entire SAP landscape—an estimated 50 terabytes—to Microsoft Azure. It was at that time, when the last, most important systems were moved over one busy weekend, that the company was able to seize on a new opportunity.

“Moving to the cloud saved us money, but this was really about becoming more agile and innovative,” says Krassimir Karamfilov, general manager of the Microsoft SAP program in Microsoft Digital. “This allowed our teams to stop worrying about keeping our infrastructure up and running and to focus on innovating without a lot of heartburn. They can now run experiments, learn, and then use those learnings to take us in new directions—and if an experiment doesn’t work? They can easily shut it down and move on to something else.”

Moving from on-premises to Microsoft Azure slashed the Microsoft SAP budget by 20 percent to 25 percent, cost savings that came from fine-tuning usage, snoozing systems at night and on weekends, and by leaving behind old processes that weren’t needed anymore.

“We’re 100 percent in the cloud, and since Azure is the trusted cloud provider, our entire SAP landscape is now more secure than ever,” says Karamfilov, explaining that the employees who manage the company’s SAP in many instances were given new tools to work with. “The shift to the cloud enabled us to start using machine learning and artificial intelligence to look at the data underneath our entire landscape, and to start learning from that.”


Reutter answers questions about the company’s SAP on Microsoft Azure migration and lessons learned along the way.

At the time, the move to the cloud represented how committed Microsoft was and still is to its partnership with SAP, Karamfilov says. Officially and publicly, SAP has adopted a “multi-cloud strategy” where it supports all hyperscale cloud providers to enable customers to use their preferred cloud. Also at the time, SAP announced it had selected Microsoft Azure to run many of its internal mission-critical SAP business systems (since then SAP has moved a lot of those internal systems to the cloud on Microsoft Azure).

[Learn how Microsoft uses telemetry and monitoring on its SAP on Microsoft Azure workload. Here’s how Microsoft is examining SAP transactions with Microsoft Azure Anomaly Detector. Find out how Microsoft is optimizing SAP for Microsoft Azure.]

Making the move to Azure

Looking back at how Microsoft moved its SAP workload to the cloud, the Microsoft SAP team had been thinking about making the move since 2013, but it was only in 2017 that Azure added virtual machine SKUs big enough to handle a Microsoft-sized SAP ERP system, says Hans Reutter, the Microsoft Digital engineering manager who led the cloud migration.

“It wasn’t just the size, it was the complexity,” Reutter says, giving the example of several independent enterprise-scale purchasing processes needing to come together seamlessly at the point of purchase so the customer has a good buying experience. “All of these production landscapes were highly dependent on each other because of the traffic that flows back and forth between them.”

We started with the low-risk stuff—the low-hanging fruit. First we moved the sandbox systems because we knew that there wouldn’t be any impact on our customers if something went wrong.

—Hans Reutter, group engineering manager, Microsoft Digital

Microsoft Azure developed the M-Series specifically to address the enterprise market demand for larger virtual machines capable of running SAP landscapes at companies just like Microsoft.

“We just happened to be one of the enterprises waiting for the M-Series,” Reutter says. “It was exactly what we needed to make our move to the cloud.”

With Azure’s M64 and M128 SKUs in hand, Reutter’s team migrated the company’s largest and most complex SAP systems to Azure over one weekend (the team used smaller virtual machines to move most of the Microsoft landscape over the prior year).

To understand how they did it, Reutter says think horizontally and vertically.

“We started with the low-risk stuff—the low-hanging fruit,” Reutter says. “First, we moved the sandbox systems because we knew that there wouldn’t be any impact on our customers if something went wrong.”

That was the base layer. Next came the next least important layer, and the next, and the next. The team worked up layer by layer until it eventually got to the most important production systems. That was the horizontal approach. “If we missed anything, we found out really quickly—before we got to the really important systems,” Reutter says.

The team wrote migration scripts, found landmines, and worked out kinks while working in the base layers, which allowed it to get everything down pat by the time things got serious.

Now to the vertical.

The team also used an end-to-end vertical strategy, migrating a few systems—from development all the way to production—all at once. “We tested the whole stack to make sure there were no gaps or surprises,” Reutter says. “This vertical strategy allowed the team to accelerate learning and getting a jump on production processes.”

By getting it right in both dimensions, they were able to migrate the mission critical systems seamlessly. “It’s one thing to move some small stuff,” Reutter says. “You want to make sure the crown jewels land in a safe spot.”

And while the migration wasn’t without bumps, it went smoothly and the crown jewels were kept safe.

Getting it right at home

Juergen Thomas hears it all the time. If a customer is considering moving his or her SAP systems to Microsoft Azure, they always ask the same question.

The ask?

“Does Microsoft run its landscape in Azure?”

Thomas, a partner architect who manages a team in Microsoft Azure that helps SAP customers move to the cloud, was happy when he could start answering with a resounding “yes,” and even more importantly, that he also then had a good answer to their inevitable second question, which is “how did you do it?”

It’s proof that the Microsoft Azure platform can carry and sustain such a complex SAP platform on our Azure infrastructure. We talk about it in customer briefings and conferences—it is our poster child. Running our own backend business processes with SAP in Azure is a main differentiator between us and our competitors who don’t run SAP at all, or not in their own clouds.

—Juergen Thomas, partner architect, Microsoft Azure

“Then they immediately want to get into the details, which is a good thing,” Thomas says. “It was very important to be able to say that Microsoft is leading again and is ahead of the mass of companies.”

Thomas says the story of moving the Microsoft ERP systems to the cloud has become a key showcase for the Microsoft Azure team because it shows Microsoft Azure can handle the biggest and most complicated SAP instances.

“It’s proof that the Microsoft Azure platform can carry and sustain such a complex SAP platform on our Azure infrastructure,” he says. “We talk about it in customer briefings and conferences—it is our poster child. Running our own backend business processes with SAP in Azure is a main differentiator between us and our competitors who don’t run SAP at all, or not in their own clouds.”

Key Takeaways

If you’re getting ready to move your SAP systems to the cloud, here are four quick things to think about to help you get started.

  • Clean out your closet: Moving to the cloud is an opportunity to throw out the stuff you’re not using. When you already owned that old on-premises server, it didn’t matter how much old stuff you had buried in there. When you’re on the cloud, the cost of carrying around a lot of dead weight can add up fast.
  • Avoid disasters: When you move all your stuff to the cloud, the temptation is to move it and forget it. Not so fast—you need to make sure you have a backup plan. Microsoft Azure has more than 60 regions around the world. Make sure your profile is backed up in a region that’s geographically separate from the one you’re physically located in. This will make sure your systems keep plugging even if something goes wrong in your region.
  • Use the cloud you already pay for: As much as you can, move your systems into cloud products that you already pay for to keep your costs down, including Microsoft 365 and Microsoft Dynamics 365.
  • Snooze so you don’t lose: Slash your costs by taking advantage of one of the cloud’s best benefits, which is to snooze your usage of Microsoft Azure when your teams are out of the office on nights and weekends.

Related links

The post Hello Azure: Unpacking how Microsoft moved its SAP workload to the cloud appeared first on Inside Track Blog.

]]>
3429
Transforming how Microsoft connects with its 58,000 suppliers http://approjects.co.za/?big=insidetrack/blog/transforming-how-microsoft-connects-with-its-58000-suppliers/ Wed, 11 Oct 2023 16:00:57 +0000 http://approjects.co.za/?big=insidetrack/blog/?p=6840 Anyone procuring goods or services for Microsoft needs to do their homework. Not only must you make sure the supplier can provide what you need at a reasonable price, you must also make sure they meet certain standards and security compliance and obtain their contact information. Tracking down all that information is where supplier information […]

The post Transforming how Microsoft connects with its 58,000 suppliers appeared first on Inside Track Blog.

]]>
Microsoft Digital storiesAnyone procuring goods or services for Microsoft needs to do their homework. Not only must you make sure the supplier can provide what you need at a reasonable price, you must also make sure they meet certain standards and security compliance and obtain their contact information. Tracking down all that information is where supplier information management comes in.

From cafeteria food and office chairs to professional services, marketing, and hardware, Microsoft works with more than 58,000 different suppliers. Previously, finding the relevant information needed to make an informed choice meant searching through at least 10 different databases or tools.

Whether that information would be current and accurate was another question. Updates made in one database might not make it into another. Only 65 percent of Microsoft’s regular suppliers had current contact information on file.

“There was no holistic data store where we could say, ‘here is a 360-degree view of a supplier,’” says Naveen Kumar Nooka, a senior program manager on Microsoft’s Procurement team.

Gathering the pieces to form that view is typically a four- or five-day process. The solution was often to create a support ticket with the Accounts Payable or Procurement teams—and there were a lot of support tickets.

Shah poses smiling in front of a white background, wearing a black button up shirt.
Suchit Shah listened to the pain points expressed by Microsoft suppliers to improve the experience of providing supplier information to Microsoft.

These pain points weren’t only felt internally at Microsoft.

Suppliers also had a difficult time providing the needed information to partner with Microsoft. They had to visit up to 14 separate tools owned by various teams to enter different types of data, such as basic profile information and their rate cards, compliance documentation, and sourcing information. If they had questions, there was no centralized place to get help.

“We heard from the suppliers telling us that it’s difficult to do business with Microsoft because you have so many tools,” says Suchit Shah, a senior procurement operations manager for Microsoft. If they wanted to update any information, suppliers had to contact their business manager, and that person would then enter the updates manually on an as-needed basis.

“The result of that was that suppliers were missing compliance deadlines and missing critical data that they had to provide to us,” Shah says. “That’s disruptive to the business, because if they’re not compliant we have to block them, and they can’t provide services to Microsoft.”

The Procurement team within Microsoft Digital, the organization that powers, protects, and transforms Microsoft, envisioned a way to solve the company’s issues with supplier information management.

Not only aiming to reduce the overall risk by ensuring trusted and accurate data, the Procurement team sought to improve the experience for their users on both sides of the business relationship while reducing the costs associated with the manual and disconnected processes.

[Learn more about how Microsoft secures its supply chain with risk-based assessments. Find out how Microsoft designed a modern service architecture for its procurement and payment processes.]

Somebody stood up

Microsoft Consulting Services is highly dependent on supplier information.

As a senior business program manager overseeing master data within Microsoft’s Services business unit, it’s important for Andreas Hart to have accurate information on subcontractors.

The biggest win of the project was that somebody stood up and said, “I’ll be that central reference point for everyone.”

—Andreas Hart, senior business program manager, Microsoft Services

To realize the full benefit of a new centralized supplier data repository, teams within Microsoft needed to trust the new source of truth before they could eliminate their now redundant processes. Microsoft Consulting Services was the first of many to stand up and partner with the SupplierWeb team to retire their separate processes, thereby driving efficiencies for their business.

“I think the biggest win of the project was that somebody stood up and said, ‘I’ll be that central reference point for everyone,’” Hart says. “Naveen and the Procurement engineering team volunteered to be the backbone, and then the consuming team saying, ‘we want to create a dependency as well.’ That trust on the dependency has to be established.”

For Microsoft suppliers, a single portal called SupplierWeb was created to replace the 14 existing tools. Designed as a self-service portal, suppliers can sign in and easily manage and update their own data, view their transactions, and get help through a digital assistant. Implementing data governance rules and best practices within SupplierWeb ensures that only valid data flows in.

Since its initial rollout in January 2020, SupplierWeb has served roughly 48,000 unique users a year, representing approximately 60 percent of the active supplier base.

For internal users at Microsoft needing to find a supplier, a new portal focused on Microsoft users was created called ProcureWeb, providing the sought-after 360-degree view with all the necessary and validated information together in one place. In addition to basic information. ProcureWeb also offers “surround data”—augmented data-like awards, skills, fact sheets, and special recognitions that help complete the picture.

We’ve built a seamless user interface on Microsoft technologies that is available to both suppliers as well as internal users.

—Naveen Kumar Nooka, senior program manager, Microsoft Procurement

Kumar poses outside wearing a purple t-shirt, smiling towards the camera.
Naveen Kumar Nooka helped redesign and improve the experience of procuring the right supplier at Microsoft.

The robust ProcureWeb database has another popular new feature: intelligent search insights that can help internal users find a supplier based on specific criteria, such as areas of expertise, level of experience, or even a supplier’s diversity rating, which is helpful as Microsoft works to diversify its supplier base to include more minority-owned businesses. The search function has logged more than 870,000 unique searches since July.

Built on Microsoft Azure, both portals use a micro front-end architecture, with a single service layer powering both systems with consistency. Data is stored in Microsoft Azure Cosmos DB, Microsoft’s multi-model database service, and seamlessly connects to SAP on the back end.

“We’ve built a seamless user interface on Microsoft technologies that is available to both suppliers as well as internal users,” Nooka says. “We build once and make it available in multiple places, ensuring that there is the right level of authorization. That’s the whole suite of solutions we’ve built to make life easier for both suppliers and internal users.”

Tackling bigger questions

Microsoft isn’t alone in needing a robust solution for supplier information management that operates at enterprise scale. Designing one with solid data integrity, enhanced capabilities, and a smooth user experience was an opportunity to build something unique.

“If we look at the industry today, there are a lot of large enterprises who do business with a number of suppliers,” Nooka says. “Most of them have their own portals where they collect supplier information. So, we’re trying to be as innovative as possible in this space, optimizing the data we collect from suppliers, while generating all the insights for an internal user to ensure a seamless experience.”

The Procurement team is now looking to improve other aspects of the supplier lifecycle by harnessing the full potential of AI to create a seamless supplier management experience, decreasing processing time, improving supplier compliance, and helping suppliers get access to support faster.

The combination of Fluent UI design principles and AI has the potential to revolutionize the supplier management experience. By combining Fluent UI design principles and AI’s capabilities, supplier management becomes more efficient, accurate, and proactive. The result is a streamlined and empowered experience, fostering stronger supplier relationships, mitigating risks, and driving better overall performance. Fluent UI provides visually appealing and user-friendly interfaces, ensuring intuitive interactions with supplier management platforms. AI enhances this experience by automating various tasks like supplier discovery, risk assessment, performance monitoring, and contract analysis. It enables intelligent automation, simplifying processes and freeing up time for strategic decision-making.

The supplier vetting process was the next big overhaul, aiming to eliminate duplicative work that happens when the same supplier is onboarded multiple times by different teams that have their own processes.

Collecting data at the right points in time and completing the vetting process before a supplier is entered into the system also ensured that only suppliers that meet Microsoft standards and requirements go through the whole onboarding process.

“It’s has helped faster onboarding,” Shah says. “It’s a massive transformation for both sides, and it’s a multi-year journey.”

Rosalia Snyder, group procurement operations manager for Microsoft Procurement, says that as risks in managing suppliers have evolved, it’s increasingly critical to have agile solutions for supplier information management.

“Whether it be supporting company commitments around diversity or sustainability or adding mandatory statutory requirements, how do we ensure we have accurate supplier data to quickly adapt when we need to?” Snyder says.

It starts with better tools, but that’s just the beginning.

“We are breaking down silos, taking a lead across the enterprise to define how suppliers should do business with us, while creating the ecosystem to do it in,” Snyder says. “This has been part of our digital transformation journey at Microsoft.”

Related links

The post Transforming how Microsoft connects with its 58,000 suppliers appeared first on Inside Track Blog.

]]>
6840
Transforming Microsoft’s corporate expense tools with Microsoft Azure and Microsoft Dynamics 365 http://approjects.co.za/?big=insidetrack/blog/transforming-microsofts-corporate-expense-tools-with-microsoft-azure-and-microsoft-dynamics-365/ Tue, 19 Sep 2023 16:12:16 +0000 http://approjects.co.za/?big=insidetrack/blog/?p=6462 When it comes to employee expenses, Microsoft has quite a collection of corporate expense tools in its tool belt. Depending on your role, you might need different tools, all of which accomplish the same basic goal—reporting expenses and getting reimbursed—in job-specific ways. That may sound pretty handy, but all those tools can also be a […]

The post Transforming Microsoft’s corporate expense tools with Microsoft Azure and Microsoft Dynamics 365 appeared first on Inside Track Blog.

]]>
Microsoft Digital storiesWhen it comes to employee expenses, Microsoft has quite a collection of corporate expense tools in its tool belt. Depending on your role, you might need different tools, all of which accomplish the same basic goal—reporting expenses and getting reimbursed—in job-specific ways.

That may sound pretty handy, but all those tools can also be a burden, especially for the engineers who have to keep them sharp. It can also be confusing for employees to have to switch between different tools and user interfaces for filing expenses.

Microsoft has built up its collection of corporate expense tools over the years, onboarding various internal and third-party expense management platforms through acquisitions and because of different business needs. As the weight grew heavier, so did the need to address their maintenance issues.

“We had multiple tools that needed streamlining,” says Amruta Anawalikar. She is a senior program manager for Microsoft Commerce Financial Services (CFS) in Finance Engineering, the team under Azure Cloud + AI that manages expenses at Microsoft. “We wanted to reduce employee productivity costs and improve operations, engineering capacity, and business capacity.”

[Find out how Microsoft is creating efficiencies in finance with Microsoft Dynamics 365 and machine learning. Learn more about migrating critical financial systems to Microsoft Azure.]

Flipping the design

These disconnected systems were working across 110 countries and regions with no synchronization, each with its own vertical infrastructure. The system to manage them all had been built up piece by piece.

Because of varying configurations specific to local environments, deployments were no small task. Expense categories, policies, and payment rules differed across tools. None of the systems even spoke the same language—“report name” in one tool might be called “report description” or “expense purpose” in another.

Solving for these issues through tool customizations wasn’t an option, either, because one of the primary tools (a third-party product) didn’t support the API integrations needed for that kind of flexibility.

Standardization and unification were badly needed.

We’ve flipped the whole design. By generalizing the entire ecosystem and focusing on a modern Azure-based design, it allows multiple tools to exist.

– Sumeet Deshpande, Microsoft CFS Finance Engineering

“Historically, the efforts were really focused locally within each tool and over the years, each ended up having its own over-customized, tool-centric designs built around them,” says Sumeet Deshpande. He is a principal software engineering manager for CFS Finance Engineering. “Even though the systems were very similar, we couldn’t leverage the same components.”

Deshpande’s team set out to create a unified expense management experience that put engineering before process.

“We’ve flipped the whole design,” Deshpande says. “By generalizing the entire ecosystem and focusing on a modern Azure-based design, it allows multiple tools to exist, and the problems we were having in over-aligning to the specific tools have gone away.”

The modernization journey

Microsoft’s journey to build a modern backend pipeline to unify its corporate expense tools began two years ago, with a milestone that became a catalyst for change.

In 2019, the legacy tool MSExpense 1.0 was being retired. Faced with 110 countries and regions to migrate (and myriad tax and statutory regulations to go with them), the team expected the migration to MSExpense 2.0 to take two to three years and cost an estimated $2 million. But then they decided to try a new strategy.

“That’s the point where we started thinking differently in how our tools need to be either onboarded or retired,” says Mohit Jain, a senior software engineer who led the retirement for CFS Finance Engineering.

They started breaking big problems down into smaller pieces, dividing countries and regions into buckets based on their level of complexity and tackling the migration one at a time.

The entire migration took just six months, and cost nothing.

“This was a really important part of our journey and how we approached problems going forward,” Jain says.

Building on that momentum, the team implemented a major overhaul of the user experience in 2020 and introduced OneExpense, automating much of the process with built-in machine learning to essentially eliminate the need for employees to file expense reports at all.

According to Deshpande, that’s what set the stage for remaking the back end.

“MSExpense 1.0 was retired at rocket speed,” he says. “We built on that and delivered end-to-end automation. That was a powerful story for leadership—people started to listen to us after that.”

Meet the hero: Microsoft Azure’s architecture

With a firm engineering mindset and funding to move forward, the mission to modernize was on.

Microsoft Azure provided the cloud base that would help the team achieve their internet-first, top-down goal. Whereas the structures of the third-party expense tools were locked, Microsoft Dynamics 365, the company’s powerful suite of business solutions software, swooped in to stretch their flexibility.

Key to that flexibility was building a disconnected architecture that allowed the team to create a plug-and-play modular design that enabled any individual system to be swapped in or out.

“We can literally replace any component, with Dynamics 365 as a unified mechanism for any expense tool that exists,” Deshpande says. “Our primary hero is the modern Azure-based design, which really synergizes reusability across expense tools and allows multiple tools to exist because they have been there for a valid reason. The most important piece within it is Dynamics 365, but that’s just one part of the puzzle.”

Standardization was also a fundamental piece.

As part of the earlier automation project, Anawalikar had gathered extensive market research data to standardize policies. By reaching out to different markets as well as other large tech companies to investigate the sources of governance, they were able to trim a lot of fat and create a universal set of standards and definitions.

This enabled Jain to create what they called the OneExpense contract: a common data language that would allow the siloed systems to understand each other.

“Whatever expense tool you run your infrastructure on, eventually we converted that tool-specific data to a OneExpense contract,” Jain says. “It was a very important piece to make a contract that’s tool agnostic—that’s what we need for our downstream needs.”

Anawalikar calls it something else: “The holy grail.”

“Everything in this ecosystem, right from automation to expense reporting, is based on the expense contract,” she says. “We need to protect it at all costs.”

Those vertical structures in the architecture that made management of the corporate expense tools so labor-intensive were up for disruption as well, and were restructured horizontally.

This is where Dynamics 365 really has an edge. It gives us that flexibility of customization, which will make it extendable for any future needs as well.

– Mohit Jain, senior software engineer, Microsoft CFS Finance Engineering

“When they’re horizontal, they’re the same across different systems,” Jain says. “Even with different tools, the overall experience remains the same because the tool-specific data was converted to a generic OneExpense contract.”

At the core of the system, the team built its Data Integration Service: an orchestrator that controls various microsystems for calculating taxes, sending statutory forms, and sending emails to approvers. Functioning much like an orchestra conductor, the orchestrator queues each system at the right time to perform the right function, sometimes each playing solo, and other times in harmony with other instruments.

“Adding up all of those microservices would have been really difficult in any other system,” Jain says. “This is where Dynamics 365 really has an edge. It gives us that flexibility of customization, which will make it extendable for any future needs as well.”

Microsoft OneExpense architecture transition state graphic illustrates current horizonal structure of the expense tool management system.
The current transitional state of Microsoft’s OneExpense architecture features a new horizontal structure that facilitates deployment and orchestration.
Dream state graphic illustrates the future of the system, which is lighter with fewer individual tool instances to support.
The future “dream” state of the Microsoft OneExpense architecture has a lighter design that is more scalable and extendible.

The tool belt of the future

The Finance Digital team is already enjoying life in the modern expense management world.

Deployments are easy, having gone from 100 percent manual (with a lot of typos) to standardized configurations deployed with 90 percent automation.

Modular plug-and-play components include a tax calculation system and a centralized automated auditing system that’s more efficient at flagging errors. Plus, self-monitoring and self-healing tools detect and fix issues behind the scenes before they’re flagged by employees filing help tickets.

With all of this comes greatly reduced operational costs, higher productivity, and employees who no longer have to view expense filing as a laborious, confusing task that interrupts the main focus of their jobs.

As Anawalikar says, one of the primary goals achieved is to “make expenses less expensive.”

“This expense story today is a showcase of everything Azure Cloud + AI has to offer, namely the power of Azure Cloud, the use of AI and ML, and the use of Dynamics 365,” Deshpande says.

What gets Deshpande and his team really excited as they iterate further is the architecture’s modular ability to work with any corporate expense tool.

“If Dynamics 365 evolves and rolls out multiple advanced versions, it will still work,” Deshpande says.

Jain is also looking forward to a future state in which the architecture is even lighter-weight and Microsoft Dynamics 365 is truly functioning as the Swiss Army knife of all corporate expense tools.

Along the way, the team has experienced what it feels like to be part of a team that’s empowered by research, data, and its own supportive leadership.

“We were encouraged to challenge the status quo and ask questions,” Jain says. “We used disruption for a positive outcome.”

Related links

The post Transforming Microsoft’s corporate expense tools with Microsoft Azure and Microsoft Dynamics 365 appeared first on Inside Track Blog.

]]>
6462
How auto-scaling SAP on Microsoft Azure is benefitting Microsoft http://approjects.co.za/?big=insidetrack/blog/how-auto-scaling-sap-on-microsoft-azure-is-benefitting-microsoft/ Thu, 11 Feb 2021 21:25:57 +0000 http://approjects.co.za/?big=insidetrack/blog/?p=6219 Microsoft has implemented auto-scaling SAP on Microsoft Azure to help its SAP workloads run more efficiently. Why? Like many enterprises, Microsoft runs on SAP. It uses the software to run everything from tracking servers in its supply chain to making sure the company’s 140,000 employees are paid on time. It has one of the largest […]

The post How auto-scaling SAP on Microsoft Azure is benefitting Microsoft appeared first on Inside Track Blog.

]]>
Microsoft Digital storiesMicrosoft has implemented auto-scaling SAP on Microsoft Azure to help its SAP workloads run more efficiently.

Why?

Like many enterprises, Microsoft runs on SAP.

It uses the software to run everything from tracking servers in its supply chain to making sure the company’s 140,000 employees are paid on time. It has one of the largest SAP deployments in the world.

In fact, Microsoft manages 50 terabytes of SAP data (enough to hold nearly 7 million digital photos) on 700 Microsoft Azure virtual machines. The company’s SAP usage is doubling year over year, and it conducts 300 million operations per month.

In 2017, Microsoft Digital team orchestrated a massive lift-and-shift of the company’s SAP business process services, moving that data trove to 700 virtual machines.

The move was intended to save Microsoft operational costs, while also improving reliability.

And it succeeded.

Still, more could be done, says Sanoop Thrivikraman Nampoothiri, a senior software engineer in Microsoft Digital.

“We thought the benefits of moving to Azure could be even greater if we just took a few more steps,” he says. “So we designed a way to more efficiently manage the resources we use to power our SAP applications.”

The team had already made progress. Costs of managing the SAP workload dropped 18 percent during the first two years of Microsoft Azure operations. That was thanks to moving away from on-premises hardware and waterfall-based engineering practices and deploying upgrades. You can read more about that effort.

Moreover, lifting-and-shifting Microsoft’s SAP workload to Azure allowed the company to easily scale its SAP application to keep up with the explosive growth in usage.

But Microsoft Digital engineers sought further savings. They saw pain points such as the rising costs of running SAP on virtual machines, the company’s one-size-fits-all approach to server configuration, and the lack of an out-of-the-box way to dynamically scale SAP workloads.

[Learn how Microsoft monitors SAP end to end. See how Microsoft monitors end-to-end enterprise health with Microsoft Azure. Check out how Microsoft migrated critical financial systems to Microsoft Azure.]

Adapting SAP for the cloud era

One of the challenges with managing SAP in the cloud is that even though it now has more than 220 million cloud-based users, it’s not fully optimized for today’s elastic cloud infrastructure.

“SAP (in its current form) was designed back in the 1990s,” Nampoothiri says. “It’s an older architecture, and it’s not as modern as some of our other Azure services—especially web services. At the same time, it’s one of our busiest services, managing everything from finances to supply chains. And a lot of our customers are in the same position as we are.”

One result is that engineers tend to be cautious when managing mission-critical applications such as SAP, building in plenty of capacity to ensure customers always have access.

“When you design a system like that, you always design for peak load,” Nampoothiri says. “Most of our customers do the same. But Azure has a lot of flexibility that allows you to right-size systems.”

This leaves SAP application servers at a sweet spot for automation and optimization. Combining the oversight of Microsoft Azure Monitor with the power of Microsoft Azure Automation, these application servers can be scaled at will.

Microsoft Digital carefully monitors SAP usage and stability using Microsoft Azure Monitor and applies technologies such as predictive analytics to spot potential problems before they occur. That monitoring also measures the loads on SAP infrastructure, allowing engineers to clearly see usage patterns.

“The telemetry from Microsoft Azure Monitor helped us understand which workgroups have different loads,” says Karan Parseja, a Microsoft Digital software engineer in Hyderabad. “The next step was to build a solution that would decide which servers should run at lower load levels, and then automatically reduce the capacity for those servers. We also needed the solution to gracefully stop an application when needed.”

Enter auto-scaling, tight-sizing, and snoozing.

After the migration of 700-plus virtual machines (VMs) to Azure, we were constantly looking at the opportunities for further optimization of infrastructure resources.

– Santosh Rajput, senior software engineer in Microsoft Digital

Microsoft Azure runs SAP more efficiently with auto-scaling

The seed for auto-scaling SAP on Microsoft Azure came from a hackathon—an annual week-long event at Microsoft where everyone teams up with colleagues to work on ideas of their choice.

“After the migration of 700-plus VMs to Azure, we were constantly looking at the opportunities for further optimization of infrastructure resources,” says Santosh Rajput, a senior software engineer in Microsoft Digital. “During a hackathon, we came up with this idea of scaling in or out of SAP application servers automatically, in real time.”

Altogether, the team took three approaches to improve how Microsoft Azure runs SAP:

Auto-scaling. The team embraced an “infrastructure on demand” approach, in part because it’s easy to scale Microsoft Azure up as needed. Team members used the SAP Quick Sizer tool to estimate precisely how much VM capacity was needed, then scaled accordingly. And they shortened the planning horizon from several years to six months, enabling more precise adjustments to demand.

Tight-sizing. Most system demand peaks are predictable—quarter-end and year-end in particular. The Microsoft Digital team redesigned its VM array running SAP to correlate system capacity with anticipated peak demands.

Snoozing. Perhaps the biggest change was to move away from the always-on status of the original SAP setup. The Microsoft Azure team used Microsoft PowerShell to give the system the ability to “sleep” during quiet periods. But if someone is working on a weekend and needs access, the virtual machines rapidly come back online to do the work.

The reconfigured SAP/Microsoft Azure system also was redesigned with fewer points of failure and has a substantial degree of redundancy to guard against unexpected faults. It also has dual databases, which provide automatic failover in the event one crashes. That also makes it easier to perform system upgrades without interfering with work demands.

A chart showing the flow of data through a SAP instance in Azure. It shows how the databases, servers, and Azure interact to respond to changes in demand for SAP.
Microsoft’s SAP infrastructure is based on servers, telemetry, SQL databases, and Microsoft Azure Logic Apps. This allows Microsoft Azure to scale SAP up or down, depending on demand.

Still, perhaps the biggest task was finding a way to deploy these improvements in a way that allowed Microsoft’s SAP infrastructure to keep working smoothly while auto-scaling SAP on Microsoft Azure changes were made. Think of it as repairing a jetliner mid-flight—from the outside.

“We had to convince our stakeholders that this would really work without having an impact on the availability of the system,” Rajput says. “Any customer running SAP and Azure would have that concern as well.”

When we moved, we didn’t want to take any chances. We wanted to show people the best possible way to run SAP on Azure. So we were conservative and focused on availability for peak loads. But now we’re confident that Azure can handle SAP workloads, so now we’re working on optimization.

– Niranjan Maski, senior program manager in Microsoft Digital

Niranjan Maski agrees. He is a senior program manager for Microsoft Digital in Hyderabad.

“When we moved, we didn’t want to take any chances,” Maski says. “We wanted to show people the best possible way to run SAP on Azure. So, we were conservative and focused on availability for peak loads. But now we’re confident that Azure can handle SAP workloads, so now we’re working on optimization.”

Empowering customers to do more with SAP

Overall, auto-scaling SAP on Microsoft Azure reduced the cost of running SAP by another 18 percent and created a more robust system in the process.

Now used internally, these improvements may be rolled out for customers using Microsoft Azure and SAP. That would be an important stage in keeping Microsoft Azure abreast or ahead of competitors, who also run SAP on their cloud services.

“With COVID-19, we’re seeing more enterprises moving their IT infrastructure to the cloud, so they have better resiliency and scalability,” Rajput says. “For a lot of our customers, their biggest workload is enterprise resource planning (ERP) performed on SAP. If we can show them that moving to the cloud saves them money, then that will drive more cloud adoption.”

Options include making this an add-on to Microsoft Azure, says Amit Ganguli, a Microsoft Digital program management director based in Hyderabad. That also means possibly using Microsoft Azure Monitor, which now is in preview, or open-source code on GitHub.

For the team, making a big difference despite their few members has been a great source of satisfaction.

“I’m really proud of my team members,” Rajput says. “One of the strengths of Microsoft is it can quickly build teams that can solve big problems like this. I don’t feel like I’m just doing a job. What motivates me is that we’re having a positive impact on our customers.”

Related links

The post How auto-scaling SAP on Microsoft Azure is benefitting Microsoft appeared first on Inside Track Blog.

]]>
6219