Kartik Rao Polepalli, Author at Microsoft Power Platform Blog http://approjects.co.za/?big=en-us/power-platform/blog Innovate with Business Apps Wed, 11 Jun 2025 15:13:03 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.3 Advanced | Flow of the Week: Tracking changes in a deployment http://approjects.co.za/?big=en-us/power-platform/blog/power-automate/tracking-changes-in-a-deployment/ Wed, 13 Sep 2017 14:35:02 +0000 http://approjects.co.za/?big=en-us/power-platform/blog/power-automate/tracking-changes-in-a-deployment/ In a previous blog, we saw how the Microsoft Flow team tracks Flow portal and backend deployments as they get deployed across regions. While knowing which builds are deployed to which region helps during a live site investigation, an equal if not more important piece of information is knowing what changes were deployed as part of a build. Knowing what changes comprise a build and where it is deployed helps assess the impact of a live site incident and take corrective actions accordingly. In this Flow of the Week, we're developing an end-to-end deployment e-mail notification system and also generating Power BI-based change tracking report without writing a single line of code.

The post Advanced | Flow of the Week: Tracking changes in a deployment appeared first on Microsoft Power Platform Blog.

]]>
In a previous blog, we saw how the Microsoft Flow team tracks Flow portal and backend deployments as they get deployed across regions. While knowing which builds are deployed to which region helps during a live site investigation, an equal if not more important piece of information is knowing what changes were deployed as part of a build. Knowing what changes comprise a build and where it is deployed helps assess the impact of a live site incident and take corrective actions accordingly. However, knowing which changes are being deployed also helps in the following ways:

  • Generate e-mail based alerts when the current candidate build gets deployed to first release region. Engineers and Product Managers can check if their bug fixes or features are making to production, and if so, validate it end to end on production. This also serves as a callout for test team to switch to testing the service on production and, especially, focus on the new features or bug fixes that are being rolled out.
  • Generate e-mail based alerts when the deployment to all regions completes. This is especially useful for Product Managers as they can publish blogs announcing new features. They can also work with partner teams to plan out next steps for a big feature and update customers about blocker bug fixes and / or new features that the customer was waiting for.
  • Track which particular change was deployed to which regions and when. This helps the engineering team to ensure that a backend change –  on which a front end feature depends – has been deployed to all regions before enabling the corresponding front end feature on production.

We will look at a Flow that enables each of the above scenarios and helps improve decision making across all roles on the team.

 

Solution

We used two SharePoint lists in the previous blog to determine when a deployment finished and what changes were deployed as part of the deployment. The lists were:

  • Current deployments snapshot: To record the branch and build that was deployed to each role in each region.
  • Deployments history: To record the history of branch and builds deployed to each role in each region.

We trigger our Flow when an item in the Current Deployments snapshot list is updated.

Once our flow is triggered, we need to check if the deployment was to a first release region or not. We do this by using an advanced condition to evaluate all possible combinations for a first release deployment.

If this deployment is to a first release region, we send a mail with the log of changes contained in this build – we will look at how we get the log of changes later in the blog. However, if the deployment is to some other region, we check if this is the last deployment for this build. We do this by checking if the Current deployment snapshot list has any items that do not have the candidate build deployed on the same role across regions.

If there are no items in the Current deployment snapshot list that have a different build, it means that deployment to all regions completed. At this point, we get the list of all regions to which the current candidate build was deployed and create a table containing columns for the region name and the timestamp when deployment in that region completed. We then send out a mail containing this table along with the change log table to the team.

 

Getting change logs in a build

Microsoft Flow’s product code is maintained on Visual Studio Team Services (VSTS or formerly Visual Studio Online). Microsoft Flow also has a first class connector that connects to VSTS. However due to some limitations of the current connector implementation, we used a Custom Connector to connect to VSTS and get the required change log data. The partner team is working hard on bringing the features of Custom Connector into the first class connector. In the meantime, if you do want to build and use similar capabilities in your Flow, it is super simple to build a Custom Connector in Microsoft Flow and the VSTS REST API makes it very easy to get one up very quickly.

Depending on whether we are trying to get change log for the frontend or backend service, we compute the repository id using a simple IF expression. Once we know the repository id, and since we already know the branch name (from the flow trigger), we can get the log of recent changes on the repository. The commit log to identify a candidate build follows a particular pattern, and so, we filter the logs to find the corresponding commit. The output of the filter operation is an array containing a single matching commit log – we extract this object for use in subsequent actions by using a Compose action followed by a ParseJson action.

In order to find which commit log to read up to, we first need to find the previous deployment build information. We do this by searching the Deployments history list to find the deployment that preceded the just completed deployment. We list the top 2 items in the Deployments history list and filter out the item corresponding to the deployment that just completed. We then use the Compose and ParseJson actions to get the deployment item for the previous deployment, again, for use in subsequent actions.

We use the branch name from previous deployment item to get the commit log details for the previous build that was deployed to same role. Now that we have the previous and current commit log details, we get the list of change logs that are contained in the build and filter out some auto generated commit logs to get the list of changes contained in this build.

 

Pushing change log to Power BI

We use the change log to send out not only the e-mail notifications mentioned earlier, but we also push the data into into Power BI. We created a Streaming Dataset in Power BI with a schema specific to tracking the change log information:

In the flow, once we get the change log information, we iterate over each change log record and push it into Power BI.

 

Outputs

Now that we have looked at how the flow works, let’s look at the different outputs of the flow. The first mail that gets sent out is the email notifying deployment to first release region – it looks as follows:

The next mail that gets generated is when deployment to all regions completes – this mail lists the regions where deployment finished along with deployment completion time and the changes that were deployed.

Finally, we can use the Power BI report to track which change was deployed to which regions.

 

Summary

Using Microsoft Flow we were able to develop an end-to-end deployment e-mail notification system and also generate Power BI-based change tracking report without writing a single line of code.

The post Advanced | Flow of the Week: Tracking changes in a deployment appeared first on Microsoft Power Platform Blog.

]]>
Flow of the Week: Tracking Deployments http://approjects.co.za/?big=en-us/power-platform/blog/power-automate/tracking-deployments/ Thu, 29 Jun 2017 17:00:00 +0000 The Microsoft Flow Engineering team member Kartik shares about Flows that the team itself is using to help our work be more efficient.

The post Flow of the Week: Tracking Deployments appeared first on Microsoft Power Platform Blog.

]]>
Hello Flow Community!

​Today were bringing you a post from one of our internal Engineers. This post is about a Flow that we, the team use in our own environment and work day.

The Microsoft Flow portal and the backend service are deployed to multiple Azure regions. New features and bug fixes are deployed by the team at a regular cadence. The deployment is done through a safe deployment sequence – an approach wherein deployment proceeds from regions with least usage to regions with highest usage. During a deployment, the team may get an incident through automated runners or through a customer report. At this point, the team must investigate the incident and make several decisions. This involves:

  • Investigating the impact of the incident
  • Determining the cause of the incident
  • Determining which regions are impacted by the incident
  • Deciding whether to halt the current deployment sequence and roll back to previous stable bits
  • Deciding whether to roll out a hotfix

The time required to investigate and take corrective actions can be significantly reduced if we have all relevant information easily accessible.

 

Solution

We solve part of the problem by tracking current and historical information about deployments through Microsoft Flow. We do this by maintaining the current deployment snapshot and the log of previous deployments in two SharePoint lists. In this blog, we will walk through the flow that helps capture these data points for reference during investigations.

image

 

Getting the version of current deployed bits

Microsoft Flow portal and the backend service log information that helps monitor the health of the service. These logs move through a data pipeline and finally land in Kusto, an internal data warehouse. We have a custom connector for Kusto which allows us to query data in Kusto through a flow.
 
We can get the version of the current deployed bits for both the portal and backend service by making a query to Kusto. Further, since a deployment can complete anytime, we run the query every 5 minutes. Even though we run the flow at a 5 minutes frequency, we look at the last 1 hour of data to get the list of recent deployments. This is done to factor in any latency in copying logs from the deployed service to Kusto, clock skews and even transient failures in downstream systems.

Processing the list of recent deployments

Once we get the list of recent deployments from Kusto, we need to iterate over each deployment record to process them – so, we add an “Apply to each” block. Since the flow runs at a frequency of 5 minutes but looks back at 1 hour of data, it is possible that some of the deployment information has already been processed. To handle this case, we check if for a record a corresponding item has been created in the “Deployments history” list. Since we know the exact record that we are looking for, we can construct an ODATA filter query to get an exact match of this record – if it is in the list.
 
We determine whether a corresponding item is in the Deployment history list by checking the length of the item list returned by the above action. If the length is 0 – which means no corresponding item was found in the Deployment History list – we then try to find the corresponding item in the Current Snapshot list. This is again done by using an ODATA filter to find an exact match. Since the Current Snapshot list was initially created to have one item per Environment, Role, and Region, we will always find one unique record.

 

Create new deployment history item and update current snapshot

Since the filter on Current Deployment snapshot list returns a list of matches – even though it will always return a single item – we need to extract the first object of the list. We do this by using a Compose action followed by a ParseJson action. The output of ParseJson action can be used in subsequent actions.
 
Finally, we create a new item in the Deployment History list and update the Current Deployment
 

 

Summary

Using Microsoft Flow it was super easy to build a no code solution to both track deployment history and track the state of current deployments across regions.

The post Flow of the Week: Tracking Deployments appeared first on Microsoft Power Platform Blog.

]]>