Advanced Archives - Microsoft Power Platform Blog Innovate with Business Apps Wed, 29 May 2019 20:46:46 +0000 en-US hourly 1 Advanced | Flow of the Week: Triage and Team Assignment with Microsoft Flow http://approjects.co.za/?big=en-us/power-platform/blog/power-automate/advanced-flow-of-the-week-triage-and-team-assignment-with-microsoft-flow/ Wed, 29 May 2019 20:46:46 +0000 Triage and Team Assignment Flow Whats up Flow Friends! This post is written by George Culler and Brian Osgood from the Azure team.

The post Advanced | Flow of the Week: Triage and Team Assignment with Microsoft Flow appeared first on Microsoft Power Platform Blog.

]]>
Triage and Team Assignment Flow

Whats up Flow Friends!

This post is written by George Culler and Brian Osgood  from the Azure team. They Contacted me and asked to show me a set of Flows they had made for their team and after seeing this all in action, i asked them to share the post with all of you, as i know we can all use this same concept in many ways!

Anyways, onward and Enjoy!

 Overview:

Hi, Flow Community!

We are excited to share a customized Triaging and Team Assignment Flow which will help speed up your teams processing times and ensure more accurate work.

We built this Flow out of a need to quickly triage to the right teammate and check for errors in tickets coming from a much larger customer facing team. Below we will show you the steps taken to run Flow through five different checks, as well as send TFS tickets to the right team member for processing. This Flow allows your organization to cut down on churn between teams, provide more accurate information and overall improve customer experience. It will be assumed that you have a understanding of how Flow works so we are going to dive right into creation.

Prereqs:

  • Basic Expression Function Knowledge
  • Advanced Flow Knowledge
  • SharePoint list creation
  • Intermediate TFS / DevOps Knowledge

Steps:

Create a new Flow with a DevOps “When Work Item is Created” trigger

Fill in your required fields and select your work item type. For our Flow, we are using our in-house “Credit Issue” work item. (this helps the Flow pull the correct data fields if you have multiple work items)

We are going to be checking for five “pain points” on each work item. These five points are:

  1. Date issue reported; we have a requirement that issues be reported within a certain amount of days since service start.
  2. Is a file attached? We require that a calculation file be attached to some issues so we will need to check the issue category and then check a field for its value.
  3. Is this Issue a duplicate category for the same customer? We have a couple of one time use categories that we need to check aren’t given multiple times.
  4. Is the dollar amount below a certain threshold? If so, it should be marked by the creator as Pre-Approved.
  5. Auto calculate the dollar amount based on the exchange rate.

We will be running our checks in parallel so let’s add some parallel conditions and get this Flow started!

Date Reported:

Starting with a condition action, we will use the dynamic content to pull in the “Date Issue Reported” field. Our conditional statement will check if the field is blank. We used an expression for this with the “empty” function.

empty(triggerBody()?[‘fields’]?[‘Ops_DateIssuewasReported’])

This function returns true if the field is empty. If its empty, we tag it as “BOT: NODATE”. If the field is populated, we will perform some date calculations to check if the reported time is within 120 days of the Date Issue Started. Using the “Date Time” connector and the “Add to Time” action. We can add 120 days to the Date Issue Began field.


We now have two Dynamic date values we can use, but to compare the date difference, we will use another expression function, the “Ticks” function which we can use in a conditional action to compare if our date issue started + 120 days is greater than the date issue was reported.

ticks(triggerBody()?[‘fields’]?[‘Ops_DateIssuewasReported’])

ticks(body(120_Day_Addition))

If the condition results in a “Yes” then we will tag the ticket as “BOT: 120Days” and leave a message in the Bot Box (this is a field we have added to our TFS template that automatically displays a message when a ticket is created incorrectly). If the condition results in a “No” we do nothing as the ticket is within our criteria.

Calc. file check:

Currently, our check for the file is dependent on if the creator has marked “No” in a dropdown. If so, we tag it “BOT: CF” and then update our custom “Bot Box” to have a message.




Empowerment Check:

For this branch, we will use our SharePoint connector to pull from a custom list. We add a SharePoint action “Get Items” and add our site address and select our List Name. This will return an array that we need to iterate through using the “Apply to Each” action on the “value” dynamic variable from our “Get Items”.


We then run a condition on the ticket item checking for criteria, if its marked correctly, and what the dollar amount is in USD.

If the condition returns a “Yes” we tag the ticket with “BOT: EMP” and leave a message in the Bot Box. We played with the idea of re-assigning the ticket to the creator but that would cause too many headaches for the small amount of time we might save.

Duplicate scenarios:

Since we have three, one time use categories, we need to make sure we are not giving a customer more than once, however we have thousands of customer’s so we cannot just go look over every customer’s request and find certain categories every time. We built a custom query in TFS to pull all tickets where we approved a refund in those categories within the last two months. We then used the DevOps connector “Get Query Results” action to find our custom query and pulled from it.

This returns an array, so we need to use the “apply to each” action again. Inside the “apply to each” we need get the details of each ticket. For this we use the DevOps “Get Work Item Details”. After we have the details, we run a condition, this checks if the customer number on the new ticket is equal to the customer number in an old ticket and if the category is the same for that ticket.

If the conditions are met, we update the ticket, and you get one guess on what we do to the ticket… did you say, “Tag it”? Congrats, we tag it!

Currency Calculation:

This one was tricky to do and involves A LOT of string parsing and variable initialization. Here is the branch overview.

We won’t go through all the individual variables, but I will go over the FX Rate (Exchange Rate) and how we pull from a list and parse the value out. Inside of DevOps, we have a tab that is manually updated by the finance group. It is a long list of semicolon separated values for each of that month’s specific exchange rates for each currency. We will use the split expression function to split on a delimiter. The format of the list is:

USD-1; EUR-0.9;

So we need to split on the semicolon and then the dash. This is the expression we used.

float(split(split(split(variables(‘FX Rates For Current Month’),variables(‘currency value’))[1],’;’)[0],’-‘)[1])

It returns a float that corresponds to the currency that is used in the ticket and the month the ticket was created to then use for a local currency to USD calculation.

Once all the variables are created, we can update the TFS work item with our Tag, “BOT: FX”. All our branches are completed, if there are issues with any of these instances, they will be tagged for the team member to be made aware of and check.

So, let’s go to the final step.

Pre-Triage Tag:

Once all checks have been run, if the ticket passes all previous criteria it will get tagged as BOT: Triage. Signaling the system to send the ticket to its sister Flow where we triage tickets out to individual team members. First, we must determine that this is a new case and is not coming from certain teams already approved. To do this we will use the condition action. Setting the state of the ticket to only run when equal to new cases. Further, we consider certain other teams who are excluded from this action using the or function.

When the parameters for the condition have been set we use the update work item in Azure DevOps for both the yes and no condition trees. Both are, once again, only going to change the tags on the ticket which will later trigger the team triage process.

If the conditions are not met then the Flow will follow the no path and tag the ticket Bot: NA, this signals to the system that the appropriate conditions were not met and to not move forward with the triaging to team members.

If the correct conditions are met it will mark the issue as Bot: Triage, which gives the go ahead to the system to send the ticket to the triage Flow and distribute to the appropriate team member.

This is the end of our initial Flow, to recap, a ticket has been checked for our five initial problems and has been tagged as such to alert the team member of the problem. Further, if a TFS issue has met all the necessary criteria it is marked “Bot: Triage” signaling to the system it is ready to be sent to a team member.

Queue Triage:

This will be a separate Flow in which we take what was done in the previous Flow and distribute TFS tickets to individual team members. As a brief overview, this is what the formatting will look like when done correctly. Below we will dig into what all of this is doing and walk through the process.

First, we need to set up our trigger, we are using a scheduled recurrence to run this Flow. Every five minutes it will run and look for updated tickets with the necessary tag from our previous section. Once it sees a TFS ticket with the tag it will move on with the Flow.

We then built individual queries within TFS for each member of the team and the categories they own. A new feature within Azure Devops called “get query results” allows us to check these queries quickly and as team members come and go be more flexible.

When you have built your query, use the “apply to each” condition and select value as the output. This will allow you to access all data fields in the TFS query we have built.

After we have identified the correct query, we will need to add a separate condition that ensures the issue was tagged by the pre-check Flow talked about earlier.

Once the update condition has been met, we once again need to use the “update work” item in Azure Devops. This will finally send the TFS ticket to the correct team member with the ticket checked for any of our known issues before being received.


Conclusion:

Overall, Flow has helped us cut down our own triage processing times from over 6 hours to 1-2 minutes and given us the flexibility to focus on the core business projects.

We know this post will help you think differently about more creative ways Flow can be used to reduce manual work, speed up your business, and solve redundant problems you run into on a daily basis.

If you have any suggestions or questions, feel free to comment on this post.

We’d love to hear your feedback!

Brian and George

 

 

 

 

The post Advanced | Flow of the Week: Triage and Team Assignment with Microsoft Flow appeared first on Microsoft Power Platform Blog.

]]>
Advanced | Flow of the Week: Build a Custom Connector for Microsoft Flow & Search Unified Audit Logs http://approjects.co.za/?big=en-us/power-platform/blog/power-automate/custom-connector-microsoft-flow-search-unified-audit-log/ Fri, 03 May 2019 16:00:00 +0000 In this post, Joao Lucindo, a Microsoft TSP hailing from Brazil shows you how to develop a Microsoft Flow Custom  Connector to get audit logs from Office 365. This solution is based on the Search-UnifiedAuditLog cmdlet.

The post Advanced | Flow of the Week: Build a Custom Connector for Microsoft Flow & Search Unified Audit Logs appeared first on Microsoft Power Platform Blog.

]]>
In this post, Joao Lucindo, a Microsoft TSP hailing from Brazil shows you how to develop a Microsoft Flow Custom  Connector to get audit logs from Office 365. This solution is based on the Search-UnifiedAuditLog cmdlet.

Step-by-Step

1) Access Microsoft Flow, log in and select the gear icon in the top right-hand corner and then click “Custom Connectors”.


2) Click “+Create custom Connector” and choose “Create from blank”


3) Choose a name for your custom connector

4)Fill the fields like the print screen below, and then click “Security”:

Icon background color: choose a color in hex format

Description: give a description for your custom connector

Host: outlook.office365.co

5) Choose “Basic authentication” for the Authentication Type, and fill the Parameter label field with “UserName” and “Password” like the print screen below (Do NOT enter secrets here. These fields are used to configure display names for connections). Finally click “Definition”

6) In the “Definition” step, click “New Action”

7) Fill all the field with “GetLogs”, like the image below, and then select “+ Import From sample”

8) Choose the verb “Get”. In the field URL paste: <https://outlook.office365.com/psws/service.svc/UnifiedAuditLog?StartDate={STARTDATE}&EndDate={ENDDATE}&RecordType={RECORDTYPE}&ResultSize={RESULTSIZE}>. Finally click “Import”

 

9) In the query parameter “RecordType” click “Edit”

10) Change the “Is required” option to “Yes”; The “Dropdown type” to “Static”; and then paste <AzureActiveDirectory, AzureActiveDirectoryAccountLogon, AzureActiveDirectoryStsLogon, ComplianceDLPExchange, ComplianceDLPSharePoint, Discovery, ExchangeAdmin, ExchangeAggregatedOperation, ExchangeItem, ExchangeItemGroup, MicrosoftTeams, MicrosoftTeamsAddOns, MicrosoftTeamsSettingsOperation, OneDrive, PowerBIAudit, SecurityComplianceAlerts, SecurityComplianceCenterEOPCmdlet, SecurityComplianceInsights, SharePoint, SharePointFileOperation, SharePointSharingOperation, SkypeForBusinessCmdlets, SkypeForBusinessPSTNUsage, SkypeForBusinessUsersBlocked, Sway, ThreatIntelligence, Yammer, MicrosoftStream> in the field “Values”

11) Repeat the same for the others Query parameters (StartDate,EndDate,ResultSize), but this time only change the “Is required” field to “Yes”

12) Select “Create connector”, wait for a few seconds to conclude the creation connector process, and then click “Test”

13) Select “New connection”

14) Type the email and password for the Global admin account, and then select “Create connection”

15) If necessary, click in the refresh icon to activate the new connection that we just created. Fill the fields “StartDate” and “EndDate” (YYYY-MM-DD)*. For the “RecordType” field choose one of the options from the step 10. For the “ResultSize” field fill with 5000. Finally click “Test operation”

*Audit record is generated and stored in the Office 365 audit log for your organization. The length of time that an audit record is retained (and searchable in the audit log) depends on your Office 365 subscription, and specifically the type of the license that is assigned to a specific user.

  • Office 365 E3 – Audit records are retained for 90 days. That means you can search the audit log for activities that were performed within the last 90 days.
  • Office 365 E5 – Audit records are retained for 365 days (one year). That means you can search the audit log for activities that were performed within the last year. Retaining audit records for one year is also available for users that are assigned an E3/Exchange Online Plan 1 license and have an Office 365 Advanced Compliance add-on license.

(https://docs.microsoft.com/en-us/office365/securitycompliance/search-the-audit-log-in-security-and-compliance)

16) You should receive a Response Status 200, if not please review all the steps.

Now you can build a schedule-based Flow to save logs in a Sharepoint List or SQL table, for example:

For the Parse Json Action, you can you the follow Schema:

 

{

"type": "object",

"properties": {

"odata.metadata": {

"type": "string"

},

"value": {

"type": "array",

"items": {

"type": "object",

"properties": {

"Identity": {

"type": "string"

},

"AuditData": {

"type": "string"

},

"CreationDate": {

"type": "string"

},

"IsValid": {

"type": "boolean"

},

"ObjectState": {

"type": "string"

},

"Operations": {

"type": "string"

},

"RecordType": {

"type": "string"

},

"ResultCount": {

"type": "number"

},

"ResultIndex": {

"type": "number"

},

"UserIds": {

"type": "string"

},

"ObjectIds": {},

"IPAddresses": {},

"SiteIds": {}

},

"required": [

"Identity",

"AuditData",

"CreationDate",

"IsValid",

"ObjectState",

"Operations",

"RecordType",

"ResultCount",

"ResultIndex",

"UserIds",

"ObjectIds",

"IPAddresses"

]

}

}

}

}

 

The post Advanced | Flow of the Week: Build a Custom Connector for Microsoft Flow & Search Unified Audit Logs appeared first on Microsoft Power Platform Blog.

]]>
Advanced | Flow of The Week: Role Based Security in PowerApps using Flow & SharePoint Groups http://approjects.co.za/?big=en-us/power-platform/blog/power-automate/advanced-flow-of-the-week-role-based-security-in-powerapps-using-flow-sharepoint-groups/ Wed, 30 Jan 2019 16:00:00 +0000 Enabling role based security in PowerApps controlled by SharePoint Security Groups has been a common customer ask. For example, can you make an Admin screen that is visible only to users who belong to a specific SharePoint Security Group? Yes, you can and this is where Microsoft Flow comes to the rescue!
This blog post is an attempt to share an approach for finding out the SharePoint Group membership of a signed in user and make certain features or screens of an app available to them.

The post Advanced | Flow of The Week: Role Based Security in PowerApps using Flow & SharePoint Groups appeared first on Microsoft Power Platform Blog.

]]>
What’s up Flow Fans!?

Our FOTW This week comes from Geetha Sivasailam.

Geetha is a Collaboration & Custom App Dev consultant with Artis Consulting (http://www.artisconsulting.com/ ) a Microsoft Partner, delivering business solutions leveraging O365, Microsoft Business Applications, Custom App Dev implementations and various emerging technologies.  She is a Power Platform enthusiast and is passionate about enabling others to expand their possibilities and act more effectively with emerging tools, trends and technologies. Follow her on twitter (https://twitter.com/GSiVed ) or on her Blog

With no further adieu, here is her post!

Enabling role based security in PowerApps controlled by SharePoint Security Groups has been a common customer ask. For example, can you make an Admin screen that is visible only to users who belong to a specific SharePoint Security Group? Yes, you can and this is where Microsoft Flow comes to the rescue!
This blog post is an attempt to share an approach for finding out the SharePoint Group membership of a signed in user and make certain features or screens of an app available to them.
Prerequisites
Create a SharePoint security group with members that you would want to use for role based security in your PowerApps App and navigate to the group settings .


And enable “Everyone” access to view the members of this group

We will be creating a Flow to check group membership using a SharePoint HTTP REST call. Users triggering the Flow from your app may not necessarily have admin privileges to read group membership details. To enable all users to read group membership it is necessary to allow “Everyone” to view members of the security group.
Steps

  • Create a blank flow using a Site Collection Admin/Flow owner account.
  • Add a PowerApps Trigger step so you can call this Flow from the app. Then add an Initialize variable step to store a boolean value (IsAdministrator). Next add another Initialize variable step with a string variable (UserGroupInfo) to store the user group information we will be retrieving in the next step.

  • Search for ‘Send an HTTP request to SharePoint’ action under SharePoint actions and add it. Now let’s configure this action to make a SharePoint REST call to determine user group membership.

Site Address: Select the site collection where your SharePoint Security group exists
Method: Get
Uri: api/web/sitegroups/getByName(‘SP Group Name’)/Users?$filter=Email eq ‘’
Replace ‘SP Group Name’ with your group name. Place cursor in between the single quotes after $filter=Email eq and select ‘Ask In PowerApps’ under manual content. This will auto generate a variable name that will be used as an input parameter for this Flow. The goal here is to pass the logged in user’s email id as a parameter from PowerApps to Flow.
Below is how the action looks after configuration. ‘RequestAdmins’ is the SharePoint User Group I used for this example.

This REST call will return an empty object if the user is not member of the group.

It will return an object with User properties in the following format if the user is a member of the group.

  • Now that we have an output from the REST call above, we can parse it to extract the results section. Add a ‘Set Variable’ action to set the variable ‘UserGroupInfo’ created earlier with the value set to expression body(‘CheckUserGroup’)[‘d’][‘results’]


Here ‘CheckUserGroup’is the name of the previous action. If your action name has spaces, replace the spaces with underscore (_) character. At this stage, we have extracted the results which can be used to determine if the User is a member of the group.

  • Add a ‘Condition’ step to evaluate the results value. If the results object is empty then the user is not a member.

Use the expression @not(equals(variables(‘UserGroupInfo’), ‘[]’)) to evaluate the object.
‘UserGroupInfo’ is the variable used to store the object value and ‘[]’ compares to an empty object.
Set the variable ‘IsAdministrator’ that was initialized earlier to true if the condition evaluated to be true. If not, set it to be false.

  • One final step in the flow would be to pass the results of our user group membership check back to PowerApps as an output parameter that can be used to enable certain features of the app for the logged in user. Add ‘Respond to PowerApps’ action and choose a text output. An output of type boolean would have been ideal but is not available at this time and we’ll stick to a text output. Provide a name for the text output parameter (I used ‘IsAdminUser’) and set the value to variable ‘IsAdministrator’.


Here’s how entire Flow looks like and it’s time to save it and see it in action.

  • Now that we have Flow ready, let’s implement and test it on the app. Navigate to your PowerApps app, create a new blank screen, click on Properties dropdown and select the ‘OnVisible’ event of the screen. Next click on the ‘Action’ tab and select ‘Flows’. Select the Flow you created to add it to the formula bar to associate the Flow to the ‘OnVisible’ event of the screen. Type the below functions on the ‘OnVisible’ formula bar.


First we are creating a variable (isAdmin) to store the user group membership status in Boolean format and setting the default value to be false. Next we are triggering the Flow ( ‘CheckUserPermission’ is the name of the Flow I created) and passing in an encoded format of the current logged in user’s email as the input parameter.
The output returned from Flow is being stored in a variable ‘UserGroupInfo’. Lastly, we are validating the value of the output parameter ‘isadminuser’ we configured earlier and setting the ‘isAdmin’ variable with Boolean equivalent.

  • Drop a button on the screen and set the “Visible’ property of the button to variable ‘isAdmin’. This will show/hide the button based on the value of the variable. You can set the ‘OnSelect’ event of the button to navigate to an Admin Only Screen.

And we are done! Publish the app and run. If all goes well, you should see that the flow ran successfully.
On the app side, Flow is triggered (when the screen is visible) with the current user’s encoded email id as the input parameter which in turns makes a SharePoint Rest Call to determine the user’s membership.It returns a Text output of value “True” or “False”. The app then validates the returned value and shows/hides the button based on the user’s group membership.
Here’s how my sample app looks like if the current user is an admin:

If the current user is not an admin then the admin button is hidden.

Resources
Here are some resources I found to be very useful to help design this solution:

  • Doctor Flow’s post how you can use SharePoint REST APIs in Flow
  • For those interested in the SharePoint REST/OData API, you can find the complete set of REST/OData APIs here.

 

The post Advanced | Flow of The Week: Role Based Security in PowerApps using Flow & SharePoint Groups appeared first on Microsoft Power Platform Blog.

]]>
The Microsoft Flow Online Conference is Tomorrow 12/12 http://approjects.co.za/?big=en-us/power-platform/blog/power-automate/the-microsoft-flow-online-conference-is-tomorrow-12-12/ Tue, 11 Dec 2018 20:57:48 +0000 We are SO Excited to have you join us tomorrow for the ALL DAY, Online, Free, Microsoft Flow Conference with some of the VERY BEST Speakers in the world on the topic!

The post The Microsoft Flow Online Conference is Tomorrow 12/12 appeared first on Microsoft Power Platform Blog.

]]>
Hello Microsoft Flow Fans!

We are SO Excited to have you join us tomorrow for the ALL DAY, Online, Free, Microsoft Flow Conference with some of the VERY BEST Speakers in the world on the topic!

Starting  December 12, 2018 – 8AM PST

To Join the Live Webcast, Please use the following Link: https://aka.ms/FlowConf

Also, Please join the live chat on the right side, or if you prefer, ask questions on Twitter with the Hashtag  #MSFlowConf

The post The Microsoft Flow Online Conference is Tomorrow 12/12 appeared first on Microsoft Power Platform Blog.

]]>
Advanced | Flow of The Week: Convert Office documents to PDF on the fly using Microsoft Flow http://approjects.co.za/?big=en-us/power-platform/blog/power-automate/convert-pdf-with-microsoft-flow-nov-2018/ Wed, 14 Nov 2018 13:30:19 +0000 Imagine a company where service technicians perform on-site repair of equipment. Inevitably, in doing this sort of work, the technician will need to refer to equipment drawings, service history, past photos, specifications and/or operating manuals.
These days PowerApps is fast-becoming a great option for such a scenario because many field workers prefer to use their phone or a tablet. But PowerApps also has some limitations, and right now that is around the display of documents from SharePoint. For a start, it is impossible to display office documents natively in PowerApps at this time, and there are authentication-related issues in certain circumstances when pulling content from SharePoint.

But fear not… with a 6-step flow, it is possible to solve this problem. This flow allows a remote user to securely request a document from SharePoint, but importantly, converts that document to a PDF on the fly.

There are two big benefits from this:

1. A reduction in time and effort for document controllers. If a document frequently changes, it is most likely in word, excel or PowerPoint format. They do not have to worry about converting it to PDF.

2. It allows the document to be viewed in PowerApps natively (As a result of #2, on top of some Flow kung-fu, we will learn some PowerApp tricks in this article too :-).

The post Advanced | Flow of The Week: Convert Office documents to PDF on the fly using Microsoft Flow appeared first on Microsoft Power Platform Blog.

]]>
Whats up Flow Fans! 

This week we have a special FOTW From one of our MVP’s – Paul Culmsee!

Paul Culmsee () is a management and technology consultant, Microsoft MVP and award winning author from Perth, Western Australia. He co-founded Seven Sigma Business Solutions () and specialises in sensemaking, helping organisations (re)discover their purpose, knowledge management, strategic planning, IT governance, facilitation and all facets of SharePoint and Office365 delivery

 

Sympathy for the on-site technician

Hi everyone.

Imagine a company where service technicians perform on-site repair of equipment. Inevitably, in doing this sort of work, the technician will need to refer to equipment drawings, service history, past photos, specifications and/or operating manuals.

These days PowerApps is fast-becoming a great option for such a scenario because many field workers prefer to use their phone or a tablet. I have made many apps like this and PowerApps is a great solution for this use-case. But PowerApps also has some limitations, and right now that is around the display of documents from SharePoint. For a start, it is impossible to display office documents natively in PowerApps at this time, and there are authentication-related issues in certain circumstances when pulling content from SharePoint.

But fear not… with a 6-step flow, it is possible to solve this problem. This flow allows a remote user to securely request a document from SharePoint, but importantly, converts that document to a PDF on the fly.

There are two big benefits from this:

  1. A reduction in time and effort for document controllers. If a document frequently changes, it is most likely in word, excel or PowerPoint format. They do not have to worry about converting it to PDF.
  2. It allows the document to be viewed in PowerApps natively (As a result of #2, on top of some Flow kung-fu, we will learn some PowerApp tricks in this article too :-).

Now in the (admittedly large article to follow, I go into detail on how to set this up, but if you prefer to see this in video form, we also have you covered…

Step 1: Setting up a SharePoint library

Since this is a field-worker scenario, let’s make a SharePoint library and be good digital citizens by throwing in an extra couple of columns so we can tag documents by their type. I’ll keep it deliberately simple for the purpose of this post, but you can extend this type of library setup in whatever manner suits you.

I created a new SharePoint site, and added the following columns to the default document library:

  • Equipment – Single Line of text
  • DocumentType – Single Line of text

 

Now upload some documents and tag them by type and by equipment. In the diagram below, take note that I’ve also thrown in a folder with some documents inside. While I won’t win any information architecture awards for doing this, it is very much the reality for many organisations and it demonstrates that the flow we will build accommodates complex folder structures also.

 

Oh… also keep the documents relatively simple. I can’t guarantee this auto-pdf goodness works with crazy large documents or those with embedded bits and pieces.

Building the Flow

The flow we are going to build is going to be triggered from PowerApps. PowerApps will pass in the ID and folder path of a particular file, and the flow will do a small bit of data cleansing before using a very powerful action called Send an HTTP request to SharePoint to bring it back as a PDF.

Please note this flow is deceptively simple as it only requires as little as 5 actions, but we are using some nice trickery so take your time…

Step 1: Create a new Blank Flow, and choose the PowerApps trigger

Step 2: Add an Initialize Variable Action.

This variable will hold the ID of the file sent from PowerApps. For reasons that will become apparent later, rename this action to FileID, add a variable called ID, make it String format.

For the Value, click Ask in PowerApps from the Dynamic Content panel (click see more if this is missing)

If you do this right, you will see a FileID_Value parameter name in the textbox. Now when we invoke this flow from PowerApps, this is the name of the parameter that the user will see. Eg:

Why is this? Well, behind the scenes, PowerApps used the name of the action to generate this parameter name, hence why it made sense to rename it something simple so you don’t make the PowerApps view confusing.

Note: I tend to add comments to my flow actions to make the intention clear as shown below.

 

Step 3: Add another Initialize Variable Action.

This variable will hold the folder path of the file and will also be sent from PowerApps. Rename this action to FolderPath, add a variable of the same name. This time make it a string and once, again use “Ask In PowerApps” to set a parameter called “FolderPath_Value” as shown below…

Step 3a. Sanity check interlude!

Please note this common trap for new players… if you clicked Ask in PowerApps more than once, then you might have some additional unwelcome parameters. I recommend being really (really) careful here to ensure you only have two parameters defined, otherwise it will make working with PowerApps a bit of a pain later. So, check that in Dynamic content, you only see the two parameters as shown below…

Step 4: Clean up the data.

A common occurrence in the world of messing with low-level stuff is that data is not always in the format we need it to be. In this case, it turns out that when PowerApps sends the folder path to Flow, it will send it in this format.

DocumentLibraryName/Folder/

Unfortunately, when we send this to SharePoint to get the PDF URL, it needs to be in a format that includes the site collection URL and the removal of the trailing slash.

/Sites/SiteName/DocumentLibraryName/Folder

Adding the site collection URL is easy – and we will do that later. But the trailing slash needs a flow expression to strip the trailing slash from the path. The expression will look like this:

substring(<path>,0,sub(length(<path>),1))

If you are new to Flow expressions, then yes – I agree that they are ugly. But trust me, they do become quite intuitive over time and actually can really bring your flows to life. For the record, what we are doing here is using a substring function to grab all except the final character of FolderPath variable. To exclude the final character, I take the length of FolderPath and subtract it by 1.

Now we could sort this out by adding another initialize variable action to reformat the path in the way we want it. But that means an extra step and variable. So, let’s be cool and modify the previous step to do it in one hit. So, go back to your FolderPath action and delete the FolderPath_Value reference in the Value textbox.

Now click Expression in the Dynamic content panel and check out this little-known trick. Type in the word substring( and you will see a somewhat annoying pop-up like so…

Now, make sure your cursor position is in between the two brackets after the word substring! Then click the Dynamic Content tab, and you can now choose the FolderPath parameter. It will add it into your expression. Neat huh?

Now your expression will look like the following:

substring(triggerBody()[‘FolderPath_Value’])

 

This neat trick allows you to write flow expressions and refer to variables or parameters without manually having to type them in. Now all we need to do is fill out the rest of the expression using this reference to the FilePath parameter.

From: substring(<path>,0,sub(length(<path>),1)) we replace <path> with triggerBody()[‘FolderPath_Value’]:

 

substring(triggerBody()[‘FolderPath_Value’],0,sub(length(triggerBody()[‘FolderPath_Value’]),1))

Your flow action will now look like the screenshot below… note that I completed this task by adding the above expression to the comment for the action to make it clear what is going on…

Step 5: Get the PDF Information from SharePoint

Now if you want to just get down to business, feel free to skip to bit where I show the next action to add. But if you want to know what we are going to do, read on!

This Flow leverages a little-known capability of SharePoint that among other things, allows us to generate image thumbnails and PDF’s of documents. This capability is an API with the nerdy name of RenderListDataAsStream. In a nutshell, it is possible to pass a reference to a document and it will dutifully spit out the URL to a PDF version.

To do this, we need to pass 4 things to the API.

  1. The document library where we want to get the file from
  2. The folder where this file resides in the library
  3. The SharePoint ID of the file we want to PDF
  4. A special code that tells the API to bring back a URL of the generated PDF

For reference, a sample API call for a file with an ID of 14 in the default SharePoint document library in a folder called “test” would look like this.

https://culmsee.sharepoint.com/sites/flowoftheweek/_api/web/lists/GetbyTitle(‘Documents’)/RenderListDataAsStream?FilterField1=ID&FilterValue1=14

Don’t try the above link in the browser, as it is a POST request. Additionally, we need to send some information in the request body too, namely the folder where the file resides and the code to get the PDF URL. The body looks like this:

    “parameters”: {

       “RenderOptions” : 4103,

       “FolderServerRelativeUrl” : “/Sites/FlowOfTheWeek/Documents/test”

    }

}

Ok, so what is the deal with the RenderOption number above? Well, this API does lots more than just generate a PDF, and that parameter allows you to specify what information you want back. The documentation includes a table of different interesting things you can return, which you can do by adding the values together.

Label

Description

Value

ContextInfo

Return list context information

1

ListData

Return list data

2

ListSchema

Return list schema

4

EnableMediaTAUrls

Enables URLs pointing to Media TA service, such as .thumbnailUrl, .videoManifestUrl, .pdfConversionUrls.

4096

 

So, we are asking this API not just to bring back the data associated with a list item, but also some additional useful stuff. The last entry is particularly interesting as it mentions a mysterious beast known as the Media TA service which I assume means either “translation” or “totally awesome” :-). Basically, what happens is if we total the numbers listed in the above table (4103), we will end up all the data we need to do PDF conversion.

Okay enough talk!

Add a SharePoint action called Send an HTTP request to SharePoint. Set the Site Address to the site that contains your document library and set the Method to POST. Set the URI to _api/web/lists/GetbyTitle(<docLib>)/RenderListDataAsStream?FilterField1=ID&FilterValue1=, where <Doclib> is the name you specified for the document library

( for example, mine is _api/web/lists/GetbyTitle(‘Documents’)/RenderListDataAsStream?FilterField1=ID&FilterValue1= ).

Finally, on the end of the URI, click Dynamic Content and choose the ID variable as shown below:

In the Body section, paste the following configuration (watch the quotes when pasting from this article):

    “parameters”: {

       “RenderOptions” : 4103,

       “FolderServerRelativeUrl” : “/<your site collection URL>/”

    }

}

In my case the FolderServerRelativeURL was “/sites/flowoftheweek/” but if you use the root site collection, it will simply be a slash “/”.

Finally, place your cursor just after the slash in the FolderServerRelativeURL parameter and from Dynamic content, choose the FolderPath variable.

Step 6: Save and Test the Flow

At this point, click the Test icon in the top right of the screen. Choose the option I’ll perform the trigger action and click the Save & Test button. On the popup that follows, click the Continue button and on the next screen, type in the ID number of one of the documents in your library and the folder path the document resides in.

For example: The first document uploaded to the library will likely be ID 1 and if is the default SharePoint document library, the folder will be Shared Documents/

Click the Run Flow button. Your flow will start and you can click Done. Assuming it worked, you will see a green tick of happiness in the history.

Click on the Send an HTTP Request to SharePoint action to expand it. We need to grab the output from the API call for the next action. Find the OUTPUTS section and copy the entire contents to the clipboard….

Note: If you made an error (e.g., you asked for a file that does not exist or the document library is empty), then the subsequent steps will fail. You can quickly sense-check this by looking at the clipboard output in notepad.

If there is no file found, you will see a line “Row”: [] in the ListData section like so…

{

  “wpq”: “”,

  “Templates”: {},

  “ListData”: {

    “Row”: []

 

But if you did retrieve a file, there will be heaps of data inside Row like so…

{

  “wpq”: “”,

  “Templates”: {},

  “ListData”: {

    “Row”: [

      {

        “ID”: “1”,

        “PermMask”: “0x7ffffffffffbffff”,

        [snip a heap of stuff]

      }

Note: The output needs to have row data before moving to the next step

Step 6: Add an action to help us work with APi output

Go back to edit mode and add a Data Operations action called Parse JSON to your flow. This action will allow us to make use of the output of the API call in the subsequent flow step.

Click the Use sample payload to generate schema link, paste your clipboard contents into the window and click the Done button.

In the Content field, go to Dynamic content panel and choose Body from the Send an HTTP Request to SharePoint action.

 

Step 7: Construct the PDF URL

Now when this API is called, a lot of data is returned. The purpose of the Parse JSON action in step 6 was to process all this output and turn it into flow objects to make our life easier here. This allows us to quickly grab only certain data from the output of the previous flow step without having to parse the output ourselves via more complex expressions.

At this point, you might be thinking that one of those is a nicely formatted PDF URL all done for us. Unfortunately, this is not the case. Microsoft give you all the bits you need, but it is up to you to put it all together. As a result, we need to do some more work to create the URL by trawling through some of the data returned by the previous step.

Of all the output currently in your clipboard, the main one that interests us is this entry…

“.pdfConversionUrl”: “{.mediaBaseUrl}/transform/pdf?provider=spo&inputFormat={.fileType}&cs={.callerStack}&docid={.spItemUrl}&{.driveAccessToken}”

This parameter is basically a template for generating the PDF URL. All the stuff in curly braces are tokens that have to be replaced by the actual values that are also returned as part of the API call. For example, if I search the clipboard content for the first token in .pdfConversionURL called  {.mediabaseURL}, I find this entry…

“.mediaBaseUrl”: “https://australiasoutheast1-mediap.svc.ms”

Now go and look at .pdfConversionUrl again. Replace {.mediabaseUrl} with https://australiasoutheast1-mediap.svc.ms and now we have

“.pdfConversionUrl”: “https://australiasoutheast1-mediap.svc.ms /transform/pdf?provider=spo&inputFormat={.fileType}&cs={.callerStack}&docid={.spItemUrl}&{.driveAccessToken}

Get the idea? We need to replace the remaining tokens ( {.fileType}, {.callerStack}, {spItemUrl} and {driveAccessToken} and replace them. Once we have done this, we finally have created our PDF URL. When we subsequently access that URL, we will receive the converted document in PDF format without needing to store the PDF. The source document can stay in its native office format!

Phew! Now let’s get this done…

Add an Initialize Variable action to your flow, name the variable PDFURL (or something similar) and set its Type to String format. image_thumb[35]_thumb

Now we come to the most complex bit of the flow where we have to substitute the tokens we just examined. Be careful here as this is the most likely place to make an error. In the Value textbox, click the Dynamic content flyout and find .mediaBaseUrl from the Parse JSON action…

Next, add the following text to the Value textbox, taking care not to delete what you just added in the previous step.

/transform/pdf?provider=spo&inputFormat=

Now we come to a slightly tricky bit. The next bit of content we need is the file type of the document we are dealing with. The bit that is tricky about this even though we are only asking for a single file when we call the API, it comes back as an array. Why? Well this API allows you to process multiple files in one go, so it always returns an array in the output, even if you only requested a single file.

The offending bit of data returned by the API is shown below. Inside a ListData object, we have a an array of Row objects:

  “ListData”: {

    “Row”: [

      {

        “ID”: “1”,

We need to get the file type of the document, so the PDF converter knows what it is dealing with. Like the way we dealt with removing the trailing slash from the FilePath variable in step 4, we can use another expression to handle it. Click the Expression tab and type in the following:

first(body(‘Parse_JSON’)?[‘ListData’]?[‘Row’])?[‘File_x0020_Type’]                                                          

 

What this expression is doing is assuming we are only handling one file at a time and grabbing the first element of the Rows array and then grabbing the File_x0020_Type property.

Next, add the following text to the Value textbox, taking care not to delete what you just added in the previous step.

&cs=

Also, be super careful here because at the time of writing, the cursor in this textbox can randomly move and wipe out your edits…

Now in the Value textbox, click the Dynamic content panel and find .callerStack from the Parse JSON action…

Next, add the following text to the Value textbox, taking care not to delete what you just added in the previous step.

&docid=

Now we come to another array that needs to be handled. This is the URL of the document we are dealing with. Click the Expression tab and type in the following:

first(body(‘Parse_JSON’)?[‘ListData’]?[‘Row’])?[‘.spItemUrl’]

 

Okay we are almost done… Add an ampersand ( & ) to the Value textbox, and then click the Dynamic content panel and find .driveAccessToken from the Parse JSON action…

Whew! We are done. I realise that was a bit of effort for one flow action, but it will all be worth it in the end. As a final step, rename this task to “Generate PDF URL”.

Step 8: Send the PDF URL back to PowerApps

Add the PowerAppsRespond to PowerApps action to the flow. Click the Add an output icon and choose Text from the list of output types. Name the output PDFURL and set the value to the variable you created in step 12 (PDFURL).

Ok we completed the flow. Save it and give it a quick review. It should look something like this…

If you have not done so already, save your Flow and give it a short, sharp name like GetMyPDF.

Building a sample PowerApp

Now let’s build a sample field worker iPad app. This app will allow a field worker to choose an equipment type, and then based on the equipment type, choose the available document types for it. While we will not win any design awards for this proof of concept, the added-bonus is you can learn how to do cascading dropdowns using SharePoint metadata.

The concept is shown below. First our trusty technician picks the equipment they are dealing with (step 1), and then they further refine by document type (step 2). Since the dropdowns cascade, only document types that exist for a chosen piece of equipment will be selectable.

This in turn refines a list of matching documents, irrespective of which folder they actually live in. From there the user clicks the “go” button (step 3) to display the PDF version (step 4).

To create the app, in the PowerApps portal, make a canvas app from blank and choose the Tablet form factor…

Next, let’s connect the app to the SharePoint document library we used at the start of this article. From the View menu, choose Data Sources and click the Add data source button. Make or use a SharePoint connection and choose the site collection where you uploaded the documents.

Click the Go button and manually type in the name of your document library and click Connect (at the time of writing, you still have to link to SharePoint document libraries by typing in their name).

From the Insert menu, choose Gallery and add a Blank vertical gallery. Set the data source to the connection you just made…

Click on the pencil in the gallery, and then from the Insert menu, choose Label. Click on this label and in the Text property, change it to ThisItem.'{FilenameWithExtension}’. You should now see the files and folders from the document library listed.

Note: {FilenameWithExtension} is one of a number of properties of a SharePoint library that PowerApps provides. We will soon use another built-in one as well…

Now let’s make use of the SharePoint columns we created at the start of this article. Recall that I made a column called DocumentType and one called Equipment. We are going to use these to filter this gallery so that only certain files are shown. After all, our field workers want to quickly get to the documents they need…

Click outside of the gallery, and then from the Insert menu, choose Controls and then pick Dropdown from the list

Set the Items property for this dropdown to Distinct(Documents, Equipment). If you then hold the Alt key, you will be able to click the dropdown and see the equipment. This list has been built from the SharePoint document library. The distinct function returns all unique values for the specified column (Equipment) without hardcoding the values.

Note”: We are seeing a blank value in the dropdown, because of the folder I created in this library which has not been tagged to a device or document type. But the empty value is seen as something unique by the Distinct function. If you want to make it go away, try this alternative setting for the Items property. This will remove the empty value…

Filter(Distinct(Documents, Equipment), !IsBlank(Result))
 

Insert another dropdown and place it under the first one. Set the Items property for this one to

Distinct(Filter(Documents, Equipment = Dropdown1.Selected.Value ), DocumentType )

What this does is filter the document library to only items that match the currently selected equipment. Then it uses distinct to get all the unique document types for that equipment. This pretty much means you now have a cascading set of dropdowns. The values in the second dropdown are based on what was selected in the first dropdown.

Now let’s return to our gallery and modify it so that it only shows items based on the dropdowns. In the Items property of the gallery, change it to:

Filter(Documents, Equipment=Dropdown1.Selected.Value, DocumentType=Dropdown2.Selected.Value)

Note that depending on the name of your dropdown controls, you might need to modify the text above.

Test by changing the values of the dropdowns and confirm that different documents are listed in the gallery.

Next let’s add a button to the gallery so a user can load the PDF. Select your gallery and click the pencil icon. From the Insert menu, add a Button and set the label to “go”. Position it to the right of the file name. Also double check that you placed the button inside the gallery and not on the screen by confirming the button sits as a child control of the gallery in the left hand navigation.

Click on the newly minted button and let’s now call our PDF generation flow. From the Action menu, click on Flows and in the data panel, find your Flow.

If this worked, the flow will be asking you for the two “Ask in PowerApps” parameters that you set up earlier. For the first parameter, enter Gallery1.Selected.ID and for the second, enter Gallery1.Selected.'{Path}’. Your formulae should look something like this…

E.g.: GetMyPDF.Run(Gallery1.Selected.ID, Gallery1.Selected.'{Path}’)

Note: I am assuming your Gallery is called Gallery1.

We are not quite done though, because we need to capture the output of running this flow as a variable. This will give us the PDF URL. To achieve this, modify your expression so that the flow is inside the Set function …

Set(MyPDF, GetMyPDF.Run(Gallery1.Selected.ID, Gallery1.Selected.'{Path}’) )

 

If you want to test things at this point, simply preview the app and press the button. If flow has worked, you can see the variable via the View menu so see if a URL has been created.

Finally, let’s display the PDF.

From the Insert menu, choose Controls and add a PDF Viewer control. Place it to the right of the gallery and set the Document property to MyPDF.pdfurl as shown below:

If all things go to plan, you should be viewing PDF’s. Use the dropdowns to find a different document and click the button to see the PDF. It may take a few seconds to load the first time, particularly if the original document is large, but it should happily display.

Try some different document types… For example, below is a sample excel document that I loaded into the library…

Taking it further (and caveats)

In the real world, I have used this technique to deliver documents to field staff with much more sophisticated navigation and searching capability. It has been used on large construction sites to share safety bulletins and it has been integrated with more extensive PowerApps solutions for managing callouts/jobs and managing assets. Field staff were stoked that they were able to quickly call up and view key information without having to navigate through a file share or SharePoint.

By the way, this method can also can be used to generate image thumbnails too, making it excellent for apps that are photo heavy and bandwidth sensitive. I even utilised this the thumbnail variation of this technique to add photos to activity feeds in apps and have even used to create rich action cards in Microsoft Teams.

Another use for this approach is its utility as a free PDF conversion tool. Paul O’Flaherty pointed this out to me that is more elegant than the commonly used OneDrive Flow action that many people use. Using a variation of the flow I outlined in this article, we can save a html file directly to a SharePoint library, and then use that file’s ID and Path to get the PDF URL of it. Finally, we can use the flow HTTP action to get a hold of that PDF and save it into SharePoint. Neat eh? We don’t need to use OneDrive connector anymore.

So what’s the catch? The main one is the common caveat – potentially lots of flow runs. Remember that with this method, each time a user clicks a button, a flow run is used to generate the PDF URL. One can improve this by leveraging the fact that we can call the RenderListDataAsStream API and send it multiple documents to convert to PDF. Thus, a single flow run can actually generate a lot of PDF (and thumbnail) URL’s.

Finally, Ashlee and I recorded a video of this technique so you can follow along with that over at YouTube.

Phew! We are finally done. Thank for sticking with me and please let us know your feedback and how you plan to utilise this approach. This feedback often gives me ideas and new directions to explore.

Till next time

Paul Culmsee

Company: www.sevensigma.com.au

Books: www.hereticsguidebooks.com

Blog: www.cleverworkarounds.com

 

 

 

The post Advanced | Flow of The Week: Convert Office documents to PDF on the fly using Microsoft Flow appeared first on Microsoft Power Platform Blog.

]]>
Advanced | Flow of the Week: Creating an AtBot ChatBot connected to Dynamics 365 http://approjects.co.za/?big=en-us/power-platform/blog/power-automate/atbot-chatbot-connected-to-dynamics365/ Wed, 24 Oct 2018 12:01:07 +0000 Creating Bots that integrate with Dynamics 365 has not been the easiest thing to do in the past for non-developers because it has required a lot of coding to be done, and also in order to change the conversation flows you needed to update and redeploy the code for the Bot to make it available to the users.

Recently I was introduced to a partner solution called AtBot that allows us to create Bot services through the AtBot portal that links to LUIS and Azure Bot Services, allowing us to build conversation flows and dialogs using Flow as the authoring engine.

This allows us to build Bots with zero coding experience that also leverages the power of Flow to connect to other services seamlessly, allowing us to integrate Dynamics 365 using the standard entities.

In this walkthough we will show you how to configure and build an AtBot Bot that connects to Dynamics 365, using LUIS as the engine for discoving the users intent and deploy it out to chat platforms like Microsoft Teams.

This is gold I tell you, gold!

The post Advanced | Flow of the Week: Creating an AtBot ChatBot connected to Dynamics 365 appeared first on Microsoft Power Platform Blog.

]]>

Hello Flow Fans and Welcome to another Flow of the Week!

This Flow of the week comes from Murray Fife – Global Blackbelt for Microsoft! Ill let him introduce himself, and then we can get into the meat of the post!

Murray Fife is a Dynamics 365 Global Black Belt (GBB) at Microsoft, supporting national sales of the Dynamics 365 platform and evangelizing all of the Microsoft technologies. Murray has over 20 years of experience in the software industry working as a developer, an implementation consultant, a trainer, a public speaker, and a demo guy. As a bonus he is finally getting the chance to dust off his degree in Artificial Intelligence and put that to work with the Microsoft AI technologies.



In his spare time, he is also an Author of over 50 books on Microsoft technologies including the Bare Bones Configuration Guide Series which has been a personal mission of his which introduces new users to the setup and configuration of Dynamics AX and Dynamics 365 using step by step visual walkthroughs. This set of 15 guides start off with the Financial modules, progress through the Operational, Distribution, and also includes specialized modules like Production, Warehouse Management, Human Resources and Project Accounting.

You can find all of his books on Amazon (www.Alpha XR/author/murrayfife), and also on the Dynamics Companions site (www.dynamicscompanions.com) where he publishes resources for the Dynamics community.

Murray is a continual tinkerer and blogger, and you can follow all of his experiments and projects on his personal A Tinkerers Notebook blog site (www.atinkerersnotebook.com) and on Twitter (@murrayfife). 

Check out The video Below as Murray Introduces the solution and how he has connected Microsoft Flow + Dynamics by using AtBot 

 

 

Now to see how the full solution is built, Click THIS LINK and download the full blog post walkthrough!

 

The post Advanced | Flow of the Week: Creating an AtBot ChatBot connected to Dynamics 365 appeared first on Microsoft Power Platform Blog.

]]>
Advanced | Flow of The Week: Record your travel mileage using Flow and Bing Maps http://approjects.co.za/?big=en-us/power-platform/blog/power-automate/record-your-travel-mileage-using-flow-and-bing-maps/ Wed, 03 Oct 2018 14:43:26 +0000 Recording your travel mileage is a crucial and currently a manual process to get travel reimbursement, however, now thanks to Flow and the Bing Maps connector we can automate this process as well. There are two Flows, one is for departures and the second for arrival but both of them update the single row in the Excel spreadsheet. The Flow is smart enough to see which was the last entry submitted by the user that only has the departure information and is pending arrival.

The post Advanced | Flow of The Week: Record your travel mileage using Flow and Bing Maps appeared first on Microsoft Power Platform Blog.

]]>
Whats up Flow Fans?!

This weeks Flow of the week is written by Flow & PowerApps MVP Daniel Christian. Daniel is a rockstar community member, Lego superstar and all around problem solver 101. Check out his YouTube Channel and follow him on Twitter HERE. Also, be sure to leave some questions or comments below and he will be happy to answer them!

Introduction:

Recording your travel mileage is a crucial and currently a manual process to get travel reimbursement, however, now thanks to Flow and the Bing Maps connector we can automate this process as well. There are two Flows, one is for departures and the second for arrival but both of them update the single row in the Excel spreadsheet. The Flow is smart enough to see which was the last entry submitted by the user that only has the departure information and is pending arrival.

Requirements:

 1. A Flow subscription
 2. A Bing Maps Key
 3. A location to save the date. In this scenario we are using Excel saved in OneDrive but you can do the same using SharePoint lists, Common Data Services Entity or SQL tables.
 

Bing Maps Connector:

You need to first create an account with Bing Maps Portal at https://www.bingmapsportal.com/ using an existing Microsoft account. You can then select the My Keys option in My account drop-down list.
 
 

 
Copy the key
 

Next head over to https://Flow.Microsoft.com to add the Bing Maps connector which is currently in preview.
 

 
This is where you add the API key
 

Excel spreadsheet:

The Excel spreadsheet is located in the OneDrive for Business and is shared with all the users who are using this Flow.
 

 

Flow #1: Depart

This flow uses a virtual button to trigger. Here is what the Flow looks like.
 

 
Here is the expression for the StartTime column addHours(formatDateTime(utcNow(),’MM-dd-yyy HH:MM’),-5)
 

Flow #2: Arrive

When the flow is triggered, we first create two variables, one is to save the email address of the user provided by the Flow button action, the other is where the distance is going to be saved. The FlowUserVar is required for a condition to work.
 

 
Next we Get all the rows from the Excel spreadsheet.
 

 
The following steps are required to first find the rows that contain the end user’s email address. In it find the row that doesn’t have the destination or end time and address. Here we use the Length express to find that row where the end time is empty. Instead of end time you could use end address as well.
 
length(items(‘Apply_to_each’)?[‘Endtime’])
 

 
As you can see below if the length is equal to 0 then we are updating that row with the end time and date. We find that row by using the Row id.
 

Now that we have the start and end address we can use Bing maps to find the distance between them. To do that we add the Get route action. Waypoint 1 is the start address which we get the Excel spreadsheet and Waypoint 2 is the end address which we got from the Manually trigger a flow action.
 

 
Finally, we can take the total travel distance and update the row. For some reason Flow does not like to use the Travel Distance value as is and hence we have to wrap it in a variable and then save it to the row.
 

 
Now you have a single row which has recorded the start address with date and time, end address with date and time and the mileage.
 

This video demonstrates how the two Flow virtual buttons work and the logic behind the flow.
 

Conclusion:

The total mileage is presented by Bing Maps and might have subtle differences if compared to other maps such as Google.  Also, in this case I have only recorded the mileage but you could record the total time it took by calculating the delta between the start and end time.
 

The post Advanced | Flow of The Week: Record your travel mileage using Flow and Bing Maps appeared first on Microsoft Power Platform Blog.

]]>
Advanced | Flow of the Week: Sending Pull Request review reminders using MS Flows http://approjects.co.za/?big=en-us/power-platform/blog/power-automate/sending-pull-request-review-reminders-using-ms-flows/ Wed, 12 Sep 2018 14:06:44 +0000 Hey Flow Fans! Have you wanted to send automated PR Reminders out to your team? Look no further and follow along as Jyoti, one of the Microsoft Flow Engineers guides you through creating the Flow!

The post Advanced | Flow of the Week: Sending Pull Request review reminders using MS Flows appeared first on Microsoft Power Platform Blog.

]]>
Hello Flow Fans!

With the increasing number of code repositories it was getting difficult to keep track of Pull Requests (PRs) that need to be reviewed. As an effort to improve dev productivity and reduce PR wait times we created a simple flow using the Visual Studio Team Services (VSTS) connector to consolidate all PR links and send reminders to reviewers. Now, having a single email in their inbox every week with PRs to be reviewed saves everyone the hassle of navigating to each repository.

Overview of the flow

  1. Trigger the flow through Recurrence
  2. Get the desired PRs using VSTS REST APIs
  3. Parse response JSON
  4. Iterate over reviewers list for each PR and prepare a dictionary of reviewer-PRs
  5. Send emails to the reviewers with the list of PRs they need to review

The Flow in this article, sends emails to a filtered list of users to not spam members from other teams. This can be enhanced to send emails to users in an alias (not in scope of this article).

 

First choose your Trigger

A recurrence trigger can be used as the aim is to send an email at regular intervals. /you can set yours up like mine, or choose a schedule of your own.

 

Now add an action to get Pull Requests from VSTS

The VSTS connector supports multiple actions; the one we used is called ‘Send an HTTP request to VSTS’. VSTS exposes hundreds of REST APIs https://docs.microsoft.com/en-us/rest/api/vsts/?view=vsts-rest-4.1 at our disposal. We will leverage the Pull Requests APIs here.

We use HTTP GET method to retrieve all active PRs where a particular reviewer was added. Account name is the Azure DevOps organization name (e.g. for jorg.visualstudio.com account name is jorg). Here, the query gets all the active pull requests where a particular reviewer has been added.

 

Next we will add an action to Parse the JSON

The output is then parsed using the ‘Parse JSON’ connector. You can use a sample payload to generate schema as the response json structure could be complicated.

 

Iterate over PRs

After which we iterate over all the PRs and reviewers to prepare a dictionary of Users and PRs.

 

Iterate over reviewers for each PR

At this step, the reviewer list for each PR it iterated through and we prepare a Reviewer-PR map. E.g.

{
“reviewer1_alias”: [
“<tr>\n<td>PR Title 1 </td>\n<td>requester1</td>\n<td > Pending review</td>\n <td> https://url1\n</td>\n</tr>
],
“reviewer2_alias”: [
“<tr>\n<td>PR Title 2 </td>\n<td>requester2</td>\n<td > Waiting for author</td>\n <td> https://url2\n</td>\n</tr> “,
“<tr>\n<td>PR Title 3 </td>\n<td>requester2</td>\n<td > Pending review</td>\n <td> https://url3\n</td>\n</tr>
]
}


 

 

 

 

 

 

 

 

 

 

An extension here could be to check if the comments from the reviewer have been resolved or not.

 

Send email

To each of the above reviewers in the dictionary, we can use the  ‘Send an email’ action with the outlook connector and pass in our parsed items. The email looks as follows.

Thanks for reading! Hope this helps improve dev productivity in your team as well. Please leave comments for any improvements or enhancements or questions you may have!

 

The post Advanced | Flow of the Week: Sending Pull Request review reminders using MS Flows appeared first on Microsoft Power Platform Blog.

]]>
Power platform Security & Governance: Deploying a Defense in Depth Strategy http://approjects.co.za/?big=en-us/power-platform/blog/power-automate/security-governance-strategy/ Thu, 30 Aug 2018 14:47:07 +0000 A common cyber security approach used by organizations to protect their digital assets is to leverage a defense-in-depth strategy. When customers ask how to best secure and govern their Microsoft Flow and PowerApps environments, we provide similar guidance. The following list represents different layers that you can use to protect your digital assets and apply governance to ensure your organization’s interests are met.

The post Power platform Security & Governance: Deploying a Defense in Depth Strategy appeared first on Microsoft Power Platform Blog.

]]>
A common cyber security approach used by organizations to protect their digital assets is to leverage a defense-in-depth strategy. The SANS Institute defines defense-in-depth as “protecting a computer network with a series of defensive mechanisms such that if one mechanism fails, another will already be in place to thwart an attack.”

When customers ask how to best secure and govern their Power platform environments (which includes Microsoft Flow and PowerApps), we provide similar guidance. The following list represents different layers that you can use to protect your digital assets and apply governance to ensure your organization’s interests are met.

  • Secure data at rest Microsoft Flow does not provide users with access to any data assets that they don’t already have access to. This means that users should only have access to data that they really require access to. It also means that if a user has access to this data through a web browser, then they likely have access to it through Microsoft Flow. A recommendation the Microsoft Flow team suggests, is using a least privilege approach to data access. The United States Computer Emergency Readiness Team refers to least privilege access as: “Every program and every user of the system should operate using the least set of privileges necessary to complete the job. Primarily, this principle limits the damage that can result from an accident or error.” Deploying least privilege access is a good practice and a big part of an organization’s overall security hygiene.
  • Network Access Control The National Institute of Standards and Technology (NIST) encourages organizations to inspect “inbound and outbound network traffic for specific IP addresses and address ranges, protocols, applications, and content types based on the organization’s information security policies.” While Microsoft Flow is a cloud-based application, organizations have the ability to govern how connections are established when users are connected to the corporate network. For example, if an organization blocks access to a social media site from within their corporate network by blocking the sign-on page through their firewall, then when this same log-in page is launched from the flow portal, the connection can also be blocked from being established.
  • Location-based Conditional Access For organizations that want to govern where users can access the Microsoft Flow service from, they can setup Azure Active Directory Conditional Access policies that can restrict what network addresses have access to the service. For additional information, please refer to the following presentation from the Microsoft Business Application Summit.
  • Data leakage can be avoided by configuring Data Loss Prevention (DLP) polices that allow an administrator to group connectors into Business data and Non-Business data groups. Connectors within each group can communicate with each other but cannot be used within a flow if the connectors span these two data groups. There are both design-time and runtime checks that will enforce these policies.
  • Anomaly Detection is another common strategy used by organizations to understand user behavior. For example, if an organization usually creates 5 new flows every day and there is an exponential spike in flows being created, then it may be worth understanding what is driving that growth. Is it legitimate usage or is there a threat. How can this be detected? Microsoft recently released management connectors for Microsoft Flow, Microsoft PowerApps and Microsoft Power platform. We also published a template that will automate the discovery of these assets.

  • NIST classifies Audit Trails as “a record of system activity both by system and application processes and by user activity of systems and applications.  In conjunction with appropriate tools and procedures, audit trails can assist in detecting security violations, performance problems, and flaws in applications.” Microsoft Flow publishes audit trail events to the Office 365 Security and Compliance center related to:
    • Created flow
    • Edited flow
    • Deleted flow
    • Edited permissions
    • Deleted permissions
    • Started a paid trial
    • Renewed a paid trial

As part of these audit events, the user who was involved in the event will be captured and in the case of create flow and edit flow events, the connectors used in these flows will also be captured.

 

  • Alerting is another line of defense that should be used to inform stakeholders when corporate policies have been broken. Much like we want Microsoft Flow users to automate their business processes, we also want to provide administrators with this same level of automation. An example of alerting that can be implemented is subscribing to Office 365 Security and Compliance Audit Logs. This can be achieved through either a webhook subscription or polling approach. However, by attaching Flow to these alerts, we can provide administrators with more than just email alerts. By leveraging the new Management Connectors or PowerShell Cmdlets corrective action can be implemented which allows administrators to remain productive as they protect their environment.
  • Education cannot be ignored as a layer of defense. Cybersecurity is more than just technology and processes, it is also highly dependent upon people. Phishing continues to be a popular avenue for hackers to try and exploit. In part due to users clicking on links that they shouldn’t. In many circumstances, users are tricked into clicking on links based upon clever campaigns being designed. End-user education continues to be another layer that organizations implement to prevent breaches. Microsoft Flow users should also be educated on company cyber security policies to ensure this security layer is not exploited.

Additional Resources

In this blog post we discussed many security layers that organizations should implement as they seek to govern and protect their environment. In addition to what we have discussed in this blog post, we also have additional resources that organizations can leverage to protect their environments.

·PowerShell Cmdlets for PowerApps and Microsoft Flow In May, we introduced PowerShell cmdlets that provide both user and admin functions to automate Application Lifecycle Management (ALM) and administrative tasks. We continue to update these PowerShell cmdlets based upon customer feedback. Please find the latest release here.

·PowerApps and Microsoft Flow Governance and Deployment Whitepaper was released earlier this month and includes prescriptive guidance for deploying and managing the Power platform. Topics within the whitepaper focus on the following areas:

  • Data Loss Prevention (DLP) Policies
  • PowerApps and Microsoft Flow Access Management
  • Automating Governance
  • Deployment Scenarios
  • Office 365 Security and Compliance Center
  • Importing and Exporting application packages
  • Licensing
  • Power platform Admin Center (coming soon) At the Business Application Summit in July, we announced a unified experience for managing Dynamics 365, PowerApps, Microsoft Flow and CDS for Apps assets. One of the features of this new admin experience is Admin Analytics, which will provide administrators with an analytics experience that will provide insight into how these flows and apps are used within their tenant.

The post Power platform Security & Governance: Deploying a Defense in Depth Strategy appeared first on Microsoft Power Platform Blog.

]]>
Advanced | Flow of The Week – Integrate Facebook Workplace with your SharePoint Intranet http://approjects.co.za/?big=en-us/power-platform/blog/power-automate/connect-facebook-workplace-to-sharepoint/ Fri, 20 Jul 2018 11:30:24 +0000 In this entry of the Flow of the Week - Community Member Tomasz Poszytek builds a Flow that connects Facebook Workplace to a SharePoint intranet site!
Come and find out how he did this and how you can build it too!

The post Advanced | Flow of The Week – Integrate Facebook Workplace with your SharePoint Intranet appeared first on Microsoft Power Platform Blog.

]]>
Facebook @workplace integration with SharePoint using Microsoft Flow

Hello Flow Fans! 

This weeks Flow of The Week comes from Tomasz Poszytek – Originaly posted on his Blog and re-posted with permission

This post will show how to connect your SharePoint intranet with a @workplace by Facebook community area dedicated for enterprises. The aim of the flow is to automatically gather and store every new post, comment and other sort of activities that can occur on a @workplace side in a dedicated list within SharePoint site.

Because there is yet no ootb solution to show @workplace activity feed, this approach will let you to have your data inside SharePoint and then to show it to your users using a built-in list web part or by creating a custom display web part. Choice is yours! Let’s begin.

 

Setting up @workplace and Flow

 

Step no 1 – create custom integration in @workplace

The integration is done via a webhook configured on a @workplace side. To be able to use the webhook, first it needs to be defined. To do that you need to go to “Integrations” page (being a @workplace administrator): https://[your-company].facebook.com/work/admin/?section=apps&ref=bookmarks and then to create new custom integration:

Then you need to define its name and optionally a description:

Now you need to select what permissions does the integration need (in my case this is only “Read group content”), then select groups, to which it will have access (in my cases this is “All groups”)

and finally configure the webhooks.

Remember, that you can only subscribe one URL per webhook topic, but you may use the same URL for multiple topics.

In my case I only configured webhooks for the “Groups”. How? Read below.

 

Step 2 – webhook configuration and verification

 

To configure and save a webhook it must have verified connection with the callback URL. The callback URL is in fact a URL of the Microsoft Flow, which is going to receive and parse data.

The tricky thing is, that the callback verification must be done using GET request, whereas further webhooks’ requests are done using POST calls.

To do that simply create a Flow with a “Request” action as the trigger.

Then parse the header parameters, using the “Parse JSON” action:

  1. Content: triggerOutputs()[‘queries’]
  2. Schema:
    { “type”: “object”, “properties”: { “hub.mode”: { “type”: “string” }, “hub.challenge”: { “type”: “string” }, “hub.verify_token”: { “type”: “string” } } }

Finally add the “Response” action, and use “hub.challenge” value as a Body:

Do not forget, to change how the Flow can be triggered – set the method to “GET” (1) in trigger action (it’s under the “advanced options”), publish your workflow and finally copy its URL (2):

Now paste that copied URL in a “Callback URL” field in webhook configuration and type any “Verify Token” value (it should be used to check, if the GET request really comes from your webhook) and hit “Save”:

Now you’re ready to go – you will notice in your Flow a new run, completed successfully and your new “Custom integration” will be saved. 

Whenever you decide to change ANYTHING in your Custom Integration configuration, you will need to verify Callback URL again – so again to send a GET request to your Flow.

Data structure

For saving comments I am using a single SharePoint list built from the following columns:

  1. Title – holds type of the information.
  2. Author – a text column to keep name and last name of an author.
  3. Date – when the event occurred (from webhook).
  4. Post/ Comment – a multiline text field, allowing HTML formatting, to keep body of the message.
  5. Image – again, a multiline text field, allowing HTML formatting, to store <img> tag with the URL of the image, attached to a message – this is because Flow does not support “Picture/ Hyperlink” fields having set type to “Picture”.
  6. SourceURL – a “Picture/ Hyperlink” field with a type set to “Hyperlink”.
  7. AuthorPPL – again, an author field, but this time as a “Person or group” field. My Flow is trying to map Author data to an existing SP User.
  8. ItemId – an identifier of the message: comment_id, post_id or a member_id.

The list, filled up with data, looks as below:

Building a Flow

Remember to keep the actions for GET response within your workflow. I did it by adding a condition action, with a condition: “1 is equal to 2”, so that Flow always goes the other way. I am changing its behavior manually, between receiving GET and POST requests, which is impossible to automate right now:

Whenever I modify my @workplace Custom integration settings, I change the method to “GET” and the condition to “1 is equal 1”. After verification passes, I am turning them back to “POST” and “1 is equal to 2”. Some kind of a lifehack ?

Step 1 – Request body schema

In my “POST” branch, the first action is “Parse JSON”, used to parse the request body. Before I made a valid schema, I did dozens of calls from @workplace to Flow, to see how specific actions are represented in JSON. I have named the following scenarios:

  1. Membership – when a new user joins a group
  2. Comment – when a new comment is created
  3. Post – when a new post is written
    1. Post with a single photo
    2. Post with multiple photos
    3. Post without photo (a status)
    4. Event
    5. all others

In the end the schema looks like below (it contains properties for all scenarios, set to not be mandatory):

{

    “type”: “object”,

    “properties”: {

        “entry”: {

            “type”: “array”,

            “items”: {

                “type”: “object”,

                “properties”: {

                    “changes”: {

                        “type”: “array”,

                        “items”: {

                            “type”: “object”,

                            “properties”: {

                                “field”: {

                                    “type”: “string”

                                },

                                “value”: {

                                    “type”: “object”,

                                    “properties”: {

                                        “from”: {

                                            “type”: “object”,

                                            “properties”: {

                                                “id”: {

                                                    “type”: “string”

                                                },

                                                “name”: {

                                                    “type”: “string”

                                                }

                                            }

                                        },

                                        “member”: {

                                            “type”: “object”,

                                            “properties”: {

                                                “id”: {

                                                    “type”: “string”

                                                },

                                                “name”: {

                                                    “type”: “string”

                                                }

                                            }

                                        },

                                        “update_time”: {

                                            “type”: “string”

                                        },

                                        “verb”: {

                                            “type”: “string”

                                        },

                                        “community”: {

                                            “type”: “object”,

                                            “properties”: {

                                                “id”: {

                                                    “type”: “string”

                                                }

                                            }

                                        },

                                        “actor”: {

                                            “type”: “object”,

                                            “properties”: {

                                                “id”: {

                                                    “type”: “string”

                                                },

                                                “name”: {

                                                    “type”: “string”

                                                }

                                            }

                                        },

                                        “attachments”: {

                                            “type”: “object”,

                                            “properties”: {

                                                “data”: {

                                                    “type”: “array”,

                                                    “items”: {

                                                        “type”: “object”,

                                                        “properties”: {

                                                            “url”: {

                                                                “type”: “string”

                                                            },

                                                            “subattachments”: {

                                                                “type”: “object”,

                                                                “properties”: {

                                                                    “data”: {

                                                                        “type”: “array”,

                                                                        “items”: {

                                                                            “type”: “object”,

                                                                            “properties”: {

                                                                                “url”: {

                                                                                    “type”: “string”

                                                                                },

                                                                                “media”: {

                                                                                    “type”: “object”,

                                                                                    “properties”: {

                                                                                        “image”: {

                                                                                            “type”: “object”,

                                                                                            “properties”: {

                                                                                                “src”: {

                                                                                                    “type”: “string”

                                                                                                },

                                                                                                “width”: {

                                                                                                    “type”: “number”

                                                                                                },

                                                                                                “height”: {

                                                                                                    “type”: “number”

                                                                                                }

                                                                                            }

                                                                                        }

                                                                                    }

                                                                                },

                                                                                “type”: {

                                                                                    “type”: “string”

                                                                                },

                                                                                “target”: {

                                                                                    “type”: “object”,

                                                                                    “properties”: {

                                                                                        “url”: {

                                                                                            “type”: “string”

                                                                                        },

                                                                                        “id”: {

                                                                                            “type”: “string”

                                                                                        }

                                                                                    }

                                                                                },

                                                                                “title”: {

                                                                                    “type”: “string”

                                                                                }

                                                                            }

                                                                        }

                                                                    }

                                                                }

                                                            },

                                                            “media”: {

                                                                “type”: “object”,

                                                                “properties”: {

                                                                    “image”: {

                                                                        “type”: “object”,

                                                                        “properties”: {

                                                                            “src”: {

                                                                                “type”: “string”

                                                                            },

                                                                            “width”: {

                                                                                “type”: “number”

                                                                            },

                                                                            “height”: {

                                                                                “type”: “number”

                                                                            }

                                                                        }

                                                                    }

                                                                }

                                                            },

                                                            “type”: {

                                                                “type”: “string”

                                                            },

                                                            “description”: {

                                                                “type”: “string”

                                                            },

                                                            “target”: {

                                                                “type”: “object”,

                                                                “properties”: {

                                                                    “url”: {

                                                                        “type”: “string”

                                                                    },

                                                                    “id”: {

                                                                        “type”: “string”

                                                                    }

                                                                }

                                                            },

                                                            “title”: {

                                                                “type”: “string”

                                                            }

                                                        }

                                                    }

                                                }

                                            }

                                        },

                                        “type”: {

                                            “type”: “string”

                                        },

                                        “target_type”: {

                                            “type”: “string”

                                        },

                                        “comment_id”: {

                                            “type”: “string”

                                        },

                                        “post_id”: {

                                            “type”: “string”

                                        },

                                        “created_time”: {

                                            “type”: “string”

                                        },

                                        “message”: {

                                            “type”: “string”

                                        },

                                        “permalink_url”: {

                                            “type”: “string”

                                        }

                                    }

                                }

                            }

                        }

                    },

                    “id”: {

                        “type”: “string”

                    },

                    “time”: {

                        “type”: “number”

                    }

                },

                “required”: [

                    “changes”,

                    “id”,

                    “time”

                ]

            }

        },

        “object”: {

            “type”: “string”

        }

    }

}

 

Step 2 – setting up actions

After parsing request body using the schema, Flow must do the following steps:

  1. Get operation field – whether post is a “Comment”, “Membership” or “Posts”
    I do it using the “Compose” action and the following expressions:
    body(‘Parse_request_body’)?[‘entry’]?[0]?[‘changes’]?[0]?[‘field’]
     
  2. Switch branches based on the operation type:

  1. For each type I am getting a proper ID (ex. comment_id for “Comments” or post_id for “Posts”)
  2. Then it must query SharePoint to check, whether there is already a record with that ID. If there is, then obviously workflow ends its execution:

  1. If this is a “Post” type, then I am looking an internal operation type property using the following expression:
    body(‘Parse_request_body’)?[‘entry’]?[0]?[‘changes’]?[0]?[‘field’]

  1. Based on the outcome, Flow switches between:
    1. Photo
    2. Status
    3. Event
    4. and other requests (haven’t seen anything “other” yet)
       
  2. Then for “Photo” type it also checks, whether there is only one or multiple photos (gallery) checking, or if the type is “album” by evaluating the below expression:
    body(‘Parse_request_body’)?[‘entry’]?[0]?[‘changes’]?[0]?[‘value’][‘attachments’][‘data’]?[0]?[‘type’]
     
  3. Finally, for each message type it is also trying to match user with an existing one, by using action “Office 365 Get user profile (V2)” and concatenating users first and last name together with company’s domain as a UPN:
    concat(trim(replace(trim(body(‘Parse_request_body’)?[‘entry’]?[0]?[‘changes’]?[0]?[‘value’][‘from’][‘name’]), ‘ ‘, ‘.’)), ‘@the-company.com’)
     
  4. Obviously, if user is not found, the action triggers error, so for that case the next action, which is “Create list item” has configured “run after” settings also to “has failed”, so that no matter what happens, Flow won’t fail for that reason:

  1. In the end Flow is creating new item on a list, combining all information together:

Structure of Flow’s actions, on a medium level of details, looks as below for the described solution:

Icons (i) visible next to arrows mark relations, in which the next action is executed even if the previous one ends up with failure.

Structure of a single block, used to save data from @workplace into a SharePoint list, looks as on the example below (it shows saving data for a new post):

Things to remember

During that project I learnt the following things. Please, keep them in mind when trying to follow my steps:

  1. @workplace will send GET request to FLOW every time a “Custom integration” is modified and saved.
  2. @workplace will be sending requests for a single event as long as it won’t receive a “200 OK” response – be prepared to verify if the content has already been added.
  3. @workplace is not always waiting for the response. I see dozens of failed flows just because
  4. @workplace abandoned waiting for the response. I have no idea why, it seems to be really random. Anyway – in case it fails waiting, it will try to send the same data again and again until receiving “200 OK”. Only the frequency will be lower and after couple of tries it will stop.
  5. Using the “run after” configuration in Flow’s actions can be really useful – both for overcoming missing information or actions which can end up with errors which we don’t really care about, as well as doing conditional blocks. In my Flow I was using this configuration very often.

Thanks for reaching this sentence. I hope you find it useful. Don’t hesitate to contact me in case you have any questions or leave your comment below!

 

The post Advanced | Flow of The Week – Integrate Facebook Workplace with your SharePoint Intranet appeared first on Microsoft Power Platform Blog.

]]>