Kent Weare, Author at Microsoft Power Platform Blog http://approjects.co.za/?big=en-us/power-platform/blog Innovate with Business Apps Thu, 08 Nov 2018 16:17:24 +0000 en-US hourly 1 Introducing Mobile Application Management (MAM) support for Microsoft Flow Mobile Application http://approjects.co.za/?big=en-us/power-platform/blog/power-automate/mam-flow-mobile/ Thu, 08 Nov 2018 16:17:24 +0000 We have recently shipped a new version of the Flow Mobile application for Apple IOS and Android that supports Microsoft Application Management (MAM) support without device enrollment. Using MAM allows IT administrators to create an enforce mobile data policies to safeguard company data.

The post Introducing Mobile Application Management (MAM) support for Microsoft Flow Mobile Application appeared first on Microsoft Power Platform Blog.

]]>
We have recently shipped a new version of the Microsoft Flow mobile application for Apple iOS and Android that supports Microsoft Application Management (MAM) without device enrollment. Using MAM allows IT administrators to create and enforce mobile data policies to safeguard company data.

Why is this important?
Whether a customer has adopted a Bring Your Own Device (BYOD) strategy or is providing employees with a corporate phone, they are looking for more control over the data that resides on a mobile device. Organizations may want to restrict how data moves on the device and ensure the data is removed, should the employee leave the organization.

What is MAM?
MAM allows organizations to create policies that govern how an application is used within a tenant. This can include enforcing app data encryption, limiting the ability to copy/extract data to only approved applications or enforcing a PIN on a device can be implemented.

Does my device need to be enrolled?
Intune MAM without enrollment does not require a user to enroll their device in Intune MDM. However, the Company Portal application needs to be installed on the device to enforce policies. A user does not need to sign-in to the company portal application for MAM to function. The Company Portal application can be downloaded from the Apple and Android app stores.

What version of the Microsoft Flow mobile app is required?
Version 2.31.0 of the app is required. Our deployments for iOS have reached 100% coverage to all regions. For Android, we are staging our rollout so there may be a delay in this version of the app being available.

How can I setup a MAM policy?
An administrator can create polices from the Azure portal. For the purpose of this blog post, we will create an App protection policy that enforces a flow user to require a pin when using the Microsoft Flow mobile application.

• From the Azure portal, navigate to Intune App Protection.
• Click on App protection policies – Create Policy.
• An Add a policy form will appear which requires a Name, Description and Platform.
• We now need to select an application that we want to manage. Currently, the Microsoft Flow application can be identified as one of the following. 
   com.microsoft.procsimo  (iOS)
   com.microsoft.flow (Android)
 
Note: A more friendly “Microsoft Flow” display name will appear in this experience later this month.
 

• Ensure the appropriate application is selected based upon the platform that you are trying to target. If you do not find it in the list of apps, search for it by typing in the appropriate value into the Bundle ID field. Click the Add button to add this application as a required app and then click Select to complete this configuration. 

  • We now need to define our policy that will impose specific application behaviors by clicking on Configure required settings.
  • Within the Configure require settings experience, there are 3 areas that we need to configure: Data relocation, Access requirements and Conditional launch.
  • Let’s start with the Data relocation settings. Since the flow app is not used to generate local data, we can use the default policy.

Note: This policy has been used as an example. Please modify to meet your organization’s needs. 

• Next, we are going to focus on Access requirements and can establish a policy like the one below. Once we are done configuring our Access requirements we can click on the Ok button.

Note: When testing you can lower the Recheck the access requirements after (minutes) setting to reduce the amount of time you need to wait for a prompt.

• In addition, we can also provide a Conditional launch configuration. For the purposes of this blog post we will keep the default policy and can click OK to complete this interaction.
• Click OK to close the Settings panel.
• Click Create to finalize the policy.
• Within our policy list we should now see the policy that we just created.
• We now need to assign Azure AD groups for which this policy should apply. We can assign access by clicking on our policy and then by clicking on Assignments.
 
 

To select an Azure AD group(s), click on Select groups to include and then select the appropriate group. For this purpose, I have created an Azure AD group and included members for whom I want these policies applied to.

Testing
We can now go ahead and test our MAM policy by logging into the Microsoft Flow mobile app and follow these instructions:
• Ensure you have the latest version of the iOS or Android app (version 2.31.0)
• Close the Microsoft Flow mobile app
• Launch Microsoft Flow mobile app
• You should be prompted with the following message indicating that “Your organization is now protecting its data in this app.”
 

 

• Since we opted to allow finger prints when we created our policy, we have the ability to provide our finger print.
 
• Otherwise, a user will be required to setup and provide a PIN number.
 
Conclusion
MAM support has been a key ask by our customers who are using Intune App Protection to manage company data on mobile devices. By providing this support, we are aligning with Microsoft customer promises to ensure that organizations have a consistent way to manage their mobile data.

 

 

 

 

 

The post Introducing Mobile Application Management (MAM) support for Microsoft Flow Mobile Application appeared first on Microsoft Power Platform Blog.

]]>
Solutions in Microsoft Flow http://approjects.co.za/?big=en-us/power-platform/blog/power-automate/solutions-in-microsoft-flow/ Mon, 05 Nov 2018 18:29:15 +0000 In a recent announcement, Microsoft shared news about a new Application Lifecycle Management (ALM) capability for PowerApps and Microsoft Flow. This new capability is built upon the Common Data Service solution system. In this blog post, we will share details about how Microsoft Flow makers can use Solutions to bundle related flows (and apps) within a single deployable unit.

The post Solutions in Microsoft Flow appeared first on Microsoft Power Platform Blog.

]]>
In a recent announcement, Microsoft shared news about a new Application Lifecycle Management (ALM) capability for PowerApps and Microsoft Flow. This new capability is built upon the Common Data Service solution system. 
In this blog post, we will share more details about how Microsoft Flow makers can use Solutions to bundle related flows (and apps) within a single deployable unit.

Application Lifecycle Management

Previously, we provided the ability to export and import a single flow from one environment to another environment. While this feature was useful for promoting individual flows between environments, it required many clicks to move multiple flows.

We have also received feedback from our customers regarding the ability to logically group related flows as they are being built and managing them. For example, you may have multiple flows that are part of a project you are working on. While working on, or managing these flows, you don’t want to have to scroll and search for these flows when you want to access them.

Solutions address deployment needs by allowing you to export and import a set of flows (and apps). In addition, you can organize these flows within a single ‘container’ which simplifies navigating and managing these flows.

Accessing Solutions

For customers who meet the environment prerequisites (see more below), a Solutions link will appear within the left navigation of the Microsoft Flow maker portal.

The Solutions experience will be loaded where we can find our deployed solutions. As part of this experience we can determine the Date it was installed on, if it is Managed externally, Version and the name of the Publisher.

Creating a new Solution

From the Solutions experience, we have the ability to create a new solution by clicking on the New Solution button.

A new tab will open where we can provide the Display Name, Publisher and Version of our solution. Once we have provided these details, we can click Save and Close.

Adding flow(s) to a Solution

From the Solutions experience, we can click on our newly created solution and navigate to its default view. A context-aware menu will appear that allows us to add New and Existing assets to our solution. Click on New – Flow to add a new flow to our solution.

A new tab will open that will take us to the flow design surface where we can construct our flow and add our related trigger and action(s). Once we are done editing our flow, we can press the Save button to save our flow.

 Once we have saved our flow, this flow will be part of our solution.

We can repeat these same steps to add subsequent flows to our solution.

Exporting our Solution

After validating our flows work in our test environment, we now want to promote it to our production environment. We can export our solution by finding it in our Solutions experience, clicking on the …, selecting Export and then clicking on As unmanaged.

Once we click As unmanaged, a zip file will be made available for us that we can download and store locally.

Importing our Solution

With our solution exported, we can now import it into another environment, such as a production environment. With our production environment selected from the environment picker, we can now choose to import our package by clicking on the Import button.

We now need to browse to select our solution package and then complete the wizard to load the solution.

Once we complete the import process, we will find our solution deployed within our new environment.

Configuring and Enabling our Solution

With our solution imported, there are still a couple of activities that we must perform: 

  • For each flow that we have imported, we need to wire-up connections for our trigger(s) and action(s).

  • By default, when new flows are imported, they will be in a disabled state since connections still need to be established. Upon establishing connection(s) and saving the flow, the flow will become activated.

Note: ​You cannot save changes to a flow that is activated. You need to de-activate it first either from the Solutions experience or the flow maker portal by turning the flow off.

Pre-requisites

The solutioning experience is available only online and for environment version 9.1.0.267 and later. To check your version, please go to …PowerApps admin center> Environments > select your environment > Details tab.

Future Investments

This is the initial release of solution support for PowerApps and Flow. This experience will be enhanced over time as new features are added and the process of deploying apps and flows is further streamlined.

Additional Resources

In this article we focused on including multiple flows within a solution. PowerApps can also be included in solutions as can a mix of powerapps and flows be included. Please check out Linh Tran’s post on the PowerApps’ blog regarding canvas apps in Solutions.

 

 

 

The post Solutions in Microsoft Flow appeared first on Microsoft Power Platform Blog.

]]>
New Power platform Admin Analytics Reports: Sharing and Connectors http://approjects.co.za/?big=en-us/power-platform/blog/power-automate/power-platform-analytics-connectors/ Thu, 25 Oct 2018 18:34:11 +0000 In late September, we announced the public preview release of the Power platform Admin Analytics. In that post we discussed a couple of upcoming reports including sharing and connectors. We have honored that commitment and I am happy to share that both the sharing and connectors reports are available in Microsoft Flow Admin Analytics.

The post New Power platform Admin Analytics Reports: Sharing and Connectors appeared first on Microsoft Power Platform Blog.

]]>
In late September, we announced the public preview release of Power platform Admin Analytics. In that post we discussed a couple of upcoming reports including sharing and connectors. We have honored that commitment and I am happy to share that both the sharing and connectors reports are available in Microsoft Flow Admin Analytics.

Note: The pre-requisites for accessing these reports has not changed, but is something we are working on. Please review the requirements in our previous post.

Both reports provide insight into how users are using flow within your tenant. From a sharing perspective, you are able to understand who are your champions and then figure out how you can empower them to provide even more automated solutions for your organization! The connectors report will identify Microsoft, third party and custom connectors that are in use within your organization. 

When you navigate to the Power platform admin center, you will find an Analytics menu where you can choose to browse analytics for the Common Data Service, Microsoft Flow and PowerApps. For the Sharing and Connectors report, we will click on Microsoft Flow

From within the Flow Analytics feature, we can click on Shared to access our sharing report. Within this report we will see three different visualizations that capture:
• The types of flows shared (System Events, Scheduled or Button clicked)
• The name of the flow that has been shared
• The number of shares that have taken place
• A trendline report of these share events.

In addition to the sharing report, we have also released a report that highlights connector usage. In the Connectors report we will provide:

• Two visualizations that display connector usage by:
o Flow runs
o Connector connections (calls to the connector)
• A table visualization that lists
o The name of connector
o Number of connections 
o Number of flows involved
o Number of flow runs using that connector
 

What’s coming?
We aren’t done just yet. We are working on reducing the requirements for accessing these reports. We are also working and providing more details within these reports. What else is missing? Would love to hear from you in the comments below.

 

The post New Power platform Admin Analytics Reports: Sharing and Connectors appeared first on Microsoft Power Platform Blog.

]]>
Announcing Power Query Online Integration for Microsoft Flow http://approjects.co.za/?big=en-us/power-platform/blog/power-automate/powerquery-flow/ Thu, 27 Sep 2018 13:55:13 +0000 We have recently shipped, in public preview, Power Query Online integration for Microsoft Flow. Using this capability allows flow makers to transform and shape their data coming from SQL Server using Power Query Online.

The post Announcing Power Query Online Integration for Microsoft Flow appeared first on Microsoft Power Platform Blog.

]]>
We have recently shipped, in public preview, Power Query Online integration for Microsoft Flow. Using this capability allows flow makers to transform and shape their data coming from SQL Server using Power Query Online.

Why did we build this capability?

We built this capability for many reasons including:

  • An alternative to OData which can be cumbersome to use for many of our users.
  • A simpler approach to joining tables than T-SQL or writing Stored Procedures.
  • Future opportunities to include additional data sources in mashups that allow makers to build data transformations across multiple data sources.

Licensing

The Power Query functionality will be included in our Premium Connector offering, which requires a Flow Plan 1 or higher. Currently this is not enforced as part of the preview, but will be in the future.

Scenario

Let’s take a closer look at how we can use this new capability to enhance our data extraction capabilities.

In this scenario, we have an Apartment Rental company called Contoso Apartments. Naturally, customers will inform the main office when there are maintenance issues across their many properties. Much like any Work Order system, data is organized in multiple tables including Customers, Apartments and Work Orders. The Customer Service representatives are very interested in keeping their customers happy and want to proactively ensure that customers are content. If they are not, they want to quickly course-correct to avoid them leaving to live somewhere else.

We can monitor customer sentiment by ‘mashing’ up our Customer, Apartment and Work Order data from Power Query. When we do this, we have aggregated data that can be passed to a sentiment analysis tool. When we detect that a customer is unhappy, we can then publish a message to Microsoft Teams where a customer service rep can follow-up with the customer.

To build this solution we will perform the following:

  • We will add a Recurrence trigger that will run every day
  • Next, we will locate our Transform data using Power Query action which is provided as part of the SQL Server connector.

  • To connect to SQL Server, we need to create a Connection that includes a SQL Server name, Database name, Username and Password.

  • With our connection established, we can now create a Power Query query.

  • Our next step is we need to select the tables that we would like to include in our mash-up. In this case we are going to select Customers, Apartments and Work Orders.

  • We want to join these different tables so that we have enriched Work Order data that includes Customer and Apartment related data for our Work Orders. To do this, we will click on Combine tables  and then select Merge queries as new.

  • When we merge, we need to select the type of join that we want to use. In this case we will select a Left Join and declare our Work Order table as our core table that we want to enrich with Customer and Apartment data.

Note: In this scenario, I ran the Merge twice. Once with Work Orders and Customers and then once again with my (WorkOrders and Customers) + Apartment.

  • Once we have merged our tables, we can now trim our dataset by only including the columns that we need.

  • Before we configure the rest of our Flow, we do need to declare our new aggregated query as the query we will Enable Load for.

Note: At this time, you can only enable 1 query that will return back to Flow. However, as we have discovered we can merge multiple queries into a single query for our use.

  • With our Power Query query configured, we can now use the result set and dynamic content, much like we can do with other connectors. In our use case, what we will do with our result set is loop through each record returned and send the Work Order Comments from the customer through the Azure Cognitive Services sentiment analysis API.

 

  • Next, we will evaluate the sentiment returned to see if it is less than .4 (which is really bad). When this occurs, we will add related Apartment, Customer and Work Order information with this sentiment value and add to an Array. After we have iterated through all of these recent Work Orders, we will then check the length of the array to see if we have records. If we do have records, we will convert this Array to an HTML table which we can then publish to a Microsoft Teams Channel.

Testing

We can now go and run this flow from within the Microsoft Flow maker portal or by calling it from the Microsoft Teams Flow Bot. Once the flow runs, it will publish the results of our flow in Microsoft Teams. This allows the Customer Service channel to target customers who are unhappy without performing a lot of data exploration.

 

Features

  • Only SQL Server is supported as a data source. This is deliberate in our first release as we do not want to expose additional data sources that are not protected by Microsoft Flow Data Loss Prevention (DLP) policies. We do want to include additional data sources, but those will be future investments.
  • We do throttle Power Query Online usage based upon:
    • 2 Hours/Day
    • 10 Hours/Week

This is based upon the amount of time it takes for your Power Query queries to execute. If these values don’t work for you, we would love to hear what they need to be.

  • As described previously, we will only output 1 query. To avoid unexpected results, ensure that you Enable Load on the desired query.

What’s Next?

Working with the Power Query team to unlock this capability has been really exciting. Both teams see an opportunity to empower Power platform users to do more using these technologies. Since Power Query is a very rich and deep platform, we would love to hear more about Power Query + Flow use cases that you envision. This feedback will help us prioritize future investments. Please comment below.

The post Announcing Power Query Online Integration for Microsoft Flow appeared first on Microsoft Power Platform Blog.

]]>
Introducing Power platform Admin Analytics http://approjects.co.za/?big=en-us/power-platform/blog/power-automate/admin-analytics/ Tue, 25 Sep 2018 09:47:04 +0000 As part of the recent preview release of the Power platform Admin center, I am happy to announce that we have included Admin Analytics as part of this preview. The Admin Analytics feature includes reports for Common Data Services, Microsoft Flow and PowerApps.

The post Introducing Power platform Admin Analytics appeared first on Microsoft Power Platform Blog.

]]>
As part of the recent preview release of the Power platform Admin center, I am happy to announce that we have included Admin Analytics as part of this preview. The Admin Analytics feature includes reports for Common Data Services, Microsoft Flow and PowerApps.

For those of you who are familiar with the Microsoft Flow Maker Analytics that we previously shipped, the Admin Analytics for Microsoft Flow will look and feel very familiar. We are providing the same Power BI Embedded experience, but with an environment-wide view point. 

Why did we build Microsoft Flow Admin Analytics?

We built Admin Analytics as a result of customer feedback. Our customers want more visibility into how their organization is using Microsoft Flow. They also want quick access to insights that allow them to govern and provide change management services to their users.

What is required to access Admin Analytics?

During this preview release, tenant administrator privileges are required. In addition, a Flow Plan 2 license is required.

Note: During this preview, you will also need to be part of the Environment Administrator role for the  environment that you wish to view Analytics. This permission can be set within the existing Flow Admin center. See the Roadmap section of this post for additional information.

How long is data retained?

Data is retained for 28 days and never leaves the region that your environment is hosted in. However, we do provide filters that allow you to view 7 and 14 days worth of data.

Can I export this data?

Much like other Power BI dashboards, yes you can export data by clicking on the Export data label within the visualization menu.

What is the scope of data presented?

We provide analytics from a per-environment view point. You can select your environment by clicking on the Change Filters link or the Filter icon in the upper right-hand corner.

What reports are included in Microsoft Flow Admin Analytics?

As part of this initial release, we are including 5 Microsoft Flow reports including:

  • Runs provides a Daily, Weekly and Monthly view of the Successful, Failed, Cancelled and Total flow runs within a specified environment.

  • Usage provides insights related to the types of flows that are in use. This includes Button, Scheduled and System Event flows, all broken down by the number of runs and the trend over a configured timeline.

  • Created provides insight into the different types of flows that have been created. This includes Button, Scheduled and System Event flows, all broken down by the number of runs and a trend over a configured timeline.

  • Errors provides insights into flows that may be experiencing issues. The errors will be broken down by error type so that you can look for common problems that may exist. In addition, we will provide the total number of errors that have occurred within your configured timeframe. We will also provide you with a Last occurred timestamp which will provide an indication of how recently the last error occurred.

  • Shared provides insight into the flows that have been shared within an environment. We will include the type of flow including Button, Scheduled or System Events. In addition, we will provide the name of the flow that has been shared and the number of shares that have taken place including a timeline of these share events.

What is coming next?

As mentioned earlier in this blog post, this is an initial preview of our Admin Analytics. Our team is already working on our next release which will include more details about who owns these flows that have been used, created, shared or have errors. In addition, we will be providing a Connectors report which will outline which connectors are being used within an environment.

Roadmap

In addition to the enhancements that will follow this initial release, we will also be working on more granular access to these reports so that Environment Admins have access to these reports without requiring tenant administration privileges.

We also want to provide aggregated tenant-level analytics so that you can see summary level information across your entire tenant.

What else would you like to see included in these analytics?

Please provide comments below on any features that you would love to see included in this analytics feature.

The post Introducing Power platform Admin Analytics appeared first on Microsoft Power Platform Blog.

]]>
Power platform Security & Governance: Deploying a Defense in Depth Strategy http://approjects.co.za/?big=en-us/power-platform/blog/power-automate/security-governance-strategy/ Thu, 30 Aug 2018 14:47:07 +0000 A common cyber security approach used by organizations to protect their digital assets is to leverage a defense-in-depth strategy. When customers ask how to best secure and govern their Microsoft Flow and PowerApps environments, we provide similar guidance. The following list represents different layers that you can use to protect your digital assets and apply governance to ensure your organization’s interests are met.

The post Power platform Security & Governance: Deploying a Defense in Depth Strategy appeared first on Microsoft Power Platform Blog.

]]>
A common cyber security approach used by organizations to protect their digital assets is to leverage a defense-in-depth strategy. The SANS Institute defines defense-in-depth as “protecting a computer network with a series of defensive mechanisms such that if one mechanism fails, another will already be in place to thwart an attack.”

When customers ask how to best secure and govern their Power platform environments (which includes Microsoft Flow and PowerApps), we provide similar guidance. The following list represents different layers that you can use to protect your digital assets and apply governance to ensure your organization’s interests are met.

  • Secure data at rest Microsoft Flow does not provide users with access to any data assets that they don’t already have access to. This means that users should only have access to data that they really require access to. It also means that if a user has access to this data through a web browser, then they likely have access to it through Microsoft Flow. A recommendation the Microsoft Flow team suggests, is using a least privilege approach to data access. The United States Computer Emergency Readiness Team refers to least privilege access as: “Every program and every user of the system should operate using the least set of privileges necessary to complete the job. Primarily, this principle limits the damage that can result from an accident or error.” Deploying least privilege access is a good practice and a big part of an organization’s overall security hygiene.
  • Network Access Control The National Institute of Standards and Technology (NIST) encourages organizations to inspect “inbound and outbound network traffic for specific IP addresses and address ranges, protocols, applications, and content types based on the organization’s information security policies.” While Microsoft Flow is a cloud-based application, organizations have the ability to govern how connections are established when users are connected to the corporate network. For example, if an organization blocks access to a social media site from within their corporate network by blocking the sign-on page through their firewall, then when this same log-in page is launched from the flow portal, the connection can also be blocked from being established.
  • Location-based Conditional Access For organizations that want to govern where users can access the Microsoft Flow service from, they can setup Azure Active Directory Conditional Access policies that can restrict what network addresses have access to the service. For additional information, please refer to the following presentation from the Microsoft Business Application Summit.
  • Data leakage can be avoided by configuring Data Loss Prevention (DLP) polices that allow an administrator to group connectors into Business data and Non-Business data groups. Connectors within each group can communicate with each other but cannot be used within a flow if the connectors span these two data groups. There are both design-time and runtime checks that will enforce these policies.
  • Anomaly Detection is another common strategy used by organizations to understand user behavior. For example, if an organization usually creates 5 new flows every day and there is an exponential spike in flows being created, then it may be worth understanding what is driving that growth. Is it legitimate usage or is there a threat. How can this be detected? Microsoft recently released management connectors for Microsoft Flow, Microsoft PowerApps and Microsoft Power platform. We also published a template that will automate the discovery of these assets.

  • NIST classifies Audit Trails as “a record of system activity both by system and application processes and by user activity of systems and applications.  In conjunction with appropriate tools and procedures, audit trails can assist in detecting security violations, performance problems, and flaws in applications.” Microsoft Flow publishes audit trail events to the Office 365 Security and Compliance center related to:
    • Created flow
    • Edited flow
    • Deleted flow
    • Edited permissions
    • Deleted permissions
    • Started a paid trial
    • Renewed a paid trial

As part of these audit events, the user who was involved in the event will be captured and in the case of create flow and edit flow events, the connectors used in these flows will also be captured.

 

  • Alerting is another line of defense that should be used to inform stakeholders when corporate policies have been broken. Much like we want Microsoft Flow users to automate their business processes, we also want to provide administrators with this same level of automation. An example of alerting that can be implemented is subscribing to Office 365 Security and Compliance Audit Logs. This can be achieved through either a webhook subscription or polling approach. However, by attaching Flow to these alerts, we can provide administrators with more than just email alerts. By leveraging the new Management Connectors or PowerShell Cmdlets corrective action can be implemented which allows administrators to remain productive as they protect their environment.
  • Education cannot be ignored as a layer of defense. Cybersecurity is more than just technology and processes, it is also highly dependent upon people. Phishing continues to be a popular avenue for hackers to try and exploit. In part due to users clicking on links that they shouldn’t. In many circumstances, users are tricked into clicking on links based upon clever campaigns being designed. End-user education continues to be another layer that organizations implement to prevent breaches. Microsoft Flow users should also be educated on company cyber security policies to ensure this security layer is not exploited.

Additional Resources

In this blog post we discussed many security layers that organizations should implement as they seek to govern and protect their environment. In addition to what we have discussed in this blog post, we also have additional resources that organizations can leverage to protect their environments.

·PowerShell Cmdlets for PowerApps and Microsoft Flow In May, we introduced PowerShell cmdlets that provide both user and admin functions to automate Application Lifecycle Management (ALM) and administrative tasks. We continue to update these PowerShell cmdlets based upon customer feedback. Please find the latest release here.

·PowerApps and Microsoft Flow Governance and Deployment Whitepaper was released earlier this month and includes prescriptive guidance for deploying and managing the Power platform. Topics within the whitepaper focus on the following areas:

  • Data Loss Prevention (DLP) Policies
  • PowerApps and Microsoft Flow Access Management
  • Automating Governance
  • Deployment Scenarios
  • Office 365 Security and Compliance Center
  • Importing and Exporting application packages
  • Licensing
  • Power platform Admin Center (coming soon) At the Business Application Summit in July, we announced a unified experience for managing Dynamics 365, PowerApps, Microsoft Flow and CDS for Apps assets. One of the features of this new admin experience is Admin Analytics, which will provide administrators with an analytics experience that will provide insight into how these flows and apps are used within their tenant.

The post Power platform Security & Governance: Deploying a Defense in Depth Strategy appeared first on Microsoft Power Platform Blog.

]]>
Advanced | Flow of the Week: Automating Microsoft Flow Governance – Using Microsoft Flow http://approjects.co.za/?big=en-us/power-platform/blog/power-automate/automate-flow-governance/ Thu, 23 Aug 2018 17:52:12 +0000 Enterprise Security and Governance is an important topic for many organizations. Microsoft continues to make investments that allow customers to implement PowerApps and Flow and be confident that they have their bases covered from a governance perspective. Much like Microsoft Flow empowers users to build powerful workflow and automation solutions, we want to empower administrators with the same capabilities to support their needs. In this blog post we are going to explore a scenario that describes how you can automate governance activities by taking advantage of the Office 365 Management API.

The post Advanced | Flow of the Week: Automating Microsoft Flow Governance – Using Microsoft Flow appeared first on Microsoft Power Platform Blog.

]]>
Introduction

Enterprise Security and Governance is an important topic for many organizations. Microsoft continues to make investments that allow customers to implement PowerApps and Flow and be confident that they have their bases covered from a governance perspective. Much like Microsoft Flow empowers users to build powerful workflow and automation solutions, we want to empower administrators with the same capabilities to support their needs. In this blog post we are going to explore a scenario that describes how you can automate governance activities by taking advantage of the Office 365 Management API.

Overview

A scenario that we will walk through in this post is the ability to detect when specific events exist within a flow definition so that we can detect these events and provide pro-active governance against it. For example, some organizations would like to avoid users forwarding emails externally. Microsoft Exchange can block these scenarios through transport rules. But, using cloud workflow tools (including more than just Flow) you generally break down these actions into more discrete events. For example, I can receive an email and send an email within the same flow. Independently, these actions may not be perceived as forwarding an email, but from a functional perspective, they achieve the same result.

In order to detect these events, we will depend upon the Office 365 Security and Compliance logs which will capture events related to creating, editing or deleting a flow. In a previous blog post, we discussed how we can poll the Office 365 Security and Compliance PowerShell Webservice looking for these events. In this blog post, we are going to use an event-driven approach where we will create a webhook and have events sent to a Microsoft Flow endpoint. Once Microsoft Flow receives this event, we will go fetch additional details of the event. We will then parse these events and perform some logic to determine if a condition exists that warrants action, including stopping the flow that is a concern.

Pre-requisites

In this blogpost, we will be interacting with the Office 365 Management API and the Microsoft Flow Management connector. As a result, there are specific requirements for accessing these capabilities:

Office 365 Management API

  • Global Administrator Access
  • Azure AD Application
  • Get Office 365 tenant admin consent

Flow Management Connector

  • Global Administrator or Environment Admin
  • Microsoft Flow P2 license

Azure AD Application

The first thing that we need to do is create an Azure AD Application that we will use when calling the Office 365 Management API. For this blog post we are going to try to focus on the Microsoft Flow components as much as possible. For additional information on the Office 365 Management API, please see the following post.

To create an Azure AD Application:

  1. Navigate to the Azure Portal
  2. Select Azure Active Directory and then App registrations
  3. Create a New application registration
  4. Provide a Name for your application, Application type of Web app/API and a Sign-on URL.

Note: The Sign-on URL is an arbitrary value. You can even put a value of http://localhost

  1. Once the application has been created, you can click on Settings to further configure.
  2. Click on Properties and make a note of the Application ID as you will require it in a future step.
  3. While on the Properties screen, ensure the Multi-tenanted option is set to Yes.
  4. Click on Reply URLs and add a value. For this value you can provide an arbitrary URL, but having it resolve will simplify an upcoming step. For my example, I just used my blog http://www.middlewareinthecloud.com

  1. Next, click on Required permissions
  2. Click on Add – Select an API – Office 365 Management API
  3. Next, set the permissions as illustrated below.

  1. We now need to need to obtain a Key which can be achieved by clicking on Keys.
  2. Provide a Description, Duration and click Save. Once you have done this, a Key Value will be generated. Copy this value for future use.
  3. Save and exit.

Note: If your key contains special characters like ‘/’ and ‘+’, you will get an invalid key error when you try to create a token in a subsequent step. These values need to be encoded and any online URL encoding website should be able to encode these values for you.

Get Office 365 tenant admin consent

In the Office 365 documentation, it calls out “a tenant admin must explicitly grant your application these permissions in order to access their tenant’s data by using the APIs”. As a result, a tenant admin must call the following URL in order to grant consent. In addition, the URL will return an authorization code that we will need in a future call.

Within this URL, there are two placeholders that we need to populate with information from our Azure AD application. When it comes to “{your_client_id}” this is referring to the Application ID that we recorded when creating our Azure AD application. The “{your_redirect_url}” placeholder refers to the Reply URL that we also provided when creating the Azure Ad application.

https://login.windows.net/common/oauth2/authorize?response_type=code&resource=https%3A%2F%2Fmanage.office.com&client_id={your_client_id}&redirect_uri={your_redirect_url}

  1. With our URL formulated, we can use a web browser to make this call. Upon successfully calling this URL, you will be prompted with a consent dialog.

  1. Upon Accepting the terms, your Reply URL web page should be displayed.

Create Microsoft Flow Listener

With our Azure AD App created and consent granted to use the Office 365 Management API we are now going to create our webhook subscription within Office 365. But, before we do that we need to be able to provide a URL that can be called whenever there are events published from the O365 Management API. We will now create our flow and then we can use the URL that is provided as part of our HTTP trigger when configuring our webhook subscription.

  1. Create a Flow from blank and add an HTTP Trigger
  2. Since we want a typed message that we can be used within our flow, we can provide a JSON Schema payload of an event we can expect to receive from the O365 Security and Compliance Center.
{
    "type": "array",
    "items": {
        "type": "object",
        "properties": {
            "clientId": {
                "type": "string"
            },
            "contentCreated": {
                "type": "string"
            },
            "contentExpiration": {
                "type": "string"
            },
            "contentId": {
                "type": "string"
            },
            "contentType": {
                "type": "string"
            },
            "contentUri": {
                "type": "string"
            },
            "tenantId": {
                "type": "string"
            }
        },
        "required": [
            "clientId",
            "contentCreated",
            "contentExpiration",
            "contentId",
            "contentType",
            "contentUri",
            "tenantId"
        ]
    }
}


 

  1. Next, we will add 3 Compose actions where we will store our values for client id, client key and tenant. For both client id and client key you should have these values from when you created your Azure AD application. Your tenant id can be retrieved by following one of these approaches.

Note: We chose to use Compose actions instead of variables as there is less of a performance hit and these are values that we will not need to further manipulate.

  1. Our next step is to retrieve an auth token that we can use to retrieve event details from the O365 Security and Compliance Center. We will use the values that we captured in our Compose actions and construct a URI that includes our Tenant ID. Our Header will include a Content-Type of application/x-www-form-urlencoded. Lastly, we need to provide key/value pairs that include our Client ID, Client Secret, Resource and Grant Type.

 

  1. We need to use the token that is returned in downstream actions so we will add a Parse JSON action that can will use this HTTP response as an input. The following Schema can be used to give our response a message shape.
{
 "type": "object",
 "properties": {
  "token_type": {
   "type": "string"
  },
  "expires_in": {
   "type": "string"
  },
  "ext_expires_in": {
   "type": "string"
  },
  "expires_on": {
   "type": "string"
  },
  "not_before": {
   "type": "string"
  },
  "resource": {
   "type": "string"
  },
  "access_token": {
   "type": "string"
  }
 }
}

 

  1. Our HTTP Trigger will only provide us with a message that describes the event that occurred inside the Office 365 Security and Compliance Center. It won’t provide us with actual details about the event. To get the actual details about the event we need to make a subsequent call to the Office 365 Management API to get the details. We will accomplish this by using the HTTP Action and performing a GET request to the URI that was provided as part of the inbound message. The expression that we can use to retrieve this value is triggerBody()[0]?[‘contentUri’]. We also need to provide an Authorization Header that includes a Bearer token that is retrieved from our previous Parse Token Response action. In addition, we need to specify a Content-Type of applicationhttps://www.microsoft.com/json.

 

  1. We now need to parse our response from the Office 365 Management API so we can explore the results. Once again we will use the Parse JSON action and this time we will provide the following schema:
{
 "type": "array",
 "items": {
  "type": "object",
  "properties": {
   "CreationTime": {
    "type": "string"
   },
   "Id": {
    "type": "string"
   },
   "Operation": {
    "type": "string"
   },
   "OrganizationId": {
    "type": "string"
   },
   "RecordType": {
    "type": "integer"
   },
   "ResultStatus": {
    "type": "string"
   },
   "UserKey": {
    "type": "string"
   },
   "UserType": {
    "type": "integer"
   },
   "Version": {
    "type": "integer"
   },
   "Workload": {
    "type": "string"
   },
   "ObjectId": {
    "type": "string"
   },
   "UserId": {
    "type": "string"
   },
   "FlowConnectorNames": {
    "type": "string"
   },
   "FlowDetailsUrl": {
    "type": "string"
   },
   "LicenseDisplayName": {
    "type": "string"
   },
   "RecipientUPN": {
    "type": "string"
   },
   "SharingPermission": {
    "type": "integer"
   },
   "UserTypeInitiated": {
    "type": "integer"
   },
   "UserUPN": {
    "type": "string"
   }
  },
  "required": [
   "CreationTime",
   "Id",
   "Operation",
   "OrganizationId",
   "RecordType",
   "ResultStatus",
   "UserKey",
   "UserType",
   "Version",
   "Workload",
   "ObjectId",
   "UserId",
   "FlowConnectorNames",
   "FlowDetailsUrl",
   "LicenseDisplayName",
   "RecipientUPN",
   "SharingPermission",
   "UserTypeInitiated",
   "UserUPN"
  ]
 }
}

 

  1. The Parse Log Event can retrieve multiple events from Office 365. As a result, we need to loop through the Body that is returned from the Parse Log Event. This loop will get added as soon as we use a data element from the Parse Log Event output.
  2. Since Microsoft Flow events are captured within Audit.General Content Type inside of Office 365 Security and Compliance Center, will now want to perform some logic that will focus on Microsoft Flow CreateFlow and EditFlow events. To accomplish this, we will add an advanced condition that includes an or statement that looks for either CreateFlow or EditFlow events.

@or(equals(items(‘Apply_to_each_2’)[‘Operation’], ‘CreateFlow’),equals(items(‘Apply_to_each_2’)[‘Operation’], ‘EditFlow’))

  1. Next, we want to see if the Office 365 Outlook Connector is being used within this Flow that created the audit event. We can achieve this by seeing if the FlowConnectorNames attribute (within the Parse Log Event) contains Office 365 Outlook.

  1. If the list of connectors does include the Office 365 Outlook connector then we want to further explore whether the Forward Email action is being used since that is the action that we want to prevent our users from using. In order to determine if a Flow Definition does contain the ForwardEmail action we need to capture the Environment ID and Flow ID. To get the Environment ID we will use a Compose Action and use an expression to parse it from the FlowDetailsUrl attribute that can be found within Parse Log Event – Body array. The expression we want to use is:

substring(replace(item()?[‘FlowDetailsUrl’],’https://admin.flow.microsoft.com/environments/’,”),0,indexOf(replace(item()?[‘FlowDetailsUrl’],’https://admin.flow.microsoft.com/environments/’,”),’/’))

  1. We will use a similar approach to retrieve the Flow ID, but our expression will be:

replace(substring(item()?[‘FlowDetailsUrl’],lastIndexOf(item()?[‘FlowDetailsUrl’],’/’),sub(length(item()?[‘FlowDetailsUrl’]),lastIndexOf(item()?[‘FlowDetailsUrl’],’/’))),’/’,”)

  1. In an upcoming step, we want to add our Principle Id as an owner of this flow that we want to inspect so that we can retrieve the flow definition. To obtain our Principle ID we can use the Office 365 Users connector and the Get my profile (V2) action to provide this attribute.
  2. We can use the Id returned from the Get my profile (V2) action with our outputs from the Get Environment and Get Flow ID compose actions to add our account as an owner of this flow.

 

  1. Being an owner of the flow is important so that we can retrieve the flow definition to determine whether or not the Forward Email action is being used. We can retrieve the flow definition by using the Flow Management connector and using the Get Flow action. Once again we need to use the outputs from the Get Environment and Get Flow ID compose actions as inputs to this action.

  1. We are going to inspect the flow definition for a swaggerOperationId that is equal to ForwardEmail but before we do that we need to cast the json flow definition to a string. We can do this by using the following expression: string(body(‘Get_Flow’)[‘properties’][‘definition’]). Once we have it cast, we can see if it contains “swaggerOperationId”:”ForwardEmail”.

  1. If the flow definition does include the ForwardEmail action then we want to perform some additional steps in the If yes branch.
  2. As you have seen, the Environment ID is an attribute that we have used within this flow. But, we have not used the Environment Name, since it isn’t a data attribute that is available to us at this point. However, we can access this attribute by using the List My Environments action that is part of the Flow Management connector.

  1. By calling the List My Environments action, all of the environments that our user has access to will be returned. Since we cannot filter using the existing connector, we can add a Filter array action and filter on the Environment Name attribute by comparing it to the Environment ID that we have previously captured.

  1. Since the Filter array action will return a list of items that match our criteria, we will want to access the first instance using an expression body(‘Filter_array’)[0]?[‘properties’]?[‘displayName’] which will take the first index of our array. Since Environment IDs are unique, this approach is safe.
  2. With our Environment Display Name now available, we can pass this attribute and others into an approval that we will use to determine whether or not any corrective action is required. In addition, we will include the Flow Display Name, Environment ID, User UPN (from Parse Log Event) and Connectors Used (from Parse Log Event).

  1. Next, we will wait for an approval by adding a condition to our flow. Provided the Response is equal to Approve we will use the Stop Flow action that is part of the Flow Management connector to stop that flow.

To view the entire flow, please click on the following link.

 

Creating Office 365 Management API Webhook

With our flow now complete, there is something that we need to do before we create our Webhook subscription. We need the URL that is part of our HTTP Request Trigger which we can copy by clicking on the following icon.

To complete the next couple steps we are going to need to call the Office 365 Management APIs and as a result will benefit from a tool called Postman.

We need to generate an access token that we can use to create our Webhook subscription. To do this, we need access to our Code that is returned from our consent call.

  1. To obtain this code populate client_id and redirect_uri with your values and enter this into a web browser.

https://login.windows.net/common/oauth2/authorize?response_type=code&resource=https%3A%2F%2Fmanage.office.com&client_id={your_client_id}&redirect_uri={your_redirect_url}

  1. When the webpage resolves, there will be a query parameter called code returned in the URL. Copy this value for use in the next step.

Note: At the end of the URL returned from the web browser, there may be a session_state query parameter also returned. This value is not required and should not be included in the next step.

  1. We now need to construct an HTTP request that we will send to https://login.windows.net/common/oauth2/token that looks like the following image that will provide us with an access_token that we will use when creating our webhook. As part of this request we will need to provide data from our Azure AD application that we previously created including client_id, client_secret and our redirect_uri. In addition to these values, we also need to include a resource of https://manage.office.com, a grant_type of authorization_code and our code from our previous step.

  1. Next up is creating our Webhook subscription. To do this we will need to copy out the access_token from our response. Inside of Postman, open a new tab and construct a new POST request to https://manage.office.com/api/v1.0/{your_tenant_id}/activity/feed/subscriptions/start?contentType=Audit.General

Note: We are including a query parameter of contentType that has a value of Audit.General. As mentioned previously, the flow events show up under this content type.

The Headers that we need to include are Authorization that has a value of Bearer <access_token>. Recall this is the access_token from our previous step. We also want to provide a Content-Type of applicatiionhttps://www.microsoft.com/json.

We aren’t quite done yet. We also need to provide a Body where we will include our Flow Request URL and a value for authId.

{
 "webhook" : {
  "address": "Enter your Flow Request URL here",
  "authId": "Enter an arbitrary value here",
  "expiration": ""
 }
}
  1. When we submit this request, we can expect to receive a response like the one below which indicates that our webhook has been created successfully.

 

Testing

We are now ready to test!!! To test our new governance process, we will sign into Microsoft Flow with a different user account. We will subsequently create a new flow the includes an Outlook Trigger and has a Forward Email action.

  1. Upon saving this flow, an event will be raised within the Office 365 Security & Compliance Center within approximately 20 minutes and our webhook subscription will be invoked.
  2. We should now have an approval waiting for us.

  1. We will go ahead and approve this request. When we do, we will see that this flow has been stopped from further processing.

 

Conclusion

In this blog post we explored some powerful capabilities that exist within the Office 365 Management APIs and the Flow Management Connector. Using the combination of these two platforms allows for a customized governance experience. This allows organizations to build governance solutions on top of what Microsoft already provides out of the box.

In addition to the scenario that we just built, this solution can be extended to support other scenarios that you want to govern, including other connectors or actions that you want to restrict.

 

Other Considerations

  • In this post we described how to receive events from the Office 365 Security and Compliance Center using a webhook approach. There are also options to use a polling approach like we covered in a previous blog post.
  • We only covered one scenario where we parsed our flow definition. If you wanted to build a more comprehensive parsing solution, you can build an Azure Function and pass the flow definition into the function where your logic is executed.

The post Advanced | Flow of the Week: Automating Microsoft Flow Governance – Using Microsoft Flow appeared first on Microsoft Power Platform Blog.

]]>
Intermediate | Flow of the Week: Automating Change Management Processes http://approjects.co.za/?big=en-us/power-platform/blog/power-automate/intermediate-flow-of-the-week-using-try-catch-to-build-robust-flows/ Wed, 06 Jun 2018 19:37:09 +0000 Change Management processes are important to organizations to ensure work is scheduled, prioritized, repeatable and that oversight has been applied. However, a problem may arise if organizations layer too many levels of bureaucracy on top of these processes. But, the good news is that Microsoft Flow can automate many Change Management processes that allow organizations to achieve their Change Management goals, without accruing inefficiencies. In this weeks blog, we take a look at how Change Management notification processes can be streamlined using Microsoft Flow and ServiceNow.

The post Intermediate | Flow of the Week: Automating Change Management Processes appeared first on Microsoft Power Platform Blog.

]]>
Change Management processes are important to organizations to ensure work is scheduled, prioritized, repeatable and that oversight has been applied. However, a problem may arise if organizations layer too many levels of bureaucracy on top of these processes. But, the good news is that Microsoft Flow can automate many Change Management processes that allow organizations to achieve their Change Management goals, without accruing inefficiencies.

For many organizations, the IT Change Advisory Board (CAB) is weekly event.  The change board has many purposes including:

  • A communication vehicle for teams to discuss what activities they are about to perform
  • Prioritize activities in the queue
  • Understand impacts to dependent services
  • Applying governance to ensure of regulatory and financial compliance
  • Monitoring and tracking the execution of changes

Unfortunately, some organizations start to layer inefficiencies and red tape which leads to a lot of bureaucracy. Inefficient CAB processes may lead to disengaged members as the value of change board meetings drop. These inefficiency losses only impair the original intents we described above and lead to more mistakes, outages and compliance exposure.

There are many opportunities to automate Change Management processes which can actually improve efficiency, engagement, compliance and success of changes. The scenario I am about to describe is one, of many, ways in which organizations can automate a change board process. To aid in the illustration of this scenario, we will use ServiceNow as our IT Service Management system and use the Microsoft Flow connector to interact with it.

Most CAB meetings occur at the same time every week. In order for people to know whether their change has been approved, they need one of the following:

  • Having attended the CAB meeting to see if their change has been approved.
  • Wait for an email from their team’s representative indicating that their change has been approved/rejected.
  • Log in to the IT Service Management application and look for their ticket to see if it has been approved.

Instead, a scheduled process can be setup within Microsoft Flow that performs the following functions:

  • Add a Recurrence Trigger and have it configured to run at a relevant time after your CAB meeting.
  • Initialize an Array called Changes
  • Create a connection to your ServiceNow instance and select the List Records action. Select Record Type of Group and then provide a Query of name=Network. In this case, name represents the display name of an Assignment group in ServiceNow called Network. You can change this based upon the Assignment Group that you would like to target. 

Note: For greater clarity, I have renamed the List Records action to be more descriptive bay calling it List Assignment Groups.

  • The List Assignment Groups action will return an array. In this case we expect only 1 record as we have provided a query. As a result, when we go to use the output of our previous action, an Apply to each loop is added. Once again, we want to use the List Records action when connecting to ServiceNow. This time we are going to target the Change Request table and look for records that have been approvedassignment_group=Sys ID (returned from our List Assignment Groups call) and state= 2 which in this case means scheduled.

Note: The ^ represents the logical & operator in ServiceNow’s query language.

  • At this point, we will have all of the approved change requests for the Network team. But, it would also be useful to have who the owner of the ticket is to ensure there is clear accountability of who is responsible for the execution of the change.  To accomplish this, we will add a Get Record action and will pass in the Assigned to value that was returned from the List Approved Change Requests action. In this particular case, the Assigned to contains a reference to the assigned to user but does not include a user friendly display name. By retrieving the User record we can retrieve their user profile.

  • We can now assemble a composite message which will include the outputs from our List Approved Change Requests and Get (User) Record. To do this, we will use the Variables – Append to array variables action, which I have renamed for my scenario.

Note: this action takes place within our parent loop as we need to do this for reach change request that has been returned.

  • Our next step is to assemble an HTML table that we can use as part of our output.

  • When it comes to outputs, there are several options including posting to a Microsoft Teams Channel or sending an email.

Testing

If we go ahead and run this flow, we will see the following outputs. The first one for Microsoft Teams and the second for Email which includes CSS markup.

Bonus Content

Since this is a Scheduled flow, this also means that we can invoke this flow from within Microsoft Teams itself through the Flow Bot for Microsoft Teams which we introduced this past January. This allows the owner of this flow to kick off the flow and communicate with the broader team with a few keystrokes.

Conclusion

This was just one example of a Change Management process that can be automated. We have created ServiceNow templates to help get you started on this and other scenarios which allow you to improve the efficiencies of both your Incident and Change Management processes.

What are some other Change Management processes that you would love to see automated? Please let us know in the comments below.  

 

 

 

 

The post Intermediate | Flow of the Week: Automating Change Management Processes appeared first on Microsoft Power Platform Blog.

]]>
Advanced | Flow of the Week: Filtering Data with OData http://approjects.co.za/?big=en-us/power-platform/blog/power-automate/advanced-flow-of-the-week-filtering-with-odata/ Wed, 28 Feb 2018 17:38:08 +0000 A question that we frequently receive is how can I filter out data before it gets to Microsoft Flow? The answer to this question is: OData filter queries. In this blog post we are going to cover some of the most popular OData filter queries using some of our most popular connectors including SQL Server, Dynamics 365 and SharePoint Online.

The post Advanced | Flow of the Week: Filtering Data with OData appeared first on Microsoft Power Platform Blog.

]]>
OData (Open Data Protocol) is an OASIS standard that establishes best practices for designing RESTful APIs. One of the capabilities of OData is providing the ability to filter data using a standardized method across RESTful APIs, regardless if they are vendor provided or custom developed. Since Microsoft Flow’s connectors are built-upon RESTful APIs, many of our connectors support the ability to filter datasets server-side using OData. Some of the benefits of using OData include reducing the amount of data you are bringing into your flow, thus reducing the need to loop through a record set to find values of interest.

In this blog post we are going to explore some popular OData filter expressions that you can use with some of our most popular connectors including SQL Server, Dynamics 365 and SharePoint Online.

Scenario #1: Get Rows from SQL Server and filter on Customer Name

We have the following Azure SQL database with a table that contains many work orders. From Microsoft Flow, we want to return only rows where the Customer Name is equal to ‘Contoso’

Inside of Microsoft Flow, we can add a SQL Server – Get Rows action. After providing a ​Table name we also have the ability to provide a ​Filter Query​.  Inside this textbox we will provide a statement of CustomerName eq ‘Contoso’. The breakdown of this syntax is we need to provide the name of the field in the source system (i.e. SQL Server), followed by an operator. In this case we want to use = which is represented as eq in OData. Don’t use the = symbol otherwise you will get a runtime error. Lastly, we need to provide a value that we want to filter on. In this case we want to filter on Contoso.​ Since it is a string, we need to wrap it in single quotes ‘ ‘.

For the purposes of this blog post, we will wrap the results in HTML and send them via Office 365 Outlook connector so we can verify our results.

After the flow executes, our we will see our results rendered successfully and only records with a Customer Name of Contoso are displayed.

Scenario #2: Get Rows from SQL Server and filter on date

In this scenario we want to filter out older records and only retrieve records that have a Work Order Create Date that is less than 30 days old. To accomplish this we will also use a flow expression that will calculate the date, 30 days ago. We will then look for any records that have a Work Order Create Date that is greater than this date. The complete expression is: WorkOrderCreatedTime gt addDays(utcnow(‘yyyy-MM-ddTHH:mm:ssZ’),-30). In this scenario, WorkOrderCreatedTime is our source field, gt ​represents our ‘greater than’ operator and addDays(utcnow(‘yyyy-MM-ddTHH:mm:ssZ’),-30) will calculate a date of 30 days prior.

The results only include records that are less than 30 days old.

Scenario #3: List Records from Dynamics 365 using an AND clause

 We will now move onto the Dynamics 365 connector where we can also use OData to filter out records. In this case we want to want to retrieve only records where the Account Name is Contoso Hospital AND the City is Phoenix​.

​To accomplish this we will use an AND​ clause that will let us join two statements. The first being our (Account) name being equal to ‘Contoso Hospital‘ and secondly, our address1_city being equal to ‘Phoenix’. Our complete statement is name eq ‘Contoso Hospital’ and address1_city eq ‘Phoenix’.

When we execute our flow, we will see results only related to the Contoso Hospital in Phoenix.

 

Scenario #4: List Records from SharePoint Online that Starts With

In our final scenario, we are going to filter records from a custom SharePoint list. In this particular example, we have 4 records within a SharePoint List and we want to filter on all sites that start with the word ‘Contoso’​.

From a flow perspective, we will include the following OData query within our SharePoint action: startswith(Title,’Contoso’) where Title ​is the name of the column that we want to filter on and ​Contoso​ is the value we want to the column to start with.

​When our flow runs, we will discover that only the Site Names that begin​ the word Contoso are included in our results.

Conclusion

​In this blog post we covered 4 different OData queries across 3 different connectors including SQL Server, Dynamics 365 and SharePoint Online. While the syntax is a little different than what you are used to when using T-SQL, the power available to you unlocks new ways to filter your data in Microsoft Flow. Using OData to filter at the data source will reduce execution times as it reduces the need to loop through data sets in order to find specific records. So not only is this more efficient by sending smaller messages around, but it will also allow your flows to run faster.

For more examples of OData filter expressions, please check out the following Microsoft page.

The post Advanced | Flow of the Week: Filtering Data with OData appeared first on Microsoft Power Platform Blog.

]]>
Advanced | Flow of the Week: What interests my boss, fascinates me! http://approjects.co.za/?big=en-us/power-platform/blog/power-automate/flow-of-the-week-what-interests-my-boss-fascinates-me/ Thu, 01 Feb 2018 14:58:13 +0000 Wouldn’t it be great if there was a way for a cognitive engine to process your emails, pull out key phrases and then execute a search engine query and return the top 3 results? Well there is, and these steps can be orchestrated using Microsoft Flow which is the inspiration for this blog post.

The post Advanced | Flow of the Week: What interests my boss, fascinates me! appeared first on Microsoft Power Platform Blog.

]]>
Recently I spent some time at one of Microsoft Canada’s field offices and was talking with Mark Speaker, an Industry Solutions Executive in Calgary about Microsoft Flow use cases. Mark has many progressive ideas about how Flow can be used within the asset intensive industries (Oil & Gas, Mining, Utilities) that he focuses on but he also had some ideas about how he can leverage Flow for personal productivity. Being an Industry Solutions Executive requires Mark to be on top of many trends that impact his customers and the Canadian Microsoft subsidiary.

When I was chatting with Mark, he brought up Microsoft’s abilities in cognitive computing and felt there must be a way “to leverage this technology in order help him and customers scale.” One use case he identified was managing communication that he receives from his leader, Sarah Kennedy. There is an old saying, “What interests my boss fascinates me” and that is the inspiration for this blog post.

Naturally, Mark pays close attention to his leader’s email, but between customer meetings and traveling across the Canadian subsidiary, Mark wants to ensure he can stay on-top of the email that he receives. In the event, there are emails that require further research, Mark wants quick access to information without suffering from search engine fatigue. 

Wouldn’t it be great if there was a way for a cognitive engine to process your emails, pull out key phrases and then execute a search engine query and return the top 3 results? Well there is, and these steps can be orchestrated using Microsoft Flow. With this as our requirements, I set out to build such a flow.

There are a lot of steps in building this flow. In this blog post, I will focus on the high-level concepts, but you can download an exported version of this flow here.

Building the flow

  • We want to run this flow every day, so we will use the Recurrence trigger that will ensure to instantiate it each day.

  • The next step that we want to run is the Office 365 Outlook – Get emails action. This action will allow us to retrieve the last 10 emails. We can also provide a Search Query, much like we can in the Outlook client. This is where you can provide a from: parameter indicating the person whose emails you want to target.

  • To simplify our processing, we will create 3 variables that we can use to store information like the News articles returned by Bing, our Key Phrases returned from Cognitive Services and a variable that will allow us to count our Key Phrases result.

  • Next, we will loop through each of the emails that we have received.

  • Since most of the emails received these days tend to be in HTML format, we can use the Html to text action to perform a conversion as we don’t want to pass HTML markup to the Key Phrases Cognitive API.

  • We are now ready to send our cleansed email body to the Key Phrases API. Optionally, we can also provide a targeted Language.

Note: you can obtain a free trial API key for the Text Analytics API (which includes Key Phrases) here. Alternatively, you can also obtain an API key from the Azure Portal if you have an Azure Subscription.

  • As you can imagine, large emails may contain many key phrases and passing in a large number of phrases may result in an inaccurate search result. To mitigate this, we will cap our key phrases at 10 to have a more relevant result set. But, until our keyPhraseCount reaches 10, we will append these key phrases together which will make up our query for Bing.

  • There is an out of box Bing API connector, but for this scenario I have opted to use the HTTP action so I can control additional parameters including the cc query parameter which will localize the results. For example, if Mark is interested in more information about a carbon tax, it is important to return results in his locale as opposed to another jurisdiction which won’t have the same level of impact. We are also going to include a count parameter and assign a value of 3 to limit the result set to 3.

Note: The Bing Search API also requires an API Key. You can sign up for a free trial here.

  • Since we are not using the out of box Bing connector, we need to use the Parse JSON Response to have a typed message that we can use in downstream connectors. We also want to put a defensive check to see if there is a response. If there are no results from our web search, we want to prevent a failure from occurring. So when there are no results we will construct a “No-Results-Found” message and add it to our news array so that Mark knows there was no result for a specific email. If results are returned, we will iterate through them and append them to our news array.

  • We don’t want to include all of the data (attributes) from our Bing search results so we will use the Select Columns action to include only Subject, Name and URL.

  • Next, we want to use the Create HTML table action that will convert our array into an HTML table.

  • Lastly, we will use the Send an email action to send our digest our to our interested user. We can also include formatting by embedding CSS selectors.

Testing

To test this flow, I did what any flow user would do…automate it using Microsoft Flow! By clicking a button, I can send 10 different emails that represent the types of emails that a person expects to receive.

  • As an example, there is activity in the Western Canada recently about Shale Gas projects that would be of interest to both Sarah and Mark as they support this customer segment. Therefore, I have included an example of such a scenario. For American Football fans – I have you covered too!

  • To run our test harness, we simply click on the …More dropdown and then select Run now. Alternatively, we can also pull out our mobile device and open the Microsoft Flow mobile app and click our virtual button.

  • Within a few seconds, we will see our 10 emails show up in our inbox.

  • With our emails staged, we are now ready to launch our actual “What interests my boss fascinates me” flow. We can either wait for our schedule to be reached or can manually kick it off as well.

If we explore the run details of the flow we will discover a few interesting items:

  • For our quantum computing email, the Text Analytics API was able to detect the following as Key Phrases:

  • When we call Bing, we assemble a query string that resembles a useful query string.

  • Bing, in turn, will provide valid results.

  • The complete output by Microsoft Flow looks like the following:

Other Opportunities

I hope this blog post gave you some ideas about how you can leverage Microsoft Flow and Azure Cognitive Services to scale yourself, and your team. Here are some additional ideas that we came up with that may also be of interest:

1. “Keep me updated on my Interests” (without me explicitly telling you my Interests) 

2. Of all of the news being released today, what would most interest my specific customer?

3. What LinkedIn article should I post that would attract the most interest?

4. Show me something interesting I likely have never seen before.

5. Of all the people who have asked me for something, who should I get back to first? Send me a note when I’m really late.

6. When a customer posts something on LinkedIn, update their list of key interests in CRM for me.

7. Tell me if someone has sent me an email or text and they are angry.

8. Match the top trends to the interests of my customers based on their LinkedIn posts and profiles. Send me updates when trends that impact these customers have something interesting.

9. Who in my organization, of thousands of people, would be best matched to speak to certain people at my customers? Send me those names on demand and their relevant topics.

10. Find the internal PowerPoint presentations that match the interests of my customers by topic and industry. Send me links to the presentations on demand and when I request for certain customers. As new presentations are created on the topics I need to know that.

11. What are some current competitive threats for my customers?

  1. Who are my customers currently interacting with on social media?

Conclusion

I find that “Digital Transformation” is such a loaded term these days with many organizations abusing the term. For me I look at Digital Transformation as a way to change the way people work. One of these ways is through the democratization of technology that allows employees to scale through access to tools. This is a great example of both democratized access to technology and improving the way people work through the automation of tasks. Only 2 years ago, this scenario was infeasible as it would require a developer to write a lot of code. Using Microsoft Flow, we can take advantage of other Microsoft investments, such as Azure Cognitive Services and Bing in order to build cutting-edge applications on a coffee-cup budget.

I want to thank Mark for bringing this scenario forward and identifying some very cool use cases for Microsoft Flow. I also want to thank his leader, Sarah for being a good sport about including her in this blog post as part of our scenario. I look forward to continuing to work with Mark in order to unlock more really innovative ideas using Microsoft Flow.

P.S. To never miss another blog post from the Flow blog – Use This Flow

The post Advanced | Flow of the Week: What interests my boss, fascinates me! appeared first on Microsoft Power Platform Blog.

]]>