Governance Archives - Microsoft Power Platform Blog Innovate with Business Apps Tue, 20 Sep 2022 13:30:18 +0000 en-US hourly 1 Introducing the Automation Kit for Power Platform http://approjects.co.za/?big=en-us/power-platform/blog/power-automate/introducing-the-automation-kit-for-power-platform/ Tue, 20 Sep 2022 13:30:18 +0000 The Automation Kit for Power Platform is designed to help organizations manage, govern, and scale automation platform adoption based on industry best practices. The Automation Kit for Power Platform is now available to the public as an open source GitHub project.

The post Introducing the Automation Kit for Power Platform appeared first on Microsoft Power Platform Blog.

]]>

To establish a successful automation culture, you typically need to build an Automation Center of Excellence (CoE) to maximize your organization’s investments and define guardrails to develop RPA and other hyperautomation scenarios for digital transformation in a controlled manner.  Back in December 2021, we blogged about HEAT (Holistic Enterprise Automation Techniques) and automation adoption best practices.

We also released a private preview version of Automation Kit (previously Automation CoE Starter Kit), which has been implemented by many customers across the globe at early stages of the development.

Today, we are happy to introduce the Automation Kit for Power Platform, now available to the public as an open source GitHub project.

 

What is the Automation Kit?

The Automation Kit for Power Platform is designed to help organizations manage, govern, and scale automation platform adoption based on industry best practices. The toolkit is a collection of components and tools based on HEAT concepts and built using Power Apps and Power Automate, so you can easily extend and customize the kit to your needs.

The Automation Kit for Power Platform helps you accelerate your organization’s automation CoE. It includes a set of Power Apps applications and Power Automate flows that provide ready-made solutions to manage your automation projects, capture near-real-time value tracking, and gain insights to your automation initiatives. The kit was built based on feedback from customers across the globe using Power Automate as their choice of hyperautomation and RPA platform.

 

Case study: Cineplex accelerates establishing their Automation Center of Excellence with Automation Kit

Cineplex Inc. is a leading media and entertainment company that welcomes millions of guests through its 170+ cinemas and entertainment venues.

Bo Wang, Vice President of Taxation & Treasury, started using Power Automate desktop flows back in September 2020 to automate business processes within his team. After realizing the benefits in process efficiencies and time savings, he decided to set up the Automation Center of Excellence so that he could further scale the use of Power Automate across the entire organization. To do so, Bo and his team adopted the Automation Kit to manage and have visibility across the automation development lifecycle. The Automation Kit helped Cineplex:

  • Accelerate the development of their Automation Center of Excellence
  • Centrally manage the automation lifecycle from ideation to production
  • Get insights for both leadership and operations team about return on investment (ROI) in automation

What’s included in the Automation Kit?

Automation Project Management

Use this app to manage the automation initiatives in your organization and define the metrics to calculate ROI.

Projects are managed in the Automation Center

 

Solution Metering

The kit offers capabilities to link Automation Project definitions with their Power Automate cloud flows, desktop flows and other artifacts in order to automatically capture and calculate the contributions of an automation.

Artifacts are attached to solution metering for automatic metric captures

Automation CoE Dashboard

The dashboard provides a holistic view of your automation projects, ROI, and goals. It includes multiple views and metrics that are automatically calculated and refreshed based on what you have defined in the Automation Projects.

Your organization can identify which automation projects to work on using the complexity score and estimated savings to ensure correct prioritization is being made on the automation investments.

Once the automations are established, the organization can then track the savings and track them against goals.

Dashboard showing the complexity score and list of projects

Where to start

Now that you do have a sneak peek of all the cool features the Automation Kit has to offer, here are resources that can get you started.

What’s next

We’re continuing to evolve and expand Automation Kit features based on customer feedback to support your ability to grow your organization’s automation maturity to enterprise scale.

We will regularly publish a prioritized list of features from our open-source backlog that we will work on and release in our next monthly update.

As part of our upcoming planned regular office hours that start Tuesday 11th October 7:00AM – 8:00AM PDT with you can register at https://aka.ms/ak4ppofficehours. We will showcase new and planned features and hold an “ask me anything”-style conversation to get feedback about your use of the Automation Kit and prioritize areas that we should work on to provide the most impact for you.

Disclaimer

Although the underlying features and components used to build the Automation Kit (such as Microsoft Dataverse, admin APIs, and connectors) are fully supported, the kit itself represents sample implementations of these features. Our customers and community can use and customize these features to implement admin and governance capabilities in their organizations.

If you face issues with:

  • Using the kit: Report your issue here: aka.ms/automation-kit-issues. Microsoft Support will not help you with issues related to this kit, but they will help with related, underlying platform and feature issues.
  • The core features in Power Platform: Use your standard channel to contact Support.

The post Introducing the Automation Kit for Power Platform appeared first on Microsoft Power Platform Blog.

]]>
Change owner of a solution flow http://approjects.co.za/?big=en-us/power-platform/blog/power-automate/change-owner-of-a-solution-flow/ Mon, 20 Jun 2022 16:00:00 +0000 You can now change the owner of a solution flow from Power Automate portal. This feature enables owners, co-owners, and admins to change the owner of a solution flow to enable business continuity when the original owner switches teams or leaves the organization

The post Change owner of a solution flow appeared first on Microsoft Power Platform Blog.

]]>
You can now reassign a solution flow to a new owner from the Power Automate portal. This feature enables owners, co-owners, and admins to change the owner of a solution flow to enable business continuity when the original owner is switching teams or leaving the organization.

You can change the owner to an individual or an Azure Active Directory service account. If the flow is using a service account, see here for guidance on licensing service accounts.

To change the owner, first select a solution flow and edit the flow details section:

Next, remove the current owner and search for the new owner:


If the flow is a scheduled or Automated flow, once the owner is changed, the flow will run under the license of the new owner and use their Power Platform request limits. If the flow is a manual flow, the flow will run under the license of the user who runs the flow. The Plan section shows whose license plan is used by the flow.

This change is limited to solution flows. For changing the owner of a non-solution flow, the flow must be exported and imported by the new owner. Check out this video to learn how to export and import as new owner.

For more details, see the documentation here.

Happy Automating!

 

The post Change owner of a solution flow appeared first on Microsoft Power Platform Blog.

]]>
Governing Your Digital Transformation with the Power Platform http://approjects.co.za/?big=en-us/power-platform/blog/power-automate/governing-your-digital-transformation-with-the-power-platform/ Mon, 29 Jun 2020 13:00:00 +0000 Learn about governance capabilities with the Power Platform. Our new Microsoft Mechanics video covers Security, Analytics, and Administration of the Power Platform.

The post Governing Your Digital Transformation with the Power Platform appeared first on Microsoft Power Platform Blog.

]]>
As organizations accelerate their digital transformation, the need to rapidly create applications is increasing.. The Microsoft Power Platform has emerged as a leading low-code development platform which enables developers of all backgrounds and skill levels to participate in the organization’s digital journey. Over 97% of the Fortune 500 benefit from the Power Platform’s ease of use, wide set of feature, and expansive governance capabilities which ensure that an organization’s digital transformation adheres to its protocols. As adoption of low-code application development methods evolve and in many enterprises become the de-facto application platform, the need to govern the creation of new applications becomes increasingly important.
The Power Platform is built on a firm foundation of governance. Our colleagues on the Microsoft Mechanics team were keen to learn more about how the Power Platform helps admins govern these low code application environments. To this end, Jeremey Chapman recently interviewed Julie Strauss, Director of the Power Platform Administration and Governance team. Check out the interview where Julie highlights the following governance capabilities available with the Power Platform:

• rich analytics capabilities in the Power Platform admin center
• custom dashboards built with the Power Platform Center of Excellence Starter Kit
• layers of security
o tenant level
o user level
o how to set up an environment
o data access controls with the common data service
o data loss prevention policies.

Take a view minutes and check out the full video.

 

Video 1.  Microsoft Mechanics video on Power Platform Governance

 

Go Deeper on Power Platform Governance

You can go deeper on the Power Platform governance capabilities by checking out our recent sessions at the Microsoft Business Applications Virtual Summit:

 

Also, for a great technical deep dive into Power Platform governance, check out our recently updated whitepaper.  To get more details on the concepts that Julie discussed in her Mechanics interview, check out our documentation on Power Platform governance considerations.

 
Join Our Community and Get Started Today

Join the growing Power Platform Community so you can get the latest updates, join discussions and get ideas on how the Power Platform can help your organization.  Also, be sure to check out the Power Platform Center of Excellence Starter Kit to learn how you can expand and customize your Power Platform governance capabilities.

Also, check out these other great Microsoft Mechanics videos featuring the Power Platform:

The post Governing Your Digital Transformation with the Power Platform appeared first on Microsoft Power Platform Blog.

]]>
Power platform Security & Governance: Deploying a Defense in Depth Strategy http://approjects.co.za/?big=en-us/power-platform/blog/power-automate/security-governance-strategy/ Thu, 30 Aug 2018 14:47:07 +0000 A common cyber security approach used by organizations to protect their digital assets is to leverage a defense-in-depth strategy. When customers ask how to best secure and govern their Microsoft Flow and PowerApps environments, we provide similar guidance. The following list represents different layers that you can use to protect your digital assets and apply governance to ensure your organization’s interests are met.

The post Power platform Security & Governance: Deploying a Defense in Depth Strategy appeared first on Microsoft Power Platform Blog.

]]>
A common cyber security approach used by organizations to protect their digital assets is to leverage a defense-in-depth strategy. The SANS Institute defines defense-in-depth as “protecting a computer network with a series of defensive mechanisms such that if one mechanism fails, another will already be in place to thwart an attack.”

When customers ask how to best secure and govern their Power platform environments (which includes Microsoft Flow and PowerApps), we provide similar guidance. The following list represents different layers that you can use to protect your digital assets and apply governance to ensure your organization’s interests are met.

  • Secure data at rest Microsoft Flow does not provide users with access to any data assets that they don’t already have access to. This means that users should only have access to data that they really require access to. It also means that if a user has access to this data through a web browser, then they likely have access to it through Microsoft Flow. A recommendation the Microsoft Flow team suggests, is using a least privilege approach to data access. The United States Computer Emergency Readiness Team refers to least privilege access as: “Every program and every user of the system should operate using the least set of privileges necessary to complete the job. Primarily, this principle limits the damage that can result from an accident or error.” Deploying least privilege access is a good practice and a big part of an organization’s overall security hygiene.
  • Network Access Control The National Institute of Standards and Technology (NIST) encourages organizations to inspect “inbound and outbound network traffic for specific IP addresses and address ranges, protocols, applications, and content types based on the organization’s information security policies.” While Microsoft Flow is a cloud-based application, organizations have the ability to govern how connections are established when users are connected to the corporate network. For example, if an organization blocks access to a social media site from within their corporate network by blocking the sign-on page through their firewall, then when this same log-in page is launched from the flow portal, the connection can also be blocked from being established.
  • Location-based Conditional Access For organizations that want to govern where users can access the Microsoft Flow service from, they can setup Azure Active Directory Conditional Access policies that can restrict what network addresses have access to the service. For additional information, please refer to the following presentation from the Microsoft Business Application Summit.
  • Data leakage can be avoided by configuring Data Loss Prevention (DLP) polices that allow an administrator to group connectors into Business data and Non-Business data groups. Connectors within each group can communicate with each other but cannot be used within a flow if the connectors span these two data groups. There are both design-time and runtime checks that will enforce these policies.
  • Anomaly Detection is another common strategy used by organizations to understand user behavior. For example, if an organization usually creates 5 new flows every day and there is an exponential spike in flows being created, then it may be worth understanding what is driving that growth. Is it legitimate usage or is there a threat. How can this be detected? Microsoft recently released management connectors for Microsoft Flow, Microsoft PowerApps and Microsoft Power platform. We also published a template that will automate the discovery of these assets.

  • NIST classifies Audit Trails as “a record of system activity both by system and application processes and by user activity of systems and applications.  In conjunction with appropriate tools and procedures, audit trails can assist in detecting security violations, performance problems, and flaws in applications.” Microsoft Flow publishes audit trail events to the Office 365 Security and Compliance center related to:
    • Created flow
    • Edited flow
    • Deleted flow
    • Edited permissions
    • Deleted permissions
    • Started a paid trial
    • Renewed a paid trial

As part of these audit events, the user who was involved in the event will be captured and in the case of create flow and edit flow events, the connectors used in these flows will also be captured.

 

  • Alerting is another line of defense that should be used to inform stakeholders when corporate policies have been broken. Much like we want Microsoft Flow users to automate their business processes, we also want to provide administrators with this same level of automation. An example of alerting that can be implemented is subscribing to Office 365 Security and Compliance Audit Logs. This can be achieved through either a webhook subscription or polling approach. However, by attaching Flow to these alerts, we can provide administrators with more than just email alerts. By leveraging the new Management Connectors or PowerShell Cmdlets corrective action can be implemented which allows administrators to remain productive as they protect their environment.
  • Education cannot be ignored as a layer of defense. Cybersecurity is more than just technology and processes, it is also highly dependent upon people. Phishing continues to be a popular avenue for hackers to try and exploit. In part due to users clicking on links that they shouldn’t. In many circumstances, users are tricked into clicking on links based upon clever campaigns being designed. End-user education continues to be another layer that organizations implement to prevent breaches. Microsoft Flow users should also be educated on company cyber security policies to ensure this security layer is not exploited.

Additional Resources

In this blog post we discussed many security layers that organizations should implement as they seek to govern and protect their environment. In addition to what we have discussed in this blog post, we also have additional resources that organizations can leverage to protect their environments.

·PowerShell Cmdlets for PowerApps and Microsoft Flow In May, we introduced PowerShell cmdlets that provide both user and admin functions to automate Application Lifecycle Management (ALM) and administrative tasks. We continue to update these PowerShell cmdlets based upon customer feedback. Please find the latest release here.

·PowerApps and Microsoft Flow Governance and Deployment Whitepaper was released earlier this month and includes prescriptive guidance for deploying and managing the Power platform. Topics within the whitepaper focus on the following areas:

  • Data Loss Prevention (DLP) Policies
  • PowerApps and Microsoft Flow Access Management
  • Automating Governance
  • Deployment Scenarios
  • Office 365 Security and Compliance Center
  • Importing and Exporting application packages
  • Licensing
  • Power platform Admin Center (coming soon) At the Business Application Summit in July, we announced a unified experience for managing Dynamics 365, PowerApps, Microsoft Flow and CDS for Apps assets. One of the features of this new admin experience is Admin Analytics, which will provide administrators with an analytics experience that will provide insight into how these flows and apps are used within their tenant.

The post Power platform Security & Governance: Deploying a Defense in Depth Strategy appeared first on Microsoft Power Platform Blog.

]]>
Advanced | Flow of the Week: Automating Microsoft Flow Governance – Using Microsoft Flow http://approjects.co.za/?big=en-us/power-platform/blog/power-automate/automate-flow-governance/ Thu, 23 Aug 2018 17:52:12 +0000 Enterprise Security and Governance is an important topic for many organizations. Microsoft continues to make investments that allow customers to implement PowerApps and Flow and be confident that they have their bases covered from a governance perspective. Much like Microsoft Flow empowers users to build powerful workflow and automation solutions, we want to empower administrators with the same capabilities to support their needs. In this blog post we are going to explore a scenario that describes how you can automate governance activities by taking advantage of the Office 365 Management API.

The post Advanced | Flow of the Week: Automating Microsoft Flow Governance – Using Microsoft Flow appeared first on Microsoft Power Platform Blog.

]]>
Introduction

Enterprise Security and Governance is an important topic for many organizations. Microsoft continues to make investments that allow customers to implement PowerApps and Flow and be confident that they have their bases covered from a governance perspective. Much like Microsoft Flow empowers users to build powerful workflow and automation solutions, we want to empower administrators with the same capabilities to support their needs. In this blog post we are going to explore a scenario that describes how you can automate governance activities by taking advantage of the Office 365 Management API.

Overview

A scenario that we will walk through in this post is the ability to detect when specific events exist within a flow definition so that we can detect these events and provide pro-active governance against it. For example, some organizations would like to avoid users forwarding emails externally. Microsoft Exchange can block these scenarios through transport rules. But, using cloud workflow tools (including more than just Flow) you generally break down these actions into more discrete events. For example, I can receive an email and send an email within the same flow. Independently, these actions may not be perceived as forwarding an email, but from a functional perspective, they achieve the same result.

In order to detect these events, we will depend upon the Office 365 Security and Compliance logs which will capture events related to creating, editing or deleting a flow. In a previous blog post, we discussed how we can poll the Office 365 Security and Compliance PowerShell Webservice looking for these events. In this blog post, we are going to use an event-driven approach where we will create a webhook and have events sent to a Microsoft Flow endpoint. Once Microsoft Flow receives this event, we will go fetch additional details of the event. We will then parse these events and perform some logic to determine if a condition exists that warrants action, including stopping the flow that is a concern.

Pre-requisites

In this blogpost, we will be interacting with the Office 365 Management API and the Microsoft Flow Management connector. As a result, there are specific requirements for accessing these capabilities:

Office 365 Management API

  • Global Administrator Access
  • Azure AD Application
  • Get Office 365 tenant admin consent

Flow Management Connector

  • Global Administrator or Environment Admin
  • Microsoft Flow P2 license

Azure AD Application

The first thing that we need to do is create an Azure AD Application that we will use when calling the Office 365 Management API. For this blog post we are going to try to focus on the Microsoft Flow components as much as possible. For additional information on the Office 365 Management API, please see the following post.

To create an Azure AD Application:

  1. Navigate to the Azure Portal
  2. Select Azure Active Directory and then App registrations
  3. Create a New application registration
  4. Provide a Name for your application, Application type of Web app/API and a Sign-on URL.

Note: The Sign-on URL is an arbitrary value. You can even put a value of http://localhost

  1. Once the application has been created, you can click on Settings to further configure.
  2. Click on Properties and make a note of the Application ID as you will require it in a future step.
  3. While on the Properties screen, ensure the Multi-tenanted option is set to Yes.
  4. Click on Reply URLs and add a value. For this value you can provide an arbitrary URL, but having it resolve will simplify an upcoming step. For my example, I just used my blog http://www.middlewareinthecloud.com

  1. Next, click on Required permissions
  2. Click on Add – Select an API – Office 365 Management API
  3. Next, set the permissions as illustrated below.

  1. We now need to need to obtain a Key which can be achieved by clicking on Keys.
  2. Provide a Description, Duration and click Save. Once you have done this, a Key Value will be generated. Copy this value for future use.
  3. Save and exit.

Note: If your key contains special characters like ‘/’ and ‘+’, you will get an invalid key error when you try to create a token in a subsequent step. These values need to be encoded and any online URL encoding website should be able to encode these values for you.

Get Office 365 tenant admin consent

In the Office 365 documentation, it calls out “a tenant admin must explicitly grant your application these permissions in order to access their tenant’s data by using the APIs”. As a result, a tenant admin must call the following URL in order to grant consent. In addition, the URL will return an authorization code that we will need in a future call.

Within this URL, there are two placeholders that we need to populate with information from our Azure AD application. When it comes to “{your_client_id}” this is referring to the Application ID that we recorded when creating our Azure AD application. The “{your_redirect_url}” placeholder refers to the Reply URL that we also provided when creating the Azure Ad application.

https://login.windows.net/common/oauth2/authorize?response_type=code&resource=https%3A%2F%2Fmanage.office.com&client_id={your_client_id}&redirect_uri={your_redirect_url}

  1. With our URL formulated, we can use a web browser to make this call. Upon successfully calling this URL, you will be prompted with a consent dialog.

  1. Upon Accepting the terms, your Reply URL web page should be displayed.

Create Microsoft Flow Listener

With our Azure AD App created and consent granted to use the Office 365 Management API we are now going to create our webhook subscription within Office 365. But, before we do that we need to be able to provide a URL that can be called whenever there are events published from the O365 Management API. We will now create our flow and then we can use the URL that is provided as part of our HTTP trigger when configuring our webhook subscription.

  1. Create a Flow from blank and add an HTTP Trigger
  2. Since we want a typed message that we can be used within our flow, we can provide a JSON Schema payload of an event we can expect to receive from the O365 Security and Compliance Center.
{
    "type": "array",
    "items": {
        "type": "object",
        "properties": {
            "clientId": {
                "type": "string"
            },
            "contentCreated": {
                "type": "string"
            },
            "contentExpiration": {
                "type": "string"
            },
            "contentId": {
                "type": "string"
            },
            "contentType": {
                "type": "string"
            },
            "contentUri": {
                "type": "string"
            },
            "tenantId": {
                "type": "string"
            }
        },
        "required": [
            "clientId",
            "contentCreated",
            "contentExpiration",
            "contentId",
            "contentType",
            "contentUri",
            "tenantId"
        ]
    }
}


 

  1. Next, we will add 3 Compose actions where we will store our values for client id, client key and tenant. For both client id and client key you should have these values from when you created your Azure AD application. Your tenant id can be retrieved by following one of these approaches.

Note: We chose to use Compose actions instead of variables as there is less of a performance hit and these are values that we will not need to further manipulate.

  1. Our next step is to retrieve an auth token that we can use to retrieve event details from the O365 Security and Compliance Center. We will use the values that we captured in our Compose actions and construct a URI that includes our Tenant ID. Our Header will include a Content-Type of application/x-www-form-urlencoded. Lastly, we need to provide key/value pairs that include our Client ID, Client Secret, Resource and Grant Type.

 

  1. We need to use the token that is returned in downstream actions so we will add a Parse JSON action that can will use this HTTP response as an input. The following Schema can be used to give our response a message shape.
{
 "type": "object",
 "properties": {
  "token_type": {
   "type": "string"
  },
  "expires_in": {
   "type": "string"
  },
  "ext_expires_in": {
   "type": "string"
  },
  "expires_on": {
   "type": "string"
  },
  "not_before": {
   "type": "string"
  },
  "resource": {
   "type": "string"
  },
  "access_token": {
   "type": "string"
  }
 }
}

 

  1. Our HTTP Trigger will only provide us with a message that describes the event that occurred inside the Office 365 Security and Compliance Center. It won’t provide us with actual details about the event. To get the actual details about the event we need to make a subsequent call to the Office 365 Management API to get the details. We will accomplish this by using the HTTP Action and performing a GET request to the URI that was provided as part of the inbound message. The expression that we can use to retrieve this value is triggerBody()[0]?[‘contentUri’]. We also need to provide an Authorization Header that includes a Bearer token that is retrieved from our previous Parse Token Response action. In addition, we need to specify a Content-Type of applicationhttps://www.microsoft.com/json.

 

  1. We now need to parse our response from the Office 365 Management API so we can explore the results. Once again we will use the Parse JSON action and this time we will provide the following schema:
{
 "type": "array",
 "items": {
  "type": "object",
  "properties": {
   "CreationTime": {
    "type": "string"
   },
   "Id": {
    "type": "string"
   },
   "Operation": {
    "type": "string"
   },
   "OrganizationId": {
    "type": "string"
   },
   "RecordType": {
    "type": "integer"
   },
   "ResultStatus": {
    "type": "string"
   },
   "UserKey": {
    "type": "string"
   },
   "UserType": {
    "type": "integer"
   },
   "Version": {
    "type": "integer"
   },
   "Workload": {
    "type": "string"
   },
   "ObjectId": {
    "type": "string"
   },
   "UserId": {
    "type": "string"
   },
   "FlowConnectorNames": {
    "type": "string"
   },
   "FlowDetailsUrl": {
    "type": "string"
   },
   "LicenseDisplayName": {
    "type": "string"
   },
   "RecipientUPN": {
    "type": "string"
   },
   "SharingPermission": {
    "type": "integer"
   },
   "UserTypeInitiated": {
    "type": "integer"
   },
   "UserUPN": {
    "type": "string"
   }
  },
  "required": [
   "CreationTime",
   "Id",
   "Operation",
   "OrganizationId",
   "RecordType",
   "ResultStatus",
   "UserKey",
   "UserType",
   "Version",
   "Workload",
   "ObjectId",
   "UserId",
   "FlowConnectorNames",
   "FlowDetailsUrl",
   "LicenseDisplayName",
   "RecipientUPN",
   "SharingPermission",
   "UserTypeInitiated",
   "UserUPN"
  ]
 }
}

 

  1. The Parse Log Event can retrieve multiple events from Office 365. As a result, we need to loop through the Body that is returned from the Parse Log Event. This loop will get added as soon as we use a data element from the Parse Log Event output.
  2. Since Microsoft Flow events are captured within Audit.General Content Type inside of Office 365 Security and Compliance Center, will now want to perform some logic that will focus on Microsoft Flow CreateFlow and EditFlow events. To accomplish this, we will add an advanced condition that includes an or statement that looks for either CreateFlow or EditFlow events.

@or(equals(items(‘Apply_to_each_2’)[‘Operation’], ‘CreateFlow’),equals(items(‘Apply_to_each_2’)[‘Operation’], ‘EditFlow’))

  1. Next, we want to see if the Office 365 Outlook Connector is being used within this Flow that created the audit event. We can achieve this by seeing if the FlowConnectorNames attribute (within the Parse Log Event) contains Office 365 Outlook.

  1. If the list of connectors does include the Office 365 Outlook connector then we want to further explore whether the Forward Email action is being used since that is the action that we want to prevent our users from using. In order to determine if a Flow Definition does contain the ForwardEmail action we need to capture the Environment ID and Flow ID. To get the Environment ID we will use a Compose Action and use an expression to parse it from the FlowDetailsUrl attribute that can be found within Parse Log Event – Body array. The expression we want to use is:

substring(replace(item()?[‘FlowDetailsUrl’],’https://admin.flow.microsoft.com/environments/’,”),0,indexOf(replace(item()?[‘FlowDetailsUrl’],’https://admin.flow.microsoft.com/environments/’,”),’/’))

  1. We will use a similar approach to retrieve the Flow ID, but our expression will be:

replace(substring(item()?[‘FlowDetailsUrl’],lastIndexOf(item()?[‘FlowDetailsUrl’],’/’),sub(length(item()?[‘FlowDetailsUrl’]),lastIndexOf(item()?[‘FlowDetailsUrl’],’/’))),’/’,”)

  1. In an upcoming step, we want to add our Principle Id as an owner of this flow that we want to inspect so that we can retrieve the flow definition. To obtain our Principle ID we can use the Office 365 Users connector and the Get my profile (V2) action to provide this attribute.
  2. We can use the Id returned from the Get my profile (V2) action with our outputs from the Get Environment and Get Flow ID compose actions to add our account as an owner of this flow.

 

  1. Being an owner of the flow is important so that we can retrieve the flow definition to determine whether or not the Forward Email action is being used. We can retrieve the flow definition by using the Flow Management connector and using the Get Flow action. Once again we need to use the outputs from the Get Environment and Get Flow ID compose actions as inputs to this action.

  1. We are going to inspect the flow definition for a swaggerOperationId that is equal to ForwardEmail but before we do that we need to cast the json flow definition to a string. We can do this by using the following expression: string(body(‘Get_Flow’)[‘properties’][‘definition’]). Once we have it cast, we can see if it contains “swaggerOperationId”:”ForwardEmail”.

  1. If the flow definition does include the ForwardEmail action then we want to perform some additional steps in the If yes branch.
  2. As you have seen, the Environment ID is an attribute that we have used within this flow. But, we have not used the Environment Name, since it isn’t a data attribute that is available to us at this point. However, we can access this attribute by using the List My Environments action that is part of the Flow Management connector.

  1. By calling the List My Environments action, all of the environments that our user has access to will be returned. Since we cannot filter using the existing connector, we can add a Filter array action and filter on the Environment Name attribute by comparing it to the Environment ID that we have previously captured.

  1. Since the Filter array action will return a list of items that match our criteria, we will want to access the first instance using an expression body(‘Filter_array’)[0]?[‘properties’]?[‘displayName’] which will take the first index of our array. Since Environment IDs are unique, this approach is safe.
  2. With our Environment Display Name now available, we can pass this attribute and others into an approval that we will use to determine whether or not any corrective action is required. In addition, we will include the Flow Display Name, Environment ID, User UPN (from Parse Log Event) and Connectors Used (from Parse Log Event).

  1. Next, we will wait for an approval by adding a condition to our flow. Provided the Response is equal to Approve we will use the Stop Flow action that is part of the Flow Management connector to stop that flow.

To view the entire flow, please click on the following link.

 

Creating Office 365 Management API Webhook

With our flow now complete, there is something that we need to do before we create our Webhook subscription. We need the URL that is part of our HTTP Request Trigger which we can copy by clicking on the following icon.

To complete the next couple steps we are going to need to call the Office 365 Management APIs and as a result will benefit from a tool called Postman.

We need to generate an access token that we can use to create our Webhook subscription. To do this, we need access to our Code that is returned from our consent call.

  1. To obtain this code populate client_id and redirect_uri with your values and enter this into a web browser.

https://login.windows.net/common/oauth2/authorize?response_type=code&resource=https%3A%2F%2Fmanage.office.com&client_id={your_client_id}&redirect_uri={your_redirect_url}

  1. When the webpage resolves, there will be a query parameter called code returned in the URL. Copy this value for use in the next step.

Note: At the end of the URL returned from the web browser, there may be a session_state query parameter also returned. This value is not required and should not be included in the next step.

  1. We now need to construct an HTTP request that we will send to https://login.windows.net/common/oauth2/token that looks like the following image that will provide us with an access_token that we will use when creating our webhook. As part of this request we will need to provide data from our Azure AD application that we previously created including client_id, client_secret and our redirect_uri. In addition to these values, we also need to include a resource of https://manage.office.com, a grant_type of authorization_code and our code from our previous step.

  1. Next up is creating our Webhook subscription. To do this we will need to copy out the access_token from our response. Inside of Postman, open a new tab and construct a new POST request to https://manage.office.com/api/v1.0/{your_tenant_id}/activity/feed/subscriptions/start?contentType=Audit.General

Note: We are including a query parameter of contentType that has a value of Audit.General. As mentioned previously, the flow events show up under this content type.

The Headers that we need to include are Authorization that has a value of Bearer <access_token>. Recall this is the access_token from our previous step. We also want to provide a Content-Type of applicatiionhttps://www.microsoft.com/json.

We aren’t quite done yet. We also need to provide a Body where we will include our Flow Request URL and a value for authId.

{
 "webhook" : {
  "address": "Enter your Flow Request URL here",
  "authId": "Enter an arbitrary value here",
  "expiration": ""
 }
}
  1. When we submit this request, we can expect to receive a response like the one below which indicates that our webhook has been created successfully.

 

Testing

We are now ready to test!!! To test our new governance process, we will sign into Microsoft Flow with a different user account. We will subsequently create a new flow the includes an Outlook Trigger and has a Forward Email action.

  1. Upon saving this flow, an event will be raised within the Office 365 Security & Compliance Center within approximately 20 minutes and our webhook subscription will be invoked.
  2. We should now have an approval waiting for us.

  1. We will go ahead and approve this request. When we do, we will see that this flow has been stopped from further processing.

 

Conclusion

In this blog post we explored some powerful capabilities that exist within the Office 365 Management APIs and the Flow Management Connector. Using the combination of these two platforms allows for a customized governance experience. This allows organizations to build governance solutions on top of what Microsoft already provides out of the box.

In addition to the scenario that we just built, this solution can be extended to support other scenarios that you want to govern, including other connectors or actions that you want to restrict.

 

Other Considerations

  • In this post we described how to receive events from the Office 365 Security and Compliance Center using a webhook approach. There are also options to use a polling approach like we covered in a previous blog post.
  • We only covered one scenario where we parsed our flow definition. If you wanted to build a more comprehensive parsing solution, you can build an Azure Function and pass the flow definition into the function where your logic is executed.

The post Advanced | Flow of the Week: Automating Microsoft Flow Governance – Using Microsoft Flow appeared first on Microsoft Power Platform Blog.

]]>