Sameer Chabungbam, Author at Microsoft Power Platform Blog http://approjects.co.za/?big=en-us/power-platform/blog Innovate with Business Apps Wed, 19 Jan 2022 00:46:00 +0000 en-US hourly 1 Easier deployments of Custom Connectors http://approjects.co.za/?big=en-us/power-platform/blog/power-automate/environment-variables-in-custom-connectors/ Wed, 19 Jan 2022 00:46:00 +0000 In this blog, I highlight new features for custom connectors - including the support of environment variables in custom connectors. These features provide a better experience to enable a seamless ALM (application lifecycle management) for custom connectors.

The post Easier deployments of Custom Connectors appeared first on Microsoft Power Platform Blog.

]]>
We continue to see a great momentum in our connector ecosystems for the Power Platform. Every week and every month, we release new connectors and updates to many connectors. We now have over 600 certified connectors. Apart from the certified connectors, many enterprises use custom connectors – for connecting to their internal APIs or simply to augment the connectivity requirements where the OOB certified connectors are not enough.

In this blog, I want to highlight some of the recent updates we are releasing in our platform for custom connectors. These changes are being rolled out, and will provide a better experience to enable a seamless ALM (application lifecycle management) for custom connectors. In Power Platform, Solutions are used to package components and deploy those components to various environment, customizing them as needed. The updates include:

With the above features, you can now easily enable end-to-end ALM for custom connectors. Any existing custom connector can now be added into a Solution. You do not have to start building your customer connector anew from inside a Solution. Environment specific values like API endpoints and OAUTH app details, can be externalized now in Environment variables and referenced by custom connectors. You do not need to update your custom connectors in different environments; and apps and flows targeting the custom connectors do not have to be modified in different environments. Finally, you can store your secrets using in your custom connectors in Azure Key Vault and use it via an Environment variable in your custom connector.

Let’s walk through an example.

We will use the Service Now custom connector available in our GitHub repo to illustrate these features, but you can use any custom connector.

First, we need to create the Service Now custom connector. You can sign up for a developer instance from Service Now if you need an instance. The Service Now custom connector supports OAUTH based authentication and there are certain configuration steps you need to do to enable that. It is fairly straight-forward though, and you can follow the documentation of the custom connector. You can watch this short video as well on how you can create the custom connector.

Next, we want to setup the connector so that we can externalize environment specific settings. In our case, we would want to externalize the Service Now Instance, and the OAUTH application details. Typically, if you are developing, you may be working on your own developer instance. In a production or UAT environment, you may want the connector to point to your production Service Now instance, and you may not have the necessary privilege to setup the OAUTH application.

To do that, lets create a Solution called “Snow Connector”, and add our custom connector. You can do that by selecting +Add Existing > Automation > Custom connector. You can see your custom connector in the “Outside Dataverse” category. Select the connector and click Add.

The next step is to create the Environment variables . To do that, select +New > More > Environment variable. Specify a display name, say SnowConnector_Instance, for the Environment variable and add the value which should be the instance name for your Service Now instance.

We need to add two other Environment variables for the OAUTH Client ID and OAUTH Client Secret. Because these are secrets, we will use the recently released feature of using Azure Key Vault as the secret store for these Environment variables. Create an Azure Key Vault account, add your secrets – one for the OAUTH Client ID and another for OAUTH Client Secret. Then, follow the instructions in our documentation to specify an access policy that will provide the Dataverse Application a read permission for secrets.

In your Solution, add a new Environment variable (called SnowConnector_ClientID) for the Client ID, specifying the Data type as “Secret”, the Secret Store as “Azure Key Vault”, and then add a reference to your secret specifying the Subscription Id, Resource Group name, Key Vault name and the Secret name.

Similarly, add another Environment variable (called SnowConnector_ClientSecret) for the Client Secret.

Because we are going to use the values of these Environment variables only in this environment, we want to make sure that the value are removed from the Solution. To do that, select the environment variable, scroll down to the Current value, select … > Remove from this solution.

The next step for us is to use these environment variables in our custom connector. To do that, edit custom connector and use the following formula @environmentVariables(“env_variable_name”). For our case, I updated the values for Host, and OAUTH security settings in our connector. For my Solution, this is:

Instance Name: @environmentVariables(“crd10_SnowConnector_Instance”)
Client ID: @environmentVariables(“crd10_SnowConnector_ClientID”)
Client Secret: @environmentVariables(“crd10_snowConnector_ClientSecret”)

Make sure you replace the Instance name at multiple places (Host, Authorization URL, Token URL and Refresh URL). The Client secret value is masked in the UI. See below:

Save your custom connector. And test to make sure that it is working.

Now, all you need to do is export the solution. For that, you go back to the Solutions list, select the “Snow Connector” solution, and click Export. Click Next > Export. Your Solution will now be exported and automatically downloaded as a .zip file.

You can then login to your test environment and import the Solution. While importing the Solution in the new environment, you can specify the values for the Environment variables. These values need not be the same as the one you used in the developer environment, but can be customized for the specific environment. For the Client ID and Client Secret, you can specify the value in the following format:

/subscriptions/{Subscription ID}/resourceGroups/{Resource Group Name}/providers/Microsoft.KeyVault/vaults/{Key Vault Name}/secrets/{Secret Name}

Once you are done with the import, you should be able to create a connection to your custom connector.

This walkthrough provides an overview of the end to end experience for packaging and deploying custom connectors across environments. You can leverage other ALM features like deploying the Solution from an Azure DevOps CI/CD pipeline by providing a deployment settings file as well.

There are still some limitations at the moment—

  • Any change to an EV value is currently not reflected on the custom connector automatically. The changes are reflected only after an update of the custom connector.
  • Environment variables in a custom connector can only be referenced from the Host/Base path and the security settings. Specifically, it is not supported in policies or within the operation definition.
  • [Edit 2/10/2022] Environment variables using Azure Key Vault secrets can now cannot be deployed from an Azure DevOps and GitHub pipeline. Read the blog here.

Please try these features out, and do share any feedback you may have.

Happy Automating!

The post Easier deployments of Custom Connectors appeared first on Microsoft Power Platform Blog.

]]>
Automating document workflows with Power Automate and Adobe http://approjects.co.za/?big=en-us/power-platform/blog/power-automate/automating-document-workflows-with-power-automate-and-adobe/ Wed, 05 May 2021 18:18:32 +0000 In this post, you can learn about the integration between Power Automate and Adobe and how it can help customers automate different scenarios around handling PDFs.

The post Automating document workflows with Power Automate and Adobe appeared first on Microsoft Power Platform Blog.

]]>
PDFs are core to how businesses get work done today. Businesses around the world can become paperless businesses because of PDF. It is the primary archival format for documents. Whether it is contracts, offer letters, brochures, many of the PDFs that are stored within Microsoft 365 are in fact PDFs. According to Adobe, there are over 2.5 trillion PDFs in the world today, and its growing day by day. PDF is the universal format everyone can open.

Unfortunately, many times documents are the bottleneck of workforce productivity. Often, people will perform manual processes like printing and scanning when there are digital ways for them to do that with PDFs. For example:

  • Sign them with an electronic signature
  • Combine them together into one document
  • Convert Word, PowerPoint, and other files into PDFs
  • OCR documents so that they can be searchable in SharePoint
  • If a PDF is your only copy of a document, then convert them back into document formats like Word, PowerPoint, etc.

Fortunately, Adobe as the inventor of PDF and strategic partner with Microsoft, has brought many of these types of tools directly into Microsoft Power Automate to help you streamline and automate many of your document workflows. Through these connectors, all Adobe connectors are also available in Power Apps, Power Virtual Agents, and Azure Logic Apps to build integrated custom document workflows within the Microsoft ecosystem.

Adobe connectors in Microsoft Power Automate

Adobe has offered integrations for Microsoft Power Automate since 2017 with tools to help with document automation and has continued to expand and update the connectors with new features and templates:

Adobe Sign
Adobe Sign
Adobe Sign connector enables you to take your Word, PDF, PowerPoint, and other formats and route them for electronic signature. Using Adobe Sign connector with Power Automate allows you to dynamically route documents for approval based on data.

Some useful examples include:

Adobe Sign has over 30 templates and triggers pre-created for you to get started quickly.

Adobe PDF Tools
Adobe PDF Tools
For people familiar using Adobe Acrobat for creating, editing, and manipulating PDF documents, the new Adobe PDF Tools connector these tools to your flows such as:

  • Convert your Word, Excel, PowerPoint, and other formats automatically into PDF
  • Convert PDFs back into editable formats like Microsoft Word, PowerPoint, Excel, and other formats
  • OCR your scanned PDFs
  • Convert HTML and data into PDF

Adobe PDF Tools connector has over 30 templates and triggers pre-created for you to get started quickly.

Quick Start example

Let’s walk through a quick example of how you can use some of the different connectors. We will use this template to demonstrate how you can use Adobe Sign and PDF Tools connectors together in the same workflow. In the example below, we will show how you can compress and optimize your PDFs using Adobe PDF Tools connector and then send them out for signature using one of the templates available.

Get your Adobe PDF Tools API credentials

If you haven’t already created credentials to use with Adobe PDF Tools API, you can create them here. PDF Tools API provides a 6-month trial to get started for free.

Once you provide a name and description, your client credentials will be generated. Keep this window open; you will need this information to create a connection in Microsoft Power Automate.

IMPORTANT NOTE: Adobe PDF Tools is a Premium connector so you will need to have the proper Microsoft Power Automate license to use the connector.

Get your Adobe Sign account

To use Adobe Sign connector, you will need to have an Adobe Sign for enterprise subscription. If you don’t and just want to give it a try, sign up for a developer account.

Create a new flow from a template

In this scenario, the flow has us taking a PDF, compressing it, and sending it for signature using Adobe Sign. Because sometimes PDFs can be large with huge images, we are going to have it compress the PDF first using Adobe PDF Tools and then send it for signature using Adobe Sign.

  1. Go to this template.
  2. Create a new connection for Adobe PDF Tools. You will need the Client ID, Client Secret, Organization ID, Technical Account ID, and Private Key from when you setup your key.
  3. When you have entered all your credentials, click Create.
  4. For Adobe Sign, create a new connection.
  5. You may be prompted to choose what authentication level you want to connect as. Choose whether you would like the scope to be for you as an individual Adobe Sign user, group admin (i.e. access agreements within your user group you are an admin for), or account admin (you have full privileges across users).
  6. You will then be prompted to log in with a pop-up to Adobe Sign using your credentials. Login and Allow Access.
  7. When all your connections are created, click Continue to create your connection.

Configure SharePoint settings

Because the trigger for this template is a selected file in SharePoint, we need to set each of the SharePoint actions.

For the For a selected file, Get file properties, Get file content and Get file metadata actions, set the Site Address and Library Name to your desired SharePoint site.

PDF Services connector action

The condition in the flow is to determine whether the document is a PDF or not. It determines this based on the file extension. If it is a PDF, it will skip to the end. If it is not, then it will convert the document to a PDF.

The “Convert document to PDF” will take whatever document you have selected and convert it into a PDF.

You will notice that the action, like all the Adobe PDF Tools connector actions, have mainly two inputs:

  • File Name: The name of the file of the input. It is important to have a file extension so that it knows which format to convert it from.
  • File Content: The file content of the document you want to convert to PDF.

The file content is coming from the file contents of the selected file. Previously in the flow, the file content is stored in the FileContent variable. You don’t have to do it this way, but it is how this template is setup. You can also have it passed directly from Get File Contents with SharePoint.

After the condition, you will then see there is another action called Create Searchable PDF using OCR. This is used to OCR any PDF in case it is a scanned document that needs to be optimized. Like the previous action, we can see that it asks you to input the File Name and File Content. These values are coming from the output of the previous PDF step.

Why is it using variables?
The reason that there are variables is because of the condition. If the document is a PDF, then it is skipping the condition. But if it is not a PDF, it needs to get converted which will change the input going into the Create Searchable PDF using OCR action. By using variables, it makes it easier to change the values going into that action.

Adobe Sign connector actions

After the Create Searchable PDF using OCR action, there are two Adobe Sign connector actions. Let’s walk through what each of them do.

Upload a document and get a document ID
When you want to send a document for signature using Adobe Sign, you need to upload it to Adobe Sign as what is called a transient document, which returns a Document ID. The reason for this is it also allows you in other scenarios to upload a document once to Adobe Sign and then send it multiple times without having to upload a document every single time.

Like Adobe PDF Tools, you need to provide a File Name and a File Content value. In this template, these are passed from the Create Searchable PDF using OCR action.


Create an agreement from an uploaded document and send for signature

Once the document is uploaded to Adobe Sign, now you can send an agreement for signature.

Document ID the documents you would like to include with the agreement you send for signature. You can attach many by clicking “Add a new item”. The input is the Document ID from the Upload a document and get a document ID action.

Next, we need to add a recipient. Click on Add new item under Signature Type. This will expose a few settings. First is the email address which you can pass. For this example, we will set a static example email address, but you can use Dynamic content to pass the email address from a previous action.


Participant Order specifies in which order the recipient is signing. You can specify recipients to be in a parallel order or sequential order.

Participant Role allows you to specify what type of role the recipient needs to be such as Signer, Approver, Certified Recipient, Form Filler or Form Filler. If you want to learn more about the roles, you can learn about that here.

Message field is the message that will show up in the email to the recipient.

Expiration Time allows you to set limit for a recipient to sign a document. Once that passes, then the link to view and sign will no longer be active.

Reminder Frequency allows you to set how often Adobe Sign will send email reminders to the recipient.

Document Password allows you to encrypt the final PDF with a password.

Form Fields allow you to pass field values into a document. Adobe Sign allows you to merge form field values into fields in a document. You need to specify the Name (which corresponds to the name in the document) and the value you want to pass. This is also an array so you can add many. For more information on adding form fields to your document see the Adobe Sign Text Tag documentation.

Redirect Delay and Redirect URL allow you to specify which address to redirect a signer after they sign.

Once you trigger the agreement, the recipient(s) will receive an email with a link for them to view and sign an agreement.

Other Helpful Scenarios

There are several other scenarios that can be helpful for document processes:

Store signed agreements into SharePoint
OCR PDFs as they are imported into SharePoint
Password protect a new file in OneDrive and email as an attachment

What’s Coming Next

Adobe is only getting started with these connectors in Microsoft Power Automate. Coming soon, Adobe will be bringing additional features from its Adobe Document Services to Power Automate. Here are a few of the highlights:

Adobe Document Generation API

Document Generation API enables you to dynamically generate documents based on JSON data. Document templates are creating in Microsoft Word, and there is even an Add-In for Microsoft Word to allow you to easily tag your documents based on data. Whether it is data from Microsoft Dataverse, SharePoint, Excel, or other sources, you will be able to dynamically generate your documents, which also includes some of these features:

  • Simple and easy template creation in Microsoft Word
  • Support for complex data sets, so you can push data from your Power Automate actions directly in, even if there are complex data structures.
  • Dynamically change and import images.
  • Set conditional text and paragraphs based on data and conditions.
  • Numerical calculations to calculate sum, average, etc. Inline from arrays of data inline in the template.
  • Dynamically populate tables with data values.
  • Integration with Adobe Sign to create tagged documents ready to be sent for signature.
  • Export in multiple file formats such as Word or PDF.

An overview of the highlights are available here.

Adobe PDF Extract API

A lot of data is stored inside of PDFs, but it’s often difficult to reliably extract that data for use elsewhere. PDF Extract API converts PDFs into JSON data to reliably capture content structure, element positions, and relationships, which can be parsed and placed into data systems, republished in other platforms, or for search and analysis. Some the key benefits are:

  • Reliable extraction of tables, even if they cross pages
  • Reliable reading of columns of data and understand when related paragraphs span across columns or pages
  • Understand and tag data into structures such as titles, headers, paragraphs, tables etc.

Final Thoughts

Adobe has provided a number of great connectors to help with document automation using Microsoft Power Automate. In addition to Power Automate connectors, Adobe PDF Tools and Adobe Sign also have add-ins for other Microsoft applications such as Adobe Acrobat for Microsoft Teams, Adobe Sign for Microsoft Teams, Adobe Sign for Word, Adobe Sign for SharePoint, Adobe Creative Cloud for Word and PowerPoint, and many others. These can be a great addition to the automated flows you create in Power Automate.

The post Automating document workflows with Power Automate and Adobe appeared first on Microsoft Power Platform Blog.

]]>
Upcoming changes to the Gmail connector http://approjects.co.za/?big=en-us/power-platform/blog/power-automate/upcoming-changes-to-the-gmail-connector/ Thu, 23 Apr 2020 18:26:43 +0000 Starting May 1, 2020 we’ll be rolling out changes that will potentially impact your flows that use the Gmail connector due to policy changes by Google. Under this policy, the Gmail connector, when used with a Gmail consumer account, can only be used with a limited set of Google-approved services. We have rolled out an option to use your own Google application for personal or internal use in your enterprise in case your flows are impacted.

The post Upcoming changes to the Gmail connector appeared first on Microsoft Power Platform Blog.

]]>
(Edit: A typo on the redirect URI is fixed.)

Starting May 1, 2020 we’ll be rolling out changes that will potentially impact your flows that use the Gmail connector due to policy changes by Google. In line with Google’s data security and privacy policies, customers using a Gmail consumer account (email addresses ending with @gmail.com or @googlemail.com) will have certain limitations on the set of connectors that can be used along with the Gmail connector in a flow.

If you are using a G-Suite business account (email addresses with a custom domain), your flows will not be impacted, and there is no restriction on the use of the Gmail connector.

Under this policy, the Gmail connector, when used with a Gmail consumer account, can only be used with a limited set of Google-approved services. We will continue to work with Google to add more services to this list. For now, the set of Google-approved connectors that can be used with the Gmail connector in the same flow include:

    • Built-in actions and triggers: Control, AI Builder, Data operations, Date Time, Number Functions, Power Virtual Agents, Power Apps, Request, Schedule, Text Functions, Variables, Flow button, Location, Content Conversion Service
    • Google services: Gmail, Google Calendar, Google Contacts, Google Drive, Google Sheets, Google Tasks
    • Approved Microsoft Services: OneDrive, SharePoint Online, Excel Online, Dynamics 365, Microsoft Teams, Office 365, Planner, Outlook, OneNote, Word Online
    • Customer managed data sources: FTP, SFTP, SQL Server, HTTP, SMTP, RSS

NOTE: This list is subject to change. Going forth, we’ll post the complete and and up-to-date info in our documentation.

If you have an existing flow that will be impacted by the changes, we will also notify you via email to let you know of the impacted flows.  Starting June 9, 2020, any flow that is not compliant will be disabled. You will need to ensure that you use only one of the approved connectors in your flow before you can enable your flow again.

What can you do if your flow is impacted?

You may need to use the Gmail connector with a Gmail consumer account in a flow with some of the non-approved connectors.  In this case, we have rolled out an option to use your own Google application for personal or internal use in your enterprise.

To enable that, you will need to:

  • Create an OAuth client application using Google’s API Console
  • Use the settings of your client application in the Gmail connector

For details on the steps, you can read the instructions here in the Gmail connector reference documentation.  Briefly, you can use Google’s setup tool and follow the steps to create an OAuth Client application.  Here’s some useful information—

  • Add the Gmail scope (https://mail.google.com)
  • Add “azure-apim.net” as one of the authorized domain
  • Use “https://global.consent.azure-apim.net/redirect” for Redirect URI

The finished screen will look like below:

To use the app in your Gmail action or trigger, select “…” > + Add new connection to create a new connection. Select “Bring your own application” as the Authentication Type and specify the value of “Client ID” and “Client secret” from your Google app.

When you click on “Sign in”, you will see that the login screen reflects the app you created.  In case you are using a Gmail consumer account, you may see a screen that the App is not verified by Google.

Now you should be able to use the Gmail connector in your flows without restrictions.

 

 

The post Upcoming changes to the Gmail connector appeared first on Microsoft Power Platform Blog.

]]>
Azure Active Directory authentication in the SQL Server connector http://approjects.co.za/?big=en-us/power-platform/blog/power-automate/azure-active-directory-authentication-in-the-sql-server-connector/ Mon, 21 Oct 2019 17:45:27 +0000 We recently added support for Active Directory authentication in the SQL Server connector. This has been one of the most requested feedback from our customers. In this post, I want to give an overview of how you can use this feature.

The post Azure Active Directory authentication in the SQL Server connector appeared first on Microsoft Power Platform Blog.

]]>
NOTE: You can read a longer version of this post in the PowerApps Blog here.  This is an abridged version.

We recently added support for Active Directory authentication in the SQL Server connector. This has been one of the most requested feedback from our customers. In fact, few customers already noticed this rollout and have started using it. In this post, I want to give an overview of how you can use this feature, and some of the underlying design changes we had to bring about in the platform.

Using Azure AD authentication for Azure SQL Database provides a lot of benefits when it comes to managing the security of your data. In the context of PowerApps and Flow, this feature will enable each user to connect to the underlying databases with their own credentials. The SQL Server connection using Azure AD authentication will not be shared when an app is shared.  This is similar to how authentication works for Office 365 Outlook, SharePoint and other Azure AD based services.

Using the feature in Microsoft Flow

In Microsoft Flow, this feature is available when you create a new SQL Server connection. When you create a new connection, you will be asked to choose an Authentication Type. Apart from SQL Server Authentication and Windows Authentication, you can now select “Azure AD Integrated (Preview)” authentication. Once you select that, you can sign in with you Azure AD account to create a connection. If you select any of the other authentication types, you will need to provide appropriate details.
Creating a SQL Server connection in Flow

After you select a connection, you need to specify the server and database as part of the action or trigger you are using. This allows you to use a single connection – associated with your Azure AD credentials – across multiple SQL Servers and databases. Note that we provide a dynamic drop-down for the Database parameter once you provide a valid SQL server. Once you provide the server and database, you can proceed to provide the rest of the parameters required for your operation.

TIP: If your credentials do not have access to list the database, you can select “Enter custom value” and type in the database name.

SQL connector in Flow - Database dropdown

Upcoming changes

This feature is in Preview now. As we see usage of this feature, we will make it production ready. Right now, there are a few areas we are still working on:

  • “App from data” experience in PowerApps
  • “Transform data using Power Query” experience in Microsoft Flow

If you use this feature, please do share your feedback. You can use our product forums or shoot us an email directly.

Thanks!
Sameer

The post Azure Active Directory authentication in the SQL Server connector appeared first on Microsoft Power Platform Blog.

]]>
Office 365 Outlook connector: Important upcoming changes http://approjects.co.za/?big=en-us/power-platform/blog/power-automate/office-365-outlook-connector-important-upcoming-changes/ Wed, 21 Aug 2019 20:28:55 +0000 We are making a few important updates to the connector as we migrate the underlying API to use the Microsoft Graph API instead of Outlook REST API. These updates will unfortunately force us to deprecate most of the current set of actions and triggers. We will be rolling out these changes over the next couple of months and we strongly encourage all users to start adopting the new actions and triggers.

The post Office 365 Outlook connector: Important upcoming changes appeared first on Microsoft Power Platform Blog.

]]>
The Office 365 Outlook connector is one the most popular connector in Microsoft Flow and PowerApps. It is also one of the most feature-rich connector we have today. We are committed to provide the best experience for end users – and that includes not breaking any flow or app. From time to time though, there are changes on underlying services which necessitate us to push out breaking changes. When we do so, we usually update the versions of the actions and triggers in our connector so that users can start leveraging the new actions and triggers.

We are making a few important updates to the connector as we migrate the underlying API to use the Microsoft Graph API instead of Outlook REST API. These updates will unfortunately force us to deprecate most of the current set of actions and triggers. We will be rolling out these changes over the next couple of months and we strongly encourage all users to start adopting the new actions and triggers.

In this post, you can find the full list of operations that will be deprecated and the new actions and triggers that you can use instead.

Details of the Changes

The following actions will be deprecated soon and will be replaced by the corresponding newer version:

Current Action (to be deprecated) New Action Notes
Create contact Create contact (V2)
Create event (V3) Create event (V4) See notes below
Delete contact Delete contact (V2)
Delete email Delete email (V2)
Delete event Delete event (V2)
Export email Export email (V2)
Find meeting times Find meeting times (V2)
Flag email Flag email (V2) See notes below
Forward an email Forward an email (V2)
Get attachment Get Attachment (V2)
Get calendar view of events (V2) Get calendar view of events (V3) See notes below
Get calendars Get calendars (V2)
Get contact Get contact (V2)
Get contact
folders
Get contact folders (V2)
Get contacts Get contacts (V2)
Get email Get email (V2)
Get emails Get emails (V3)
Get event (V2) Get event (V3)
Get events (V3) Get events (V4)
Get mail tips for a mailbox Get mail tips for a mailbox (V2)
Get room lists Get room lists (V2)
Get rooms Get rooms (V2)
Get rooms in room list Get rooms in room list (V2)
Mark as read Mark as read or unread (V2) See notes below
Move email Move email (V2)
Reply to email (V2) Reply to email (V3)
Respond to an event invite Respond to an event invite (V2)
Send an email Send an email (V2) See notes below
Send an email from a shared mailbox Send an email from a shared mailbox (V2)
Set up automatic replies Set up automatic replies (V2) See notes below
Update contact Update contact (V2)
Update event (V3) Update event (V4) See notes below

 

The following list of triggers will be deprecated:

Current Trigger (to be deprecated) New Trigger Notes
When a new email arrives When a new email arrives (V2)
When a new email arrives in a shared mailbox When a new email arrives in a shared mailbox (V2)
When a new event is created (V2) When a new event is created (V3) See notes below
When an event is added, updated or deleted When an event is added, updated or deleted (V2) See notes below
When an event is modified (V2) When an event is modified (V3) See notes below
When an upcoming event is starting soon (V2) When an upcoming event is starting soon (V3) See notes below

 

The following existing actions and triggers will continue to work:

 

Detailed change notes

The main changes between the existing actions and triggers and the new ones are:

  1. All response properties are now camelCased – in line with the convention for Microsoft Graph
  2. For Calendar operations and the Set up automatic replies (V2) operation, the TimeZone properties are now separated from the Start and End fields. In general, the properties follow the dateTimeTimeZone resource type.
  3. The Flag email (V2) operation is enhanced to support set, clear or complete the flag.
  4. The Mark as read or unread (V2) operation is enhanced to support marking an email as read or unread.
  5. There is a limit on the message payload of 4MB. For Send an email (V2), in particular, there is a limit of 4MB for the message body and for each of the attachments. Additionally, there is an overall limit of 50MB for a message (which can be reduced by the Admin).
  6. Some property names have changed. This includes:
  • To >> toRecipients
  • Cc >> ccRecipients
  • Bcc >> bccRecipients
  • DateTimeReceived >> receivedDateTime
  • DateTimeCreated >> createdDateTime
  • DateTimeLastModified >> lastModifiedDateTime

Why the change?

The connector today uses the Outlook REST API, instead of Microsoft Graph API. We are making the move to start using the Microsoft Graph API. In doing so, we are present with 2 sets of challenges:

  1. Incompatibilities between the Outlook REST API and Microsoft Graph API
  2. Deprecation of the Outlook REST API v1.0 and the Office Discovery Endpoint API

The Outlook REST API v1.0 and the Office Discovery Endpoint API has been deprecated and is planned to be de-commissioned by November 1, 2019. Today, when a connection is made, we rely on the Office Discover Endpoint API to retrieve the right endpoint for the user’s mailbox. With the migration to Microsoft Graph API, this is handled by the Microsoft Graph API endpoint seamlessly. Before November 1, 2019, we will be rolling out an update to the connector so that we will not be relying on the Office Discovery API. We will be migration the existing connections seamlessly so that it doesn’t break end users.

That said, what really drove us to deprecate the existing set of actions and triggers is really the incompatibilities between the two APIs. We could have tried to provide backward compatible mapping in the connector itself. However, based on our experience, we are usually able to provide the best user experience if we make the connector layer ‘thin’. This allows us to surface almost all functionality provided by the backend service.

Next Steps

To re-iterate, our commitment to provide the best experience for end users remain – and that includes not breaking any flow or app. To that effect, we plan to and will be investing in providing backward compatibility to some of the more popular actions and triggers. However, they will be eventually deprecated and removed. We strongly encourage you to start using the new actions and triggers immediately.

The post Office 365 Outlook connector: Important upcoming changes appeared first on Microsoft Power Platform Blog.

]]>
Building an ecosystem of open source connectors http://approjects.co.za/?big=en-us/power-platform/blog/power-automate/building-an-ecosystem-of-open-source-connectors/ Tue, 30 Apr 2019 13:58:00 +0000 We continue to release connectors for Microsoft PowerApps and Microsoft Flow taking the number of connectors available to over 250. As the number grows, we would like to empower the wider community to maintain these connectors. We now open source our connectors on GitHub. Our ability to open source connectors is made possible through two new exciting features: the Microsoft Power Platform Connectors CLI can be used to download, create, or update any custom connector and template-based policies can be used to modify the runtime behaviors for custom connectors.

The post Building an ecosystem of open source connectors appeared first on Microsoft Power Platform Blog.

]]>
Over the last few months, we have released a lot of connectors for Microsoft PowerApps and Microsoft Flow taking the number of connectors available to over 250. These connectors drive growth and adoption of the platform by enabling our customers to get started quickly building their apps and flows.  It is also worth noting that all these connectors are built by partners – both internal partners within Microsoft and external partners. We have a process to certify these connectors.

Very often though, we get requests and feedback from our customers and community around features and gaps in many of these connectors. These could be as simple as the connector not providing a certain field/property to complex new triggers and actions that they would like to see. These feedbacks are usually captured in our forum ideas, and then we go about triaging and fixing the connectors. Sometimes, based on support tickets, we end up updating the connectors. But, like any feature, it takes time and we are limited by resources. We often end up recommending our customers to build a custom connector or use the HTTP connector to call the underlying API directly.

Open source connectors

As the number of connectors grow, we would like to empower the wider community. With that, I would like to announce that we are starting a new journey to open-source our connectors. Through an open sourced connector strategy, we hope to get more community developers involved in maintaining a large set of high-quality connectors. You can see how these connectors are authored, and try them out yourself. And, you should be able to fork these connectors to augment any feature gaps and bug fixes. In return, developers can submit those changes back so that the larger community can benefit.

As with any other feature, we like to make small but concrete steps. You can find out open source connector repository on GitHub. Today, we have a small set of connectors, but we are fully committed to add more connectors in the coming months. I would also like to state that we are going through an open-source by default strategy. All new connectors will be put up for open sourcing by default. While the ultimate decision to open source a connector lies with the owner of the connector, we strongly encourage all our partners to open source their connectors.

The repository also provides sample connectors to help developers get started. This is something that we would like to see your contributions as well. If you have built interesting custom connectors, we would like you to share your custom connector as a sample.  In addition, we have updated our documentation as well with some new topics.

There are 2 platform features that enable us to open source our connectors. It will also highlight some of the challenge we have around open sourcing some of the connectors we have today.

Power Platform Connectors CLI

The CLI (Command Line Interface) is a powerful tool that can be used to download, create, or update any custom connector. This tool allows developers to save their custom connectors onto a local file storage, and have it managed using a source code management system. This workflow is not available with the portal – which is great to get started, but not great for developers. The CLI tool also provides the developers access to other advanced features not available in the portal like connection parameters.

To get started with the Power platform connectors CLI, follow the instructions here. The CLI is a cross platform utility that is based on the Azure CLI. So, if you are familiar with that, you will find that it is straightforward to use. Essentially, you need to install the Python runtime, and then install the CLI from the PyPI:

pip install paconn

Once installed, you can login and then start using the CLI to download, create or update a custom connector.

paconn login
paconn download
paconn create
paconn update

In future, we will be adding more functionality to do validation of connectors as well.  Just like our open source philosophy, the CLI itself is open sourced. You will be able to see the source code in the same connector repository on GitHub.

Policy support for custom connectors

Building a custom connector for Microsoft PowerApps and Flow is very straightforward. You simply describe the underlying API operations in OpenAPI format – which can be done by using our tooling in the portal. However, REST APIs are primarily designed to be used by developers who have intimate knowledge of the APIs. This can present a problem for custom connectors where the share of the operations exposed by the connector is determined directly by what is provided at the underlying APIs. This is where policies come in.

Policies can be used to modify the behavior of connectors at runtime. For example, policies are used to enforce throttling limits on API calls, to route calls to different endpoints, etc. Policies are used extensively by many of the out-of-box connectors we have today. We often end up writing many custom policies to modify the API behavior so that the end user experience is great. The use of these custom policies is a problem for us – and our partners. These policies are not available for custom connectors. That effectively means that partners cannot update their connectors or test the changes themselves. It also prevents us from open sourcing the connectors as there is no way for anyone to test those connectors.

That is, until now.

We now make available template-based policies to custom connectors as well. Policies are available in the Definition section for your custom connector.

We currently support the following templates:

  • Dynamic host url – This policy allows the API calls to be routed to a host endpoint based on a connection parameter. This is useful if your connector is targeting different API URL. For instance, the Azure Key Vault connector sample leverages this policy to set the host.
  • Route requests to a fixed endpoint: This policy can be used to route a request to another sub-path. Consider using this policy when there are multiple
  • Set HTTP header: This policy can be used to set a specific HTTP header.
  • Set query string parameter: This policy can be used to set a specific query parameter.

We will be adding more policy templates in future as we get more requirements.

Finally, putting it together – an example

In this example, the CLI and the policies are used to create a connector for Azure Key Vault that can dynamically route to different Key Vault endpoints based on a connection parameter. We believe there are many more examples like that, and we would love to hear them.

This marks an important step towards our journey to build a vibrant developer ecosystem. We would love to hear your feedback. Feel free to use our community forum or send an email directly to us: paconnfb@microsoft.com. We appreciate all feedback!

The post Building an ecosystem of open source connectors appeared first on Microsoft Power Platform Blog.

]]>
Introducing triggers in the SQL Connector http://approjects.co.za/?big=en-us/power-platform/blog/power-automate/introducing-triggers-in-the-sql-connector/ Mon, 25 Sep 2017 15:37:38 +0000 The SQL Database Connector in Logic Apps and Flow is one of the most popular connector, and we have just added a highly requested feature – triggers. Now, you can kick off your Flow or Logic Apps whenever a row is inserted or modified in a SQL Azure Database table.

The post Introducing triggers in the SQL Connector appeared first on Microsoft Power Platform Blog.

]]>
The SQL Database Connector in Logic Apps and Flow is one of the most popular connectors, and we have just added a highly requested feature – triggers.  Now, you can kick off your Flow or Logic Apps whenever a row is inserted or modified in a SQL Azure Database table.
Triggers for the SQL Connector

Using the trigger is straightforward.  Select the appropriate trigger.  Create your connection (if you have not already) or select an existing connection.  Then, select the table from the drop-down. [If you don’t see your table, see the notes below.]  You can also choose to further limit the rows returned by specifying a filter.
When an item is created - SQL Trigger

Once you configure the trigger, you can now use the output from the trigger in any action in your Flow or Logic App.  The trigger will make available the columns of the selected tables.
The trigger provides dynamic schema

You can now save the flow, and it will kick off whenever a row is added to (or modified in) the selected SQL Database table.
The run output of the trigger

That’s it.  You now have a working flow that you can use to monitor and automate whenever rows are added or modified in your SQL table.

 

Limitations

The triggers do have the following limitations:

  • It does not work for on-premises SQL Server
  • Table must have an IDENTITY column for the new row trigger
  • Table must have a ROWVERSION (a.k.a. TIMESTAMP) column for the modified row trigger

 

A Brief note on the design and on the limitations

Some of you might have noticed that this feature has been available for some time now as a limited preview feature of Logic Apps in the East US 2 region.  Designing and implementing a trigger is more complex than adding actions.  This is because, the trigger needs to monitor and track changes.  In the case of SQL Databases, unfortunately, there is no mechanism of tracking changes that will work for all tables.  Therefore, specific tables must have specific column types which are designed for change tracking.  In order to track changes like addition or modification of rows in a table, the table must have a column whose value is unique and whose value increases (or decrease) monotonically each time such a change is made.  This is satisfied by having an IDENTITY column for tracking creation, and ROWVERSION (a.k.a. TIMESTAMP) column for tracking modification .

So, what happens when there is no such column in a table?  Those tables will not be listed when you try to use the trigger.  And, it will not work if you do try to type the table name manually.  The only workaround then is to externalize the state yourself and use the “Get rows” action to query for the changes.

We would, of course, like to hear your feedback.

The post Introducing triggers in the SQL Connector appeared first on Microsoft Power Platform Blog.

]]>
Connecting to Oracle Database from PowerApps, Flow and Logic Apps http://approjects.co.za/?big=en-us/power-platform/blog/power-apps/connecting-to-oracle-database-from-powerapps-flow-and-logic-apps/ Wed, 01 Mar 2017 18:43:14 +0000 You can now connect to your Oracle Database from PowerApps, Flow and Logic Apps.  The Oracle Database connection allows you to list tables, and perform standard create, read, update and delete…

The post Connecting to Oracle Database from PowerApps, Flow and Logic Apps appeared first on Microsoft Power Platform Blog.

]]>
You can now connect to your Oracle Database from PowerApps, Flow and Logic Apps.  The Oracle Database connection allows you to list tables, and perform standard create, read, update and delete of rows in an Oracle databases.  In addition, it supports full delegation of PowerApps’ filtering, sorting and other functions.  It does not support triggers or store procedures yet.

Supported versions: Oracle 9 and later, Oracle client software 8.1.7 and later.

Pre-requisites

The Oracle Database connection requires the installation of the on-premises data gateway and Oracle client SDK.

Install On-premises Data Gateway

The Oracle Database connection requires the On-premises data gateway to be installed.  Based on the service that you want to connect to, you can follow the steps defined below to install:

A note about the on-premises data gateway:  The on-premises data gateway acts as a bridge, providing quick and secure data transfer between on-premises data (data that is not in the cloud) and the Power BI, Microsoft Flow, Logic Apps, and PowerApps services.  The same gateway can be used with multiple services and multiple data sources.  So, you will probably need to do this only once.  More about gateways can be found here (https://powerapps.microsoft.com/en-us/tutorials/gateway-reference/).

Install Oracle Client

To connect to Oracle, you will also need to install the Oracle client software on the machine where you have installed the on-premises data gateway.  Specifically, the Oracle Database connection requires the installation of the 64-bit Oracle Data Provider for .NET.  You can use the following link to download and install the Oracle client:

If you do not have the Oracle client installed, you will see an error when you try to create or use the connection (see the known issues below).

Create an app using Oracle Database connection

Depending on the service you are using, follow the appropriate steps to create a connection.  Here, we will use PowerApps, but the steps for Flow and Logic Apps should be similar.  Even with PowerApps, there are multiple ways in which you can start using the connection.  In this walkthrough, we will use the PowerApps’s feature of creating an app from an Oracle database table.

Follow the steps below to create an app from an Oracle database in PowerApps:

  1. Open PowerApps Studio, sign in if it prompts you to, and click New to create a new app.
    image
  2. Under “Start with your data”, click on the arrow which will show you a list of connections you already have.  Click on “+New connection” to specify that you want to create a new connection.
    image
  3. Under the list of connections, select “Oracle Database”.
    image
  4. On the right side, specify the Oracle server name, username and password.  For the server, if an SID is required, you can specify this in the format: ServerName/SID.
    image
  5. As mentioned above, you will need to use a Gateway to use an Oracle Database connection.  Select the appropriate gateway.  If required, you can also install a gateway.  Click on “Refresh gateway list” if you do not see your gateway.
    image
  6. Click “Create” to create the connection.  If everything is fine, you will see the screen to select a dataset with a single dataset called “default”.
    image
  7. Select default.
    image
  8. You will see a list of tables based on the Oracle database you are connecting to.  Select the table that you want to use.
    image
  9. Click “Connect” to create the app
    image

Based on the table that you selected, PowerApps will create an app for you with three screens:

  • A Browse screen that list all the entries in the table
  • A Detail screen that provides additional info about an entry
  • An Edit screen that can be used for updating an existing entry or creating a new entry

PowerApps Oracle screens

This is just the starting point.  You may want to customize these screens, add more tables into your app to finish up your app.  You can refer to PowerApps documentation for more information.

Creating a flow using the Oracle Database connection

You can use the steps for Flow and Logic Apps to create a flow or a logic app and use the Oracle Database connection.  I will not go into the details here, but the steps for creating the connection should be very similar to the steps listed above.  The Oracle Database connection does not yet support any triggers.  So, use another trigger at the start of your workflow. In a logic app, for example, you can use the Request/Response trigger to start your logic app, and then add the Oracle Database connection to see its actions.  Once you have a connection, you can use any of the actions listed below.

Flow Oracle Database actions

Known Issues, Tips, and Troubleshooting

Here are some common issues, limitations and workarounds:

  1. Cannot reach the Gateway.
    If you get this error “Cannot reach the Gateway”, it means that on-premises data gateway is not able to connect to the cloud.  To check the status of your Gateway, you can login to powerapps.microsoft.com.  Click Gateways, and select/click the Gateway you are trying to connect to.
    Mitigation: Make sure your gateway is running on the on-premise machine where it is installed, and that it can connect to the internet.  We recommend against installing the gateway on a computer that may be turned off or sleep.  Try restarting the on-premises data gateway service (PBIEgwService) as well.
  2. The provider being used is deprecated: ‘System.Data.OracleClient requires Oracle client software version 8.1.7 or greater.’. Please visit https://go.microsoft.com/fwlink/p/?LinkID=272376 to install the official provider.
    You get the above error if the Oracle client SDK is not installed on the machine where the on-premises data gateway is running.  To fix this issue, download and install the Oracle client SDK as given above.
  3. Table ‘[Tablename]’ does not define any key columns
    This error message signifies that the table it is connecting to, does not have any primary key.  Currently, the Oracle Database connection requires that a table with a primary key column be used.
  4. Stored Procedures are currently not supported.  We are looking at adding this functionality in the future.  We would like you to vote for it in the community forum if you are interested.
  5. Any table with composite keys are not supported.  We would like you to vote for it in the community forum if you are interested.
  6. Nested object types in tables are not supported.  We would like you to vote for it in the community forum if you are interested.

The Oracle Database connection is one of the connectors that many of our users have asked for.  We hope to get some usage and feedback on this.  Please feel free to use the community forums to provide us your bug reports, feedback and ideas.

The post Connecting to Oracle Database from PowerApps, Flow and Logic Apps appeared first on Microsoft Power Platform Blog.

]]>