Support Archives - Microsoft Power Platform Blog Innovate with Business Apps Tue, 24 Sep 2024 21:11:19 +0000 en-US hourly 1 Control Access to Dataverse with IP Firewall: Secure Your Data with Ease http://approjects.co.za/?big=en-us/power-platform/blog/power-apps/announcing-public-preview-of-ip-internet-protocol-firewall-for-dataverse/ Thu, 21 Mar 2024 06:58:21 +0000 We’re excited to let you know that the IP firewall feature is now generally available for the Power Platform environments across all regions. This feature allows you to control access to Dataverse, enabling you to implement stricter security measures. With IP Firewall, Power Platform administrators can configure IP restrictions on each of the Power Platform environments, allowing access to Dataverse only from allowed IP ranges. This helps mitigate risks of insider exfiltrating the data and prevents token replay attack from restricted IP ranges. We hope this feature will help you keep your organizational data secure and protected.

The post Control Access to Dataverse with IP Firewall: Secure Your Data with Ease appeared first on Microsoft Power Platform Blog.

]]>
 

We’re excited to let you know that the IP firewall feature is now generally available for the Power Platform environments across all regions. This feature allows you to control access to Dataverse, enabling you to implement stricter security measures. With IP Firewall, Power Platform administrators can configure IP restrictions on each of the Power Platform environments, allowing access to Dataverse only from allowed IP ranges. This helps mitigate risks of insider exfiltrating the data and prevents token replay attack from restricted IP ranges. We hope this feature will help you keep your organizational data secure and protected.

When you configure the IP firewall on the Power Platform environment, it will only allow the requests from the configured IP ranges and reject all other requests, thereby allowing you to restrict the access to Dataverse.

Get Started

Power Platform admins can enable IP restrictions on Power Platform environments (available per licensing requirement) individually via Power Platform admin Center, by default this feature is turned off.

To enable the IP firewall on a Power Platform environment, you can pursue the configuration steps outlined in this article. You can also refer to this demo on IP firewall.

Finally, your environment IP firewall settings will look like below.

Call to Action:

  1. Enable IP firewall in Audit-Only Mode: If you haven’t already, enable the IP firewall feature to protect your organizational data by limiting user access to Dataverse from only allowed IP ranges. You can learn more about how to enable this feature by visiting the following link: IP firewall in Power Platform environments – Power Platform | Microsoft Learn
  2. Review firewall audit logs: You can review the audit logs and It’s helpful when you’re configuring restrictions on a Power Platform environment. We recommend that you enable audit-only mode for at least a week and disable it only after careful review of the audit logs. IP firewall in Power Platform environments – Power Platform | Microsoft Learn
  3. Enable IP firewall in enforcement mode: Once you have tested the IP firewall in audit-only mode and reviewed the audit logs, you can go ahead and enable the IP firewall in enforcement mode.

Learn More:

The post Control Access to Dataverse with IP Firewall: Secure Your Data with Ease appeared first on Microsoft Power Platform Blog.

]]>
IP based cookie binding in Dataverse is Generally Available. http://approjects.co.za/?big=en-us/power-platform/blog/power-apps/ip-based-cookie-binding-in-dataverse-is-generally-available/ Wed, 25 Jan 2023 11:00:00 +0000 We are pleased to announce that IP based cookie binding in Dataverse is Generally Available (GA) for all our customers. This security feature will allow the administrators to safeguard Dataverse platform by blocking the cookie replay attack . IP cookie binding in Dataverse IP based cookie binding is a security technique that helps protect Dataverse against

The post IP based cookie binding in Dataverse is Generally Available. appeared first on Microsoft Power Platform Blog.

]]>
We are pleased to announce that IP based cookie binding in Dataverse is Generally Available (GA) for all our customers. This security feature will allow the administrators to safeguard Dataverse platform by blocking the cookie replay attack .

IP cookie binding in Dataverse

IP based cookie binding is a security technique that helps protect Dataverse against cookie replay attacks. A cookie replay attack occurs when an attacker intercepts a valid cookie and exploits it to impersonate the user who originally created the cookie. IP based cookie binding addresses this threat by evaluating the IP address associated with the cookie in the request. If the IP address in the request does not match the IP address of the device where the cookie was originally created, the Dataverse API will automatically reject the cookie and prompt the user with a message indicating that their session may have been compromised. This ensures that only the legitimate and authorized user is able to access the protected resources and prevents attackers from using stolen cookies to gain unauthorized access. IP based cookie binding is a real-time solution, which means it can detect and prevent cookie replay attacks as soon as they occur, providing an added layer of security for the customer’s organization.

How can I enable this feature?

Power Platform administrators can enable this feature in their environments via Power Platform admin center. This feature is turned off by default.

  • Select the Environments from the left navigation bar and click on the environment where you want to enable this feature.
  • Select Settings –> Product –> Privacy + Security
  • Turn on the “Enable IP address-based cookie binding”
graphical user interface, text, application
Enable IP address-based cookie binding

More details about this feature are available here

The post IP based cookie binding in Dataverse is Generally Available. appeared first on Microsoft Power Platform Blog.

]]>
Do more with data – From Data Export Service to Azure Synapse Link for Dataverse http://approjects.co.za/?big=en-us/power-platform/blog/power-apps/do-more-with-data-from-data-export-service-to-azure-synapse-link-for-dataverse/ Wed, 24 Nov 2021 07:59:00 +0000 Today, we are announcing deprecation of Data Export Service (DES); an add-on feature available via Microsoft AppSource which provides the ability to replicate data from Microsoft Dataverse to an Azure SQL store in a customer-owned Azure subscription. Data Export Service will continue to work and will be fully supported until it reaches end-of-support and end-of-life a year from this announcement date, in November 2022. This notice is to allow you sufficient time to plan and onboard to Azure Synapse Link for Dataverse which is a successor to Data Export Service.

The post Do more with data – From Data Export Service to Azure Synapse Link for Dataverse appeared first on Microsoft Power Platform Blog.

]]>
What is this announcement?

Today, we are announcing deprecation of Data Export Service (DES); an add-on feature available via Microsoft AppSource which provides the ability to replicate data from Microsoft Dataverse to an Azure SQL store in a customer-owned Azure subscription. Data Export Service will continue to work and will be fully supported until it reaches end-of-support and end-of-life a year from this announcement date, in November 2022. This notice is to allow you sufficient time to plan and onboard to Azure Synapse Link for Dataverse which is a successor to Data Export Service.

Why is Data Export Service deprecating and what is the replacement?

Data is the new oil; when collated, analyzed, and connected with relevant data efficiently, you can unfold insights that drive business decisions for tomorrow. More and more customers using Dynamics 365 and modern business applications are generating massive amounts of business data that is securely stored and managed by Dataverse. Data not just helps organizations understand their current customers, the market, and competitors, but with the right tooling it can enable advanced scenarios to help understand future behaviors such as propensity modelling to predict the likelihood of their customers performing certain actions. ​Harnessing insights from large volumes of data requires sophisticated end-to-end tooling.

​Our first step in this journey was with Data Export Service and while this enabled exporting the data, it was just that, a vehicle to push data to a store. Our customers then had to search for tools that could analyze this data and then if at all, derive insights from it. All of this was not just unintuitive but also a time-consuming process with no assurance of success. And doing this on a continuous basis with constantly changing data and metadata in the source was increasingly becoming a challenging process.

In conversations with the community, many have asked for a single, end-to-end way to work with data in Microsoft Dataverse, from running AI and machine learning, integrating with external datasets, and slicing and dicing large volumes of Dataverse data. Azure Synapse Link for Dataverse, made generally available @ Ignite 2021 serves as a single comprehensive solution that can help you deliver on these goals end-to-end. (Note: Ability to export Dataverse data to Azure Data Lake has been generally available since Feb 2020). Now, instead of using multiple tools to get the job done, you can accelerate time-to-insights using Azure Synapse Link for Dataverse; something that is built-in and available out-of-the-box.

 

What is Azure Synapse Link for Dataverse?

Azure Synapse Link for Dataverse enables you to get near real-time insights over your data in Microsoft Dataverse. With a few clicks, you can bring your Dataverse data to Azure Synapse, visualize data in your Azure Synapse workspace, and rapidly start processing the data to discover insights using advanced analytics capabilities for serverless data lake exploration, code-free data integration, data flows for extract, transform, load (ETL) pipelines, and optimized Apache Spark for big data analytics. Enterprise customers are able to use the familiarity of T-SQL to analyze big data and gain insights from it, while optimizing their data transformation pipeline to leverage the deep integration of Azure Synapse with other Azure services such as Power BI Embedded, Azure CosmosDB, Azure Machine Learning, and Azure Cognitive Services.

Azure Synapse Link for Dataverse is the fastest path to success for our customers to analyze data and generate insights as our experiences are ready to run as soon as you launch your Azure Synapse workspace.

Built-in capabilities for data ingestion, data preparation and machine learning

How are customers using Azure Synapse Link for Dataverse today?

Our use cases encompass a wide variety of industries including Retail, Healthcare, Non-profit, Telecommunication and Government. Following are some common usage patterns that we saw across these industries.

Our Dynamics 365 Sales customers use Azure Synapse Link for Dataverse to bring the Dataverse data into Azure Synapse Analytics and then leverage Azure Synapse Analytics Spark to load in external data along with Azure Synapse Analytics Serverless SQL to create custom views over both datasets. Once the data is prepared, they connect to their custom dataset with Power BI to create a YoY revenue report. Additionally, they leverage the AI integration built into Azure Synapse Analytics for custom forecast prediction over their exported Dataverse data.

Many customers across the world, utilize Azure Synapse Link for Dataverse to bring COVID-19 related data from Dataverse to Azure Synapse Analytics. They leverage Azure Synapse Analytics Pipelines to pull data from various lab facilities and use data flows to transform the data and derive Key Performance Indicators (KPIs) that are published to a Power BI report. This empowers them to create custom analysis and tracking of their local COVID-19 situation. Additionally, many governments use Azure Synapse Link for Dataverse to create a solution that tracks Personal Protective Equipment (PPE) inventory and ensure health providers have the necessary equipment and supplies to treat patients during the pandemic.

And the use cases aren’t just limited to creating reports or running AI and machine learning. There are also instances where customers combine Dataverse data in Azure Synapse Analytics with external data and integrate with Customer Insights which enriches Dynamics 365 Marketing’s segmentation capabilities, allowing precise audience targeting and increased campaign effectiveness

How is Microsoft helping customers to onboard to Azure Synapse Link for Dataverse?

We want to be sensitive to the fact that deprecation is hard and onboarding to newer features does not happen overnight. To this end, we have put together the Data Export Service Deprecation Playbook which will help guide our customers plan their move to Azure Synapse Link for Dataverse. The playbook will guide them through the initialization, planning, and adoption phases. More importantly, it will help our customers assess their existing usage patterns and map them to equivalent scenarios in Azure Synapse Link for Dataverse with a step-by-step guide https://aka.ms/DESDeprecationPlaybook.

Next Steps

While Data Export Service will continue to work and will be fully supported, we will not be adding any new features until it reaches end-of-support and end-of-life a year from this announcement date, in November 2022. Our ask to our customers is to ACT NOW and START PLANNING the onboarding process to Azure Synapse Link for Dataverse. Leverage our documentation and step-by-step guide to test Azure Synapse Link for Dataverse and how your current scenarios can be transitioned to Synapse Link. If you don’t see a scenario that matches yours or wish to bounce off a few questions by us, please reach out to des-synapselink@service.microsoft.com.

Please take advantage of this opportunity to embark on a journey to do more with data by identifying your organization’s analytical needs and turning data into insights for making effective business decisions.

The post Do more with data – From Data Export Service to Azure Synapse Link for Dataverse appeared first on Microsoft Power Platform Blog.

]]>
Announcing Public Preview for modernize business units http://approjects.co.za/?big=en-us/power-platform/blog/power-apps/announcing-public-preview-for-modernize-business-units/ Tue, 02 Nov 2021 13:30:18 +0000 Dataverse security concepts - Modernize Business Units

The post Announcing Public Preview for modernize business units appeared first on Microsoft Power Platform Blog.

]]>
We have modernized the business units security concepts in Dataverse. Business units’ data access can now support matrix data access structure . The matrix structure is typical for sales organizations where a regional sales manager can access data in multiple regions, or retail organizations where a store clerk works at multiple stores.

Users are no longer restricted to accessing/managing data in their own business unit. They can now access and own records across business units. Security roles from different business units can be assigned to the users regardless of the business unit the users belong to. This feature enables the Owning Business Unit column of the record so that it can be set/updated by the users. The Owning Business Unit column determines the business unit that owns the record.

This feature is currently being rolled out for Public Preview during the months of November and December, 2021. Power Platform admins can enable this feature in their environment via the Power Platform admin center.

  1. Select the Environments tab, and then choose the environment that you want to enable this feature for.
  2. Select Settings Product Features.
  3. Turn On the Record ownership across business units (Preview)

Note: for Public preview, please enable this feature switch for your non-production environments.

See Modernize Business Unit video for a demo of this feature.

Full documentation of this capability can be found:

  1. Dataverse security concept on Business Units
  2. Modernize business unit – Matrix data access structure
  3. Introducing Owning Business Units
  4. Assign security roles from different business units
  5. Manager hierarchy direct report’s business unit and manager’s business unit
  6. Change the business unit for a user
  7. Change the owner and/or business unit of a record
  8. Cascading effects for parental table relationship behaviors when owner and/or business unit of a record is changed

Try it out and let us know what you think!  And stay tuned for when it is released for General Availability.

The post Announcing Public Preview for modernize business units appeared first on Microsoft Power Platform Blog.

]]>
Canvas App Troubleshooting – Part 1 http://approjects.co.za/?big=en-us/power-platform/blog/power-apps/canvas-app-troubleshooting-part-1/ Thu, 07 Oct 2021 16:00:00 +0000 Take the first steps in self-diagnosing and mitigating issues in your apps.

The post Canvas App Troubleshooting – Part 1 appeared first on Microsoft Power Platform Blog.

]]>
Overview

In this blog series, we’ll discuss some strategies our support team use when troubleshooting issues in canvas apps. We’ll focus on strategies that any app maker can use to diagnose and fix issues in their apps. To illustrate, let’s start with a scenario.

Scenario: A Filter() statement that returns no items

Let’s say you have a gallery, and the Items property for the gallery is something like this:

Filter(MySharePointList, ContactEmail=cmbContacts.Selected.Email)

Where MySharePointList is a SharePoint list containing Departments in your company, and cmbContacts is a Combo box of possible department contacts, populated from an Excel spreadsheet.

Problem: the gallery isn’t showing any records. What could be causing this?

Break the formula into component parts

In the Filter() statement, we have three elements:

  • The SharePoint list “MySharePointList”
  • The column on that list “ContactEmail”
  • The selected record cmbContacts.Selected.Email

Let’s interrogate each of these elements

Does the datasource have any rows that Power Apps can see?

This should be an easy test. In our example, we can simply set the gallery’s Items property to MySharePointList (no filter). If the unfiltered datasource populates some items in the gallery, then we know that there are rows present.

Looking good!

What data does the ContactEmail column contain?

It’s important to validate that the ContactEmail has some data in it, and what kind of data is in the column. You can actually see this data in the previous screenshot, but sometimes it helps to go right to the datasource (SharePoint) and see what’s present.

Now we know that this ContactEmail column has three possible values: andrea@contoso.com, bill@contoso.com, or marie@contoso.com.

What is the value of cmbContacts.Selected.Email?

To check this, create a label and set its text property to cmbContacts.Selected.Email, so we can validate if that Email value looks correct.

Uh oh! That doesn’t look right. We have differing values for cmbContacts.Selected.Email and ContactEmail:

  • Andrea.R@contoso.com
  • andrea@contoso.com

This seems to be our problem. If we investigate further, we’ll find that, while the SharePoint list has recorded the email address of these users, the Excel spreadsheet I used to populate my Combo box actually has the users’ UPNs, and those are not the same!

Taking it one step further

Here I want to introduce one of the sharpest tools in our toolbelt: Power Apps Monitor. There are resources available on Monitor, so I won’t go into every detail of using it.

Here I want to introduce one of the sharpest tools in our toolbelt: Power Apps Monitor. There are resources available on Monitor, so I won’t go into every detail of using it.

Debugging canvas apps with Monitor
Collaborative troubleshooting using Monitor
Advanced monitoring concepts

In our case, we want to capture a monitor trace while our gallery is loading, and try selecting some different filters via the Combo Box.

Once we have that captured, look for rows in the Monitor trace where the Control column matches our gallery’s name, and the Property column says “Items”. In my case, that would be the rows highlighted:

You will see that in my case these rows are coming in pairs. By clicking on a row with Operation “getRows”, I can see much more detail about the request to get this data from SharePoint.

I want to draw attention to the highlighted section on the right. This is the actual HTTP request the Power App is making to fetch relevant records from SharePoint. It looks a bit messy, but if we run it through a URL decoder a few times, we can get a legible result (I’ll be removing the SharePoint domain and replacing it with <MyDomain>):

https://unitedstates-002.azure-apim.net/apim/sharepointonline/<connectionId>/datasets/https://<MyDomain>.sharepoint.com/sites/TomJ/tables/949a20cb-5647-436f-a696-f609ea26a8f0/items?$filter=ContactEmail eq 'Andrea.R@contoso.com'&$top=100

This entire URL is packed with interesting information, but the part after “datasets/” is what concerns us right now.

https://<MyDomain>.sharepoint.com/sites/TomJ/tables/949a20cb-5647-436f-a696-f609ea26a8f0/items?$filter=ContactEmail eq 'Andrea.R@contoso.com'&$top=100

Here, you can see exactly the filter that is going to be applied when we query SharePoint: ContactEmail eq 'Andrea.R@contoso.com'. All it would take is a quick glance at SharePoint to confirm that there aren’t any items that will fulfill this condition. With a little knowledge of the SharePoint REST API, I can even turn this into an API query I can run in my browser:

https://<MyDomain>.sharepoint.com/sites/TomJ/_api/web/lists(guid'949a20cb-5647-436f-a696-f609ea26a8f0')/items?$filter=ContactEmail%20eq%20%27Andrea.R@contoso.com%27

And compare it to what the filter should be:

https://<MyDomain>.sharepoint.com/sites/TomJ/_api/web/lists(guid'949a20cb-5647-436f-a696-f609ea26a8f0')/items?$filter=ContactEmail%20eq%20%27Andrea@contoso.com%27

Takeaways

This scenario is inspired by a real case that came to our support team recently. The customer found themselves in a scenario I’m sure many are familiar with – supporting an app that was authored by someone else, with little documentation on how the app works. Of course, it would be easier if the app had come with documentation or robust inline comments, but often that is not the case.

The solution to this issue was ensuring that both sides of the Filter() expression were using the Email address, rather than one using Email and the other using UPN. In the course of troubleshooting, there were other possible causes we had considered, including:

  • Maybe there were genuinely no records in the SharePoint list for the selected Contact
  • Such records existed, but the logged-in user didn’t have access to them
  • Perhaps ContactEmail and cmbContacts.Selected.Email were formatted differently and there are no exact matches

If the equality of ContactEmail and cmbContacts.Selected.Email were not the problem, then we would be in the realm of “issues that may not have a simple explanation”. If you find yourself reaching that point, it would definitely make sense to reach out to us at the Support team!

The post Canvas App Troubleshooting – Part 1 appeared first on Microsoft Power Platform Blog.

]]>
Best security practices for Power Apps http://approjects.co.za/?big=en-us/power-platform/blog/power-apps/best-security-practices-for-power-apps/ Sat, 11 Sep 2021 00:24:25 +0000 A reminder on best practices when working with Power Apps and external data sources.

The post Best security practices for Power Apps appeared first on Microsoft Power Platform Blog.

]]>
Best practices for securely using external data sources with Power Apps

We get questions from time to time about how our customers should work securely with Power Apps.  Security and privacy are very important to us.  For the most control over both security and privacy we recommend Dataverse which has best in class security and privacy features.

However, customers may not have their data in Dataverse.   And, it’s important for customers to be able to connect to data where it lives.  Power Apps enables this scenario with a very rich set of connectors.   As part of the deployment of your app, however, you should be clear about the security risk for how authentication to data is enabled for your app.

We talk about connections being “implicitly” or “explicitly” shared.   By this we mean that the authentication method used for the connection is either explicit or implicit.

An explicitly shared connection means that the end user of the application must authenticate to the back-end data source (e.g., SQL Server) with their own explicit credentials. Usually this authentication happens behind the scenes as part of Azure Active Directory or Windows authentication handshake. The user doesn’t even notice when the authentication takes place.

Explicitly shared connections are the most secure.   Explicitly shared connections use the user’s ID on the server to authenticate and then formulates the queries (e.g., filtering, etc.) on the server.   For instance, to securely filter data on the server side for SQL Server, such an app uses built-in security features in SQL Server such as row level security for rows, and the deny permissions to specific objects (such as columns) to specific users. This approach will use the Azure AD user identity to filter the data on the server.

An implicitly shared connection means that the user implicitly uses the credentials of the account that the app maker used to connect and authenticate to the data source during while creating the app. The end user’s credentials are not used to authenticate. Each time the end user runs the app, they’re using the credentials the author created the app with.

An implicitly shared connection is the least secure.  It has the all of the risks associated with a connection made directly to a server on a service.  In particular, you cannot rely on filtering commands to be be secure and even the name of the database and other details can be discovered.  Consequently, we actively discourage the use of implicitly shared connections except in narrow scenarios where the data and access are already public.   If you have a connection of this type we encourage you to consider a more secure connection type.

Connection choices.

Some data sources (such as SQL Server) have multiple ways in which you can connect.   For example, the following four connection authentication types can be used with SQL Server for Power Apps:

Authentication Type Power Apps connection method
Azure AD Integrated Explicit
SQL Server Authentication Implicit
Windows Authentication Implicit
Windows Authentication (non-shared) Explicit

Future

We are always looking to improve our product and welcome feedback you may have.

See also

Use Microsoft SQL Server securely with Power Apps
Overview of connectors for canvas apps

The post Best security practices for Power Apps appeared first on Microsoft Power Platform Blog.

]]>
Important Bug Fix Information: ActivityPointer Tables Cannot Add Custom Columns http://approjects.co.za/?big=en-us/power-platform/blog/power-apps/important-bug-fix-information-activitypointer-tables-cannot-add-custom-columns/ Fri, 30 Jul 2021 20:20:00 +0000 A bug fix to prevent users from adding columns to ActivityPointer is being deployed. If users added any custom columns to ActivityPointer they may need to take action.

The post Important Bug Fix Information: ActivityPointer Tables Cannot Add Custom Columns appeared first on Microsoft Power Platform Blog.

]]>
We have fixed a recently discovered bug that allowed users to create custom columns on the ActivityPointer table.  Creating custom columns on the ActivityPointer table is not a supported feature.

Only organizations that have added a custom column to the Activity Pointer table manually or via solution will be impacted. The bug fix will prevent new custom columns from being added to the ActivityPointer table, either manually or through solutions. If a custom column was previously added to the ActivityPointer table, and is part of an existing solution, it will need to be removed prior to importing the solution.

This bug was present in organizations created starting May 20, 2020. The fix for this bug is currently being deployed to all regions. The deployment began on July 16th 2021 and is scheduled to complete worldwide distribution to standard organizations on Aug 6th 2021.

Bug Fix Details:

In Dynamics 365, ActivityPointer is the root table for all activity tables. All columns of ActivityPointer are system columns and are inherited by all activity tables. Custom columns are not allowed because of the impact to other existing activities. If an activity requires a custom column for data, it can be created on individual custom activities directly.

Once the fix is applied to an organization, users will no longer be able to create custom columns on the ActivityPointer table.

If users have created custom columns on the ActivityPointer table while the bug was active, they will notice the following behaviors:

  • Attempting to create a new custom column on the ActivityPointer table will fail.
  • Importing a solution that includes a custom column on the ActivityPointer table will fail.

Call to Action:

If you have an environment that includes these columns, you will need to take some action to resolve them.

Importing Solution Containing Custom Column in ActivityPointer is blocked

If a solution contains custom columns on the ActivityPointer table it will be blocked from import. The following error will be provided:

The evaluation of the current component (name=Attribute, id={guid}) in the current operation (Create) failed during managed property evaluation of condition: Managed Property Name: iscomponentcreationenabled; Component Name: Attribute;

To resolve this issue, go to the source environment of the solution and delete all the custom columns from ActivityPointer table, and then export solution again. If you do not own the solution, the solution publisher will need to fix and provide a new solution.

Clean Up Custom Columns on ActivityPointer

As part of the bug fix, Microsoft will run a job to remove all custom columns from unmanaged ActivityPointer tables. The ActivityPointer table does not store any data, so removing custom columns is nondestructive. Removing the column from the ActivityPointer table will not remove the cloned columns on the activities that store data.

If the custom columns were introduced by a managed solution import, those columns will not be deleted automatically. Customers need to follow below steps to delete them

  1. Remove custom attribute from the solution in the source organization.
  2. Export the updated solution as an incremented version.
  3. Import the solution into target organization with ‘Upgrade’ option.

If the user does not own the solution, the solution publisher will need to fix and provide a new solution.

Clean Up Inherited Custom Columns From Individual Activity Tables

If custom columns were added to the Activity Pointer table, any activities added after the change will include the inherited custom columns. Users can choose to keep the columns on the Activity including their data, so long as the columns are removed from the ActivityPointer table.

If the custom activity was created using a Power Apps client UI or the import of an unmanaged solution, the custom columns cloned from ActivityPointer will be unmanaged. These columns can continue to be used on the Activity, or the user can choose to remove them if they no longer need them.

If the custom activity was created by importing a managed solution, the custom columns inherited from ActivityPointer are created on a managed solution layer even though they were not included in the custom activity initially.

Since these columns were not intended to be introduced by solutions, they need to be cleaned up by upgrading the managed solutions. If these columns are required by the target organization, customers should go to source organization and include all the required custom columns of individual activities in the solution and upgrade the solution in the target organization.

Example:

  • Solution 1.0.0.0 Includes the ActivityPointer table with a new column named: ‘Likes’.
  • The user imports solution 1.0.0.1 which does not have the ActivityPointer table but it includes a new activity: ‘BlogPost’.
  • Upon import of Solution 1.0.0.1, ‘BlogPost’ will inherit ‘Likes’.

Users will need to fix Solution 1.0.0.0 to remove ‘Likes’ from the ActivityPointer table.

If users want to keep the ‘Likes’ column on ‘BlogPost’, they should update Solution 1.0.0.1 to include the ‘Likes’ column as part of ‘BlogPost’.

 

The post Important Bug Fix Information: ActivityPointer Tables Cannot Add Custom Columns appeared first on Microsoft Power Platform Blog.

]]>
Rename your Power Apps action-based data sources http://approjects.co.za/?big=en-us/power-platform/blog/power-apps/rename-your-power-apps-action-based-data-sources/ Tue, 15 Jun 2021 18:14:21 +0000 Rename of connectors

The post Rename your Power Apps action-based data sources appeared first on Microsoft Power Platform Blog.

]]>
We are happy to announce that Power Apps action-based data sources can be renamed. This feature can help customers avoid an extensive rename of formula referenced data sources in a Power App.  The feature will reach all regions by the end of this week June 18, 2021

How are data source names generated?

Data source names are generated from the display name of the connection they are based on. The first instance of a data source name in an app is typically exactly the same as the connection name. For instance, if I used an “AzureDevOps” connection as the basis for my data source it would be named “AzureDevOps”. And, if I add another “AzureDevOps” based data source to my same application the second data source will be named “AzureDevOps_1”.

Renamed connectors

Occasionally, a connector author will change the display name of the connector.  For instance, the author may change the display name from “OldConnector” to “NewConnector”. Your existing Power Apps will continue to work fine even though your data source names say “OldConnector” since your data sources still point (under the covers) to the correct connector type.

However, if for some reason you drop your existing data source and re-add it, the new data source will be named “NewConnector” and your formulas will still reference “OldConnector”.  This will cause formula errors where ever these data source names exist.

Fixing broken formula references to data source names

Using the new “Rename” feature, you can rename your action-based connectors and fix up your formulas automatically.

To fix this kind of problem you:

  1. Rename the data source from the new display name back to the old display name.  For example, you would rename the data source display name from “New Connector” back to “Old Connector”.  Once you do this, all of your formula references will re-connect and work as they did previously.  Your application should now work correctly and you could stop at this step and be done.
  2. (Optional)  Rename your data source names back to the new name.  Once your data source name and your formulas are in sync you can then rename your data source again to bring it in line with the new connector name.  For example, you can rename your data source to be “NewConnector” and the data source name and all of the formula references will be updated to be “NewConnector”.   It’s a good idea to do this because if you ever have to drop your data source again, then re-adding the data source will be based on the newer connector display name.  Performing this step will help you avoid future issues.

This works because the data source display name and the formula referenced data sources are in sync (i.e., have exactly the same display name.)   This rename of the data sources in the formulas cannot work before you rename your data source back because the data source display name and the formula referenced data sources are not the same.

Limitation: action-based connectors, not tabular connectors

This rename capability is limited to action-based connectors.  It does not work for tabular data sources.  Tabular data sources use the name of the table or entity they are pointing at not the general name of the service hosting them.

The post Rename your Power Apps action-based data sources appeared first on Microsoft Power Platform Blog.

]]>
Introducing Command Checker for model-app ribbons http://approjects.co.za/?big=en-us/power-platform/blog/power-apps/introducing-command-checker-for-model-app-ribbons/ Mon, 30 Mar 2020 15:04:32 +0000 We’re excited to announce a new feature adding transparency to model-driven app ribbons in Power Apps! Command Checker is an in-app tool designed to help users understand why buttons are hidden or shown and what command will run upon click.

The post Introducing Command Checker for model-app ribbons appeared first on Microsoft Power Platform Blog.

]]>
We’re excited to announce a new feature adding transparency to model-driven app ribbons in Power Apps!  Command Checker is an in-app tool designed to help users understand why buttons are hidden or shown and what command will run upon click.

For each button on a given ribbon, the tool will show its calculated visibility status, the evaluation result of each enable/display rule attached to the button, and the command to be executed when clicked.  In addition, we have built the ability to see solution layers contributing to the final set of rules and buttons.

If you’re experiencing issues with command button visiblity or execution (or just curious about ribbon behavior), try out our step-by-step troubleshooting guide.

Let’s see the tool in action!  To enable Command Checker, pass ribbondebug=true as a URL parameter (ex: https://myorg.crm.dynamics.com/main.aspx?appid=c26d1c44-e7c0-4c72-9d6d-0e82768cb5bd&ribbondebug=true).  You’ll see two new UI features light up.  The first is a new button in the top right of the header which lets you inspect the global command bar

Next, each command bar contains a new “Command checker” button.  Note that this shows up at the end of the ribbon, so you may have to click the overflow flyout.

Let’s explore the experience upon clicking the button.  For this example, we’ll use the account edit form.  The tool is overlaid on top of the page as seen below.

On the left, the context and entity name are displayed above a tree of tabs, groups, flyouts, and buttons.  On the right, the details of the currently selected button are shown.  The tree is collapsible at any tab, group, and flyout level for a cleaner experience when desired.  The tree contents reflect the current state of the inspected ribbon (of course, this is dependent on each solution layer installed).

Buttons and flyouts in italicized grey are hidden, and those in black text are visible.  Note a flyout will be hidden when all its children are hidden.  Likewise, all children will be hidden when a flyout evaluates to hidden.

Let’s drill down into one of the buttons to see why it’s hidden.  We’ll use the Mark Complete button as an example.  Clicking on this button shows the following breakdown.

You can see the name of the button and other properties from the ribbon XML displayed on the right.  Clicking on Command properties on the right pane header will show details on execution and rule evaluation.

You can quickly see that the Mscrm.PrimaryIsActivity display rule evaluated to false in this page load, and in model-apps ribbons, the button is hidden if any rule evaluates to false.  Drilling down into the evaluation, you can see which rule was used to determine visibility.

The rule was an EntityPropertyRule which checks the IsActivity property of the PrimaryEntity (this being the entity used on the form).  Since the account entity has an IsActivity property of false, the rule itself evaluates to false, and the button is hidden.  This level of detail exists for all types of rules, namely the JavaScriptRule, which shows the exact function name and JS file used to determine visibility (example below).

This breakdown is intended to demystify the hiding and showing of buttons in any context, be it an edit form, subgrid, home page grid, or the global command bar.  The tool will also show the method to be executed upon click, including function name and JS file.

Let’s next look at the solution layer drilldown.  Within Command checker, you can view the solution layering, including diffing ability, for button, command, and rule definitions.  We’ll use the Save button on the opportunity form as an example.  In this example, Contoso has imported a rule to always show to the Save button, regardless of auto-save status.  Let’s click on the button, navigate to Command properties, open the Mscrm.IsAutoSaveDisable rule and click View rule definition solution layers.

You can see all solutions which contribute to this command definition with publisher, order, and entity context displayed.  One common mystery of button visibility is when many solution layers add and remove rules, so it’s hard to determine which are in the final set.  Let’s compare the custom and base solutions to see how a rule was modified.  We’ll select the two records.  Then we’ll hit Compare.

In this view, you can select any two solutions for button, command, or rule definition comparison.  Here, we are comparing the top solution (ContosoCustomization) with the base solution (System).  You can see that the top solution (on the right) has overridden the base rule definition and instead uses the alwaysEnabled function to return true, meaning the Mscrm.IsAutoSaveDisable rule will always evaluate to true.  This should help clarify how solution layering has impact on the final rule set.

Feedback

We hope you’ll take some time to try out this new diagnostics feature, whether investigating a user-facing issue or simply exploring the buttons and rules present in the system.

Please help us improve the tool by providing feedback in our Power Apps community post.

What’s next

We’re excited for you to try out Command Checker, and we’re already working on improvements to make it more useful:

  • Allow Command Checker usage through a new privilege rather than URL parameter
  • Support for the HideCustomAction element
  • Visualize commands hidden server-side before they reach the client
  • Support for buttons hidden for reasons other than rule evaluation
  • Adding detection for common ribbon mistakes and show them as errors and warnings
  • Show lock icons for system solutions that are not modifiable

The post Introducing Command Checker for model-app ribbons appeared first on Microsoft Power Platform Blog.

]]>
PowerApps weekly release summaries are live! Tune in for new features and bug fixes! http://approjects.co.za/?big=en-us/power-platform/blog/power-apps/stay-tuned-with-the-latest-features-and-fixes-through-powerapps-weekly-release-notes/ Tue, 11 Dec 2018 21:34:55 +0000 While you have been building stunning applications on PowerApps, you’ve requested a way to view the latest enhancements and new features that our team is releasing each week. Today we are launching PowerApps Weekly Release Notes. This will give you visibility into the latest features and optimizations that we’re shipping on a weekly basis.

The post PowerApps weekly release summaries are live! Tune in for new features and bug fixes! appeared first on Microsoft Power Platform Blog.

]]>
 

Hey everyone! While you have been building stunning applications on PowerApps, you’ve requested a way to view the latest enhancements and new features that our team is releasing each week. Today we are launching PowerApps Weekly Release Notes. This will give you visibility into the latest features and optimizations that we’re shipping on a weekly basis.

Over the past few months we’ve listened to your needs and are formally launching Release Notes that will provide you the following information.

  1. Preview Vs Worldwide region availability – plan-ahead and try out the latest in preview before it reaches your local region
  2. Versions for each region – troubleshoot issues and see if an update is on its way
  3. Item classification – drill down into each release to see what new features available and what bugs are deployed
  4. Region specific deployments – track the deployments across geographies
  5. Functional segregation of items – drill into each sub product release:
    1. Common Data Service for Apps
    2. Web.powerapps.com
    3. PowerApps Studio & Mobile

How to navigate the Release Notes page?

  1. Latest version availability, track the incremental roll out across various regions (image below)
    1. Region – supported regions
    2. Common Data Service for Apps – CDS versions that have been deployed
    3. Web.PowerApps.com – Maker Portal versions that have been deployed
    4. PowerApps Studio and PowerApps Mobile – Studio and Player versions that have been deployed
  2. Preview region and Worldwide availability of functional areas, you will know what’s in preview and what’s been rolled out globally
      1. Version – version number for the functional area
      2. Preview Region (First available) – releases over time for the preview region
      3. Worldwide availability –  releases over time in all regions

     

    How is each release setup?
    As you start exploring the content for each version, you’ll notice two sections.

    1. Features – shows the features that are shipped for this version (on the top of the page)
    2. Updates and Improvements – shows updates and optimizations that were introduced in this version (at the bottom of the page)

 

Weekly cadence for the release notes

Every week, we’ll publish the release notes on this site https://docs.microsoft.com/business-applications-release-notes/powerplatform/released-versions/powerapps. We will also publish a blog every week summarizing the release notes, so you may as well subscribe to the blog RSS.

We’ll continue to make incremental updates to the release notes so that it’s not only insightful but also easier for you to consume. Looking forward to hearing your feedback, so that we can continue optimizing the release notes.

The post PowerApps weekly release summaries are live! Tune in for new features and bug fixes! appeared first on Microsoft Power Platform Blog.

]]>