James Oleinik, Author at Microsoft Power Platform Blog http://approjects.co.za/?big=en-us/power-platform/blog Innovate with Business Apps Sat, 16 Nov 2024 00:13:27 +0000 en-US hourly 1 Explore Ignite 2024: Unlock enterprise knowledge with Dataverse http://approjects.co.za/?big=en-us/power-platform/blog/it-pro/explore-ignite-2024-unlock-enterprise-knowledge-with-dataverse/ Wed, 13 Nov 2024 17:23:18 +0000 Join us at Ignite 2024! See our listed sessions and unlock enterprise knowledge with Dataverse.

The post Explore Ignite 2024: Unlock enterprise knowledge with Dataverse appeared first on Microsoft Power Platform Blog.

]]>
Join us in-person in Chicago or virtually for Microsoft Ignite 2024 from Nov 18 to 22. We’re excited to share the latest updates from Dataverse with the low code maker community. Microsoft Ignite is your opportunity to discover solutions that help build intelligent apps and agents, safeguard data, accelerate productivity, and expand your services, while connecting with partners and growing your community or business. 

Can’t make it in person? No problem, register as a digital attendee: Microsoft Ignite 2024

Below is the lineup of sessions that will showcase the upcoming releases from Dataverse, along with samples, demos, and how-to manuals for building effective and efficient low code experiences. If you are a maker in the low code space, make sure to add these to your Ignite backpack. 

Breakout: In-person & Virtual 

BRK 165: What’s new with Copilot Studio and agents 

Tues, Nov 19 from 2:45 PM to 3:30 PM CT | Omar Aftab, Ray Smith 

Abstract: Over the past 6 months, Copilot Studio has rapidly grown. Join us to hear the latest about how Copilot Studio is enabling organizations to build Agents and realize value from M365 Copilot. We’ll highlight major recent improvements and discuss what comes next. 

Breakout: In-person & Virtual 

BRK180: Enterprise Scale: The Future of Power Platform Governance + Security 

Tues, Nov 19 from 5:15 PM to 6:00 PM CT | Ryan Jones, Shawn Nandi 

Abstract: The Power Platform provides control, visibility, and efficiency for your low-code solutions across Power Apps, Power Automate, Power Pages, Dataverse, and Microsoft Copilot Studio. Learn about the latest innovations to govern, protect, monitor, and manage your large-scale deployments. Through real-world use cases, discover effective strategies to enhance your governance posture and safeguard the Power Platform at scale with new, powerful tools. Join us to revolutionize your low-code management. 

Breakout: In-person & Virtual 

BRK275: Ground Microsoft 365 Copilot in your business knowledge 

Wed, Nov 20 from 1:15 PM to 2:00 PM CT | Mike Bassani & Jenn Cockrell 

Abstract: Unlock the full potential of Microsoft 365 Copilot by grounding it in more of your enterprise data with Copilot connectors. This technical session will guide you through indexing your business data and content into Copilot and Copilot agents, with strict enforcement of your data governance policies. Join us to learn how to maximize the impact of your data, make Copilot smarter for every role in your company, and scale business value. 

Breakout: In-person & Virtual 

BRK182: Get started with best-in-class Copilot connectors in Copilot Studio 

Wed, Nov 20 from 5:00 PM to 5:45 PM CT | Sabin Nair, Nithin Ravindra 

Abstract: Explore Copilot connectors and revolutionize the way agents interact with business and productivity data. Learn how Copilot connectors enable you to effortlessly unify business data stored in Microsoft Dataverse with productivity data from Microsoft Graph, all within a single tool. No more switching between different platforms—Copilot connectors allow you to access, manage, and act on all your data from one interface, streamlining workflows and simplifying your daily tasks. 

Breakout: In-person & Virtual 

BRK181: Get the most of your enterprise knowledge with Copilot Studio 

Thurs, Nov 21 from 1:15 PM to 2:00 PM CT | Julie Koesmarno, Rakesh Krishnan 

Abstract: In this session, we will explore how to effectively leverage your enterprise knowledge within Microsoft Copilot Studio to enhance your agents with more relevant and contextual answers. You’ll discover the diverse types of knowledge available and learn how to optimize them to meet your specific needs 

Breakout: In-person & Virtual 

BRK176: Automatically process e-mails, documents and images using GPT and Power Automate 

Thurs, Nov 21 from 3:45 PM to 4:30 PM CT | Gwenael Bego, Maria Remolina Gutierrez 

Abstract: Power Platform’s prompt builder has been enhanced to handle multi-modal content, now supporting both documents and images. When integrated with the structured outputs feature in a Power Automate flow, these capabilities allow you to automate the processing of incoming emails, documents, and images, leading to significant improvements in efficiency and productivity. 

Demo: In-person only 

THR540: Getting the most out of the new Power Platform Admin Center 

Wed, Nov 20 from 1:45 PM to 2:00 PM CT | Zohar Raz 

Abstract: Power Platform Admin Center has been updated to make administration of Power Platform even easier! Learn about all the changes and how Power Platform governance is ready for enterprise-scale. 

Demo: In-person only 

THR541: How to discover, install and publish your agents in Copilot Studio 

Thurs, Nov 21 from 8:45 AM to 9:00 AM CT | Yogi Naik 

Abstract: This session is for those who are building agents in Copilot Studio and wondering how a customer can use the agent for their business. The session will also showcase how out of the box, Microsoft-built 1P agents can be customized for specific use cases to drive further business value! Join us to learn about the new discovery, acquisition, and customization flows for agents in Copilot Studio. 

Demo: In-person only 

THR643: Leveraging rich visibility to secure AI driven business applications 

Thurs, Nov 21 from 10:45 AM to 11:15 AM CT | Scott Magee, Jocelyn Panchal 

Abstract: Protect your Power Platform resources at each layer of the stack by leveraging actionable insights to make informed administrative decisions in the Age of AI. We will discuss how to monitor utilization at scale with comprehensive visibility and detailed metrics. From there, we will curtail a security strategy with thoughtful user provisioning, data exfiltration prevention, network isolation, and more to ensure that your organization and its data are safeguarded while adopting AI. 

Lab: In-person only 

LAB436: Build an agent with your enterprise knowledge in Copilot Studio  

Wed, Nov 20 from 10:15 AM to 11:30 AM CT | Nathan Helgren, Rama Krishnamoorthy 

Abstract: Calling all Microsoft Dynamics 365 customers! Learn how to build an agent in low code that will consume existing business data stored in Microsoft Dataverse, and add uploaded files, to provide quick access to important information for natural language queries. Then understand how to share the agent to ensure appropriate access control for intended users across your organization. 

We hope you join us for Microsoft’s flagship event. Let’s build the future together!  

The post Explore Ignite 2024: Unlock enterprise knowledge with Dataverse appeared first on Microsoft Power Platform Blog.

]]>
Hyperscale Your Enterprise Business Applications with Microsoft Dataverse http://approjects.co.za/?big=en-us/power-platform/blog/power-apps/hyperscale-your-enterprise-business-applications-with-microsoft-dataverse/ Fri, 04 Oct 2024 20:47:02 +0000 Over the past three years, the annual growth rate of enterprise data within Fortune 500 companies has been remarkable, driven by an exponential increase in data generation and the strategic imperative to leverage data analytics. The volume of data continues to grow as enterprises build and deploy new applications within their organizations. IDC has predicted

The post Hyperscale Your Enterprise Business Applications with Microsoft Dataverse appeared first on Microsoft Power Platform Blog.

]]>
Over the past three years, the annual growth rate of enterprise data within Fortune 500 companies has been remarkable, driven by an exponential increase in data generation and the strategic imperative to leverage data analytics. The volume of data continues to grow as enterprises build and deploy new applications within their organizations. IDC has predicted that 1 billion applications will be built over the next 4 years, each of which will generate data that must be ingested, transformed, activated, and managed. 

The accelerating growth in data volume, coupled with its siloed nature, makes deriving insights costly and time-consuming. As a result, enterprises must improve data access to drive productivity and efficiency across different business teams. However, they face seemingly insurmountable hurdles when their data is contained in disparate sources. Beyond the inherent challenges posed by data silos, every enterprise is acutely aware of the security risks it faces due to the rise of active targeting by global threat actors, making security a non-negotiable priority. IT must safeguard data while enabling the enterprise to manage it at scale without redundancy or increased costs. 

Recent technological advancements in AI and machine learning, including data-grounded Copilots and context-rich analytical and recommendation systems, present enterprises with new ways to manage their data estates at scale using Microsoft Dataverse and deliver generative AI solutions with this knowledge using Microsoft Copilot Studio.  

Microsoft Dataverse offers a comprehensive solution by empowering low-code makers to accomplish more, while providing pro developers and IT teams with enhanced power and scope. This enables enterprises to unlock greater value from their data, while ensuring cost-efficiency, security, governance, and compliance across the entire application data lifecycle. 

Data Lifecycle Management graphic

With applications built on Dataverse, enterprises can improve productivity at every phase of the data lifecycle. Dataverse provides: 

  • Seamless ingestion of data with AI-assisted mapping; 
  • Business insights with support for Direct Querying using SqlQueries (using Tabular Data Stream (TDS)), Power BI capabilities and built-in Microsoft Fabric link capabilities;  
  • Storage capacity tools to manage data at scale; and 
  • The enterprise-grade security of Microsoft Entra ID. 

With the ever-growing volume of data generated by enterprises, it is critical that organizations find ways to ingest data at scale – but ingestion is only the first step in the process. Features like AI assisted mapping can provide meaningful productivity gains through recommendations on Dataverse table selection for new data ingestion. Time spent on schema analysis and data ingestion can be further minimized by leveraging Power Platform dataflows: a self-service, cloud-based data preparation technology designed to simplify and accelerate these processes. 

AI-Assisted Dataflows enable enterprises to efficiently ingest, transform, and load data into Power Platform environments with greater precision and less guesswork. For instance, Dataverse’s AI-assisted mapping suggests column mappings when importing data into Dataverse tables. By leveraging these AI-driven recommendations, makers can effectively reuse the robust data schemas already in place within business applications, eliminating the need for manual data remapping. This not only improves data quality and consistency, but also enhances overall productivity. Additionally, solution-aware dataflows support the seamless migration of apps and components across environments, (such as from test to production,) addressing a critical IT requirement. 

When data migration is not the preferred solution due to challenges associated with managing large volumes of data across different IT systems, Dataverse virtual tables provide IT teams with the ability to access data from non-Dataverse sources in real-time. Virtual tables provide read/write access to enterprise data without the need for ingestion into Dataverse. This capability creates a low-code pathway for enterprises to modernize legacy applications, enabling the development of automated flows, interactive Power Pages, and AI-driven knowledge bases, while avoiding the complexities of data replication and reducing the impact of API thresholds, throttling, and costs on invoking APIs. 

Customers across the globe, like Chevron, are using Dataverse virtual tables with SharePoint, Azure SQL server, Fabric, and Salesforce to harness the capabilities of Power Platform and Power Pages. Dataverse plans to add support for more sources, leveraging additional connectors.  Learn more about Dataverse virtual tables.  

By leveraging virtual tables, enterprises can create relationships between external data and data that exists natively in Dataverse. Once these connections are established, data can be seamlessly accessed from multiple data silos across the enterprise, enabling makers to effectively utilize this data in the applications they’re building. Learn more about creating virtual table relationships

Enrich your data using new AI Functions  

Valuable enterprise data often languishes in an unstructured format. For example, an un-alerted comment field may contain key feedback that has the potential to dramatically improve customer satisfaction. AI functions help enterprises summarize, translate, and extract nuggets of insight from this data, which may otherwise go untapped, using prompts. To drive action, the business can even utilize AI to craft customer emails and documents, adding significant value and increasing productivity.  

While AI Functions provide powerful out-of-the-box capabilities, enterprises can take AI integrations even further by grounding custom AI prompts with Dataverse data. 

This allows the enterprise to: 

  • Link AI prompts directly to business data in Dataverse tables; 
  • Provide context-specific responses based on the organization’s information; and 
  • Improve accuracy by referencing up-to-date data from the internal environment. 

For example, an enterprise could create a custom prompt to extract key details from customer proposals, grounded in its Proposals table in Dataverse. This combines the power of large language models with proprietary business data. 

Support your business needs for large volume operations

In today’s world, increasingly driven by data and AI, Dataverse Elastic Tables backed by Azure Cosmos DB is a powerful option with practically unlimited storage. Makers can bulk load large data volumes at high throughput and enable applications to scale up to 120 million writes per hour and 6000 reads per second, while storing 3 billion records in a single table, all with low code. Elastic Tables even supports business scenarios that require flexible schemas with JSON payload, and is already being used extensively by Microsoft Dynamics 365 applications. It allows enterprises to optimize data capacity utilization with its auto delete capability based on time to live functionality – and as always, data is protected by Dataverse security. Learn more about Dataverse Elastic Tables. 

With rapid business growth inevitably leading to large data volumes in existing standard Dataverse tables (which supports up to 100TB), enterprises need the ability to hyperscale write operations. Dataverse bulk operation APIs  are designed for enterprise makers to support these high throughputs write scenarios. Bulk operation APIs like CreateMultiple, UpdateMultiple, and UpsertMultiple can provide throughput improvement of up to 5x, growing from 2 million records created per hour using ExecuteMultiple to the creation of 10 million records in less than an hour. Customers have saved up to 82% of the time spent in end-to-end scenarios using CreateMultiple, UpdateMultiple, and UpsertMultiple in Dataverse SQL tables. 

Drive action on your insights with Fabric from Dataverse 

Dataverse makes hyperscale data accessible and understandable, identifying insights and improving business outcomes. With Microsoft Dataverse Link to Fabric, enterprise data stays in Dataverse, without data copy, while authorized users work with it in Fabric and Power BI to unlock new insights. Using Fabric tools such as SQL, Spark, and dataflows, enterprises can combine, transform and aggregate additional enterprise data into their Dataverse data, enabling near real time insights. 

Enterprises often need to leverage data across different lines of business (LOB) beyond Dataverse applications. Fabric mirroring makes it easy to move this enterprise data into Fabric, where Fabric tools generate insights by combining the enterprise LOB data with the Dataverse data already linked to Fabric. 

When a business scenario requires LOB or other IT data (such as Azure SQL Database, Azure Cosmos Database, and Snowflake) to be maximized from within a Dataverse application to drive business outcomes, low code makers can create Dataverse virtual tables for all the data sitting in Fabric and leverage Power Apps within the Dataverse application. 

“To quickly answer questions, a client’s Accounts Receivable (AR) team will often desire historical sales invoice details in their Dynamics 365 Financials application, but this data could be in multiple systems and oftentimes is cost prohibitive to migrate into a central ERP.  Now with Fabric-based virtual tables, we will implement a Dynamics 365-embedded Power App sourced from legacy sales data housed in a Fabric data warehouse. We love it…same data, same answer, and delivered in less time!” 

Travis Christens, Director Business Analytics and Azure –armanino

Data capacity management 

With rapid enterprise digitization and business expansion resulting in exponential data growth, further accentuated by the prevalence of AI, enterprises need to ensure continued performance of the live application and optimize storage capacity consumption. This must be achieved while ensuring adherence to compliance and regulatory requirements, by reducing the risks associated with historical data. Dataverse tools such as Power Platform capacity reports assist organizations to better manage application data by providing visibility into the storage capacity consumed by business applications. IT admins can further reduce unnecessary storage consumption with regular scheduled usage of Bulk Delete.  

In many business scenarios (such as customer service case management, Finance Ledger, and Supply Chain Inventory), even as the data lifecycle of the business application moves from active to inactive over a defined period, inactive data must be retained for at least seven years for legal and regulatory compliance. While Dataverse has no set limit on active data required to support an enterprise’s unlimited business growth, database capacity consumption can be reduced by storing historical inactive data in Dataverse long term retention for Dataverse and Dynamics 365 Finance and Operations applications. 

Dataverse enables an enterprise grade security model for modern access requirements  

In today’s business environment, teams need to share and collaborate on data spread across multiple resources. Historically, enterprises have been forced to choose between fostering efficient, productive work and ensuring the security, governance, and regulatory compliance of their data and IT systems.  

Dataverse enables a more efficient business model by allowing data access for multiple users across different teams while providing enterprise-grade security backed by Microsoft Entra ID. Enterprises with matrix-based organizational structures can now enable users to own records across different business units, granting each user access control and deletion privileges for the specific records they own – thereby improving productivity. For example, an enterprise’s customer service associates can be granted access to customer accounts and emails by the sales team without violating access controls. 

Security is a huge concern for every organization, and Dataverse offers capabilities such as customer managed key, customer lockbox, environments security groups, Dataverse Audit, and Azure vNet to ensure enterprises can be confident that their data is secure both at rest and in transit. The Power Platform security hub allows administrators to assess the security posture for the tenant, identify and act on recommendations, and use its rich set of high value tools to gain visibility, detect threats, and proactively set policies in place to safeguard from vulnerabilities and risks. 

Next Steps 

In summary, Microsoft Dataverse provides a powerful and scalable solution for enterprises seeking to manage their data efficiently and securely, empowering organizations to overcome the challenges of data silos, enhance productivity, and ensure compliance with security standards. Features such as AI-assisted mapping, virtual tables, and Elastic Tables enable businesses to ingest, transform, and activate data seamlessly, supporting large-scale operations without the need for extensive data migration. Moreover, Dataverse’s integration with Microsoft Fabric and its security features backed by Microsoft Entra ID ensure that enterprises can manage their data lifecycle effectively from ingestion to long-term retention, while maintaining robust security and governance. This comprehensive approach allows enterprises to unlock the full potential of their data, driving actionable insights and improving business outcomes. 

As the volume of enterprise data continues to grow, adopting a solution like Microsoft Dataverse becomes increasingly critical. It not only addresses the immediate needs of data management, but also positions businesses for future growth and innovation. Enterprises are encouraged to explore the capabilities of Dataverse to enhance their data strategies and achieve greater efficiency and security in their operations. To fully unlock the potential of your data and take your organization’s data strategy to the next level, consider diving deeper into the specific capabilities that Dataverse provides. 

AI-Assisted Mapping: Streamline your data ingestion processes and improve data consistency by leveraging AI-driven recommendations for seamless integration. 

Modernized Business Units: Empower your teams to collaborate effectively across different business units with enhanced data ownership and access control features. 

AI Functions: Harness the power of AI to derive actionable insights from your data, transforming it into a strategic asset. 

Elastic Tables: Support large-scale operations with the flexibility and scalability that Elastic Tables provides, ensuring your data infrastructure grows with your business needs. 

Fabric Link: Integrate Dataverse with Microsoft Fabric to enable seamless data flows across your organization, driving efficient data activation and business outcomes. 

Long-Term Data Retention: Implement robust data lifecycle management strategies with Dataverse’s long-term retention capabilities, ensuring compliance and governance over time. 

To learn more about managing security and governance within the Power Platform, visit Power Platform Security Hub on Microsoft Learn

By taking these next steps you’ll be well on your way to optimizing your data management strategy, enhancing productivity, and ensuring the security and compliance of your enterprise’s data. Explore these features today and see how Microsoft Dataverse can help you achieve your business objectives. 

The post Hyperscale Your Enterprise Business Applications with Microsoft Dataverse appeared first on Microsoft Power Platform Blog.

]]>
Hyperscale Your Enterprise Business Applications with Microsoft Dataverse http://approjects.co.za/?big=en-us/power-platform/blog/it-pro/hyperscale-your-enterprise-business-applications-with-microsoft-dataverse/ Fri, 04 Oct 2024 15:11:00 +0000 Over the past three years, the annual growth rate of enterprise data within Fortune 500 companies has been remarkable, driven by an exponential increase in data generation and the strategic imperative to leverage data analytics. The volume of data continues to grow as enterprises build and deploy new applications within their organizations.

The post Hyperscale Your Enterprise Business Applications with Microsoft Dataverse appeared first on Microsoft Power Platform Blog.

]]>
Over the past three years, the annual growth rate of enterprise data within Fortune 500 companies has been remarkable, driven by an exponential increase in data generation and the strategic imperative to leverage data analytics. The volume of data continues to grow as enterprises build and deploy new applications within their organizations. IDC has predicted that 1 billion applications will be built over the next 4 years, each of which will generate data that must be ingested, transformed, activated, and managed. 

The accelerating growth in data volume, coupled with its siloed nature, makes deriving insights costly and time-consuming. As a result, enterprises must improve data access to drive productivity and efficiency across different business teams. However, they face seemingly insurmountable hurdles when their data is contained in disparate sources. Beyond the inherent challenges posed by data silos, every enterprise is acutely aware of the security risks it faces due to the rise of active targeting by global threat actors, making security a non-negotiable priority. IT must safeguard data while enabling the enterprise to manage it at scale without redundancy or increased costs. 

Recent technological advancements in AI and machine learning, including data-grounded Copilots and context-rich analytical and recommendation systems, present enterprises with new ways to manage their data estates at scale using Microsoft Dataverse and deliver generative AI solutions with this knowledge using Microsoft Copilot Studio.  

Microsoft Dataverse offers a comprehensive solution by empowering low-code makers to accomplish more, while providing pro developers and IT teams with enhanced power and scope. This enables enterprises to unlock greater value from their data, while ensuring cost-efficiency, security, governance, and compliance across the entire application data lifecycle. 

Data Lifecycle Management graphic

With applications built on Dataverse, enterprises can improve productivity at every phase of the data lifecycle. Dataverse provides: 

  • Seamless ingestion of data with AI-assisted mapping; 
  • Business insights with support for Direct Querying using SqlQueries (using Tabular Data Stream (TDS)), Power BI capabilities and built-in Microsoft Fabric link capabilities;  
  • Storage capacity tools to manage data at scale; and 
  • The enterprise-grade security of Microsoft Entra ID. 

With the ever-growing volume of data generated by enterprises, it is critical that organizations find ways to ingest data at scale – but ingestion is only the first step in the process. Features like AI assisted mapping can provide meaningful productivity gains through recommendations on Dataverse table selection for new data ingestion. Time spent on schema analysis and data ingestion can be further minimized by leveraging Power Platform dataflows: a self-service, cloud-based data preparation technology designed to simplify and accelerate these processes. 

AI-Assisted Dataflows enable enterprises to efficiently ingest, transform, and load data into Power Platform environments with greater precision and less guesswork. For instance, Dataverse’s AI-assisted mapping suggests column mappings when importing data into Dataverse tables. By leveraging these AI-driven recommendations, makers can effectively reuse the robust data schemas already in place within business applications, eliminating the need for manual data remapping. This not only improves data quality and consistency, but also enhances overall productivity. Additionally, solution-aware dataflows support the seamless migration of apps and components across environments, (such as from test to production,) addressing a critical IT requirement. 

When data migration is not the preferred solution due to challenges associated with managing large volumes of data across different IT systems, Dataverse virtual tables provide IT teams with the ability to access data from non-Dataverse sources in real-time. Virtual tables provide read/write access to enterprise data without the need for ingestion into Dataverse. This capability creates a low-code pathway for enterprises to modernize legacy applications, enabling the development of automated flows, interactive Power Pages, and AI-driven knowledge bases, while avoiding the complexities of data replication and reducing the impact of API thresholds, throttling, and costs on invoking APIs. 

Customers across the globe, like Chevron, are using Dataverse virtual tables with SharePoint, Azure SQL server, Fabric, and Salesforce to harness the capabilities of Power Platform and Power Pages. Dataverse plans to add support for more sources, leveraging additional connectors.  Learn more about Dataverse virtual tables.  

By leveraging virtual tables, enterprises can create relationships between external data and data that exists natively in Dataverse. Once these connections are established, data can be seamlessly accessed from multiple data silos across the enterprise, enabling makers to effectively utilize this data in the applications they’re building. Learn more about creating virtual table relationships

Enrich your data using new AI Functions  

Valuable enterprise data often languishes in an unstructured format. For example, an un-alerted comment field may contain key feedback that has the potential to dramatically improve customer satisfaction. AI functions help enterprises summarize, translate, and extract nuggets of insight from this data, which may otherwise go untapped, using prompts. To drive action, the business can even utilize AI to craft customer emails and documents, adding significant value and increasing productivity.  

While AI Functions provide powerful out-of-the-box capabilities, enterprises can take AI integrations even further by grounding custom AI prompts with Dataverse data. 

This allows the enterprise to: 

  • Link AI prompts directly to business data in Dataverse tables; 
  • Provide context-specific responses based on the organization’s information; and 
  • Improve accuracy by referencing up-to-date data from the internal environment. 

For example, an enterprise could create a custom prompt to extract key details from customer proposals, grounded in its Proposals table in Dataverse. This combines the power of large language models with proprietary business data. 

Support your business needs for large volume operations

In today’s world, increasingly driven by data and AI, Dataverse Elastic Tables backed by Azure Cosmos DB is a powerful option with practically unlimited storage. Makers can bulk load large data volumes at high throughput and enable applications to scale up to 120 million writes per hour and 6000 reads per second, while storing 3 billion records in a single table, all with low code. Elastic Tables even supports business scenarios that require flexible schemas with JSON payload, and is already being used extensively by Microsoft Dynamics 365 applications. It allows enterprises to optimize data capacity utilization with its auto delete capability based on time to live functionality – and as always, data is protected by Dataverse security. Learn more about Dataverse Elastic Tables. 

With rapid business growth inevitably leading to large data volumes in existing standard Dataverse tables (which supports up to 100TB), enterprises need the ability to hyperscale write operations. Dataverse bulk operation APIs  are designed for enterprise makers to support these high throughputs write scenarios. Bulk operation APIs like CreateMultiple, UpdateMultiple, and UpsertMultiple can provide throughput improvement of up to 5x, growing from 2 million records created per hour using ExecuteMultiple to the creation of 10 million records in less than an hour. Customers have saved up to 82% of the time spent in end-to-end scenarios using CreateMultiple, UpdateMultiple, and UpsertMultiple in Dataverse SQL tables. 

Drive action on your insights with Fabric from Dataverse 

Dataverse makes hyperscale data accessible and understandable, identifying insights and improving business outcomes. With Microsoft Dataverse Link to Fabric, enterprise data stays in Dataverse, without data copy, while authorized users work with it in Fabric and Power BI to unlock new insights. Using Fabric tools such as SQL, Spark, and dataflows, enterprises can combine, transform and aggregate additional enterprise data into their Dataverse data, enabling near real time insights. 

Enterprises often need to leverage data across different lines of business (LOB) beyond Dataverse applications. Fabric mirroring makes it easy to move this enterprise data into Fabric, where Fabric tools generate insights by combining the enterprise LOB data with the Dataverse data already linked to Fabric. 

When a business scenario requires LOB or other IT data (such as Azure SQL Database, Azure Cosmos Database, and Snowflake) to be maximized from within a Dataverse application to drive business outcomes, low code makers can create Dataverse virtual tables for all the data sitting in Fabric and leverage Power Apps within the Dataverse application. 

“To quickly answer questions, a client’s Accounts Receivable (AR) team will often desire historical sales invoice details in their Dynamics 365 Financials application, but this data could be in multiple systems and oftentimes is cost prohibitive to migrate into a central ERP.  Now with Fabric-based virtual tables, we will implement a Dynamics 365-embedded Power App sourced from legacy sales data housed in a Fabric data warehouse. We love it…same data, same answer, and delivered in less time!” 

Travis Christens, Director Business Analytics and Azure –armanino

Data capacity management 

With rapid enterprise digitization and business expansion resulting in exponential data growth, further accentuated by the prevalence of AI, enterprises need to ensure continued performance of the live application and optimize storage capacity consumption. This must be achieved while ensuring adherence to compliance and regulatory requirements, by reducing the risks associated with historical data. Dataverse tools such as Power Platform capacity reports assist organizations to better manage application data by providing visibility into the storage capacity consumed by business applications. IT admins can further reduce unnecessary storage consumption with regular scheduled usage of Bulk Delete.  

In many business scenarios (such as customer service case management, Finance Ledger, and Supply Chain Inventory), even as the data lifecycle of the business application moves from active to inactive over a defined period, inactive data must be retained for at least seven years for legal and regulatory compliance. While Dataverse has no set limit on active data required to support an enterprise’s unlimited business growth, database capacity consumption can be reduced by storing historical inactive data in Dataverse long term retention for Dataverse and Dynamics 365 Finance and Operations applications. 

Dataverse enables an enterprise grade security model for modern access requirements  

In today’s business environment, teams need to share and collaborate on data spread across multiple resources. Historically, enterprises have been forced to choose between fostering efficient, productive work and ensuring the security, governance, and regulatory compliance of their data and IT systems.  

Dataverse enables a more efficient business model by allowing data access for multiple users across different teams while providing enterprise-grade security backed by Microsoft Entra ID. Enterprises with matrix-based organizational structures can now enable users to own records across different business units, granting each user access control and deletion privileges for the specific records they own – thereby improving productivity. For example, an enterprise’s customer service associates can be granted access to customer accounts and emails by the sales team without violating access controls. 

Security is a huge concern for every organization, and Dataverse offers capabilities such as customer managed key, customer lockbox, environments security groups, Dataverse Audit, and Azure vNet to ensure enterprises can be confident that their data is secure both at rest and in transit. The Power Platform security hub allows administrators to assess the security posture for the tenant, identify and act on recommendations, and use its rich set of high value tools to gain visibility, detect threats, and proactively set policies in place to safeguard from vulnerabilities and risks. 

Next Steps 

In summary, Microsoft Dataverse provides a powerful and scalable solution for enterprises seeking to manage their data efficiently and securely, empowering organizations to overcome the challenges of data silos, enhance productivity, and ensure compliance with security standards. Features such as AI-assisted mapping, virtual tables, and Elastic Tables enable businesses to ingest, transform, and activate data seamlessly, supporting large-scale operations without the need for extensive data migration. Moreover, Dataverse’s integration with Microsoft Fabric and its security features backed by Microsoft Entra ID ensure that enterprises can manage their data lifecycle effectively from ingestion to long-term retention, while maintaining robust security and governance. This comprehensive approach allows enterprises to unlock the full potential of their data, driving actionable insights and improving business outcomes. 

As the volume of enterprise data continues to grow, adopting a solution like Microsoft Dataverse becomes increasingly critical. It not only addresses the immediate needs of data management, but also positions businesses for future growth and innovation. Enterprises are encouraged to explore the capabilities of Dataverse to enhance their data strategies and achieve greater efficiency and security in their operations. To fully unlock the potential of your data and take your organization’s data strategy to the next level, consider diving deeper into the specific capabilities that Dataverse provides. 

AI-Assisted Mapping: Streamline your data ingestion processes and improve data consistency by leveraging AI-driven recommendations for seamless integration. 

Modernized Business Units: Empower your teams to collaborate effectively across different business units with enhanced data ownership and access control features. 

AI Functions: Harness the power of AI to derive actionable insights from your data, transforming it into a strategic asset. 

Elastic Tables: Support large-scale operations with the flexibility and scalability that Elastic Tables provides, ensuring your data infrastructure grows with your business needs. 

Fabric Link: Integrate Dataverse with Microsoft Fabric to enable seamless data flows across your organization, driving efficient data activation and business outcomes. 

Long-Term Data Retention: Implement robust data lifecycle management strategies with Dataverse’s long-term retention capabilities, ensuring compliance and governance over time. 

To learn more about managing security and governance within the Power Platform, visit Power Platform Security Hub on Microsoft Learn

By taking these next steps you’ll be well on your way to optimizing your data management strategy, enhancing productivity, and ensuring the security and compliance of your enterprise’s data. Explore these features today and see how Microsoft Dataverse can help you achieve your business objectives. 

The post Hyperscale Your Enterprise Business Applications with Microsoft Dataverse appeared first on Microsoft Power Platform Blog.

]]>
Chevron Leads the Way in Citizen Development by Extending Data Lake Accessibility with Microsoft Dataverse http://approjects.co.za/?big=en-us/power-platform/blog/power-apps/chevron-leads-the-way-in-citizen-development-by-extending-data-lake-accessibility-with-microsoft-dataverse/ Fri, 02 Jun 2023 13:34:42 +0000 Chevron has been at the forefront of the energy industry for more than 140 years, and while there are many factors contributing to this success, it’s clear the ingenuity and creativity of its people continuously pave the way for innovation. The global company has always empowered its workforce to solve business challenges. A solid example

The post Chevron Leads the Way in Citizen Development by Extending Data Lake Accessibility with Microsoft Dataverse appeared first on Microsoft Power Platform Blog.

]]>
Chevron has been at the forefront of the energy industry for more than 140 years, and while there are many factors contributing to this success, it’s clear the ingenuity and creativity of its people continuously pave the way for innovation.

The global company has always empowered its workforce to solve business challenges. A solid example is how Chevron was an early adopter of the Microsoft Power Platform, which plays a crucial role in advancing how anyone at the company can make significant business impacts through technology—regardless of their discipline or function. Enrolling and cultivating a global force of citizen developers has required new approaches to generate, test, and replicate innovative ideas faster, at lower cost, and with less risk.

Recently, Chevron partnered with Microsoft to use low-code solutions to remove data access barriers. Providing easier access to data is just one of these successes, and the results are impressive—increased productivity across the organization.

Chevron has been a leader in embracing citizen development, teaming with Microsoft to advance low-code capabilities, governance, and training on a global enterprise scale. Their collaboration has been critical to the growth of the Power Platform, ensuring people with varying technical skill levels can build and maintain solutions that make important business impacts.

Removing data access barriers  

For any organization, easy and secure access to data is one of the greatest obstacles to building sustainable business applications—whether low code or not. The solution is to store the data as parquet files in a secure data lake. Otherwise, applications end up relying on duplicative data repositories with varying levels of reliability and access risks that have traditionally been a headache for even the most seasoned professional developers. Because Chevron is well versed in data lake development, the company knew this was the approach they needed to take to better enable its rapidly growing citizen developer community.

It was not simple undertaking. The enterprise scale of this challenge meant it didn’t need to be solved just once – the team needed to establish a repetitive pattern to allow anyone to follow the same steps to access the data lake. Fidelity with respect to access rules also had to be maintained from user back to source. The team created a proof-of-concept based on Power Platform telemetry data, which is also being used to better understand and improve use of the Platform.

Traditionally, citizen developers, or “makers,” had to spend days on a multi-step process of identifying the data folder or parquet file, requesting appropriate access, configuring a Power Apps connection, and ensuring proper permissions were in place and effective. Now, using Synapse Analytics with the virtual table connector in Dataverse, Chevron eliminated the “heavy lifting” steps, allowing makers to spend less time on data access and more time building innovative solutions.

Screenshot of creating a new table from external data in Dataverse
Create virtual tables from external data sources with Microsoft Dataverse

Creating Dataverse virtual tables using Power Platform telemetry data stored in data lake:

  • Enables direct and easy consumption of external data while extracting its complexities,
  • Shortens time to develop working application,
  • Prevents duplication or replication of external data,
  • Increases adoption of virtual tables

The resulting business value associated with the implementation of virtual tables and Synapse Analytics includes:

  • Simplified and secure access to data lake parquet files
  • Shortened time from ideation to design and ultimately a working application
  • Faster feedback to produce iterations that lead to a final, produced application

“We want to make it easy to do the right thing when it comes to data architecture and reducing data duplication, and Microsoft Dataverse helps make it possible. Not only does it eliminate complexity associated with virtual tables, it helps modernize data that needs a home by liberating and protecting stranded data traditionally stored in Excel files.”

— Tavia Prouhet Product Owner, Chevron Low code and automation, Catalysts

The sustainable future of low code at Chevron

Chevron’s workforce relies on hundreds of business applications to perform their day-to-day roles as they solve some of the world’s most difficult energy challenges. Throughout the past decade, the company has blurred the lines between traditional IT and business, teaming to embrace cloud and SaaS, reduce application complexity, enhance agility, and scale solutions. The results are clear. And as these efforts advanced, the question became how to handle the unique needs within the organization—those that are not mission critical or connected to broader workflows already supported by the company’s impressive solution portfolio. The answer—low code. An approach that lets the people closest to the business problem develop a solution that follows company technical standards regardless of their development skills.

Low code platforms provide an intuitive approach to software development with minimal or no coding to build applications and processes that help both the business and IT. Enterprise licensing enables every individual in the organization to have full capabilities and contribute to the continuous digital transformation.

Key to its low code success, Chevron enrolled and trained a community of developers who create solutions for complex problems and lean on each other for support. The company cultivated a sense of community across low code users, including more than 2,800 innovators. The apps empower employees to increase productivity, drive efficiency and enhance reliability. Chevron also maintains a robust upskilling program for employees that ensures sustainable progress. As of mid 2023, over a hundred teams across the globe have participated in these programs, with over 500 participants supported by several hundred mentors and previous program graduates. Internally, the citizen developer movement is referred to as RADD, which stands for Rapidly Accelerating Democratized Development. Living up to that name requires an appropriate method to democratize data, a challenge Chevron willingly accepted as exemplified by their workflow to access the data lake via virtual tables.

Chevron is known as “The Human Energy Company,” and low code and democratized development are powerful enablers in helping its workforce advance the future of energy.

Learn more about Microsoft Power Platform

The post Chevron Leads the Way in Citizen Development by Extending Data Lake Accessibility with Microsoft Dataverse appeared first on Microsoft Power Platform Blog.

]]>
Export and import your apps across environments with packaging http://approjects.co.za/?big=en-us/power-platform/blog/power-apps/powerapps-packaging/ Thu, 27 Jul 2017 18:30:46 +0000 PowerApps announces packaging, an easier way to move your apps between your UAT and PROD environments.

The post Export and import your apps across environments with packaging appeared first on Microsoft Power Platform Blog.

]]>
We’re very happy to announce that you no longer have to manually move your apps by locally saving .msapp files from your DEV/UAT environment and re-saving the apps in your production environment from PowerApps Studio.

The preview of packaging is now available on web.powerapps.com.  With packaging, you will be able to export an app as a package and import it into another environment.

NOTE: Once the preview is over, you will be required to have a PowerApps Plan 2 trial or PowerApps Plan 2 paid license in order to access the packaging feature.

Exporting an app

You can export an app via the following steps:

apps link image

  • Select Export (preview) for the app you want to export

export link image

  • Enter a Name and Description for the package

name and description field

  • Within the ‘Review Package Content’ section you can optionally add comments or notes or change the setting for how each individual resource will be imported into the target environment during package import

add comments or notes

  • When you are done select Export and the package file will begin downloading within a few seconds

Importing an app package

You can import an app package via the following steps:

select apps

  • Select Import package (preview)

import package

  • Select Upload and select the app package file that you want to import

upload

  • Once the package has been uploaded you will need to review the package contents and will need to provide additional input for any item marked with a red icon by selecting the wrench icon for each item and entering the required information.

additional input

  • Once you have provided all of the required information select Import

import

  • When import completes you will be automatically redirected to a page (similar to the one below) that outlines whether or not the import operation was successful

NOTE: If you are importing an app and chose to Update an existing app, the new changes will be saved as a draft of the applications.  You will need to publish those changes in order for them to be available all other users of the applications.

update

Which resources can be packaged?

When you export an app, the dependent resources for your app will also get exported into the package.  Initially only a subset of all possible resource types will be supported as outlined in the table below.

Resource Supported Import Options
App Yes There are two options to import an app into an environment:

  1. Create new – the app will be created as a new app into the environment where the package is imported.
  2. Update – the app already exists in the environment and will be updated when this package is imported.
Flow Yes There are two options to import a flow into an environment:

  1. Create new – the flow will be created as a new flow into the environment where the package is imported.
  2. Update – the flow already exists in the environment and will be updated when this package is imported.

NOTE: All resources that the flow depends on will also be included within the app package is exported and will need to be configured when it is imported.

CDS Entity Customizations and Picklists Yes There are two options to import CDS Entities or Picklists into an environment:

  1. Overwrite – If there’s a resource with the same name, this import will replace it. If there isn’t a matching resource, a new resource will be created.
  2. Merge – If there’s an entity or picklist with the same name, new fields or entries will be added, but missing fields or entries won’t be removed.
Custom Connectors No If an app depends on a custom connector we do not currently support exporting the connector as a part of the package.If you have an app that relies on a custom connector, your only current option is to manually re-create or update the connector in your target environment and select that connector when you import the package.
Connections No If an app depends on a connection (such as a SQL connection w/ credentials) we do not currently support exporting the connection or credentials as a part of the package.If you have an app that relies on a shared connection (like SQL), your only current option is to manually re-create that connection with the appropriate credentials in your target environment and select that connection when you import the package.
CDS Custom Roles and Permission Sets No Exporting custom CDS roles and/or permission sets is not currently supported.

Known limitations

Limitation Status
Importing app packages that contains more than ~3 resources has been reported to take several minutes to complete. We will be rolling out a fix for this within the next two weeks.
Ability to export/import custom connectors This work is on the backlog and we are working to deliver this within the next 6 months.
Ability to re-configure the datasources for my app during import
(for example switch from one SharePoint list or SQL database to another)
This work is on the backlog and we are working to deliver this within the next 6 months.
Ability to export/import CDS Custom Roles and Permission Sets This work is on the backlog and we are working to deliver this within the next 6 months.
Ability to export/import CDS data (i.e. sample data rows) This work is on the backlog and we are working to deliver this within the next 12 months.

The post Export and import your apps across environments with packaging appeared first on Microsoft Power Platform Blog.

]]>