Machine learning Archives | Microsoft AI Blogs http://approjects.co.za/?big=en-us/ai/blog/topic/machine-learning/ Wed, 25 Sep 2024 16:41:32 +0000 en-US hourly 1 European Fabric Community Conference 2024: Building an AI-powered data platform http://approjects.co.za/?big=en-us/microsoft-fabric/blog/2024/09/25/european-fabric-community-conference-2024-building-an-ai-powered-data-platform/ Wed, 25 Sep 2024 07:00:00 +0000 Get a firsthand look at the latest capabilities we are bringing to the Microsoft Fabric platform.

The post European Fabric Community Conference 2024: Building an AI-powered data platform appeared first on Microsoft AI Blogs.

]]>


Thank you to everyone joining us at the first annual European Microsoft Fabric Community Conference this week in Stockholm, Sweden! Besides seeing the beautiful views of Old Town, attendees are getting an immersive analytics and AI experience across 120 sessions, 3 keynotes, 10 workshops, an expo hall, community lounge, and so much more. They are seeing firsthand the latest capabilities we are bringing to the Fabric platform. For those unable to attend, this blog will highlight the most significant announcements that are already changing the way our customers interact with Fabric. 

Decorative image of abstract art

Microsoft Fabric

Learn how to set up Fabric for your business and discover resources that help you take the first steps

Over 14,000 customers have invested in the promise of Microsoft Fabric to accelerate their analytics including industry-leaders like KPMG, Chanel, and Grupo Casas Bahia. For example, Chalhoub Group, a regional luxury retailer with over 750 experiential retail stories, used Microsoft Fabric to modernize its analytics and streamline its data sources into one platform, significantly speeding up their processes.

“It’s about what the technology enables us to achieve—a smarter, faster, and more connected operational environment.”

—Mark Hourany, Director of People Analytics, Chalhoub Group

Check out the myriad ways customers are using Microsoft Fabric to unlock more value from their data:

New capabilities coming to Microsoft Fabric

Since launching Fabric, we’ve released thousands of product updates to create a more complete data platform for our customers. And we aren’t slowing down anytime soon. We’re thrilled to share a new slate of announcements that are applying the power of AI to help you accelerate your data projects and get more done.

Specifically, these updates are focused on making sure Fabric can provide you with: 

  1. AI-powered development: Fabric can give teams the AI-powered tools needed for any data project in a pre-integrated and optimized SaaS environment.
  1. An AI-powered data estate: Fabric can help you access your entire multi-cloud data estate from a single, open data lake, work from the same copy of data across analytics engines, and use that data to power AI innovation 
  1. AI-powered insights: Fabric can empower everyone to better understand their data with AI-powered visuals and Q&A experiences embedded in the Microsoft 365 apps they use every day. 

Let’s look at the latest features and integrations we are announcing in each of these areas. 

AI-powered development

With Microsoft Fabric, you have a single platform that can handle all of your data projects with role-specific tools for data integration, data warehousing, data engineering, data science, real-time intelligence, and business intelligence. All of your data teams can work together in the same pre-integrated, optimized experience, and get started immediately with an intuitive UI and low code tools. All the workloads access the same unified data lake, OneLake, and work from a single pool of capacity to simplify the experience and ease collaboration. With built-in security and governance, you can secure your data from any intrusion and ensure only the right people have access to the right data. And as we continue to infuse Copilot and other AI experiences across Fabric, you can not only use Fabric for any application, but also accelerate time to production. In the video below, check out how users can take advantage of Copilot to create end-to-end solutions in Fabric: 

Today, I’m thrilled to share several new enhancements and capabilities coming to the platform and each workload in Fabric.

Fabric platform

We’re building platform-wide capabilities to help you more seamlessly manage DevOps, tackle projects of any scale and complexity. First, we’re updating the UI for deployment pipelines, in preview, to be more focused, easier to navigate, and have a smoother flow, now in preview. Next, we’re introducing the Terraform provider for Fabric, in preview, to help customers ensure deployments and management tasks are executed accurately and consistently. The Terraform provider enables users to automate and streamline deployment and management processes using a declarative configuration language. We are also adding support for Azure service principal in Microsoft Fabric REST APIs to help customers automate the deployment and management of Fabric environments. You can manage principal permissions for Fabric workspaces, as well as the creation and management of Fabric artifacts like eventhouses and lakehouses.

We’re excited to announce the general availability of Fabric Git integration. Sync Fabric workspaces with Git repositories, leverage version control, and collaborate seamlessly using Azure DevOps or GitHub. We are also extending our integration with Visual Studio Code (VS Code). You can now debug Fabric notebooks with the web version of VS Code and integrate Fabric environments as artifacts with the Synapse VS Code extension—allowing you to explore and manage Fabric environments from within VS Code. To learn more about these updates, read the Fabric September 2024 Update blog.

Security and governance

To help organizations govern the massive volumes of data across their data estate, we’re adding more granular data management capabilities including item tagging and enhancements to domains—both of which are now in preview. We’re introducing the ability to apply tags to Fabric items, helping users more easily find and use the right data. Once applied, data consumers can view, search, and filter by the applied tags across various experiences. We’re also enhancing domains and subdomains with more controls for admins including the ability to define a default sensitivity label, domain level export and sharing settings, and insights for admins, on tenant domains. Finally, for data owners, we’re adding the ability to search for data by domain, to filter workspaces by domain, and to view domain details in a data item’s location.

Over the past year, we’ve launched a myriad of security features designed to secure your data at every step of the analytics journey. Two of our network security features, trusted workspace access, and managed private endpoints, were previously only available in F64 or higher capacities. We’re excited to share that, based on your feedback, we are making these features available in all Fabric capacities. We’re also making managed private endpoints available in trial capacities as part of this release.

We’re also announcing deeper integration with Microsoft Purview, Microsoft’s unified data security, data governance, and compliance solution. Coming soon, security admins will be able to use Microsoft Purview Information Protection sensitivity labels to manage who has access to Fabric items with certain labels—similar to Microsoft 365. Also coming soon, we are extending support for Microsoft Purview Data Loss Prevention (DLP) policies, so security admins can apply DLP policies to detect the upload of sensitive data, like social security numbers, to a lakehouse in Fabric. If detected, the policy will trigger an automatic audit activity, can alert the security admin, and can even show a custom policy tip to data owners to remedy themselves. These capabilities will be available at no additional cost during preview in the near term, but will be part of a new Purview pay-as-you-go consumptive model, with pricing details to follow in the future. Learn more about how to secure your Fabric data with Microsoft Purview by watching the following video: 

You can also complement and extend the built-in governance in Fabric by seamlessly connecting your Fabric data to the newly reimagined Purview Data Governance solution—now generally available. This new solution delivers an AI-powered, business-friendly, and unified solution that can seamlessly connect to data sources within Fabric and across your data estate to streamline and accelerate the activation of your modern data governance practice. Purview integrations enable Fabric customers to discover, secure, govern, and manage Fabric items from a single pane of glass within Purview for an end-to-end approach to their data estate. Learn more about these Microsoft Purview innovations.  

Workload enhancements and updates

We’re also making significant updates across the six core workloads in Fabric: Data Factory, Data Engineering, Data Warehouse, Data Science, Real-Time Intelligence, and Microsoft Power BI.

Data Factory

In the Data Factory workload, built to help you solve some of the most complex data integration scenarios, we are simplifying the data ingestion experience with copy job, transforming the dataflow capability, and releasing enhancements for data pipelines. With copy job, now in preview, you can ingest data at petabyte scale, without creating a dataflow or data pipeline. Copy job supports full, batch, and incremental copy from any data sources to any data destinations. Next, we are releasing the Copilot in Fabric experience for Dataflows Gen2 into general availability—empowering everyone to design dataflows with the help of an AI-powered expert. We’re also releasing Fast Copy in Dataflows Gen2 into general availability, enabling you to ingest large amounts of data using the same high-performance backend for data movement used in Data Factory (e.g., “copy” activity in data pipelines, or copy job). Lastly for Dataflows Gen2, we are introducing incremental refresh into preview, allowing you to limit refreshes to just new or updated data to reduce refresh times.

Along with the dataflow announcements, we’re announcing an array of enhancements for data pipelines in Fabric, including the general availability of the on-premises data gateway integration, the preview of Fabric user data functions in data pipelines, the preview of invoke remote pipeline to call Azure Data Factory (ADF) and Synapse pipelines from Fabric, and a new session tag parameter for Fabric Spark notebook activity to enable high-concurrency Notebook runs. Additionally, we’ve made it easier to bring ADF pipelines into Fabric by linking your existing pipelines to your Fabric workspace. You’ll be able to fully manage your ADF factories directly from the Fabric workspace UI and convert your ADF pipelines into native Fabric pipelines with an open-source GitHub project. 

Data Engineering

For the Data Engineering workload, we’re updating the native execution engine for Fabric Spark and releasing upgraded Fabric Runtime 1.3 into general availability. The native execution engine enhances Spark job performance by running queries directly on lakehouse infrastructure, achieving up to four times faster performance compared to traditional Spark based on the TPC-DS 1TB benchmark. The native execution engine can now, in preview, support Fabric Runtime 1.3, which together can further enhance the performance of Spark jobs and queries for both data engineering and data science projects. This engine has been completely rewritten to offer superior query performance across data processing; extract, transform, load (ETL); data science, and interactive queries. We are also excited to announce a new acceleration tab and UI enablement for the native execution engine.

Additionally, we are announcing an extension of support in Spark to mirrored databases, providing a consistent and convenient way to access and explore databases seamlessly with the Spark engine. You can easily add data sources, explore data, perform transformations, and join your data with other lakehouses and mirrored databases. Finally, we are excited to launch T-SQL notebooks into public preview. The T-SQL notebook enables SQL developers to author and run T-SQL code with a connected Fabric data warehouse or SQL analytics endpoint, allowing them to execute complex T-SQL queries, visualize results in real-time, and document analytical process within a single, cohesive interface. 

Data Warehouse

We are excited to announce the Copilot in Fabric experience for Data Warehouse is now in preview. This AI assistant experience can help developers generate T-SQL queries for data analysis, explain and add in-line code comments for existing T-SQL queries, fix broken T-SQL code, and answer questions about general data warehousing tasks and operations. Learn more about the Copilot experience for Data Warehouse here. And as mentioned above, we are announcing T-SQL notebooks—allowing you to create a notebook item directly from the data warehouse editor in Fabric and use the rich capabilities of notebooks to run T-SQL queries.

Real-Time Intelligence

In May 2024, we launched a new workload called Real-Time Intelligence that combined Synapse Real-Time Analytics and Data Activator with a range of additional new features, currently in preview, to help organizations make better decisions with up-to-the-minute insights. We are excited to share new capabilities, all in preview, to help you better ingest, analyze, and visualize your real-time data.

First, we’re announcing the launch of the new Real-Time hub user experience; a redesigned and enhanced experience with a new left navigation, a new page called “My Streams” to create and access custom streams, and four new eventstream connectors: Azure SQL Managed Instance – change data capture (MI CDC), SQL Server on Virtual Machine – change data capture (VM CDC), Apache Kafka, and Amazon MSK Kafka. These new sources empower you to build richer, more dynamic eventstreams in Fabric. We’re also enhancing eventstream capabilities by supporting eventhouse as a new destination for your data streams. Eventhouses, equipped with KQL databases, are designed to analyze large volumes of data, particularly in scenarios that demand real-time insight and exploration.

graphical user interface, table

We’re also pleased to announce an upgrade to the Copilot in Fabric experience in Real-Time Intelligence, which translates natural language into KQL, helping you better understand and explore your data stored in Eventhouse. Now, the assistant supports a conversational mode, allowing you to ask follow-up questions that build on previous queries within the chat. With the addition of multi-variate anomaly detection, it’s even easier to discover the unknowns in your high-volume, high-granularity data. You can also have Copilot create a real-time dashboard instantly based on the data in your table, providing immediate insights you can share in your organization.

Finally, we are upgrading the Data Activator experience to make it easier to define a variety of rules to act in response to changes in your data over time, and the richness of our rules have improved to include more complex time window calculations and responding to every event in a stream. You can set up alerts from all your streaming data, Power BI visuals, and real-time dashboards and now even set up alerts directly on your KQL queries. With these new enhancements, you can make sure action is taken the moment something important happens.

Learn more about all of these workload enhancements in the Fabric September 2024 Update blog.

Power BI

We’re thrilled to announce new capabilities across Power BI that will make it easier to track and use the KPIs that matter most to you, create organizational apps, and work with Direct Lake semantic models. 

First, we are announcing the preview of Metric sets which will allow users to promote consistent and reliable metrics in large organizations across Fabric, making it easier for end users to discover and use standardized metrics from corporate models. With Metric sets, trusted creators within an organization can develop standardized metrics, which incorporate essential business logic from Power BI. These creators can organize the metrics into collections, promote and certify them, and make them easily discoverable for end users and other creators. These endorsed and promoted metrics can then be used to build Power BI reports, improving data quality across the organization, and can also be reused in other Fabric solutions, such as notebooks.

graphical user interface, chart, line chart

We’re improving organizational apps in Power BI, a key tool for packaging and securely distributing Power BI reports to your organization. Now in preview, you can create multiple organizational apps in each workspace, and they can contain other Fabric items like notebooks and real-time dashboards. The app interface can even be customized, giving you more control over the color, navigation style, and landing experience.

We’re also making it easier to work with Direct Lake semantic models with new version history for semantic models, similar to the experience found across the Microsoft 365 apps. Power BI users can also now live edit Direct Lake semantic models right from Power BI Desktop. And we’re excited to announce a capability widely asked for by Power BI users: a dark mode in Power BI Desktop. 

Finally, we’re announcing the general availability of OneLake integration for semantic models in Import mode. OneLake integration automatically writes data imported into your semantic models to Delta Lake tables in OneLake so that you can enjoy the benefits of Fabric without any migration effort. Once added to a lakehouse in OneLake, you can use T-SQL, Python, Scala, PySpark, Spark SQL, or R on these Delta tables to consume this data and add business value. All of this value comes at no additional cost as data stored in OneLake for Power BI import semantic models is included in the price of your Power BI licensing.

Learn more about the Power BI announcements in the Power BI September 2024 Feature blog. Also see the AI-powered insights section below for new Copilot experiences for Power BI creators and consumers.

AI-powered data estate

With OneLake, Fabric’s unified data lake, you can create a truly AI-powered data estate to fuel your AI innovation and data culture. OneLake’s shortcuts and mirroring capabilities enable you to access your entire multi-cloud data estate from a single, intuitively organized data lake. With your data in OneLake, you can then work from a single copy across analytics engines, whether you are using Spark, T-SQL, KQL, or Analysis Services and even access that data from other apps like Microsoft Excel or Teams. Today, we are thrilled to share even more capabilities and enhancements coming to OneLake that can help you better connect to and manage your data estate.

One of the biggest benefits of OneLake is the ability to create shortcuts to your data sources, which virtualizes data in OneLake without moving or duplicating it. We are pleased to announce that shortcuts for Google Cloud Services (GCS) and S3-compatible sources are now generally available. These shortcuts also support the on-premise data gateway, which you can use to connect to your on-premise S3 compatible sources as well as GCS buckets that are protected by a virtual private cloud. We’ve also made enhancements to the REST APIs for OneLake shortcuts, including adding support for all current shortcut types and introducing a new list operation. With these improvements, you can programmatically create and manage your OneLake shortcuts.

We’re also excited to announce further integration with Azure Databricks with the ability to access Databricks Unity Catalog tables directly from OneLake—now in preview. Users can just provide the Azure Databricks workspace URL and select the catalog, and Fabric creates a shortcut for every table in the selected catalog, keeping the data in sync in near real-time. Once your Azure Databricks Catalog item is created, it behaves the same as any other item in Fabric, so you can access the table through SQL endpoints, notebooks, or Direct Lake mode for Power BI reports. Learn more about the OneLake shortcut and Azure Databricks announcements in the Fabric September 2024 Updates blog.

At Microsoft Build last May, we announced an expanded partnership with Snowflake that gives our customers the flexibility to easily connect and work across our tools. Today, I’m excited to share progress on this partnership with the upcoming preview of shortcuts to Iceberg tables. In the coming weeks, Microsoft Fabric engines will be able to consume Iceberg data with no movement or duplication using OneLake shortcuts. Simply point to an Iceberg dataset from Snowflake or another Iceberg-compatible service, and OneLake virtualizes the table as a Delta Lake table for broad compatibility across Fabric engines. This means you can work with a single copy of your data across Snowflake and Fabric. With the ability to write Iceberg data to OneLake from Snowflake, Snowflake customers will have the flexibility to store Iceberg data in OneLake and use it across Fabric.

Finally, we’ve released mirroring support for Snowflake databases into general availability—providing a seamless, no-ETL experience for integrating existing Snowflake data with the rest of your data in Microsoft Fabric. With this capability, you can continuously replicate Snowflake data directly into Fabric OneLake in near real-time, while maintaining strong performance on your transactional workloads. Learn more about Snowflake mirroring in Fabric.

AI-powered insights

With your data teams using the AI-enhanced tools in Fabric to accelerate development of insights across your data estate, you then need to ensure these insights reach those who can use them to inform decisions. With easy-to-understand Power BI reports and AI-powered Q&A experiences, Fabric bridges the gap between data and business results to help you foster a culture that empowers everyone to find data-driven answers.

We’re announcing a richer Copilot experience in Power BI to help create reports in a clearer, more transparent way. This new experience, now in preview, includes improved conversational abilities between you and Copilot that makes it easier to provide more context to Copilot initially so you can get the report you need on the first try. Copilot will even provide report outlines to improve transparency on data fields being used. We are also releasing the ability to auto-generate descriptions for measures into general availability. Lastly, report viewers can now use Copilot to summarize a report or page right from the Power BI mobile app, now in preview.

We’re also enhancing email subscriptions for reports by extending dynamic per recipient subscriptions to include both paginated and Power BI reports. With dynamic subscriptions, you can set up a single email subscription that delivers customized reports to each recipient based on the data in the semantic model. For reports that are too large for email format, we are also giving you the ability to deliver Power BI and paginated report subscriptions to a OneDrive or SharePoint location for easy access. Finally, you can now create print-ready, parameterized paginated reports using the Get Data experience in Power BI Report Builder—accessing over 100 data sources.

Learn more about all of the Power BI announcements in the Power BI September 2024 Feature blog

Start building your Fabric skills

We are grateful so many of you have decided to grow your skills with Microsoft Fabric. In the past six months alone, more than 17,000 individuals have earned the Fabric Analytics Engineer Associate certification, making it the fastest growing certification in Microsoft’s history. Today, we’re excited to announce a brand-new certification for data engineers coming in late October. The new Microsoft Certified: Fabric Data Engineer Associate certification will help you prove your skills with data ingestion, transformation, administration, monitoring, and performance optimization in Fabric. 

Our portfolio of Microsoft Credentials for Fabric also includes four Microsoft Applied Skills, which are a complement to Microsoft certifications and free of cost. Applied Skills test your ability to complete a real-world scenario in a lab environment and provide you with formal credentials that showcase your technical skills to employers. For Fabric, we have Applied Skills credentials covering implementing lakehouses, data warehouses, data science and real-time intelligence solutions. 

Visit the Fabric Career Hub to get the best free resources to help you get certified and the latest certification exam discounts. Don’t forget to also join the vibrant Fabric community to connect with like-minded data professionals, get all your Fabric technical questions answered, and stay current on the latest product updates, training programs, events, and more. 

And if you want to test your skills, explore Fabric, and win prizes, you can also register for the Microsoft Fabric and AI Learning Hackathon. To learn more, you can join our Ask Me Anything event on October 8. 

Join us at Microsoft Ignite

We are excited to bring even more innovation to the Microsoft Fabric platform at Microsoft Ignite this year. Join us from November 19 through November 21, 2024 either in person in Chicago or online. You will see firsthand the latest solutions and capabilities across all of Microsoft and connect with experts, community leaders, and partners who can help you modernize and manage your own intelligent apps, safeguard your business and data, accelerate productivity, and so much more. 

Explore additional resources for Microsoft Fabric

If you want to learn more about Microsoft Fabric: 

The post European Fabric Community Conference 2024: Building an AI-powered data platform appeared first on Microsoft AI Blogs.

]]>
Boost your AI with Azure’s new Phi model, streamlined RAG, and custom generative AI models https://azure.microsoft.com/en-us/blog/boost-your-ai-with-azures-new-phi-model-streamlined-rag-and-custom-generative-ai-models/ Thu, 22 Aug 2024 16:00:00 +0000 We're excited to announce several updates to help developers quickly create AI solutions with greater choice and flexibility leveraging the Azure AI toolchain.

The post Boost your AI with Azure’s new Phi model, streamlined RAG, and custom generative AI models appeared first on Microsoft AI Blogs.

]]>
As developers continue to develop and deploy AI applications at scale across organizations, Azure is committed to delivering unprecedented choice in models as well as a flexible and comprehensive toolchain to handle the unique, complex and diverse needs of modern enterprises. This powerful combination of the latest models and cutting-edge tooling empowers developers to create highly-customized solutions grounded in their organization’s data. That’s why we are excited to announce several updates to help developers quickly create AI solutions with greater choice and flexibility leveraging the Azure AI toolchain:

  • Improvements to the Phi family of models, including a new Mixture of Experts (MoE) model and 20+ languages.
  • AI21 Jamba 1.5 Large and Jamba 1.5 on Azure AI models as a service.
  • Integrated vectorization in Azure AI Search to create a streamlined retrieval augmented generation (RAG) pipeline with integrated data prep and embedding.
  • Custom generative extraction models in Azure AI Document Intelligence, so you can now extract custom fields for unstructured documents with high accuracy.
  • The general availability of Text to Speech (TTS) Avatar, a capability of Azure AI Speech service, which brings natural-sounding voices and photorealistic avatars to life, across diverse languages and voices, enhancing customer engagement and overall experience. 
  • The general availability of Conversational PII Detection Service in Azure AI Language.

Use the Phi model family with more languages and higher throughput 

We are introducing a new model to the Phi family, Phi-3.5-MoE, a Mixture of Experts (MoE) model. This new model combines 16 smaller experts into one, which delivers improvements in model quality and lower latency. While the model is 42B parameters, since it is an MoE model it only uses 6.6B active parameters at a time, by being able to specialize a subset of the parameters (experts) during training, and then at runtime use the relevant experts for the task. This approach gives customers the benefit of the speed and computational efficiency of a small model with the domain knowledge and higher quality outputs of a larger model. Read more about how we used a Mixture of Experts architecture to improve Azure AI translation performance and quality.

We are also announcing a new mini model, Phi-3.5-mini. Both the new MoE model and the mini model are multi-lingual, supporting over 20 languages. The additional languages allow people to interact with the model in the language they are most comfortable using.

Even with new languages the new mini model, Phi-3.5-mini, is still a tiny 3.8B parameters.

Companies like CallMiner, a conversational intelligence leader, are selecting and using Phi models for their speed, accuracy, and security.

CallMiner is constantly innovating and evolving our conversation intelligence platform, and we’re excited about the value Phi models are bringing to our GenAI architecture. As we evaluate different models, we’ve continued to prioritize accuracy, speed, and security... The small size of Phi models makes them incredibly fast, and fine tuning has allowed us to tailor to the specific use cases that matter most to our customers at high accuracy and across multiple languages. Further, the transparent training process for Phi models empowers us to limit bias and implement GenAI securely. We look forward to expanding our application of Phi models across our suite of products—Bruce McMahon, CallMiner’s Chief Product Officer.

To make outputs more predictable and define the structure needed by an application, we are bringing Guidance to the Phi-3.5-mini serverless endpoint. Guidance is a proven open-source Python library (with 18K plus GitHub stars) that enables developers to express in a single API call the precise programmatic constraints the model must follow for structured output in JSON, Python, HTML, SQL, whatever the use case requires. With Guidance, you can eliminate expensive retries, and can, for example, constrain the model to select from pre-defined lists (e.g., medical codes), restrict outputs to direct quotes from provided context, or follow in any regex. Guidance steers the model token by token in the inference stack, producing higher quality outputs and reducing cost and latency by as much as 30-50% when utilizing for highly structured scenarios. 

We are also updating the Phi vision model with multi-frame support. This means that Phi-3.5-vision (4.2B parameters) allows reasoning over multiple input images unlocking new scenarios like identifying differences between images.

graphical user interface, website
text

At the core of our product strategy, Microsoft is dedicated to supporting the development of safe and responsible AI, and provides developers with a robust suite of tools and capabilities.  

Developers working with Phi models can assess quality and safety using both built-in and custom metrics using Azure AI evaluations, informing necessary mitigations. Azure AI Content Safety provides built-in controls and guardrails, such as prompt shields and protected material detection. These capabilities can be applied across models, including Phi, using content filters or can be easily integrated into applications through a single API. Once in production, developers can monitor their application for quality and safety, adversarial prompt attacks, and data integrity, making timely interventions with the help of real-time alerts. 

Introducing AI21 Jamba 1.5 Large and Jamba 1.5 on Azure AI models as a service

Furthering our goal to provide developers with access to the broadest selection of models, we are excited to also announce two new open models, Jamba 1.5 Large and Jamba 1.5, available in the Azure AI model catalog. These models use the Jamba architecture, blending Mamba, and Transformer layers for efficient long-context processing.

According to AI21, the Jamba 1.5 Large and Jamba 1.5 models are the most advanced in the Jamba series. These models utilize the Hybrid Mamba-Transformer architecture, which balances speed, memory, and quality by employing Mamba layers for short-range dependencies and Transformer layers for long-range dependencies. Consequently, this family of models excels in managing extended contexts ideal for industries including financial services, healthcare, and life sciences, as well as retail and CPG. 

“We are excited to deepen our collaboration with Microsoft, bringing the cutting-edge innovations of the Jamba Model family to Azure AI users…As an advanced hybrid SSM-Transformer (Structured State Space Model-Transformer) set of foundation models, the Jamba model family democratizes access to efficiency, low latency, high quality, and long-context handling. These models empower enterprises with enhanced performance and seamless integration with the Azure AI platform”— Pankaj Dugar, Senior Vice President and General Manger of North America at AI21

Simplify RAG for generative AI applications

We are streamlining RAG pipelines with integrated, end to end data preparation and embedding. Organizations often use RAG in generative AI applications to incorporate knowledge on private organization specific data, without having to retrain the model. With RAG, you can use strategies like vector and hybrid retrieval to surface relevant, informed information to a query, grounded on your data. However, to perform vector search, significant data preparation is required. Your app must ingest, parse, enrich, embed, and index data of various types, often living in multiple sources, just so that it can be used in your copilot. 

Today we are announcing general availability of integrated vectorization in Azure AI Search. Integrated vectorization automates and streamlines these processes all into one flow. With automatic vector indexing and querying using integrated access to embedding models, your application unlocks the full potential of what your data offers.

In addition to improving developer productivity, integration vectorization enables organizations to offer turnkey RAG systems as solutions for new projects, so teams can quickly build an application specific to their datasets and need, without having to build a custom deployment each time.

Customers like SGS & Co, a global brand impact group, are streamlining their workflows with integrated vectorization.

“SGS AI Visual Search is a GenAI application built on Azure for our global production teams to more effectively find sourcing and research information pertinent to their project… The most significant advantage offered by SGS AI Visual Search is utilizing RAG, with Azure AI Search as the retrieval system, to accurately locate and retrieve relevant assets for project planning and production”—Laura Portelli, Product Manager, SGS & Co

Extract custom fields in Document Intelligence 

You can now extract custom fields for unstructured documents with high accuracy by building and training a custom generative model within Document Intelligence. This new ability uses generative AI to extract user specified fields from documents across a wide variety of visual templates and document types. You can get started with as few as five training documents. While building a custom generative model, automatic labeling saves time and effort on manual annotation, results will display as grounded where applicable, and confidence scores are available to quickly filter high quality extracted data for downstream processing and lower manual review time.

graphical user interface, application, table

Create engaging experiences with prebuilt and custom avatars 

Today we are excited to announce that Text to Speech (TTS) Avatar, a capability of Azure AI Speech service, is now generally available. This service brings natural-sounding voices and photorealistic avatars to life, across diverse languages and voices, enhancing customer engagement and overall experience. With TTS Avatar, developers can create personalized and engaging experiences for their customers and employees, while also improving efficiency and providing innovative solutions.

The TTS Avatar service provides developers with a variety of pre-built avatars, featuring a diverse portfolio of natural-sounding voices, as well as an option to create custom synthetic voices using Azure Custom Neural Voice. Additionally, the photorealistic avatars can be customized to match a company’s branding. For example, Fujifilm is using TTS Avatar with NURA, the world’s first AI-powered health screening center.

“Embracing the Azure TTS Avatar at NURA as our 24-hour AI assistant marks a pivotal step in healthcare innovation. At NURA, we envision a future where AI-powered assistants redefine customer interactions, brand management, and healthcare delivery. Working with Microsoft, we’re honored to pioneer the next generation of digital experiences, revolutionizing how businesses connect with customers and elevate brand experiences, paving the way for a new era of personalized care and engagement. Let’s bring more smiles together”—Dr. Kasim, Executive Director and Chief Operating Officer, Nura AI Health Screening

As we bring this technology to market, ensuring responsible use and development of AI remains our top priority. Custom Text to Speech Avatar is a limited access service in which we have integrated safety and security features. For example, the system embeds invisible watermarks in avatar outputs. These watermarks allow approved users to verify if a video has been created using Azure AI Speech’s avatar feature.  Additionally, we provide guidelines for TTS avatar’s responsible use, including measures to promote transparency in user interactions, identify and mitigate potential bias or harmful synthetic content, and how to integrate with Azure AI Content Safety. In this transparency note, we describe the technology and capabilities for TTS Avatar, its approved use cases, considerations when choosing use cases, its limitations, fairness considerations and best practice for improving system performance. We also require all developers and content creators to apply for access and comply with our code of conduct when using TTS Avatar features including prebuilt and custom avatars.  

Use Azure Machine Learning resources in VS Code

We’re thrilled to announce the general availability of the VS Code extension for Azure Machine Learning. The extension allows you to build, train, deploy, debug, and manage machine learning models with Azure Machine Learning directly from your favorite VS Code setup, whether on desktop or web. With features like VNET support, IntelliSense and integration with Azure Machine Learning CLI, the extension is now ready for production use. Read this tech community blog to learn more about the extension.

Customers like Fashable have put this into production.

“We have been using the VS Code extension for Azure Machine Learning since its preview release, and it has significantly streamlined our workflow… The ability to manage everything from building to deploying models directly within our preferred VS Code environment has been a game-changer. The seamless integration and robust features like interactive debugging and VNET support have enhanced our productivity and collaboration. We are thrilled about its general availability and look forward to leveraging its full potential in our AI projects.”—Ornaldo Ribas Fernandes, Co-founder and CEO, Fashable

Protect users’ privacy 

Today we are excited to announce the general availability of Conversational PII Detection Service in Azure AI Language, enhancing Azure AI’s ability to identify and redact sensitive information in conversations, starting with English language. This service aims to improve data privacy and security for developers building generative AI apps for their enterprise. The Conversational PII redaction service expands upon the Text PII redaction service, supporting customers looking to identify, categorize, and redact sensitive information such as phone numbers and email addresses in unstructured text. This Conversational PII model is specialized for conversational style inputs, particularly those found in speech transcriptions from meetings and calls. 

diagram

Self-serve your Azure OpenAI Service PTUs  

We recently announced updates to Azure OpenAI Service, including the ability to manage your Azure OpenAI Service quota deployments without relying on support from your account team, allowing you to request Provisioned Throughput Units (PTUs) more flexibly and efficiently. We also released OpenAI’s latest model when they made it available on 8/7, which introduced Structured Outputs, like JSON Schemas, for the new GPT-4o and GPT-4o mini models. Structured outputs are particularly valuable for developers who need to validate and format AI outputs into structures like JSON Schemas. 

We continue to invest across the Azure AI stack to bring state of the art innovation to our customers so you can build, deploy, and scale your AI solutions safely and confidently. We cannot wait to see what you build next.

Stay up to date with more Azure AI news 

The post Boost your AI with Azure’s new Phi model, streamlined RAG, and custom generative AI models appeared first on Microsoft AI Blogs.

]]>
Empowering partnerships: The Microsoft Fabric Conference—your gateway to AI innovation http://approjects.co.za/?big=en-us/microsoft-fabric/blog/2024/07/22/empowering-partnerships-the-microsoft-fabric-conference-your-gateway-to-ai-innovation/ Mon, 22 Jul 2024 15:00:00 +0000 The upcoming European Microsoft Fabric Community Conference 2024 in Stockholm, Sweden from September 24 to 27, 2024, is not just an event—it's a beacon for Microsoft partners who are steering the future of AI and analytics.

The post Empowering partnerships: The Microsoft Fabric Conference—your gateway to AI innovation appeared first on Microsoft AI Blogs.

]]>
In case you haven’t heard, building on the success of the inaugural Microsoft Fabric Community Conference in Las Vegas, Nevada earlier this year, the conference has expanded to Europe!  

The upcoming European Microsoft Fabric Community Conference 2024 in Stockholm, Sweden is not just an event—it’s a beacon for Microsoft partners who are steering the future of AI and analytics. The conference, set to take place from September 24 to 27, 2024, is a pivotal gathering for those at the forefront of deploying and adopting Microsoft Fabric’s transformative technologies.  

Stockholm, the heart of Scandinavian innovation, is the perfect backdrop for the Fabric Community Conference. Known for its vibrant tech scene and forward-thinking approach, Stockholm embodies the spirit of progress that Microsoft and its partners strive for. 

Decorative image of abstract art

European Microsoft Fabric Community Conference 2024

A brand-new conference dedicated to Fabric

What to expect at the European Microsoft Fabric Community Conference 2024

Expect to be wowed. You’ll hear from leading Microsoft and community experts from around the world covering topics ranging from Retrieval-Augmented Generation (RAG) pattern applications and semantic modeling, to data governance and sustainability, to integrating applications into the Fabric framework. And if that isn’t enough, you’ll get to experience the latest features from Fabric, Power BI, Azure Databases, Azure AI, Microsoft Purview, and more, demonstrating how Fabric serves as a unified platform that empowers both data and business professionals across all industries.   

And as a prelude to the main conference, we invite you to a special Partner Pre-Day—day dedicated to you, our partners, to ensure you’re equipped with the knowledge and connections to thrive. See more details below.

We’ve also planned a few other activities to connect with the community: 

  • Partner happy hour: Network with the Fabric leadership and product team. An invaluable opportunity to connect with the team bringing you Fabric.
  • One-on-one partner executive connections: Meet with our executives and Fabric partner team to discuss your priorities and needs and gain a better understanding of partner motions and resources.
  • Partner-to-partner connection: Connect with other partners to discuss joint business opportunities and share learnings. 

What’s in it for Microsoft partners 

For Microsoft partners, the conference is more than just a learning experience; it represents an amazing chance for partners to forge deeper connections with the minds behind the technology, to engage with customers eager to leverage your expertise to grow their business, and to network with peers and other partners who are equally passionate about driving adoption of these amazing technologies. Are you excited yet? 

And of course, let’s not forget the Partner Pre-Day, an invaluable opportunity for partners to delve into the latest Microsoft partner initiatives, resources, and strategies for focusing on how Microsoft Fabric drives business growth and innovation. 

Here’s a sneak peek into the Partner Pre-Day: 

  • Get inspired: Attend Ask Me Anything sessions with top Microsoft data, AI, and analytics leadership and this year’s Partner of the Year Award winners.
  • Learn: Gain insights on how best to take advantage of partner-only offerings and incentives, access to resources, and deep technical skilling customized for our Microsoft AI Cloud Partner Program ecosystem.
  • Share: Meet one-on-one with Microsoft executives and the Microsoft partner team to share what’s on your mind.
  • Connect: Forge new relationships and strengthen existing ones with your partner peers for joint business outcomes.  

If you’re as excited about the Fabric Conference in Stockholm as we are, you’ll want to stay connected for all the latest updates. Be sure to follow the event on Microsoft’s partner social media channels on LinkedIn, Fabric YouTube, and the Fabric Tech Community Blog. These platforms are your go-to for live updates, exclusive behind-the-scenes content, and a chance to network with fellow innovators before, during, and after the conference. 

And hey, while you’re at it, why not join the Fabric Partner Community? It’s a fantastic way to get involved with weekly engineering calls where you can dive deep into the tech, ask questions, and share your insights. It’s like having a backstage pass to the world of Microsoft Fabric.

Now let’s make some noise—take these next steps

  • Register for the Fabric Conference today. By registering early for the 3-day pass, you can take advantage of an exclusive €200 off discount using the code MSCUST.
  • Check out sponsorship opportunities.
  • And of course, share this blog post with your network to start those pre-conference discussions online.

Start your Fabric journey today

Check out these additional resources to learn more about Fabric and prepare your organization for the next phase of your Fabric journey. 

  • Read this blog to learn how to enable your organization to help customers prepare their data for AI innovation with Microsoft Fabric.
  • Check out the new Fabric certification and Fabric Career Hub to get your team upskilled and let customers know you’re Fabric certified.
  • Join the Fabric Partner Community on Microsoft Teams, where you can attend the Fabric Engineering Connection (our weekly partner community calls with product engineering), stay connected with other partners, and learn of the latest resources, opportunities, and more.
  • Visit Azure Migrate and Modernize and Azure Innovate to learn more about Azure Innovate, our hero partner offering, and to access resources and funding for customer projects. 

Let’s get the buzz going and show the world what the Microsoft partner community is all about. 

I can’t wait to see you all in Stockholm, Sweden for an unforgettable experience. Let’s innovate, collaborate, and grow together!

Fabric_Blog

Microsoft Fabric

Bring your data into the era of AI


In partnership with Microsoft, the European Microsoft Fabric Community Conference is brought to you by the team behind ESPC, Europe’s premier Microsoft 365 Conference and the European Power Platform Conference. 

The post Empowering partnerships: The Microsoft Fabric Conference—your gateway to AI innovation appeared first on Microsoft AI Blogs.

]]>
Microsoft is a Leader in the 2024 Gartner® Magic Quadrant™ for Data Science and Machine Learning Platforms  https://azure.microsoft.com/en-us/blog/microsoft-is-a-leader-in-the-2024-gartner-magic-quadrant-for-data-science-and-machine-learning-platforms/ Tue, 25 Jun 2024 20:00:00 +0000 Microsoft is a Leader in this year’s Gartner® Magic Quadrant™ for Data Science and Machine Learning Platforms. Azure AI provides a powerful, flexible end-to-end platform for accelerating data science and machine learning innovation.

The post Microsoft is a Leader in the 2024 Gartner® Magic Quadrant™ for Data Science and Machine Learning Platforms  appeared first on Microsoft AI Blogs.

]]>
Microsoft is a Leader in this year’s Gartner® Magic Quadrant™ for Data Science and Machine Learning Platforms. Azure AI provides a powerful, flexible end-to-end platform for accelerating data science and machine learning innovation while providing the enterprise governance that every organization needs in the era of AI. 

Magic Quadrant for Data Science and Maching Learning Platforms showing Gartner results for as of April 2024.

In May 2024, Microsoft was also named a Leader for the fifth year in a row in the Gartner® Magic Quadrant™ for Cloud AI Developer Services, where we placed furthest for our Completeness of Vision. We’re pleased by these recognitions from Gartner as we continue helping customers, from large enterprises to agile startups, bring their AI and machine learning models and applications into production securely and at scale. 

Azure AI is at the forefront of purpose-built AI infrastructure, responsible AI tooling, and helping cross-functional teams collaborate effectively using Machine Learning Operations (MLOps) for generative AI and traditional machine learning projects. Azure Machine Learning provides access to a broad selection of foundation models in the Azure AI model catalog—including the recent releases of Phi-3, JAIS, and GPT-4o—and tools to fine-tune or build your own machine learning models. Additionally, the platform supports a rich library of open-source frameworks, tools, and algorithms so that data science and machine learning teams can innovate in their own way, all on a trusted foundation. 

Accelerate time to value with Azure AI infrastructure

We’re now able to get a functioning model with relevant insights up and running in just a couple of weeks thanks to Azure Machine Learning. We’ve even managed to produce verified models in just four to six weeks.”

Dr. Nico Wintergerst, Staff AI Research Engineer at relayr GmbH

Azure Machine Learning helps organizations build, deploy, and manage high-quality AI solutions quickly and efficiently, whether building large models from scratch, running inference on pre-trained models, consuming models as a service, or fine-tuning models for specific domains. Azure Machine Learning runs on the same powerful AI infrastructure that powers some of the world’s most popular AI services, such as ChatGPT, Bing, and Azure OpenAI Service. Additionally, Azure Machine Learning’s compatibility with ONNX Runtime and DeepSpeed can help customers further optimize training and inference time for performance, scalability, and power efficiency.

Whether your organization is training a deep learning model from scratch using open source frameworks or bringing an existing model into the cloud, Azure Machine Learning enables data science teams to scale out training jobs using elastic cloud compute resources and seamlessly transition from training to deployment. With managed online endpoints, customers can deploy models across powerful CPU and graphics processing unit (GPU) machines without needing to manage the underlying infrastructure—saving time and effort. Similarly, customers do not need to provision or manage infrastructure when deploying foundation models as a service from the Azure AI model catalog. This means customers can easily deploy and manage thousands of models across production environments—from on-premises to the edge—for batch and real-time predictions.  

Streamline operations with flexible MLOps and LLMOps 

Prompt flow helped streamline our development and testing cycles, which established the groundedness we required for making sure the customer and the solution were interacting in a realistic way.”

Fabon Dzogang, Senior Machine Learning Scientist at ASOS

Machine learning operations (MLOps) and large language model operations (LLMOps) sit at the intersection of people, processes, and platforms. As data science projects scale and applications become more complex, effective automation and collaboration tools become essential for achieving high-quality, repeatable outcomes.  

Azure Machine Learning is a flexible MLOps platform, built to support data science teams of any size. The platform makes it easy for teams to share and govern machine learning assets, build repeatable pipelines using built-in interoperability with Azure DevOps and GitHub Actions, and continuously monitor model performance in production. Data connectors with Microsoft sources such as Microsoft Fabric and external sources such as Snowflake and Amazon S3, further simplify MLOps. Interoperability with MLflow also makes it seamless for data scientists to scale existing workloads from local execution to the cloud and edge, while storing all MLflow experiments, run metrics, parameters, and model artifacts in a centralized workspace. 

Azure Machine Learning prompt flow helps streamline the entire development cycle for generative AI applications with its LLMOps capabilities, orchestrating executable flows comprised of models, prompts, APIs, Python code, and tools for vector database lookup and content filtering. Azure AI prompt flow can be used together with popular open-source frameworks like LangChain and Semantic Kernel, enabling developers to bring experimental flows into prompt flow to scale those experiments and run comprehensive evaluations. Developers can debug, share, and iterate on applications collaboratively, integrating built-in testing, tracing, and evaluation tools into their CI/CD system to continually reassess the quality and safety of their application. Then, developers can deploy applications when ready with one click and monitor flows for key metrics such as latency, token usage, and generation quality in production. The result is end-to-end observability and continuous improvement. 

Develop more trustworthy models and apps 

The responsible AI dashboard provides valuable insights into the performance and behavior of computer vision models, providing a better level of understanding into why some models perform differently than others, and insights into how various underlying algorithms or parameters influence performance. The benefit is better-performing models, enabled and optimized with less time and effort.” 

—Teague Maxfield, Senior Manager at Constellation Clearsight 

AI principles such as fairness, safety, and transparency are not self-executing. That’s why Azure Machine Learning provides data scientists and developers with practical tools to operationalize responsible AI right in their flow of work, whether they need to assess and debug a traditional machine learning model for bias, protect a foundation model from prompt injection attacks, or monitor model accuracy, quality, and safety in production. 

The Responsible AI dashboard helps data scientists assess and debug traditional machine learning models for fairness, accuracy, and explainability throughout the machine learning lifecycle. Users can also generate a Responsible AI scorecard to document and share model performance details with business stakeholders, for more informed decision-making. Similarly, developers in Azure Machine Learning can review model cards and benchmarks and perform their own evaluations to select the best foundation model for their use case from the Azure AI model catalog. Then they can apply a defense-in-depth approach to mitigating AI risks using built-in capabilities for content filtering, grounding on fresh data, and prompt engineering with safety system messages. Evaluation tools in prompt flow enable developers to iteratively measure, improve, and document the impact of their mitigations at scale, using built-in metrics and custom metrics. That way, data science teams can deploy solutions with confidence while providing transparency for business stakeholders. 

Read more on Responsible AI with Azure.

Deliver enterprise security, privacy, and compliance 

We needed to choose a platform that provided best-in-class security and compliance due to the sensitive data we require and one that also offered best-in-class services as we didn’t want to be an infrastructure hosting company. We chose Azure because of its scalability, security, and the immense support it offers in terms of infrastructure management.”

—Michael Calvin, Chief Technical Officer at Kinectify

In today’s data-driven world, effective data security, governance, and privacy require every organization to have a comprehensive understanding of their data and AI and machine learning systems. AI governance also requires effective collaboration between diverse stakeholders, such as IT administrators, AI and machine learning engineers, data scientists, and risk and compliance roles. In addition to enabling enterprise observability through MLOps and LLMOps, Azure Machine Learning helps organizations ensure that data and models are protected and compliant with the highest standards of security and privacy.

With Azure Machine Learning, IT administrators can restrict access to resources and operations by user account or groups, control incoming and outgoing network communications, encrypt data both in transit and at rest, scan for vulnerabilities, and centrally manage and audit configuration policies through Azure Policy. Data governance teams can also connect Azure Machine Learning to Microsoft Purview, so that metadata on AI assets—including models, datasets, and jobs—is automatically published to the Microsoft Purview Data Map. This enables data scientists and data engineers to observe how components are shared and reused and examine the lineage and transformations of training data to understand the impact of any issues in dependencies. Likewise, risk and compliance professionals can track what data is used to train models, how base models are fine-tuned or extended, and where models are employed across different production applications, and use this as evidence in compliance reports and audits. 

Lastly, with the Azure Machine Learning Kubernetes extension enabled by Azure Arc, organizations can run machine learning workloads on any Kubernetes clusters, ensuring data residency, security, and privacy compliance across hybrid public clouds and on-premises environments. This allows organizations to process data where it resides, meeting stringent regulatory requirements while maintaining flexibility and control over their MLOps. Customers using federated learning techniques along with Azure Machine Learning and Azure confidential computing can also train powerful models on disparate data sources, all without copying or moving data from secure locations. 

Get started with Azure Machine Learning 

Machine learning continues to transform the way businesses operate and compete in the digital era—whether you want to optimize your business operations, enhance customer experiences, or innovate. Azure Machine Learning provides a powerful, flexible machine learning and data science platform to operationalize AI innovation responsibly.  


*Gartner, Magic Quadrant for Data Science and Machine Learning Platforms, By Afraz Jaffri, Aura Popa, Peter Krensky, Jim Hare, Raghvender Bhati, Maryam Hassanlou, Tong Zhang, 17 June 2024. 

Gartner, Magic Quadrant for Cloud AI Developer Services, Jim Scheibmeir, Arun Batchu, Mike Fang, Published 29 April 2024. 

GARTNER is a registered trademark and service mark of Gartner, Inc. and/or its affiliates in the U.S. and internationally, Magic Quadrant is a registered trademark of Gartner, Inc. and/or its affiliates and is used herein with permission. All rights reserved. 

Gartner does not endorse any vendor, product or service depicted in its research publications and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s Research & Advisory organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose. 

This graphic was published by Gartner, Inc. as part of a larger research document and should be evaluated in the context of the entire document. The Gartner document is available upon request from this link. 

The post Microsoft is a Leader in the 2024 Gartner® Magic Quadrant™ for Data Science and Machine Learning Platforms  appeared first on Microsoft AI Blogs.

]]>
Raise the bar on AI-powered app development with Azure Database for PostgreSQL https://azure.microsoft.com/en-us/blog/raise-the-bar-on-ai-powered-app-development-with-azure-database-for-postgresql/ Wed, 05 Jun 2024 15:00:00 +0000 By harnessing the might of PostgreSQL in the cloud—with all the scalability and convenience you expect—comes Microsoft Azure Database for PostgreSQL. This fully managed service takes the hassle out of managing your PostgreSQL instances, allowing you to focus on what really matters: building amazing, AI-powered applications.

The post Raise the bar on AI-powered app development with Azure Database for PostgreSQL appeared first on Microsoft AI Blogs.

]]>
Known for its reliability and versatility, PostgreSQL is a popular and powerful open-source database system with a wide array of features. By harnessing the might of PostgreSQL in the cloud—with all the scalability and convenience you expect—comes Microsoft Azure Database for PostgreSQL. This fully managed service takes the hassle out of managing your PostgreSQL instances, allowing you to focus on what really matters: building amazing, AI-powered applications.  

What is postgresql?


Learn more

Azure Hero Image

Azure Database for PostgreSQL

Innovate with a fully managed, AI-ready PostgreSQL database

To better get you acquainted with how Azure Database for PostgreSQL empowers users to migrate their PostgreSQL databases and build intelligent apps, this blog will introduce a roster of new learning paths and events, including a pair of Cloud Skills Challenges. As if that’s not exciting enough, completing one of the challenges automatically enters you in a drawing for a great prize. So, let’s get going!   

Seamless database migration and app creation   

Say goodbye to tedious maintenance tasks and hello to seamless deployments, automated patching, and built-in high availability. Azure Database for PostgreSQL is a fully managed service that simplifies the migration of existing PostgreSQL databases to the cloud. We handle the burdens of patching, backups, and scaling—allowing you to focus on your applications.  

Seamless compatibility with PostgreSQL minimizes code changes during the transition and caters to diverse needs and budgets. With migration tooling in Azure Database for PostgreSQL, transferring data and schemas to the cloud becomes a breeze. 

Beyond migration, Azure Database for PostgreSQL empowers the development of AI-powered applications. Its native support for the pgvector extension allows for efficient storage and querying of vector embeddings, essential for AI and machine learning tasks. The service seamlessly integrates with other Azure AI services, such as Azure Machine Learning, Azure OpenAI Service, Microsoft Azure AI Language, and Microsoft Azure AI Translator, providing developers with a rich toolkit for building intelligent applications.  

Additionally, the service’s scalability ensures optimal performance as AI workloads grow, maintaining cost efficiency throughout the development process. Overall, Azure Database for PostgreSQL provides a comprehensive solution for both migrating to the cloud and building powerful AI applications. 

Here are some key features: 

  • High availability: Up to 99.99% uptime guaranteed with zone-redundant high availability, automated maintenance, patching, and updates.
  • Performance automation: Get analysis of your database workloads to identify opportunities to improve query performance with query store and index recommendations.
  • Security: Includes Microsoft Defender for open-source relational databases to protect your data, and Azure IP Advantage, which is designed to protect businesses and developers who build on Azure from intellectual property risks.
  • Azure AI extension: Generate and store vector embeddings, call Azure AI services, and build AI-powered apps directly within the database.
  • Migration support: Tools to migrate Oracle Database to Azure Database for PostgreSQL are available, making the transition smoother.
  • Cost-effective: Provides operational savings—up to 62% compared with on-premises—with comprehensive database monitoring and optimization tools, which can lead to a lower total cost of ownership. 

Learn at your own pace with curated lessons 

Now that you’ve gotten a primer on Azure Database for PostgreSQL, the next step is engaging with our curated learning paths. The collected modules in these two courses include readings, exercises, and knowledge checks.

  • Build AI Apps with Azure Database for PostgreSQL
    Designed for developers interested in harnessing AI within their PostgreSQL applications on Azure, this learning path explores how the Azure AI extension for Azure Database for PostgreSQL can be leveraged to incorporate AI capabilities into your apps.

    By completing this learning path, you’ll gain a solid understanding of the Azure AI extension and its various functionalities. Discover how to evaluate different summarization techniques available through Azure AI services and the azure_ai extension, explore the differences between extractive, abstractive, and query-focused summarization, and apply generative AI summarization techniques to data within a PostgreSQL database. This hands-on experience will empower you to leverage Azure AI services and the azure_ai extension to build intelligent applications that can summarize complex content into concise and informative summaries. 

  • Configure and migrate to Azure Database for PostgreSQL
    This learning path supplies you with the essential skills needed to effectively work with Azure Database for PostgreSQL. It begins with a foundational understanding of PostgreSQL architecture and core concepts, before delving into practical aspects such as connecting to the database, executing queries, and ensuring robust security measures.

    You’ll also learn how to create and manage databases, schemas, and tables, and how to leverage stored procedures and functions for code reusability. With insights into how Azure Database for PostgreSQL implements ACID transactions and write-ahead logging for data integrity and durability, you’ll gain confidence in configuring, managing, and migrating existing PostgreSQL databases to Azure.

Complete timed challenges to win Azure prizes 

To go along with these learning paths, we’ve also assembled a pair of corresponding Azure Database for PostgreSQL Cloud Skills Challenges. While learning paths are usually self-paced, solitary activities, Cloud Skills Challenges are part interactive learning sprint, part good-natured tournament between you and thousands of your peers around the globe. They’re immersive, gamified learning experiences blending hands-on exercises, tutorials, and assessments to ensure a well-rounded learning experience. 

Complete at least one of these challenges before time runs out and you’ll be automatically entered into a drawing to win one of 20 awesome Azure prizes. Sign up when these challenges kick off on June 11, 2024, and start competing! 

Connect with PostgreSQL experts at POSETTE 2024 conference 

Hosted by Microsoft, POSETTE 2024 (formerly Citus Con), is an exciting developer event dedicated to all things PostgreSQL. The event is a unique opportunity to learn from experts, network with fellow Postgres enthusiasts, and delve into the latest innovations in database technology. 

As a key player in the PostgreSQL community, we’ll be showcasing our commitment to the open-source database system. Attendees can look forward to a session on the future of Azure Database for PostgreSQL, where our experts will share our vision for the service and its integration with other Azure offerings.  

Running June 11 to 13, 2024, POSETTE—which stands for Postgres Open Source Ecosystem Talks, Training, and Education—is a free, virtual event featuring four unique livestreams. Registration is optional, and all scheduled talks will be online to watch immediately after the event ends. Don’t miss out on this chance to connect with the Microsoft team and learn how we’re advancing PostgreSQL in the cloud.

Take the next step on your Azure Database for PostgreSQL journey 

Whether you’re a seasoned developer or just starting out, PostgreSQL and Azure Database for PostgreSQL is a dream team for building modern, scalable, and AI-powered apps. By offering robust migration tools and seamless integration with AI and machine learning services, Azure Database for PostgreSQL helps users efficiently migrate to the cloud and build sophisticated AI applications.  

Get started today with our pair of learning paths and their respective Cloud Skills Challenges to be entered into a drawing for cool Azure prizes, then check out the POSETTE 2024 livestreams to learn more about everything you can do with the world’s most advanced open-source database.  

The post Raise the bar on AI-powered app development with Azure Database for PostgreSQL appeared first on Microsoft AI Blogs.

]]>
Unlock real-time insights with AI-powered analytics in Microsoft Fabric http://approjects.co.za/?big=en-us/microsoft-fabric/blog/2024/05/21/unlock-real-time-insights-with-ai-powered-analytics-in-microsoft-fabric/ Tue, 21 May 2024 15:30:00 +0000 With Microsoft Fabric, we are simplifying and future-proofing your data estate with an ever-evolving, AI-powered data analytics platform. Fabric will keep up with the trends for you and seamlessly integrate each new capability so you can spend less time integrating and managing your data estate and more time unlocking value from your data.  

The post Unlock real-time insights with AI-powered analytics in Microsoft Fabric appeared first on Microsoft AI Blogs.

]]>
The data and analytics landscape is changing faster than ever. From the emergence of generative AI to the proliferation of citizen analysts to the increasing importance of real-time, autonomous action, keeping up with the latest trends can feel overwhelming. Every trend requires new services that customers must manually stitch into their data estate—driving up both cost and complexity.  

With Microsoft Fabric, we are simplifying and future-proofing your data estate with an ever-evolving, AI-powered data analytics platform. Fabric will keep up with the trends for you and seamlessly integrate each new capability so you can spend less time integrating and managing your data estate and more time unlocking value from your data.  

Get started with Microsoft Fabric

Set up Fabric for your business and discover resources that help you take the first steps

Aurizon, Australia’s largest rail freight operator, turned to Fabric to modernize their data estate and analytics system.

“With Microsoft Fabric, we’ve answered many of our questions about navigating future growth, to remove legacy systems, and to streamline and simplify our architecture. A trusted data platform sets us up to undertake complex predictive analytics and optimizations that will give greater surety for our business and drive commercial benefits for Aurizon and our customers in the very near future.”

—Tammy Wigg, Chief Data Analytics Officer at Aurizon

Aurizon is just one among thousands of customers who have already used Fabric to revolutionize how they connect to and analyze their data. In fact, a 2024 commissioned Total Economic Impact™ (TEI) study conducted by Forrester Consulting found that Microsoft Fabric customers saw a three-year 379% return on investment (ROI) with a payback period of less than six months. We are thrilled to share a huge range of new capabilities coming to Fabric. These innovations will help you more effectively uncover insights and keep you at the forefront of the trends in data and analytics. Check out a quick overview of the biggest changes coming to Fabric.

Fabric is a complete data platform

Prepare your data for AI innovation with Microsoft Fabric—now generally available


Read the blog

Fabric is a complete data platform—giving your data teams the ability to unify, transform, analyze, and unlock value from data from a single, integrated software as a service (SaaS) experience. We are excited to announce additions to the Fabric workloads that will make Fabric’s capabilities even more robust and even customizable to meet the unique needs of each organization. These enhancements include: 

  1. A completely redesigned workload, Real-Time Intelligence, that brings together and enhances Synapse Real-Time Analytics and Data Activator to help you analyze and act on high-volume, high-granular event streaming data and even explore your organization’s real-time data in the new Real-time hub.
  1. New tools like the Fabric Workload Development Kit, Application Programming Interface (API) for GraphQL™, and “user data functions” that can help developers build powerful solutions on the Fabric platform. 
  1. A new feature in the Microsoft Azure Data Factory experience called Data workflow, powered by Apache Airflow runtime, that can help you author, schedule, and monitor workflows or data pipelines using Python. 
Chart showing the latest Microsoft Fabric additions.

Unlock continuous insights with Real-Time Intelligence and the Real-time hub

When we introduced Fabric, it launched with seven core workloads which included Synapse Real-time Analytics for data streaming analysis and Data Activator for monitoring and triggering actions in real-time. We are unveiling an enhanced workload called Real-Time Intelligence that combines these workloads and brings an array of additional new features, in preview, to help organizations make better decisions with up-to-the-minute insights. From ingestion to transformation, querying, and taking immediate action, Real-Time Intelligence is an end-to-end experience that enables seamless handling of real-time data without the need to land it first. With Real-Time Intelligence, you can ingest streaming data with high granularity, dynamically transform streaming data, query data in real-time for instant insights, and trigger actions like alerting a production manager when equipment is overheating or rerunning jobs when data pipelines fail. And with both simple, low-code or no-code, and powerful, code-rich interfaces, Real-Time Intelligence empowers every user to work with real-time data. 

Behind this powerful workload is the Real-time hub, a single place to discover, manage, and use event streaming data from Fabric and other data sources from Microsoft, third-party cloud providers, and other external data sources. Just like the OneLake data hub makes it easy to discover, manage, and use the data at rest, the Real-time hub can help you do the same for data in motion. All events that flow through the Real-time hub can be easily transformed and routed to any Fabric data store and users can create new streams that can be discovered and consumed. From the Real-time hub, users can gain insights through the data profile, configure the right level of endorsement, set alerts on changing conditions and more, all without leaving the hub. While the existing Real-Time Analytics capabilities are still generally available, the Real-time hub and the other new capabilities coming to the Real-Time Intelligence workload are currently in preview. Watch this demo video to check out the redesigned Real-Time Intelligence experience:  

Elcome, one of the world’s largest marine electronics companies, built a new service on Fabric called “Welcome” that helps maritime crews stay connected to their families and friends.

Microsoft Fabric Real-Time Intelligence has been the essential building block that’s enabled us to monitor, manage, and enhance the services we provide. With the help of the Real-time hub for centrally managing data in motion from our diverse sources and Data Activator for event-based triggers, Fabric’s end-to-end cloud solution has empowered us to easily understand and act on high-volume, high-granularity events in real-time with fewer resources.”

—Jimmy Grewal, Managing Director of Elcome

Real-time insights are becoming increasingly critical across industries like route optimization in transportation and logistics, grid monitoring in energy and utilities, predictive maintenance in manufacturing, and inventory management in retail. And since Real-Time Intelligence comes fully optimized and integrated in a SaaS platform, adoption is seamless. Strathan Campbell, Channel Environment Technology Lead at One NZ—the largest mobile carrier in New Zealand—said they “…went from a concept to a delivered product in just two weeks.” To learn more about the Real-Time Intelligence workload, watch the “Ingest, analyze and act in real time with Microsoft Fabric” Microsoft Build session or read the Real-Time Intelligence blog.  

Extend Fabric with your own, custom workloads and experiences

Fabric was built from the ground up to be extensible, customizable, and open. Now, we are making it even easier for software developers and customers to design, build, and interoperate applications within Fabric with the new Fabric Workload Development Kit—currently in preview. Applications built with this kit will appear as a native workload within Fabric, providing a consistent experience for users directly in their Fabric environment without any manual effort. Software developers can publish and monetize their custom workloads through Azure Marketplace. And, coming soon, we are creating a workload hub experience in Fabric where users can discover, add, and manage these workloads without ever leaving the Fabric environment. We already have industry-leading partners building on Fabric including SAS, Esri, Informatica, Teradata, and Neo4j.

You can also learn more about the Workload Development Kit by watching the “Extend and enhance your analytics applications with Microsoft Fabric” Microsoft Build session.

We are also excited to announce two new features, both in preview, created with developers in mind: API for GraphQL and user data functions in Fabric. API for GraphQL is a flexible and powerful RESTful API that allows data professionals to access data from multiple sources in Fabric with a single query API. With API for GraphQL, you can streamline requests to reduce network overheads and accelerate response rates. User data functions are user-defined functions built for Fabric experiences across all data services, such as notebooks, pipelines, or event streams. These features enable developers to build experiences and applications using Fabric data sources more easily like lakehouses, data warehouses, mirrored databases, and more with native code ability, custom logic, and seamless integration. You can watch these features in action in the “Introducing API for GraphQL and User Data Functions in Microsoft Fabric” Microsoft Build session.

You can also learn more about the Workload Development Kit, the API for GraphQL, user data functions, and more by reading the Integrating ISV apps with Microsoft Fabric blog.

Orchestrate complex data workflows in the Fabric Data Factory workload

We are also announcing the preview of Data workflows in Fabric as part of the Data Factory experience. Data workflows allow customers to define Directed Acyclic Graphs (DAG) files for complex data workflow orchestration in Fabric. Data workflows is powered by the Apache Airflow runtime and designed to help you author, schedule and monitor workflows or data pipelines using python. Learn more by reading the data workflows blog.  

Fabric is lake-centric and open

The typical data estate has grown organically over time to span multiple clouds, accounts, databases, domains, and engines with a multitude of vendors and specialized services. OneLake, Fabric’s unified, multi-cloud data lake built to span an entire organization, can connect to data from across your data estate and reduce data duplication and sprawl.  

We are excited to announce the expansion of OneLake shortcuts to connect to data from on-premises and network-restricted data sources beyond just Azure Data Lake Service Gen2, now in preview. With an on-premises data gateway, you can now create shortcuts to Google Cloud Storage, Amazon S3, and S3 compatible storage buckets that are either on-premises or otherwise network-restricted. To learn more about these announcements, watch the Microsoft Build session “Unify your data with OneLake and Microsoft Fabric.”  

Empower business users with Fabric

Insights drive impact only when they reach those who can use them to inform actions and decisions. Professional and citizen analysts bridge the gap between data and business results, and with Fabric, they have the tools to quickly manage, analyze, visualize, and uncover insights that can be shared with the entire organization. We are excited to help analysts work even faster and more effectively by releasing the model explorer and the DAX query view in Microsoft Power BI Desktop into general availability.

The model explorer in Microsoft Power BI provides a rich view of all the semantic model objects in the data pane—helping you find items in your data fast. You can also use the model explorer to create calculation groups and reduce the number of measures by reusing calculation logic and simplifying semantic model consumption. 

graphical user interface

The DAX query view in Power BI Desktop lets users discover, analyze, and see the data in their semantic model using the DAX query language. Users working with a model can validate data and measures without having to build a visual or use an additional tool—similar to the Explore feature. Changes made to measures can be seamlessly updated directly back to the semantic model. 

graphical user interface

To learn more about these announcements and others coming to Power BI, check out the Power BI blog.  

Fabric is AI-powered

When ChatGPT was launched, it had over 100 million users in just over two months—the steepest adoption curve in the history of technology.1 It’s been a year and a half since that launch, and organizations are still trying to translate the benefit of generative AI from novelty to actual business results. By infusing generative AI into every layer of Fabric, we can empower your data professionals to employ its benefits, in the right context and in the right scenario to get more done, faster.  

Use Copilot in Fabric, now generally available 

Copilot in Fabric was designed to help users unlock the full potential of their data by assisting data professionals to be more productive and business users to explore their data more easily. With Copilot in Fabric, you can use conversational language to create dataflows, generate code and entire functions, build machine learning models, or visualize results. We are excited to share that Copilot in Fabric is now generally available, starting with the Power BI experience. This includes the ability to create stunning reports and summarize your insights into narrative summaries in seconds. Copilot in Fabric is also now enabled on-by-default for all eligible tenants including Copilot in Fabric experiences for Data Factory, Data Engineering, Data Science, Data Warehouse, and Real-Time Intelligence, which are all still in preview. The general availability of Copilot in Fabric for the Power BI experience will be rolling out over the coming weeks to all customers with Power BI Premium capacity (P1 or higher) or Fabric capacity (F64 or higher). 

We are also thrilled to announce a new Copilot in Fabric experience for Real-Time Intelligence, currently in preview, that enables users to explore real-time data with ease. Starting with a Kusto Query Language (KQL) Queryset connected to a KQL Database in an Eventhouse or a standalone Azure Data Explorer database, you can type your question in conversational language and Copilot will automatically translate it to a KQL query you can execute. This experience is especially powerful for users less familiar with writing KQL queries but still want to get the most from their time-series data stored in Eventhouse. 

Create custom Q&A experiences with your data with AI skills 

We are also thrilled to release a new AI capability in preview called AI skills—an innovative experience designed to provide any user with a conversational Q&A experience about their data. AI skills allow you to simply select the data source in Fabric you want to explore and immediately start asking questions about your data—even without any configuration. When answering questions, the generative AI experience will show the query it generated to find the answer and you can enhance the Q&A experience by adding more tables, setting additional context, and configuring settings. AI skills can empower everyone to explore data, build and configure AI experiences, and get the answers and insights they need.  

AI skills will honor existing security permissions and can be configured to respect the unique language and nuances of your organization, ensuring that responses are not just data-driven but steeped in the context of your business operations. And, coming soon, it can also enrich the creation of new copilots in Microsoft Copilot Studio and be interacted with from Copilot for Microsoft for 365. It’s about making your data not just accessible but approachable, inviting users to explore insights through natural dialogue, and shortening the time to insight.

New partnerships with Microsoft Fabric

Snowflake and Microsoft Fabric

With the launch of Fabric, we’ve committed to open data formats, standards, and interoperability with our partners to give our customers the flexibility to do what makes sense for their business. We are taking this commitment a step further by expanding our existing partnership with Snowflake to expand interoperability between Snowflake and Fabric’s OneLake. We are excited to announce future support for Apache Iceberg in Fabric OneLake and bi-directional data access between Snowflake and Fabric. This integration will enable users to analyze their Fabric and Snowflake data written in Iceberg format in any engine within either platform, and access data across apps like Microsoft 365, Microsoft Power Platform, and Microsoft Azure AI Studio.

With the upcoming availability of shortcuts for Iceberg in OneLake, Fabric users will be able to access all data sources in Iceberg format, including the Iceberg sources from Snowflake, and translate metadata between Iceberg and Delta formats. This means you can work with a single copy of your data across Snowflake and Fabric. Since all the OneLake data can be accessed in Snowflake as well as in Fabric, this integration will enable you to spend less time stitching together applications and your data estate, and more time uncovering insights. To learn more about this announcement, read the Fabric and Snowflake partnership blog.

Adobe and Microsoft Fabric 

We are also excited to announce we are expanding our existing relationship with Adobe. Adobe Experience Platform (AEP) and Adobe Campaign will have the ability to federate enterprise data from Fabric. Our joint customers will soon have the capability to connect to Fabric and use the Fabric Data Warehouse for query federation to create and enrich audiences for engagement, without having to transfer or extract the data from Fabric. 

Combine Fabric and Microsoft Azure Databricks to get the best of both worlds

We are excited to announce that we are expanding the integration between Fabric and Azure Databricks—allowing you to have a truly unified experience across both products and pick the right tools for any scenario. 

Azure Databricks Unity Catalog integration with Fabric 

Coming soon, you will be able to access Azure Databricks Unity Catalog tables directly in Fabric, making it even easier to unify Azure Databricks with Fabric. From the Fabric portal, you can create and configure a new Azure Databricks Unity Catalog item in Fabric with just a few clicks. You can add a full catalog, a schema, or even individual tables to link and the management of this Azure Databricks item in OneLake—a shortcut connected to Unity Catalog—is automatically taken care of for you.  

This data acts like any other data in OneLake—you can write SQL queries or use it with any other workloads in Fabric including Power BI through Direct Lake mode. When the data is modified or tables are added, removed, or renamed in Azure Databricks, the data in Fabric will remain always in sync. This new integration makes it simple to unify Azure Databricks data in Fabric and seamlessly use it across every Fabric workload. 

Federate OneLake as a Remote Catalog in Azure Databricks 

Also coming soon, Fabric users will be able to access Fabric data items like lakehouses as a catalog in Azure Databricks. While the data remains in OneLake, you can access and view data lineage and other metadata in Azure Databricks and leverage the full power of Unity Catalog. This includes extending Unity Catalog’s unified governance over data and AI into Azure Databricks Mosaic AI. In total, you will be able to combine this data with other native and federated data in Azure Databricks, perform analysis assisted by generative AI, and publish the aggregated data back to Power BI—making this integration complete across the entire data and AI lifecycle. 

Watch these announcements in action at Microsoft Build 2024

Join us at Microsoft Build from May 21 to 23, 2024 to see all of these announcements in action across the following sessions: 

You can also try out these new capabilities and everything Fabric has to offer yourself by signing up for a free 60-day trial—no credit card information required. To start your free trial, sign up for a free account (Power BI customers can use their existing account), and once signed in, select start trial within the account manager tool in the Fabric app. Existing Power BI Premium customers can already access Fabric by simply turning on Fabric in their Fabric admin portal. Learn more on the Fabric get started page

Join us at the European Microsoft Fabric Community Conference 

We are excited to announce a European Microsoft Fabric Community Conference that will be held in Stockholm, Sweden from September 23 to 26, 2024. You can see firsthand how Fabric and the rest of the data and AI products at Microsoft can help your organization prepare for the era of AI. You will hear from leading Microsoft and community experts from around the world and get hands on experiences with the latest features from Fabric, Power BI, Azure Databases, Azure AI, Microsoft Purview, and more. You will also have the opportunity to learn from top data experts and AI leaders while having the chance to interact with your peers and share your story. We hope you will join us and see how cutting-edge technologies from Microsoft can enable your business success with the power of Fabric.   

Explore additional resources for Microsoft Fabric

If you want to learn more about Microsoft Fabric: 

Microsoft Fabric

Experience the next generation in analytics 

Abstract image


1ChatGPT sets record for fastest-growing user base – analyst note, Reuters.

The post Unlock real-time insights with AI-powered analytics in Microsoft Fabric appeared first on Microsoft AI Blogs.

]]>
What’s new in Azure Data, AI, and Digital Applications: Harness the power of intelligent apps  https://azure.microsoft.com/en-us/blog/whats-new-in-azure-data-ai-and-digital-applications-harness-the-power-of-intelligent-apps/ Thu, 02 May 2024 16:00:00 +0000 Sharing insights on technology transformation along with important updates and resources about the data, AI, and digital application solutions that make Microsoft Azure the platform for the era of AI.

The post What’s new in Azure Data, AI, and Digital Applications: Harness the power of intelligent apps  appeared first on Microsoft AI Blogs.

]]>
As companies race to build or modernize intelligent apps that incorporate generative AI, many are finding gaps in their tech-stack modernization efforts as they explore and test use cases. 

Recently, Microsoft commissioned Forrester Consulting to evaluate modernization efforts at global organizations. The study found that nearly 90% of decision makers said they have a modern tech foundation and are ready for the future. The study also found there’s a perception gap and many aren’t as ready as they think they are. Some still deal with inflexible, outdated, and brittle legacy software systems that stifle innovation and leave substantial room for improvement. 

Some of this comes from a cloud approach focused on migration. There is a lot to gain simply in migrating to the cloud, but modernization is much more than lift and shift—it’s about what you can do once you get there. 

The study also found that one in five companies surveyed are overcoming foundational barriers in their application modernization journey to bring critical value back to the business. These companies are doing a few common things: implementing a clear technology strategy built around AI and training, integrating analytics and key metrics to connect with business outcomes, and upgrading technology stacks, supported by strategic partnerships with relevant suppliers. In adopting these practices, these companies can move rapidly to address internal gaps and overcome thorny challenges in the way of their modernization goals. 

Is your tech stack AI-ready? Are you looking for best practices to help execute your modernization strategy? Want to learn more about the Forrester study? Join us on May 8, 2024 for our next Microsoft Azure webinar: “Harness the Power of Intelligent Apps: Modernize with Azure.” In addition to hearing from me, you’ll hear from: 

  • Amanda Silver on “Powering Future Innovation with Azure Application Platform”
  • Ali Powell on “Vision to Value: Accelerating Modernization to Azure with Azure Customer Success”
  • Cyril Belikoff on “Empowering Modernization with Microsoft Programs, Offers, and Partner Ecosystem (Closing Keynote)”

And we’ll be joined by Bill Martorelli, Principal Analyst at Forrester, who will discuss the study. 

Person talking to a group of people in conference room.

Join us for the Azure webinar series

"Harness the Power of Intelligent Apps: Modernize with Azure"

Next, here is the latest from Microsoft data, AI, and digital apps to help you with your modernization.

What’s new in data, AI, and digital applications 

Snowflake partnership extends Azure options for your data 

One of the most empowering things Microsoft does for customers is forge partnerships with companies who extend the value of our platform. Snowflake is a great example of this in action. I had the opportunity to join Snowflake Chief Marketing Officer Denise Persson to talk about how we work together to help customers make the most of their data to drive AI transformation. Partnerships like these are just one more reason why Azure is the best cloud for your data. 

More choiceMicrosoft Azure AI model catalog of open and frontier models just got bigger with additions from Snowflake, Microsoft Research, Meta, Databricks, and more 

The Azure AI model catalog now offers more than 1,600 foundation models (large language models (LLMs) and small language models (SLMs)) enabling Azure customers to choose the best model for their use case. Our catalog continues to grow, with the addition of new cutting-edge models, including Snowflake Arctic, Meta Llama 3, databricks/dbrx-base and databricks/dbrx-instruct, and Microsoft’s Phi-3 family of open models, the most capable and cost-effective small language models available, outperforming models of the same size and next size up across a variety of language, coding, and math benchmarks. The new models not only expand our catalog, but also enhance our ability to meet diverse enterprise demands.  

Azure AI Search now has significantly increased storage capacity and vector index size at no additional cost. To support increased demand for retrieval augmented generation (RAG) and generative AI applications at scale, AI Search has drastically raised storage limits, giving customers more vectors per dollar without compromising high performance. With this change, customers can achieve more scalability at a lower cost, trust AI Search to handle their large RAG workloads, and apply advanced search strategies to navigate complex data to innovate in ways previously unimaginable. 

Microsoft announces new industrial AI innovations from the cloud to the factory floor 

We announced new AI-driven solutions for manufacturing that aim to unify data estates, enhance operational resilience, and empower front-line workers with AI copilots for efficient data querying and issue resolution. The private preview includes Microsoft Fabric data solutions and the Microsoft Copilot for Azure AI template, promising to optimize operations and accelerate AI integration across global manufacturing sites. 

From cloud to edge, CPUs to GPUs, and application-specific integrated circuits (ASICs), the AI hardware and software landscape is expanding at an impressive rate. Our new “Microsoft Azure: The State of AI Infrastructure” report helps you keep up with the current state of AI, its trends and challenges, and to learn about best practices for building and deploying scalable and efficient AI systems. Read the inaugural blog in our new Infrastructure for the Era of AI series for more and to download the report. 

Mirroring in Microsoft Fabric for Azure SQL Database and Azure Cosmos DB now in public preview 

Azure SQL Database Mirroring in Microsoft Fabric continuously replicates your SQL data into Delta tables in Microsoft Fabric’s OneLake in near real-time. This new feature helps avoid complex and time-consuming extract, transform, load (ETL) processes and enables faster time to insights. Mirroring further reduces your overall total cost of ownership with zero compute and storage costs to replicate, helping you quickly and cost-effectively respond to changes in your business. 

With Azure Cosmos DB Mirroring in Microsoft Fabric, you can seamlessly bring your Azure Cosmos DB data into OneLake in Microsoft Fabric with no-ETL and near real-time insights on your operational data, allowing you to react more quickly to changes in your business environment and market conditions. 

Microsoft Azure Database for PostgreSQL: New AI capabilities in Azure AI extension and new migration service 

The Azure AI extension in public preview now includes real-time predictions to invoke machine learning models hosted on Microsoft Azure Machine Learning online endpoints. This is extremely helpful for building fraud detection in banking, product recommendations in retail, patient predictions in healthcare, and more. Also new is real-time text translation using Microsoft Azure AI Translator to translate text from within SQL. This facilitates building intelligent multilingual applications on Azure Database for PostgreSQL. It also supports automatically detecting the language of the input text and filtering profanities.  

A new migration service now generally available simplifies the process of moving PostgreSQL databases from anywhere to Azure. The service offers both offline and online migration options from an on-premises server, AWS RDS for PostgreSQL, Azure Virtual Machines (VMs), and Azure Database for PostgreSQL—Single Server. The service helps customers move to Azure Database for PostgreSQL—Flexible Server with ease and confidence. 

Microsoft Defender for Cloud now supports Azure Database for MySQL—Flexible Server 

Protect databases from threats without affecting performance or availability, and lower the risk of data breaches, attacks, and unauthorized access with security monitoring of anomalous or suspicious activities. Customers can easily enable Defender for Cloud from the Azure Portal to start getting security alerts and insights for Azure Database for MySQL—Flexible Server and receive recommendations to mitigate potentially harmful threats. Learn more about Defender for Cloud

Healthcare data solutions in Microsoft Fabric in public preview 

The healthcare data solutions provide data models and transformation activities that help customers create a multi modal warehouse. It enables customers to align with industry standards, such as Fast Healthcare Interoperability Resources (FHIRs) and Digital Imaging and Communications in Medicine (DICOM), and supports compliance with regulations, such as the Health Insurance Portability and Accountability Act of 1996 (HIPAA) and General Data Protection Regulation (GDPR). Learn more about healthcare data solutions.

Empowering AI globally with investments in Japan and United Arab Emirates, a new AI Hub in London, and a global partnership with Cognizant 

Microsoft will invest USD2.9 billion over the next two years to increase our hyperscale cloud computing and AI infrastructure in Japan. We will also expand our digital skilling programs, open our first Microsoft Research Asia lab in Japan, and deepen our cybersecurity collaboration with the Government of Japan. 

We are investing USD1.5 billion in Abu Dhabi’s G42 to accelerate AI development and global expansion, and to strengthen our collaboration on bringing the latest Microsoft AI technologies and skilling initiatives to the United Arab Emirates and other countries around the world. 

Microsoft AI Chief Executive Officer Mustafa Suleyman announced Microsoft AI is opening a new AI hub in the heart of London. Microsoft AI London will drive pioneering work to advance state-of-the-art language models and their supporting infrastructure, and to create world-class tooling for foundation models, collaborating closely with our AI teams across Microsoft and with our partners, including OpenAI. 

Cognizant and Microsoft are expanding our partnership to integrate Microsoft’s generative AI and Microsoft Copilot into Cognizant’s digital transformation services. With a significant investment in generative AI, the partnership is set to drive AI adoption and create industry-specific solutions, while upholding ethical AI standards. The initiative also includes extensive training for Cognizant developers and the deployment of Microsoft Copilot for Microsoft 365 to a vast user base.  

Learn something new 

Microsoft Learn helps address the AI skills gap 

With more than 6 million people globally engaged with AI learning, Microsoft Learn is your AI skill-building partner in addressing the AI skills gap. We designed a simple framework to help you chart your own course for building the necessary AI skills to realize the value of the Microsoft platform. Learn more with the “Accelerate AI transformation with skill building” position paper. For more, see the blog from Kim Akers, Corporate Vice President, Enablement and Operations for Microsoft Customer and Partner Solutions, on how to transform your business with AI skill building on Microsoft Learn

How to customize your generative AI model 

There are three key techniques to customizing a LLM: prompt engineering, RAG, and fine-tuning. In this blog, you will learn how to:  

  • Optimize prompts to ensure a model produces more accurate responses.  
  • Retrieve information from external sources for better accuracy. 
  • Teach a model new skills and tasks. 
  • Teach a model new information from a custom dataset. 

LLMOps Maturity Model Assessment and framework, plus new instructional video series 

LLMOps describes the operational practices and strategies for managing LLMs in production. Not sure what your organization’s LLMOps maturity level is? Take a 10-minute assessment to find out. Then get guidance on how to advance your capabilities in LLMOps based on your organization’s maturity level.  

There are a lot of intricacies involved in building systems that use LLMs: selecting just the right models, orchestrating LLM flows, and monitoring them using responsible AI toolchains. LLMOps is more than technology or product adoption. It’s a confluence of the people engaged, the processes used, and the products implemented. In a new instructional video series, we will explore the concept of LLMOps in depth, alongside introducing the latest Azure AI tools designed to facilitate the adoption of robust LLMOps practices for enterprises. 

New blog: Improving Azure Functions cold starts 

Our new blog, “How to Conquer Cold Starts for Better Performance” is now live on The New Stack, the platform for the latest news and resources for cloud native technologies. Skill-up for free on leveraging AI with cloud-native and serverless technologies on Azure.

Insights Tomorrow: A podcast for data enthusiasts 

Insights Tomorrow features in-depth conversations with data leaders and experts about the revolutionary journeys they’re taking in the world of data, analytics, and governance. Join host Patrick LeBlanc to explore how unlocking value from data is trailblazing the way to digital transformation. 

Customers innovating with generative AI  

MultiChoice boosts user satisfaction with Azure Machine Learning and Microsoft Azure Databricks 

Have you ever wished for a viewing experience that knows your taste better than you do? MultiChoice, Africa’s premier entertainment destination, has made that a reality with Azure Machine Learning. Users enjoy highly personalized recommendations that cater to their tastes, making every view count. As MultiChoice is making the viewing experience better, smarter, and more enjoyable, it has seen a remarkable uptake in user engagement and satisfaction. 

Nexi uses Microsoft Azure App Service to modernize its FinTech apps to provide a personalized payment solution to customers 

Nexi is a European PayTech company that helps its customers pay and accept digital payments through a complete and personalized range of simple, intuitive, and secure solutions. To simplify transactions and empower people and businesses to enjoy closer relationships and prosper together, Nexi needed a unified solution with more flexibility to allow its developers to spend much less time on infrastructure activities and provide more value on delivering products. Working with Microsoft and cVation, Nexi moved forward to implement app modernization and transformation

Coles accelerates from monthly to weekly application deployments with Azure 

Coles, which operates more than 800 stores in Australia, needed to modernize its technology to meet growing customer demand for an omnichannel experience that served them equally well in store and online. In the highly competitive grocery business, that means rapidly evolving in response to customer input. Coles gained amazing efficiency with Azure DevOps, shifting from monthly to weekly deployments, reducing build times, and enabling rapid deployment of changes to production. The customer experience is further optimized with Azure Cosmos DB, which provides an aggregated view across channels to enable further customer insights, allowing Coles to provide an even more customized experience. 

Opportunities to connect 

Build multimodal apps with Microsoft and win at Microsoft Generative AI Hackathon—now through May 6, 2024

Do you want to learn how to use Azure AI and GitHub Copilot to build multimodal apps that combine text, image, video, or voice inputs and outputs for a big impact? Do you want to win cash prizes, Azure credits, and recognition for your work? If you answered yes to any of these questions, then you should join us for the Microsoft Generative AI Hackathon, an online challenge where you can use your skills to create exciting generative AI solutions. 

Microsoft Build—May 21 to 23, 2024 

Develop the AI skills needed for tomorrow, today. At Microsoft Build 2024, you will learn from in-demand experts, experience the latest innovations in breakouts, and make connections with a community that can help you achieve more. Register for Build

Microsoft Developers AI Learning Hackathon, win up to $10,000 in prizes! Now through June 17, 2024

This hackathon challenges you to push the boundaries of what’s possible by building your very own AI app using Azure Cosmos DB for MongoDB. Targeting both Node.js and Python developers, the hackathon guides you through comprehensive learning to delve into the fundamentals of AI apps— and you might win up to $10,000 in prizes. By the end of the hackathon, you will know how to build your very own custom AI copilot with Azure Cosmos DB for MongoDB. Get started now

Azure Cosmos DB Conf 2024 available on-demand 

Missed the excitement and innovation-packed Azure Cosmos DB Conference in April? No worries, we’ve got you covered! The event brought together the sharpest minds in the industry, including product leaders, chief technical officers, and community members who shared insights on building groundbreaking generative AI applications powered by Azure Cosmos DB. Whether you missed a little or missed it all, catch the replays on-demand

Microsoft’s AI Classroom Hackathon winners 

Microsoft’s AI Classroom Hackathon winners were chosen from more than 3,700 students from over 100 countries, participating in Microsoft’s call to build the next generation of intelligent applications that reimagine the future of education using Azure AI and Azure databases. Among the winners is Dialogues Through Time, that allows you to have interactive AI-driven dialogues with historical figures like Socrates and Leonardo DaVinci. Also, EquEdu, an AI-powered content accessibility tool that makes online educational content accessible for the visually impaired. Lots of inspirational ideas here—check out the winners and their projects. 


What’s new?  

Jessica shares insights on technology transformation along with important updates and resources about the data, AI, and digital application solutions that make Microsoft Azure the platform for the era of AI. Find Jessica’s blog posts here and be sure to follow Jessica on LinkedIn

The post What’s new in Azure Data, AI, and Digital Applications: Harness the power of intelligent apps  appeared first on Microsoft AI Blogs.

]]>
Announcements from the Microsoft Fabric Community Conference http://approjects.co.za/?big=en-us/microsoft-fabric/blog/2024/03/26/announcements-from-the-microsoft-fabric-community-conference/ Tue, 26 Mar 2024 15:00:00 +0000 I’m thrilled so many of you could attend the Microsoft Fabric Community Conference this week in Las Vegas, Nevada. With more than 130 sessions from experts around the world, attendees are getting hands-on experience with everything Microsoft Fabric has to offer from data warehousing to data movement to AI, real-time analytics, and business intelligence.

The post Announcements from the Microsoft Fabric Community Conference appeared first on Microsoft AI Blogs.

]]>
I’m thrilled so many of you could attend the Microsoft Fabric Community Conference this week in Las Vegas, Nevada. With more than 130 sessions from experts around the world, attendees are getting hands-on experience with everything Microsoft Fabric has to offer from data warehousing to data movement to AI, real-time analytics, and business intelligence. For those who could not attend, however, I wanted to share all the announcements from the conference for Microsoft Fabric and the rest of the Microsoft data, AI, and security products in the Microsoft Intelligent Data Platform.

The Microsoft Intelligent Data Platform is our solution to help you create a powerful, agile, and secure data and AI foundation, made simple. Microsoft Intelligent Data Platform is a suite of technologies with Microsoft Fabric at its heart that helps organizations harness the full power of their data. By natively integrating products across four workloads—AI, analytics, database, and security—organizations no longer have to bear the cost and burden of stitching together a complex set of disconnected services from multiple vendors themselves. Instead, focusing on making bold, real-time decisions and empowering teams to create and innovate without limits. Learn more about the Microsoft Intelligent Data Platform and how Microsoft Fabric fits in by reading Jessica Hawk’s blog “The Microsoft Intelligent Data Platform: unleash your data to accelerate transformation.”

Many of our customers are already taking advantage of our focus on integration to accelerate their time to insight and empower more people to make data-backed decisions. For example, One NZ, one of the largest mobile carriers in New Zealand, wanted to provide nearly 1,000 users with a real-time, tailored view of customer data to provide more tailored and timely customer service. But as Strathan Campbell, Channel Environment Technology Lead at One NZ explains, “Our increasing data volumes started leading to delayed refresh rates in what should have been real-time Power BI dashboards.” They turned to Microsoft Fabric and in particular, Fabric’s real-time analytics capabilities to provide a seamless and easy-to-manage solution.

“What drew us to [Fabric] was that it was an all-in-one solution. Since we didn’t need to buy new components and were already embedded with Power BI, putting the architecture and security in place was quick and easy. Most dashboards are updated every 10 seconds now, which is six times faster than before.”

—Steven Easton, BI Channels Specialist at One NZ

Learn more about One NZ’s journey.

Satisfied executive customer working and collaborating with investment advisor on investment decisions using intelligent apps powered by Azure.

Microsoft Fabric

Bring your data into the era of AI

Join the thriving Microsoft Fabric community 

It’s been fantastic meeting so many of our most active community members at the Microsoft Fabric Community Conference this week. We launched Microsoft Fabric 10 months ago at Microsoft Build—a reimagining of analytics with a single, SaaS platform that could tackle every step of the analytics process, all on a multi-cloud data lake foundation. We were thrilled by the excitement across the millions of active Power BI, Data Factory, and Synapse community members who came together to answer thousands of questions, post ideas, join user groups, and help each other along their data journeys. Your feedback has helped us create and refine so much of what makes Microsoft Fabric great. Thank you all for your ideas and constant support. 

We have created new resources to help you ramp up on Microsoft Fabric and advance your career. First, visit the new Fabric Career Hub, to access a comprehensive learning journey with free on-demand and live training, discounts on certification exams, career insights from community experts, and role guidance to understand how Fabric can open potential opportunities. You can also join the vibrant Fabric Community today and engage with a huge community of data professionals to get help when you’re stuck, learn from peers, showcase your work, and even suggest product improvements. 

We’ve also published an enhanced portfolio of Microsoft Credentials, including the new “Microsoft Certified: Fabric Analytics Engineer Associate” certification, along with several new Microsoft Applied Skills covering scenarios using Microsoft Fabric, like implementing lakehouses, data warehouses, real-time analytics and data science solutions—with more coming out over the next few months. Check them all out on the Microsoft Credentials homepage.

New capabilities coming to Microsoft Fabric 

We have been working tirelessly over the past year to create the richest and most intuitive analytics platform on the market. We are thrilled to share the latest in a long line of innovation that is helping us fulfill the four core promises of Fabric:

  1. Fabric is a complete platform
  2. Fabric is lake-centric and open
  3. Fabric can empower every business user
  4. Fabric is AI powered

Fabric is a complete platform

Our first promise is that Fabric is a complete analytics platform with every tool your data scientists, data engineers, data warehousing professionals, analysts, and business users need to unlock value from data in a single unified SaaS platform. It also has the end-to-end, industry-leading security, governance, and data management capabilities needed to protect and manage your data. Let‘s take a look at the latest enhancements to the Fabric platform: 

Over the past few months, we have made significant updates to our platform to help you tackle projects of any scale and complexity. First, we are transforming Microsoft Fabric’s CI/CD experience. This transformation includes support for data pipelines and data warehouses in Fabric Git integration and deployment pipelines. Spark job definition and Spark environment will become available in Git integration. We are also giving you the ability to easily branch out a workspace integrated into Git with just a couple of clicks to help you reduce the time to code. Additionally, because many organizations already have robust CI/CD processes established in tools such as Azure DevOps, we will also support both Fabric Git integration APIs as well as Fabric deployment pipelines APIs, enabling you to integrate Fabric into these familiar CI/CD tools. All of these updates will be launched in a preview experience in early April.

Second, we are significantly updating our dataflows and data pipeline experience in Fabric to help customers more quickly ingest and transform their data. With Fast Copy in Dataflows Gen2, you can ingest a large amount of data using the same data movement backend as the “copy” activity in data pipelines. For data pipelines, you can now access on-premises data using the on-premises Data Gateway—the same gateway used with dataflows. We are also excited to add a new activity, semantic model refresh, that enables you to use Data Factory to orchestrate the refresh of semantic models in Microsoft Fabric. Finally, we are doubling the number of activities supported in a data pipeline from forty to eighty. All of these updates are now in preview and you can try them today. 

We have also listened to your feedback over the past few months and added some highly requested features to make working in Fabric even easier. We’ve released the ability to create folders in your workspaces, now in preview, and we are announcing the ability to create multiple apps in the same workspace, coming soon. We are also excited to share a feature coming soon that will give you the ability to add tags to Fabric items and manage them for enhanced compliance, discoverability, and reuse. Finally, we want to show you a sneak peek of a new feature we are bringing to Microsoft Fabric called task flows. Task flows can help you visualize a data project from end-to-end: 

This image shows the new capability in Microsoft Fabric, external data sharing, which enables you to share data and assets with external organizations.

Security in Microsoft Fabric 

With all your data flowing into the same platform, you need to be certain that data is secure at every step of the analytics journey. With that in mind, we have released a number of enterprise security features that can better protect your data. We recently announced the preview of Azure Private Link support for Microsoft Fabric which can provide secure access to your sensitive data in Microsoft Fabric by providing network isolation and applying required controls on your inbound network traffic. We also announced the preview of Trusted Workspace Access and Managed Private Endpoints which allow secure connections from Microsoft Fabric to data sources that are behind a firewall or not accessible from the public internet. Similarly, we released VNET data gateway into general availability in February which lets you connect your Azure and other data services to Microsoft Fabric and the Power Platform while ensuring no traffic is exposed to a public endpoint. We are thrilled to announce the expansion of these VNET data gateway to include on-premise data behind a VNET—now generally available.

We are also announcing deeper integration with Microsoft Purview’s industry-leading data security and compliance offerings to help you seamlessly secure data across your data estate. First, we are excited to announce that security admins will soon be able to define Purview Information Protection policies in Microsoft Fabric to automatically enforce access permissions to sensitive information in Fabric. Also coming soon is the extension of Purview Data Loss Prevention (DLP) policies to Fabric, enabling security teams to automatically identify the upload of sensitive information to Fabric and trigger automatic risk remediation actions. The DLP policies will initially work with Fabric Lakehouses with support for other Fabric workloads to follow. Finally, we are thrilled to announce the upcoming integration with Purview Insider Risk Management which will help you detect, investigate, and act on malicious and inadvertent data oversharing activities in your organization. Learn more about all of these upcoming integrations in the latest Microsoft Purview blog.

Governing data in Microsoft Fabric

With the massive growth in the volume of data, organizations are increasingly moving towards federated governance models where data is governed and managed according to the line of business needs. That is why, when we launched Fabric, we included the ability to create domains which allow tenant admins to delegate control to the domain level, enabling each business department to define its own rules and restrictions according to its specific business need. We have listened to your requests and have added, in preview, the ability for organizations to create subdomains to further refine the way your Fabric data estate is structured. Moreover, we are making it easier to create and manage domains with the ability to set default domains for security groups, the ability to use public admin APIs, and more. Learn more here.

This image shows the new capability in Microsoft Fabric, Task flows, which helps you visualize a data project from end-to-end by mapping out each artifact in  a visual map view.

To complement and extend the built-in data governance capability within Microsoft Fabric, we also natively integrate with the Microsoft Purview Data Governance solution. Today, Microsoft is announcing a reimagined data governance experience that offers sophisticated yet simple business-friendly interaction for your multi-cloud, multi-source data estate governance practice. Informed by Microsoft’s own internal journey, this reimagined experience is purpose-built for federated data governance that offers efficient data curation, data quality, and data management backed by actionable insights that help you activate and nurture your governance practice. Microsoft Fabric’s built-in governance, like item inventory, data lineage and metadata are reflected in Purview to accelerate your multi-cloud data estate governance practice.  Learn more about the new Microsoft Purview experience by reading the latest blog.  

Fabric is lake-centric and open

Our second promise was to design Fabric to be lake-centric and open to help you establish a trusted data foundation for your entire data estate. With OneLake, you can connect data from anywhere into a single, multi-cloud data lake for the entire organization, and work from the same copy of data across analytics engines. Two key features in OneLake, Shortcuts, and Mirroring, simplify how you bring data into OneLake.  

Shortcuts enable your data teams to virtualize data in OneLake without moving or duplicating the data. We are thrilled to release the preview of shortcuts to the Google Cloud Platform. We are also announcing the ability to create shortcuts to cloud-based S3 compatible data sources, in preview, and on-premise S3 compatible data sources, coming soon. These sources include Cloudflare, Qumulo, MinIO, Dell ECS, and many more.

Last November, we shared a new, zero-ETL way of accessing and ingesting data seamlessly in near-real time from any database or data warehouse into the Data Warehousing experience in Fabric called Mirroring. We are thrilled to announce that Mirroring is now in preview, enabling Azure Cosmos Database, Azure SQL Database, and other database customers to mirror their data in OneLake and unlock all the capabilities of Fabric Data Warehouse, Direct Lake mode, notebooks, and much more. We are also offering a free terabyte of Mirroring storage for replicas for every capacity unit (CU) you have purchased and provisioned. For example, if you purchase F64, you will get sixty-four free terabytes worth of storage for your mirrored replicas. Learn more about these announcements by reading this blog

Finally, we are introducing an external data-sharing experience for Microsoft Fabric data and artifacts, helping make collaboration easier and more fruitful across organizations. Fabric external data sharing, coming soon, enables you to share data and assets with external organizations such as business partners, customers, and vendors in an easy, quick, and secure manner. Because this experience is built on top of OneLake’s shortcut capabilities, you can share data in place from OneLake storage locations without copying the data. External users can access it in their Fabric tenant, combine it with their data, and work with it across any Fabric experience and engine.

This image shows the new capability in Microsoft Fabric, subdomains, which allows you to further refine the way your Fabric data estate is structured according to the business needs.

Fabric can empower every business user

The third promise we made was to empower every business user with approachable tools in Fabric to help turn data and insights into better decisions and more innovation. Power BI has been on the leading edge of helping every user access, explore, and take advantage of data with an intuitive interface and deep integration into the apps people use every day.

As part of our commitment to empowering every user, we are adding enhancements to the core Power BI visuals including more layout options for the matrix visual, additional formatting options for all cartesian charts, and new visual types like the button slicer and the new 100% stacked line area chart.

We are also introducing a metrics layer, in Fabric, coming soon, which allows organizations to create standardized business metrics, that are rooted in measures and are discoverable and intended for reuse. Trusted creators can select Power BI measures to promote to metrics and even, include descriptions, dimensions, and other metadata to help users better understand how they should be applied and interpreted. When looking through the metrics, users can preview and explore the simplified semantic model in a simple UI before using it in their solution. These metrics can not only be used in reports, scorecards, and Power BI solutions but also in other artifacts across Fabric, such as data science notebooks. 

We are also making it easier to connect to your data no matter where you are working. Later in the year, we will release the ability to live edit Direct Lake semantic models in the Fabric service right from Power BI Desktop, so you can work with data directly from OneLake. We are also enabling you to connect to over a hundred data sources and create paginated reports right from the Power BI Report Builder, now in preview. Also in preview is the new ability to create Power BI reports in the Fabric web service by simply connecting to your Excel and CSV files with relationship detection enabled. And to save you time when you are building reports, we have created new visuals for calculations we are introducing a new way for you to create and edit a custom date table without writing any Data Analysis Expressions (DAX) formulas, both in preview. Finally, you can generate mobile-optimized layouts for any report page, in preview, to help everyone view insights even on the go.

Fabric is AI-powered

Our fourth and final promise was to infuse generative AI capabilities into every layer of Fabric to help data teams accelerate their projects and focus on higher-value activities. With Copilot in Fabric, we are realizing that promise. The experiences currently in preview are already helping professionals go from raw data to insights in minutes. 

I am excited to share two important updates coming to Copilot in Fabric. In November, we announced a new feature called Explore that can help users learn more about their semantic model without building a report. We also announced another new feature called the DAX query view that helps you analyze and build your semantic model by running DAX queries. I’m excited to share we are making both of these capabilities even more powerful with Copilot. In Explore, we’ve added a new “Data overview” button which provides a summary, powered by Copilot, of the semantic model to help users get started. This feature will be released in preview in early April and will roll out to regions gradually. We are also adding the ability for Copilot to help you write and explain DAX queries in the DAX query view—now in preview.

Finally, we wanted to share a sneak peek of a new generative AI feature in Fabric that will enable custom Q&A experiences for your data. You can simply select the data source in Fabric you want to explore and immediately start asking questions about your data—even without any configuration. When answering questions, the generative AI experience will show the query it generated to find the answer and you can enhance the Q&A experience by adding more tables, setting additional context, and configuring settings. Data professionals can use this experience to learn more about their data or it could even be embedded into apps for business users to query.

Join us at Microsoft Build 

These announcements represent just the start of the innovation we are bringing to the Microsoft Fabric platform.

Join us at Microsoft Build from May 21st-23rd, 2024 either in person in Seattle, Washington, or online. You will hear and see our biggest announcements across the Microsoft Intelligent Data Platform and the rest of Microsoft.

Explore additional resources for Microsoft Fabric 

If you want to learn more about Microsoft Fabric, consider: 

The post Announcements from the Microsoft Fabric Community Conference appeared first on Microsoft AI Blogs.

]]>
Microsoft and NVIDIA partnership continues to deliver on the promise of AI https://azure.microsoft.com/en-us/blog/microsoft-and-nvidia-partnership-continues-to-deliver-on-the-promise-of-ai/ Mon, 18 Mar 2024 22:00:00 +0000 At NVIDIA GTC, Microsoft and NVIDIA are announcing new offerings across a breadth of solution areas from leading AI infrastructure to new platform integrations, and industry breakthroughs. The news expands our long-standing collaboration, which paved the way for revolutionary AI innovations that customers are now bringing to fruition.

The post Microsoft and NVIDIA partnership continues to deliver on the promise of AI appeared first on Microsoft AI Blogs.

]]>
At NVIDIA GTC, Microsoft and NVIDIA are announcing new offerings across a breadth of solution areas from leading AI infrastructure to new platform integrations, and industry breakthroughs. Today’s news expands our long-standing collaboration, which has paved the way for revolutionary AI innovations that customers are now bringing to fruition.

Microsoft and NVIDIA collaborate on Grace Blackwell 200 Superchip for next-generation AI models

Microsoft and NVIDIA are bringing the power of the NVIDIA Grace Blackwell 200 (GB200) Superchip to Microsoft Azure. The GB200 is a new processor designed specifically for large-scale generative AI workloads, data processing, and high performance workloads, featuring up to a massive 16 TB/s of memory bandwidth and up to an estimated 30 times the inference on trillion parameter models relative to the previous Hopper generation of servers.

Microsoft has worked closely with NVIDIA to ensure their GPUs, including the GB200, can handle the latest large language models (LLMs) trained on Azure AI infrastructure. These models require enormous amounts of data and compute to train and run, and the GB200 will enable Microsoft to help customers scale these resources to new levels of performance and accuracy.

Microsoft will also deploy an end-to-end AI compute fabric with the recently announced NVIDIA Quantum-X800 InfiniBand networking platform. By taking advantage of its in-network computing capabilities with SHARPv4, and its added support for FP8 for leading-edge AI techniques, NVIDIA Quantum-X800 extends the GB200’s parallel computing tasks into massive GPU scale.

Azure will be one of the first cloud platforms to deliver on GB200-based instances

Microsoft has committed to bringing GB200-based instances to Azure to support customers and Microsoft’s AI services. The new Azure instances-based on the latest GB200 and NVIDIA Quantum-X800 InfiniBand networking will help accelerate the generation of frontier and foundational models for natural language processing, computer vision, speech recognition, and more. Azure customers will be able to use GB200 Superchip to create and deploy state-of-the-art AI solutions that can handle massive amounts of data and complexity, while accelerating time to market.

Azure also offers a range of services to help customers optimize their AI workloads, such as Microsoft Azure CycleCloud, Azure Machine Learning, Microsoft Azure AI Studio, Microsoft Azure Synapse Analytics, and Microsoft Azure Arc. These services provide customers with an end-to-end AI platform that can handle data ingestion, processing, training, inference, and deployment across hybrid and multi-cloud environments.

Microsoft Azure AI solution stack

Delivering on the promise of AI to customers worldwide

With a powerful foundation of Azure AI infrastructure that uses the latest NVIDIA GPUs, Microsoft is infusing AI across every layer of the technology stack, helping customers drive new benefits and productivity gains. Now, with more than 53,000 Azure AI customers, Microsoft provides access to the best selection of foundation and open-source models, including both LLMs and small language models (SLMs), all integrated deeply with infrastructure data and tools on Azure.

The recently announced partnership with Mistral AI is also a great example of how Microsoft is enabling leading AI innovators with access to Azure’s cutting-edge AI infrastructure, to accelerate the development and deployment of next-generation LLMs. Azure’s growing AI model catalogue offers, more than 1,600 models, letting customers choose from the latest LLMs and SLMs, including OpenAI, Mistral AI, Meta, Hugging Face, Deci AI, NVIDIA, and Microsoft Research. Azure customers can choose the best model for their use case.

“We are thrilled to embark on this partnership with Microsoft. With Azure’s cutting-edge AI infrastructure, we are reaching a new milestone in our expansion propelling our innovative research and practical applications to new customers everywhere. Together, we are committed to driving impactful progress in the AI industry and delivering unparalleled value to our customers and partners globally.”

Arthur Mensch, Chief Executive Officer, Mistral AI

General availability of Azure NC H100 v5 VM series, optimized for generative inferencing and high-performance computing

Microsoft also announced the general availability of Azure NC H100 v5 VM series, designed for mid-range training, inferencing, and high performance compute (HPC) simulations; it offers high performance and efficiency.

As generative AI applications expand at incredible speed, the fundamental language models that empower them will expand also to include both SLMs and LLMs. In addition, artificial narrow intelligence (ANI) models will continue to evolve, focused on more precise predictions rather than creation of novel data to continue to enhance its use cases. Their applications include tasks such as image classification, object detection, and broader natural language processing.

Using the robust capabilities and scalability of Azure, we offer computational tools that empower organizations of all sizes, regardless of their resources. Azure NC H100 v5 VMs is yet another computational tool made generally available today that will do just that.

The Azure NC H100 v5 VM series is based on the NVIDIA H100 NVL platform, which offers two classes of VMs, ranging from one to two NVIDIA H100 94GB PCIe Tensor Core GPUs connected by NVLink with 600 GB/s of bandwidth. This VM series supports PCIe Gen5, which provides the highest communication speeds (128GB/s bi-directional) between the host processor and the GPU. This reduces the latency and overhead of data transfer and enables faster and more scalable AI and HPC applications.

The VM series also supports NVIDIA multi-instance GPU (MIG) technology, enabling customers to partition each GPU into up to seven instances, providing flexibility and scalability for diverse AI workloads. This VM series offers up to 80 Gbps network bandwidth and up to 8 TB of local NVMe storage on full node VM sizes.

These VMs are ideal for training models, running inferencing tasks, and developing cutting-edge applications. Learn more about the Azure NC H100 v5-series.

“Snorkel AI is proud to partner with Microsoft to help organizations rapidly and cost-effectively harness the power of data and AI. Azure AI infrastructure delivers the performance our most demanding ML workloads require plus simplified deployment and streamlined management features our researchers love. With the new Azure NC H100 v5 VM series powered by NVIDIA H100 NVL GPUs, we are excited to continue to can accelerate iterative data development for enterprises and OSS users alike.”

Paroma Varma, Co-Founder and Head of Research, Snorkel AI

Microsoft and NVIDIA deliver breakthroughs for healthcare and life sciences

Microsoft is expanding its collaboration with NVIDIA to help transform the healthcare and life sciences industry through the integration of cloud, AI, and supercomputing.

By using the global scale, security, and advanced computing capabilities of Azure and Azure AI, along with NVIDIA’S DGX Cloud and NVIDIA Clara suite, healthcare providers, pharmaceutical and biotechnology companies, and medical device developers can now rapidly accelerate innovation across the entire clinical research to care delivery value chain for the benefit of patients worldwide. Learn more.

New Omniverse APIs enable customers across industries to embed massive graphics and visualization capabilities

Today, NVIDIA’s Omniverse platform for developing 3D applications will now be available as a set of APIs running on Microsoft Azure, enabling customers to embed advanced graphics and visualization capabilities into existing software applications from Microsoft and partner ISVs.

Built on OpenUSD, a universal data interchange, NVIDIA Omniverse Cloud APIs on Azure do the integration work for customers, giving them seamless physically based rendering capabilities on the front end. Demonstrating the value of these APIs, Microsoft and NVIDIA have been working with Rockwell Automation and Hexagon to show how the physical and digital worlds can be combined for increased productivity and efficiency. Learn more.

Microsoft and NVIDIA envision deeper integration of NVIDIA DGX Cloud with Microsoft Fabric

The two companies are also collaborating to bring NVIDIA DGX Cloud compute and Microsoft Fabric together to power customers’ most demanding data workloads. This means that NVIDIA’s workload-specific optimized runtimes, LLMs, and machine learning will work seamlessly with Fabric.

NVIDIA DGX Cloud and Fabric integration include extending the capabilities of Fabric by bringing in NVIDIA DGX Cloud’s large language model customization to address data-intensive use cases like digital twins and weather forecasting with Fabric OneLake as the underlying data storage. The integration will also provide DGX Cloud as an option for customers to accelerate their Fabric data science and data engineering workloads. 

Accelerating innovation in the era of AI

For years, Microsoft and NVIDIA have collaborated from hardware to systems to VMs, to build new and innovative AI-enabled solutions to address complex challenges in the cloud. Microsoft will continue to expand and enhance its global infrastructure with the most cutting-edge technology in every layer of the stack, delivering improved performance and scalability for cloud and AI workloads and empowering customers to achieve more across industries and domains.

Join Microsoft at NVIDIA CTA AI Conference, March 18 through 21, at booth #1108 and attend a session to learn more about solutions on Azure and NVIDIA.

Learn more about Microsoft AI solutions

The post Microsoft and NVIDIA partnership continues to deliver on the promise of AI appeared first on Microsoft AI Blogs.

]]>
Introducing healthcare data solutions in Microsoft Fabric: A game-changer for healthcare data analysis http://approjects.co.za/?big=en-us/microsoft-fabric/blog/2024/03/11/introducing-healthcare-data-solutions-in-microsoft-fabric-a-game-changer-for-healthcare-data-analysis/ Mon, 11 Mar 2024 15:00:00 +0000 Learn how you can leverage the power of healthcare data solutions in Microsoft Fabric to break down data silos.

The post Introducing healthcare data solutions in Microsoft Fabric: A game-changer for healthcare data analysis appeared first on Microsoft AI Blogs.

]]>
Learn how you can leverage the power of healthcare data solutions in Microsoft Fabric to break down data silos.

Industry-specific capabilities withinMicrosoft Fabric play a pivotal role in addressing the diverse business challenges faced by organizations across various sectors. By tailoring data solutions to specific industries, we empower our customers to achieve optimal outcomes. Customers in healthcare, retail and sustainability among others are interested in leveraging Microsoft Fabric with industry specific capabilities so they can collaborate and build next-generation applications finely tuned to vertical high-value use cases. Microsoft aims to foster a robust partner ecosystem centered on industry standards through strategic investments in vertical capabilities. 

Each industry contends with different data and regulatory frameworks. At HIMSS 2024, we are outlining some of these vertical investments. Healthcare data is complex, sensitive, and highly regulated. It requires a robust and secure platform that can handle the diverse and evolving needs of the healthcare industry. That’s why we are excited to announce the public preview of offerings that will help you analyze healthcare data with ease and confidence. Today we are announcing the public preview of healthcare data solutions in Microsoft Fabric.

Bring your data into the era of AI

Reshape how everyone accesses, manages, and acts on data with a single, AI-powered platform.

Support multi-modal data management with healthcare data solutions

Healthcare data solutions in Microsoft Fabric is an end-to-end, analytics SaaS platform that enables you to ingest, store, and analyze healthcare data from various sources, such as electronic health records, picture archiving and communication systems, and more. The healthcare data solutions provide data models and transformation activities, which helps customers create a multi modal warehouse. It enables customers to align with industry standards, such as FHIR (Fast Healthcare Interoperability Resources) and DICOM, and supports compliance with regulations, such as HIPAA and GDPR. With healthcare data solutions in Microsoft Fabric, you can access and query your healthcare data using familiar experiences, such as Azure Synapse Analytics, Azure Data Factory, and Microsoft Power BI.

Public preview for ingesting clinical data available now

In October 2023, we announced the private preview for healthcare data solutions in Microsoft Fabric. Since then, customers have been testing it and giving us feedback. For example, the University of Wisconsin Madison School of Medicine and Public Health, which holds $524 million in annual total extramural research support and is home to more than 2,000 full-time faculty members, is exploring healthcare data solutions in Microsoft Fabric to power their Colorectal Cancer Multi-Modal Data Commons. Leveraging the capabilities of healthcare data solutions in Microsoft Fabric accelerates UW Madison’s path for unifying data across a variety of difference sources to create an ecosystem for secure, ethical, and reproducible data management and analytics that will further drive innovative clinical and translational research to promote health in Wisconsin and beyond.

Today we are excited to announce that the healthcare data solutions focused on ingesting clinical data is available in public preview and available in the Industry Solutions workload in Microsoft Fabric. Previews designated as production-ready employ privacy and security measures typically present in generally available online services. It can be used to process data that is subject to HIPAA regulatory compliance requirements. The public preview enables customers to create a medallion architecture aligned with the FHIR standard by providing data models and transformation pipelines. These are the current capabilities: 

  • Healthcare data foundations—Set up your healthcare data estate to run solution capabilities and configure it to structure data for analytics and AI/ML modeling. 
  • FHIR data ingestion—Bring your FHIR data to OneLake from a FHIR service such as Azure Health Data Services. 
  • Unstructured clinical notes enrichment—Use Azure AI’s Text analytics for health services to add structured to unstructured clinical notes for analytics. 
  • OMOP Analytics—Prepare data for standardized analytics through OMOP (Observational Medical Outcomes Partnership) open community standards. 
  • Data preparation for Dynamics 365 Customer Insights—Connect Dynamics 365 Customer Insights to your OneLake on Fabric for creating patient or member lists for your outreach. 

Imaging support for healthcare data solutions in Microsoft Fabric now available in private preview

Delivering on our vision for building a multi-model healthcare data estate and energized by the feedback from customers on our healthcare data solutions preview, we are extending our capabilities to support the complex and sensitive nature of imaging data.

With imaging support for healthcare data solutions in Microsoft Fabric, you will be able to ingest, store, and analyze imaging meta data across various modalities, such as X-rays, CT scans, and MRIs. The new imaging capabilities will enable collaboration, R&D and AI innovation for wide range of healthcare and life science use cases. Our customers and partners will be able to take DICOM images and bring them together with the clinical data stored in FHIR already. Making imaging pixel and metadata available alongside the clinical history and laboratory data enables clinicians and researchers to interpret imaging findings in the appropriate clinical context, leading to a higher diagnostic accuracy, informative clinical decision making, and improved patient outcomes.

  • Unify your medical imaging and clinical data estate for analytics—Establish a regulated hub to centralize and organize all your multi-model healthcare data, creating a foundation for predictive and clinical analytics. Built natively on well-established industry data models, including DICOM, FHIR, and OMOP.​ 
  • Build fit-for-purpose analytics models​—Start constructing ML and AI models on a connected foundation of EHR (Electronic Health Record) and pixel-data. Enable researchers, data scientists, and health informaticians to perform analysis on large volumes of multi-model datasets to achieve higher accuracy in diagnosis, prognosis, and improved patient outcomes. 
  • Advance research, collaboration and sharing of de-identified imaging​—Build longitudinal views of patients’ clinical history and related imaging studies with the ability to apply complex queries to identify patient cohorts for research and collaboration. Apply text and imaging de-identification to enable in-place sharing of research datasets with role-based access control. 
  • Reduce the cost of archival storage and recovery​—Take advantage of the cost-effective, and reliable cloud-based storage to help back up your medical imaging data from the redundant storage of on-prem PACS (Picture archiving and communication system) and VNA systems (Vendor Neutral archive), and enhance your HIPAA compliance efforts. Improve your security posture with a 100% off-site cloud archival of your imaging datasets in case of unplanned data loss. 
Screenshot of DICOM data ingestion deployment page, describes the feature and describes two notebook artifacts to be deployed to the Fabric workspace

Try healthcare data solutions today

Try out the public preview for our clinical data by going to the industry solutions switcher in Fabric and select the healthcare data solutions tile and explore all the capabilities you need from there.

Screenshot of the healthcare data solutions workload in Fabric, showing a sample data and six capabilities than can be deployed in the public preview

This preview for imaging is available to a select group of customers who are interested in exploring the potential of imaging data in Microsoft Fabric. If you are interested in participating, please contact your account executive for more information. 

We look forward to hearing your feedback and seeing the amazing solutions you will create with these offerings. To learn more, visit our documentation.

The post Introducing healthcare data solutions in Microsoft Fabric: A game-changer for healthcare data analysis appeared first on Microsoft AI Blogs.

]]>