Yitzhak Kesselman | Author at Microsoft Fabric Blog http://approjects.co.za/?big=en-us/microsoft-fabric/blog Fri, 13 Feb 2026 20:14:48 +0000 en-US hourly 1 http://approjects.co.za/?big=en-us/microsoft-fabric/blog/wp-content/uploads/2026/03/cropped-favicon-32x32.png Yitzhak Kesselman | Author at Microsoft Fabric Blog http://approjects.co.za/?big=en-us/microsoft-fabric/blog 32 32 Microsoft Fabric Real-Time Intelligence: A Leader in the 2025 Forrester Streaming Data Wave https://blog.fabric.microsoft.com/en-US/blog/microsoft-fabric-real-time-intelligence-a-leader-in-the-2025-forrester-streaming-data-wave/ Wed, 10 Dec 2025 23:00:00 +0000 Microsoft has been recognized as a Leader in The Forrester Wave™: Streaming Data Platforms, Q4 2025, which we view as a strong validation of our strategy and execution.

The post Microsoft Fabric Real-Time Intelligence: A Leader in the 2025 Forrester Streaming Data Wave appeared first on Microsoft Fabric Blog.

]]>
Businesses and organizations are entering a new operational era defined by immediacy, intelligence, and continuous adaptation. AI is shifting expectations across every industry. Organizations now need to sense what is happening across their business the moment it occurs, understand its significance, and respond with confidence. Real-time data has become the foundation for how resilient, competitive organizations run. Enterprises also realize that fragmented data stacks cannot support modern AI or operational agility.

Microsoft anticipated this shift early on. We invested heavily in real-time services in Azure, including Event Hubs, Stream Analytics, and Data Explorer, powering mission-critical real-time workloads for years for both Microsoft and our customers and delivering proven reliability, performance, and planet scale.

But the decisive step was building Real-Time Intelligence into Microsoft Fabric on top of that mature foundation in Azure. Real-Time Intelligence unifies streaming, analytics, and action in one governed platform, bringing batch and streaming together in OneLake and Fabric.

Microsoft has been recognized as a Leader in The Forrester Wave™: Streaming Data Platforms, Q4 2025, which we view as a strong validation of our strategy and execution. This position as a leader is the result of our long-term conviction rather than short-term reaction. Microsoft invested early so organizations would have a mature, scalable real-time foundation exactly when the need became urgent, and that foresight is now paying off for customers.

A blue and white diagram

AI-generated content may be incorrect.
Forrester Wave Streaming Data Platforms, Q4 2025

A Leader for the Real-Time Enterprise

Forrester’s 2025 evaluation confirms a clear market shift: enterprises are moving away from fragmented real-time architectures and toward unified platforms that can support AI-driven decisions at digital speed. As Forrester notes, “AI agents rely on seamless data flow across ingestion, transformation, and real-time insights to avoid bottlenecks and cascading errors. A robust platform unifies these workloads (messaging, processing, analytics), eliminating silos and latency that degrade decision quality.” In this environment, a fully integrated streaming platform has become essential rather than optional.

This is exactly where Real-Time Intelligence in Microsoft Fabric stands out. Forrester notes that Microsoft’s strategy is to bring dozens of services together under a single umbrella, making real-time development and event-driven analytics “second nature” within Fabric. Real-time data becomes a first-class citizen in Fabric’s unified data estate. Streaming signals land directly into OneLake, the same foundation that powers the lakehouse, warehouse, semantic models, governance, Power BI, and AI agents.

Forrester’s assessment is clear: “Microsoft excels at messaging, analytics, governance, developer experience, business user experience, and more, enabling robust performance for real-time analytics and event-driven applications. It provides seamless integration within the Fabric ecosystem to support enterprise use cases like predictive analytics and operational dashboards. It offers strong tooling for both technical and business users as well as deep integration with Azure services, empowering enterprises to build real-time solutions.”

This recognition reflects a platform designed from the start to work as one coherent system, not a set of loosely assembled services. It is an end-to-end real-time platform that strengthens the entire data estate and positions organizations to run AI-driven operations with clarity, speed, and confidence.

Why Enterprises Are Standardizing on Fabric Real-Time Intelligence

Real-Time Intelligence delivers a complete end-to-end platform for understanding and acting on what is happening across the enterprise in the moment. It unifies signals across time, space, and relationships to provide a connected operational picture rather than isolated dashboards or fragmented telemetry. Every stage of the real-time lifecycle (streaming, analyzing, modeling, visualizing, and acting) is integrated into one governed, AI-ready system. This coherence is what enables teams and AI agents to work from one live, trusted view of the business and make high-quality decisions at digital speed.

https://youtube.com/watch?v=bfXn21tqsAQ%3Fstart%3D2%26feature%3Doembed

The Real-Time Hub is the unified experience that makes every enterprise signal visible, governed, and ready for use. It brings all the capabilities of Real-Time Intelligence together into a coherent operational fabric, the live nervous system of the modern enterprise.

Across five major areas, these capabilities form a complete real-time platform:

  • Stream: Eventstream ingests, shapes, filters, and enriches data in motion, while Connectors pulls data from dozens of streaming sources, including Kafka, MQTT, IoT systems, SaaS apps, and CDC feeds. Event Schema Set standardizes events to keep signals consistent, interoperable, and easy to govern.
  • Analyze: Real-time and historical insights converge in Eventhouse, a high-performance engine for interactive analytics over petabyte-scale data. Anomaly Detector highlights deviations and emerging risks the moment they appear.
  • Model: Fabric’s modeling layer enables unified operational awareness. Graph links signals to entities and relationships, Fabric Map situates them in physical space, and Digital Twin Builder models assets and environments over time.
  • Visualize: KQL Querysets enable fast, interactive analytics for exploring data and diagnosing problems, Graph Querysets for relational and causal patterns across the business, and Real-Time Dashboards for intuitive, no-code views of live conditions and trends.
  • Act: Activator detects patterns over time on a per-instance basis and triggers alerts or workflows through visual no-code business rules. Operations Agent monitors your operations, reasons about your business, and takes automated actions based on natural language instructions.
Fabric Real-Time Intelligence Components across Stream | Analyze | Model | Visualize | Act

How Fabric IQ Completes the Intelligence Layer

We recently introduced Fabric IQ (see blog post), the intelligence layer that transforms unified data into unified understanding for every team and every AI agent. IQ brings a semantic, reasoning-ready foundation to the entire Fabric platform, including Real-Time Intelligence, so organizations can interpret what is happening across their business, not just observe it. It augments human and AI workflows with natural language understanding, contextual reasoning, and a unified live view drawn from both streaming and historical data.

IQ enables natural-language exploration across all your data in Fabric, allowing users to ask questions, investigate anomalies, and understand relationships without writing code. It synthesizes patterns across time, space, and relationships, surfaces anomalies, explains correlations, and identifies root causes. This shift is what elevates Fabric from a data platform to an enterprise intelligence platform, one where insights are generated, connected, and immediately actionable.

https://youtube.com/watch?v=RjU0slwcZGs%3Fstart%3D3%26feature%3Doembed

IQ amplifies the power of Real-Time Intelligence. Streaming events stop being isolated signals and become part of a live coherent semantic picture: operators can ask why something happened, analysts can explore emerging risks and opportunities, and AI agents can reason over business meaning before taking action. Because IQ relies on the same governance, semantics, and security as the rest of Fabric, the insights it produces are consistent, trustworthy, and grounded in the organization’s shared data estate.

Together, Real-Time Intelligence and IQ create a unified real-time decision system. Real-Time Intelligence senses what is happening in the moment; IQ interprets it and drives the next action. This seamless integration enables enterprises and their AI agents to operate with precision even as conditions change second by second.

Animated GIF of Fabric IQ offering.
Fabric components across IQ – Act | Decide | Observe | Analyze

Strategic Takeaways for Enterprise Leaders

Real-time intelligence is becoming core to modern enterprise operations, no longer a specialist capability or an add-on component. AI requires up-to-date, contextualized data. Decision systems must unify batch and streaming data. And governance and semantics must extend across the entire data estate, from historical tables to real-time streams, to ensure trust, lineage, compliance, and safe AI behavior.

Microsoft Fabric, with Real-Time Intelligence and IQ, offers a decisive foundation for this new era. It brings together data, meaning, and action in one governed platform. It unifies time, space, and relationships to give enterprises a complete operational picture. And it supports AI-driven decisions at digital speed while strengthening governance, reliability, and trust.

As organizations modernize their data platforms and adopt AI at scale, Fabric provides the clarity, coherence, and confidence they need to run their business in real time and to thrive in the decade ahead.

Statement from Forrester

Forrester does not endorse any company, product, brand, or service included in its research publications and does not advise any person to select the products or services of any company or brand based on the ratings included in such publications. Information is based on the best available resources. Opinions reflect judgment at the time and are subject to change. For more information, read about Forrest’s objectivity here.

Learn More

Presentations

Conferences

  • Join us at FabCon in Atlanta, Georgia, on March 16-20, 2026. Register and use code MSCATL for a $200 discount on top of the current Early Access pricing!

Resources

The post Microsoft Fabric Real-Time Intelligence: A Leader in the 2025 Forrester Streaming Data Wave appeared first on Microsoft Fabric Blog.

]]>
Your business doesn’t wait, why should your analytics? http://approjects.co.za/?big=en-us/microsoft-fabric/blog/2025/08/20/your-business-doesnt-wait-why-should-your-analytics/ Wed, 20 Aug 2025 15:00:00 +0000 To stay competitive in a fast-moving world, organizations are replacing traditional batch processing in favor of real-time streaming. Why? Because in today’s digital economy, speed is strategy. The faster you turn data into insight, the faster you can act and win.

The post Your business doesn’t wait, why should your analytics? appeared first on Microsoft Fabric Blog.

]]>
To stay competitive in a fast-moving world, organizations are replacing traditional batch processing in favor of real-time streaming. Why? Because in today’s digital economy, speed is strategy. The faster you turn data into insight, the faster you can act and win.

A 2025 report from Gartner® highlighted the accelerating shift toward real-time data: “According to the Transforming Data With Intelligence (TDWI) survey, 38% of organizations plan to enable access to real-time data for driving operational use cases. In the 2024 Data Streaming Report, Confluent found that ‘51% of IT leaders cite data streaming as a top strategic priority for IT investments in 2024, compared to 44% in 2023.’ Additionally, 68% expect the use of data streaming technology to continue growing over the next two years.”1, 2

Think fraud detection: every second counts. Or live customer monitoring, where spotting a behavior shift in real time can mean the difference between a lost sale and a loyal customer. In logistics, delays in anticipating and even preventing disruptions can cascade into missed deadlines, higher costs, and lost customers. These aren’t futuristic scenarios. They’re happening now, powered by real-time decision-making.

At the core of this shift is AI. By making decisions on the most up-to-date data, AI can detect anomalies, uncover patterns, trigger alerts, and continuously optimize workflows, as events unfold. No more waiting for yesterday’s reports to make tomorrow’s decisions.

Batch systems inherently operate on delay that doesn’t impact historical reporting, but isn’t sufficient for live operations. Streaming flips that model, delivering a continuous flow of insights when they matter most: now.

Henry Ford once said, “If I’d asked people what they wanted, they’d have said faster horses.” We don’t need faster batch jobs. We need a new paradigm. So instead of trying to speed up a process that isn’t designed for real-time, start where the action begins: at the moment data is born.

The pace of business has changed. Has your analytics kept up?

How is batch processing used today?

Enterprises today often use a centralized analytics team to handle batch processing. This involves extracting data from various sources, processing it through multiple stages in a “medallion architecture” (bronze, silver, gold), loading that data into a compute engine, and finally delivering it to end-users via tools like Power BI. This process is often slow and cumbersome, leading to delays in decision-making. Batch processing works when you have a very predictable set of data that always needs the same processing structure, but it suffers from:

  • Delayed insights from reporting on stale data.
  • A need for centralized control and governance.
  • Pipeline failures affecting the whole batch.

In today’s dynamic world, our expectations and our customers’ expectations have increased to a more agile, adaptable framework. This is where real-time streaming comes into play.

What is real-time streaming?

Real-time streaming is a data processing paradigm that involves the continuous flow and analysis of data as it is generated. Unlike traditional batch processing methods that collect and process data at intervals, real-time streaming allows for instantaneous insights and responses, making it possible to act on information as events unfold.

This approach is increasingly vital in a range of applications, from fraud detection and live monitoring to dynamic decision-making in business processes. By leveraging real-time streaming solutions, organizations can enhance operational efficiency, improve responsiveness, and deliver more agile, data-driven experiences.

How do you know you need real-time streaming?

The shift to real-time often starts with a question. When organizations begin asking for faster insights, smarter automation, and more responsive systems, it’s a clear sign they’ve outgrown batch processing. Here are common signs: “I want to…”

  • Respond to events in my business.
  • Receive instant alerts when something changes.
  • Monitor digital and physical assets continuously and in detail.
  • Generate real-time reports and dashboards.
  • Enable AI and machine learning to act in the moment.
  • Scale systems without sacrificing performance.
  • Build and deliver data products in real time.

If these needs resonate with you, your organization is ready to evolve. And your data infrastructure should, too.

What benefits does streaming data provide?

Streaming data enables faster and more responsive operations by delivering insights as events unfold. It eliminates the delays of batch processing and supports continuous awareness and action. With streaming data, businesses can:

  • Detect and respond to changes instantly.
  • Diagnose problems as they emerge and shorten the time-to-resolve.
  • Power real-time dashboards and decision-making.
  • Feed machine learning and generative AI with live, high-fidelity inputs.
  • Monitor digital and physical assets continuously.
  • Scale analytics without sacrificing performance.
  • Deliver real-time data products to internal and external consumers.

This shift turns data into a live operational asset, driving immediate action, automation, and continuous improvement.

What is Real-Time Intelligence in Microsoft Fabric?

Real-Time Intelligence in Microsoft Fabric is built for the now. It delivers a powerful, unified SaaS platform for continuously ingesting, processing streaming data, training machine learning and AI models, and transforming raw signals into actions in real time. It bridges the gap between data and action. It’s not just about seeing what’s happening, it’s about responding as it happens.

Microsoft Fabric showing five workloads.
Schematic of Microsoft Fabric showing five workloads, including Real-Time Intelligence, with an underlying layer of AI, OneLake, and Governance.
Real-Time Intelligence architecture.
Schematic of Real-Time Intelligence architecture, including a collection of icons representing different connectors for data ingetion, the items that make up Real-Time Intelligence, and two underlying layers of AI and the Real-Time hub.

Across industries, Real-Time Intelligence is already powering smarter, faster decision-making in thousands of organizations across a variety of industries:

Industry sectors that have adopted Real-Time Intelligence.
Schematic of different industry sectors that have adopted Real-Time Intelligence, such as Airlines, Retail, Health Care, Logistics, and Automotive (among others), with representative organizations for each sector.

Is Fabric Real-Time Intelligence cost-effective?

Microsoft Fabric Real-Time Intelligence is built for both performance and cost-efficiency. With a flexible, consumption-based pricing model, you only pay for what you use, whether it’s streaming ingestion, transformation, or event-driven processing, allowing you to start small and scale seamlessly. Unlike traditional batch systems, which often require heavy infrastructure and redundant data processing, it optimizes costs by processing only the most recent records in real time.

In many cases, it’s not just more responsive than batch. It’s more affordable. And compared to other streaming platforms, Real-Time Intelligence in Microsoft Fabric delivers powerful real-time capabilities at a lower total cost of ownership. Real-time insights don’t have to come at a premium.

The future is streaming, don’t let it pass you by

Real-time is no longer a luxury. It’s a competitive necessity. With Microsoft Fabric Real-Time Intelligence, you can harness streaming data with a streamlined, intuitive experience designed for scale, speed, and simplicity. It’s not about processing everything faster. It’s about processing the right data, as it happens, and turning it into action.

Say goodbye to outdated batch jobs, complex pipelines, and latency that holds your business back. With event-driven transformations and real-time ELT, you’re not just cleaning up data. You’re clearing the path to instant insight and smarter decisions.

The future of data is already in motion. Are you ready to move with it?

Check out the release plan.


1Gartner, Emerging Tech: Revolutionize Your Products With Real-Time Data and AI, Kevin Quinn, David Pidsley, 31 January 2025.

2Gartner is a registered trademark of Gartner, Inc. and/or its affiliates and is used herein with permission. All rights reserved. Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.

The post Your business doesn’t wait, why should your analytics? appeared first on Microsoft Fabric Blog.

]]>
Microsoft is recognized as a Leader in The Forrester Wave™: Streaming Data Platforms http://approjects.co.za/?big=en-us/microsoft-fabric/blog/2024/02/28/microsoft-is-recognized-as-a-leader-in-the-forrester-wave-streaming-data-platforms/ Wed, 28 Feb 2024 17:00:00 +0000 Microsoft has been recognized as a Leader in The Forrester Wave™: Streaming Data Platforms, Q4 2023—a distinction based on Forrester’s evaluation of the advanced capabilities of Azure Event Hubs and Azure Stream Analytics services.

The post Microsoft is recognized as a Leader in The Forrester Wave™: Streaming Data Platforms appeared first on Microsoft Fabric Blog.

]]>

In today’s rapidly evolving data and AI landscape, the demand for comprehensive event and streaming analytics solutions has never been more critical. Over the past few years, Microsoft has dedicated itself to investing in this area and addressing the needs of its customers by offering seamless and powerful options for stream processing. This includes Microsoft Fabric Real-Time Analytics, a robust solution built upon the scalable, fault-tolerant foundations of Azure Streaming Data Platform services. 

We’re excited to announce that our investments are paying off. Microsoft has been recognized as a Leader in The Forrester Wave™: Streaming Data Platforms, Q4 2023—a distinction based on Forrester’s evaluation of the advanced capabilities of Azure Event Hubs and Azure Stream Analytics services.   

Microsoft not only secured the highest score among all vendors in the “current offering” category, but was also given the top score possible of 5.0 across 13 distinct criteria. The Forrester report acknowledges Microsoft’s strengths in streaming analytics, low latency, fault tolerance, and developer tools. You can read the full The Forrester Wave™: Streaming Data Platforms, Q4 2023 to learn more about Microsoft’s position as a leader in the evaluation.   

The Forrester Wave™ Streaming Data Platform figure

Why is Stream processing important?   

Stream processing has emerged as a crucial technology, revolutionizing the way data is collected, analyzed, and insight-based decisions are made, all in real-time. But the benefits of stream processing extend beyond just speed. It gives organizations a holistic 360-degree view of real-time and batch data empowering them to detect anomalies, identify patterns, and extract valuable insights from the continuous influx of data. This not only enhances decision-making processes but also enables businesses to deliver personalized and responsive experiences to their customers. With reduced latency and immediate feedback loops, stream processing fosters a more agile and adaptive approach to data-driven decision-making, ultimately contributing to increased efficiency, innovation, and a critical edge in today’s competitive market.  

However, stream processing at a cloud scale can be challenging for businesses, both large and small. Integration complexities with existing services, the need for efficient resource scaling, ensuring reliability and fault tolerance, and addressing security concerns pose significant hurdles. Striking a balance between seamless integration, scalable infrastructure, and robust security measures is crucial for successful cloud-based stream processing implementations.   

Why do customers love Azure Streaming Data Platform?  

Azure Streaming Data Platform architecture

Microsoft Azure addresses these challenges with one powerful solution on two products—Azure Event Hubs and Azure Stream Analytics. Azure Event Hubs provides a scalable and open event ingestion service, simplifying the integration of stream processing frameworks. Complementing this, Azure Stream Analytics enables users to build and deploy complex stream processing queries. Together, these services enable businesses to ingest, process, and analyze massive amounts of data in real-time. With built-in security, reliability features, and developer tools, Microsoft Azure services ensure data protection, business continuity, and productivity—at a flexible price point that makes stream processing at scale affordable for all Azure customers.    

  • Ingestion and stream analytics: Azure Event Hubs has native, on-by-default support for Open standards such as Apache Kafka and AMQP, enabling customers to ingest data from a wide variety of data sources. Meanwhile, Azure Stream Analytics supports output data into multiple services including Microsoft Power BI, Azure Functions, Azure SQL, and Azure Data Lake Storage. The support for Delta Lake (Delta Parquet) enables customers to persist the output of the stream analytics process in a way that can then be consumed by popular analytics services like Fabric, Azure Synapse, and Azure Databricks. By facilitating seamless integration between best-in-class tools and frameworks, Azure Streaming Data Platform services ensure that data flows smoothly from ingestion through to analytics, optimizing the value Azure customers can derive from their data assets.  
  • High scale (latency and throughput): Azure Streaming Data Platform services are the backbone for thousands of leading enterprises that have come to depend upon the proven performance at planet scale. Azure Event Hubs has consistently exceeded the OpenMessaging Benchmark standards demonstrating end-to-end latency of under 10 milliseconds. Additionally, Azure Streaming Analytics, built upon multiple patented optimizations, supports data processing at a throughput of several GBs per second. These two services are handling over 10 trillion requests and more than 13 PB data ingested per day. This combination of low latency and high throughput equips businesses with the capability to manage, process, and analyze vast volumes of data at a very high velocity. This enables businesses to gain a significant advantage by making quicker, more informed decisions, and staying ahead of their competitors in today’s fast-paced market.   
  • Fault tolerance and reliability: Built upon the robust, fault-tolerant foundation of Microsoft Azure and features such as availability zones, the Azure Streaming Data Platform is engineered for resilience. This ensures an unparalleled level of reliability and fault tolerance, providing our customers with peace of mind through a financially backed 99.99% uptime guarantee. Thousands of customers across various industries rely on our Streaming Data Platform services for their mission-critical applications, trusting in our platform’s consistent performance, reliability and the assurance of operational continuity, even in the face of unexpected challenges.   
  • Security and privacy: Security is paramount in the digital age, and Azure takes this seriously. Both Azure Event Hubs and Azure Stream Analytics come fortified with robust security capabilities. Features such Microsoft Entra-based modern authentication, full compute isolation with dedicated clusters, native VNET integration, and end-to-end encryption using customer managed keys (CMK) ensures that businesses can navigate the data landscape with confidence, knowing that their valuable information is not just processed but safeguarded with the highest standards of security and privacy.  
  • Developer productivity and tools: Azure’s commitment to high productivity is reflected in the suite of developer tools like Visual Studio Code, user-friendly web interfaces, and the low-code/no-code experiences of both Azure Event Hubs and Stream Analytics. Features such as “capture data” and “process data” are designed to simplify routine tasks, enabling data engineers to quickly accomplish their objectives without the need to navigate through complex configurations, service integrations, or job scheduling and monitoring. This focus on streamlining the development experience not only boosts efficiency but also accelerates the deployment of data-driven solutions, enabling teams to focus on innovation rather than administrative overheads.   
  • Streaming in the age of AI: The integration of machine learning features amplifies the analytical capabilities of Azure’s streaming solutions in this age of AI including anomaly and fraud detection. The ability to call out to real-time scoring APIs and Azure Cognitive APIs from Azure Stream Analytics jobs are just the start.  
  • Pricing flexibility: Achieving business efficiency, customer satisfaction, and user productivity shouldn’t come at an exorbitant cost. To that end, Azure Event Hubs and Azure Stream Analytics offer a variety of flexible pricing options tailored to meet the needs of businesses and use cases of all sizes. Whether you’re a small startup or a large enterprise, our pricing models are designed to provide cost-effective access to advanced stream processing capabilities. This approach ensures that businesses can leverage the power of real-time data analytics without compromising on financial objectives, ultimately driving better business outcomes, enhancing customer satisfaction, and promoting overall productivity.   
  • Globally available: Azure Event Hubs and Azure Stream Analytics services are available in over 60 Azure regions around the globe. This number keeps growing to meet more customer demand. So, whether Azure customers need high-speed network access or have data sovereignty requirements, Stream Processing services are always in a region near them.  

If you are looking for a cloud-based streaming data platform that offers reliability, openness, performance, and affordability, look no further than Microsoft Azure. Azure Event Hubs and Azure Stream Analytics are the leading solutions in the market, enabling you to harness the power of stream processing for your business needs with ease and promoting a culture of innovation without unnecessary complexity. And we are not done! With Microsoft Fabric Real-Time Analytics and Data Activator, we have taken the next steps towards an even more accessible and seamless Software as a Service (SaaS) experience for you.  

The road ahead with Fabric Real-Time Analytics  

Microsoft Fabric Real-Time Analytics offers a powerful suite of features for handling real-time data, all seamlessly integrated under a consistent user experience, security model, and capacity model. Event streams enables ingestion and processing of streaming data from a wide variety of sources including Kafka, Azure, and into Fabric Lakehouse and Kusto Query Language (KQL) databases, enabling efficient analytics, real-time dashboarding, anomaly detection, and monitoring. Data Activator empowers users to define actionable patterns within their data, from simple thresholds to complex trends, driving informed decisions and alerts. All together, Fabric Real-Time Analytics and Data Activator provide a low-code/no-code experience for high-volume, high-granularity data within the unified AI-power analytics platform of Microsoft Fabric.   

Discover solutions

Learn more streaming data platforms on Azure and Microsoft Fabric:

Abstract image

Real-Time Analytics in Fabric

Reduce complexity and simplify data integration

The post Microsoft is recognized as a Leader in The Forrester Wave™: Streaming Data Platforms appeared first on Microsoft Fabric Blog.

]]>