November 11, 2025
November 11, 2025

Processing sensor data in real-time to detect abnormal conditions is a common requirement in industrial IoT applications. This tutorial demonstrates how to use Azure Stream Analytics to identify anomalies in streaming sensor data and visualize the results using Power BI dashboards.
We'll walk through the complete setup process, from configuring Azure services to creating live dashboards that display both sensor readings and anomaly alerts.
This implementation uses several Azure services working together:
Industrial Controller: A PLC or industrial device with a temperature sensor connected to its analog input module. This device sends temperature readings to Azure IoT Hub.
Azure IoT Hub: The cloud gateway that receives sensor data from industrial devices.
Azure Stream Analytics: A real-time analytics service that processes incoming data streams, applies machine learning functions to detect anomalies, and routes results to other services.
Power BI: A visualization platform that displays live dashboards showing sensor readings and anomaly detection signals.
The data flows from the temperature sensor through the industrial controller to IoT Hub, where Stream Analytics processes it in real-time before sending results to Power BI for display.
Before configuring Azure Stream Analytics, set up a workspace in Power BI where your dashboards will live.
Log into your Power BI account and create a new workspace. Give it a descriptive name such as "stream_analytics_workspace" to identify its purpose. This workspace will store the datasets and dashboards created by your Stream Analytics job.
When multiple services read data from the same IoT Hub, consumer groups prevent them from interfering with each other. Each service should use its own consumer group for reliable data processing.
In your Azure portal, navigate to your IoT Hub and select "Built-in endpoints." Here you'll find the consumer groups section. Create a new consumer group specifically for your Stream Analytics job. Give it a meaningful name like "stream_analytics_group."
This ensures your Stream Analytics job processes all events independently of any other services consuming data from the same IoT Hub.
Navigate to "Create a resource" in the Azure portal, select "Analytics," then choose "Stream Analytics job." Provide a name for your job that describes its function, such as "temperature_anomaly_detection."
Select your Azure subscription and resource group, then create the job. Wait for the deployment to complete, which typically takes a minute or two.
Once deployed, find your new Stream Analytics job under "All resources" in the Azure portal.
Stream Analytics jobs require at least one input source. In this case, the input is your Azure IoT Hub.
Click on "Inputs" in your Stream Analytics job, then select "Add stream input." Choose "IoT Hub" as the input type.
Configure the following settings:
Input Alias: A name you'll use to reference this input in your query, such as "IoTHub"
Subscription: Your Azure subscription containing the IoT Hub
IoT Hub: Select the correct IoT Hub from the dropdown
Consumer Group: Choose the consumer group you created earlier for this Stream Analytics job
Leave other settings at their default values and save the input configuration.
Next, configure where Stream Analytics sends processed data. For this tutorial, the output destination is Power BI.
Click on "Outputs," then select "Add output" and choose "Power BI."
Configure these settings:
Output Alias: The name you'll use in your query to reference this output, such as "PowerBI"
Dataset Name: The name that appears in Power BI when data arrives, such as "stream_analytics_dataset"
Table Name: The table name within the dataset where data is stored, such as "temperature_data"
Click "Authorize" to grant Stream Analytics permission to write data to your Power BI account. After successful authorization, you should see the workspace you created earlier. Select it and save the output configuration.
The query defines how Stream Analytics processes incoming data. Click "Edit query" to access the query editor.
The query for anomaly detection looks like this:
SELECT
time,
temperature,
AnomalyDetection_SpikeAndDip(temperature, 99, 500) as anomaly
INTO PowerBI
FROM IoTHub
This query performs several operations:
Data Selection: Pulls time and temperature values from the IoT Hub input stream
Anomaly Detection: Calls the built-in AnomalyDetection_SpikeAndDip function with three parameters:
Output Creation: Sends time, temperature, and anomaly detection results to Power BI
The anomaly detection function identifies both spikes (sudden increases) and dips (sudden decreases) in the data stream. When an anomaly is detected, the function outputs a signal that changes from 0 to 1, which you can use to trigger alerts or highlight unusual conditions on dashboards.
Save the query after entering it in the editor.
Return to your Stream Analytics job overview page and click "Start" to begin processing data. The job starts reading from IoT Hub, applying your query logic, and sending results to Power BI.
It takes a few moments for the job to initialize and begin processing events.
Navigate to your Power BI workspace in a web browser. Under "Datasets," you should see the dataset name you specified in the Stream Analytics output configuration.
Click "Create dashboard" and provide a name such as "temperature_dashboard."
Click "Add tile," select "Custom streaming data," then choose your Stream Analytics dataset.
Select "Line chart" as the visualization type. Configure these settings:
Axis: Select "time" to show data over timeValues: Select "temperature" to display temperature readingsTime window: Set to display the last 10 minutes of data
Add a title like "Temperature" and apply the settings.
Add another tile using custom streaming data from your dataset. This time, select "Card" as the visualization type and choose "temperature" as the value. This displays the current temperature reading as a single number.
Add a third tile with a line chart. Set "time" as the axis and "is_anomaly" as the value. This displays the anomaly detection signal, which shows 0 under normal conditions and jumps to 1 when an anomaly is detected.
This visual indicator makes it immediately obvious when the system detects unusual behavior in the data stream.
The machine learning algorithm needs time to establish a baseline before it can reliably detect anomalies. Allow at least 500 events to flow through the system (as specified in your query) before testing.
Once sufficient data has been collected, create an anomalous condition by applying heat to the temperature sensor. Watch the Power BI dashboard as the temperature begins rising. When the temperature increase exceeds normal patterns, the anomaly signal on your dashboard jumps from 0 to 1, indicating that Stream Analytics detected the abnormal condition.
This demonstrates the real-time nature of the system—anomalies are detected within seconds of occurring and immediately visible on the dashboard.
The AnomalyDetection_SpikeAndDip function uses two key parameters that affect detection sensitivity:
Confidence Level (99%): Higher percentages reduce false positives but may miss subtle anomalies. Lower percentages detect more anomalies but increase false alerts.
History Size (500): The number of events the algorithm analyzes to understand normal behavior patterns. Larger values provide more stable baselines but require more time before detection begins.
Adjust these parameters based on your specific application requirements and the characteristics of your data.
Real-time anomaly detection applies to many industrial scenarios:
Azure Stream Analytics provides built-in machine learning functions that make real-time anomaly detection accessible without requiring custom model development. By configuring inputs from IoT Hub, writing simple queries, and routing results to Power BI, you can create live monitoring dashboards that automatically flag unusual conditions in your industrial data streams.
The key steps include creating consumer groups to isolate data streams, configuring Stream Analytics inputs and outputs, writing queries that apply anomaly detection functions, and building Power BI dashboards that visualize both raw data and detection results. Once configured, the system continuously monitors your data and alerts you to abnormal conditions as they occur.