We are living in the world of electric mobility. Globally, the adoption of electric cars and two-wheeler is steeply on the rise. Electric mobility devices rely on expensive rechargeable lithium-ion batteries for power. These batteries are integral for the fight against the bad effects of fossil fuels such as pollution and rising costs. But the lithium-ion technology does come with some drawbacks such as:
- Fire hazard and explosions resulting from temperature increases and short-circuits
- Decreased life-span due to over/under charging
- Decreased performance over time due to aging
The global Lithium-Ion battery market size is estimated to more than double in a few years – $44 billion in 2020 to $94 Billion by 2025
Right now the costs related to the procurement of the physical batteries and their maintenance is a major hurdle in the adoption of this technology. Several lithium-ion battery manufacturing companies are now coming up with Smart Battery Management Systems that help monitor and track the health of lithium-ion batteries. There are several goals of a battery management system:
- Preventative Analysis— Monitoring Layer – Continuously monitor threats related to fire and explosions
- Predictive Analysis – Analytics Layer— Predict failures to prevent damage to the battery, therefore increasing the life of the battery
Recently, I got a chance to create such a smart battery management system for a company who manufactures batteries for electric scooters. Following is the high level architecture of the solution:

- Each battery pack is fitted with an electronic IoT sensor that measures voltage, current and temperature of the battery
- Sensor data is sent to Azure cloud (Azure Event Hubs or Azure IoT Hub) using Wi-Fi or Bluetooth
- Streaming data from Azure Event Hubs /Azure IoT Hub is computed using Azure Stream Analytics
- Azure Stream Analytics jobs performs anomaly detection. If an anomaly is detected an alert is generated and sent to the user using Azure Notification Hubs
In the following section I will show you how we managed to perform anomaly detection using a machine learning based operators – AnomalyDetection_SpikeAndDip and AnomalyDetection_ChangePoint provided by Azure.
AnomalyDetection_SpikeAndDip is capable of detecting temporary anomalies in a time series event.
AnomalyDetection_ChangePoint is capable of detecting persistent anomalies in a time series event stream
For this demonstration I used Aqueouss 48V 25Ah Li-Ion Battery. Following are the technical specifications of the battery.

For this battery following are the recommended watermarks.
Upper point voltage – 54.6 V— Anything higher could cause an explosion or fire Lower point voltage – 39 V – Anything lower impacts the health of the battery
A key function of the battery management system is to monitor these key values are in the recommended range. For the purpose of this demo I will concentrate on the Upper point voltage. After all who wants an explosion or fire.
Now lets get started on the implementation. Assuming you already have a Azure account feel free to follow along by following the steps below:
- Open a cloud shell https://portal.azure.com/#cloudshell/
- Invoke the command below to clone the code repository
Azure> git clone https://github.com/mkukreja1/blogs
Cloning into 'blogs'...
remote: Enumerating objects: 149, done.
remote: Counting objects: 100% (149/149), done.
remote: Compressing objects: 100% (90/90), done.
remote: Total 149 (delta 54), reused 135 (delta 40), pack-reused 0
Receiving objects: 100% (149/149), 1.67 MiB | 41.71 MiB/s, done.
Resolving deltas: 100% (54/54), done.
- Now lets create an Azure Event Hub. You may want to edit the name of the RESOURCEGROUPNAME and LOCATION as per your preferences.
RESOURCEGROUPNAME="training_rg"
LOCATION="eastus"
TOPIC="iotbattery"
EVENTHUB_NAMESPACE="iotbatteryspace"
EVENTHUB_NAME="iotbatteryhub"
EVENT_SUBSCRIPTION="iotbatteryevent"
az provider register --namespace Microsoft.EventGrid
az provider show --namespace Microsoft.EventGrid --query "registrationState"
az eventgrid topic create --name $TOPIC -l $LOCATION -g $RESOURCEGROUPNAME
az eventhubs namespace create --name $EVENTHUB_NAMESPACE --resource-group $RESOURCEGROUPNAME
az eventhubs eventhub create --name $EVENTHUB_NAME --namespace-name $EVENTHUB_NAMESPACE --resource-group $RESOURCEGROUPNAME
- To verify the newly created Azure Event Hub navigate to Home > All Resources > iotbatteryspace. On the left menu click on Event Hubs. Now click on the newly created event hub iotbatteryhub.

- Click on Process Data followed by Explore. You should now be in a Query Editor window. We need to create a shared access policy for previewing data in Azure event hubs. Click on Create.

- At this point we are ready to send some events to the newly created Azure event hub. Since you will not have an actual IoT sensor we can use commands below to emulate 100 events from five batteries sent to Azure event hubs. Invoke commands below in cloud shell.
RESOURCEGROUPNAME="training_rg"
TOPIC="iotbattery"
EVENTHUB_NAMESPACE="iotbatteryspace"
EVENTHUB_NAME="iotbatteryhub"
EVENTHUB_CONN_STR=$(az eventhubs namespace authorization-rule keys list --resource-group $RESOURCEGROUPNAME --namespace-name $EVENTHUB_NAMESPACE --name RootManageSharedAccessKey --query "primaryConnectionString" --output tsv)
for VARIABLE in {1..100}
do
echo "Sending event $VARIABLE"
python ~/blogs/eventhubs/iot/send_battery_events_normal_reading.py --EVENTHUB_NAME $EVENTHUB_NAME --EVENTHUB_CONN_STR $EVENTHUB_CONN_STR
done
- We can now verify if the event data appears in the Azure event hub. Go back to the Azure portal query editor. In the Input preview tab, press Refresh. You should now see the newly sent events in the query window.

- Check the readings received from IoT events. Invoke SQL below in query editor.
SELECT data.Timestamp, data."Battery Pack", data."Terminal Voltage", data."Charging Current", data."Discharging Current",data."SoC", data."Charge Capacity", data."Charging Power", data."Discharging Power", data."Cycle Count"
FROM [iotbatteryhub]

- We are now ready to implement anomaly detection. Remember we want to monitor battery voltage for spikes os volatage because they could be dangerous. Invoke the SQL below to check how many spike were received.
WITH batteryAnomalyDetection AS
(
SELECT
data.Timestamp AS time,
CAST(data."Terminal Voltage" AS float) AS Voltage,
AnomalyDetection_SpikeAndDip(CAST(data."Terminal Voltage" AS float), 70, 90, 'spikes')
OVER(LIMIT DURATION(second, 120)) AS Spikes
FROM
[iotbatteryhub]
)
SELECT
time,
Voltage,
CAST(GetRecordPropertyValue(Spikes, 'Score') AS float) AS
SpikeAndDipScore,
CAST(GetRecordPropertyValue(Spikes, 'IsAnomaly') AS bigint) AS
IsSpikeAnomaly
INTO battery
FROM batteryAnomalyDetection
WHERE CAST(GetRecordPropertyValue(Spikes, 'IsAnomaly') AS bigint)=1

In the SQL above we checked the anomaly using a confidence of 70%. AnomalyDetection_SpikeAndDip(CAST(data."Terminal Voltage" AS float), 70, 90, ‘spikes’)
The image above shows us that the machine learning operator flagged 17 events as spikes.
- Let’s retest the model but this time for a 90% confidence.
WITH batteryAnomalyDetection AS
(
SELECT
data.Timestamp AS time,
CAST(data."Terminal Voltage" AS float) AS Voltage,
AnomalyDetection_SpikeAndDip(CAST(data."Terminal Voltage" AS float), 90, 95, 'spikes')
OVER(LIMIT DURATION(second, 120)) AS Spikes
FROM
[iotbatteryhub]
)
SELECT
time,
Voltage,
CAST(GetRecordPropertyValue(Spikes, 'Score') AS float) AS
SpikeAndDipScore,
CAST(GetRecordPropertyValue(Spikes, 'IsAnomaly') AS bigint) AS
IsSpikeAnomaly
INTO battery
FROM batteryAnomalyDetection
WHERE CAST(GetRecordPropertyValue(Spikes, 'IsAnomaly') AS bigint)=1

The image above shows us that the machine learning operator flagged 15 events as spikes.
For Preventative Analysis we can implement code that will send an alert for each spike recived. This can be done easily using Azure functions.
- Now lets do something even more fun. We will now covert this to a Stream Analytics Job. The benefit of converting this query to a job is that data can be collected unattended. The collected data will help us with further analysis that can be used for dashboarding and decision making.
WITH batteryAnomalyDetection AS
(
SELECT
data.Timestamp AS time,
CAST(data."Terminal Voltage" AS float) AS Voltage,
AnomalyDetection_SpikeAndDip(CAST(data."Terminal Voltage" AS float), 90, 95, 'spikes')
OVER(LIMIT DURATION(second, 120)) AS Spikes
FROM
[iotbatteryhub]
)
SELECT
time,
Voltage,
CAST(GetRecordPropertyValue(Spikes, 'Score') AS float) AS
SpikeAndDipScore,
CAST(GetRecordPropertyValue(Spikes, 'IsAnomaly') AS bigint) AS
IsSpikeAnomaly
INTO battery
FROM batteryAnomalyDetection
Click on Create Stream Analytics Job. Fill in the details as below and click on Create.

- You will now be in window that looks like this.

We need an output for data collection. Click on + Add and choose Azure Blob storage/ADLS Gen2.

- Now that the output of the job has been created we are ready to start the job. Click on Start.

- Once you recieve the "Streaming job started successfully" message send some more events to the event hub. Invoke commands below in cloud shell.
RESOURCEGROUPNAME="training_rg"
TOPIC="iotbattery"
EVENTHUB_NAMESPACE="iotbatteryspace"
EVENTHUB_NAME="iotbatteryhub"
EVENTHUB_CONN_STR=$(az eventhubs namespace authorization-rule keys list --resource-group $RESOURCEGROUPNAME --namespace-name $EVENTHUB_NAMESPACE --name RootManageSharedAccessKey --query "primaryConnectionString" --output tsv)
for VARIABLE in {1..100}
do
echo "Sending event $VARIABLE"
python ~/blogs/eventhubs/iot/send_battery_events_normal_reading.py --EVENTHUB_NAME $EVENTHUB_NAME --EVENTHUB_CONN_STR $EVENTHUB_CONN_STR
done
- If everything worked well the stream analytic job will out events to storage below in JSON format. Navigate to Home > All Resources > traininglakehouse > battery

- But how do we perform analysis on the events data. For this we can use Azure Synapse. Invoke commands below in cloud shell to create a new Synape workspace.
STORAGEACCOUNTNAME="traininglakehouse"
BATTERYDATALAYER="battery"
SYSNAPSEWORKSPACENAME="batterytracking"
RESOURCEGROUPNAME="training_rg"
LOCATION="eastus"
SQLUSER="sqladminuser"
SQLPASSWORD="*********"
az synapse workspace create
--name $SYSNAPSEWORKSPACENAME
--resource-group $RESOURCEGROUPNAME
--storage-account $STORAGEACCOUNTNAME
--file-system $BATTERYDATALAYER
--sql-admin-login-user $SQLUSER
--sql-admin-login-password $SQLPASSWORD
--location $LOCATION
az synapse workspace firewall-rule create --name allowAll --workspace-name $SYSNAPSEWORKSPACENAME --resource-group $RESOURCEGROUPNAME --start-ip-address 0.0.0.0 --end-ip-address 255.255.255.255
- Now navigate to Home > All Resources > batterytracking and click on Open in the Open Synapse Studio section

- In the Synapse workspace window click on Data using menus on left pane.

Enter details as below and click Create

- Once the database has been created open a new SQL script window.

- Paste the SQL below in the newly opened SQL window. This SQL will create a view over the battery events data.
CREATE view battery
AS
SELECT JSON_VALUE(doc, '$.time') AS time, JSON_VALUE(doc, '$.BatteryPack') AS BatteryPack, JSON_VALUE(doc, '$.Voltage') AS Voltage, JSON_VALUE(doc, '$.SpikeAndDipScore') AS SpikeAndDipScore, JSON_VALUE(doc, '$.IsSpikeAnomaly') AS IsSpikeAnomaly
FROM openrowset(
BULK 'https://traininglakehouse.blob.core.windows.net/battery/',
FORMAT= 'csv',
FIELDTERMINATOR='0x0b',
FIELDQUOTE= '0x0b'
) WITH(doc nvarchar(max)) as row
GO
SELECT CONVERT(TIME,DATEADD(s,cast(time as int),'19700101 0:00:00')) as time, BatteryPack, Voltage, IsSpikeAnomaly FROM battery
GO
- As you can see we can now vizualize events data as it flows in to storage. This chart below shows us the spikes in events over time. This information is critical is critical for Predictive Analysis.

I hope this article was helpful. The topic is covered in greater detail as part of the Azure Data Analytics course offered by Datafence Cloud Academy. The course is taught online by myself on weekends.