Background
Azure Event Hubs is a big data streaming platform and event ingestion service which can track and process thousands of events per second. Data from event hub can be transformed and stored using real time analytics service. Azure Stream Analytics is a real-time analytics service designed to help analyze and process streams of data which can be used to trigger alerts and actions. Power BI is a business analytics service that provides interactive visualizations.
Objective
Utilizing Azure data ingestion/query services such as Event Hub and Stream Analytics –successfully create a self-streaming Power BI real time dashboard
Setting Up Azure
- To begin, accessing Azure portal, set up your Azure free-trial subscription via the following link: https://azure.microsoft.com/en-us/free/
- Once your free trial account has been activated, go to the azure portal at : https://portal.azure.com/#home. Double check in the top right corner to make sure that you are signed in with your free trial azure account. If you aren’t, make sure to sign-in, before following the next steps.
- At the top left corner under the title: "Microsoft Azure", Click Create a resource.

- In the "Search the Marketplace" search bar, type "Resource Group". Click the first result titled: "Resource Group", then hit create.
- Once you hit create, it will take you to a page where you set the settings of your new resource group. Type in any name for your new resource group, and then hit the blue button down below titled: "Review + Create". To enter the resource group click on the notification icon, located in the top right corner of the portal. Then click on "Go to resource group".
- Now, once again click on the create a new resource button in the top right corner of the portal.
- In the search bar, type "Event Hub". Click on the first result titled: "Event Hubs."

- Once again, this will lead you to a page where you can set the settings of your new event hub. Type in any name for your event hub. Set the pricing tier to Standard. Enable Kafka and make sure that "Make this namespace zone redundant" is turned OFF. Subscription should be set to Free Trial. Select your existing resource group that you created in the previous steps. Keep location set to West US and set Throughput Units to 4. Enable Auto-Inflate as well.
- Once you hit create, it will take a few minutes for your event hub to deploy. Click on the notification icon in the top right corner, to see the status of your event hub deployment. Once your deployment succeeds, click on your event hub link name, located in the description of the notification.

- Once clicked, it will take you to your deployment page. Simply click go to this resource to access your event hub namespace.
- Once you’re inside the event hub namespace, look at the right hand-side column with various resource sections. Look for the section titled "Event Hubs", under entities.

11.
Once you are transferred to the Event Hubs page, click on the + icon adjacent to the word Event Hub. Once clicked it will take you to the event hub creation settings page. Type a name for your new event hub (different than your namespace name) and set partition count and message retention to 4. Set capture to OFF. Then hit create. The event hub will be immediately created, and you will be transferred back to the Event Hubs page in your namespace. To access your new Event Hub instance, click on its name.

- Next set up is a stream analytics job. To set this up, click on the create resource button in the top left corner. Then, search up stream analytics job. Click on the first result, then hit create. Once again, you will be transferred to the stream analytics job settings page. In the settings page, enter your ASA job name, and select your resource group that you created. Keep hosting environment to cloud and move streaming units to 4.
- It will take a few minutes for your ASA (Azure Stream Analytics) job to deploy. Once the deployment succeeds, click on Go to resource in the notifications bar.

- Once you’re inside the stream analytics job, click on inputs in the right-hand-side column, under Job topology.

In the inputs page, click on New stream input located on the top of the page. Once clicked, it will bring down three different stream inputs for you to choose from. Click on "Event Hub".

- 1. Once you click on event hub, it will bring up the input settings page. Set a name for your input alias and keep the rest of the settings as is. Make sure to have "Event serialization format" set to JSON. Next, hit save. This will automatically start testing the connection with ASA and your event hub. It will soon pop up a notification saying that your connection was successful.

- Next, head on over to the outputs section under "job topology."

Once you click on outputs. Select the add button on the top of the page. Scroll down to select Power Bi as an output. In the Power BI settings page, set a name for your Power BI output alias. Then click on Authorize. Sign in to your Power BI Personal/Work account. Once you are signed in, choose a name for your dataset. Running a stream analytics job automatically creates a streaming dataset in Power BI. The dataset name will be the name for this streaming dataset. Choose a table name as well.

- Once you click save, it will automatically begin to test the connection to the output. It should say connection successful.
- Now, go back to your event hub namespace. Click on the home icon in the top right corner, under "Create a resource". Make sure to choose the "Event Hub Namespace" NOT the instance. The namespace is the second resource you created in this tutorial. Once clicked, you will be inside the event hub namespace. In the right-hand side column, click on Shared Access policies, under settings.


Python Set Up
We will be using Python to post data into our Azure Event Hub. Feel free to use any IDE that best suits you. Popular Python IDEs: PyCharm, NotePad++
We’ll be using pandas to read an excel file for data, and then post the data values to Azure Event Hub.
Set up a simple Excel table with rows and columns. Have at least 15–20 rows of data for your Python script to read. My data will have columns of Yield, SKU, SerialNumber, ProductWeight, Belt Speed etc.
The payload dictionary should have key value pairs of column name and row value datatype respectively.
Make sure to install all the necessary packages, otherwise, the program will not run.
Now your python program is set, and ready to run.
Run the program and confirm that your results show up without any errors.
Stream Analytics and Power BI
- Now head on over to your Stream Analytics job in the Azure portal. Click on query under "job topology."
-
From here, copy the following code into your stream analytics query:
Replace SKU, Yield, BeltEfficiency etc with your column names. Make sure to adjust the datatypes accordingly as well.
A complete list of Stream Analytics query datatypes can be found here
- Now click on the 3 ellipsis located next to your input name. Next, click on sample data from input.

4. Wait for your input to be sampled. Once it has been sampled, you will receive a notification saying that your sample has succeeded. Now click on test on the top of the page.

- Once the test has succeeded, it should pull up your results on the bottom of the query. Now, click on overview in the right-hand side of the ASA job. Then click START. This will start your streaming job.
- It takes a while for your ASA job to start streaming, but once it starts, a new streaming dataset will be creating in the datasets section of your Power BI Workspace. From here, you can create a new dashboard and create visuals with your streaming dataset.

Contact: