In this hack we're going to use a Thinxtra Xkit device to send temperature data over the Sigfox network into an Azure EventHub to provide a PoweBI dashboard.
This is an extension of a hack doing the same with AWS services. There's a bit of overlap and repeated text, although I've tried to minimise it.
PrerequisitesFor this hack, I assume that you have:
- An Azure Account, with a Resource Group that you have permissions to create services in. NOTE: you are solely responsible for all Azure charges that result from following this walk-through.
- A PowerBI Account
- Familiarity with compiling and installing Arduino Firmware.
- A Thinxtra Xkit with a Sigfox contract. (Any programmable Sigfox device will do but some specifics will vary)
- Go through the "Xkit Overview and firmware update" and the first section of "Sigfox Configuration" in this article
Unlike the AWS configuration, for Azure EventHub, we need to create the EventHub first prior to setting up the Sigfox callbacks.
Log into https://portal.azure.com and click on "EventHubs" in the lefthand menu, note that it might be under "More Services". Then click 'Add', you should get the following screen.
Choose anything you want, it has to be unique, I used 'sigfoxhubexp' but you will need to choode something else. Pricing Tier can be set to 'Basic' which will help keep the costs down. Select an appropriate Resource Group and Location, and leave Throughput Units at 1.
Click OK, and wait for your EventHub to be created.
Once ready (keep an eye on the notifications at the top of the window), open the new EventHub and click "+ Event Hub" at the top. Enter 'demo' in the Name field and click OK.
Now back in the Event Hub screen for 'sigfoxhubexp' (note your name will be different), click on 'Connection Strings' and select the 'RootManageSharedAccessKey' Policy, it will be the only one there. Copy the "Connection string–primary key" value, I'll refer to this as the Connection String in a bit.
Sigfox ConfigurationLog into https://backend.sigfox.com.
CallbacksWhen the Sigfox network receives a signal from your device, it uses the callbacks configured on this screen to send the data to your backend. In this case, were are going to use the Azure EventHub, which Sigfox have provided a wizard to get going. Click on the 'new' button at the top of the screen and then click on the Azure EventHub Wizard line.
Enter the following:
- Custom Payload Config
temp::uint:16:little-endian pressure::uint:16:little-endian photo::uint:16:little-endian AccX::uint:16:little-endian AccY::uint:16:little-endian AccZ::uint:16:little-endian
- Connection String: Paste the Connection String we copied from Azure earlier
- URL: You have to manually create the url, for me it is: https://sigfoxhubexp.servicebus.windows.net/demo/messages
You need to change 'sigfoxhubexp' to what ever you used when creating the EventHub and change 'demo' if you used a different value above.
- JSON Body (note, no line breaks)
{"device": "{device}","time": {time},"station": "{station}","snr": {snr},"rssi": {rssi},"data": "{data}","temp": {customData#temp},"pressure": {customData#pressure},"photo": {customData#photo},"AccX": {customData#AccX},"AccY": {customData#AccY},"AccZ": {customData#AccZ}}
Then click OK.
Stream Analytics ConfigurationNext we need to setup a Stream Analytics job, to capture the data going into EventHub and prepare it for PowerBI. Log back into Azure Portal and find "Stream Analytics" in the left menu, again, it might be under "More Services Jobs".
Click "+ Add" at the top, provide an arbitrary name, and appropriate values for Subscription, Resource Group and Location. Then click "Create" at the bottom.
Wait a minute before clicking the refresh button (within the Azure Portal, not the browser portal, although it shouldn't matter) and then open the job you just created.
Go to Inputs, and click Add to open the following window.
Provide an input alias, and set the following
- Source Type -> Data Stream
- Source -> Event Hub
- Inport option -> Use event hub for current subscription
- Service bus namespace -> Use the name you used earlier where I used "sigFoxHubExp" (It will be different)
- Event hub name -> Use the name that you used were I used "demo" (it might be the same)
- Event hub policy name -> RootManaGeSharedAccessKey
- Event serialisation format ->JSON
- Encoding -> UTF-8
- Event Compression Type -> None
And click 'Save'.
Back in the Stream Analytics Job definition, click on "Outputs" and the "+ Add", to get to the New Output window.
Provide an name for the output, for example "SigfoxOut", set Sink to "Power BI" and click "Authorize".
This will get you to log in to PowerBI and enable the Stream job to send data to PowerBI.
Once authorized, provide a dataset name "sigfoxDemo" and table name "Demo1", or values of your choice. Then click "Save".
Finally, select "Query", and paste the below into the query box on the right hand side. You will need to change "sigFoxExp-PowerBi
" to the name you used for your output, and "sigFoxHubExp" to the name you used for your input, you can see the names you used on the left side of the window.
SELECT
device, duplicate, snr, station, data, rssi,
(CAST(temp as float) / 100.0) AS temperature,
(CAST(pressure as float) * 3.0) AS pressure,
(CAST(photo as float) / 1000.0) AS photo,
(CAST(AccX as float) / 250.0) AS accx,
(CAST(AccY as float) / 250.0) AS accy,
(CAST(AccZ as float) / 250.0) AS accz,
DATEADD(hour,+10,DATEADD(SECOND,(CAST(time as bigint)), '1970-01-01 00:00:00')) AS realtime,
DATEADD(hour,+10, DATEADD(SECOND,LAST(time) OVER (PARTITION BY time LIMIT
DURATION(hour, 1) WHEN time IS NOT NULL), '1970-01-01 00:00:00')) AS lastevent
INTO
"sigFoxExp-PowerBi"
FROM
sigFoxHubExp
The query above, takes the raw values supplied by the XKit and rescales them back to values we understand and also sets out the date fields. Click 'Save'.
Critically, you must now "Run" the Stream Analytics job, or it wont do anything. Once you click "Run", you cant change any of its configuration until you stop it, but in doing so you loose the existing data (in my experience).
Now go and generate some data with your XKit, you can't proceed without at least one data entry, but the more the merrier (you get a pretty boring graph with only one data point).
PowerBI ConfigurationFinally its time to log into PowerBI and create our dashboard.
Once logged into PowerBI, expand "My Workspace" and under datasets you should see "SigfoxDemo" (the namespace we used in the Stream Analytics Output configuration), click on it.
Now we can craft the Dashboard, I'll walk through a simple temperature chart, the rest is up to you.
Click on the line graph icon to create a new graph, drag "realtime" to the "Axis" property of the graph, and "temperature" to "Values". Resize the chart a little and you should have something like this.
So just to go over what we done, we used the Xkit, from Thinxtra, to transmit information over the Sigfox network, which was then sent to Azure EventHub. The EventHub sends data to a Stream Analytics Job which rescales the data and sends it to a PowerBI Dataset. Finally, PowerBI uses the data to create our dashboard.
As always, there are many ways of doing this, Thinxtra have a tutorial doing mostly the same except that they use the IoT Hub, which I may have used, except that I don't have sufficient permissions in the Azure Account I was using.
Until next time,
Happy Hacking
Comments