Air Quality Monitoring Delivery

I created this solution as part of Azure Blogathon event 2022 Phase 3, For more info about this competition please go to this website : https://azureblogathon.com/

We know that the intensity of air pollution is increasing all over the places, yet we are ignoring this fact in an assumption that we are immune to it. It is found recently by scientists that how bad this could impact us. Health impacts extends way beyond the respiratory illness. Through research it is found that , foetal development ,mental health and metabolic syndrome are impacted due to increase in air pollution.

Based on Dr. Sarath Guttikunda article, “A general understanding is that an ambient monitoring station can represent an area covering 2 km radius, which translates to 15 sq.km (rounded off).” But if we consider Chennai Metropolitan area, it is spread across 1189 sq km (planned to expand up to 8878 sq km), whereas we have only 8 air quality monitoring stations as of 23rd Jan 2023. It is evident that we don’t have enough stations to cover entire area.

Proposed solution is to make use of BOV Garbage Collector, food delivery partners/ Cabs (Ola, Uber, Swiggy, Zomato …etc.) to mount the air quality sensor that detects PM 2.5, 10 concentrations in the air and visualize it as a live heat map. It will be an effective solution than ambient monitoring station as it shows us exactly where the intensity of air pollution is higher on street basis.


Before proceeding with the solution , public survey has been conducted with various people who works for Uber ,Ola, Zomato, Swiggy etc.…

https://dev-to-uploads.s3.amazonaws.com/uploads/articles/o7veaglg77ii5xqwxc84.png



  • IoT devices push the telemetry to cloud gateway (IoT Hub)
  • Message routing feature of IoT hub enables us to forward the telemetry to different store path. In our proposed architecture we have used Azure Cosmos DB as a hot store path to store data. Azure cosmos DB is ideal for IoT workloads as it is highly capable of ingesting high volume of data at low latency and high rates. IoT hub is also capable of filtering only specific messages to be pushed into Azure Cosmos DB.
  • Hybrid Transactional and Analytical Processing (HTAP) capability of Azure Synapse link for Azure Cosmos DB integrates data into analytical store which is highly efficient for analytical purposes.
  • Power BI helps us to visualize data in a Azure Map layer. Heat Map is used in our proposed architecture to better depict our scenario.
  • Azure Cosmos DB change feed triggers an Azure Functions
  • Change Feed trigger is used to publish the message to a Azure Web PubSub service
  • As a presentation layer we have a Azure Map integration in a web app and with the websockets we will have near real-time updates about the telemetry and could visualize it.

https://dev-to-uploads.s3.amazonaws.com/uploads/articles/q9d2mw0ob553wgjxuhic.png

  • I have used USB 2.0 to TTL UART serial converter module for connecting PM sensor with Raspberry pi , which could be easily plugged into the USB port.
  • GPS module doesn’t come with a soldered header pins. Hence we need to solder berg strip male connectors (4 pins) on the GPS module, prior to connecting with RaspberryPi 2.a. For connecting Neo6M GPS module , jumper wires are required to connect with RaspberryPi. Follow the pin out diagram below for establishing connection between them. 2.b. Optionally you can skip the jumper wire part connection to RaspberryPi module , if you have an extra USB 2.0 to TTL UART serial converter.
  • Setting up RaspberryPi can be done by following the official documentation here .
  • After the setup is done , use command prompt to enter raspi-config or under Preferences->select RaspberryPi Configuration.
  • Enable Serial Port interface.

https://dev-to-uploads.s3.amazonaws.com/uploads/articles/c0ibzbds5yagpe3zzayn.png


  • Azure subscription is required and you can create a free account here.
  • Create an IoT Hub by clicking on + Create a resource button or using the search bar.
  • Choose a subscription and create a resource group “rg-raspberrypi” if not created one earlier Provide IoT hub name and choose the region closest to you
  • For the tier I have started with Standard tier , but Free tier should be more than enough for testing and evaluating AQMD.
  • Daily message limit can be updated based on the preferences for higher tier, however for the free tier we should go with the default one which is 8000 message per day quota.
  • Proceed to the next step Networking and accept the defaults for now and click on next step Management , you can optionally change the permission mode and assign yourself the contributor role.
  • Proceed to the final step and click on Create button.

https://dev-to-uploads.s3.amazonaws.com/uploads/articles/d366a7mdickaxd0z8583.png https://dev-to-uploads.s3.amazonaws.com/uploads/articles/3dxtnoxqxol7kdcpyqwj.png

  • Under the device management in the left pane , click on Devices, then select Add Devices button
  • In the Create device pane , mention the device ID (Ola, Uber, Zepto, BOV, etc..) from which we will be collecting the telemetry.
  • Let’s leave the default authentication type and let’s use auto-generate keys
  • Once you click on Save button , device will be created along with autogenerated connection string for you to connect with.

https://dev-to-uploads.s3.amazonaws.com/uploads/articles/km5t6msew3xvm5yq7u25.png

  • For the hot store path we will make use of Cosmos DB, prior to this step make sure to Create a Cosmos DB in a serverless mode. Follow the steps here to create Cosmos DB.
  • Create account name as cosmos-raspberrypi , database name as airqualitymonitoringdelivery and collection as telemetry with synthetic_key as partition key.
  • Now all the telemetries from IoT Hub should be ingested to this hot store path. In order to do that we should create a Message routing that routes all the data to Cosmos DB.
  • First step is to create Custom endpoints. In our case it is Cosmos Db ( at the time of creating this blog , this feature is still in Preview)

https://dev-to-uploads.s3.amazonaws.com/uploads/articles/1xr0jfw3xezwsr62wwj1.png

  • Provide a name for the Cosmos DB endpoint
  • Choose the Cosmos DB account and respective collection to which the data should be ingested into
  • Choose Synthetic partition for the message
  • Provide the partition key template
  • Leave the default authentication type
  • Click on create button

https://dev-to-uploads.s3.amazonaws.com/uploads/articles/oqfmrtueq9626etotsxl.png

  • Click on Add routes
  • Provide a name for the route
  • Select the Cosmos DB endpoint created in the previous step
  • Choose data source to be device telemetry messages
  • Select Enable route
  • Optionally you can modify the routing query to push only specific messages to Cosmos Db , In our case it is set to true ( push all messages to Cosmos Db)

https://dev-to-uploads.s3.amazonaws.com/uploads/articles/73abpwu400jspa0qentc.png


  • Now that we have ingested data into Azure Cosmos DB , next step is to visualize and analyze the data to get meaningful insights from the telemetries
  • Azure Synapse link provides a cloud-native hybrid transactional and analytical processing (HTAP) capability that enables near real time analytics over operational data in Azure Cosmos DB.
  • Under Integrations in Azure Cosmos DB account , select Azure Synapse Link and click on Enable button
  • Once Azure synapse link is enabled, make sure to choose the collection if it is created already and create a workspace in the next screen.

https://dev-to-uploads.s3.amazonaws.com/uploads/articles/wdiv09o3d2gemfpr6f40.png

https://dev-to-uploads.s3.amazonaws.com/uploads/articles/eun8q3919cv82tx2cyyx.png

https://dev-to-uploads.s3.amazonaws.com/uploads/articles/x2qwbry072lit7xymcbj.png

  • Enter into the Azure synapse analytics workspace
  • Select the database “airqualitymonitordelivery” in the top pane and proceed to create a view
  • Create a view named “aggregatetelemetry” which aggregates the data by rounding off latitude and longitude to 3 points, to avoid any discrepancies while visualizing the data as a heatmap

https://dev-to-uploads.s3.amazonaws.com/uploads/articles/9k1qpccoqa2px9hfwafe.png


  • Sign up for a free Power BI account if you don’t have one here.
  • You can optionally download Power BI Desktop tool from here or you can use app.powerbi.com
  • Select Import data from SQL server.
  • Provide the connection details (Serverless SQL endpoint) in the server text box and choose DirectQuery as Data Connectivity mode
  • Click on OK button
  • In the next screen , sign in if prompts and then click on connect button.
  • Now Choose the view created in the earlier step and click on Load button

https://dev-to-uploads.s3.amazonaws.com/uploads/articles/dv8s3357yxvynwxymjo6.png

https://dev-to-uploads.s3.amazonaws.com/uploads/articles/fokxpomn410w9mudhe0w.png

https://dev-to-uploads.s3.amazonaws.com/uploads/articles/q5uvvtrqvqbapn3u8oq9.png


 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
#!/usr/bin/python
# -*- coding: UTF-8 -*-

# Requires pyserial. Install via:
# pip install pyserial

from __future__ import print_function
from serial import Serial, EIGHTBITS, STOPBITS_ONE, PARITY_NONE
import time, struct
import string 
import pynmea2
import json
import uuid

from azure.iot.device import IoTHubDeviceClient, Message

port = "/dev/ttyUSB0" # Set this to your serial port.
baudrate = 9600

gpsPort="/dev/ttyAMA0"
gpsBaudrate= 9600


device_client= IoTHubDeviceClient.create_from_connection_string("YOUR_IOTHUB_CONNECTION_STRING")
 
device_client.connect()

MSG_PAYLOAD= '{{"pm_25": {pm_25},"pm_10": {pm_10},"lat": {lat},"lng": {lng}}}'
# Prepare serial connection.
ser = Serial(port, baudrate=baudrate, bytesize=EIGHTBITS, parity=PARITY_NONE, stopbits=STOPBITS_ONE)
ser.flushInput()

HEADER_BYTE = b"\xAA"
COMMANDER_BYTE = b"\xC0"
TAIL_BYTE = b"\xAB"

byte, previousbyte = b"\x00", b"\x00"

while True:
    previousbyte = byte
    byte = ser.read(size=1)
    #print(byte)
    
    # We got a valid packet header.
    if previousbyte == HEADER_BYTE and byte == COMMANDER_BYTE:
        packet = ser.read(size=8) # Read 8 more bytes
        #print(packet)
        
        # Decode the packet - little endian, 2 shorts for pm2.5 and pm10, 2 ID bytes, checksum.
        readings = struct.unpack('<HHcccc', packet)
        
        # Measurements.
        pm_25 = readings[0]/10.0
        pm_10 = readings[1]/10.0

        #gps
        gpsSer= Serial(gpsPort,9600,timeout=0.5)
        dataout= pynmea2.NMEAStreamReader()
        
        newdata= gpsSer.readline().decode('utf-8',errors='ignore')
        lat=0
        lng=0
        print(newdata)
        # ID
        id = packet[4:6]
        
        # Prepare checksums.
        checksum = readings[4][0]
        calculated_checksum = sum(packet[:6]) & 0xFF
        checksum_verified = (calculated_checksum == checksum)
        
        # Message tail.
        tail = readings[5]
        if tail == TAIL_BYTE and checksum_verified:
            #print("PM 2.5:", pm_25, "μg/m^3  PM 10:", pm_10, "μg/m^3")
            if newdata[0:6] == "$GPRMC":
                newmsg= pynmea2.parse(newdata)
                lat= newmsg.latitude
                lng=newmsg.longitude
                payload= MSG_PAYLOAD.format(pm_25=pm_25, pm_10=pm_10, lat=lat, lng=lng)
                msg= Message(payload)
                msg.message_id= uuid.uuid4()
                msg.content_encoding= "utf-8"
                msg.custom_properties["deviceId"]= "ola"
                msg.content_type= "application/json"
                print("sending message: {}".format(msg))
                device_client.send_message(msg)
                time.sleep(2)
                print("PM 2.5:", pm_25, "µg/m³  PM 10:", pm_10, "µg/m³  ID:", bytes(id).hex())
            
            

  • Azure subscription is required and you can create a free account here.
  • Create a WebPubSub instance by clicking on + Create a resource button or using the search bar.
  • Choose a subscription and create a resource group “rg-raspberrypi” if not created one earlier
  • Provide Resource name and choose the region
  • For the tier I have chosen Free tier, that should be more than enough for testing and evaluating AQMD.
  • Proceed to the final step and click on Create button.
  • After resource is created add a hub (AQMD) to the instance
  • Copy the ConnectionString from the key session

https://dev-to-uploads.s3.amazonaws.com/uploads/articles/xoa3s9gzqbey587seml3.png

https://dev-to-uploads.s3.amazonaws.com/uploads/articles/7fgv9i6n4mdkjxpg7x1r.png


 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
 public class TelemetryTrigger
    {
        private readonly IConfiguration _configuration;
        public TelemetryTrigger(IConfiguration configuration)
        {
            _configuration = configuration;
        }

        [FunctionName("TelemetryToPubSub")]
        public async Task TelemetryToPubSub([CosmosDBTrigger(
            databaseName: "airqualitymonitoringdelivery",
            collectionName: "telemetry",
            ConnectionStringSetting = "cosmosconnection",
            LeaseCollectionName = "leases",
            CreateLeaseCollectionIfNotExists = true)]IReadOnlyList<Document> input,
            ILogger log)
        {
            if (input != null && input.Count > 0)
            {
                log.LogInformation("Documents modified " + input.Count);
                log.LogInformation("First document Id " + input[0].Id);

                var serviceClient = new WebPubSubServiceClient(_configuration["WebPubSubConnectionString"], _configuration["HubName"]);
                await serviceClient.SendToAllAsync(JsonConvert.SerializeObject(input));
            }
        }
    }
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
public class Connection
    {
        private readonly ILogger<Connection> _logger;

        public Connection(ILogger<Connection> log)
        {
            _logger = log;
        }

        [FunctionName("GetConnectionUrl")]
        [OpenApiOperation(operationId: "Run", tags: new[] { "name" })]
        [OpenApiParameter(name: "name", In = ParameterLocation.Query, Required = true, Type = typeof(string), Description = "The **Name** parameter")]
        [OpenApiResponseWithBody(statusCode: HttpStatusCode.OK, contentType: "text/plain", bodyType: typeof(string), Description = "The OK response")]
        public async Task<WebPubSubConnection> GetConnectionUrl(
            [HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = "negotiate")] HttpRequest req, [WebPubSubConnection(Hub = "AQMD", UserId = "{query.userid}")] WebPubSubConnection connection)
        {
            Console.WriteLine("login");
            return await Task.FromResult(connection);
        }
    }
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95

<!DOCTYPE html>
<html>
<head>
    <title>AQMD</title>

    <meta charset="utf-8" />
    <meta http-equiv="x-ua-compatible" content="IE=Edge" />
    <meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no" />
    <!-- Add references to the Azure Maps Map control JavaScript and CSS files. -->
    <link rel="stylesheet" href="https://atlas.microsoft.com/sdk/javascript/mapcontrol/2/atlas.min.css" type="text/css" />
    <script src="https://atlas.microsoft.com/sdk/javascript/mapcontrol/2/atlas.min.js"></script>
    <script src="https://cdn.jsdelivr.net/npm/@aspnet/[email protected]/dist/browser/signalr.js"></script>
    <script src="https://cdn.jsdelivr.net/npm/[email protected]/dist/axios.min.js"></script>

    <script type='text/javascript'>

    function GetConnectionInfo() {
    return axios.get('https://func-raspberrypi.azurewebsites.net/api/negotiate?userid=test')
        .then(function (response) {
            return response.data
        }).catch(console.error)
}
GetConnectionInfo().then(info  =>{
        info.accessToken = info.accessToken || info.accessKey;
        info.url = info.url || info.endpoint;
        const options = {
            accessTokenFactory: () => info.accessToken
        };
        const connection = new WebSocket(info.url);

        connection.onopen = () => console.log('connected');
        connection.onmessage= (event) => newMessage(event);
        connection.onclose= () => console.log('disconnected');
        
    });

    let counter = 0;
    let datasource;
    let aqmd = [];
    function newMessage(message) {
        var parseData= JSON.parse(message.data);
        console.log(parseData);
        var newAqmdPin = new atlas.Shape(new atlas.data.Point([parseData[0].Body.lng, parseData[0].Body.lat]), parseData[0].id);
        newAqmdPin.addProperty('pm_25', parseData[0].Body["pm_25"]);
        newAqmdPin.addProperty('pm_10', parseData[0].Body["pm_10"]);
        aqmd[parseData[0].id] = newAqmdPin;
        datasource.setShapes(Object.values(aqmd));
    }

    function generateUser() {
            return 'User ' + Math.random().toString(36).replace(/[^a-z]+/g, '').substr(0, 5);
        }

    var map;
    function GetMap() {
            //Initialize a map instance.
            map = new atlas.Map('myMap', {
                center:  [80.1683535,13.136500833333333],
                style: 'light',
                zoom: 8,
                //Add your Azure Maps subscription key to the map SDK. Get an Azure Maps key at https://azure.com/maps
                authOptions: {
                    authType: 'subscriptionKey',
                    subscriptionKey: 'YOUR_SUBSCRIPTION_KEY'
                }
            });
            //Wait until the map resources are ready.
            map.events.add('ready', function () {
                datasource = new atlas.source.DataSource();
                map.sources.add(datasource);
                //Create a symbol layer using the data source and add it to the map
                map.layers.add(
                    new atlas.layer.SymbolLayer(datasource, null, {
                        iconOptions: {
                        ignorePlacement: true,
                        allowOverlap: true,
                        size:0.5
            },
                        textOptions: {
                            textField: ['concat', ['to-string', ['get', 'pm_25']], ',', ['get', 'pm_10']],
                            color: 'orange'
                        }
                        

                    })
                );  
            });
        }
    </script>
</head>
<body onload="GetMap()">
    <div id="myMap" style="position:relative;width:100%;min-width:290px;height:600px;"></div>
</body>
</html>

It is optional to buy a case for raspberry pi model . In my case I have bought a case to hold my touch screen + raspberry Pi. Myself and my wife decided to take it for a ride in and around our residence (Ambattur estate). Outcome of the ride can be seen in the next slide, which feels more accurate as we felt the same before while crossing those streets!

https://dev-to-uploads.s3.amazonaws.com/uploads/articles/o28dxn602tq652w9agsz.png


https://dev-to-uploads.s3.amazonaws.com/uploads/articles/b4r0h21ahyroq49n7lcs.png


https://github.com/Cloud-Jas/AQMD


  • LoRaWan could be even more efficient solution
  • Support from public and government to implement this solution