Skip to content

Latest commit

 

History

History
125 lines (83 loc) · 4.52 KB

File metadata and controls

125 lines (83 loc) · 4.52 KB

Registering New Sensors

Imagine a case where a brand new sensor is introduced into mints. Lets say this sensor is is called ISG001 and has the following feilds.

ISG001 = OrderedDict([
    ("dateTime", str(dateTime)),
    ("temperature", None),
    ("pressure", None),
    ("humidity", None),
    ("feelsLike", None)
])

Initially check if this sensorID is already taken using this file.

  1. Create a copy of bme280reader.py and name it isg001Reader.py on your node.
cp bme280Reader.py isg001Reader.py
  1. At this point modify isg001Reader.py to have a sensor named ISG001 with the appropriate dictinary as suggested above and then run it.
python3 isg001Reader.py

At this point check if a new sensor ID is visible on influx DB.

The sensor data wont be available on influx since the sensor itself is not yet registered in the workflow. The following explains how it can be done.

  1. At this point open the node red ineterface and navigate to the central node tab.
image
  1. Copy and paste a MQTT node + json node connection and paste it on the silouette
image
  1. Double click on the duplicate MQTT node and then add the ID of the new sensor.
image
  1. Link the new sensor to the parse for Mints Data function and hit deploy.
image
  1. Export the current node red instance.
image image At this point make sure to select **all flows** and hit download.
  1. Upload the downloaded file @ the node red docker folder of Air Quality Analysis Workflows.

Now we update the nodered docker as we did in EX02.

Step 2: Log In to the Cloud Server

Access the MDASH server from your PC using SSH:

ssh jxw190004@mdash.circ.utdallas.edu

⚠️ You must already have SSH access configured to MDASH.


Step 3: Update the Air Quality Analysis Workflow Repository

After logging in, navigate to the repository and pull the latest updates:

cd AirQualityAnalysisWorkflows/
git pull

Step 4: Restart the Node-RED Instance

Navigate to the influxdb folder and list the running containers:

cd influxdb
podman container ls

You should see output similar to this:

CONTAINER ID  IMAGE                            COMMAND  CREATED       STATUS           PORTS                                                                   NAMES
c3692f3ca93e  k8s.gcr.io/pause:3.2                      3 years ago   Up 3 years ago   0.0.0.0:1880->1880/tcp, 0.0.0.0:3000->3000/tcp, 0.0.0.0:8086->8086/tcp  06360efbd5a9-infra
cc581152869a  localhost/mints-grafana:latest            4 months ago  Up 4 months ago  0.0.0.0:1880->1880/tcp, 0.0.0.0:3000->3000/tcp, 0.0.0.0:8086->8086/tcp  influxdb_grafana_1
edd6ff8a8e49  localhost/mints-influxdb:latest  influxd  4 months ago  Up 4 months ago  0.0.0.0:1880->1880/tcp, 0.0.0.0:3000->3000/tcp, 0.0.0.0:8086->8086/tcp  influxdb_influxdb_1
33bad6576ec5  localhost/mints-nodered:latest            2 months ago  Up 2 months ago  0.0.0.0:1880->1880/tcp, 0.0.0.0:3000->3000/tcp, 0.0.0.0:8086->8086/tcp  influxdb_nodered_1

Locate the Node-RED container ID (e.g., 33bad6576ec5).
Stop and remove only the Node-RED container — do not remove the InfluxDB container.

podman stop 33bad6576ec5
podman rm 33bad6576ec5

Now, rebuild and restart the unavailable containers:

podman-compose up --build -d

Step 5: Exit MDASH

Once the workflow is rebuilt and running, exit the server:

exit

Step 6: Verify Node Data

Finally, rerun the isg001Reader.py script on your test node and confirm that data from the new node name is appearing in InfluxDB.