Skip to content

MIT-Senseable-City-Lab/greenery-scanner

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

274 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Greenery Scanner Platform (version "Nautilus")

Nautilus

A drive-by sensing platform for capturing RGN, LWIR, air quality, IMU and GPS data.

Thermal sensing capability (LWIR) mentioned in the remaining of this document are optional, currently in-prgress, and untested.


%%{
  init: {
    'theme': 'dark'
  }
}%%

flowchart LR

    subgraph Environment
        subgraph Phenomena
            things([Trees & Surfaces])
            air([Atmosphere])
        end

        subgraph Vehicle
            subgraph Device
                sensing([Sensing])-->pre(Pre-processing)
                positioning([Positioning])-->pre            
            end

            subgraph Users
                drive(Driving)
                command(Controlling)
            end
            
            command--App-->Device
        end
    end

    subgraph Software
        post(Post-processing)-->evaluation(Evaluation)
        external(External Data)-->evaluation
    end

    things--RGN---->sensing
    things--LWIR---->sensing
    air--AQ---->sensing

    subgraph Data
        RGN
        Thermal
        Numerical
    end

    pre-->Data-->post
    

Loading

Table Of Contents:

  1. Acquisition Instructions
    1.1. Device
    1.2. App
    1.3. Data
    1.4. Processing
  2. Technical Information
    2.1. Mechanics
    2.2. Electronics
    2.3. Fabrication
    2.4. Firmware
    2.5. Software
  3. Organization
    3.1. This Repository
    3.2. Raw Data
    3.3. Pre-Processed Data
    3.4. Post-Processed Data

1. Acquisition Instructions

Greenery Scanner is composed by a vehicle-mountable device, the Greenery Scanner Sensing Node (GSSN) and a AI-assisted software pipeline for post processing data.

The list below gives step-by-step instructions for its operation.

1.1. GSSN Device

photo

Start by making sure all physical components and accessories of the GSSN are available:

  • Device;
  • Lens Cover (transparent);
  • Calibration Plate(s);
  • Battery(es);
  • Charger;
  • Outlet Adapter.

πŸ”‹ Battery + Charger

Firstly, charge the battery by latching it in the charger. While in progress the charger LED will be solid πŸŸ₯ red, and it will keep this way until it is fully charged, becoming solid 🟩 green.

Full Charge Autonomy: around 2 hours of capturing, or 3 hours in standby;

Note

In case there is an issue with the battery or the charger, the light will blink πŸŸ₯🟩 red/green intermitently. Check to see if the battery is properly secured.

πŸ”Œ Outlet Adapter

Alternativelly, it is recommended to use the wired outlet adapter to have unlimited autonomy for critical/demanding tasks such as transferring data, testing and debugging. In such critical situations, always use the adapter instead of the battery to avoid data loss.

Caution

Never insert the outlet adapter into the battery charger.

⚑️ Turning it On

Begin by latching the battery or the outlet adapter in the Device's bottom compartment. Then, press the adjacent power button.

The fans will start spinning and, in a few moments the button light will become blue.

πŸ’‘ Status light

The power button has a LED ring for feedback with four distinct states:

  • πŸ”΅ blue – in standby, has booted correctly, but there is no GPS signal or any other reasons that prevent data capture;
  • 🟒 green – ready for capturing data with good GPS signal;
  • πŸ”΄ red – busy initializing components or processing files, most App actions will be unavailable;
  • πŸ”˜ faint green – ready to remove the battery, or did not boot correctly.

πŸ“· Lens Protection

Place the GSSN with its feet on a flat surface and remove the lens protection. Do not touch the glass, but if it ever needs cleaning, please use a clean microfiber cloth.

Important

Whenever the Device is not in use the lens cap must be mounted.

Be careful to not bump the lens, it is delicate and can be easily scratched.

βš–οΈ Calibration Plate

Without touching the felt surface, snap one calibration plate around the lens and make sure it is secured magnetically. It will serve as a visual red, green and near-infrared ground truth during the capture.

Important

If the felt pads get dirty or damaged, they might need hyperspectral recalibration.

πŸš— Mounting

Magnetic Feet (default)

Three magnetic feet are screwed by default in the Device base to place it directly on top of a vehicle roof (must be ferromagnetic).

ARCA-SWISS

Besides the three 1/4" standard holes (for the magnetic feet), there are another two with the correct dimensions to attach this 120mm ARCA-SWISS mount.

Since the ARCA-SWISS is a mounting standard, any other compatible accessories might be used to interface with other mounting solutions like tripods.

Below is a quick-release plate with the 120mm ARCA-SWISS above, used on a custom bike:

Tip

Depending on the mounting plate, the magnetic feet might need to be removed by unscrewing the magnets then disassembling the bottom piece and taking the feet off.

Elevated Mount

Depending on the vehicle and desired data to be captured, the Device might need to be held on a higher position. It is unlikely that it need be more than 50cm above the vehicle roof, but it is recommended to conduct data capture tests to attest if there is any undesireable obstruction before the actual capture.

Mounts like this one are an option. Though, it should be used with caution since it wasn't tested.

Caution

First, make sure the mount can stand the device weight. Observe all orientations from its manufacturer.

For proper mounting, there must be a way to keep the device from unscrewing with the vehicle vibration/rotation. Using the two-screw ARCA-SWISS adapter (above) can help with that.

Bike Handle Mount

For small and short tests a simple bike mount can be used.

The down side is that the biker will always be visible, interfering with the images.

Caution

Make sure both the mount and Device are very well secured: the Device is heavy.

Monitor it constantly to make sure it is not unscrewing or losing sustentation over time.

✈️ Transportation

Box Package

Since the Device and its accessories are both delicate (especially the fisheye lens) and heavy, its highly recommended to use self-expanding foam packets to ensure all parts are kept tight but soft inside the shipping box.

All parts will fit an 11"x11"x11" standard FedEx box, with three self-expanding packets, in the following order (top to bottom):

  • Box top
  • Foam;
  • Device;
  • Foam;
  • Accessories;
  • Foam;
  • Box bottom.

Pelican Case

If the Device must be carried around frequently, it is recommended to bring it on a dedicated Pelican Case with custom-cut foam to accomodate all parts.

1.2. App

The GSSN is exclusively controlled through the App, which might be accessed on a mobile device (smartphone or notebook) while capturing, or a desktop computer to retrieve data later.

πŸ“± Accessing the App

Connect to the Nautilus Wi-Fi network on your mobile phone, notebook or computer.

The App is a webpage served by the Device using its own network. Internet access is needed only to preview or upload files.

To open it, use a web browser (Chrome preferred) and access https://10.42.0.1:3000.

Tip

In case the Nautilus WI-Fi network is not available, check to see if the Device is turned on and working properly. The power button light must be πŸ”΅ blue.

If the page stays in the "Waiting for device" indefinitely, make sure the button light underneath is πŸ”΅ blue or 🟒 green.

πŸŽ›οΈ Device Status

Inside the App, the "Device" module constantly streams feedback on the Device's vitals such as GPS signal quality, battery charge, storage availability, and internet connection.

Additionally, right below there is real-time information on sensors: IMU, ambient temperature, relative humidity, barometric pressure, particulate matter (PM1.0, 2.5, 4.0 and 10.0), VOC and NOx index points.

The timestamp in the module's footer indicates the latest time information was received.

Tip

If the latest status does not update, try refreshing the page.

πŸŒ€ Help

In case of doubt on how to operate the device, go down to "Utilities" and click on Help. This button leads to a web document with instructions (needs mobile Internet access).

πŸŒ› Reboot and Shutdown

The reboot and shutdown functions must happen exclusively via App commands.

Go to "Utilities" then press Reboot or Shutdown.

While recooting the device will interrupt the "Nautilus" Wi-Fi, so make sure to connect back again.

Caution

Specifically for the shutdown, make sure to remove the battery (only) after the Power LED becomes πŸ”˜ faint green, otherwise the firmware and data might get corrupted.

πŸ”­ Focusing Lens

Before start capturing data, it is important to make sure the RGN camera is focused.

To do that, first walk with the Device to an outdoor area where trees are at least 10 meters away. In the App, scroll down to the "Utilities" tab and click on the "Adjust Focus" button.

A modal window will open inside the App and the GSSN will send a new photo every 15 seconds. After the initial beep, wait for the first photo to update, then screw the lens a tiny bit (clock or counter-clockwise), wait for the next photo and correct if needed. Repeat until satisfied.

Note

Please be patient, the slow time between refreshes is due to photos needing to be transfered between internal storages.

πŸš— Capturing Data

In order to capture data, the GSSN must have reasonable GPS signal, enough battery and storage available. When one or more of these criteria are not satisfied, their respective symbols in the status bar will become red and the Capture will not be available.

If everything is set, start capturing by rolling to the "Capture Data" section and clicking on Start Capture. The status right above will change accordingly, displaying how many samples were already taken.

In parallel, the storage available on the "Device" module will indicate how many captures and consequencly the running time is still available.

πŸ›‘ Stopping the Capture

To stop a capture, press Stop & Save Capture. Please be patient as the device will undergo a critical routine to wrap up data in the correct internal folders.

For data integrity reasons, the GSSN will automatically stop the capture if the battery goes below 25% or critical storage is being reached. This will allow it to wrap up the session accordingly.

Caution

If the device is switched off while capturing, or stopping capture and its consequent busy state, data might lost or be spread apart in different folders. This will demand manual gathering for this data and removing incomplete folders from storage before the Device might be able to run again.

πŸ”Ž Checking Data

If there are any captured package(s), they will be named after the time they were started by following a YYYY-MM-DD_hh-mm-ss timestamp format.

In the App, go to the "Data Packages" section to view the scrollable list. Beside the package's name, there is a parenthesis (NNN) indicating how many samples are stored inside.

πŸ—‘οΈ Deleting Data

To clean-up space, click on the X button located to the left of each package. Alternatively, all packages can be deleted at once by pressing Delete All Packages.

Caution

For any of the methods above, there is no way to undo. Act carefully.

πŸ“Œ Previewing Data

width=25%>

Whenever the Device is connected to the internet (via RJ-45 cable), a preview button will be available to the right side of every package.

By clicking it, it is possible to see a map with all its captured data as dots. The following data is visible on a popup by clicking each plotted dot:

  • IMU;
  • Temperature, Humidity, Pressure;
  • Particulate Matter (1.0, 2.5, 4.0, 10.0), VOC, NOx;
  • Timestamp.

πŸ“‚ Transfering Data

All data is stored in the external drive 0000-0001. It is always mounted and available as long as the Device is not capturing data or busy.

Currently, there are three ways to extract data:

A. By Google drive upload (slow, online):

  1. make sure the Device is connected to the internet (RJ-45);
  2. access remotely the device using the Raspberry Pi Connect service;
  3. from inside the Device, open the Chromium Browser;
  4. upload the desired package to the folder;
  5. access it from other device.

B. FTP transfer (fast, offline):

Using FileZilla as OpenSource software for sftp file transfer:

  1. Make sure the device from where you will be accessing is connected to the β€œNautilus” Wi-Fi network;
  2. Insert
    • Host: 10.42.0.1;
    • Username: TODO;
    • Password: TODO;
    • port: TODO;
  3. press: quickconnect;
  4. Go to the folder /media/nautilus/0000-0001;
  5. Copy and paste /data folder in you computer;
  6. Wait for the file transfer;

C. Physical (slow, offline) – not recommended:

  1. Make sure the Device is off;
  2. Disassemble it and remove the MicroSD card from the MAPIR camera (which is attached to the Top piece);
  3. Transfer the files to a computer;
  4. Important: Do not format the card or it might not be readable by the MAPIR camera;
  5. Put the card back to the camera and reassemble the Device.

1.3. Data

πŸ“¦ Packages

As result of every capture session, Data Packages are created in the 0000-0001/data/ folder, with the following structure:

Sample of a Data Package

./timestamp/* # YYYY-MM-DD_hh-mm-ss timestamp
  β”œβ”€ rgn/
  β”‚  β”œβ”€ timestamp-001.jpg # incorrect timestamp (disregard)
  β”‚  β”œβ”€ timestamp-002.RAW # incorrect timestamp (disregard)
  β”‚  └─ ...
  β”œβ”€ thermal/
  β”‚  β”œβ”€ TH_timestamp.npy # epoch timestamp
  β”‚  β”œβ”€ ...
  β”‚  └─ image/
  β”‚     β”œβ”€ TH_timestamp.png # epoch timestamp
  β”‚     └─ ...
  └─ data.csv

πŸ•₯ Time Data

The timestamp is determined by the exact time the capture is started.

To ease readability of the Data Package folder, the timestamp is written as YYYY-MM-DD_hh-mm-ss.

For all other data (thermal and numerical), the exact same timestamp is stored as Unix Time Stamp (Epoch) for easier data handling.

πŸ‘οΈ Visual Data

RGN Images

These files contains Red, Green and Near-Infrared channels in a 4000 by 3000 pixels image.

The /rgn folder contains odd/even file pairs (.jpg & .RAW) of the same image.

For instance, this is a pair of the same image:

  • 2023_0101_000355_001.RAW
  • 2023_0101_000356_002.JPG

The .RAW will contain the pixel information, and the .jpg holds the metadata and ease previewing.

The MAPIR software combines both into a .TIFF image. See Software for more information.

Important

these files are named by an incorrect_timestamp, which is automatically written by the MAPIR camera due to the camera's internal clock being resetted every time the device turns off, so please disregard it for time matters.*

Thermal Images (THIS FUNCTIONALITY IS CURRENTLY WORK IN PROGRESS AND UNTESTED)

The /thermal folder has raw .npy files, saved directly from the OpenMV camera Serial output.

Their resolution is of 120x160 pixels on a portrait format, chosen to maximise the frustrum intersection with the RGN camera. Every pixel will show the absolute radiometric temperature observed by the FLIR sensor in Celsius degrees.

Thermal Preview

To ease visualization, identical .png images are created inside the thermal/image folder. The perceptually-uniform inferno sequential colormap is provided by matplotlib.


The temperature scale (above) ranges from 0ΒΊC (left, black) to 80ΒΊC (right, yellow) in an absolute measurement scale.

This was designed to keep coherent images across all thermal cameras through time.

Overlaying Images

The RGN and thermal imaging outputs can be combined into a multilayered panoramic image.

The image below shows the frustrums for both cameras: thermal (up to 6x) in purple and RGN in pink. The vehicle is a Jeep Renegade, and the Device's bottom is positioned 50cm above the roof.

TODO: overlaid data.

More info in the Software section.

3️⃣ Numeric Data

There are four types of data stored inside the data.csv file:

  • temporal: when;
  • spatial: where and with which orientation;
  • atmospheric: air characteristics;
  • device: readings that might be used to compensate offsets by the own device thermo-electric characteristics (noise);

The file contains the following payload for every line:

# Position Element Type Purpose Source
0 timestamp [s] temporal Combined epoch timestamp (UTC) u-blox MAX-M10S
1 GPS UTC date [ddmmyy] temporal Human-readable date (UTC) u-blox MAX-M10S
2 GPS UTC time [hhmmss:sss] temporal Human-readable time (UTC) u-blox MAX-M10S
3 LAT spatial Latitude in decimal minutes u-blox MAX-M10S
4 LAT hemi spatial North or south hemisphere u-blox MAX-M10S
5 LON spatial Longitude in decimal minutes u-blox MAX-M10S
6 LON hemi spatial West or east hemisphere u-blox MAX-M10S
7 altitude spatial Altitude u-blox MAX-M10S
8 HDOP spatial GPS precision factor u-blox MAX-M10S
9 quat i spatial Determine quaternion rotation CEVA BNO085
10 quat j spatial Determine quaternion rotation CEVA BNO085
11 quat k spatial Determine quaternion rotation CEVA BNO085
12 quat r spatial Determine quaternion rotation CEVA BNO085
13 amb temp [C] atmospheric Ambient temperature Bosch BME280
14 humidity [%] atmospheric Relative humidity Bosch BME280
15 pressure [mb] atmospheric Barometric pressure Bosch BME280
16 pm1p0 atmospheric Particulated Matter (1 ΞΌm) Sensirion SEN-55
17 pm2p5 atmospheric Particulated Matter (2.5 ΞΌm) Sensirion SEN-55
18 pm4p0 atmospheric Particulated Matter (4 ΞΌm) Sensirion SEN-55
19 pm10p0 atmospheric Particulated Matter (10 ΞΌm) Sensirion SEN-55
20 voc atmospheric Volatile Organic Compounds Sensirion SEN-55
21 nox atmospheric Nitrogen Oxides Sensirion SEN-55
22 bat voltage [V] device Compensate sensing Texas Instruments INA219
23 bat current [mA] device Compensate sensing Texas Instruments INA219
24 proc temp. [C] device Compensate sensing Raspberry Pi CPU

Extra info

Pairing

To match all data together, follow the files numeric sequence.

Example for a data package with 35 captures:

  • 1x numerical file (.csv) with 36 lines (1st for header)
  • 70x RGN images (35x .jpg + 35x .RAW)
  • 70x thermal images (35x .jpg + 35x .npy)

1.4. Processing

Refer to the Software section for more details on the data processing pipeline.


2. Technical Information

2.1. Mechanics

Overview

The purpose of device enclosure is to aggregate different cameras, sensors, a processing unit and power management in a sturdy and reliable package. Its shape is majorly defined by the panoramic disposition of cameras to leverage a unique spatial drive-by sensing capability. It is also influenced by the disposition of the other components ensuring the air flow for both sensing and dissipation of heat coming from internal components.

There are up to seven cameras onboard, and the enclosure is divided on seven subsystems: top, ring, core (processor/battery/air sensing), bottom, and removable battery (with its charger).

Detail

Top



On the top piece, the main custom MAPIR RGN camera is mounted facing bottom-up, parallel to the ground, so its 225ΒΊ Field-of-View fisheye lens can capture all of the surroundings at once. Since this lens is custom-made and not the standard for the camera, sensor cropping occurs. Therefore the MAPIR mount orientation is transversal (X) to the driving axis (Y) to make sure the area where subjects are captured with best resolution, are not left out of frame.

In the outside there is a magnetic craddle for a calibration plate which hold different grayscale felt samples, responsible for spectral calibration of the final RGN image. More information about it on Software.

Thermal Module(s) (THIS FUNCTIONALITY IS CURRENTLY WORK IN PROGRESS AND UNTESTED)



An intermediary outer ring hold up to six radial, horizontally-facing thermal cameras which inform the hexagonal symmetry of the device. This exact number and disposition were determined by the need of overlapping the Field-of-View of each of the 95ΒΊ FoV FLIR Lepton thermal cameras to compose a thermal panoramic layer on top of the MAPIR RGN image, with their stitching eased by physically controlling the angles and positional offsets between the images.

Mechanically, each camera is mounted on their respective OpenMV modules that are further screwed into their own 3D printed modules. These are assembled to the ring chassis and pressed against rubber o-rings and germanium lenses to form a barrier against the external weather.

Important

Notice that the hexagonal modular shape allows for using only one camera pointing to either side of the road in case a minimal setup is desired. In this case, instead of "thermal", install "blind" modules accordingly.

Ring



The ring also serves as the main structural component for the device, to which all other external parts are attached. Though, its three internal core modules solely slide in and are locked fit together by other parts. The battery holder in the center is adjacent to the processing unit and the air sensing to the other side. They were distributed on opposite directions to make sure the processor heat does not interfere on the sensors, and both of them possess their individual intakes to ensure fresh airflow through their air tunnels.

Duct (processing)



Since the processing can be intensive due the amount of concurrent tasks the device has to execute, on the processor air tunnel there are two 40x40mm fans to help on cooling the Raspberry Pi 4: one blows fresh air to the processor, and the other pulls it to the outside.

Duct (air sensing)



In the other tunnel are the air quality sensors: BME280 (ambient temperature, humidity and barometric pressure) and the Sensirion SEN-55 (particulate matter 10, 4, 2.5 and 1, voc and nox) which has a dedicated 3D printed support to secure it in position while assuring its sampled air does not get retrofed.

Power + I/O



The last core module sits right in the middle and serves as a compartment for the battery, while offering cradles to electronic components such as the GPS, IMU and battery sensors. It also provides an I/O port with the illuminated power button and the a RJ-45 port to connect the device to the internet.

Bottom



The bottom piece encloses the device and allows different types of mounting: with three 1/4" magnets that serve to deploy directly on a car roof or any other ferromagnetic surface, or by using a 120mm ARCA SWISS standard camera mount.

Also, it provides the opening and latching mechanism to the battery module, that is inserted from the bottom and secured by rotating it to the locking position. The same opening provides access to the I/O ports and enough space for the user to grab device and help carrying it around.

Battery



To ensure the device is able to function during extended trips and minimize down time, a modular battery package was developed. Its 3D printed enclosure wraps the 7.2V Li-Ion pack that has its leads exposed through two flat metallic plates.

Charger



The charger wraps the battery manufacturer original charger inside, while using the same mounting as the device. It works in 110-240V, 50-60Hz and has a modular mains cable that can be switched to other models that serve regions (US is the default provided with the kit).

Outlet Adapter

This adapter is useful to keep the Device indefinetely powered while plugged on the wall outlet. The electrical connection is the same as the battery, but it is connected to a AC to DC power supply.

Caution

Do not place the outlet adapter into the battery charger, with high risk of disastrous consequences.

2.2. Electronics

The platform is equipped with multiple sensors capable of simultaneously collecting various types of environmental data, each tagged with GPS coordinates.

alt text

This section provides a detailed description of the hardware components used for data collection, their interconnections, and communication methods.

Connections Overview


%%{
  init: {
    'theme': 'dark'
  }
}%%

flowchart LR

    openMV--USB (UART)-->Raspi

    adapterSen55{{**Adapter:**<br>SEN-55 to Qwiic<br>*Adafruit*}}<--Qwiic<br>(I2C 0x81)-->hub
        
        subgraph Raspberry Pi
            Raspi[[**Processor:**</br>Raspberry Pi 4</br>8GB RAM]]
            sdCardRaspi
            firmwareRaspi
            Heatsink
            fan1(2x 40mm fans)
        end

        subgraph HATS     
            wittyPi
            shield{{**Shield:**<br>Pin Breakout<br>Qwiic HAT}}
        end
        
        subgraph I/O
            ethernet
            button
        end

        subgraph *POWER*
            ina219
            power
            battery
            outlet
        end        
        
    firmwareRaspi((**Firmware:**<br>Main))-->sdCardRaspi[(**MicroSD Card:**<br>32GB<br>Sandisk)]-->Raspi
    shield<-->wittyPi{{**Shield:**<br>Power Management<br>WittyPi 4}}<-->Raspi

    ethernet(**Ethernet:**<br>RJ-45)--RJ-45-->Raspi
    button(**Illuminated Button:**<br>RGB, N.O. Momentary)--R to D16-->shield
    button--G to D12-->shield
    button--B to D6-->shield
    button--PLUS to 3V3-->shield
    button--Switch 1st to SW-->wittyPi
    button--Switch 2nd to GND-->wittyPi     

    fan1--PIN: 5v-->wittyPi
    Heatsink-->Raspi
    
    shield--D21 to P7--->openMV

    subgraph *SENSORS*
        hub{{**Qwiic Hub:**<br>5x I/O<br>*Adafruit*}}
        GPS([**GPS:**<br>Advanced GPS V2<br>*MAPIR*])--UART<br>TX to RX-->shield
        bme280([**Temp, Hum. and Press:**<br>Bosch BME280<br>*Adafruit*])<--Qwiic<br>(I2C 0x77)-->hub
        sen55([**Particulate Matter:**<br>SEN-55<br>*Sensirion*])--JST GH-->adapterSen55
        bno08x([**IMU:**<br>CEVA BNO085<br>*Adafruit*])<--Qwiic<br>(I2C 0x4A)-->hub    
    end

    subgraph *THERMAL: up to 6x*        
        FLIR([**Thermal Sensor:**</br>Lepton 3.1R<br>*FLIR*])--SPI-->FLIRAdapter{{**Adapter:**<br>FLIR to OpenMV<br>*OpenMV*}}--SPI-->openMV[[**Thermal Processor:**<br>Open MV Cam H7 R1]]
        openMV<-->sdCardOpenMV[(**MicroSD Card:**<br>>512mb<br>*Sandisk*)]-->firmwareOpenMV((**Firmware:**<br>OpenMV))
    end

    subgraph *RGN*
        MAPIR<-->sdCardMAPIR[(**MicroSD Card:**<br>up to 512GB)]
        MAPIR--HDMI to D20-->shield
        Fisheye(**Lens:**<br>225ΒΊ Fisheye)-->MAPIR([**RGN Sensor:**<br> MAPIR Survey 3 RGN])--USB (UART)-->Raspi
    end

    ina219([**Battery level:**<br> INA219])--Qwiic-Qwiic<br>(I2C 0x40)-->hub
    power{**Power Input:**<br>6 to 30 VDC}--( + )-->ina219--( + )-->wittyPi
    power--( - )-->wittyPi
    battery(**Battery:**<br>Li-ion 7.2VDC 7000Ah)
    battery-.->power   
    outlet(**Outlet Adapter:**<br>110-240VAC to 12VDC)   
    outlet-.->power
   
Loading

Shield Assembly: the Qwiic HAT goes on the Witty Pi 4 module, which is then mounted on the Raspberry Pi 4.

Qwiic HAT pins:

  • D20: trigger MAPIR camera (white wire);
  • D21: trigger OpenMV camera(s) (daisy chain from D21 to P7s);
  • D16: LED red;
  • D12: LED green;
  • D6: LED blue;
  • 3V3: LED common anode;
  • 5V: to GPS 5V;
  • GND: to GPS GND;
  • RX: to GPS TX;

WittyPi 4 pins (soldering required):

  • SW: to switch (1st terminal);
  • GND: switch (2nd terminal) + (2x) fan GND ;
  • VOUT: to (2x) fan 5V;

Component Highlights

Raspberry Pi4B

The raspberry with linux-based operating system (Raspbian) is the core computational module with the possibility to communicate with different sensors by using several interfaces such as i2c, serial communication and SPI. Moreover the computational power is sufficient to handle basic data processing at the edge.


Witty PI 4 Shield

This shield is used to have a simple power management of the entire device giving the possibility to turn the device on by using a push button.


RGN MAPIR Survey 3 Camera

The MAPIR camera is a multispectral camera which depending on the camera firmware and proper lens filter can be used to acquire different light spectral bands in the visible and near-infrared portion of the light spectrum. In this specific application the RED(660nm) GREEN(540nm) NIR(850nm) filter has been chosen in order to calculate the Normalized Vegetation Index (NDVI) where bot RED and NIR channels are required.


225ΒΊ Fisheye Lens

The fisheye lens is a custom lens from MAPIR and is used to have a 360 by 225ΒΊ of Field of View with the purpose of increasing the picture area as much as possible.

From the data point of view the camera is used as RAW+JPG mode as this is required for the pipeline of calibrated images. Therefore the trigger cable is attached to the Raspberry in order to control and synch the acquisition with all the others sensors. The images are then stored in the MAPIR SDCard which can be accessed by activating/deactivating the memory access following PWM trigger rules.


MAPIR Advanced GPS V2 module

This module continually streams GPS data and it is connected to the Serial pin (RX) of the Raspberry.


BME280 Ambient Temperature, Humidity and pressure sensor

This MEMS sensor is a digital sensor which can be controlled by using i2c communication protocol. It collects ambient temperature, ambient humidity and ambient pressure data.


Sensirion SEN55

The Sensirion sensor is used to capture:

  • particulate matter in the air: PM1.0, PM2.5, PM4.0 and PM10;
  • specific gasses: VOCs and NOx;
  • internal temperature and humidity (not used).

CEVA BNO085

This 9 axis IMU contains a sensor-fusion processor to provide absolute rotation values, delivered as Quaternions;


OpenMV H7 R1 camera module + FLIR/OpenMV Adapter + FLIR 3.1R Thermal

The OpenMV camera module is a standalone electronic module which can be programmed in micro-python in order to capture sensor data depending on module mounted on. In this specific case it mounts a Thermal camera from FLIR (3.1R Thermal Camera module). It sends data through the USB Serial port to the Raspberry when the the trigger signal has been received.

These mudules are optional and currently being tested for inclusion in the platform.


AF INA219 DC Current sensor

This digital Current/Voltage sensor is used to calculate the voltage the battery and it communicates witht the Raspberry by the i2c protocol.


Li-Ion 7.2V 7000mAh 50.4 Wh battery pack


For the rest of components, check the Bill of Materials below.


2.3. Fabrication

Bill Of Materials

Electronics

Type Specification URL Quantity
Processing Raspberry Pi 4 8gb link 1
Cooling Heatsink (Rpi 4) link 1
Cooling 40mmx40mm Fan link 2
Power Witty Pi 4 Shield link 1
Power CR2025 Battery (RTC) link 1
Power Li-Ion 7.2V 7000 mAh 50.4Wh Battery Pack link 2
Power Tenergy TLP-4000 Battery Charger link 1
Power ALITOVE AC 100-240V to DC 12V 10A Adapter link 1
Sensor MAPIR Survey 3 -RGN spec link 1
Sensor MAPIR Advanced V2 GNSS Receiver link 1
Sensor INA219 High Side DC Current Sensor link 1
Sensor BME280 link 1
Sensor BNO085 link 1
Sensor Sensirion SEN55 link 1
Sensor OpenMV Camera H7R1 link 0 to 6 [a]
Sensor FLIR to OpenMV Adapter link 0 to 6 [a]
Sensor FLIR Lepton 3.1R Radiometric 95ΒΊ FOV link 0 to 6 [a]
Harness MAPIR Trigger Cable link 1
Harness Qwiic SEN-55 Adapter link 1
Harness Battery spring contacts link 2 + 2 [b]
Harness Qwiic 5 Port Hub link 1
Harness Button 16mm, RGB, 3.3-6v link 1
Harness JST XH Extension (male to female) link 1
Harness JST XH Cables (matching pair) link 2
Harness Qwiic I2C Cables link 6
Harness USB-A - microUSB Cables, Down-angled, 0.2m link 1 to 6 [a]
Harness Dupont Cable, Female to Female link 1 to 6 [a]
Harness RJ-45 extension, Male-Female Panel Mount link 1
Harness 2.54 Male Connector (straight) link 1
Harness Qwiic HAT for Raspberry Pi link 1
Storage MicroSD Card - 32gb, for Rpi 4 + OpenMV(s) link 2 to 7 [a]
Storage MicroSD Card - 256gb, for MAPIR link 1

Mechanics

Type Specification URL Quantity
Structure Polymaker PLA PRO 1.75mm White 3D print filament link 1.5 kg
Optics MAPIR FishEye Lens custom order 1
Optics Germanium Glass ΓΈ 25.4mm x 2mm link 1 to 6 [a]
Optics Polyester Felt: Black, Dark Gray, Light Gray and White link < 1 [d]
Sealing O-ring Oil-Resistant Buna-N 3/32 FW Dash 116 link 7
Sealing O-ring Stock Oil-Resistant Soft Buna-N 3/32 FW link 1 meter
Fixation Neodymium Magnets ΓΈ 10x2mm (package with 50pcs) link 3 + 3 [b]
Fixation Epoxy glue link < 1 [d]
Fixation Teflon tape link < 1 [d]
Fixation Thumb Screw Camera 1/4" link 1
Fixation Screw M2x12, Stainless, Socket Head link 2 + 12 [a]
Fixation Screw M2x16, Stainless, Socket Head link 22
Fixation Screw M3X12, Stainless, Socket Head link 36 + 4 + 4 [c]
Fixation Threaded Insert M3, Brass, Tapered link 34 + 4 + 4 [c]
Fixation Threaded Insert M2, Brass, Flanged link 22
Fixation Threaded Insert 1/4", Brass link 5
Mounting Rigwheels - Long screw link 3
Mounting ARCA SWISS Adapter, 120mm, 2 screws link 1
Mounting Camera Screw 1/4" x 20mm link 2
Mounting ARCA SWISS Head, 90mm, with level link 1
Mounting Elevated Car mount (untested) link 1
Mounting Security straps link 3
Observations:
  • [a] Depending on how many thermal cameras will be mounted;
  • [b] X + Y where X is the one device and Y is for each battery pack (or outlet adapter);
  • [c] X + Y + Z where X is the one device and Y for the battery charger and Z for each battery pack (or outlet adapter);
  • [d] Less than one pack per device.

Fabrication

3D Printing

Part Filename Observations Quantity
Device calibration_panel.stl 1
Device top.stl use supports 1
Device ring.stl 1
Device module_blind.stl 1 to 6
Device module_thermal.stl 1 to 6
Device duct_processor_lower.stl use supports 1
Device duct_processor_upper.stl use supports 1
Device duct_sensor_lower.stl 1
Device duct_sensor_upper.stl 1
Device compartment.stl 1
Device bottom_foot.stl 3
Device bottom.stl 1
Battery, Outlet battery_cap.stl 1 [a]
Battery, Outlet battery_main.stl 1 [a]
Battery, Outlet battery_support.stl 1 [a]
Outlet battery_outlet_adapter.stl 2 [b]
Charger charger_lower.stl 1
Charger charger_upper.stl 1
Observations:
  • [a] Depending on how many batteries (or outlet adapters) will be made;
  • [b] For each outlet adapter, 2;
Information:
  • Material: PLA;
  • Layer height: 0.28mm (or less);
  • Infill: 15%;
  • Wall loops: 5 (min. 2mm);
  • Top shell layers: 5 (min. 1.4mm);
  • Bottom shell layers: 5 (min. 1.4mm).

Laser Cutting

Part Filename Observations Quantity
Device calibration_felt_template.svg 3
Information
  • Material: polyester felt;
  • Speed: by machine;
  • Power: by machine;

Caution

Never use wool felt on a lasercutter, it will catch fire.

Assembly Schemas

Calibration Pad


%%{
  init: {
    'theme': 'dark'
  }
}%%

flowchart BT

calibration(**calibration_panel.stl**)
calibration<--Glue-->black("Black Felt (3x)")
calibration<--Glue-->dark("Dark Felt (3x)")
calibration<--Glue-->light("Light Felt (3x)")
calibration<--Glue-->white("White Felt (3x)")
mag2("Magnet (3x)")<--Glue-->calibration

Loading

Device


%%{
  init: {
    'theme': 'dark'
  }
}%%

flowchart BT

top(**top.stl**)

ring(**ring.stl**)
thermal("**module_thermal.stl** (6x)")
processing("**duct_processor_lower.stl**")
processing2("**duct_processor_upper.stl**")
air("**duct_sensor_lower.stl**")
air2("**duct_sensor_upper.stl**")
compartment("**compartment.stl**")

thermal--M3 insert (18x)<br>M3x12 Screw (18x)-->ring
oring3("O-ring Torus (6x)")
lens("Germanium Lens (6x)")

mapir(MAPIR)
mapir--1/4'' Camera Screw-->top
fisheye(Fisheye)

oring1("O-ring Torus")
oring1<-.->fisheye
oring1-->top
fisheye--Teflon tape-->mapir
thermal-->lens-->oring3-.->ring

openmv("OpenMV (6x)")--M2x12 Screw (6x)-->thermal

air-->ring
air2-->sen55("SEN-55")-->air
bme280("bme280")--M2 Insert (2x)<br>M2x6 Screw (2x)-->air
hub("Qwiic Hub")--M2 Insert (4x)<br>M2x6 Screw (4x)-->air
adapt("SEN adapter")--M2 Insert (4x)<br>M2x6 Screw (4x)-->air

compartment-->ring
leaddevice("Lead (2x)")-->compartment
processing-->ring
processing2-->processing

button("RGB Button")-->compartment
gps("GPS")--Hellermann (2x)-->compartment
imu("bno085")--M2 Insert (4x)<br>M2x6 Screw (4x)-->compartment
ina("ina219")--M2 Insert (4x)<br>M2x6 Screw (4x)-->compartment
rj45("RJ-45")--M3x12 Screw (2x)-->compartment


bottom("**bottom.stl**")

oring2("O-ring Cord")

ring-->oring2<-..->top

ring--M3 insert (6x)<br>M3x12mm Screw (6x)-->top
ring

top<--Glue-->mag1("Magnet (3x)")
bottom--M3 insert (6x)<br>M3x12mm Screw (6x)-->ring
mount("Magnetic Foot (3x)")--1/4'' Insert (3x)<br>1/4'' Screw-->foot("**bottom_foot.stl** (3x)")-->bottom

Loading

Battery


%%{
  init: {
    'theme': 'dark'
  }
}%%

flowchart BT

batmain("**battery_main.stl**")
batcover("**battery_cover.stl**")
batsupport("**battery_support.stl**")
batcell("7.2V Battery")-->leadbat
batmain<-->batcell-->batsupport-->leadbat("Lead (2x)")-->batcover
batcover--M3 insert (2x)<br>M3x12 Screw (2x)-->batmain

Loading

Caution

Pay attention to the battery polarity indicated on the 3D Printed pieces. Check twice.


Outlet Adapter


%%{
  init: {
    'theme': 'dark'
  }
}%%

flowchart BT

batmain("**battery_main.stl**")
supply-->adapter
batcover("**battery_cover.stl**")
batsupport("**battery_support.stl**")
supply("AC-DC Adapter")-->leadbat
batmain<-->adapter("**battery_outlet_adapter.stl** (2x)")-->batsupport-->leadbat("Lead (2x)")-->batcover
batcover--M3 insert (2x)<br>M3x12 Screw (2x)-->batmain

Loading

Caution

Pay attention to the battery polarity indicated on the 3D Printed pieces. Check twice.


Charger


%%{
  init: {
    'theme': 'dark'
  }
}%%

flowchart BT

chargerlo(**charger_lower.stl**)
chargerup(**charger_upper.stl**)
batcharger("Battery Charger")
chargerlo-->leadcharger("Lead (2x)")-->batcharger

batcharger-->chargerlo
batcharger-->chargerup
chargerlo--M3 insert (4x)<br>M3x12 Screw (4x)-->chargerup

Loading

Caution

Pay attention to the battery polarity indicated on the 3D Printed pieces. Check twice.


2.4. Firmware

The data acquisition firmware is developed entirely in Python and implemented as a multithreaded application, handling both backend and frontend operations.

Below is an overview of the firmware's architecture and functionality:


%%{
  init: {
    'theme': 'dark'
  }
}%%

flowchart LR
    subgraph SENSORS
        subgraph I2C
            bme280([bme280])
            sen5x([sen5x])
            bno08x([bno08x])
            ina219([ina219])
        end

        subgraph Serial
            MAPIR([mapir])
            openMV([openMV])
            gps([gps])
        end
    end

    subgraph FRONTEND
        ui((User Interface))
        10.42.0.1:3004
        10.42.0.1:3005
    end

    subgraph BACKEND
        subgraph Data
            gps-->gpsDeque>gpsDeque]-->dataCollect
            openMV<-->openMVDeque>openMVDeque]-->dataCollect
            bme280-->bme280Deque>bme280Deque]-->dataCollect
            sen5x-->sensirionDeque>sensirionDeque]-->dataCollect
            bno08x-->bno08xDeque>bno08xDeque]-->dataCollect
        end

        bme280-->bme280DequeFrontend>bme280DequeFrontend]-->webSocketSend
        sen5x-->sensirionDequeFrontend>sensirionDequeFrontend]-->webSocketSend
        bno08x-->bno08xDequeFrontend>bno08xDequeFrontend]-->webSocketSend

        webSocketSend{{webSocketSend}}-.->10.42.0.1:3004{{10.42.0.1:3004}}
        10.42.0.1:3005{{10.42.0.1:3005}}-.-> webSocketReceive{{webSocketReceive}}
        
        10.42.0.1:3005<-->ui        
        10.42.0.1:3004<-->ui
        webSocketReceive-->systemStatus[\systemStatus/]-->...
        webSocketReceive-->uploadStatus[\uploadStatus/]-->...
        webSocketReceive-->emptyStorageStatus[\emptyStorageStatus/]-->...

        subgraph Package
            dataCollect[(dataCollect)]-->/timestamp(/timestamp)-->data.csv[[data.csv]]
            /timestamp-->TH_timestamp.npy[[TH_timestamp.npy]]
            /timestamp-->TH_timestamp.png[[TH_timestamp.png]]
            /timestamp-->RAW[[RAW]]
            /timestamp-->JPG[[JPG]]
            MAPIR<-->dataCollect
        end

    end   
Loading

Multithreading

Sensor Management

Each sensor operates at a unique sampling frequency and data transmission rate. To manage this, the firmware employs individual threads for each sensor. These threads continuously fetch data from their respective sensors and store the data in a FIFO queue, which is updated in real-time.

Code example in the following Python cells

def sensirionDataStream(sensirionDequeBackend, sensirionDequeFrontend):
    print("SEN-55: stream data")
    time.sleep(0.2)
    while True:
        sensirionValues = sensirionDevice.read_measured_values()
        sensirionDequeBackend.append(sensirionValues)
        sensirionDequeFrontend.append(sensirionValues)
        time.sleep(0.1)
# allocate an empty list of threads for the application
applicationThreads = []

# allocate queues
sensirionDequeBackend = deque(maxlen=5)  # FIFO queue containing sensirion data 
sensirionDequeFrontend = deque(maxlen=5)  # FIFO queue containing sensirion data 

# create the thread
thread = threading.Thread(target=sensirionDataStream, name="sens5x_data_stream", args=(sensirionDequeBackend, sensirionDequeFrontend))

# start the thread
thread.start()

# append the thread to the list of the application running thread
applicationThreads.append(thread)

Data Logging

  • Triggers the MAPIR and OpenMV cameras every 4 seconds (real measured time 5s, limited by MAPIR cadence);
  • Extracts sensor data from the FIFO queues;
  • Stores all collected data into a CSV file;
  • Manages Frontend I/O with WebSockets.

Frontend

To manage the user interface, the firmware utilizes WebSocket communication, supported by dedicated threads:

  • Incoming Messages: One thread manages incoming messages from the frontend application, which is built using JavaScript and React.
  • Outgoing Feedback: Another thread sends feedback to the frontend, providing updates on the acquisition status and sensor values which affect the reactive interface.

This multithreaded approach ensures efficient data acquisition and real-time communication with the frontend, maintaining a seamless operation for both sensor management and user interaction.

The frontend interface is developed with JavaScript and React, and is served with with a npm server that is run automatically everytime the Device is turned on.

Operational System

The firmware is compatible with the Raspbian 12 (bookworm) Operational System.

Remote Access

Raspberry Pi Connect allows easy command from any other internet-connected computer without the need to disassemble the Device.

To configure it, follow these instructions (https://www.raspberrypi.com/documentation/computers/remote-access.html#raspberry-pi-connect) and give it a unique name like Nautilus_X, where X is a unique crescent number.

To connect:

  1. Plug an RJ-45 cable with internet (preferrably a direct connection from a router's WAN port) to the ethernet port of the Device;
  2. Turn the device on;
  3. In the App, check if there is internet connection available;
  4. Go to https://connect.raspberrypi.com/ and check to see if the Device appears;
  5. Click on the correct Device;
  6. Choose to access it either via shell or Graphic User Interface;
  7. A command window will open.

Alternatively, you can also access the Device via a local network with SSH, as in any other Raspberry Pi.

Services

To make sure the Device works automatically on boot, both firmware.service and frontend.service run by default on startup.

Firmware

Runs the nautilus_main.py.

To check if the firmware.service is running:

sudo systemctl status firmware.service 

To edit the it, run:

sudo nano /etc/systemd/system/firmware.service  
Click to see the file content.
[Unit]
Description=Firmware Service
After=network.target

[Service]
WorkingDirectory=/home/nautilus/Documents/nautilus/firmware
ExecStart=/home/nautilus/Documents/nautilus/firmware/.venv/bin/python /home/nau$
Restart=always
User=nautilus

[Install]
WantedBy=multi-user.target

Frontend

This service runs the npm run start command directly from the ./firmware/frontend folder, starting an node server to provide the App.

To check if the frontend.service is running:

sudo systemctl status frontend.service 

To edit it:

sudo nano /etc/systemd/system/frontend.service
Click to see the file content.
[Unit]
Description=Frontend Service
After=network.target

[Service]
WorkingDirectory=/home/nautilus/Documents/nautilus/firmware/frontend
ExecStart=/usr/bin/npm run start
Restart=always
User=nautilus
Environment=PATH=/usr/bin:/usr/local/bin
Environment=NODE_ENV=production

[Install]
WantedBy=multi-user.target

Tip

To temporarily disable the firmware.service, it is easier to rename the nautilus_main.py file and then reboot the device.

You can do the same for the frontend.service by renaming the /frontend folder to something else.

Be careful to rename everything back again, otherwise it will not start automatically the next time.

Configuration

Specific settings in the OS level are needed to support the firmware.

Network Settings

The Device must provide a Wi-Fi Access Point while also being able to connect to the WAN/LAN using the RJ-45 cable.

See the table below for an overview on how to setup both:

Type Method Autoconnect Priority ID Password Purpose
Cable RJ-45 0 Wired connection 1 - Connect to internet: remote access, retrieve files
Access Point Wi-Fi 1 Nautilus must be set Serve the App locally

The priorities are set so the system will always try to acess internet from Wired. That way, internet will be provided to any device connected to Nautilus Wi-Fi Access Point.

To configure that you can either use the Raspbian GUI and navigate to the Network Settings in the top bar, or run the command below:

sudo nmtui

Alternativelly, it is also possible to edit the configuration files manually as noted below:

Wi-Fi Access Point

Edit using nano:

sudo nano "/etc/NetworkManager/system-connections/Nautilus.nmconnection"
Click to see the file content.
[connection]
id=Nautilus
uuid=86174878-8a1f-406b-ad8b-e0c3a59c0f23
type=wifi
autoconnect-priority=1
interface-name=wlan0

[wifi]
mode=ap
ssid=Nautilus

[wifi-security]
key-mgmt=wpa-psk
psk=*******

[ipv4]
method=shared

[ipv6]
addr-gen-mode=stable-privacy
method=shared

[proxy]
Wired connection

Edit using nano:

sudo nano "/etc/NetworkManager/system-connections/Wired connection 1.nmconnection"
Click to see the file content.
[connection]
id=Wired connection 1
uuid=bf183fb6-8a11-35ba-8471-fc295f87d160
type=ethernet
interface-name=eth0
timestamp=1724895920

[ethernet]

[ipv4]
method=auto

[ipv6]
addr-gen-mode=stable-privacy
method=auto

[proxy]

Hardware Settings

Most of this configuration relates to the hardware pins config and I2C (low) speed to accomodate the sensors used.

To edit, use nano:

sudo nano /boot/firmware/config.txt
Click to see the file content.
# For more options and information see
# http://rptl.io/configtxt
# Some settings may impact device functionality. See link above for details

# Uncomment some or all of these to enable the optional hardware interfaces
#dtparam=i2c_arm=on
#dtparam=i2s=on
dtparam=spi=on

# Enable audio (loads snd_bcm2835)
dtparam=audio=on

# Additional overlays and parameters are documented
# /boot/firmware/overlays/README

# Automatically load overlays for detected cameras
camera_auto_detect=1

# Automatically load overlays for detected DSI displays
display_auto_detect=1

# Automatically load initramfs files, if found
auto_initramfs=1

# Enable DRM VC4 V3D driver
dtoverlay=vc4-kms-v3d
max_framebuffers=2

# Don't have the firmware create an initial video= setting in cmdline.txt.
# Use the kernel's default instead.
disable_fw_kms_setup=1

# Run in 64-bit mode
arm_64bit=1

# Disable compensation for displays with overscan
disable_overscan=1

# Run as fast as firmware / board allows
arm_boost=1

[cm4]
# Enable host mode on the 2711 built-in XHCI USB controller.
# This line should be removed if the legacy DWC2 controller is required
# (e.g. for USB device mode) or if USB support is not required.
otg_mode=1

[cm5]
dtoverlay=dwc2,dr_mode=host

[all]
#dtoverlay=w1-gpio
dtparam=i2c1=on
dtparam=i2c_arm=on
dtoverlay=miniuart-bt
enable_uart=1
dtparam=i2c_arm_baudrate=400000

# keep LED blue on startup meaning not ready
gpio=6=op,dl

Time settings

In order to keep the device synchronized to the same time provided by the GPS, the timezone must be set to GMT0 (UTC).

To do so, either:

  1. Using the Raspbian GUI, go to Preferences > Raspberry Pi Configuration > Localisation > Timezone
  2. Select the TZif2 locale;
  3. Select the GMT0 timezone.

Or, use the command line:

sudo timedatectl set-timezone GMT0

Virtual Environment

A virtual environment located at firmware/.venv, is used to run the Python firmware. Inside, there are all the libraries needed, and is from where the firmware.service runs when booting.

To ease replication the .gitignore file does not exclude its content from the repository (an exception to the standard best-practice). This ensures no manual installs are needed, and that this setup will be stable due to dependencies that might change over time.

Though, the cache files are excluded in the .gitignore file.

In case of adding or removing libraries from code, it must be done using this procedure.

OpenMV Firmware

Each OpenMV module has an MicroSD card that holds a firmware responsible for receiving the trigger command and sending back data to the Raspberry Pi 4.

A file named main.py is located at the ./firmware/openmv folder and must be saved to the root of each OpenMV's Micro SD.

Updating the Firmware from the Device

To ease the firmware development iteration, it is suggested to use the Visual Studio Code software installed in the RaspberryPi and login with a GitHub account.

This logged account must have read/write access to this repository. This way, changes can be instantly tested and commited.

2.5. Software

Data Processing

The pipeline can be divided into two phases.


%%{
  init: {
    'theme': 'dark'
  }
}%%

flowchart LR
    subgraph PRE
       pre(Pre Processing)-->data[(Data)]
    end

    subgraph POST
       data--Greenery Studies-->post1(Post Processing)-->results1(Results)
       data--Heat Studies-->post2(Post Processing)-->results2(Results)
       data--...-->post3(Post Processing)-->results3(Results)
    end

Loading

The single common pre-processing pipeline shapes raw information together into a processable format. First, by diminishing the errors provenient of the interaction of electro-optical components and the phenomena such as vignetting, chromatic aberration, sensor cropping and exposure fluctuation. Finally, by unwrapping both RGN and thermal images to the same equirectangular topology.

The post-processing phase is where pre-processed data is interpreted with different intents and applications, such as Greenery applications (segmentation and NDVI extraction).

Pre-processing


%%{
  init: {
    'theme': 'dark'
  }
}%%


flowchart TD
    subgraph Data ["πŸ“¦ Data Package"]
        datargn("**🟒 RGN Data**<br>(.jpg + .RAW)")
        datathermal("**🟒 Thermal Data**<br>(.npy + .png)")
        datacsv("**🟒 Numeric Data**<br>(.csv)")-->interference("**🟒 Interference Offset**<br>(device temperature)")
    end

    datargn-->info

    subgraph MAPIR scripts
        info-.->devignetteMAPIR("**πŸ”΄ Devignette**")-.->whitebalanceMAPIR("**πŸ”΄  White Balance**")-.->calibrationMAPIR("**πŸ”΄ Reflectance Calibration**")
        info("**🟒 Convert (script)**")-->metadataMAPIR(🟒 Add Metadata)-->tiffRGN("**🟒 Converted RGN**<br>(.TIFF)")
    end

    datathermal-->offset
    interference-->offset

    subgraph RGN [πŸ› οΈ RGN Workflow]
        devignette("**🟑 Devignette**<br>(flat field,<br>fisheye lens)")-->defringe("**πŸ”΅ Defringe**<br>(chromatic aberration,<br>fisheye lens)")-->axial("**🟒 Align**<br>(center image axes,<br>fisheye lens)")-->crop("**🟒 Crop**<br>(inscribe circle,<br>fisheye lens)")-->unwrap("**🟒 Unwrap**<br>(polar to equirectangular)")-->calibrate("**🟑 Calibrate**<br>(channels to RGN truth)")
        reference1(**πŸ”΅ Reference**<br>Fisheye<br>Vignette)-->devignette
    end

    subgraph Thermal ["πŸ› οΈ Thermal Workflow"]
        offset("**πŸ”΅ Offset**<br>(thermal reference)")-->devignette2("**πŸ”΅ Devignette**<br>(germanium lens)")-->straighten("**πŸ”΅ Straighten**<br>(remove barrel)")-->stitch("**πŸ”΅ Stitch**<br>(panoramic topology)")
        reference2(**πŸ”΅ Reference**<br>Germanium<br>Vignette)-->devignette2
    end
    
    subgraph Merge [πŸ› οΈ Multispectral Workflow]
        match(Match RGN and Thermal)
    end

    subgraph numeric [πŸ› οΈ Numeric Workflow ]
        quaternion("**πŸ”΄ Quaternion**<br>(i, j, k, r)")-.->euler("**πŸ”΄ Absolute<br>Rotation**<br>(Euler XYZ)")
        GPS(**🟒 GPS**)-->decimal("**🟒 Latitude/Longitude**<br>(Decimal)")-->tangents("**🟑 Heading**<br>(implied from interpolate curve tangents)")
        timestamp("**🟒 timestamp**<br>(UTC)")-->localTime(**πŸ”΅ Local Time**)
        GPS-->localTime
        ready("**🟒 Original Payload**")-->export(**πŸ”΅ export**)
    end

    numeric--->processedCSV("**Processed Numerical CSV**<br>(1x .csv)")

    calibrate--->rgnPano("**Calibrated RGN Image**<br>(1x panorama)")
    calibrate-->Merge
    Merge-->multilayered("**Calibrated Multispectral Image**<br>(1x panorama)")

    straighten-->thermalSingle("**Calibrated Thermal Image**<br>(6x single)")
    stitch-->thermalPano("**Calibrated Thermal Image**<br>(1x panorama)")

    thermalPano-->Merge

    tiffRGN-->devignette

    datacsv--->numeric

Loading

Observations:

  • πŸ”΅ to be implemented.
  • 🟒 implemented and working;
  • 🟑 implemented, but should be tested in-depth to make sure it is working properly;
  • πŸ”΄ implemented, but should not be used otherwise will introduce errors or interferences to the process;

πŸ› οΈ RGN Workflow

The files for the steps for this workflow (together with MAPIR scripts) are in ./software/pre-processing/visual/rgn/. Check inside each folder for the scripts (instructions inside).

🟒 Step 1_convert (convert to TIFF)

This first procedure takes a pair of .RAW (pixel information) and .JPG (metadata) and converts them into a single 16-bit .TIF image for better dynamic range.

πŸ”΅ Step 2_correct (devignette & defringe)to do

The devignette procedure should take a flat fields reference image as input to compensate the image's surrounding darkening caused by the lens' internal absorpions. Read more about vignetting here.

Important to notice there are a few flatfield images available for the standard MAPIR lens, but not for the fisheye currently being used. These should strictly serve as reference for a proper fisheye flatfield referenc to be manufactured, but not be used for devignetting.

Then, the defringe procedure should align the the R, G and N channels to compensate for chromatic aberration. More info on it here.

🟒 Step 3_unwrap (align, crop, unwrap)

This portion of code is used to convert the FishEye images to the panorama view.

Start by placing the center of a circle right in the middle of the fisheye circular projection, then matching the circle to its radius to make sure the xrop area is correct.

The script will automatically unwrap the image to the equirectangular panorama and export the images to the folder pointed by the user.

Follow the instruction inside the code for more detail.

🟑 Step 4_calibrate (calibrate) to review

The calibration objective is to balance the R, G and N channels similarly to what happens to white balance adjustment. The difference in this case, R, G and N values must match the hyperspectral references noted inside the code.

For that, the script looks to the lower strip of the unwrapped image that corresponds to the calibration target, and compares the samples to the reference reflectance values on R, G and N channels.

The corrected image should have a correct ratio between these, meaning the "white-balance" is right. This is very important so the R and N (from the RGN) are following physically-adjusted conditions. Since the NDVI is used by relating R and N, if they are not correct, the calculation will be totally off, even if all other factors are perfectly fine.

πŸ”΅ Step 5_deflare (de-flare) to do

Flares are the unwanted internal light reflections and refractions inside the lens, which cause visual artifacts in the image that can block subjects, alter their colors and impact the calibration. Read about it here.

A script must be developed to localize and remove them from the picture, or in an extreme case, discard the whole image.

πŸ› οΈ Thermal Workflow

πŸ”΅ Thermal offset to do

The thermal camera close to the Raspberry Pi might be contaminated by its heat, so in the future it might be interesting to investigate better.

If needed, the CPU temperature is available in the numerical payload and might be used to offset the thermal camera values.

πŸ”΅ Devignette to do

In a similar fashion to the RGN camera, vignetting also occurs on the themal images, but mainly because of the germanium lenses.

A reference image (flatfield) can be taken by aiming to a surface of uniform, known temperature. That image can be used to compensate for all vignetting.

πŸ”΅ Straighten to do

There is a slight barrel distortion on the thermal image that must be compensated if there is need for precision or composing several images together and reprojecting into a panorama. Check here for more information.

πŸ› οΈ Multispectral Workflow

πŸ”΅ Match RGN and Thermal to do

To create a multispectral panoramic image with RGN and thermal images:

  1. Obtain the panorama by unwrapping the fisheye image;
  2. Obtain straightened thermal image;
  3. Join all six thermal images with an image stitching method (like cell phone panorama image composition) OR distort a single image to a predicted position within the panoramic topological distortion.

Step 3 might be easier with all six thermal images available, but it should be doable with a single one. Anyways, with this solved it will be possible to use the RGN layer to segment the thermal layer.

πŸ› οΈ Numerical Workflow

πŸ”΅ Timezone to do

With the latitude and longitude provided precisely from the GPS (1.5 meters of precision) it is possible to derive important information.

Every time provided is in UTC, but it is easy to convert to local timezone by using a library such as TimezoneFinder.

This is useful for use cases that involve local contexts, habits, and routines.

πŸ”΅ Heading to do

Also by using a sequence of points' latitude and longitude values it is possible to obtain the heading of the device to establish the directionality of RGN and thermal images. This allows for connecting elements inside the image (pattern recognition) to geospatial elements on a map (external datasets).

These tests have been tested on this Colab Notebook (python):

  1. Fetch all original points from a data package, and plot them on a map with their latitude and longitude;
  2. Create an interpolated curve that integrates these original points with at least 5(? untested for cost-benefit) times their density;
  3. For every original point find the closest one in the interpolated curve, then trace a line between its previous and next points to obtain the tangent angle. This is the implied direction where the car and device were aiming when the picture was taken.

The precision of this method hasn't been tested, but it will be better than using an IMU sensor (9 axis), which is not working at all because it is very complex to calibrate it to overcome the drifting.

Post-processing

The procedures below are application-specific.

Greenery


%%{
  init: {
    'theme': 'dark'
  }
}%%


flowchart TD
    RGN-->Segmentation-->NDVI

Loading

NDVI Data Analysis

This section describes the steps to obtain spatialized NDVI.

It is composed of pre-processing, where the images are corrected and brought to the right format, and the actual process to extract the NDVI value distribution and to plot on a map the average NDVI value per street segment.

Image segmentation

To run the image segmentation and extract the tree foliage to perform the NDVI calculation on, it is necessary to open and run NDVI_calc.py. It fetches the .tif images in "../GS_Data/Panorama_Images" and stores the final masked images in .tif format in "../Image_Processing/Image_Masks". It stores the NDVI color plots in "../Image_Processing/Color_Plots".

The segmentation model used to generate the preliminary tree mask in our pipeline is the Dense Prediction Transformer model developed by Intel Labs1.

Compressed results and NDVI histogram matrix

Running NDVI_calc.py will also return a file compressed_results.csv which is stored in "../Image_Processing/". This file contains each image name with the corresponding average NDVI value, as well as the standard deviation, kurtosis and skewness.

Tree spatialization to do

By using the previously calculated Heading information:

  1. Fetch all trees from a dataset and plot them into the same map as tree points.
  2. For every original point, determine the surrounding tree points within a radius (that means, trees that should the visually present on that specific image)
  3. Calculate the angle between that original point's device angle and each of the surrounding trees
  4. On that panoramic image, calculate the NDVI by following the already defined workflow
  5. Use tree recognition algorithms to detect trees on the panoramic image, mask the NDVI with it and then extract the average for that tree point by crossing the angles

There might be gaps or inaccuracies that can be improved by repeating the same process on sequential panoramic images (seeing trees from different angles and triangulating them into space), or even different passes from slightly different positions. After the procedure, it is expected that every tree detected on every panoramic image should have a latitude and longitude.

3. Organization

This sections gives instructions on how to organize information, from this very repository to data captured.

3.1. This Repository

./nautilus/* # root
  β”œβ”€ data/
  β”‚  β”œβ”€ legacy/ # sample data captured with v0.5.0 and earlier
  β”‚  └─ v1/ # sample data captured with v1.0.0 (current)
  β”œβ”€ documentation/ # support files referenced in this README.md file
  β”œβ”€ firmware/ # this folder has everything needed by the Device
  β”‚  β”œβ”€ .venv/ # virtual environment for the firmware, contains libraries
  β”‚  β”œβ”€ frontend/ # workspace containing the React.js App
  β”‚  β”œβ”€ openmv/ # firmware that must be saved inside OpenMV's SD card
  β”‚  β”œβ”€ test/ # simple test scripts for isolated components
  β”‚  β”œβ”€ uwi/ # wizard for configuring Witty Pi 4
  β”‚  β”œβ”€ wittypi/ # configuration scripts and files for the WittyPi 4
  β”‚  └─ nautilus_main.py # firmware that is automatically run
  β”œβ”€ mechanics/ # files for 3d printing
  β”œβ”€ software/ # files for processing data
  β”‚  β”œβ”€ .venv # virtual environment for the software, contains libraries
  β”‚  β”œβ”€ pre-processing # common scripts
  β”‚  β”œβ”€ post-processing # specialized scripts
  β”‚  └─ evaluation # evaluation scripts
  β”œβ”€ ...
  β”œβ”€ package.json # manages the packages (firmware/frontend is a workspace)
  └─ README.md # this file
  

Footnotes

  1. Model architecture can be found in the corresponding publication and implementation can be found in Hugging Face documentation ↩

About

Greenery Scanner Platform: drive-by sensing of RGN air quality data, wrapped by IMU and GPS spatial information.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors