A field-deployable multimodal data infrastructure that integrates autonomous sensing, edge computing, and drone-based platforms to address critical data acquisition challenges in animal ecology research.
- Demo Video: System Operations and Autonomous Pipeline Execution
- Parrot Olympe SDK: Drone Control Framework
- Docker Documentation: Container Deployment Guide
- MQTT Protocol: IoT Messaging Standard
- Camera-trap: Edge-based machine learning system for real-time animal detection using motion-triggered image capture
- UAS (Unmanned Aerial System): Autonomous drone platform for aerial video monitoring
- Detection-to-Documentation Pipeline: Automated workflow from ground-level detection to aerial observation
- Edge Computing: Local data processing at the point of collection without cloud dependency
- Multimodal Data: Synchronized datasets combining ground-level imagery with aerial videography
National Science Foundation (NSF) funded AI institute for Intelligent Cyberinfrastructure with Computational Learning in the Environment (ICICLE) (OAC 2112606)
This tutorial guides you through deploying the complete Smart Backpack system from camera-trap configuration to autonomous drone operations.
- Raspberry Pi 4B: 64GB Storage, 4GB RAM minimum
- Webcam: USB-compatible camera for animal detection
- WiFi Router: Local network infrastructure
- Power Station: Portable power supply for field deployment
- Parrot ANAFI Drone (optional): For full autonomous pipeline
- Docker Engine (v20.10+)
- Docker Compose (v2.0+)
- Python 3.9+
- Linux-based OS (tested on Ubuntu 20.04+)
-
Clone the repository:
git clone <repository-url> cd smartfield
-
Navigate to the camera-trap configuration directory:
cd ct-config -
Edit
cameratrap-config.pyto set your controller IP:controller_ip = 'http://icicle-ct1.local:8080'
-
Configure the camera-trap by running each section sequentially:
Health Check:
response = requests.get(f'{controller_ip}/health') print(response.json())
System Startup:
response = requests.post(f'{controller_ip}/startup', json={}, headers={'Content-Type': 'application/json'})
Configure Detection Parameters:
payload = { "gpu": "false", "ckn_mqtt_broker": "192.168.0.122", # Your MQTT broker IP "ct_version": "test", "mode": "demo", "min_seconds_between_images": "5", "model": "yolov5nu_ep120_bs32_lr0.001_0cfb1c03.pt", "inference_server": "false", "detection_thresholds": "{\"animal\": \"0.4\", \"image_store_save_threshold\": \"0\", \"image_store_reduce_save_threshold\": \"0\"}" } response = requests.post(f'{controller_ip}/configure', json=payload)
Start Detection: Note: Run the file everytime when you comment and uncomment the piece of code. Dont run the entire file. Run every section of code one by one.
response = requests.post(f'{controller_ip}/run')
-
Run the configuration:
python3 cameratrap-config.py
-
Create required directories:
mkdir -p logs mission AnafiMedia
-
Configure system settings in
config.toml:nano config.toml
IMPORTANT: Update the camera-trap location coordinates precisely. The drone uses these coordinates to navigate to detection sites. Incorrect values will cause the drone to fly to the wrong location.
# MQTT topic mapping for each Pi [mqtt_topics."cameratrap/events"] lat = 40.008278960212 # Replace with your camera-trap's exact latitude lon = -83.0175149068236 # Replace with your camera-trap's exact longitude camid = "pi-001" # Unique identifier for this camera-trap
Also update MQTT broker addresses, network settings, and service endpoints as needed.
-
Build and start services:
docker-compose up -d
-
Verify deployment:
docker-compose ps
Check each service endpoint:
# OpenPassLite (Mission Planning)
curl http://localhost:2177/
# SmartField (Event Coordination)
curl http://localhost:2188/
# WildWings (Drone Control)
curl http://localhost:2199/Access Grafana at http://localhost:3000:
- Username: admin
- Password: admin
Upon successful deployment, you will have:
- ✓ Autonomous camera-trap detecting animals in real-time
- ✓ MQTT-based event communication system
- ✓ Drone coordination ready for autonomous missions
- ✓ Real-time monitoring dashboard
- ✓ Centralized logging infrastructure
Adjusting detection sensitivity to balance false positives against missed detections in varying field conditions.
-
Access the camera-trap configuration in ct-config/cameratrap-config.py
-
Modify the
detection_thresholdsparameter:"detection_thresholds": "{\"animal\": \"0.4\", \"image_store_save_threshold\": \"0\", \"image_store_reduce_save_threshold\": \"0\"}"
-
Threshold values range from 0.0 to 1.0:
- 0.3-0.4: High sensitivity (more detections, more false positives)
- 0.5-0.6: Balanced (recommended for most scenarios)
- 0.7-0.8: High precision (fewer false positives, may miss detections)
-
Apply changes:
python3 cameratrap-config.py
- Test different thresholds during daytime before field deployment
- Monitor detection logs in
./logs/to tune sensitivity - Use higher thresholds in high-traffic areas to reduce false positives
- Too many false positives: Increase threshold to 0.6+
- Missing obvious animals: Decrease threshold to 0.3-0.4
- Camera not responding: Check network connectivity with
ping icicle-ct1.local
Monitoring system operations and debugging issues during field deployment.
-
View all service logs:
docker-compose logs -f
-
View specific service logs:
docker-compose logs -f wildwings docker-compose logs -f smartfield docker-compose logs -f openpasslite
-
Filter logs by time:
docker-compose logs --since 30m smartfield
-
Export logs for analysis:
docker-compose logs > system-logs-$(date +%Y%m%d).txt
Python log reader:
import subprocess
def tail_logs(service_name, lines=50):
cmd = f"docker-compose logs --tail {lines} {service_name}"
result = subprocess.run(cmd, shell=True, capture_output=True, text=True)
return result.stdout- Logs not appearing: Check if service is running with
docker-compose ps - Permission denied: Run with
sudoor add user to docker group - Disk space issues: Use the provided log-cleaner.sh script
Testing drone deployment without waiting for camera-trap detections.
-
Ensure drone is connected and WildWings service is running:
docker-compose ps wildwings
-
Access the SmartField API to publish a test detection:
curl -X POST http://localhost:2188/test-detection \ -H "Content-Type: application/json" \ -d '{"location": {"lat": 40.0, "lon": -83.0}, "confidence": 0.95}'
-
Monitor mission execution in logs:
docker-compose logs -f openpasslite wildwings
-
Check mission status via OpenPassLite:
curl http://localhost:2177/mission/status
Edit config.toml to adjust mission parameters:
[drone]
mission_duration = 45 # seconds
altitude = 10 # meters
video_quality = "high"- Drone not connecting: Check USB connection with
lsusb | grep Parrot - Mission fails to start: Verify GPS lock and battery level
- Video not recording: Check storage space in
./AnafiMedia
Extending the system to respond to custom detection events or external triggers.
-
Create a new subscriber in services/mqtt_subscriber/
-
Add your handler function:
def on_custom_event(client, userdata, message): payload = json.loads(message.payload) # Your custom logic here print(f"Custom event: {payload}")
-
Subscribe to your topic:
client.subscribe("smartfield/custom/events") client.message_callback_add("smartfield/custom/events", on_custom_event)
-
Rebuild and restart:
docker-compose up -d --build mqtt_subscriber
- Filter events by confidence threshold
- Forward events to external systems
- Aggregate multiple detections before triggering actions
- Messages not received: Verify topic name matches publisher
- Connection refused: Check MQTT broker is running on port 1883
- Callback not firing: Ensure proper topic subscription syntax
Traditional animal ecology fieldwork faces fundamental operational constraints that limit research effectiveness:
- Manual Data Collection: Human-operated protocols introduce systematic quality degradation and require continuous field personnel deployment
- Network Limitations: Remote study sites operate beyond conventional network infrastructure, precluding real-time monitoring capabilities
- Single-Modal Systems: Existing sensing platforms remain predominantly single-modal, requiring multiple independent hardware deployments and increasing complexity
- Post-Processing Paradigm: Traditional methodologies sacrifice temporal resolution through delayed data processing workflows
- Operational Discontinuity: Repeated battery replacement, manual sensor maintenance, and physical data retrieval create vulnerability to data corruption and storage limitations
The Smart Backpack System addresses these challenges through an integrated autonomous monitoring framework that combines:
- Edge Computing Infrastructure: Local processing eliminates dependency on network connectivity
- Multimodal Sensing: Unified platform integrating ground and aerial observation capabilities
- Autonomous Coordination: Software-driven orchestration eliminates human intervention requirements
- Real-Time Processing: Immediate inference and decision-making at the point of data capture
┌─────────────────────────────────────────────────────────────┐
│ Smart Backpack System │
├─────────────────────────────────────────────────────────────┤
│ │
│ ┌─────────────┐ ┌──────────────┐ ┌────────────┐ │
│ │ Camera-Trap │─────▶│ SmartField │─────▶│ WildWings │ │
│ │ (Detect) │ │ (Coordinate) │ │ (Observe) │ │
│ └─────────────┘ └──────────────┘ └────────────┘ │
│ │ │ │ │
│ └─────────────────────┴─────────────────────┘ │
│ │ │
│ ┌────▼────┐ │
│ │ MQTT │ │
│ │ Broker │ │
│ └─────────┘ │
│ │
│ ┌──────────────────────────────────────────────────────┐ │
│ │ Monitoring Stack (Grafana/Loki) │ │
│ └──────────────────────────────────────────────────────┘ │
└─────────────────────────────────────────────────────────────┘
Purpose: Real-time animal detection at the edge
Key Design Decisions:
- Motion-Triggered Capture: Reduces power consumption and storage requirements
- Edge ML Inference: YOLOv5 model runs locally, eliminating network dependency
- Threshold-Based Classification: Configurable confidence scoring balances sensitivity and precision
- MQTT Publishing: Lightweight event notification enables loose coupling
Why This Approach: Traditional camera-traps store all images for post-processing. Our edge inference approach provides immediate classification, reducing data volume by 95% and enabling real-time system response.
Purpose: Central event coordination and mission orchestration
Key Design Decisions:
- Event-Driven Architecture: Reacts to detection events rather than polling
- Stateless Processing: Enables horizontal scaling and fault tolerance
- API-First Design: RESTful interfaces allow easy integration with external systems
Why This Approach: A centralized coordinator ensures system-wide state consistency while maintaining loose coupling between detection and response modules.
Purpose: Autonomous drone mission planning
Key Design Decisions:
- Geospatial Planning: Converts detection coordinates to flight paths
- Temporal Optimization: Schedules missions to minimize flight time and battery usage
- Safety-First Logic: Pre-flight checks, no-fly zones, and emergency protocols
Why This Approach: Separating mission planning from execution allows sophisticated path optimization while keeping the drone control layer simple and responsive.
Purpose: Drone control and video capture
Key Design Decisions:
- Olympe SDK Integration: Leverages Parrot's official Python SDK for reliable control
- Video Pipeline: Hardware-accelerated encoding reduces processing overhead
- Privileged Container: Direct hardware access for USB and networking
Why This Approach: Using an open-source autonomous UAS platform ensures reproducibility and allows researchers to modify flight behaviors for specific study requirements.
The system orchestrates a comprehensive autonomous workflow:
1. Camera-Trap Detects Animal (Confidence > 0.4)
↓
2. MQTT Event Published (Location, Timestamp, Species)
↓
3. SmartField Receives Event & Validates
↓
4. OpenPassLite Plans Mission (Path, Altitude, Duration)
↓
5. WildWings Executes Flight (40-50 seconds)
↓
6. Video Captured & Stored (AnafiMedia/)
↓
7. OpenPassLite Plans Return-To-Home Mission (Path, Altitude, Duration)
↓
8. System Returns to Standby
Result: Synchronized multimodal dataset with ground-level detection paired with aerial behavioral observation—without human intervention.
Host Networking Mode: All services use network_mode: host for:
- Direct hardware access (USB devices, cameras)
- Low-latency inter-service communication
- Simplified port management in field deployments
Message-Oriented Middleware: MQTT broker provides:
- Publish-subscribe pattern for loose coupling
- Quality of Service (QoS) guarantees for critical messages
- Persistent sessions for network interruption resilience
Loki + Promtail + Grafana Stack:
- Loki: Efficient log aggregation without indexing overhead
- Promtail: Scrapes logs from all services automatically
- Grafana: Real-time visualization with custom dashboards
Design Rationale: Field deployments require operational visibility without external dependencies. The embedded monitoring stack provides insights even when disconnected from internet.
Testing Profile:
- Duration: 20 hours continuous operation
- Detections: 45+ animal detection events processed
- Missions: 38 autonomous drone deployments completed
- Success Rate: 92% mission completion (failures due to low battery)
Validation Scope:
- Complete detection-deployment-recovery cycles
- Network resilience testing (WiFi disconnections)
- Power management under field conditions
- Weather resistance (light rain, wind gusts)
- Event-Driven Architecture: Asynchronous communication enables independent module evolution
- Microservices: Containerized services allow independent scaling and deployment
- Publisher-Subscriber: MQTT decouples event producers from consumers
- Infrastructure as Code: Docker Compose enables reproducible deployments
- Centralized Logging: Observability without code instrumentation
The architecture supports:
- Multi-Camera Networks: MQTT topics can namespace multiple camera-traps
- Advanced ML Models: Swap detection models without changing pipeline
- Cloud Integration: Optional data sync when network available
- Additional Sensors: Acoustic, thermal, or environmental monitoring
- Multi-Drone Coordination: Scale to fleet operations
- Edge Computing in Wildlife Monitoring - Survey of ML at the edge
- Autonomous UAS for Ecology - Drone applications in field research
- MQTT Protocol Specification - Understanding message patterns
- Docker Compose Best Practices - Production deployments
- YOLOv5 Documentation - Understanding object detection
- Processor: Dual-core ARM/x86 (Raspberry Pi 4B compatible)
- RAM: 4GB
- Storage: 64GB (32GB for OS/software, 32GB for data)
- Network: WiFi 802.11n or Ethernet
- OS: Linux kernel 5.x+ (Ubuntu 20.04+ recommended)
- Processor: Quad-core ARM64/x86_64
- RAM: 8GB
- Storage: 128GB+ (SSD preferred)
- GPU: Optional for accelerated ML inference
- Network: Dual-band WiFi for drone control separation
- Idle: ~15W (all services running, no drone)
- Active Detection: ~20W (camera + inference)
- Drone Mission: ~25W (includes video processing)
- Recommended Battery: 200Wh+ power station for 8+ hours
Note: This system is designed for ecological research purposes. Ensure compliance with local regulations regarding drone operation and wildlife monitoring in your deployment area.