This directory contains the Python code that powers the brain-computer interface and hardware control system.
- 🧠 EEG Processing - Processes brain signals from the Emotiv headset
- 🤖 RNN Warning System - Predicts potential hazards using neural networks
- 🔄 Arduino Communication - Controls the wheelchair motors and reads sensors
- 📡 WebSocket Server - Sends real-time data to the frontend dashboard
wheelchair_drive.py- MAIN PROGRAM that coordinates all system componentscortex.py- Interface with Emotiv Insight headsetsub_data.py- Data subscription and processingsend_websocket.py- Communication with web applicationcontrol.py- Basic wheelchair control logicai_copypaste.py- RNN model implementation
# Clone the repository
git clone https://github.com/mattenarle10/neurowarn.git
cd neurowarn/src/backend# Windows
python -m venv .venv
.venv\Scripts\activate
# macOS/Linux
python3 -m venv .venv
source .venv/bin/activate# Install required Python packages
pip install -r requirements.txt# Make sure your virtual environment is activated
# Windows: .venv\Scripts\activate
# macOS/Linux: source .venv/bin/activate
# Go to the backend's neurowarn directory
cd neurowarn/src/backend/neurowarn
# Start the main program
python wheelchair_drive.pyThe backend integrates with the LSTM neural network models to process EEG data and predict user intentions:
- Data Collection: The system connects to the Emotiv Insight headset via the Cortex API
- Data Processing: EEG data is collected in real-time and processed through the following steps:
- Raw EEG data is collected from 5 channels (AF3, T7, PZ, T8, AF4)
- Data is normalized using a pre-trained StandardScaler
- Processed data is organized into a sliding window of 5 time steps
- Prediction: The LSTM model predicts the user's intended movement direction
- Command Execution: Predicted commands are sent to the Arduino to control the wheelchair
Emotiv Headset → Cortex API → Backend → LSTM Model → Command Prediction → Arduino → Wheelchair
↓
WebSocket
↓
Frontend
(Visualization)
The backend communicates with the hardware components through:
- Serial Communication: Commands are sent to the Arduino via serial connection
- Sensor Data: LiDAR sensor data is received from the Arduino for obstacle detection
- WebSocket: Real-time data is sent to the frontend for visualization
Before running the system, you need to:
- Connect the Emotiv Insight headset and ensure it's paired
- Connect the Arduino to the computer via USB
- Configure the serial port in the wheelchair_drive.py script
- Create a profile in the Emotiv app for consistent EEG readings
To test the system without a physical headset:
- Use the test data provided in the
/src/models/testset.csvfile - Run the test script:
python test_prediction.py - Monitor the console output for predicted commands