ASPEN: Assisted-as-needed sEMG-triggered control of an upper limb exoskeleton for Poststroke home rEhabilitatioN Development, verification and validation.
A compact, EMG-driven exoskeleton demo: real‑time streaming, calibration, control, and analysis — with a GUI for smooth operation and Arduino firmware for the exo control loop.
Click the image to open the PDF. If it does not load, use the direct link: docs/media/graphical-abstract.pdf
- Real‑time EMG: stream, record, and process biosignals.
- One‑click GUI: run the full demo from a user-friendly interface.
- Therapist mode: assisted workflows for clinician-driven sessions.
- Arduino firmware: exoskeleton motion/control via
ASPEN_CONTROL.ino. - Offline analysis: MATLAB tools and sample datasets included.
- Sample exercises: ready-to-stream examples for quick testing.
- Clone this repository.
- Set up the TMSi virtual environment (
tmsi_venv) using the commands in the "Setup TMSi" section. - Place the files as follows (per the original workflow):
- Put
ASPEN_GUI.pyinside yourtmsi_venvfolder. - Put the remaining Python scripts inside
tmsi-python-interface-V5.3.0.0(located withintmsi_venv).
- Put
- Activate the virtual environment and launch the GUI.
# macOS/Linux
source tmsi_venv/bin/activate
python ASPEN_GUI.py
# Windows (PowerShell)
# .\tmsi_venv\Scripts\Activate.ps1
# python ASPEN_GUI.pyUse the GUI to navigate through calibration, streaming, and control modes.
ASPEN_CONTROL/
│── ASPEN_CALIBRATE.py # EMG calibration and thresholding
│── ASPEN_CONTROL.ino # Arduino control logic for the exoskeleton
│── ASPEN_EMPOWER.py # Real-time EMG-triggered exo movement
│── ASPEN_GUI.py # Graphical interface to run the demo
│── ASPEN_RECORD.py # Record EMG signals
│── ASPEN_STOP.py # Safe stop and return to base position
│── ASPEN_STREAM.py # Stream pre-recorded EMG data
│── ASPEN_THERAPIST.py # Therapist-assisted workflow
│── arduino_secrets.h # WiFi/connection configuration
│
│── STREAM_SAMPLE_EXERCISES/ # Example inputs for streaming
│ │── Calibration_th1.txt
│ │── Exercise_1.poly5
│ │── Exercise_2.poly5
│
│── DATA_ANALYSIS/ # MATLAB analysis tools and example datasets
- A working TMSi setup and drivers.
- Python environment compatible with the TMSi interface used in your lab setup.
- Arduino toolchain (if you plan to compile and upload
ASPEN_CONTROL.ino).
- Activate
tmsi_venv. - Launch
ASPEN_GUI.py:
python ASPEN_GUI.py- Follow the on-screen steps to calibrate, stream, or empower the exo.
- MATLAB scripts for analysis live in
ASPEN_CONTROL/DATA_ANALYSIS/. - Includes tests, helper functions, and raw example data for reproducible evaluation.
- Save your image at
docs/media/graphical-abstract.png(create folders if needed). - Reference it near the top of this README as:
- TMSi drivers: ensure drivers and permissions are correctly installed.
- Serial/USB access (macOS/Linux): verify device permissions and correct port.
- Virtual environment: confirm the GUI runs from within
tmsi_venv.
If this demo or its analysis tools help your work, please cite the repository:
@software{aspen_demo_2025,
title = {ASPEN Demo},
author = {Simone Sgorbati},
year = {2025},
url = {https://github.com/simosgorby/ASPEN-DEMO}
}
ASPEN is an “assisted‑as‑needed” system for post‑stroke rehabilitation targeting mild upper‑limb motor impairments. It couples the Auxivo EduExo Pro exoskeleton with the TMSi SAGA device to detect muscle activation via surface EMG (sEMG) and translate user intention into elbow motion.
Key points:
- Hardware: Auxivo EduExo Pro (1‑DOF elbow) + TMSi SAGA.
- Signals & control: sEMG from biceps and triceps; binary controller with a double‑threshold intention‑detection algorithm.
- Motion: elbow flexion/extension along predefined trajectories at customizable speed.
- Modes:
- Empower: real‑time assistance triggered by muscle activation.
- Stream: passive, repeatable exercises using pre‑recorded EMG.
- Therapist: manual guidance during therapy sessions.
- Safety: physical emergency stop button; timeout for prolonged flexion.
- Usability: simple, intuitive GUI for setup and operation.
Results (preliminary):
- Intention detection accuracy > 90% with the double‑threshold algorithm.
- Higher repeatability vs therapist‑assisted movements:
- Exoskeleton: RMSE = 4.53°, IQR = 0.93°
- Therapist: RMSE = 12.94°, IQR = 6.13°
- Usability: setup < 10 minutes; median SUS score = 62.5/100 (n = 5 participants).
For details and methodology, see the scientific report in ASPEN_REPORT (included with the project).
Below are the commands to create the tmsi_venv environment and set up the TMSi interface.
macOS/Linux:
# 1) Create the virtual environment
python3 -m venv tmsi_venv
# 2) Activate
source tmsi_venv/bin/activate
# 3) Upgrade base tooling
pip install --upgrade pip setuptools wheel
# 4) Place the TMSi interface package inside the environment
# Extract/place the folder 'tmsi-python-interface-V5.3.0.0' inside 'tmsi_venv/'
# (e.g., tmsi_venv/tmsi-python-interface-V5.3.0.0)
# 5) Install dependencies (if a requirements.txt is present)
pip install -r tmsi-python-interface-V5.3.0.0/requirements.txt || trueWindows (PowerShell):
# 1) Create the virtual environment
python -m venv tmsi_venv
# 2) Activate
.\tmsi_venv\\Scripts\\Activate.ps1
# 3) Upgrade base tooling
pip install --upgrade pip setuptools wheel
# 4) Place the TMSi interface package inside the environment
# Extract/place the folder 'tmsi-python-interface-V5.3.0.0' inside 'tmsi_venv\'
# (e.g., tmsi_venv\tmsi-python-interface-V5.3.0.0)
# 5) Install dependencies (if a requirements.txt is present)
pip install -r tmsi-python-interface-V5.3.0.0\\requirements.txtNotes:
- Ensure TMSi drivers and device permissions are correctly configured on your system.
- Place
ASPEN_GUI.pyinsidetmsi_venv/, and the other scripts insidetmsi-python-interface-V5.3.0.0. - If the TMSi package is distributed as an archive (zip), extract it before installing requirements.
This project is released under the license listed in LICENSE.md.
Developed at Politecnico di Milano, within the "Collaborative Robotics" course, A.Y. 2024-25.
Thanks to Prof. Emilia Ambrosini, Prof. Marta Gandolla and Dr. Luca Pozzi for supervising the project.
