A compact hardware and software stack for estimating the pose of a 3D‑printed pen tip.
An ESP32 inside the pen sends a BLE trigger; the PC listens for the event, grabs a frame from a camera and computes the pen pose from ArUco markers.
- ESP32 DevKit (BLE + two buttons) to trigger the acquisition
- Physical assembly: 3D-printed pen/cube with 3–4 ArUco markers attached
- Camera: Basler camera with Pylon drivers installed or a compatible UVC webcam
- Calibration target: ChArUco board or checkerboard pattern
- Python ≥3.10 and
pipto install dependencies - ESP32 firmware toolchain: Arduino IDE/CLI or PlatformIO (optional but recommended)
- Basler Pylon drivers (optional) for industrial camera integration
- Python libraries from
pc/app/requirements.txt
esp32/– BLE firmware for the pen, detailed inesp32/README.mdpc/– PC-side application for capture and pose estimation, seepc/README.mdcube_minimal/– pose estimation library with dataset and testscube_minimal/data/sample_dataset/– sample images referenced in Quick start
- Press a button on the pen. The ESP32 publishes a BLE notification.
- The PC application receives the trigger and acquires an image from the selected camera.
- ArUco markers on the cube are detected and the cube pose is estimated.
- A fixed transform gives the pen‑tip pose, which can be logged or streamed.
Each capture session produces both a JSON summary and a CSV table beside the
session folder (e.g. session_<start>__<end>_pose.json / .csv). The JSON file
stores a list of frame objects containing:
tvec_tip: Cartesian tip position expressed in metres relative to the camera.wand_direction: Normalised tip direction vector in camera coordinates.euler_tip: Tip orientation expressed as Z‑Y‑Z intrinsic Euler angles (rz1,ry,rz2) in radians.tip_pose: Convenience tuple(x, y, z, rz1, ry, rz2)merging position and orientation. Downstream consumers should prefer this when available.
The accompanying CSV mirrors the JSON content with one row per frame and the
columns: frame_index, timestamp, ok, tip_x, tip_y, tip_z, tip_rz1,
tip_ry, tip_rz2. Numerical values are emitted in metres/radians to ease data
analysis in spreadsheets or plotting tools.
cd pc
python -m venv .venv
source .venv/bin/activate
pip install -r app/requirements.txtcd pc
py -m venv .venv
. .\.venv\Scripts\Activate.ps1
pip install -r app\requirements.txtTip: keep the virtual environment active for the next steps (
pytest, CLI, PC application).
Follow the Getting Started section of the firmware:
- Install the NimBLE-Arduino, Adafruit SSD1306, and Adafruit GFX libraries via the Arduino Library Manager.
- Wire the components as described in
esp32/README.md. - Build and upload the project by selecting the ESP32 Dev Module board (Arduino IDE/CLI) or set up an equivalent PlatformIO project.
- Use the scripts in
pc/calib/to producecalib_data.npz(camera intrinsics) from a ChArUco/checkerboard target. - For quick trials, point
capture.test_source_dirtocube_minimal/data/sample_dataset/, which contains the sample images described in its README.
cd pc/app
python app.pyConfigure config.yaml before launching the application: set ble.name_prefix or ble.addr, choose the camera (camera_type, camera_id/camera_serial), and provide the calibration path (pose.camera_calibration_npz).
For tests without physical hardware, set in config.yaml:
capture:
simulate_camera: true
test_source_dir: "../cube_minimal/data/sample_dataset"This mode replays sessions by reading the sample images through the same processing pipeline.
- Python tests (
cube_minimal):cd pc source .venv/bin/activate # Linux/macOS # PowerShell: . .\.venv\Scripts\Activate.ps1 pytest ../cube_minimal/tests
- Single-image estimation CLI:
python -m cube_minimal.cli.estimate_one --image cube_minimal/data/sample_dataset/example_1.png --camera cube_minimal/config/calib_data.npz --aruco_dict 4X4_50 --marker_size 0.055 --cube_size 0.060 --out overlay.png
- Simulated PC session: enable
capture.simulate_camera: trueas above and launch the application frompc/app/:python app.py
- Static quality checks (run from the repository root after activating the virtualenv):
Both tools read their configuration from
ruff check pc/app mypy
pyproject.tomland warn about style or typing regressions.
- Prerequisites and Repository structure for a quick overview
pc/README.md– deep dive into the PC pipeline configuration and Basler parametersesp32/README.md– hardware details, LEDs, and buttonscube_minimal/README.md– pose estimation API and CLIcube_minimal/data/README.md– structure of the sample dataset
- Specchiare Marker Errato sul pennino; Review Pennino
- Calibrazione Estrinseca Dexter
- Sviluppo di un programma robot per lettura csv e spostamento braccio nella posizione desiderata
report di chiusura del lavoro
All rights reserved.
This software and all associated files are the exclusive property of Angelo Milella - COMAU. Unauthorized copying, modification, distribution, or use of this software, via any medium, is strictly prohibited.
For inquiries about licensing, please contact: angelo_milella_dev@yahoo.com.


