Code for our paper "UMotion: Uncertainty-driven Human Motion Estimation from Inertial and Ultra-wideband Units"
uv venv --python 3.11
source .venv/bin/activate
uv pip install -r requirements.txtcheck
pyproject.tomlfor dependencies.
- Download SMPL model. You should download
version 1.1.0 for Python 2.7 (female/male/neutral, 300 shape PCs)since UIP use neutral SMPL model. - Put the model file into
models/smpl/. The directory structure should look like the following:
models/smpl/basicModel_f_lbs_10_207_0_v1.1.0.pkl
models/smpl/basicModel_m_lbs_10_207_0_v1.1.0.pkl
models/smpl/basicModel_neutral_lbs_10_207_0_v1.1.0.pkl
Change
base_dirinsrc/configto your project directory.
Download the dataset and put it into datasets/raw. The directory structure should look like the following:
datasets/raw/AMASS/<subdata>/...
datasets/raw/DIP_IMU_and_Others/DIP_IMU/s_<subject_id>/...pkl
datasets/raw/TotalCapture_Real_60FPS/...pkl
datasets/raw/uip/[(test)(train)].pt
We provide different process scripts for preprocessing data into the same data structure (run in project root directory).
| Dataset | Download Link | Processing Script |
|---|---|---|
| AMASS | https://amass.is.tue.mpg.de/ (SMPL-H) | python src/data/amass.py |
| DIP-IMU | https://dip-imu.github.io/ (DIP IMU AND OTHERS - DOWNLOAD SERVER 1) | python src/data/dipimu.py |
| TotalCapture | https://dip-imu.github.io/ (ORIGINAL TotalCapture DATA W/ CORRESPONDING REFERENCE SMPL Poses) | python src/data/totalcapture.py |
| UIP | https://siplab.org/projects/UltraInertialPoser | python src/data/uip.py |
Make sure to have gdown installed
pip install gdownThen, run the following command line:
bash download_pretrained_models.shFor shape estimator
python src/main_shape.pyFor pose estimator (DIP-IMU, TotalCapture)
python src/main_pose.py # DIP-IMU, TotalCapture, without UKF (distance without noise)
python src/main_uip.py # UIP, with UKF (noisy distance)
python src/main_ukf.py # TotalCapture, with UKF (noisy distance)Uncomment
do_train()inmain_shape.pyandmain_pose.pyto train the model. The training parameters are set inconfigs/config.yamlandconfigs/config_noise.yaml
If you find this code useful in your research, please cite:
@inproceedings{liu2025umotion,
title={UMotion: Uncertainty-driven Human Motion Estimation from Inertial and Ultra-wideband Units},
author={Liu, Huakun and Ota, Hiroki and Wei, Xin and Hirao, Yutaro and Perusquia-Hernandez, Monica and Uchiyama, Hideaki and Kiyokawa, Kiyoshi},
booktitle={Proceedings of the Computer Vision and Pattern Recognition Conference},
pages={7085--7094},
year={2025}
}The code for sensing the distance matrix using the DW3000 is available at DW3000.
If you encounter any issues or have questions, feel free to open an issue. You may also contact me via the email address: liu.huakun.li0@is.naist.jp.