Skip to content

[BMVC'25] Official repository for "Learning Correlation-aware Aleatoric Uncertainty for 3D Hand Pose Estimation"

License

Notifications You must be signed in to change notification settings

kaist-ami/Hand-Uncertainty

Repository files navigation

Learning Correlation-aware Aleatoric Uncertainty for 3D Hand Pose Estimation (BMVC 2025)

📣 Getting started

Model checkpoints

You can download model checkpoints from ours and baselines. After downloading the models, place them in ./checkpoints.

hand_uncertainty/
└── checkpoints/
    ├── hamer_ours.ckpt
    ├── hamer_diag.ckpt
    ├── hamer_full.ckpt
    └── hamer_ours_wo_linear.ckpt

Installation

Create and activate a virtual environment to work in:

conda create --n hand_uncertainty
conda activate hand_uncertainty
pip install -r requirements.txt

HaMeR data and model preparation

Follow the instructions in HaMeR to prepare trained hamer models, MANO model, hamer training data and hamer evaluation data.

🚀 Training

You need to change the model type model_type in the code and index of GPU device devices in the code.

[Model Type]

  • ours: our proposed correlation-aware uncertainty parameterization
  • diag: diagonal covariance parameterization
  • full: full covariance parameterization
  • ours_wo_linear: removing the linear layer from our parameterization

You can pass experiment name ${EXP_NAME} as an argument to the script.

python train.py exp_name=${EXP_NAME} experiment=hamer_vit_transformer trainer=gpu launcher=local

🧪 Testing

Prepare evaluation dataset

Download FreiHAND evaluation set and HO-3D evaluation set from FreiHAND and HO-3D and place them in uncertainty_eval/freihand/gt/ and uncertainty_eval/ho3d/gt/.

hand_uncertainty/
└── uncertainty_eval/
    ├── freihand/
    │   └─ gt/
    │       ├── evaluation_verts.json
    │       └── evaluation_xyz.json
    ├── ho3d/
    │   └─ gt/
    │       ├── evaluation_verts.json
    │       └── evaluation_xyz.json
    ├── ...        
    └── ...        

Evaluation

Run evaluation on FreiHAND and HO-3D datasets as follows, results are stored in results/.
You need to change the model checkpoint path ckpt_path, model type model_type and experiment name exp_name in the code.

python eval.py 
python eval_uncertainty.py 

After running the commands, the results/ directory should look like:

hand_uncertainty/
└── results/
    └── ${EXP_NAME}/
        ├── freihand-val.json
        ├── freihand-val_uncertainty.json
        ├── ho3d-val.json
        └── ho3d-val_uncertainty.json

Evaluate hand pose estimation performance

For FreiHAND and HO-3D, freihand-val.json and ho3d-val.json prediction files stored in results/ can be used for evaluation using their corresponding evaluation processes.

Evaluate uncertainty estimation performance

Run below command to evaluate AUSC, AUSE and pearson correlation.
You need to pass experiment name ${EXP_NAME} and directory where the .json prediction files are stored ${PATH_TO_PRED_DIR} as an argument to the script.

cd uncertainty_eval
python eval_uncertainty.py --dataset freihand --exp ${EXP_NAME} --pred_file_dir ${PATH_TO_PRED_DIR}
python eval_uncertainty.py --dataset ho3d --exp ${EXP_NAME} --pred_file_dir ${PATH_TO_PRED_DIR}

Scores are saved in uncertainty_eval/save/${DATASET}/${EXP_NAME}/scores.txt.

📚 Citation

If you found this code useful, please consider citing our paper.

@article{chae2025learning,
  title={Learning Correlation-aware Aleatoric Uncertainty for 3D Hand Pose Estimation},
  author={Chae-Yeon, Lee and Hyeon-Woo, Nam and Oh, Tae-Hyun},
  journal={arXiv preprint arXiv:2509.01242},
  year={2025}
}

🙏 Acknowledgement

We heavily borrow the code from the following projects. We sincerely appreciate the authors of these projects for making their work publicly available:

About

[BMVC'25] Official repository for "Learning Correlation-aware Aleatoric Uncertainty for 3D Hand Pose Estimation"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published