Skip to content

JYChen18/DexLearn

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DexLearn

Learning-based grasp synthesis baselines (e.g., diffusion model and normalizing flow) for dexterous hands, used in BODex (ICRA 2025) and Dexonomy (RSS 2025)

TODO list

  • Support BODex and Dexonomy datasets
  • Release grasp type classifier for Dexonomy

Installation

git submodule update --init --recursive --progress

conda create -n dexlearn python=3.10 
conda activate dexlearn

# pytorch
conda install -c conda-forge mkl=2020.2 -y
conda install pytorch==2.0.1 pytorch-cuda=11.7 -c pytorch -c nvidia 

# pytorch3d
wget https://mirrors.tuna.tsinghua.edu.cn/anaconda/cloud/pytorch3d/linux-64/pytorch3d-0.7.8-py310_cu118_pyt210.tar.bz2
conda install -y --use-local ./pytorch3d-0.7.8-py310_cu118_pyt210.tar.bz2


# Diffusers 
cd third_party/diffusers
pip install -e .
cd ...

# MinkowskiEngine
cd third_party/MinkowskiEngine
sudo apt install libopenblas-dev
export CUDA_HOME=/usr/local/cuda-11.7
python setup.py install --blas=openblas
cd ...

# nflows
cd third_party/nflows
pip install -e .
cd ...

# dexlearn
pip install -e .
pip install numpy==1.26.4
pip install hydra-core

you may need to run the following command to avoid potential errors related to MKL such as undefined symbol: iJIT_NotifyEvent

conda install -c conda-forge mkl=2020.2 -y

## Getting Started
1. **Prepare assets**. Download Dexonomy dataset from [Hugging Face](https://huggingface.co/datasets/JiayiChenPKU/Dexonomy). The folder should be organized as below:

DexLearn/assets |- grasp | |_ Dexonomy_GRASP_shadow | |_ succ_collect |_ object |- DGN_5k | |- valid_split | |- processed_data | |- scene_cfg | |_ vision_data |_ objaverse_5k |- valid_split |- processed_data | |- scene_cfg |_ vision_data

Similarly, you can download BODex dataset from [Hugging Face](https://huggingface.co/datasets/JiayiChenPKU/BODex). 

2. **Running**. 
```bash
CUDA_VISIBLE_DEVICES=0 python -m dexlearn.train exp_name=type1 algo=nflow data=dexonomy_shadow
CUDA_VISIBLE_DEVICES=0 python -m dexlearn.sample -e dexonomy_shadow_nflow_type1       

If you want to use our pre-trained models, please download the checkpoints from Hugging Face and put them in the output folder.

DexLearn/output
|- dexonomy_shadow_nflow_cond
|   |- .hydra
|   |- ckpts

and then you can run the sampling command below:

CUDA_VISIBLE_DEVICES=0 python -m dexlearn.sample -e dexonomy_shadow_nflow_cond algo.model.grasp_type_emb.use_predictor=True

Now we donot support the training of predictor for grasp type.

  1. Evaluating in simulation: Please see DexGraspBench

License

This work and the dataset are licensed under CC BY-NC 4.0.

CC BY-NC 4.0

Citation

If you find this work useful for your research, please consider citing:

@article{chen2024bodex,
  title={BODex: Scalable and Efficient Robotic Dexterous Grasp Synthesis Using Bilevel Optimization},
  author={Chen, Jiayi and Ke, Yubin and Wang, He},
  journal={arXiv preprint arXiv:2412.16490},
  year={2024}
}
@article{chen2025dexonomy,
        title={Dexonomy: Synthesizing All Dexterous Grasp Types in a Grasp Taxonomy},
        author={Chen, Jiayi and Ke, Yubin and Peng, Lin and Wang, He},
        journal={Robotics: Science and Systems},
        year={2025}
      }

Acknowledgment

Thanks @haoranliu for his codebase and help in normalizing flow.

About

Learning-based Grasp Synthesis for Dexterous Hand

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  

Languages