ICRA 2025 [Paper]
Sim2real robot manipulation utilizing GS modeling
RL-GSBridge_pipe.mp4
Currently Released: GS and mesh reconstruction Code for policy training and GS rendering with Pybullet
TODO: add running description; possible data; GS link?
Install SAM-Track as follows: ( ONLY for reconstruction of your own data.)
git clone https://github.com/z-x-yang/Segment-and-Track-Anything.git
cd Segment-and-Track-Anything && bash script/install.sh
bash script/download_ckpt.shInstall the required dependencies of soft mesh binding GS as follows:
cd soft-gaussian-mesh-splatting
pip install -r requirements.txt
pip install submodules/diff-gaussian-rasterization
pip install submodules/simple-knnTo run the sim2real policy, you just need to additionally install the pybullet package:
pip install pybulletWe provide an example of GS training of 'Banana' in our paper. You could download data from this link, unpack the file and move it to folder exp_obj_data.
The mesh model and masks have been generated and preprocessed in this example. For full reconstruction from raw image data, please refer to the next section.
Create GS object folder and go to the folder soft-gaussian-mesh-splatting:
mkdir exp_obj_GS
cd soft-gaussian-mesh-splattingFor training, run:
python train.py -s ./exp_obj_data/banana/ -i mask_banana -m ./exp_obj_GS/banana_mesh_0 --gs_type gs_mesh_norm_aug --num_splats 2 --sh_degree 0Here, 'gs_mesh_norm_aug' represents our soft mesh binding GS method. To run raw GaMeS method, change gs_type to 'gs_mesh'.
For evaluation of the training results, run:
# render the training set
python scripts/render.py --gs_type gs_mesh_norm_aug -m ./exp_obj_GS/banana_mesh_0
# calculate the metrics
python metrics.py --gs_type gs_mesh_norm_aug --model_paths ./exp_obj_GS/banana_mesh_0 The evaluation results will be saved to file 'results_$GS_TYPE.json' under the model path.
Our training and evaluation results of soft mesh binding GS and the raw GaMeS are provided in this link, where '_noaug' represents raw GaMeS training results.
TODO
To create the object mask and mesh models for your own data, SAM-Track and COLMAP are required. A recorded video of the object is needed.
TODO
Follow the instruction in SAM-Track to segment each frame of your video. TODO: FILE NAME DEFINITION
We provide some existing GS object and background models for 'Banana Grasping' training in this link.
This file contains GS models of foam pad background and a cake, and their physical params are already recorded in RLGS-bridge-pub/obj_trans.json. Please unpack it and put the files into the exp_obj_GS folder.
cd RLGS-bridge-pubFor policy training with GS rendering, run:
python learn_eih_SAC_meshGS.py -t 4 -q -b -c -i -l 'your training file path' -r -m mono --mesh --strain --color_refine --use_forceYou could find your training logs under the folder saves.
For training withour rendering, just run the code without -r. For more params explaination, please refer to this repo.
For policy test, run:
python test_eih_SAC_meshGS.py -t 4 -l 'your training file path' -b i -r --meshYou could find the realistic rendering images under the folder test_out