Report: More details here in report.pdf
MVA Project - Robotics Extension of the CoRL 2025 paper by Hassidof et al.
While the original DiTree relies on global diffusion policies, exploring complex mazes with narrow passages remains challenging. In this fork, we implemented a Guided Tree Search strategy to bias the exploration.
- BFS Initialization: We introduced a discrete grid-based BFS pre-processing step (
generate_intermediate_goals) to find a coarse path skeleton avoiding static obstacles. - Intermediate Goals: The planner now automatically generates a sequence of intermediate waypoints along the BFS path.
- Biased Sampling Strategy: Modified the
RRT.pyexpansion loop. Instead of sampling purely randomly or towards the final goal, the tree now progressively targets the next intermediate goal based on a proximity cost heuristic (get_distance_to_goal).
Result: This approach significantly reduces the time to find a first solution in "Maze" environments by guiding the diffusion sampler through bottlenecks.
Official implementation of the paper "Train Once Plan Anywhere Kinodynamic Motion Planning via Diffusion Trees" [CoRL 25]
This repository contains the code and experiment setup for Train-Once Plan-Anywhere Kinodynamic Motion Planning via Diffusion Trees . It includes training configuration, experiment execution scripts, and pretrained model checkpoints.
The associated car dataset and model weights are available for download here.
We recommend creating a Python virtual environment before installing dependencies.
- Clone the repository
git clone https://github.com/Yanivhass/ditree.git
cd ditree- Create a virtual environment
python3 -m venv venv
source venv/bin/activate # On Linux/Mac
venv\Scripts\activate # On Windows- Install PyTorch Visit PyTorch.org to find the correct installation command for your system (CPU or GPU). Example (CUDA 11.8):
pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu118- Install other dependencies We provide here a minimal list of depencencies for running the car experiments. Other functionalities might require additional installation (e.g. Mujoco's Ant requiring Mujoco).
pip install -r requirements.txt.
├── run_scenarios.py # Runs the experiments
├── train_manager.py # Configures and launches training sessions
├── checkpoints/ # Contains pretrained model weights
├── data/ # (optional) Local dataset storage
└── requirements.txt # Python dependencies
python run_scenarios.pypython train_manager.pyPretrained model weights and the car dataset can be downloaded from: Google Drive Link AntMaze dataset is available using Minari Minari Place the downloaded files into:
checkpoints/
data/
If you use this work, please cite:
@inproceedings{
hassidof2025trainonce,
title={Train-Once Plan-Anywhere Kinodynamic Motion Planning via Diffusion Trees},
author={Yaniv Hassidof and Tom Jurgenson and Kiril Solovey},
booktitle={9th Annual Conference on Robot Learning},
year={2025},
url={https://openreview.net/forum?id=lJWUourMTT}
}