Skip to content

3D LiDAR place recognition targeting the heterogeneous robots scenario

Notifications You must be signed in to change notification settings

SYSU-RoboticsLab/EHPR

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Effective Heterogeneous Point Cloud-Based Place Recognition and Relative Localization for Ground and Aerial Vehicles

Rui Mao and Hui Cheng

2025 IEEE International Conference on Robotics and Automation

Introduction

EHPR is a 3D LiDAR place recognition method tailored for the heterogeneous robots scenario. In this work, we propose a pipeline based on BEV density images, combined with an enhanced data structure. An efficient height alignment algorithm is proposed for relative localization. We also show that our method is capable to detect inter- and intra-robot loop closures in a ground and aerial multi-session SLAM system.

If you find this work useful, please consider citing our paper:

Details
@INPROCEEDINGS{11128163,
  author={Mao, Rui and Cheng, Hui},
  booktitle={2025 IEEE International Conference on Robotics and Automation (ICRA)}, 
  title={Effective Heterogeneous Point Cloud-Based Place Recognition and Relative Localization for Ground and Aerial Vehicles}, 
  year={2025},
  volume={},
  number={},
  pages={15828-15834},
  doi={10.1109/ICRA55743.2025.11128163}}

Installation

a. Ubuntu and ROS

We tested our code on Ubuntu 20.04 with noetic.

b. Packages from ubuntu source

sudo apt install libboost-dev libtbb-dev libgoogle-glog-dev

c. GTSAM for demo

Following the official GTSAM installation. We tested our code with GTSAM 4.2.

d. Build and compile

mkdir -p catkin_ws/src
cd catkin_ws/src
git clone https://github.com/SYSU-RoboticsLab/EHPR.git
cd ../..
catkin_make

Example

This repository provides an implementation of EHPR, along with a simple demo of a ground–aerial multi-session SLAM system.

Demo Data

To run the demo, download the LiDAR data collected by our UAV and UGV:

heterogeneous data

We use a front-end odometry method (e.g., FAST-LIO2) to estimate poses, then save each sequence in the following format:

<sequence_name>/
├── pcd/
│   ├── 0.pcd                 # one scan per file
│   ├── 1.pcd
│   └── ...
├── raw_imu_data.txt          # timestamp + IMU measurements (e.g., ts qx qy qz qw ...)
├── register_lidar.txt        # timestamp + index + pointcloud filename (e.g., ts 0 0.pcd)
└── register_pose.txt         # timestamp + 3x4 pose matrix [R|t] (e.g., ts r11 r12 r13 tx r21 r22 r23 ty r31 r32 r33 tz)

Note: If you use your own dataset, you can either follow the same file organization above or modify the input parser in the code to match your data format.

Run the Demo

Edit the paths ground_root and air_root in config/multisession_video.yaml and launch the demo:

source deve/setup.bash
roslaunch ehpr multi_video.launch

Acknowledgments

In the development of EHPR, we stand on the shoulders of the following repositories:

  • Contour Context: Abstract Structural Distribution for 3D LiDAR Loop Detection and Metric Pose Estimation
  • MapClosures: Effectively Detecting Loop Closures using Point Cloud Density Maps
  • STD: A Stable Triangle Descriptor for 3D place recognition
  • HBST: Hamming Binary Search Tree

Contact Us

For any technical issues, please contact us via email Rui Mao(maor8@mail2.sysu.edu.cn).

About

3D LiDAR place recognition targeting the heterogeneous robots scenario

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published