This work deals with localization of mobile robot. ORB_SLAM and IMU are fused by using extended Kalman filter. This work is based on ORB_SLAM3, robot_pose_ekf, imu_calibration and unified_calibration.
Ubuntu 18.04
ROS melodic
Realsense D435i
install librealsense
./install_script_cheninstall Eigen3: (http://eigen.tuxfamily.org/index.php?title=Main_Page)
install Ceres Solver: (http://ceres-solver.org/installation.html)
create catkin workspace:
mkdir -p ~/catkin_ws/src
cd ~/catkin_ws/src
catkin_init_workspace
cd ~/catkin_ws
catkin_make
source ~/catkin_ws/devel/setup.bashinstall dependencies for kalibr
sudo apt-get install python-setuptools python-rosinstall ipython libeigen3-dev libboost-all-dev doxygen libopencv-dev ros-melodic-opencv ros-melodic-image-transport-plugins ros-melodic-cmake-modules python-software-properties software-properties-common libpoco-dev python-matplotlib python-scipy python-git python-pip ipython libtbb-dev libblas-dev liblapack-dev python-catkin-tools libv4l-devinstall repository:
cd ~/catkin_ws/src
git clone https://github.com/hustchenli/Visual_SLAM_IMU_loose_coupling.git
cd ..
catkin_makebuild ORB_SLAM3:
cd ~/PATH/ORB_SLAM3
chmod +x build.sh
./build.sh
gedit ~/.bashrc
export ROS_PACKAGE_PATH=${ROS_PACKAGE_PATH}:/PATH/ORB_SLAM3/Examples/ROS
chmod +x build_ros.sh
./build_ros.shbuild robot_pose_ekf
rosdep install robot_pose_ekf
roscd robot_pose_ekf
rosmakeFind the luanuch file of camera rs_camera.launch, and change the value of unite_imu_method to linear_interpolation:
<arg name="unite_imu_method" default="linear_interpolation"/>Plug camera to computer and run:
roscore
roslaunch realsense2_camera rs_camera.launch
rostopic list check if there is topic /camera/imu
then keep the camera still, and record the data form imu as long as possible:
roscore
roslaunch realsense2_camera rs_camera.launch
rosbag record -O imu_calibration /camera/imu change the value of max_time_min in ~/imu_utils/launch/d435i_imu_calibration.launch to the length of the recording (unit:min), then run:
roslaunch imu_utils d435i_imu_calibration.launch
rosbag play -r 50 imu_calibration.bagthe result of IMU calibration is published in ~/imu_utils/data/d435i_imu_calibration_imu_param.yaml.
download and print calibration target: (https://github.com/ethz-asl/kalibr/wiki/downloads) and create yaml file, if the target is printed on A4 paper, yaml file should be:
target_type: 'aprilgrid' #gridtype
tagCols: 6 #number of apriltags
tagRows: 6 #number of apriltags
tagSize: 0.021 #size of apriltag, edge to edge [m]
tagSpacing: 0.3 #ratio of space between tags to tagSizepoint the camera at the calibration target, and try to keep the calibration target in the camera’s field of view. Translate or rotate the camera and record image data for one to two minutes:
roscore
roslaunch realsense2_camera rs_camera.launch
rosrun topic_tools throttle messages /camera/color/image_raw 4.0 /color
rosbag record -O camd435i /colorrun the calibration programm:
kalibr_calibrate_cameras --target /PATH/April.yaml --bag /PATH/camd435i.bag --bag-from-to 20 110 --models pinhole-radtan --topics /color --show-extractionthen the calibration result would be published in yaml, txt and pdf file.
move the camera for one to two minutes, keep calibration target in camera’s field of view, record data from camera and IMU:
roscore
rosrun topic_tools throttle messages /camera/color/image_raw 20.0 /color
rosrun topic_tools throttle messages /camera/imu 200.0 /imu
rosbag record -O dynamic /color /imuafter recording, run:
kalibr_calibrate_imu_camera --target /PATH/April.yaml --cam /PATH/camera.yaml --imu /PATH/imu.yaml --bag /PATH/dynamic.bag --show-extractionin file camchain-imucam-dynamic.yaml you will find transformation matrix between camera and IMU.
change the value of parameter T_cam_imu in camimu_trans.py according to the result in camchain-imucam-dynamic.yaml, change the camera intrinsics in ~/ORB_SLAM3/D435i.yaml according to calibration results
turn on the camera:
roscore
roslaunch realsense2_camera rs_camera.launchrun:
python camimu_trans.pyturn on extended Kalman filter:
roslaunch robot_pose_ekf.launchturn on ORB_SLAM3:
rosrun ORB_SLAM3 RGBD /PATH/ORB_SLAM3/Vocabulary/ORBvoc.txt /PATH/ORB_SLAM3/D435i.yamlestimated pose from ORB_SLAM3 without fusion is:
rostopic echo /camera_poseestimated pose from fusion of ORB_SLAM3 and IMU is:
rostopic echo /robot_pose_ekf/odom_combinedIn order to test localization accuracy, FARO vantage laser tracker is used. Spherically mounted retroreflector (SMR), which could be regarded as tracking target of FARO, is stuck on the camera. Since FARO can measure the position of SMR with extremely high accuracy, the FARO measurement could be regarded as real coordinate.
FARO measurements are saved in ~/Examples/ORB_SLAM_results/test09114.xlsx and ~/Examples/ORB_SLAM_results/test09118.xlsx. The results of ORB_SLAM3 without IMU are saved in ~/Examples/ORB_SLAM_results/cpose09114.bag and ~/Examples/ORB_SLAM_results/cpose09118.bag. The Matlab-script ~/Examples/ORB_SLAM_results/Error_cal_orb.m is to calculate absolute trajectory error (ATE). The results of ORB_SLAM IMU fusion are saved in ~/Examples/EKF_results/ekf09114.bag and ~/Examples/EKF_results/ekf09118.bag. The Matlab-script ~/Examples/EKF_results/Error_cal_ekf.m is used to calculate ATE of the fusion.