FollowMe is a system designed at the C.A.R Lab at Wayne State University that allows for one robot with sensors to guide another robot without sensors through a path. The system and our implementation are described in our paper. The framework is comprised of a state machine and an adaptive path segmentation algorithm that can be used in a variety of applications and situations. This repository contains a ROS implementation of the FollowMe system and was tested on differential drive robots in real life and in a simulation. Below is a series of steps to get FollowMe running on a similar system with two robots.
The full implementation is more thoroughly explained in the paper. In short, we have two identical front differential drive bases. One of them has an NVIDIA Jetson Xavier NX, the other has a Raspberry Pi 4. The Jetson robot is the guidance* or master robot (both terms will be used interchangeably) and has two USB cameras (one on front and one on back) as well as a RPLidar S1 mounted on top. The Raspberry Pi Robot has no sensors and therefore has no way of localizing itself, but it does have 4 AprilTags positioned on the center of each side for the master robot's cameras to see and locate.
- Install Ubuntu 18.04 Bionic
- Install ROS Melodic using these steps
- Clone this repo and build with
catkin-make- If catkin-make failes due to missing of
Status.hinchip-bldc-driver, runcatkin-make chip-bldc-driver-generate-messagesfirst, then rebuild the project
- If catkin-make failes due to missing of
- In Ubuntu settings, enable auto-login
- Using the Raspberry PI Imager utility, flash an SD card with Raspberry Pi OS (Legacy) Buster version. It will be near the bottom of the "other rpi os" section.
- You can also setup SSH passwords and autologin when flashing the SD Card
- Install ROS Melodic using these steps
- Clone this repo into a separate folder than the newly created
catkin-wsfrom the previous step - Copy in ONLY the
follower_robotandfollower_motordirectories from the cloned repo into thecatkin_wsfolder. - Build with
catkin-make
- Follow the steps in
79-robot.rules.mdto enable the correct UDEV Rules for both follower and guidance robot- This only needs to be done once (reboot both after doing so)
- Run
source usbsetup.shon both machines (run this every time on startup or set it as a Cron job)
SSH Setup: It is recommended to not have either the Jetson or Raspberry Pi connected via HDMI while it is moving (for obvious reasons). You can use a separate computer (any OS) or a Linux one with ROS installed (melodic or noetic) to use RVIZ and other utilities.
- Get the IP Address of each machine and note it down by running
ifconfigand checking theinetentry underwlan0. Ensure that each robot's computer is set to autologin, and you can disconnect and you should not have to connect again. - Using the central computer (neither of the robots), edit
/etc/hosts(or your OS's equivalent) to give a name to each computer (I usedjetsonandraspberrypi) - Run
ssh [username]@[computer_name]for each robot in a new terminal window (ex:ssh nvidia@jetson) - Set the environment variable of
$ROS_IPon ALL MACHINES (including remote computer) to their respective IPs - On all machines except the Master/Guidance robot, add the master robot to the list in
/etc/hosts - On all machines except the Master/Guidance robot, set the environment variable of
$ROS_MASTER_URItohttp://[computer_name]:11311(ex:http://jetson:11311)- You should put the commands for steps 4 and 6 in the
.bashrcor your shell's run configuration file to run every time a shell is opened.
- You should put the commands for steps 4 and 6 in the
- Complete all installation and common setup steps above.
- Run
roslaunch robot robot.launchon the guidance robot androslaunch follower_robot robot.launchon the follower robort - In a separate terminal on the master robot, run
rosrun robot followme.py - Open RVIZ on the remote computer (you can use the config included in
rviz/main.rviz), specify a Nav goal, and watch the master robot navigate while the follower robot follows it.
These steps are how to run basic demos of the various sensors and motors across the robot. These are outdated if you plan on using FollowMe and the setup described in 79-robot-rules.md.
sudo chmod 666 /dev/ttyUSB0
roslaunch rplidar_ros rplidar_s1.launchsudo chmod 777 /dev/ttyUSB0
roslaunch chip_bldc_driver example.launchStart the camera node:
roslaunch realsense2_camera rs_camera.launch initial_reset:=true enable_gyro:=true enable_accel:=true align_depth:=true filters:=pointcloud
Object Detection and Distance Estimation
# start the inferencing node
rosrun ros_deep_learning detectnet /dettnet/image_in:=/camera/color/image_raw
rosrun camera detectdepth.pyHeading Tracking with IMU:
rosrun camera heading.py
SLAM-type Mapping (doesn't work very well) using heading
rosrun camera mapper.py
rvizStart with:
roslaunch rplidar_ros rplidar_s1.launchSingle revolution plotted as a map:
rosrun lidar single_map.py
Multi Object Detection:
rosrun lidar pointcloud.py
rosrun multi_object_tracking_lidar kf_tracker
rvizrosrun chip_bldc_driver bldc_driver_node
rosrun motor [stepper.py, rpm.py etc.]