The murc_robot repo presents an image-based gripping strategy for cooperative object transport.
This strategy was developed in my master thesis, "Development of an Image-Based Gripping Strategy for Cooperative Object Transport using Mobile Robots" in match. As a preliminary project of cooperative object transport using several mobile robots, this thesis focuses on the transportation of an aluminium profile by the cooperation of one mobile robot and a passive flatbed trolley. The proposed strategy can help the robot autonomously accomplish the transport task of an object whose weight is beyond the maximum payload of the robot arm. For a successful demostration,please click the link: https://youtu.be/GOlom8Huy4w.
The strategy can guide a mobile robot through five steps in object transport:
- detect the object to be transported
- obtain its depth information
- estimate 6D-pose of the object
- plan a procedure to carry the object
- execute the procedure.
The MuR205 Robot is composed of a camera, a robot arm, a gripper, a mobile platform, and a computer. It combines the capabilities of a depth camera (Intel RealSense Depth Camera D435), a 6-axis lightweight robot arm (UR5), and a gripper (OnRobot RG2) as well as a mobile platform (MiR200).
Components of the robot communicate with each other via ROS. The programs (/scripts) are written in Python. In ROS, the components can share their states. As the following node diagram shows, the groups represent the camera, robot arm, and platform, respectively. The camera provides the images. The position of the profile results from the image processing. Then the components are allowed to detect the position of the aluminium profile and approach to, grip, or deposit the aluminium profile accordingly. The operator can remotely run programs and control the robot in a laptop because the PC in robot shares the ros-master with the laptop via SSH. The operator can also monitor the detected profile and the gripping process on a laptop.
Before running programs, the hardware is supposed to be configured.
- update of URCap from 1.0.1 to 1.10.1 (if an error "missing URCap" comes up, do this step):
robot setting-- URcaps-- "+" -- open OnRobot/RG2-1.10.1_0/OnRobot-1.10.1.urcap-- restart - load installation file (murc.installation) in teach pendant of UR5, then restart UR5
- adjust tcp-coordinate system:
installation-- RG configuration-- rotate the gripper twice in GUI, then RZ becomes 0. - load placing program:
program-- load program-- open /programms/Cen/Placing.urp
- run MiR driver in PC:
roslaunch mir_driver mir.launch
- run MiR navigation (Rviz) in Laptop:
roslaunch mir_navigation mir_start.launch
- run object detection program in PC, inclding camera driver and ur driver as well as serveral servers for object detection, transformation between different coordinate frames and components controlling:
if neccessary, run drivers separately
roslaunch murc_robot object_detector.py
roslaunch realsense2_camera rs_aligned_depth848x480.launch # camera driver roslaunch ur_modern_driver ur5_bringup.launch robot_ip:=192.168.12.90 # ur driver
- visualization of camera view und the result of objetdetection in Laptop:
rosrun murc_robot img_displayer.py
- monitor current gripping process in state machine in Laptop:
rosrun smach_viewer smach_viewer.py
- launch gripping state machine:
rosrun murc_robot imagebased_grasping_smach4.py
| Package | Description |
|---|---|
| mir_driver | The mir driver provides most of the important topics and functionalities for either Odometry or SLAM navigation, including the move_base action server, which can be used to execute goal requests. |
| realsense2_camera | This camera driver can stream all camera sensors and publish on the appropriate ROS topics. |
| ur_modern_driver | Updated version of ur_driver. Needed for communication with robot controller using ur_srcipt messages. The arm can be used in different modes: movel, movej, speedl, speedj. |
| Node | Description |
|---|---|
| approach2object | Robot approaches the first end of the profile. |
| functions_module | provides necessary functions to image processing, including the transformation from uv-coordinates to 3D-coordinates in camera coordinate system |
| grasp_object | Robot arm grasps the object. |
| imagebased_grasping_smach4 | state machine to launch the whole gripping strategy |
| img_displayer | displays the result of object detection in the original image. |
| IO_test | can set&read IO ports' state. It is also able to start/stop the urp programs indirectly by setting or resetting port. |
| movealongobject | robot moves along the profile to its second end. The action is executed by the move_base (ActionServer) |
| movealongobject_cmd | Robot moves along the profile to its second end. The action is executed by the by cmd_vel (ros topic) |
| obrobot_rg2_gripper | class of the rg2 gripper |
| place_skateboard | places the skateboard near the first end of profile |
| position_determination_server | service for calculating the position of object |
| position_publisher | invokes the service to calculate the position of object and publishes the results |
| profile_detection_v1_08 | algorithm of object detection and estimation of object' 6D-pose |
| rg2_gripper_switcher | can open/close the gripper |
| RG2Gripper_action_server | action server of the gripper |
| set_goal | calculates goals for mobile platform, robot arm and gripper |
| transformation_arm | take the transformation matrix camera_T_urtcp into transformation chain |
| transformation_handeye | published transformation matrix camera_T_urtcp --- result of hand eye calibration |
| ur_move_action_server | action server of robot arm |
| ur_pose_maker | change pose of the ur in joint space using movej |

