-
Notifications
You must be signed in to change notification settings - Fork 0
Object grasping pipeline
Object grasping pipeline based on visual-servoing - 'kinova_vs_manipulation' package
-
For MoveIt! visualization, run this (your local terminal must have correct ROS_HOSTNAME and ROS_IP, set by running 'export_mbot7'):
moveit_kinova_rviz -
Launch the pipeline. This launches MoveIt, the object localizer, and makes a service available to trigger an object grasp:
roslaunch kinova_vs_manipulation object-grasp.launch -
Grasp an object (using service call):
rosservice call /kinova_vs_manipulation/object_grasp_sm "object_class: 'orange' reposition: false target_position: x: 0.0 y: 0.0 theta: 0.0" -
Grasp an object (using mbot_class):
mbot_class # Grasp - lift and lower object: > mbot.kinova_manipulation.grasp_object('orange') # Grasp and move the object to target position relative to table origin: > mbot.kinova_manipulation.grasp_object('cup', target_table_position=(0.3, 0.2)) -
To grasp using smach, the
kinova_manipulation_statesmodule in thembot_statespackage has useful smach states. For an example of their use, check theobject_grasp_sm_service.pynode in thekinova_vs_manipulationpackage. For usage in a competition benchmark, check thefbm6package inmbot_competitions.
Main ROS parameters are in the kinova_vs_manipulation/launch/object-grasp.launch file.
The code for the ROS nodes is documented with top-of-file comments.
Main nodes:
-
pregrasp_serviceadvertises a service that selects a pregrasp pose relative to the target object, and moves the arm to it, making it visible by the camera -
visual_servoingadvertises a service to start approaching the arm to the object based on camera-frame error -
joint_controlreads the end-effector velocity topic published byvisual_servoing, calculates corresponding joint velocities using inverse Jacobian, and publishes them to the arm -
object_grasp_sm_serviceadvertises a service to perform the complete grasping pipeline, detecting the object and calling the services above