Complete ROS2 Humble Robotic Arm Control System
Features • Quick Start • Documentation • Contributing • License
An intelligent robotic arm system based on ROS2 Humble, implementing a complete ROS2 control system for the SO-101 arm (LeRobot SO-101) and the follower arm from the LeKiwi project (LeRobot LeKiwi). With an ultra-low hardware cost of approximately $110 per arm, this system integrates computer vision, motion planning, and voice control capabilities. It supports distributed deployment with Raspberry Pi for hardware control and PC for planning/perception, making it ideal for learning ROS2, basic robotic arm control, and educational demonstrations. You can control the arm through MoveIt2 for any motion, use Python code for programmatic control, or enable automated control via computer vision with camera integration (note: some parameter adjustments may be needed based on lighting conditions and actual hardware).
- 6-DOF Robotic Arm - Full motion control using Feetech ST3215 servos
- Gripper Control - Precision grasping with rotation support
- USB Camera - Computer vision for object detection and tracking
- Distributed Deployment - Raspberry Pi + PC architecture
- MoveIt2 Integration - Professional motion planning framework
- Collision Avoidance - Safe trajectory generation
- Inverse Kinematics - Automatic joint angle calculation
- Custom Move Groups - Flexible control configurations
- YOLOv8 Detection - Real-time object detection
- Cube Detection Mode - Optimized for cube grasping tasks
- Dual-View Triangulation - Enhanced 3D position estimation
- Depth Integration - Precise spatial localization
- Natural Language Commands - Vosk-based speech recognition
- Chinese/English Support - Multi-language interface
- Custom Command Mapping - Configurable voice actions
- Custom RViz Plugin - Interactive control panel
- Real-time Monitoring - System status display
- Debug Tools - Comprehensive logging and diagnostics
┌─────────────────────────────────┐ ┌──────────────────────────────────┐
│ Raspberry Pi (Robot Side) │ │ PC (Control Side) │
│ ─────────────────────────────── │ │ ──────────────────────────────── │
│ • arm_driver_node │ │ • move_group (MoveIt2) │
│ • robot_state_publisher │ <──DDS──> │ • arm_planning_py_node │
│ • usb_cam │ Network │ • yolo_perception_node │
│ • joint_state_publisher │ │ • rviz2 │
│ │ │ • arm_voice_node (optional) │
└─────────────────────────────────┘ └──────────────────────────────────┘
├── arm_bringup # System integration & launch files
├── arm_description # URDF robot models & kinematics
├── arm_driver_node # Feetech servo hardware driver (C++)
├── arm_interfaces # Custom ROS2 messages & services
├── arm_moveit_config # MoveIt2 motion planning config
├── arm_perception_yolo # YOLOv8 vision perception system
├── arm_planning_py # Motion planning & grasping logic (Python)
├── arm_rviz_plugin # Custom RViz control panel (C++)
└── arm_voice_interface # Voice command interface (Python)
Minimum Configuration:
- 6-DOF robotic arm with Feetech ST3215 servos
- USB camera (standard webcam, 640×480 or higher resolution)
- Ubuntu 22.04 LTS
- 8GB RAM, 4-core CPU
Recommended Configuration (Distributed):
- Raspberry Pi 4 (4GB+) - Robot side hardware control
- PC with Ubuntu 22.04 - Planning and perception
- Same LAN network connection
- ROS2 Humble - Robot Operating System 2
- Python 3.10+ - Main programming language
- MoveIt2 - Motion planning framework
- PyTorch - Deep learning (YOLO perception)
- OpenCV - Computer vision
- Ultralytics - YOLOv8 implementation
# Create ROS2 workspace
mkdir -p ~/lododo-arm/src
cd ~/lododo-arm/src
# Clone repository (note the . at the end)
git clone https://github.com/harryzy/lododo-arm.git .
cd ~/lododo-arm# Install ROS2 packages
sudo apt update
sudo apt install -y \
ros-humble-moveit \
ros-humble-usb-cam \
python3-pip
sudo apt install python3-colcon-common-extensions -y
# Install Python dependencies
pip3 install pyserial numpy scipy ultralytics opencv-pythoncd ~/lododo-arm
colcon build
# Source the workspace
source install/setup.bash
# Source ros2 humble
source /opt/ros/humble/setup.bashSingle Machine (All-in-One):
# Launch complete system
ros2 launch arm_bringup real_bringup.launch.py
# Terminal 2 (YOLO perception)
ros2 run arm_bringup start_yolo_cube_detect.shDistributed Deployment (Recommended):
# On Raspberry Pi (robot side)
ros2 launch arm_bringup robot_side.launch.py
# On PC (control side) - Terminal 1
ros2 launch arm_bringup pc_side.launch.py
# On PC - Terminal 2 (YOLO perception)
ros2 run arm_bringup start_yolo_cube_detect.shSimulation:
ros2 launch arm_bringup sim_bringup.launch.pyFor Cube Detection Mode:
ros2 run arm_bringup start_yolo_cube_detect.shThe custom RViz plugin provides buttons for:
- Scan Front - Scan objects from front view
- Scan All - Multi-view scanning (15° intervals)
- Grasp Lift - Pick up detected object
- Deliver to Pose - Place object at target location
With voice control enabled:
- "扫描前面" (Scan front)
- "扫描" (Scan all)
- "抓起" (Grasp and lift)
- "递送" (Deliver to pose)
# Publish detection command
ros2 topic pub /detection_command arm_interfaces/msg/DetectionCommand "{command: 'scan_front'}"
# Publish grasp command
ros2 topic pub /grasp_command arm_interfaces/msg/GraspCommand "{...}"The system uses a novel rotational stereo vision approach:
- Camera mounted on robotic arm
- Joint1 rotation creates baseline (typically 15°)
- Triangulation calculates object depth and dimensions
- Calibration factor for depth accuracy: 0.584
Specialized mode for detecting 5cm cubes:
- Edge length filtering (3-8cm)
- Position validation (within workspace)
- Shape scoring (bbox aspect ratio + 3D similarity)
- Automatic best candidate selection
Configuration: config/measurement_params.yaml
Key configuration files:
config/measurement_params.yaml- Vision and triangulation parametersconfig/perception_params.yaml- YOLO detection settingsconfig/ompl_planning.yaml- MoveIt2 planning parameters
We welcome contributions! Please read our Contributing Guidelines before submitting PRs.
- Fork the repository
- Create your feature branch (
git checkout -b feature/AmazingFeature) - Commit your changes (
git commit -m 'Add some AmazingFeature') - Push to the branch (
git push origin feature/AmazingFeature) - Open a Pull Request
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.
Copyright 2024 lododo
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
Main Developer:
- lododo - Initial work and maintenance
- GitHub: @harryzy
- Email: contect@lododo.org
See also the list of contributors who participated in this project.
- LeRobot - SO-101 and LeKiwi robotic arm hardware designs from Hugging Face
- SO-101 - Low-cost 6-DOF robotic arm design
- LeKiwi - Follower arm hardware platform
- ROS2 & MoveIt2 - Robot Operating System and motion planning
- Ultralytics - YOLOv8 object detection framework
- usb_cam - USB camera ROS2 driver for computer vision
- Vosk - Offline speech recognition
- Feetech - Servo motor SDK
- Open Robotics - ROS2 ecosystem and tools
- Issues: GitHub Issues
- Discussions: GitHub Discussions
- Email: contect@lododo.org
If you find this project useful, please consider giving it a star ⭐️
Active Development | Last Updated: November 2024
Current Version: v0.972
- ✅ Basic motion control
- ✅ Object detection and tracking
- ✅ Stereo vision triangulation
- ✅ Grasp planning and execution
- ✅ Voice control interface
- ✅ Cube detection mode
- 🚧 Multi-object manipulation
- 🚧 Deep learning grasp pose estimation
- 🚧 Adaptive gripper control
See Issues for a list of known issues and feature requests.
If you encounter any problems or have questions, please:
- Check the documentation
- Search existing issues
- Create a new issue with detailed information
⭐ If you find this project helpful, please consider giving it a star!