Skip to content

Commit 8c0f6c7

Browse files
committed
Samples: Automatic updates to public repository
Remember to do the following: 1. Ensure that modified/deleted/new files are correct 2. Make this commit message relevant for the changes 3. Force push 4. Delete branch after PR is merged If this commit is an update from one SDK version to another, make sure to create a release tag for previous version.
1 parent 7e557aa commit 8c0f6c7

File tree

1 file changed

+199
-57
lines changed
  • source/applications/advanced/hand_eye_calibration

1 file changed

+199
-57
lines changed
Lines changed: 199 additions & 57 deletions
Original file line numberDiff line numberDiff line change
@@ -1,92 +1,234 @@
11
# Hand-Eye Calibration
22

3-
To fully understand Hand-Eye Calibration, please see the [tutorial][Tutorial-url] in our Knowledge Base.
3+
This page provides an overview of how to **perform**, **verify**, and **use Hand–Eye Calibration** with Zivid cameras.
44

5-
-----------------
6-
The following applications create a **Hand-Eye Transformation Matrix** from data provided by a user:
5+
If you are new to Hand–Eye Calibration, start with the [Hand–Eye Calibration – Concept & Theory][HandEyeTutorial-url], explaining:
76

8-
[**HandEyeCalibration**][HandEyeCalibration-url]
7+
- What Hand–Eye Calibration is
8+
- The difference between **eye-in-hand** and **eye-to-hand**
9+
- Best practices for dataset (point clouds and robot poses) acquisition
910

10-
* An application that walks through the collection of calibration poses:
11-
1. The user provides a robot pose in the form of a 4x4 transformation matrix (manual entry)
12-
2. The application captures a point cloud of the calibration object
13-
3. The user moves the robot to a new capture pose and enters the command to add a new pose
14-
4. Steps i.-iii. are repeated until 10-20 pose pairs are collected
15-
5. The user enter the command to perform calibration and the application returns a **Hand-Eye Transformation Matrix**
11+
If you already know what you’re doing and just want to run calibration or check out our Hand-Eye calibration code, continue reading.
1612

17-
[**ZividHandEyeCalibration**][ZividHandEyeCalibration-url]
13+
<!-- Use "Markdown All in One plugin in VS code to automatically generate and update TOC". -->
1814

19-
* [CLI application][CLI application-url], which takes a collection of robot pose and point cloud pairs (e.g. output of the steps i.-iii. in [HandEyeCalibration][HandEyeCalibration-url]) and returns a **Hand-Eye Transformation Matrix**. This application comes with the Windows installer and is part of the tools deb for Ubuntu.
15+
- [Quick Start: Just Calibrate](#quick-start-just-calibrate)
16+
- [Programmatic Hand–Eye Calibration](#programmatic-handeye-calibration)
17+
- [Dataset Acquisition Samples](#dataset-acquisition-samples)
18+
- [After Hand–Eye Calibration](#after-handeye-calibration)
19+
- [Verifying Calibration Accuracy](#verifying-calibration-accuracy)
20+
- [Summary: Which Tool Should I Use?](#summary-which-tool-should-i-use)
2021

21-
-----------------
22+
---
2223

23-
There are two samples that show how to perform the acquisition of the hand-eye calibration dataset.
24-
Both samples go through the process of acquiring the robot pose and point cloud pairs and then process them to return the resulting **Hand-Eye Transform Matrix**.
24+
## Quick Start: Just Calibrate
2525

26-
[**UniversalRobotsPerformHandEyeCalibration**][URhandeyecalibration-url]
26+
If your goal is **only to compute the Hand–Eye Transformation Matrix**, use one of the tools below and follow Zivid’s [best-practice guide for capture poses][ZividHandEyeCalibration-url].
2727

28-
* This sample is created to work specifically with the UR5e robot.
29-
* To follow the tutorial for this sample go to [**UR5e + Python Hand Eye Tutorial**][URHandEyeTutorial-url].
28+
### Hand–Eye Calibration GUI (Recommended)
3029

31-
[**RoboDKHandEyeCalibration**][RobodkHandEyeCalibration-url]
30+
- Tutorial: [Hand–Eye GUI Tutorial][HandEyeCalibrationGUITutorial-url]
31+
- Application: [HandEyeCalibration GUI][HandEyeCalibrationGUI-url]
3232

33-
The second sample uses RoboDK for robot control and can be used with any robot that the software supports.
34-
The list of the robots that they support can be found [**here**][robodk-robot-library-url].
35-
Poses must be added by the user to their rdk file.
36-
To find the best capture pose practice follow the instructions provided on the Zivid knowledge base for the [hand-eye calibration process][ZividHandEyeCalibration-url].
33+
Best choice if you:
3734

38-
-----------------
39-
The following applications assume that a **Hand-Eye Transformation Matrix** has been found.
35+
- Want a guided, no-code workflow
4036

41-
[**UtilizeHandEyeCalibration**][UtilizeHandEyeCalibration-url]:
37+
---
4238

43-
* Shows how to transform position and rotation (pose) from the camera coordinate system to the robot coordinate system.
44-
* Example use case - "Bin Picking":
45-
1. Acquire a point cloud of an object to pick with a Zivid camera.
46-
2. Find an optimal picking pose for the object and **transform it into the robot coordinate system**
47-
3. Use the transformed pose to calculate the robot path and execute the pick
39+
## Programmatic Hand–Eye Calibration
4840

49-
[**PoseConversions**][PoseConversions-url]:
41+
The following applications produce a Hand–Eye Transformation Matrix from robot poses and calibration captures.
5042

51-
* Zivid primarily operates with a (4x4) **Transformation Matrix** (Rotation Matrix + Translation Vector). This example shows how to convert to and from:
52-
* AxisAngle, Rotation Vector, Roll-Pitch-Yaw, Quaternion
43+
### Minimal Hand-Eye Calibration Code Example
5344

54-
[**VerifyHandEyeWithVisualization**][VerifyHandEyeWithVisualization-url]:
45+
- Sample: [HandEyeCalibration][HandEyeCalibration-url]
46+
- Tutorial: [Integrating Zivid Hand-Eye Calibration][hand-eye-procedure-url]
5547

56-
Visually demonstrates the hand-eye calibration accuracy by overlapping transformed points clouds.
48+
Workflow:
5749

58-
* The application asks the user for the hand-eye calibration type (manual entry).
59-
* After loading the hand-eye dataset (point clouds and robot poses) and the hand-eye output (**transformation matrix**), the application repeats the following process for all dataset pairs:
60-
1. Transforms the point cloud
61-
2. Finds cartesian coordinates of the checkerboard centroid
62-
3. Creates a region of interest around the checkerboard and filters out points outside the region of interest
63-
4. Saves the point cloud to a PLY file
64-
5. Appends the point cloud to a list (overlapped point clouds)
50+
1. User inputs robot pose in the form of a 4x4 transformation matrix (manual entry)
51+
2. Camera captures the calibration object
52+
3. User moves the robot to a new capture pose and enters the command to add a new pose
53+
4. First three steps are repeated (typically 10–20 pose pairs)
54+
5. User enters the command to perform calibration and the application returns a Hand-Eye Transformation Matrix
6555

66-
This application ends by displaying all point clouds from the list.
56+
Use this if you:
6757

68-
[**RobodkHandEyeVerification**][RobodkHandEyeVerification-url]
58+
- Want the simplest integration example
59+
- Are building your own calibration pipeline
6960

70-
Serves to verify the hand-eye calibration accuracy via a touch test.
61+
---
7162

72-
* After loading the hand-eye configuration, the required transformation matrices, and the type of the calibration object, the application runs in the following steps:
73-
1. The robot moves to the Capture Pose previously defined.
74-
2. The user is asked to put the Zivid Calibration Object in the FOV and press Enter.
75-
3. The camera captures the Zivid Calibration Object and the pose of the touching point is computed and displayed to the user.
76-
4. When the user presses the Enter key, the robot touches the Zivid Calibration Object at a distinct point.
77-
5. Upon pressing the Enter key, the robot pulls back and returns to the Capture Pose.
78-
6. At this point, the Zivid Calibration Object can be moved to perform the Touch Test at a different location.
79-
7. The user is asked to input “y” on “n” to repeat or abort the touch test.
63+
### Hand Eye Calibration CLI Tool
64+
65+
- Tutorial: [Zivid CLI Tool for Hand–Eye Calibration][CLI application-url]
66+
- Installed with:
67+
- Windows Zivid installer
68+
- `tools` deb package on Ubuntu
69+
70+
Use this if you:
71+
72+
- Already have a dataset (robot poses + point clouds)
73+
- Want a command-line, batch-style workflow
74+
75+
---
76+
77+
## Dataset Acquisition Samples
78+
79+
The samples below show how to acquire robot poses and point clouds, then compute the Hand–Eye Transformation Matrix.
80+
81+
### RoboDK-Based (Robot-Agnostic)
82+
83+
- Sample: [RoboDKHandEyeCalibration][RobodkHandEyeCalibration-url]
84+
- Tutorial: [Any Robot + RoboDK + Python Hand–Eye Tutorial][RoboDKHandEyeTutorial-url]
85+
- Supported robots: [RoboDK robot library][robodk-robot-library-url]
86+
87+
Features:
88+
89+
- Works with any RoboDK-supported robot
90+
- Capture poses are manually defined in the `.rdk` file
91+
- Fully automated robot control
92+
93+
---
94+
95+
### Universal Robots (e.g. UR5e)
96+
97+
- Sample: [UniversalRobotsPerformHandEyeCalibration][URhandeyecalibration-url]
98+
- Tutorial: [UR5e + Python Hand–Eye Tutorial][URHandEyeTutorial-url]
99+
100+
Features:
101+
102+
- Designed specifically for UR robots
103+
- Fully automated robot control
104+
105+
---
106+
107+
## After Hand–Eye Calibration
108+
109+
The following applications assume that a **Hand–Eye Transformation Matrix already exists**.
110+
111+
### Utilize Hand-Eye Calibration
112+
113+
- Sample: [UtilizeHandEyeCalibration][UtilizeHandEyeCalibration-url]
114+
- Tutorial: [How To Use The Result Of Hand-Eye Calibration][UtilizeHandEyeCalibrationTutorial-url]
115+
116+
Demonstrates how to:
117+
118+
- Transform poses from camera coordinates to robot coordinates
119+
- Use the transform in real applications (e.g., bin picking)
120+
121+
Example workflow:
122+
123+
1. Capture a point cloud with a Zivid camera
124+
2. Find an object pick pose in camera coordinate system
125+
3. Transform the pose into robot coordinate system
126+
4. Plan and execute the robot motion
127+
128+
---
129+
130+
### Pose Conversions
131+
132+
- Sample: [PoseConversions][PoseConversions-url]
133+
- Application: [PoseConversions GUI][PoseConversionsGUI-url]
134+
- Theory: [Conversions Between Common Orientation Representations][PoseConversionsTheory-url]
135+
136+
Zivid primarily operates with a (4x4) Transformation Matrix (Rotation Matrix + Translation Vector). This example shows how to convert to and from:
137+
138+
- Axis–Angle
139+
- Rotation Vector
140+
- Roll–Pitch–Yaw
141+
- Quaternion
142+
143+
Useful for integrating with robot controllers.
144+
145+
---
146+
147+
## Verifying Calibration Accuracy
148+
149+
### Verify Hand-Eye With Visualization
150+
151+
- Sample: [VerifyHandEyeWithVisualization][VerifyHandEyeWithVisualization-url]
152+
153+
Application validation approach:
154+
155+
- Loads the hand-eye dataset and output (transformation matrix)
156+
- For each dataset pair:
157+
- Transforms the point cloud to common coordinate system
158+
- Finds the checkerboard centroid cartesian coordinates
159+
- Removes the points outside the the checkerboard ROI
160+
- Overlaps transformed point clouds
161+
- Visualizes alignment accuracy
162+
163+
Best for:
164+
165+
- Visual verification
166+
- Detecting systematic rotation/translation errors
167+
168+
---
169+
170+
### RoboDK Touch Test Verification
171+
172+
- Script: [RobodkHandEyeVerification][RobodkHandEyeVerification-url]
173+
- Tutorial: [Verify Hand-Eye Calibration Result Via Touch Test][RobodkHandEyeVerificationTutorial-url]
174+
175+
Verification steps:
176+
177+
1. Robot moves to a predefined capture pose
178+
2. User places the calibration object in the FOV
179+
3. Camera estimates a touch point
180+
4. Robot physically touches the calibration object
181+
5. User repeats the test at multiple locations
182+
183+
Best for:
184+
185+
- Physical validation
186+
- High-accuracy requirement applications
187+
188+
---
189+
190+
## Summary: Which Tool Should I Use?
191+
192+
| Goal | Recommended Tool |
193+
|------|------------------|
194+
| Conceptual understanding | [Knowledge Base article][HandEyeTutorial-url] |
195+
| Guided calibration | [Hand–Eye GUI][HandEyeCalibrationGUITutorial-url] |
196+
| Minimal integration example | [HandEyeCalibration][HandEyeCalibration-url] |
197+
| Existing dataset | [Hand–Eye GUI][HandEyeCalibrationGUITutorial-url]|
198+
| UR robots | [Hand–Eye GUI][HandEyeCalibrationGUITutorial-url] or [UR Hand–Eye sample][URHandEyeTutorial-url] |
199+
| Any robot | [Hand–Eye GUI][HandEyeCalibrationGUITutorial-url] or [RoboDK Hand–Eye sample][RoboDKHandEyeTutorial-url] |
200+
| Use calibration result | [UtilizeHandEyeCalibration][UtilizeHandEyeCalibrationTutorial-url] |
201+
| Verify visually | [Hand–Eye GUI][HandEyeCalibrationGUITutorial-url] or [VerifyHandEyeWithVisualization][VerifyHandEyeWithVisualization-url] |
202+
| Verify physically | [Hand–Eye GUI][HandEyeCalibrationGUITutorial-url] or [RoboDK Touch Test][RobodkHandEyeVerification-url] |
203+
204+
205+
[HandEyeTutorial-url]: https://support.zivid.com/latest/academy/applications/hand-eye.html
80206

81207
[HandEyeCalibration-url]: hand_eye_calibration.py
208+
209+
[HandEyeCalibrationGUI-url]: hand_eye_gui.py
210+
[HandEyeCalibrationGUITutorial-url]: https://support.zivid.com/en/latest/academy/applications/hand-eye/hand-eye-gui.html
211+
82212
[UtilizeHandEyeCalibration-url]: utilize_hand_eye_calibration.py
213+
[UtilizeHandEyeCalibrationTutorial-url]: https://support.zivid.com/en/latest/academy/applications/hand-eye/how-to-use-the-result-of-hand-eye-calibration.html
214+
83215
[VerifyHandEyeWithVisualization-url]: verify_hand_eye_with_visualization.py
84216
[ZividHandEyeCalibration-url]: https://support.zivid.com/latest/academy/applications/hand-eye/hand-eye-calibration-process.html
85-
[Tutorial-url]: https://support.zivid.com/latest/academy/applications/hand-eye.html
217+
[hand-eye-procedure-url]: https://support.zivid.com/en/latest/academy/applications/hand-eye/hand-eye-calibration-process.html#custom-integration
218+
86219
[PoseConversions-url]: pose_conversions.py
220+
[PoseConversionsGUI-url]: pose_conversion_gui.py
221+
[PoseConversionsTheory-url]: https://support.zivid.com/en/latest/reference-articles/pose-conversions.html
222+
87223
[CLI application-url]: https://support.zivid.com/latest/academy/applications/hand-eye/zivid_CLI_tool_for_hand_eye_calibration.html
224+
88225
[URhandeyecalibration-url]: ur_hand_eye_calibration/universal_robots_perform_hand_eye_calibration.py
89-
[URHandEyeTutorial-url]: https://support.zivid.com/en/latest/academy/applications/hand-eye/ur5-robot-+-python-generate-dataset-and-perform-hand-eye-calibration.html
226+
[URHandEyeTutorial-url]: https://support.zivid.com/en/latest/academy/applications/hand-eye/ur5-robot-%2B-python-generate-dataset-and-perform-hand-eye-calibration.html
227+
90228
[RobodkHandEyeCalibration-url]: robodk_hand_eye_calibration/robodk_hand_eye_calibration.py
229+
[RoboDKHandEyeTutorial-url]: https://support.zivid.com/en/latest/academy/applications/hand-eye/robodk-%2B-python-generate-dataset-and-perform-hand-eye-calibration.html
230+
91231
[RobodkHandEyeVerification-url]: robodk_hand_eye_calibration/robodk_verify_hand_eye_calibration.py
92-
[robodk-robot-library-url]: https://robodk.com/supported-robots
232+
[RobodkHandEyeVerificationTutorial-url]: https://support.zivid.com/en/latest/academy/applications/hand-eye/hand-eye-calibration-verification-via-touch-test.html
233+
234+
[robodk-robot-library-url]: https://robodk.com/supported-robots

0 commit comments

Comments
 (0)