A Unity-based UGV (Unmanned Ground Vehicle) simulator with ROS2 integration. Features Ackermann steering, sensor simulation (LiDAR, Camera, IMU, GPS), and deterministic rosbag recording.
| Unity | RViz2 |
|---|---|
![]() |
![]() |
- Ackermann Steering Model - Realistic vehicle dynamics with proper Ackermann geometry
- ROS2 Integration - Full bidirectional communication via ROS-TCP-Connector
- Deterministic Simulation - Reproducible sensor data for consistent rosbag recordings
- Sensor Suite
- IMU (100 Hz) with configurable noise
- GPS (10 Hz) with configurable origin coordinates
- LiDAR (10 Hz) - 16-channel, 360° raycast-based point cloud
- Camera (30 Hz) - RGB image with camera info
- Wheel encoders and suspension sensors
- TF Broadcasting - Complete transform tree for RViz visualization
- Unity 2022.3 LTS or newer
- Universal Render Pipeline (URP)
- ROS-TCP-Connector 0.7.0-preview or newer
First, install the ROS-TCP-Connector package:
- Open your Unity project
- Go to Window → Package Manager
- Click the + button → Add package from git URL...
- Enter:
https://github.com/Unity-Technologies/ROS-TCP-Connector.git?path=/com.unity.robotics.ros-tcp-connector - Click Add
After ROS-TCP-Connector is installed:
- Go to Window → Package Manager
- Click the + button → Add package from git URL...
- Enter:
https://github.com/batuhanozer/ugv-unity-sim.git - Click Add
On your ROS2 machine, you need:
- ROS-TCP-Endpoint - TCP bridge for Unity communication
- Custom messages package (included in this repo under
ros2_ws/)
# Terminal 1: Start the TCP endpoint
ros2 run ros_tcp_endpoint default_server_endpoint --ros-args -p ROS_IP:=0.0.0.0
# Terminal 2: Launch with rosbag recording (optional)
ros2 launch ugv_assignment_bringup record.launch.py| Topic | Type | Hz | Description |
|---|---|---|---|
/clock |
rosgraph_msgs/Clock |
100 | Simulation time |
/vehicle/output |
ugv_assignment_msgs/Output |
50 | Vehicle state |
/imu/data |
sensor_msgs/Imu |
100 | IMU data |
/gps/fix |
sensor_msgs/NavSatFix |
10 | GPS coordinates |
/lidar/points |
sensor_msgs/PointCloud2 |
10 | LiDAR point cloud |
/camera/image_raw |
sensor_msgs/Image |
30 | Camera image |
/camera/camera_info |
sensor_msgs/CameraInfo |
30 | Camera intrinsics |
/joint_states |
sensor_msgs/JointState |
50 | Wheel positions |
/wheel_suspensions |
sensor_msgs/JointState |
50 | Suspension state |
/tf |
tf2_msgs/TFMessage |
50 | Transform tree |
| Topic | Type | Description |
|---|---|---|
/vehicle/input |
ugv_assignment_msgs/Input |
Vehicle control commands |
ugv_assignment_msgs/Input
float64 throttle # 0.0 to 100.0 (percent)
float64 brake # 0.0 to 100.0 (percent)
float64 steering_angle # -0.5 to 0.5 (radians)
ugv_assignment_msgs/Output
string state # "Parking", "Manual", "Auto"
float64 linear_velocity # m/s
float64 angular_velocity # rad/s
float64 front_left_steering_angle # radians
float64 front_right_steering_angle # radians
The vehicle can be controlled in two modes:
-
Manual Mode - Keyboard/joystick input directly controls the vehicle
W/SorUp/Down- Throttle/BrakeA/DorLeft/Right- SteeringSpace- Brake
-
Auto Mode - Vehicle responds to
/vehicle/inputtopic from ROS2
The simulator uses Unity's fixedDeltaTime for physics and publishes /clock for ROS2 time synchronization. To record deterministic rosbags:
ros2 bag record --use-sim-time /clock /imu/data /gps/fix /lidar/points /camera/image_raw ...MIT License - see LICENSE for details.

