RobotForge
Published·~14 min

Gazebo / Ignition: the ROS companion

URDF import, plugins, sensors, and the headless simulation workflow for CI. The simulator that ships with ROS 2 — slower than MuJoCo, but with the deepest ROS integration in the industry.

by RobotForge
#simulators#gazebo#ros2

Gazebo is what most people mean by "ROS simulator." It loads URDFs, simulates contacts, renders cameras and lidars, publishes ROS topics. It's slower than MuJoCo, less photorealistic than Isaac, but no other simulator integrates with ROS as deeply. For ROS-based projects in 2026, Gazebo is still the practical default — and the 2024 merge of Gazebo Classic and Ignition into "Gazebo Harmonic" cleaned up the lineage.

The version question (settled)

Pre-2024, Gazebo split into:

  • Gazebo Classic (gazebo9, gazebo11): the legacy line. Compatible with ROS 1.
  • Ignition Gazebo: the rewrite. Modular plugin system; ROS 2 only.

In 2024 they merged. The new product is Gazebo Harmonic / Garden / Ionic. New name same thing — the rewrite, with ROS 2 first-class support. Older tutorials may still say "Ignition." The current installed package is gz-sim, called from ROS 2 via ros_gz_bridge.

Strengths over alternatives

  • ROS 2 native bridge: ros_gz_bridge exposes ROS topics for everything Gazebo simulates. Camera images, lidar scans, joint states — all auto-published.
  • Realistic sensor plugins: every common sensor has a Gazebo plugin (RealSense, Velodyne, IMU, GPS, contact). Configurable via SDF.
  • Large-world support: simulate kilometers of outdoor environment. Used heavily for autonomous driving in DARPA Challenge era.
  • Headless mode: run on a server without GPU; great for CI and large-scale rollout.

Weaknesses

  • Slower than MuJoCo for contact-rich sim. Quadrupeds are noticeably less responsive.
  • Camera renders are functional but not photorealistic. Don't train perception networks on Gazebo camera output expecting transfer.
  • Plugin system is powerful but confusing. C++ plugins compiled into Gazebo's runtime; not the most accessible.
  • Build complexity: Gazebo's CMake graph is intricate. From-source builds take a while.

The minimum viable simulation

Three files to launch a robot in Gazebo:

1. The world (.sdf)

<?xml version="1.0" ?>
<sdf version="1.10">
  <world name="empty">
    <physics name="default" type="ode">
      <real_time_update_rate>1000</real_time_update_rate>
    </physics>
    <include><uri>model://ground_plane</uri></include>
    <include><uri>model://sun</uri></include>
  </world>
</sdf>

2. The robot URDF (with Gazebo extensions)

<robot name="my_robot">
  <link name="base_link">
    <visual>...</visual>
    <collision>...</collision>
    <inertial>...</inertial>
  </link>
  <!-- Gazebo extensions -->
  <gazebo>
    <plugin name="differential_drive" filename="libgz-sim-diff-drive-system.so">
      <left_joint>left_wheel_joint</left_joint>
      <right_joint>right_wheel_joint</right_joint>
      <wheel_separation>0.32</wheel_separation>
      <wheel_radius>0.06</wheel_radius>
    </plugin>
  </gazebo>
</robot>

3. The launch file (Python)

from launch import LaunchDescription
from launch_ros.actions import Node
from launch.actions import IncludeLaunchDescription

def generate_launch_description():
    return LaunchDescription([
        IncludeLaunchDescription(
            'gz_sim.launch.py',
            launch_arguments={'gz_args': '-r my_world.sdf'}.items()
        ),
        Node(package='ros_gz_bridge', executable='parameter_bridge',
             arguments=[
                 '/cmd_vel@geometry_msgs/msg/Twist@gz.msgs.Twist',
                 '/scan@sensor_msgs/msg/LaserScan@gz.msgs.LaserScan',
             ]),
    ])

Three files; ros2 launch my_robot bringup.launch.py; the robot drives in Gazebo and publishes ROS topics. That's the production starting point.

The bridge

ros_gz_bridge is the pivotal package. It maps Gazebo's internal pub/sub system (gz transport) to ROS 2 topics. Each topic mapping declares the type on each side; the bridge translates messages.

Standard mappings:

  • /cmd_vel: geometry_msgs/Twistgz.msgs.Twist
  • /scan: sensor_msgs/LaserScangz.msgs.LaserScan
  • /odom: nav_msgs/Odometrygz.msgs.Odometry
  • /camera/image_raw: sensor_msgs/Imagegz.msgs.Image
  • /tf: typically published directly by Gazebo's pose-publisher plugin; bridge passes through.

Sensors that ship with Gazebo

  • Camera: RGB, depth, RGB-D, multi-camera rig. Configurable resolution, FOV, distortion.
  • LiDAR: 2D and 3D; configurable beam count, angular range, noise.
  • IMU: 6-axis with configurable noise.
  • Force-torque: at any joint.
  • Contact sensor: detect contacts on a link.
  • GPS: simulated lat/lon from world position.

All publishable to ROS via the bridge. For most robotics projects, there's no need to write a custom sensor plugin.

Headless mode + CI

For CI testing, Gazebo runs without a display:

gz sim -r --headless-rendering my_world.sdf

Combined with colcon test running ROS 2 launch files, this enables regression testing: every commit, drive the simulated robot through a 60-second test scenario; assert it didn't crash and reached the goal. Real teams catch real bugs this way.

Gazebo vs MuJoCo: when to use which

Need Pick
ROS-based, full sensor stackGazebo
RL training, contact-richMuJoCo
Outdoor large-worldGazebo
Photorealistic perceptionIsaac Sim

Most production robotics projects use Gazebo for development + integration tests, MuJoCo for training learned policies, Isaac for perception model training. Three sims, one project.

Common gotchas

  • Time scaling: Gazebo's real-time-factor (RTF) might drop below 1.0 on heavy scenes. Use use_sim_time: true consistently in ROS to avoid timestamp confusion.
  • Plugin compatibility: a plugin built for Gazebo Garden won't load in Harmonic. Match versions.
  • SDF version mismatch: SDF format evolves; old worlds may need migration. gz sdf -p warns.
  • Mesh paths: URDF using package:// URIs needs GZ_SIM_RESOURCE_PATH set correctly.

Exercise

Take a Pi4-based diff-drive robot you have a URDF for. Set up the Gazebo + ros_gz_bridge launch above. Drive it in rviz using a teleop keyboard. Then add a lidar plugin; visualize the scan. Then run Nav2 against the simulated robot. The pipeline from "URDF" to "Nav2 navigates a simulated robot" is a full afternoon — and once you've done it, you've internalized the ROS 2 simulation workflow.

Next

PyBullet — the friendly Python-first sim that beats Gazebo on prototyping speed.

Comments

    Sign in to post a comment.