GPS, RTK, and outdoor state estimation
Localization when you don't need SLAM — and the IMU tricks that plug the gaps when the sky does. The honest tradeoffs between consumer GPS, RTK, and full sensor fusion.
Indoors, you need SLAM because the only way to know where you are is to map your surroundings. Outdoors with sky access, satellites can tell you. GPS is the cheapest localization sensor; RTK turns it into centimeter-accurate; sensor fusion fills in the gaps when the sky disappears. Here's the working knowledge for outdoor robots.
What GPS actually gives you
A standard consumer GPS receiver:
- Latitude / longitude at ±2–5 m, 95% confidence.
- Altitude at ±5–15 m. Worst direction.
- Velocity at ±0.05 m/s. Surprisingly good.
- Time at ±10 ns. Excellent.
- Update rate: 1–10 Hz typical; up to 25 Hz on high-end consumer.
For a delivery drone navigating between waypoints 100 m apart, this is fine. For a self-driving car needing to stay in its lane (~30 cm), this is terrible.
Why GPS is so noisy
Consumer GPS uses one signal: the L1 carrier from each visible satellite. The position is computed by triangulating signal travel times. Sources of error:
- Atmospheric delay: ionosphere and troposphere bend signals. ±2–5 m typical.
- Multipath: signal reflects off buildings before reaching the antenna. Major source in urban environments.
- Satellite geometry: receiver in a "canyon" sees few satellites, all in similar directions. Geometric dilution of precision (GDOP) goes up.
- Clock error: receiver clocks are cheap; unknown offset compounds with travel-time error.
The fundamental limit on consumer GPS isn't the sensor — it's the signal model.
RTK: centimeter-accurate GPS
Real-Time Kinematic. The trick:
- A base station at a known location measures the GPS error in real time.
- The base station broadcasts a correction signal (over LoRa, cellular, NTRIP).
- The rover applies the correction to its own measurements.
Result: position error drops to ~1–5 cm. Velocity error to ~0.01 m/s.
Constraints:
- Base station within ~20 km of the rover (atmospheric error decorrelates with distance).
- Rover must receive the correction within ~1 second.
- Both base and rover need dual-frequency multi-constellation receivers (~$200+ in 2026; was thousands a few years ago).
Public RTK services (state-government NTRIP networks, paid services like Trimble VRS, SBAS) eliminate the need to deploy your own base in many regions.
Practical RTK in 2026
Cheapest viable rovers:
- u-blox ZED-F9P: ~$200, dual-frequency, used in many open-source projects. The de-facto hobbyist standard.
- Septentrio Mosaic-X5: ~$1k, more accurate, used in commercial drones.
- Skydio / Wing internal modules: proprietary; production drone autopilots.
For ~$400 (rover + base in pair) you can build an RTK-enabled robot platform. ArduPilot, PX4, and ROS 2 all support RTK feeds natively.
The blind spots
GPS / RTK fails when:
- Indoors: no signal.
- Tunnels: signal blocked.
- Tree canopy: dense foliage attenuates and reflects.
- Urban canyons: tall buildings block half the sky; multipath dominates.
- Mountains: limited satellite visibility.
- Active jamming/spoofing: rare for hobbyists; serious threat for autonomous vehicles in some regions.
For a drone flying over a city or a delivery robot rolling through downtown, GPS-only localization breaks regularly. Sensor fusion required.
The IMU bridge
An IMU (gyroscope + accelerometer + sometimes magnetometer) provides high-rate motion data without external infrastructure. Integration:
- Velocity = ∫acceleration dt.
- Position = ∫velocity dt.
- Orientation = ∫gyro dt.
The catch: IMUs drift. Gyro biases drift slowly; accelerometer biases drift faster. After 30 seconds of pure-IMU integration, position is meters off.
The pattern: GPS provides absolute position with low noise but gaps; IMU provides high-rate continuity but drifts. Fuse them.
Sensor fusion architectures
Loosely-coupled (the simplest)
The GPS provides a position fix; the IMU integrates between fixes. An EKF or UKF combines them. The IMU corrects bias from the GPS fixes; the GPS corrects drift from IMU integration.
Used by most consumer drones, automotive lane-keeping, simple robotics. Implementation: robot_localization in ROS 2, ArduPilot's EKF3.
Tightly-coupled (precision)
Skip the GPS solution; fuse the raw satellite measurements (pseudoranges, carrier phases, Doppler) with the IMU directly. More information, better resilience to brief outages, more code complexity.
Used in survey-grade GNSS-INS units, autonomous vehicles, military systems.
Add wheel odometry
Ground robots get a third sensor: wheel encoders. Provides velocity in the body frame. Combined with IMU + GPS:
- Wheel odometry: locally accurate, drifts long-term.
- IMU: rapid corrections; drifts independently.
- GPS: absolute, but with gaps.
Triple fusion in robot_localization's EKF — covered in the EKF/UKF lesson.
Add visual / visual-inertial odometry
For drones especially, add a downward camera + VIO. Provides velocity even in GPS-denied scenarios (e.g., flying under a bridge).
The full stack: GPS + IMU + visual + (LiDAR if available). Each compensates for the others' failures. Production autonomous-vehicle and drone systems all use a fusion of three or more.
The 30-degree latitude problem
GPS gives WGS84 lat/lon. Robotics wants Cartesian (m, m). Convert via:
- Local tangent plane: linearize around a reference point. Good for <10 km regions.
- UTM: standard projected coordinate system, accurate up to a UTM zone (~6° wide).
- ECEF + manual conversion: full sphere geometry; needed for global-scale operations.
For most robots: pick a local origin, do tangent-plane linearization, never look back. The math is in geographiclib and ROS's geodesy package.
Coordinate frames in outdoor systems
Multiple coordinate frames matter:
- map: globally consistent, derived from GPS or SLAM.
- odom: locally continuous (no jumps), drifts. Wheel odom + IMU.
- base_link: robot body.
The map → odom transform is the localization estimate; it's updated by GPS/SLAM whenever a fix arrives. The odom → base_link transform is dead-reckoned and never jumps.
This split lets the robot operate smoothly between GPS fixes (using odom) and re-anchor to global truth periodically (via map).
The 2026 outdoor stack
For a typical outdoor delivery robot:
- RTK GPS (~$400 component cost): cm-accurate position when sky is visible.
- Industrial IMU ($200–2000): mid-grade tactical; less drift than consumer.
- Wheel encoders: standard.
- Camera or stereo VIO: covers GPS-denied stretches.
- EKF-based fusion:
robot_localization.
For an autonomous truck: add LiDAR + HD maps. For a drone: add downward camera + barometer for altitude. Each environment dictates the sensor mix.
Exercise
Take a u-blox F9P module + matching antenna ($300 total). Set up a base station + rover. Observe the position accuracy improvement from 2–3 m (consumer GPS) to a few centimeters (RTK). Then drive into a building; watch position diverge as IMU takes over without correction. The before/after demonstrates everything in this lesson.
That's the SLAM track done
You've covered the full progression: Bayes filter → Kalman → EKF/UKF → particle filters → factor graphs → occupancy grids → visual SLAM (ORB-SLAM3) → LiDAR SLAM (LOAM family) → modern learned methods → GPS / outdoor. With this and the Foundations / Kinematics / Control / ROS 2 / Manipulation / Learning tracks, you have seven complete tracks covering the working knowledge of robotics. The remaining tracks (Perception, Motion Planning, Mobile/Legged, Simulators, Embedded, Frontiers) build on this foundation.
Comments
Sign in to post a comment.