Sensor Fusion-Based Navigation Systems for Autonomous Delivery Robots

Authors

  • Pooja Sharma Independent Researcher Dilsukhnagar, Hyderabad, India (IN) – 500060 Author

Keywords:

autonomous delivery robots; sensor fusion; SLAM; factor graphs; visual–inertial odometry; LiDAR–inertial odometry; UWB; model-predictive control; dynamic obstacle avoidance; semantic mapping

Abstract

Autonomous delivery robots must navigate sidewalks, corridors, and mixed indoor–outdoor campuses while maintaining accuracy, safety, and efficiency under imperfect sensing. Single-sensor pipelines (e.g., wheel odometry or vision alone) degrade under wheel slip, poor lighting, occlusions, and multipath. This manuscript presents a sensor-fusion navigation architecture that integrates inertial measurement units (IMUs), wheel encoders, cameras, LiDAR, and optional ultra-wideband (UWB) anchors to achieve robust localization and motion planning in dynamic environments. We detail a modular stack: (1) time-synchronized preprocessing and calibration, (2) multi-rate odometry (wheel–IMU EKF, visual–inertial odometry, LiDAR–inertial odometry), (3) factor-graph smoothing with loop closures and UWB priors, (4) semantic mapping that separates static structure from dynamic obstacles, (5) dual-horizon planning with D*-Lite globally and model-predictive control (MPC) locally, and (6) a safety supervisor enforcing stop/slowdown under uncertainty spikes.

A simulation campaign across campus-sidewalk and urban-alley scenes (varying lighting, ground friction, and pedestrian density) compares four configurations: baseline wheel–IMU EKF, VIO-aided EKF, LiDAR-inertial odometry, and full multimodal factor-graph fusion including UWB. The fused system reduces absolute trajectory error by ~82% and collision rate by ~89% relative to the baseline, while adding <16 ms average fusion latency. We report statistically significant gains (ANOVA, Tukey HSD, p < 0.01) in success rate, path efficiency, and energy per kilometer. Results suggest that tightly-coupled, uncertainty-aware fusion—combined with semantic dynamics handling—yields navigation resilience suitable for last-meter delivery. We conclude with deployment guidance and open problems in long-term calibration drift, low-texture scenes, and learning-enhanced fusion.

 

Downloads

Download data is not yet available.

Downloads

Additional Files

Published

2025-11-05

How to Cite

Sharma, Pooja. “Sensor Fusion-Based Navigation Systems for Autonomous Delivery Robots”. International Journal of Advanced Research in Computer Science and Engineering (IJARCSE) 1, no. 4 (November 5, 2025): Nov (69–76). Accessed January 22, 2026. https://ijarcse.org/index.php/ijarcse/article/view/92.

Similar Articles

1-10 of 52

You may also start an advanced similarity search for this article.