Monocular Camera &UWB Low-cost and low-power mobile robot positioning scheme VR-SLAM Fast recovery tracking failure No closed loop, easy access to global maps NTU, Nanyang Technological University

For autonomous mobile robots, achieving accurate, reliable and consistent positioning in the short and long term is a big challenge, and at low altitudes, GPS is susceptible to interference and drift; At high altitudes, vision and radar lose their signature points. Combining different sensor modalities to take advantage of their complementary strengths is a common positioning solution. However, how to achieve low cost and low power consumption while also improving the reliability of positioning? Why is UWB critical in this? Based on monocular camera and UWB ranging and positioning technology (LinkTrack UWB high-precision positioning system), the VR-SLAM algorithm and multiple measured data proposed by Dr. Thien Hoang Nguyen and Dr. Shenghai Yuan of NTU of Nanyang Technological University in Singapore will bring new ideas to the industry (download link at the end of the article).
Comparison of positioning effects with or without UWB participation:
VIRAL SLAM:Tightly Coupled Camera-IMU-UWB-Lidar SLAM
Among them, UWB sensors offer several key advantages:

  • Unlike visual odometry (VO), UWB distance measurement is drift-free and unaffected by visual conditions; Unlike GPS, UWB can be used both indoors and outdoors; UWB sensors are smaller, lighter, more affordable than lidar, and simpler to install on robotic platforms.
  • UWB can also be used as a communication network in multi-robot situations, a variable baseline between two robots, or inter-robot constraints for pose map optimization or formation control.
  • UWB sensors, on the other hand, require good line-of-sight (LoS) for accurate measurements, cannot provide any perceived information about the environment, and the performance of distance-based positioning methods relies on installing enough UWB anchor points and their configuration does not fall into a degrading situation.
Therefore, the use of UWB to complement other sensor modalities has great potential in practical applications. UWB and monocular cameras not only have low cost and low power consumption, but also have mechanical flexibility to install on robot platforms.
A simultaneous localization and mapping (SLAM) system using a monocular camera and ultra-wideband (UWB) sensors, known as VR-SLAM, is a multi-stage framework that fully exploits the strengths of each sensor and compensates for its shortcomings. In the text(VR-SLAM: A Visual-Range Simultaneous Localization and Mapping System using Monocular Camera and Ultra-wideband Sensors):

  • First, a UWB-assisted 7-degree-of-freedom (scale factor, 3D position, and 3D orientation) global alignment module is introduced to initialize the visual odometer (VO) system in the world coordinate system defined by the UWB anchor point. This module loosely fuses scaled VO and ranging data using quadratic constrained quadratic programming (QCQP) or nonlinear least squares (NLS) algorithms, depending on whether there is a good initial guess.
  • Secondly, an accompanying theoretical analysis is provided, including the derivation and interpretation of the Fisher Information Matrix (FIM) and its determinants.
  • Finally, UWB-assisted beam adjustment (UBA) and UWB-assisted attitude map optimization (UPGO) modules are proposed to improve short-term odometer accuracy, reduce long-term drift, and correct any alignment and scale errors. Extensive simulations and experiments show that our solution is superior to using only UWB/camera and previous methods, can quickly resume tracking failures without relying on visual relocation, and can easily acquire global maps, even without a closed loop.
VIRAL-Fusion: A Visual-Inertial-Ranging-Lidar Sensor Fusion Approach

Resources