Bio-Inspired Hybrid Map: Spatial Implicit Local Frames and Topological Map for Mobile Cobot Nagivation
All authors are with Learning and Adaptive Robotics Laboratory, Department of Computer Science and Engineering,
University of Texas at Arlington, Arlington, TX 76013, USA.
Navigation is a fundamental capacity for mobile robots, enabling them to operate autonomously in complex and dynamic environments. Conventional approaches use probabilistic models to localize robots and build maps simultaneously using sensor observations. Recent approaches employ human-inspired learning, such as imitation and reinforcement learning, to navigate robots more effectively. However, these methods suffer from high computational costs, global map inconsistency, and poor generalization to unseen environments. This paper presents a novel method inspired by how humans perceive and navigate themselves effectively in novel environments. Specifically, we first build local frames that mimic how humans represent essential spatial information in the short term. Points in local frames are hybrid representations, including spatial information and learned features, so-called spatial-implicit local frames. Then, we integrate spatial-implicit local frames into the global topological map represented as a factor graph. Lastly, we developed a novel navigation algorithm based on Rapid-Exploring Random Tree Star (RRT*) that leverages spatial-implicit local frames and the topological map to navigate effectively in environments. To validate our approach, we conduct extensive experiments in real-world datasets and in-lab environments.
Full demo is available at YouTube.
conda create -n simn python=3.10.16 # create new environment
conda activate simn # activate the new environment
conda install pytorch==2.0.0 torchvision==0.15.0 torchaudio==2.0.0 pytorch-cuda=11.7 -c pytorch -c nvidia
python -r requirements.txt # install dependenciesExamples: config/tum_fr1_desk.yaml
camera:
fx: 520.9
fy: 521.0
cx: 325.1
cy: 249.7
fps: 30Cogfigurations are stored as yaml file, then converted into a configured object:
cfg = Config(path="config/tum_fr1_desk.yaml").config()
camera= cfg.camera # Access a complex property
fps = camera.fps # Acess single propertypython main.py -c config/tum_fr1_desk.yaml- Tum
- Realsense Camera (Real-time on device)
- UT Arlington
- Kitti
https://github.com/IntelRealSense/librealsense/blob/development/doc/distribution_linux.md
pip install pyrealsense2
@inproceedings{dang2025bio,
title = {Bio-Inspired Hybrid Map: Spatial Implicit Local Frames and Topological Map for Mobile Cobot Navigation},
author = {Dang, Tuan and Huber, Manfred},
booktitle = {Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
year = {2025},
month = oct,
note = {Presented October 2025},
url = {https://arxiv.org/abs/2507.04649},
}

