Skip to content
/ simn Public

[IROS'25] Bio-Inspired Hybrid Map: Spatial Implicit Local Frames and Topological Map for Mobile Cobot Nagivation

Notifications You must be signed in to change notification settings

tuantdang/simn

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Bio-Inspired Hybrid Map: Spatial Implicit Local Frames and Topological Map for Mobile Cobot Nagivation

Authors:

All authors are with Learning and Adaptive Robotics Laboratory, Department of Computer Science and Engineering,
University of Texas at Arlington, Arlington, TX 76013, USA.

Astract

Navigation is a fundamental capacity for mobile robots, enabling them to operate autonomously in complex and dynamic environments. Conventional approaches use probabilistic models to localize robots and build maps simultaneously using sensor observations. Recent approaches employ human-inspired learning, such as imitation and reinforcement learning, to navigate robots more effectively. However, these methods suffer from high computational costs, global map inconsistency, and poor generalization to unseen environments. This paper presents a novel method inspired by how humans perceive and navigate themselves effectively in novel environments. Specifically, we first build local frames that mimic how humans represent essential spatial information in the short term. Points in local frames are hybrid representations, including spatial information and learned features, so-called spatial-implicit local frames. Then, we integrate spatial-implicit local frames into the global topological map represented as a factor graph. Lastly, we developed a novel navigation algorithm based on Rapid-Exploring Random Tree Star (RRT*) that leverages spatial-implicit local frames and the topological map to navigate effectively in environments. To validate our approach, we conduct extensive experiments in real-world datasets and in-lab environments.

Overview


Full demo is available at YouTube.


Environment and dependencies setup

conda create -n simn python=3.10.16 # create new environment
conda activate simn # activate the new environment
conda install pytorch==2.0.0 torchvision==0.15.0 torchaudio==2.0.0 pytorch-cuda=11.7 -c pytorch -c nvidia 
python -r requirements.txt # install dependencies

Configurations

Examples: config/tum_fr1_desk.yaml

camera:
  fx: 520.9
  fy: 521.0
  cx: 325.1
  cy: 249.7
  fps: 30

Cogfigurations are stored as yaml file, then converted into a configured object:

cfg = Config(path="config/tum_fr1_desk.yaml").config()
camera= cfg.camera # Access a complex property
fps = camera.fps   # Acess single property

Run

python main.py -c config/tum_fr1_desk.yaml

Datasets

  • Tum
  • Realsense Camera (Real-time on device)
  • UT Arlington
  • Kitti

Realsense Installation

Install driver and tools

https://github.com/IntelRealSense/librealsense/blob/development/doc/distribution_linux.md

Python Binding:

pip install pyrealsense2

Citing

@inproceedings{dang2025bio,
  title        = {Bio-Inspired Hybrid Map: Spatial Implicit Local Frames and Topological Map for Mobile Cobot Navigation},
  author       = {Dang, Tuan and Huber, Manfred},
  booktitle    = {Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
  year         = {2025},
  month        = oct,
  note         = {Presented October 2025},
  url          = {https://arxiv.org/abs/2507.04649},
}

About

[IROS'25] Bio-Inspired Hybrid Map: Spatial Implicit Local Frames and Topological Map for Mobile Cobot Nagivation

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •