This is the official code release of A Hyperspectral Imaging Guided Robotic Grasping System.
[paper] [project] [code] [Datasets] [CAD files]
## EnvironmentThe complete deployment of the project includes the following components:
- Model Training and Inference
- Robotic Manipulation
- PRISM Control (Hyperspectral Camera, Motors)
Due to the windows required of the hyperspectral camera control interface, the project is developed on :
- Windows 10.
But the model training and inference can be run on any platform such as Ubuntu 20.04 (tested) that supports PyTorch.
-
Create conda environment and install pytorch
This code is tested on Python 3.10.14 on Ubuntu 20.04 and Windows 10.
conda create -n prism python=3.10 conda activate prism # pytorch with cuda 11.8 pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu118 -
Dependencies
Install dependencies
pip install joblib pip install tqdm pip install tensorboard pip install omegaconf pip install opencv-python pip install matplotlib pip install scipy pip install scikit-learn pip install plantcv pip install spectral pip install numpy==1.26.4 pip install h5py
Only tested under the pycharms environment. please unclick the "Run with Python Console" and "view > Scientific Mode" option in the run configuration.
Run commands below to run the prism working animation:
python scripts/prism_animation.pyYou can modify the config parameter model_type in config/train.yaml to train the specific model.
python scripts/train.pyYou can also run the test script to evaluate the trained model.
python scripts/test.pyAll C++ device control codes are in the "c_device" folder. This includes control modules for Modbus devices, the Nachi robot, and the Specim linescan camera.
--c_device
--libModbus
--nachi
--specim If you find this work helpful, please consider citing:
@ARTICLE{11020724,
author={Sun, Zheng and Dong, Zhipeng and Wang, Shixiong and Chu, Zhongyi and Chen, Fei},
journal={IEEE Robotics and Automation Letters},
title={A Hyperspectral Imaging Guided Robotic Grasping System},
year={2025},
volume={},
number={},
pages={1-8},
keywords={Hyperspectral imaging;Robots;Grasping;Robot sensing systems;Sorting;Service robots;Cameras;Robot vision systems;Nonlinear distortion;Servomotors;Perception for Grasping and Manipulation;Software-Hardware Integration for Robot Systems;Grasping},
doi={10.1109/LRA.2025.3575654}}



