Skip to content

kianyale/NeuGrasp

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

NeuGrasp

Generalizable Neural Surface Reconstruction with Background Priors for Material-Agnostic Object Grasp Detection

ICRA 2025
Qingyu Fan1,2,3, Yinghao Cai1,2†, Chao Li3, Wenzhe He3,
Xudong Zheng3, Tao Lu1, Bin Liang3, Shuo Wang1,2

1 Institute of Automation, Chinese Academy of Sciences.   
2 School of Artificial Intelligence, University of Chinese Academy of Sciences.   
3 Qiyuan Lab.   
Corresponding Authors

Paper Project Page YouTube

📢 News

  • [05/09/2025]: The code of NeuGrasp is released.
  • [01/31/2025]: NeuGrasp is accepted to ICRA 2025.

📜 Introduction

[ICRA'25] This is the official repository of NeuGrasp: Generalizable Neural Surface Reconstruction with Background Priors for Material-Agnostic Object Grasp Detection.

In this paper, we introduce NeuGrasp, a neural surface reconstruction method that leverages background priors for material-agnostic grasp detection. NeuGrasp integrates transformers and global prior volumes to aggregate multi-view features with spatial encoding, enabling robust surface reconstruction in narrow and sparse viewing conditions. By focusing on foreground objects through residual feature enhancement and refining spatial perception with an occupancy-prior volume, NeuGrasp excels in handling objects with transparent and specular surfaces.

Please kindly star ⭐️ this project if it helps you 😁.

🛠️ Installation

✅ Prerequisites

  • ROS — Required for grasp and scene visualization.
  • Blender 2.93.3 — Used for simulation and synthetic data generation. You can download it from the official Blender website.

📦 Dependencies

To install the required Python packages, run:

pip install -r requirements.txt

⚠️ Blender Python Dependencies Please ensure Blender is properly installed and accessible from the command line (i.e., the blender command works). Additionally, note that Blender uses its own bundled Python environment for simulation scripts. You need to install the required packages within Blender's Python. Run the following commands to set it up:

/path/to/blender-2.93.3-linux-x64/2.93/python/bin/python3.9 -m ensurepip --upgrade
/path/to/blender-2.93.3-linux-x64/2.93/python/bin/python3.9 -m pip install -r requirements.txt

🔁 Replace /path/to/ with your actual Blender installation path.

🚀 Quick Start

📝 Modify Paths

Before running the code, please ensure that all file and directory paths in the scripts (e.g., Blender path, dataset path, save directories) are correctly set according to your local environment.

For example, update:

BLENDER_PATH = "/path/to/blender-2.93.3-linux-x64/blender"
PYTHON_PATH = "/path/to/python"

🔁 Replace /path/to/ with your actual working directory.

🧪 Data Generation

  1. Download and extract the required assets into data/.
  2. Download the ImageNet 2012 test set and organize as src/assets/imagenet/images/test. This is used for randomized texture assignment during simulation.
  3. Download scene descriptor files from GIGA into data/NeuGraspData/data/.
  4. Download and extract the renderer module to data_generator/render/.

You can also download example generated data for reference

Once all resources are prepared, run the following:

cd data_generator/render
bash run_packed_rand.sh

💡 For pile scenes and background data, follow the same procedure.

After rendering is complete, process the raw data using:

python data_generator/render/wash.py
python data_generator/render/depth2tsdf.py

These scripts will clean up the data and convert the rendered depth images into .npz TSDF files for downstream use.

🏋️ Training

Once the training data and configuration files are ready, run the following command:

bash train.sh GPU_ID

For example:

bash train.sh 0

🔍 Testing

We provide pretrained weights for direct evaluation. Please download the checkpoint and place it in:

src/neugrasp/ckpt/

To run simulation-based grasping:

bash run_simgrasp.sh

🔭 Visualization Options

  • RViz Visualization To visualize the scene and predicted grasps in RViz, set RVIZ=1 in the configuration, and run the following commands beforehand:

    roscore
    rviz

    Then, in the RViz interface, open the config file:

    src/gd/config/sim.rviz
  • PyBullet Execution To visualize grasp execution in PyBullet, set GUI=1.

🔧 Finetuning with Real-World Data

Our method is robust without geometry ground truth and can be further improved by finetuning on real-world data.

  • We provide our collected data and partial scripts for reference.
  • Due to the complexity of hardware setups, we only offer guidelines for real-world data collection.

🛠️ Real-World Data Collection Procedure

  1. The robot captures RGB-D images at the start of each scene.
  2. A human operator manually moves the arm to feasible grasp positions.
  3. Another person inputs yes, triggering the gripper to close and saving the grasp pose and width.

We recommend rewriting a collection script tailored to your setup. Useful reference functions can be found in the VGN repository.

📚 Citation

If you find our work helpful, please consider citing:

@INPROCEEDINGS{fan2025neugrasp,
  author={Fan, Qingyu and Cai, Yinghao and Li, Chao and He, Wenzhe and Zheng, Xudong and Lu, Tao and Liang, Bin and Wang, Shuo},
  booktitle={2025 IEEE International Conference on Robotics and Automation (ICRA)}, 
  title={NeuGrasp: Generalizable Neural Surface Reconstruction with Background Priors for Material-Agnostic Object Grasp Detection}, 
  year={2025},
  volume={},
  number={},
  pages={3197-3203},
  keywords={Surface reconstruction;Aggregates;Refining;Focusing;Grasping;Reconstruction algorithms;Transformers;Feature extraction;Encoding;Robots},
  doi={10.1109/ICRA55743.2025.11127348}
}

🙏 Acknowledgments

We would like to thank the authors of VGN and GraspNeRF for open-sourcing their codebases, which greatly inspired this work.

📬 Contact

For questions or feedback, feel free to open an issue or reach out:

  • Qingyu Fan: fanqingyu23@mails.ucas.edu.cn
  • Yinghao Cai: yinghao.cai@ia.ac.cn

About

[ICRA'25] NeuGrasp: Generalizable Neural Surface Reconstruction with Background Priors for Material-Agnostic Object Grasp Detection

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors