This package contains the grasp detection node for the Franka robot with two-finger gripper. The node should runs DetectGrasps service and publishes the detected grasp pose in the world frame.
For a specific/custom grasp model, please create a new ROS conda environment to avoid python dependencies conflicts. We recommend creating your conda env with:
mamba create -n grasp_env_<model> python=3.9
mamba activate grasp_env_<model>
conda config --env --add channels conda-forge
conda config --env --add channels robostack-staging
conda config --env --remove channels defaultsBefore you install full ros dependencies, we recommend you install the grasp detection models first. Since 1) Full ros environment setup adds isolated C/C++ compilters inside mamba environment and overwrites C/C++ environment variables, 2) and a lot of detection models are built locally with system g++/nvcc executables, installing detection models first can avoid a lot of potential issues.
Models including GIGA, AnyGrasp have been verified and integrated into the package. You can choose any of them to install or deloy your own model.
Please follow the installation to install our fork of the GIGA (https://github.com/SgtVincent/GIGA) with some minor fixes.
AnyGrasp is a SOTA grasp detection SDK library developed by GraspNet. You can get access to the library by applying to its license here.
Please follow the Installation to install AnyGrasp.
# Install ros-noetic into the environment (ROS1)
mamba install ros-noetic-desktop-full compilers cxx-compiler cmake pkg-config make ninja colcon-common-extensions catkin_toolsBy default, one catkin workspace uses one mamba ros environment to build the packages. To avoid conflicts, create a separate catkin workspace for your grasp detection models.
mamba activate grasp_env_<model>
# Create a catkin workspace
mkdir -p /path/to/grasp_<model>_ws/src
cd /path/to/grasp_<model>_ws
catkin init
# Link the grasp detection models to the new catkin workspace
ln -s /path/to/catkin_ws/src/grasp_detection /path/to/grasp_<model>_ws/src/grasp_detection
# Build the catkin workspace
catkin buildNote that you also need some geometric & perception tools in GIGA repo. If you DO NOT need GIGA, you can run the following command to install the tools ONLY without GIGA network installation:
pip install git+https://github.com/SgtVincent/GIGA.gitNow you should be good to run the grasp detection node.
# Activate the grasp detection environment
cd /path/to/grasp_<model>_ws
source devel/setup.bash
roslaunch grasp_detection run_node.launchFor custom model deployment and parameters setting, please refer to detectors