Skip to content

Visionary-Laboratory/Proxy-GS

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

[CVPR 2026] Proxy-GS: Unified Occlusion Priors for Training and Inference in Structured 3D Gaussian Splatting

arXiv Project Page

Teaser image

This repo contains official implementations of Proxy-GS, ⭐ us if you like it!

Todo List

  • 🔥🔥 News: 2026/2/26: Proxy-GS has been accepted to CVPR 2026.
  • [✓] Release the training & inference code of Proxy-GS.
  • Release all model checkpoints.

Installation

We recommend using a dedicated conda environment:

conda create -n proxy-gs python=3.10 -y
conda activate proxy-gs

Install a CUDA-enabled PyTorch build that matches your local CUDA version first, then install the remaining dependencies:

pip install -r requirements.txt

# Install torch-scatter with the wheel matching your PyTorch/CUDA version.
# See: https://data.pyg.org/whl/
pip install torch-scatter

# Install local CUDA extensions
pip install ./submodules/diff-gaussian-rasterization
pip install ./submodules/simple-knn

Training and Inference

The following example uses the MatrixCity block_E scene. For reproducibility, we recommend using a dedicated output directory such as output/block_E.

SCENE=proxy-gs/MatrixCity/small_city/street/pose/block_E
IMAGES=proxy-gs/MatrixCity/small_city/street/train/small_city_road_horizon
MESH=cvpr/block_E_from_mesh.ply
POINTS=MatrixCity/small_city/aerial/small_city_pointcloud/point_cloud_ds20/aerial/Block_E.ply
DEPTH_DIR=mesh_depth_block_E
OUTPUT=output/block_E

1. Render mesh depth and save caches

Render the proxy mesh into per-view depth maps and save them as .npy files:

python mesh_render.py \
  -s ${SCENE} \
  -m ${OUTPUT} \
  -i ${IMAGES} \
  --ply_mesh ${MESH} \
  --depth_npy_dir ${DEPTH_DIR}

The rendered depth files will be written to ${DEPTH_DIR}.

2. Train Proxy-GS

After the depth cache is ready, start training with the rendered mesh depth prior:

python train.py \
  -s ${SCENE} \
  -m ${OUTPUT} \
  -i ${IMAGES} \
  --ply_mesh ${MESH} \
  --depth_npy_dir ${DEPTH_DIR} \
  --ply_path ${POINTS}

Checkpoints will be saved under ${OUTPUT}/point_cloud.

3. Test and evaluate

4. build ProxyGS-Vulkan-Cuda-Interop

This optional Vulkan backend currently requires Ubuntu Linux and an NVIDIA RTX-series compute GPU.

If you want to use the Vulkan-CUDA interop backend used by render_real.py, build the Python extension in ProxyGS-Vulkan-Cuda-Interop first.

cd ProxyGS-Vulkan-Cuda-Interop

# Use the current conda environment
export PYTHON_EXECUTABLE=$(which python)

# Set this to your local Vulkan SDK path
export VULKAN_SDK_PREFIX=$HOME/VulkanSDK/1.4.321.1/x86_64

# Build the Python extension only
./build_pyext_only.sh build-py

After building, verify that the extension can be imported successfully:

cd ProxyGS-Vulkan-Cuda-Interop
python -c "
import sys
sys.path.insert(0, 'build-py/_bin/Release')
import vk2torch_ext
print('vk2torch_ext OK:', vk2torch_ext.__version__)
"

FPS Evaluation

Once the extension is built, go back to the project root and run render_real.py.

cd ..
python render_real.py \
  -m ${OUTPUT} \
  --scene_file /absolute/path/to/your_scene.glb

If you want to keep the exact commands used in our current internal runs, simply replace ${OUTPUT} with output.

Acknowledgements

This project builds upon several excellent open-source repositories. We sincerely thank the authors of:

  • Octree-GS: Octree-GS: Towards Consistent Real-time Rendering with LOD-Structured 3D Gaussians
  • Scaffold-GS: Scaffold-GS: Structured 3D Gaussians for View-Adaptive Rendering
  • 3D Gaussian Splatting: 3D Gaussian Splatting for Real-Time Radiance Field Rendering
  • vk_lod_clusters: Sample for cluster-based continuous level of detail rasterization or ray tracing

About

[CVPR 2026] Proxy-GS: Unified Occlusion Priors for Training and Inference in Structured 3D Gaussian Splatting

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors