This project uses real-time webcam input to render an avatar in the Meta Quest 2 VR environment. It performs real-time avatar generation using VIBE for body pose and shape estimation, and renders meshes with the estimated parameters in Unity.
★ If you found this repository useful, please support it by giving a ⭐!
Make sure the following environments and packages are installed:
- Ubuntu 22.04
- CUDA 11.8
- Python 3.8
- Unity 2022.3.40
- Follow the dependency requirements of VIBE
✅ Ensure your environment satisfies the requirements of VIBE for proper functionality.
The overall process of the program is illustrated in the example image below:
Download the VIBE source code from GitHub:
git clone https://github.com/mkocabas/VIBE.gitCreate a new Conda environment with Python 3.8:
conda create -n <your_env_name> python=3.8You can install the required packages using either pip or conda.
Using pip:
source scripts/install_pip.shUsing conda:
source scripts/install_conda.shDownload and place the required model files:
source scripts/prepare_data.shTo activate conda environment:
conda activate <your_env_name>We have two ways to run the demo. The client must always be started after the server is running.
Run the following command to generate output .obj files:
python VIBE_streaming.py --output_folder output/<folder_name> --save_objSimply just run the 'SampleScene' in the 'Scenes' folder Note: Make sure that both the Python server and Unity client are set to use localhost as their IP address.
Run the following command to generate output .obj files and send them using TCP/IP:
python python VIBE_streaming.py --output_folder output/<folder name> --send_objSimilarly, run the 'SampleScene' in the 'Scenes' folder.
Note: Make sure that both the Python server and the Unity client are set to use IP addresses within the same network. (Use another IP instead of localhost)
- Modified the GRU to update its hidden state at every frame for real-time inference.
- Implemented real-time .obj rendering from webcam input, supporting both same-OS (Ubuntu) setups and cross-OS configurations where the model runs on an Ubuntu server and Unity runs on Windows—for integration with the Oculus Quest 2.
This work was supported by the Electronics and Telecommunications Research Institute (ETRI)