Skip to content

kimmyju/XR_Avatar

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 

Repository files navigation

XRAvatar: AI-Driven Real-Time Avatar on Unity and Oculus Quest 2

This project uses real-time webcam input to render an avatar in the Meta Quest 2 VR environment. It performs real-time avatar generation using VIBE for body pose and shape estimation, and renders meshes with the estimated parameters in Unity.

★ If you found this repository useful, please support it by giving a ⭐!

📦 Requirements

Make sure the following environments and packages are installed:

  • Ubuntu 22.04
  • CUDA 11.8
  • Python 3.8
  • Unity 2022.3.40
  • Follow the dependency requirements of VIBE

✅ Ensure your environment satisfies the requirements of VIBE for proper functionality.


🔍 Overall Process

The overall process of the program is illustrated in the example image below:

Overall Process


⚙️ Environment Setup

1. Clone the Repository

Download the VIBE source code from GitHub:

git clone https://github.com/mkocabas/VIBE.git

2. Create Conda Environment

Create a new Conda environment with Python 3.8:

conda create -n <your_env_name> python=3.8

3. Install Dependencies

You can install the required packages using either pip or conda.

Using pip:

source scripts/install_pip.sh

Using conda:

source scripts/install_conda.sh

4. Prepare Model Files

Download and place the required model files:

source scripts/prepare_data.sh

5. Set up the Unity project to work with Oculus Quest 2

6. Activate the Conda Environment

To activate conda environment:

conda activate <your_env_name>

🧪 Demo

We have two ways to run the demo. The client must always be started after the server is running.

1) To Run in the Same OS (Ubuntu)

Server

Run the following command to generate output .obj files:

python VIBE_streaming.py --output_folder output/<folder_name> --save_obj

Client

Simply just run the 'SampleScene' in the 'Scenes' folder Note: Make sure that both the Python server and Unity client are set to use localhost as their IP address.

2) To Run in different OS (Ubuntu-Windows)

Server

Run the following command to generate output .obj files and send them using TCP/IP:

python python VIBE_streaming.py --output_folder output/<folder name> --send_obj

Client

Similarly, run the 'SampleScene' in the 'Scenes' folder.

Note: Make sure that both the Python server and the Unity client are set to use IP addresses within the same network. (Use another IP instead of localhost)


🔧 Main Changes and Updates

  • Modified the GRU to update its hidden state at every frame for real-time inference.
  • Implemented real-time .obj rendering from webcam input, supporting both same-OS (Ubuntu) setups and cross-OS configurations where the model runs on an Ubuntu server and Unity runs on Windows—for integration with the Oculus Quest 2.

Acknowledgement

This work was supported by the Electronics and Telecommunications Research Institute (ETRI)

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages