From c71c529c56e08785e6b9ad90292d4541507eda45 Mon Sep 17 00:00:00 2001 From: "copilot-swe-agent[bot]" <198982749+Copilot@users.noreply.github.com> Date: Wed, 21 Jan 2026 12:01:34 +0000 Subject: [PATCH 1/2] Initial plan From f91c8934d3b78d3dd50762bd7a789c4765478842 Mon Sep 17 00:00:00 2001 From: "copilot-swe-agent[bot]" <198982749+Copilot@users.noreply.github.com> Date: Wed, 21 Jan 2026 12:15:05 +0000 Subject: [PATCH 2/2] Add wiki documentation content for all wiki pages Co-authored-by: Phoenix-Ra <53151806+Phoenix-Ra@users.noreply.github.com> --- docs/wiki/Basics.md | 85 +++++++++++++ docs/wiki/Input-Advanced.md | 213 ++++++++++++++++++++++++++++++++ docs/wiki/Input-Basics.md | 148 ++++++++++++++++++++++ docs/wiki/Miscellaneous.md | 188 ++++++++++++++++++++++++++++ docs/wiki/README.md | 32 +++++ docs/wiki/Rendering-Advanced.md | 133 ++++++++++++++++++++ docs/wiki/Rendering-Basics.md | 132 ++++++++++++++++++++ 7 files changed, 931 insertions(+) create mode 100644 docs/wiki/Basics.md create mode 100644 docs/wiki/Input-Advanced.md create mode 100644 docs/wiki/Input-Basics.md create mode 100644 docs/wiki/Miscellaneous.md create mode 100644 docs/wiki/README.md create mode 100644 docs/wiki/Rendering-Advanced.md create mode 100644 docs/wiki/Rendering-Basics.md diff --git a/docs/wiki/Basics.md b/docs/wiki/Basics.md new file mode 100644 index 0000000..a817ea8 --- /dev/null +++ b/docs/wiki/Basics.md @@ -0,0 +1,85 @@ +## Contents + +1. [How library is organized](#how-library-is-organized) +2. [Entry point](#entry-point) +3. [Lifecycle](#lifecycle) + +## How library is organized + +AtumVR is split into three Gradle modules: + +- **api** — Public-facing interfaces and structure. Also includes utility/helper classes used across the project. +- **core** — The actual VR implementation and runtime logic. +- **example** — A reference project to help you get started. You **should not** depend on this module. + +## Entry point + +**VRProvider** is an entry point for your application to manage VR + +Use one of these abstract classes from **core** module: + +- `XRProvider` - OpenXR +- `OpenVRProvider` - OpenVR (planned in future releases) + +you will require to implement methods that create handlers for rendering, input and state. + +For OpenXR use these abstract classes for handlers: + +- `XRRenderer` +- `XRInputHandler` +- `XRState` + +Feel free to override any of their logic if needed + +More details and integration examples are covered in the next pages of the **Start** section. + +## Lifecycle + +### VRState + +VR Provider has an associated VRState object: + +- `XRState` - OpenXR +- `OpenVRState` - OpenVR (planned in future releases) + +`VRState` represents the current VR session status (_initialized_, _active_, _focused_). +You'll use it often, especially in applications that support both VR and non-VR modes. + +### Setting up lifecycle + +1. **Initialize the provider** + Your `VRProvider` must be initialized before doing anything VR-related. + +2. **Sync state at the start of the app loop** + When VRState indicates "_initialized_" is true, call `syncState()` at the **very beginning** of your main loop. + +3. **When VRState indicates "_active_" is true, call these methods in the following order:** + + 1. `syncState()` - at the beginning of main loop + 2. `startFrame()` - at the beginning of main loop + 3. `render()` - somewhere during rendering stage in your main loop (where its convenient for your case) + 4. `postRender()` - up to you, its optional + +### Notes about `startFrame()` + +`startFrame()` will **pause your thread** until the next VR frame is available from the VR runtime. +In other words, it effectively controls the refresh rate. If your application already has its own frame timing / limiter, you should **disable it while in VR mode** to avoid fighting the VR runtime. + +### Focus handling + +In addition to `active`, there is a separate flag: `focused`. + +When true, it means the user is interacting with your app in VR + +If false, it may mean: + +- VR is not active +- The user is interacting with the VR runtime menu or something else not related to your app in VR. + +It is **highly recommended** to limit rendering when `focused` is `false` (for example, don't render VR hands/controllers while unfocused). + +### Destroying the VR session + +It is pretty simple, just call the method `destroy()` in VRProvider **after postRender() or before startFrame()**. + +You are free to initialize the VRProvider again using the same instance. diff --git a/docs/wiki/Input-Advanced.md b/docs/wiki/Input-Advanced.md new file mode 100644 index 0000000..18d4ca8 --- /dev/null +++ b/docs/wiki/Input-Advanced.md @@ -0,0 +1,213 @@ +## Contents + +1. [Custom Action Sets](#custom-action-sets) +2. [Profile Sets](#profile-sets) +3. [Creating Custom Actions](#creating-custom-actions) +4. [Action Listeners](#action-listeners) + +## Custom Action Sets + +If the built-in `ProfileSetHolder` doesn't fit your needs, you can create custom action sets. + +### Creating an action set + +Extend `XRActionSet`: + +```java +public class CustomActionSet extends XRActionSet { + private FloatButtonMultiAction triggerAction; + private Vec2MultiAction joystickAction; + + public CustomActionSet(XRProvider provider) { + super(provider, + "custom_actions", // internal name + "Custom Actions", // display name + 0 // priority + ); + } + + @Override + protected List loadActions(XRProvider provider) { + triggerAction = new FloatButtonMultiAction( + provider, this, + new ActionIdentifier("custom", "trigger"), + "Trigger" + ); + + joystickAction = new Vec2MultiAction( + provider, this, + new ActionIdentifier("custom", "joystick"), + "Joystick" + ); + + return List.of(triggerAction, joystickAction); + } + + public FloatButtonMultiAction getTriggerAction() { + return triggerAction; + } + + public Vec2MultiAction getJoystickAction() { + return joystickAction; + } +} +``` + +### Action types + +AtumVR provides these action types: + +- **FloatButtonMultiAction** - analog triggers (0.0 to 1.0 with button threshold) +- **BoolButtonMultiAction** - simple boolean buttons +- **Vec2MultiAction** - 2D inputs like joysticks +- **PoseMultiAction** - position/rotation data +- **HapticPulseAction** - output action for vibration + +The "Multi" variants support both left and right hand subactions. + +## Profile Sets + +Different VR controllers have different button layouts. Profile sets handle these differences. + +### Using ProfileSetHolder + +`ProfileSetHolder` manages action sets for different controller profiles: + +```java +ProfileSetHolder profileSetHolder = new ProfileSetHolder(vrProvider); + +// Get the currently active profile set (based on connected controllers) +XRProfileSet activeProfile = profileSetHolder.getActiveProfileSet(); + +// Get specific profile set +XRProfileSet oculusProfile = profileSetHolder.getProfileSet( + XRInteractionProfile.OCULUS_TOUCH +); +``` + +### Supported profiles + +AtumVR includes built-in support for: + +- Oculus Touch controllers +- Valve Index controllers +- HTC Vive controllers +- HP Mixed Reality controllers +- And more... + +The `ProfileSetHolder` automatically detects which controllers are connected and uses the appropriate profile. + +### Shared actions + +Some actions are common across all controllers (like hand poses). These are in the `SharedActionSet`: + +```java +SharedActionSet shared = profileSetHolder.getSharedSet(); +PoseMultiAction aimPose = shared.getHandPoseAim(); +PoseMultiAction gripPose = shared.getHandPoseGrip(); +HapticPulseAction haptic = shared.getHapticPulse(); +``` + +## Creating Custom Actions + +### Single-hand action + +For actions that only apply to one hand: + +```java +public class CustomSingleAction extends XRSingleAction { + public CustomSingleAction(XRProvider provider, XRActionSet actionSet) { + super(provider, actionSet, + new ActionIdentifier("custom", "action"), + "Custom Action", + XRInputActionType.BOOLEAN + ); + } + + @Override + public String getDefaultBindings(XRInteractionProfile profile) { + return switch (profile) { + case OCULUS_TOUCH -> "/user/hand/left/input/x/click"; + case INDEX_CONTROLLER -> "/user/hand/left/input/a/click"; + default -> null; + }; + } +} +``` + +### Multi-hand action + +For actions that apply to both hands: + +```java +public class CustomMultiAction extends XRMultiAction { + public CustomMultiAction(XRProvider provider, XRActionSet actionSet) { + super(provider, actionSet, + new ActionIdentifier("custom", "grab"), + "Grab", + XRInputActionType.BOOLEAN + ); + } + + @Override + protected SubAction createSubAction(ControllerType type) { + return new BoolButtonSubAction(type) { + @Override + public String getDefaultBindings(XRInteractionProfile profile) { + String hand = type == ControllerType.LEFT ? "left" : "right"; + return switch (profile) { + case OCULUS_TOUCH -> "/user/hand/" + hand + "/input/squeeze/value"; + case INDEX_CONTROLLER -> "/user/hand/" + hand + "/input/squeeze/force"; + default -> null; + }; + } + }; + } +} +``` + +## Action Listeners + +You can register a listener to be notified when any action changes: + +```java +inputHandler.setActionListener(actionName -> { + System.out.println("Action triggered: " + actionName); +}); +``` + +### Custom update logic + +Override the `update()` method in your input handler for custom logic: + +```java +@Override +public void update() { + super.update(); // Important: call super first + + var profileSet = profileSetHolder.getActiveProfileSet(); + if (profileSet == null) return; + + // Check for specific input combinations + if (profileSet.getTriggerValue().getHandSubaction(ControllerType.LEFT).isPressed() && + profileSet.getTriggerValue().getHandSubaction(ControllerType.RIGHT).isPressed()) { + // Both triggers pressed + onBothTriggersPressed(); + } +} +``` + +### Action change detection + +To detect when a button state changes (not just when it's pressed): + +```java +VRActionDataButton button = profileSet.getButton("a_button"); +if (button.isButtonChanged()) { + if (button.isPressed()) { + // Button was just pressed + } else { + // Button was just released + } +} +``` diff --git a/docs/wiki/Input-Basics.md b/docs/wiki/Input-Basics.md new file mode 100644 index 0000000..3e3c619 --- /dev/null +++ b/docs/wiki/Input-Basics.md @@ -0,0 +1,148 @@ +## Contents + +1. [How input works](#how-input-works) +2. [Setting up handler](#setting-up-handler) +3. [Devices](#devices) +4. [Actions](#actions) + +## How input works + +AtumVR links with most of actions available in VR runtime for user's controllers. So, you don't have to understand how VR runtime works with that. + +Your part is to use profile set, VRDevice and of course input handler. + +The input system is based on OpenXR's action-based input model: +- **Action Sets** - groups of related actions +- **Actions** - individual inputs like button presses, triggers, joysticks +- **Devices** - physical VR devices like HMD and controllers + +## Setting up handler + +Create a class that extends `XRInputHandler`. You need to implement two methods: + +- `generateActionSets(MemoryStack stack)` - returns list of action sets to use +- `generateDevices(MemoryStack stack)` - returns list of VR devices to track + +Example: +```java +public class ExampleVRInputHandler extends XRInputHandler { + private ProfileSetHolder profileSetHolder; + + public ExampleVRInputHandler(XRProvider provider) { + super(provider); + } + + @Override + protected List generateActionSets(MemoryStack stack) { + profileSetHolder = new ProfileSetHolder(getVrProvider()); + return profileSetHolder.getAllSets(); + } + + @Override + protected List generateDevices(MemoryStack stack) { + return List.of( + new XRDeviceHMD(getVrProvider()), + new XRDeviceController( + getVrProvider(), + ControllerType.LEFT, + profileSetHolder.getSharedSet().getHandPoseAim(), + profileSetHolder.getSharedSet().getHandPoseGrip(), + profileSetHolder.getSharedSet().getHapticPulse() + ), + new XRDeviceController( + getVrProvider(), + ControllerType.RIGHT, + profileSetHolder.getSharedSet().getHandPoseAim(), + profileSetHolder.getSharedSet().getHandPoseGrip(), + profileSetHolder.getSharedSet().getHapticPulse() + ) + ); + } +} +``` + +And in your VRProvider: +```java +@Override +public @NotNull XRInputHandler createInputHandler() { + return new ExampleVRInputHandler(this); +} +``` + +## Devices + +AtumVR provides these built-in device types: + +### VRDeviceHMD +The head-mounted display. Access it using: +```java +VRDeviceHMD hmd = inputHandler.getDevice(VRDeviceHMD.ID, VRDeviceHMD.class); +``` + +Provides: +- `getPose()` - HMD position and orientation +- `isActive()` - whether HMD is being tracked + +### VRDeviceController +Left and right controllers. Access them using: +```java +VRDeviceController leftController = inputHandler.getDevice( + VRDeviceController.ID_LEFT, + VRDeviceController.class +); +VRDeviceController rightController = inputHandler.getDevice( + VRDeviceController.ID_RIGHT, + VRDeviceController.class +); +``` + +Each controller provides: +- `getPose()` - aim pose (pointing direction) +- `getGripPose()` - grip pose (where you hold the controller) +- `isActive()` - whether controller is being tracked +- `isGripActive()` - whether grip pose is valid +- `getType()` - `ControllerType.LEFT` or `ControllerType.RIGHT` +- `triggerHapticPulse(...)` - trigger vibration feedback + +### VRPose + +All devices provide a `VRPose` that contains: +- `position()` - Vector3fc with x, y, z coordinates +- `orientation()` - Quaternionfc rotation +- `matrix()` - 4x4 transformation matrix + +## Actions + +Actions represent user inputs. AtumVR uses `ProfileSetHolder` to manage actions for different controller types. + +### Getting button states +```java +var profileSet = profileSetHolder.getActiveProfileSet(); +if(profileSet != null) { + // Check trigger press + if(profileSet.getTriggerValue().getHandSubaction(ControllerType.LEFT).isPressed()) { + // Left trigger is pressed + } +} +``` + +### Action data types + +- `VRActionDataButton` - for buttons and triggers + - `isActive()` - is the input available + - `isPressed()` - is button currently pressed + - `isButtonChanged()` - did state change this frame + +- `VRActionDataVec2` - for joysticks and touchpads + - `getX()`, `getY()` - axis values (-1 to 1) + +### Haptic feedback +```java +VRDeviceController controller = inputHandler.getDevice( + VRDeviceController.ID_RIGHT, + VRDeviceController.class +); +controller.triggerHapticPulse(0.1f); // 100ms pulse +// or with more control: +controller.triggerHapticPulse(160f, 1.0f, 0.1f); // frequency, amplitude, duration +``` diff --git a/docs/wiki/Miscellaneous.md b/docs/wiki/Miscellaneous.md new file mode 100644 index 0000000..f8c7009 --- /dev/null +++ b/docs/wiki/Miscellaneous.md @@ -0,0 +1,188 @@ +## Contents + +1. [Logging](#logging) +2. [Error Handling](#error-handling) +3. [State Change Callbacks](#state-change-callbacks) +4. [OpenXR Extensions](#openxr-extensions) +5. [Tips and Best Practices](#tips-and-best-practices) + +## Logging + +AtumVR uses `VRLogger` interface for logging. You need to provide an implementation when creating your VRProvider. + +### Creating a logger + +```java +VRLogger logger = new VRLogger() { + @Override + public void logInfo(String message) { + System.out.println("[AtumVR INFO] " + message); + } + + @Override + public void logWarning(String message) { + System.out.println("[AtumVR WARN] " + message); + } + + @Override + public void logError(String message) { + System.err.println("[AtumVR ERROR] " + message); + } +}; + +ExampleVRProvider provider = new ExampleVRProvider(logger); +``` + +### Using the logger + +Access the logger anywhere via the provider: +```java +vrProvider.getLogger().logInfo("VR initialized successfully"); +``` + +## Error Handling + +### VRException + +AtumVR throws `VRException` for VR-related errors: + +```java +try { + vrProvider.initializeVR(); +} catch (VRException e) { + logger.logError("Failed to initialize VR: " + e.getMessage()); + // Handle gracefully - perhaps fall back to non-VR mode +} +``` + +### OpenXR error checking + +For direct OpenXR calls, use the built-in error checker: + +```java +vrProvider.checkXRError(xrResult, "operationName", "additional context"); +``` + +This will throw `VRException` if the result indicates an error. + +### Getting error descriptions + +```java +String description = vrProvider.getXRActionResult(xrResultCode); +``` + +## State Change Callbacks + +Override `onStateChanged()` in your VRProvider to respond to VR runtime state changes: + +```java +@Override +public void onStateChanged(XRSessionStateChange state) { + switch (state) { + case READY: + logger.logInfo("VR session ready"); + break; + case FOCUSED: + logger.logInfo("User is focused on VR app"); + onVRFocused(); + break; + case VISIBLE: + logger.logInfo("VR app is visible but not focused"); + break; + case STOPPING: + logger.logInfo("VR session stopping"); + onVRStopping(); + break; + case EXITING: + logger.logInfo("VR session exiting"); + break; + } +} +``` + +### Common state transitions + +1. **Initialization**: `IDLE` → `READY` → `SYNCHRONIZED` → `VISIBLE` → `FOCUSED` +2. **Dashboard opened**: `FOCUSED` → `VISIBLE` +3. **App backgrounded**: `VISIBLE` → `SYNCHRONIZED` +4. **Shutdown**: `FOCUSED/VISIBLE` → `STOPPING` → `EXITING` + +## OpenXR Extensions + +### Default extensions + +AtumVR enables these extensions by default: +- `XR_EXT_HP_MIXED_REALITY_CONTROLLER` - HP controller support +- `XR_HTC_VIVE_COSMOS_CONTROLLER` - Vive Cosmos controller support +- `XR_BD_CONTROLLER_INTERACTION` - Pico controller support + +### Adding custom extensions + +Override `getXRAppExtensions()` in your VRProvider: + +```java +@Override +public List getXRAppExtensions() { + List extensions = new ArrayList<>(super.getXRAppExtensions()); + extensions.add("XR_FB_hand_tracking"); + extensions.add("XR_EXT_eye_gaze_interaction"); + return extensions; +} +``` + +**Note**: Extensions must be supported by the VR runtime. Check availability before using extension features. + +## Tips and Best Practices + +### Performance + +1. **Keep frame times low** - VR requires consistent high framerates (72-144 FPS) +2. **Use hidden area mesh** - Skip rendering to areas the user can't see +3. **Limit draw calls** - Batch geometry where possible +4. **Avoid allocations in render loop** - Pre-allocate matrices and vectors + +### Comfort + +1. **Never take camera control away** - Always render from the HMD pose +2. **Avoid rapid movement** - Use teleportation or smooth locomotion with vignette +3. **Maintain consistent frame rate** - Stutters cause discomfort + +### Code organization + +```java +// Good: Check VR state before operations +if (vrProvider.getState().isActive()) { + vrProvider.startFrame(); + vrProvider.render(context); +} + +// Good: Proper cleanup +@Override +public void destroy() { + vrProvider.destroy(); // Releases all VR resources +} +``` + +### Debugging + +1. Check the VR runtime's own debug/log tools (SteamVR console, Oculus Debug Tool) +2. Use `VRLogger` extensively during development +3. Test on multiple headsets if possible - behavior can vary + +### Resource management + +```java +// Always destroy VR resources when done +@Override +public void onClose() { + if (vrProvider.getState().isInitialized()) { + vrProvider.destroy(); + } +} +``` + +### Thread safety + +- VR methods should be called from the main/render thread +- Don't call `startFrame()` or `render()` from background threads +- Input updates happen in `startFrame()` - read input after that call diff --git a/docs/wiki/README.md b/docs/wiki/README.md new file mode 100644 index 0000000..a096fb3 --- /dev/null +++ b/docs/wiki/README.md @@ -0,0 +1,32 @@ +# AtumVR Wiki Content + +This folder contains markdown files with content for the AtumVR GitHub Wiki pages. + +## How to use + +Copy the content from each `.md` file to the corresponding wiki page on GitHub: + +| File | Wiki Page | +|------|-----------| +| `Basics.md` | https://github.com/Phoenix-Ra/AtumVR/wiki/Basics | +| `Rendering-Basics.md` | https://github.com/Phoenix-Ra/AtumVR/wiki/Rendering-Basics | +| `Input-Basics.md` | https://github.com/Phoenix-Ra/AtumVR/wiki/Input-Basics | +| `Rendering-Advanced.md` | https://github.com/Phoenix-Ra/AtumVR/wiki/Rendering-Advanced | +| `Input-Advanced.md` | https://github.com/Phoenix-Ra/AtumVR/wiki/Input-Advanced | +| `Miscellaneous.md` | https://github.com/Phoenix-Ra/AtumVR/wiki/Miscellaneous | + +## Structure + +The wiki is organized as follows: + +### Start Section +- **Basics** - Library organization, entry point, and lifecycle +- **Rendering-Basics** - Setting up renderer and scenes +- **Input-Basics** - Input handler, devices, and actions + +### Advanced Section +- **Rendering-Advanced** - Hidden area mesh, custom textures, multi-pass rendering +- **Input-Advanced** - Custom action sets, profile sets, action listeners + +### Other +- **Miscellaneous** - Logging, error handling, tips and best practices diff --git a/docs/wiki/Rendering-Advanced.md b/docs/wiki/Rendering-Advanced.md new file mode 100644 index 0000000..dd33065 --- /dev/null +++ b/docs/wiki/Rendering-Advanced.md @@ -0,0 +1,133 @@ +## Contents + +1. [Hidden Area Mesh](#hidden-area-mesh) +2. [Custom Textures](#custom-textures) +3. [Resolution and Swapchain](#resolution-and-swapchain) +4. [Multi-pass Rendering](#multi-pass-rendering) + +## Hidden Area Mesh + +VR headsets have areas at the edges of each lens that the user cannot see. Rendering to these areas wastes GPU resources. AtumVR provides access to the hidden area mesh that you can use for stencil-based optimization. + +### Getting hidden area data + +```java +float[] leftHiddenArea = vrRenderer.getHiddenAreaVertices(EyeType.LEFT); +float[] rightHiddenArea = vrRenderer.getHiddenAreaVertices(EyeType.RIGHT); +``` + +The returned array contains triangle vertices in the format `[x1, y1, x2, y2, x3, y3, ...]` in pixel coordinates. These triangles represent the areas that should NOT be rendered. + +### Using hidden area for optimization + +You can use this data to create a stencil buffer that prevents rendering to hidden areas: + +1. At the start of each eye render, draw the hidden area triangles to the stencil buffer +2. Set stencil test to fail where hidden area was drawn +3. Render your scene normally - pixels in hidden area will be skipped + +This optimization can significantly improve performance, especially on complex scenes. + +## Custom Textures + +While `XRTexture` works for most cases, you can create custom texture classes for more control. + +### Extending XRTexture + +```java +public class CustomVRTexture extends XRTexture { + private int depthBuffer; + + public CustomVRTexture(int width, int height, int textureId, int index) { + super(width, height, textureId, index); + } + + @Override + public XRTexture init() { + super.init(); + + // Add custom depth buffer + depthBuffer = GL30.glGenRenderbuffers(); + GL30.glBindRenderbuffer(GL30.GL_RENDERBUFFER, depthBuffer); + GL30.glRenderbufferStorage(GL30.GL_RENDERBUFFER, + GL30.GL_DEPTH24_STENCIL8, width, height); + GL30.glFramebufferRenderbuffer(GL30.GL_FRAMEBUFFER, + GL30.GL_DEPTH_STENCIL_ATTACHMENT, GL30.GL_RENDERBUFFER, depthBuffer); + + return this; + } +} +``` + +Then in your renderer: +```java +@Override +protected XRTexture createTexture(int width, int height, int textureId, int index) { + return new CustomVRTexture(width, height, textureId, index); +} +``` + +## Resolution and Swapchain + +### Getting render resolution + +```java +int width = vrRenderer.getResolutionWidth(); +int height = vrRenderer.getResolutionHeight(); +``` + +The resolution is determined by the VR runtime based on the headset capabilities. AtumVR automatically uses the recommended resolution. + +### Swapchain formats + +AtumVR requests swapchain images in this order of preference: +1. `GL_SRGB8_ALPHA8` - sRGB with alpha (best quality) +2. `GL_SRGB8` - sRGB without alpha +3. `GL_RGB10_A2` - 10-bit RGB +4. `GL_RGBA16F` - 16-bit float RGBA +5. `GL_RGBA8` - fallback + +You can customize this by overriding `getSwapChainFormats()` in your `XRProvider`: +```java +@Override +public List getSwapChainFormats() { + return List.of( + GL21.GL_SRGB8_ALPHA8, + GL11.GL_RGBA8 + ); +} +``` + +## Multi-pass Rendering + +For complex rendering pipelines, you may need multiple passes per eye. + +### Rendering to intermediate buffers + +Instead of rendering directly to the swapchain framebuffer, render to your own framebuffer first: + +```java +@Override +public void updateEyeTexture(@NotNull EyeType eyeType) { + // Pass 1: Render scene to intermediate buffer + GL30.glBindFramebuffer(GL30.GL_FRAMEBUFFER, myIntermediateFramebuffer); + renderScene(eyeType); + + // Pass 2: Post-processing + GL30.glBindFramebuffer(GL30.GL_FRAMEBUFFER, myPostProcessFramebuffer); + applyPostProcessing(); + + // Final pass: Copy to VR framebuffer + int vrFramebuffer = (eyeType == EyeType.LEFT) + ? getVrRenderer().getTextureLeftEye().getFrameBufferId() + : getVrRenderer().getTextureRightEye().getFrameBufferId(); + GL30.glBindFramebuffer(GL30.GL_FRAMEBUFFER, vrFramebuffer); + drawFullscreenQuad(myPostProcessTexture); +} +``` + +### Important notes + +- Always end each eye's rendering with the final result in the VR framebuffer +- The VR runtime expects the swapchain image to contain the final rendered frame +- Be mindful of performance - VR requires high framerates (typically 72-120 FPS) diff --git a/docs/wiki/Rendering-Basics.md b/docs/wiki/Rendering-Basics.md new file mode 100644 index 0000000..16f57b3 --- /dev/null +++ b/docs/wiki/Rendering-Basics.md @@ -0,0 +1,132 @@ +## Contents + +1. [Setting up handler](#setting-up-handler) +2. [Scene](#scene) +3. [Eye Cameras](#eye-cameras) +4. [Textures](#textures) + +## Setting up handler + +Lets make a renderer for your VRProvider. + +Its actually pretty simple. + +Create a class ExampleVRRenderer (call it however you want). Then, make it extend `XRRenderer`. Then, create its instance inside `createRenderer()` method of VRProvider. + +`XRRenderer` already does most of the work, it only needs a Scene object from you that will actually render your VR scene. Also, you need to setup OpenGL context. You can use built-in method to create OpenGL context for you using `setupGLContext()`. Has to be used inside `createRenderer()` method of VRProvider. Otherwise, you will have to override `getWindowHandle()` and provide your app's window handle. + +Example: +```java +public class ExampleVRRenderer extends XRRenderer { + private VRScene vrScene; + + public ExampleVRRenderer(XRProvider provider) { + super(provider); + vrScene = new ExampleScene(this); + } + + @Override + protected XRTexture createTexture(int width, int height, int textureId, int index) { + return new XRTexture(width, height, textureId, index); + } + + @Override + public void onInit() { + vrScene.init(); + } + + @Override + public VRScene getCurrentScene() { + return vrScene; + } +} +``` + +And in your VRProvider: +```java +@Override +public @NotNull XRRenderer createRenderer() { + ExampleVRRenderer vrRenderer = new ExampleVRRenderer(this); + vrRenderer.setupGLContext(); + return vrRenderer; +} +``` + +## Scene + +Now lets make a VR scene. + +Our goal is to draw the scene on framebuffer texture for left and right eyes. + +`XRScene` is an abstract class we gonna extend from. + +It has some built-in logic, but I highly suggest to override it, to have full control over this part. Otherwise, you can just use built-in logic by implementing: +- `onInit()` - to initialize shader for example +- `updateEyeTexture(EyeType)` - to render your scene for each eye + +Example: +```java +public class ExampleScene extends XRScene { + private VRShaderProgram shaderProgram; + + public ExampleScene(XRRenderer vrRenderer) { + super(vrRenderer); + } + + @Override + public void onInit() { + // Initialize your shaders + shaderProgram = new VRShaderProgram(getVrProvider()); + shaderProgram.bindVertexShader("vertex.vsh"); + shaderProgram.bindFragmentShader("fragment.fsh"); + shaderProgram.finishShader(); + } + + @Override + public void updateEyeTexture(@NotNull EyeType eyeType) { + shaderProgram.useShader(); + + // Get the correct camera for this eye + Matrix4f projection = eyeType == EyeType.LEFT ? + getLeftEyeCamera().getProjectionMatrix() : + getRightEyeCamera().getProjectionMatrix(); + Matrix4f view = eyeType == EyeType.LEFT ? + getLeftEyeCamera().getViewMatrix() : + getRightEyeCamera().getViewMatrix(); + + // Render your objects here + // ... + + GL30.glUseProgram(0); + } + + @Override + public void destroy() { + // Release all resources attached to scene + } +} +``` + +## Eye Cameras + +`XRScene` provides built-in eye cameras (`leftEyeCamera` and `rightEyeCamera`) that are automatically updated each frame. + +Each camera provides: +- `getViewMatrix()` - view transformation matrix based on HMD position +- `getProjectionMatrix()` - projection matrix based on HMD field of view + +The cameras are updated in `setupMvp()` which is called at the start of each render call. You can override this method if you need custom near/far clip distances (default is 0.02f to 100f). + +## Textures + +AtumVR uses `XRTexture` for VR framebuffers. The renderer creates textures for both eyes automatically using swapchain images from the VR runtime. + +You can access the current eye textures via: +- `vrRenderer.getTextureLeftEye()` +- `vrRenderer.getTextureRightEye()` + +Each texture has: +- `getFrameBufferId()` - OpenGL framebuffer ID to bind for rendering +- `getTextureId()` - OpenGL texture ID + +The built-in `XRScene.render()` method already handles binding the correct framebuffer for each eye.