Kanda Players 0.7.0
Loading...
Searching...
No Matches
XR Runtime

For XR to work correctly, there are some settings that need to be correctly configured. There are also some quirks of Unity DOTS that need to be considered. This guide will help you set up the essential requirements for XR players.

1: Set up XR plugins

Supported XR runtimes need to be enabled in(Project Settings > XR Plug-in Management). This will result in platform-specific XR packages being imported to the project.

  • For all targets, set Initialize XR on Startup to false.
  • Enable required plug-in providers:
    • For Standalone, enable OpenXR.
    • For Android, enable OpenXR.
    • For visionOS, enable Apple visionOS.
  • Configure plug-in settings:
    • For OpenXR use:
      • Render Mode: Single Pass Instanced
      • Depth Submission Mode: Depth 16 bit
      • Enabled Interaction Profiles:
        • Oculus Touch Controller Profile
        • Meta Quest Touch Pro Controller Profile
      • Features (Android): Meta Quest Support
        • If you get the following warning after enabling this setting, apply the suggested solution: [Meta Quest Support] Using the Screen Space Ambient Occlusion render feature results in significant performance overhead when the applicaiton is running nativelyon device. Disabling or removing that render feature is recommended.
    • For visionOS, use:
      • App Mode: Virtual Reality - Fully Immersive Space
      • Upper Limb Visibility: False
      • Foveated Rendering: True

2: Set up main camera

At the time of writing, there is a restriction that the Unity main camera cannot be in an entity sub-scene. For this reason, we need to make sure the camera is placed in the main GameObjects scene.

You can use the prefab provided in Runtime/Prefabs/MainCamera.

Note that the template scenes come with a camera already, so you might not need to add one yourself.

3: Set up the local player entity

To request initialization of the VR runtime, make sure the player entity has the RequestVrRuntimeStarted component attached. This component will be discovered by the VrRuntimeClientSystem which in turn will start the VR runtime and remove the component.

You can use the prefab provided in Runtime/Prefabs/Players/Local/LocalPlayerVrHead.

4: OpenXR Input Collection

The OpenXrInputCollector collects input data from OpenXR devices. This collector is responsible for gathering Head Mounted Device (HMD) position and rotation data, which is needed for VR player movement and orientation.

To use OpenXR input:

  1. Ensure your player prefab includes the OpenXrInputAuthoring component.
  2. The OpenXrInputSystem will automatically populate OpenXrInputData components with the latest input data from OpenXR devices.
  3. Use the OpenXrHeadMovement component and OpenXrHeadMovementSystem to apply the collected input data to your VR player's head entity.

For more detailed information on input collection and usage, refer to the PlayerInput" page.

Mocking XR in the Editor

To facilitate development and testing of VR functionality without always requiring an XR headset, this package provides options to mock XR input in the Unity editor.

Imitating an XR Device

To imitate an OpenXR device in the editor:

  1. Set PlatformSettings.MockDevice to true.
  2. Set PlatformSettings.DeviceToMock to AndroidOpenXr.

This configuration will:

  • Join you as an XR player in the session.
  • Start the XR runtime in the editor.
  • Allow you to use Meta Quest Link on compatible PCs.

Mocking XR without a headset

Since macOS does not support OpenXR, a different approach is used to mock XR input:

  1. Set PlatformSettings.MockDevice to true.
  2. Set PlatformSettings.DeviceToMock to AndroidOpenXr.
  3. Set KandaPlayerSettings.MockXr to true.

This setting will:

  • Join you as an XR player in the session.
  • Replace the OpenXR input collector with a virtualized version.
  • Use keyboard and mouse input to populate OpenXR inputs.

With this virtual input collector:

  • Use WASD keys to move the virtual HMD position.
  • Use the mouse to control the virtual HMD rotation.

These mocking options enable developers to test XR functionality and interactions without needing a physical VR device connected, making it easier to develop and debug XR features across different development environments.