上传YomovSDK

This commit is contained in:
Sora丶kong
2026-03-03 03:15:46 +08:00
parent 9096da7e6c
commit eb97f31065
6477 changed files with 1932208 additions and 3 deletions

View File

@@ -0,0 +1,21 @@
* [About Unity OpenXR](index.md)
* [Project Configuration](project-configuration.md)
* [OpenXR Features](features.md)
* [Meta Quest Support](features/metaquest.md)
* [Foveated Rendering](features/foveatedrendering.md)
* [XR Performance Settings](features/performance-settings.md)
* [Multiview Render Regions](features/multiviewrenderregions.md)
* [OpenXR Input](input.md)
* [Microsoft Mixed Reality Motion Controller Profile](features/microsoftmotioncontrollerprofile.md)
* [Oculus Touch Controller Profile](features/oculustouchcontrollerprofile.md)
* [HTC Vive Controller Profile](features/htcvivecontrollerprofile.md)
* [Valve Index Controller Profile](features/valveindexcontrollerprofile.md)
* [Khronos Simple Controller Profile](features/khrsimplecontrollerprofile.md)
* [Eye Gaze Interaction](features/eyegazeinteraction.md)
* [Microsoft Hand Interaction](features/microsofthandinteraction.md)
* [Meta Quest Touch Pro Controller Profile](features/metaquesttouchprocontrollerprofile.md)
* [HP Reverb G2 Controller Profile](features/hpreverbg2controllerprofile.md)
* [Hand Interaction Profile](features/handinteractionprofile.md)
* [DPad Interaction](features/dpadinteraction.md)
* [Palm Pose Interaction](features/palmposeinteraction.md)
* [Hand Common Poses Interaction](features/handcommonposesinteraction.md)

View File

@@ -0,0 +1,208 @@
---
uid: openxr-features
---
# OpenXR Features
OpenXR is an extensible API that can be extended with new features. To facilitate this within the Unity ecosystem, the Unity OpenXR provider offers a feature extension mechanism.
## Feature Management
![feature-ui](images/openxr-features.png)
You can manage features from the **Project Settings > XR Plug-in Management > OpenXR** window.
To enable or disable a feature, select or clear the checkbox next to it. Unity doesn't execute disabled features at runtime, and doesn't deploy any of the feature's native libraries with the Player build. To configure additional build-time properties specific to each feature, click the gear icon to the right of the feature.
All of the information in this window is populated via the `OpenXRFeatureAttribute` described below.
**Feature Groups** group a number of different **features** together to allow you to configure them simultaneously, which might offer a better development experience. For more information, see the [Defining a feature group](#defining-a-feature-group) section on this page.
## Defining a feature
Unity OpenXR features are defined and executed in C#. C# can call to a custom native plugin if desired. The feature can live somewhere in the user's project or in a package, and it can include any Assets that a normal Unity project would use.
A feature must override the `OpenXRFeature` ScriptableObject class. There are several methods that you can override on the `OpenXRFeature` class in order to do things like intercepting native OpenXR calls, obtaining the `xrInstance` and `xrSession` handles, starting Unity subsystems, etc.
A feature can add public fields for user configuration at build time. Unity renders these fields via a `PropertyDrawer` in the feature UI, and you can override them. Your feature can access any of the set values at runtime.
A feature must also provide an `OpenXRFeature` attribute when running in the Editor.
```c#
#if UNITY_EDITOR
[UnityEditor.XR.OpenXR.Features.OpenXRFeature(UiName = "Example Intercept Create Session",
BuildTargetGroups = new []{BuildTargetGroup.Standalone, BuildTargetGroup.WSA},
Company = "Unity",
Desc = "Example feature extension showing how to intercept a single OpenXR function.",
DocumentationLink = "https://docs.unity3d.com/Packages/com.unity.xr.openxr@0.1/manual/index.html",
OpenxrExtensionStrings = "XR_test", // this extension doesn't exist, a log message will be printed that it couldn't be enabled
Version = "0.0.1",
FeatureId = featureId)]
#endif
public class InterceptCreateSessionFeature : OpenXRFeature
{
/// <summary>
/// The feature id string. This is used to give the feature a well known id for reference.
/// </summary>
public const string featureId = "com.unity.openxr.feature.example.intercept";
}
```
Unity uses this information at build time, either to build the Player or to display it to the user in the UI.
## Defining a feature group
Unity OpenXR allows you to define a feature group you can use to enable or disable a group of features at the same time. This way, you don't need to access the **Feature** section in **Project Settings &gt; XR Plug-in Management &gt; OpenXR** window to enable or disable features.
Declare a feature group through the definition of one or more `OpenXRFeatureSetAttribute` declarations in your code. You can place the attribute anywhere because the feature group functionality only depends on the attribute existing and not on the actual class it's declared on.
```c#
[OpenXRFeatureSet(
FeatureIds = new string[] { // The list of features that this feature group is defined for.
EyeGazeInteraction.featureId,
KHRSimpleControllerProfile.featureId,
"com.mycompany.myprovider.mynewfeature",
},
UiName = "Feature_Set_Name",
Description = "Feature group that allows for setting up the best environment for My Company's hardware.",
// Unique ID for this feature group
FeatureSetId = "com.mycompany.myprovider.mynewfeaturegroup",
SupportedBuildTargets = new BuildTargetGroup[]{ BuildTargetGroup.Standalone, BuildTargetGroup.Android }
)]
class MyCompanysFeatureSet
{}
```
You can configure feature groups in the **XR Plug-in Management** plug-in selection window. When you select the **OpenXR** plug-in from this window, the section under the plug-in displays the groups of features available. Not all feature groups are configurable. Some require you to install third-party definitions. The window displays information on where to get the required packages if needed.
### Enabling OpenXR spec extension strings
Unity will attempt to enable any extension strings listed in `OpenXRFeatureAttribute.OpenxrExtensionStrings` (separated via spaces) on startup. Your feature can check the enabled extensions in order to see if the requested extension was enabled (via `OpenXRRuntime.IsExtensionEnabled`).
```c#
protected virtual bool OnInstanceCreate(ulong xrInstance)
{
if (!OpenXRRuntime.IsExtensionEnabled("XR_UNITY_mock_driver"))
{
Debug.LogWarning("XR_UNITY_mock_driver is not enabled, disabling Mock Driver.");
// Return false here to indicate the system should disable your feature for this execution.
// Note that if a feature is marked required, returning false will cause the OpenXRLoader to abort and try another loader.
return false;
}
// Initialize your feature, check version to make sure you are compatible with it
if(OpenXRRuntime.GetExtensionVersion("XR_UNITY_mock_driver") < 100)
return false;
return true;
}
```
### OpenXRFeature call order
The `OpenXRFeature` class has a number of methods that your method can override. Implement overrides to get called at specific points in the OpenXR application lifecycle.
#### Bootstrapping
`HookGetInstanceProcAddr`
This is the first callback invoked, giving your feature the ability to hook native OpenXR functions.
#### Initialize
`OnInstanceCreate => OnSystemChange => OnSubsystemCreate => OnSessionCreate`
The initialize sequence allows features to initialize Unity subsystems in the Loader callbacks and execute them when specific OpenXR resources are created or queried.
#### Start
`OnFormFactorChange => OnEnvironmentBlendModeChange => OnViewConfigurationTypeChange => OnSessionBegin => OnAppSpaceChange => OnSubsystemStart`
The Start sequence allows features to start Unity subsystems in the Loader callbacks and execute them when the session is created.
#### Gameloop
Several: `OnSessionStateChange`
`OnSessionBegin`
Maybe: `OnSessionEnd`
Callbacks during the gameloop can react to session state changes.
#### Stop
`OnSubsystemStop => OnSessionEnd`
#### Shutdown
`OnSessionExiting => OnSubsystemDestroy => OnAppSpaceChange => OnSessionDestroy => OnInstanceDestroy`
### Build Time Processing
A feature can inject some logic into the Unity build process in order to do things like modify the manifest.
Typically, you can do this by implementing the following interfaces:
* `IPreprocessBuildWithReport`
* `IPostprocessBuildWithReport`
* `IPostGenerateGradleAndroidProject`
Features **should not** implement these classes, but should instead implement `OpenXRFeatureBuildHooks`, which only provide callbacks when the feature is enabled. For more information, see `OpenXRFeatureBuildHooks`.
### Build time validation
If your feature has project setup requirements or suggestions that require user acceptance, implement `GetValidationChecks`. Features can add to a list of validation rules which Unity evaluates at build time. If any validation rule fails, Unity displays a dialogue asking you to fix the error before proceeding. Unity can also presents warning through the same mechanism. It's important to note which build target the rules apply to.
Example:
```c#
#if UNITY_EDITOR
protected override void GetValidationChecks(List<OpenXRFeature.ValidationRule> results, BuildTargetGroup targetGroup)
{
if (targetGroup == BuildTargetGroup.WSA)
{
results.Add( new ValidationRule(this){
message = "Eye Gaze support requires the Gaze Input capability.",
error = false,
checkPredicate = () => PlayerSettings.WSA.GetCapability(PlayerSettings.WSACapability.GazeInput),
fixIt = () => PlayerSettings.WSA.SetCapability(PlayerSettings.WSACapability.GazeInput, true)
} );
}
}
#endif
```
![feature-validation](images/ProjectValidation/feature-validation.png)
### Custom Loader library
One and only one Unity OpenXR feature per BuildTarget can have a custom loader library. This must be named `openxr_loader` with native platform naming conventions (for example, `libopenxr_loader.so` on Android).
Features with a custom loader library must set the `OpenXRFeatureAttribute`: `CustomRuntimeLoaderBuildTargets` to a list of BuildTargets in which a custom loader library is expected to be used. Features that do not use a custom loader library do not have to set `CustomRuntimeLoaderBuildTargets` (or can set it to null or an empty list).
The custom loader library must be placed in the same directory or a subdirectory of the C# script that extends the `OpenXRFeature` class. When the feature is enabled, Unity will include the custom loader library in the build for the active BuildTarget, instead of the default loader library for that target.
### Feature native libraries
Any native libraries included in the same directory or a subdirectory of your feature will only be included in the built Player if your feature is enabled.
## Feature use cases
### Intercepting OpenXR function calls
To intercept OpenXR function calls, override `OpenXRFeature.HookGetInstanceProcAddr`. Returning a different function pointer allows intercepting any OpenXR method. For an example, see the `Intercept Feature` sample.
### Calling OpenXR functions from a feature
To call an OpenXR function within a feature you first need to retrieve a pointer to the function. To do this use the `OpenXRFeature.xrGetInstanceProcAddr` function pointer to request a pointer to the function you want to call. Using `OpenXRFeature.xrGetInstanceProcAddr` to retrieve the function pointer ensures that any intercepted calls set up by features using `OpenXRFeature.HookGetInstanceProcAddr` will be included.
### Providing a Unity subsystem implementation
`OpenXRFeature` provides several XR Loader callbacks where you can manage the lifecycle of Unity subsystems. For an example meshing subsystem feature, see the `Meshing Subsystem Feature` sample.
Note that a `UnitySubsystemsManifest.json` file is required in order for Unity to discover any subsystems you define. At the moment, there are several restrictions around this file:
* It must be only 1 subfolder deep in the project or package.
* The native library it refers to must be only 1-2 subfolders deeper than the `UnitySubsystemsManfiest.json` file.

View File

@@ -0,0 +1,88 @@
---
uid: openxr-composition-layers
---
# Composition Layers Support
The OpenXR Composition Layers feature provides support for rendering high quality images on layer types, such as cylinder, equirect, cube, and more, on platforms like the Quest headsets.
This functionality allows developers to manage visual layers in a flexible, performant way, tailored for XR applications.
Refer to the [Composition Layers](https://docs.unity3d.com/Packages/com.unity.xr.compositionlayers@latest) documentation for information about using composition layers in a scene.
## Installation
To use this feature, install the [Composition Layers package](https://docs.unity3d.com/Packages/com.unity.xr.compositionlayers@1.0/manual/index.html). Follow the steps in the package documentation for installation details.
## Enabling Composition Layers
Once the package is installed, you can enable OpenXR Composition Layers in your Unity project:
1. Open the **Project Settings** window (menu: **Edit > Project Settings**).
2. Select **XR Plug-in Management** from the list of settings on the left.
3. If necessary, enable **OpenXR** in the list of **Plug-in Providers**. Unity installs the OpenXR plug-in, if not already installed.
4. Select the **OpenXR** settings page under **XR Plug-in Management**.
5. Enable the **Composition Layer Support** option in the **OpenXR Feature Groups** area of the **OpenXR** settings.
![OpenXR feature options](../../images/openxr-features.png)
## Custom Layers
In addition to the built-in layer types supported by the Composition Layers feature, you can create and register your own custom composition layers. This is particularly useful when your project requires non-standard visual behavior, optimized layer rendering, or complex interactions in AR/VR environments.
### Custom Layer Example
The following example demonstrates how to create a custom layer handler for a custom layer type. This handler is responsible for managing the lifecycle of the layer, rendering, and other properties.
```c#
#if UNITY_EDITOR
[UnityEditor.XR.OpenXR.Features.OpenXRFeature(UiName = "OpenXR Custom Layer Handler Example",
BuildTargetGroups = new[] { BuildTargetGroup.Standalone, BuildTargetGroup.WSA, BuildTargetGroup.Android },
Company = "Unity",
Desc = "An example to demonstrate how to enable a handler for a customized composition layer type.",
DocumentationLink = "",
FeatureId = "com.unity.openxr.features.customlayerexample",
OpenxrExtensionStrings = "",
Version = "1")]
#endif
public class CustomFeature : OpenXRFeature
{
bool isSubscribed;
protected override void OnEnable()
{
if (OpenXRLayerProvider.isStarted)
CreateAndRegisterLayerHandler();
else
{
OpenXRLayerProvider.Started += CreateAndRegisterLayerHandler;
isSubscribed = true;
}
}
protected override void OnDisable()
{
if (isSubscribed)
{
OpenXRLayerProvider.Started -= CreateAndRegisterLayerHandler;
isSubscribed = false;
}
}
protected void CreateAndRegisterLayerHandler()
{
if (enabled)
{
var layerHandler = new CustomLayerHandler();
OpenXRLayerProvider.RegisterLayerHandler(typeof(CustomQuadLayerData), layerHandler);
}
}
}
#endif
```
This code can also be imported from the OpenXR Samples in the Package Manager.
### When to Use Custom Layers
You may want to create a custom layer in situations where:
- You need to implement a specific rendering technique not covered by the standard layer types (such as a specialized quad or cylinder layer).
- Performance optimizations for specific layer interactions are required.
- Create unique visual effects or layer behaviors in your XR application.
By registering custom layer handlers, you can gain full control over how the composition layers are processed and rendered, allowing for more tailored XR experiences.

View File

@@ -0,0 +1,28 @@
---
uid: openxr-dpad-interaction
---
# D-Pad Interaction
Unity OpenXR provides support for the Dpad Binding extension specified by Khronos. Use this layout to retrieve the bindings data that the extension returns. This extension allows the application to bind one or more digital actions to a trackpad or thumbstick as though it were a dpad by defining additional component paths to suggest bindings for.
Enables the OpenXR interaction profile for Dpad Interaction and exposes the `<DPad>` layout within the [Unity Input System](https://docs.unity3d.com/Packages/com.unity.inputsystem@1.0/manual/).
For more information about the Dpad Binding extension, refer to the [OpenXR Specification](https://registry.khronos.org/OpenXR/specs/1.0/html/xrspec.html#XR_EXT_dpad_binding).
## Available controls
The following table outlines the mapping between the OpenXR paths and Unity's implementation:
| OpenXR Path | Unity Control Name | Type |
|----|----|----|
| `/input/thumbstick/dpad_up` | thumbstickDpadUp | Boolean |
| `/input/thumbstick/dpad_down` | thumbstickDpadDown | Boolean |
| `/input/thumbstick/dpad_left` | thumbstickDpadLeft | Boolean |
| `/input/thumbstick/dpad_right` | thumbstickDpadRight | Boolean |
| `/input/trackpad/dpad_up` | trackpadDpadUp | Boolean |
| `/input/trackpad/dpad_down` | trackpadDpadDown | Boolean |
| `/input/trackpad/dpad_left` | trackpadDpadLeft | Boolean |
| `/input/trackpad/dpad_right` | trackpadDpadRight | Boolean |
| `/input/trackpad/dpad_center` | trackpadDpadCenter | Boolean |

View File

@@ -0,0 +1,23 @@
---
uid: openxr-eye-gaze-interaction
---
# Eye Gaze Interaction
Unity OpenXR provides support for the Eye Tracking Interaction extension specified by Khronos. Use this layout to retrieve the pose data that the extension returns.
At present, this device does not appear in the Unity Input System drop-down menus. To bind, go the gaze position/rotation, and use the following binding paths.
|**Data**|**Binding Path**|
|--------|------------|
|Position|`<EyeGaze>/pose/position`|
|Rotation|`<EyeGaze>/pose/rotation`|
For more information about the Eye Gaze extension, refer to the [OpenXR Specification](https://www.khronos.org/registry/OpenXR/specs/1.0/html/xrspec.html#XR_EXT_eye_gaze_interaction).
## Available controls
The following table outlines the mapping between the OpenXR paths and Unity's implementation:
| OpenXR Path | Unity Control Name | Type |
|----|----|----|
| `/input/gaze_ext/pose` | pose | Pose |

View File

@@ -0,0 +1,206 @@
---
uid: openxr-foveated-rendering
---
# Foveated rendering in OpenXR
Foveated rendering is an optimization technique that can speed up rendering with little perceived impact on visual quality. Foveated rendering works by lowering the resolution of areas in the user's peripheral vision. On headsets that support eye-tracking as well as foveated rendering, the higher-resolution area can be centered where the user is currently looking. Without eye-tracking, the higher resolution area is fixed near the middle of the screen. Fixed foveated rendering can be more apparent to the user since they can shift their eyes to look at the peripheral areas.
OpenXR platforms use the variable shading rate (VSR) technique for foveated rendering, which does not require you to change custom shaders. (However, if you are creating assets that should work on all XR platforms, refer to the "Foveated rendering shaders" topic in [Foveated rendering](https://docs.unity3d.com/Manual/xr-foveated-rendering.html) to learn how to write shaders and shader graphs that work under all supported foveated rendering methods.)
OpenXR devices can implement VRS using a variety of techniques and a single device can support more than one implementation. The Unity OpenXR plug-in chooses from the following techniques, in this order, depending on what the current device supports:
1. Gaze-based Fragment Density Map (GFDM) from provider.
2. Fixed Fragment Density Map (FFDM) from provider.
3. Fragment Shading Rate (FSR) using a provider's texture.
4. Fragment Shading Rate using a compute shader calculated from the asymmetric FOVs the provider gives.
This topic covers aspects of foveated rendering specific to the Unity OpenXR provider plug-in:
* [Prerequisites](#prerequisites)
* [Configure foveated rendering](#configure-foveated-rendering)
* [Use the **SRP Foveation** API](#use-the-srp-foveation-api)
* [Use the **Legacy** API](#use-the-legacy-api)
* [Request eye-tracking permission on Android](#request-eye-tracking-permission)
For general information about foveated rendering in Unity XR, refer to [Foveated rendering](https://docs.unity3d.com/Manual/xr-foveated-rendering.html).
<a id="prerequisites"></a>
## Prerequisites
To use the foveated rendering feature in OpenXR, your project must meet the following prerequisites:
* Unity 6+
* Unity OpenXR plugin (com.unity.xr.openxr) 1.11.0+
* Universal Rendering Pipeline or High Definition Render Pipeline (HDRP not recommended for mobile XR platforms such as the Meta Quest family of devices)
Alternately, you can use the Meta Core XR SDK package to access the OpenXR foveated rendering feature on Quest devices:
* Unity 2022.2+, Unity 6+
* Unity OpenXR plugin (com.unity.xr.openxr) 1.11.0+
* Meta Core XR SDK 68.0+
* Built-in or Universal Rendering Pipeline
> [!NOTE]
> The primary differences between the Unity **SRP Foveation** API and the Meta API for foveated rendering include:
>
> * They uses different APIs and code paths to enable and control foveated rendering on a device.
> * The Unity **SRP Foveation** API works on all platforms that support foveated rendering, which allows you to share code across different types of devices.
> * The Unity **SRP Foveation** API does not support the Built-in Rendering Pipeline.
> * The Meta API, which only works on Quest devices, supports Unity 2022.3 and the Built-in Render Pipeline.
> * The Meta API does not support foveated rendering when you use intermediate render targets. Intermediate rendering targets are used by post-processing, tone mapping and camera stacking, for example, and may be used by other rendering features, too.
>
> Unity recommends that you use the Unity **SRP Foveation** API where possible for better compatibility. You can still use other features from the Meta Core XR SDK in conjunction with the **SRP Foveation** API, if desired.
<a id="configure-foveated-rendering"></a>
## Configure foveated rendering
You can configure foveated rendering in a project that meets the [Prerequisites](#prerequisites):
* [Configure SRP Foveation](#configure-srp-foveation)
* [Configure Legacy foveated rendering](#configure-legacy-foveated-rendering)
* [Configure gaze-based foveated rendering](#configure-gaze-based-foveated-rendering)
Once configured in settings, you must also turn on foveated rendering at runtime. By default, the foveated rendering strength or level is set to off. You must also set a runtime flag to use gaze-based foveated rendering.
Refer to the following topics for more information:
* [Use the SRP Foveation API](#use-the-srp-foveation-api)
* [Use the Legacy API](#use-the-srp-foveation-api)
<a id="configure-srp-foveation"></a>
### Configure SRP Foveation
To enable the **SRP Foveation** API in Unity 6+:
1. Open the **Project Settings** window.
2. Under **XR Plug-in Management**, select the **OpenXR** settings.
3. Set the **Foveated Rendering API** option to **SRP Foveation**.
3. In the list of **OpenXR Feature Groups**, select **All Features**.
4. Enable the **Foveated Rendering** feature.
![SRP Foveation settings](../images/FoveatedRendering/xr-foveation-srp-api-settings.png)<br/>*Settings to enable foveated rendering with the **SRP Foveation** API*
After you have configured the settings, you must also turn on foveated rendering at runtime. Refer to [Use the SRP Foveation API](#use-the-srp-foveation-api) for more information.
> [!NOTE]
> You must configure the project to use either the [Universal Render Pipeline (URP)](https://docs.unity3d.com/Packages/com.unity.render-pipelines.universal@17.0/manual/InstallURPIntoAProject.html) or the [High Definition Render Pipeline (HDRP)](https://docs.unity3d.com/Packages/com.unity.render-pipelines.high-definition@17.0/manual/convert-project-from-built-in-render-pipeline.html), if you have not already done so.
<a id="configure-legacy-foveated-rendering"></a>
### Configure Legacy foveated rendering
In Unity 6+, set the OpenXR **Foveated Render API** option to **Legacy** to use the Meta Core XR SDK, which only supports the Meta Quest family of devices. In Unity 2022, there is no option to select a **Foveated Rendering API**. Only the legacy API using the Meta Core XR SDK is supported.
1. Install the [Meta Core XR SDK](com.unity3d.kharma:upmpackage/com.meta.xr.sdk.core) package, if necessary. You can get this package from the [Unity Asset Store](https://assetstore.unity.com/packages/tools/integration/meta-xr-core-sdk-269169). The package adds the Meta OpenXR feature group to the OpenXR along with the associated features. Refer to the [Meta Developer site](https://developer.oculus.com/downloads/package/meta-xr-core-sdk/68.0) for more information.
2. Open the **Project Settings** window.
3. Select the **XR Plug-in Management** settings from the list on the left.
4. Select the **Android** tab.
5. Under **OpenXR**, enable the **Meta XR feature group**, which is added when you install the Meta Core SDK package. (You might be prompted to restart the Unity Editor, which you can do now or after you finish configure these settings.)
![Enable Meta XR feature group](../images/FoveatedRendering/xr-meta-feature-group.png)
6. Select the **OpenXR** settings area (below **XR Plug-in Management**).
7. Set the **Foveated Rendering API** option to **Legacy**.
8. In the list of **OpenXR Feature Groups**, select **All Features**.
9. Disable the **Foveated Rendering** feature, if it is enabled.
10. Enable the **Meta XR Foveated Rendering** feature.
11. (Optional) Enable the **Meta XR Eye Tracked Foveation** feature.
![Legacy Foveation settings](../images/FoveatedRendering/xr-foveation-legacy-settings.png)<br/>*Settings to enable foveated rendering with the **Legacy** API*
After you have configured the settings, you must also turn on foveated rendering at runtime. Refer to [Use the Legacy API](#use-the-legacy-api) for more information.
> [!NOTE]
> The Meta Core XR SDK is a third-party package, which is not under Unity control. The OpenXR features and API it provides can change without notice.
<a id="configure-gaze-based-foveated-rendering"></a>
### Configure gaze-based foveated rendering
Devices that provide eye tracking can support gaze-based foveated rendering in which the highest resolution area is centered where the user is looking.
When using the Unity **SRP Foveation** API, you do not need to enable gaze-based foveated rendering in the OpenXR settings. You do need to [turn the feature on at runtime](#use-the-srp-foveation-api) and make sure any required permissions are enabled.
When using the **Legacy**, Meta Core XR SDK, you must enable the **Meta XR Eye Tracked Foveation** OpenXR feature.
To use eye-tracking data, you must [Request eye-tracking permission on Android](#request-eye-tracking-permission). Other platforms may have similar requirements.
<a id="use-the-srp-foveation-api"></a>
## Use the SRP Foveation API
After you have configured foveated rendering in the OpenXR settings, you must also turn the feature on at runtime. If you want to use gaze-based foveated rendering, you must set a runtime flag, which might require user permission.
To specify the amount, or *strength*, of the foveation effect, you must assign a value between 0 and 1 to the [XRDisplaySubsystem.foveatedRenderingLevel](xref:UnityEngine.XR.XRDisplaySubsystem.foveatedRenderingLevel) property. The default value of zero turns foveation off altogether. A value of one is the maximum strength. Different device types can interprete this value in the way that best suits their native API. Meta Quest devices, for example, have discrete levels for setting the foveation strength: if you assign a value of `0.5` to `foveatedRenderingLevel`, the provider plug-in sets the device's *medium* foveation level.
To specify that you want to use gaze-based foveated rendering, set [XRDisplaySubsystem.foveatedRenderingFlags](xref:UnityEngine.XR.XRDisplaySubsystem.foveatedRenderingFlags) to [FoveatedRenderingFlags.GazeAllowed](xref:UnityEngine.XR.XRDisplaySubsystem.FoveatedRenderingFlags.GazeAllowed). If you do not set this flag, the device doesn't support gaze-based foveated rendering, or the user turns off or denies permission to use eye-tracking, then fixed foveated rendering is performed.
To set either of these foveated rendering APIs, you must first get a reference to the active [XRDisplaySubsystem](xref:UnityEngine.XR.XRDisplaySubsystem) from the Unity [SubsystemManager](xref:UnityEngine.SubsystemManager). Unity supports multiple subsystems of the same type, and returns a list when you get the subsystems of a given type. Ordinarily, only one `XRDisplaySubsystem` exists and you can use the lone subsystem in the `XRDisplaySubsystem` list returned by [SubsystemManager.GetSubsystems](https://docs.unity3d.com/6000.0/Documentation/ScriptReference/SubsystemManager.GetSubsystems.html).
The following code example illustrates how to set foveated rendering to full strength and enable gaze-based foveation after getting the instance of the active `XRDisplaySubsystem`:
```C#
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.XR;
public class FoveationStarter : MonoBehaviour
{
List<XRDisplaySubsystem> xrDisplays = new List<XRDisplaySubsystem>();
void Start()
{
SubsystemManager.GetSubsystems(xrDisplays);
if (xrDisplays.Count == 1)
{
xrDisplays[0].foveatedRenderingLevel = 1.0f; // Full strength
xrDisplays[0].foveatedRenderingFlags
= XRDisplaySubsystem.FoveatedRenderingFlags.GazeAllowed;
}
}
}
```
> [!NOTE]
> This code example relies on methods available in Unity 6+. It does not compile in earlier versions.
<a id="use-the-legacy-api"></a>
## Use the Legacy API
This API was built only for Quest headsets; it may not be supported on other devices. You must install Meta's Core XR SDK package for this API to be available.
After you configured foveated rendering in the OpenXR settings, you must also turn the feature on at runtime. If you want to use gaze-based foveated rendering, you must set a runtime flag, which might require user permission.
The following code example illustrates how to set foveated rendering to High and enable gaze-based foveation using the **Legacy API** and the Meta Core XR SDK package:
```C#
using UnityEngine;
using UnityEngine.XR;
public class FoveationStarter : MonoBehaviour
{
private void Start()
{
OVRManager.foveatedRenderingLevel = OVRManager.FoveatedRenderingLevel.High;
OVRManager.eyeTrackedFoveatedRenderingEnabled = true;
}
}
```
Refer to Meta's [OVRManager Class Reference](https://developer.oculus.com/reference/unity/v67/class_o_v_r_manager#acbd6d504192d2a2a7461382a4eae0715a84ec48f67b50df5ba7f823879769e0ad) for more information.
<a id="request-eye-tracking-permission"></a>
## Request eye-tracking permission on Android
The Android platform requires the user to grant permission before your app can access eye-tracking data. Eye-tracking permission is required to use gaze-based foveated rendering.
To declare that your application uses eye tracking, you must add a `uses-feature` and a `uses-permission` element to your application's Android manifest file:
``` xml
<manifest xmlns:android="http://schemas.android.com/apk/res/android" xmlns:tools="http://schemas.android.com/tools">
<uses-feature android:name="oculus.software.eye_tracking" android:required="false" />
<uses-permission android:name="com.oculus.permission.EYE_TRACKING" />
... the rest of the manifest elements ...
```
Refer to [Declare permissions for an application](xref:um-android-permissions-declare) for instructions about how to add these and other custom elements to the Android manifest.
If the user denies permission, your application uses fixed foveated rendering instead. Refer to [Request runtime permissions
](xref:um-android-requesting-permissions) for more information about handling Android permissions issues, including how to handle cases where the user has denied permission.

View File

@@ -0,0 +1,24 @@
---
uid: openxr-hand-common-poses-interaction
---
# Hand Common Poses Interaction
Unity OpenXR provides support for the Hand Interaction extension specified by Khronos. Use this layout to retrieve the bindings data that the extension returns. This extension defines four commonly used action poses for all user hand
interaction profiles including both hand tracking devices and motion controller devices.
Enables the OpenXR interaction feature for Hand Common Poses Interaction and exposes the `<HandInteractionPoses>` layout within the [Unity Input System](https://docs.unity3d.com/Packages/com.unity.inputsystem@1.0/manual/).
OpenXR Specification about Hand Interaction Extension will be updated here when it is available.
## Available controls
The following table outlines the mapping between the OpenXR paths and Unity's implementation:
| OpenXR Path | Unity Control Name | Type |
|----|----|----|
|`/input/grip/pose` | devicePose | Pose |
|`/input/aim/pose` | pointer | Pose |
|`/input/pinch_ext/pose` | pinchPose | Pose |
|`/input/poke_ext/pose` | pokePose | Pose |

View File

@@ -0,0 +1,32 @@
---
uid: openxr-hand-interaction-profile
---
# Hand Interaction Profile
The hand interaction profile is designed for runtimes which provide hand inputs using hand tracking devices instead of controllers with triggers or buttons.
The hand interaction profile allows hand tracking devices to provide commonly used gestures and action poses. Enable this OpenXR interaction profile to expose the `<HandInteraction>` device layout within the [Unity Input System](xref:input-system-index).
OpenXR Specification about Hand Interaction Profile will be updated here when it is available.
## Available controls
The following table outlines the mapping between the OpenXR paths and Unity's implementation:
| OpenXR Path | Unity Control Name | Type |
|----|----|----|
|`/input/grip/pose` | devicePose | Pose |
|`/input/aim/pose` | pointer | Pose |
|`/input/pinch_ext/pose` | pinchPose | Pose |
|`/input/poke_ext/pose` | pokePose | Pose |
|`/input/pinch_ext/value`| pinchValue | Float |
|`/input/pinch_ext/ready_ext` | pinchReady | Boolean|
|`/input/aim_activate_ext/value`| pointerActivateValue | Float |
|`/input/aim_activate_ext/ready_ext` | pointerActivateReady | Boolean|
|`/input/grasp_ext/value`| graspValue | Float |
|`/input/grasp_ext/ready_ext` | graspReady | Boolean|
| Unity Layout Only | isTracked | Flag Data |
| Unity Layout Only | trackingState | Flag Data |
| Unity Layout Only | devicePosition | Vector3 |
| Unity Layout Only | deviceRotation | Quaternion |
[!include[](snippets/unity-layout.md)]

View File

@@ -0,0 +1,35 @@
---
uid: openxr-hp-reverb-g2-controller-profile
---
# HP Reverb G2 Controller Profile
Enables the OpenXR interaction profile for the HP Reverb G2 Controller and exposes the `<ReverbG2Controller>` device layout within the [Unity Input System](xref:input-system-index).
For more information about the HP Reverb G2 interaction profile, refer to the [OpenXR Specification](https://registry.khronos.org/OpenXR/specs/1.0/html/xrspec.html#XR_EXT_hp_mixed_reality_controller).
## Available controls
The following table outlines the mapping between the OpenXR paths and Unity's implementation:
| OpenXR Path | Unity Control Name | Type |
|----|----|----|
|`/input/x/click`| primaryButton (Left Hand Only) | Boolean |
|`/input/y/click`| secondaryButton (Left Hand Only) | Boolean |
|`/input/a/click`| primaryButton (Right Hand Only) | Boolean |
|`/input/b/click`| secondaryButton (Right Hand Only) | Boolean |
|`/input/menu/click` | menu | Boolean|
|`/input/squeeze/value`| grip | Float |
|`/input/squeeze/value`| gripPressed | Boolean (float cast to Boolean) |
|`/input/trigger/value`|trigger| Float |
|`/input/trigger/value`| triggerPressed | Boolean (float cast to Boolean) |
|`/input/thumbstick`| thumbstick | Vector2 |
|`/input/thumbstick/click`| thumbstickClicked | Boolean |
|`/input/grip/pose` | devicePose | Pose |
|`/input/aim/pose` | pointer | Pose |
|`/output/haptic` | haptic | Vibrate |
| Unity Layout Only | isTracked | Flag Data |
| Unity Layout Only | trackingState | Flag Data |
| Unity Layout Only | devicePosition | Vector3 |
| Unity Layout Only | deviceRotation | Quaternion |
[!include[](snippets/unity-layout.md)]

View File

@@ -0,0 +1,33 @@
---
uid: openxr-htc-vive-controller-profile
---
# HTC Vive Controller Profile
Enables the OpenXR interaction profile for the HTC Vive Controller and exposes the `<ViveController>` device layout within the [Unity Input System](xref:input-system-index).
For more information about the HTC Vive interaction profile, refer to the [OpenXR Specification](https://www.khronos.org/registry/OpenXR/specs/1.0/html/xrspec.html#_htc_vive_controller_profile).
## Available controls
The following table outlines the mapping between the OpenXR paths and Unity's implementation:
| OpenXR Path | Unity Control Name | Type |
|----|----|----|
|`/input/system/click`| select | Boolean |
|`/input/squeeze/click`| grip | Float ( Boolean value cast to float)|
|`/input/squeeze/click`| gripButton | Boolean |
|`/input/menu/click` | menu | Boolean|
|`/input/trigger/value`|trigger| Float |
|`/input/trigger/click`|triggerPressed| Boolean |
|`/input/trackpad`|trackpad| Vector2 |
|`/input/trackpad/click`|trackpadClicked| Boolean |
|`/input/trackpad/touch`|trackpadTouched| Boolean |
|`/input/grip/pose`| devicePose| Pose |
|`/input/aim/pose`|pointer| Pose |
|`/output/haptic` | haptic | Vibrate |
| Unity Layout Only | isTracked | Flag Data |
| Unity Layout Only | trackingState | Flag Data |
| Unity Layout Only | devicePosition | Vector3 |
| Unity Layout Only | deviceRotation | Quaternion |
[!include[](snippets/unity-layout.md)]

View File

@@ -0,0 +1,28 @@
---
uid: openxr-khronos-simple-controller-profile
---
# Khronos Simple Controller Profile
Enables the OpenXR interaction profile for the Khronos Simple Controller and exposes the `<SimpleController>` device layout within the [Unity Input System](https://docs.unity3d.com/Packages/com.unity.inputsystem@1.0/manual/).
For more information about the Khronos Simple Controller interaction profile, refer to the [OpenXR Specification](https://www.khronos.org/registry/OpenXR/specs/1.0/html/xrspec.html#_khronos_simple_controller_profile).
## Available controls
The following table outlines the mapping between the OpenXR paths and Unity's implementation:
| OpenXR Path | Unity Control Name | Type |
|----|----|----|
|`/input/select/click`| select | Boolean |
|`/input/menu/click` | menu | Boolean |
|`/input/grip/pose` | devicePose | Pose |
|`/input/aim/pose` | pointer | Pose |
|`/output/haptic` | haptic | Vibrate |
| Unity Layout Only | isTracked | Flag Data |
| Unity Layout Only | trackingState | Flag Data |
| Unity Layout Only | devicePosition | Vector3 |
| Unity Layout Only | deviceRotation | Quaternion |
| Unity Layout Only | pointerPosition | Vector3 |
| Unity Layout Only | pointerRotation | Quaternion |
[!include[](snippets/unity-layout.md)]

View File

@@ -0,0 +1,100 @@
---
uid: openxr-meta-quest-support
---
# Meta Quest Support
Understand how to configure the OpenXR plug-in for Meta Quest devices.
The Meta Quest Support feature enables Meta Quest support for the OpenXR plug-in. The **Meta Quest Support** window provides configurable settings specific for Meta Quest devices.
Refer to the following sections to understand how to [Enable Meta Quest Support](#Enable-support), and the [Settings](#settings) you can configure.
## Enable Meta Quest Support {#enable}
To deploy to Meta Quest, enable the Meta Quest Support feature on the Android build target as follows:
1. Open the Project Settings window (menu: **Edit > Project Settings**).
2. Select the **XR Plug-in Management** from the list of settings on the left.
3. If necessary, enable **OpenXR** in the list of **Plug-in Providers**. Unity installs the OpenXR plug-in, if not already installed.
4. Select the OpenXR settings page under XR Plug-in Management.
5. Add the **Oculus Touch Controller Profile** to the **Interaction Profiles** list. (You can have multiple profiles enabled in the list, the OpenXR chooses the one to use based on the current device at runtime.)
6. Enable **Meta Quest Support** under **OpenXR Feature Groups**.
When you enable Meta Quest Support, Unity produces Application Package Kit (APK) files that you can run on the Quest family of devices.
## Open the Meta Quest Support window
The **Meta Quest Support** settings window provides the settings you can configure for OpenXR support on Meta Quest devices.
To open the **Meta Quest Support** window:
1. Open the **OpenXR** section of **XR Plug-in Management** (menu: **Edit** > **Project Settings** > **XR Plug-in Management** > **OpenXR**).
2. Under **OpenXR Feature Groups**, enable **Meta Quest Support**.
3. Click the gear icon next to **Meta Quest Support** to open **Meta Quest Support** settings.
![Meta Quest Support in the OpenXR settings.](../images/meta-quest-support.png)<br/>*Meta Quest Support feature.*
## Settings reference {#settings}
The following sections provide a reference of the settings you can configure in the **Meta Quest Support** window.
### Rendering Settings
Use the following settings to configure rendering in your project.
#### Symmetric Projection (Vulkan)
If enabled, when the application begins it will create a stereo symmetric view that changes the eye buffer resolution based on the Inter-Pupillary Distance (IPD). Provides a performance benefit across all IPD values.
#### Optimize Buffer Discards (Vulkan)
Enable this setting to enable an optimization that allows 4x Multi Sample Anti Aliasing (MSAA) textures to be memoryless on Vulkan.
#### Optimize Multiview Render Regions (Vulkan)
Enable to activate Multiview Render Regions optimizations at application start (Unity 6.1 and newer).
To learn more about this feature, refer to [Multiview Render Regions](xref:openxr-multiview-render-regions).
#### Space Warp motion vector texture format
Choose the format used by the motion vector texture to store its values. The option you choose depends on whether you want to prioritize visual quality and performance.
The options you can choose are:
| Option | Description |
| :---------- | :---------- |
| **RGBA16f** | Use this for more precise values. This can improve the visual quality of Space Warp. |
| **RG16f** | Use this option for reduced memory usage, but slightly less precision. |
To learn more about Space Warp, refer to [URP Application Spacewarp](xref:um-xr-application-spacewarp).
### Manifest Settings
Use the following settings to configure your project manifest.
#### Force Remove Internet Permission
Enable to force the removal of internet permissions added to the [Android App Manifest](xref:um-android-manifest).
#### System Splash Screen
Uses a `PNG` in the `Assets` folder as the system splash screen image. If set, the OS will display the system splash screen as a high quality compositor layer as soon as the app is starting to launch, until the app submits the first frame.
Use the picker (&#8857;) to select an image from the `Assets` folder you want to use as the system splash screen image.
### Target Devices
Select the Quest devices your project targets.
### Experimental
The **Experimental** section contains experimental settings that are under active development.
> [!NOTE]
> Experimental settings might change before theyre verified.
## Additional resources
* [Develop for Meta Quest](xref:um-xr-meta-quest-develop) (Unity manual)
* [Oculus All In on OpenXR: Deprecates Proprietary APIs](https://developer.oculus.com/blog/oculus-all-in-on-openxr-deprecates-proprietary-apis) (Meta developer blog)

View File

@@ -0,0 +1,46 @@
---
uid: openxr-meta-quest-plus-touch-controller-profile
---
# Meta Quest Touch Plus Controller Profile
Enables the OpenXR interaction profile for Meta Quest Touch Plus controllers and exposes the `<QuestTouchPlusController>` device layout within the [Unity Input System](xref:input-system-index).
## Available controls
The following table outlines the mapping between the OpenXR paths and Unity's implementation:
| OpenXR Path | Unity Control Name | Type |
|----|----|----|
|`/input/thumbstick`| thumbstick | Vector2 |
|`/input/squeeze/value`| grip | Float |
|`/input/squeeze/value`| gripPressed | Boolean (float cast to Boolean) |
|`/input/menu/click`| menu (Left Hand Only)| Boolean |
|`/input/system/click`| menu (Right Hand Only)| Boolean |
|`/input/a/click`| primaryButton (Right Hand Only) | Boolean |
|`/input/a/touch`| primaryTouched (Right Hand Only) | Boolean |
|`/input/b/click`| secondaryButton (Right Hand Only) | Boolean |
|`/input/b/touch`| secondaryTouched (Right Hand Only) | Boolean |
|`/input/x/click`| primaryButton (Left Hand Only) | Boolean |
|`/input/x/touch`| primaryTouched (Left Hand Only) | Boolean |
|`/input/y/click`| secondaryButton (Left Hand Only) | Boolean |
|`/input/y/touch`| secondaryTouched (Left Hand Only) | Boolean |
|`/input/trigger/value`| trigger | Float |
|`/input/trigger/value`| triggerPressed | Boolean (float cast to Boolean) |
|`/input/trigger/touch`| triggerTouched| Boolean (float cast to Boolean) |
|`/input/thumbstick/click`| thumbstickClicked | Boolean |
|`/input/thumbstick/touch`| thumbstickTouched | Boolean |
|`/input/thumbrest/touch`| thumbrestTouched | Boolean |
|`/input/grip/pose` | devicePose | Pose |
|`/input/aim/pose` | pointer | Pose |
|`/input/trigger/force` | triggerForce | Float |
|`/input/trigger/curl_meta` | triggerCurl | Float |
|`/input/trigger/slide_meta` | triggerSlide | Float |
|`/input/trigger/proximity_meta` | triggerProximity | Boolean |
|`/input/thumb_meta/proximity_meta` | thumbProximity | Boolean |
|`/output/haptic` | haptic | Vibrate |
| Unity Layout Only | isTracked | Flag Data |
| Unity Layout Only | trackingState | Flag Data |
| Unity Layout Only | devicePosition | Vector3 |
| Unity Layout Only | deviceRotation | Quaternion |
[!include[](snippets/unity-layout.md)]

View File

@@ -0,0 +1,48 @@
---
uid: openxr-meta-quest-pro-touch-controller-profile
---
# Meta Quest Pro Touch Controller Profile
Enables the OpenXR interaction profile for Meta Quest Pro controllers and exposes the `<QuestProTouchController>` device layout within the [Unity Input System](xref:input-system-index).
## Available controls
The following table outlines the mapping between the OpenXR paths and Unity's implementation:
| OpenXR Path | Unity Control Name | Type |
|----|----|----|
|`/input/thumbstick`| thumbstick | Vector2 |
|`/input/squeeze/value`| grip | Float |
|`/input/squeeze/value`| gripPressed | Boolean (float cast to Boolean) |
|`/input/menu/click`| menu (Left Hand Only)| Boolean |
|`/input/system/click`| menu (Right Hand Only)| Boolean |
|`/input/a/click`| primaryButton (Right Hand Only) | Boolean |
|`/input/a/touch`| primaryTouched (Right Hand Only) | Boolean |
|`/input/b/click`| secondaryButton (Right Hand Only) | Boolean |
|`/input/b/touch`| secondaryTouched (Right Hand Only) | Boolean |
|`/input/x/click`| primaryButton (Left Hand Only) | Boolean |
|`/input/x/touch`| primaryTouched (Left Hand Only) | Boolean |
|`/input/y/click`| secondaryButton (Left Hand Only) | Boolean |
|`/input/y/touch`| secondaryTouched (Left Hand Only) | Boolean |
|`/input/trigger/value`| trigger | Float |
|`/input/trigger/value`| triggerPressed | Boolean (float cast to Boolean) |
|`/input/trigger/touch`| triggerTouched| Boolean (float cast to Boolean) |
|`/input/thumbstick/click`| thumbstickClicked | Boolean |
|`/input/thumbstick/touch`| thumbstickTouched | Boolean |
|`/input/thumbrest/touch`| thumbrestTouched | Boolean |
|`/input/grip/pose` | devicePose | Pose |
|`/input/aim/pose` | pointer | Pose |
|`/input/stylus_fb/force` | stylusForce | Float |
|`/input/trigger/curl_fb` | triggerCurl | Float |
|`/input/trigger/slide_fb` | triggerSlide | Float |
|`/input/trigger/proximity_fb` | triggerProximity | Boolean |
|`/input/thumb_fb/proximity_fb` | thumbProximity | Boolean |
|`/output/haptic` | haptic | Vibrate |
|`/output/trigger_haptic_fb` | hapticTrigger | Vibrate |
|`/output/thumb_haptic_fb` | hapticThumb | Vibrate |
| Unity Layout Only | isTracked | Flag Data |
| Unity Layout Only | trackingState | Flag Data |
| Unity Layout Only | devicePosition | Vector3 |
| Unity Layout Only | deviceRotation | Quaternion |
[!include[](snippets/unity-layout.md)]

View File

@@ -0,0 +1,29 @@
---
uid: openxr-microsoft-hand-interaction
---
# Microsoft Hand Interaction
Unity OpenXR provides support for the Hololens 2 Hand interaction profile. This layout inherits from `<XRController>` so bindings that use XR Controller and are available on this device (for example, `<XRController>/devicePosition`) will bind correctly.
This interaction profile does not provide hand mesh or hand rig data. These will be added in the future.
For more information about the Microsoft Hand Interaction extension, refer to the [OpenXR Specification](https://www.khronos.org/registry/OpenXR/specs/1.0/html/xrspec.html#XR_MSFT_hand_interaction).
## Available controls
The following table outlines the mapping between the OpenXR paths and Unity's implementation:
| OpenXR Path | Unity Control Name | Type |
|----|----|----|
|`/input/select/click`| select | Boolean |
|`/input/squeeze/value` | squeeze | Float |
|`/input/grip/pose` | devicePose | Pose |
|`/input/aim/pose` | pointer | Pose |
| Unity Layout Only | isTracked | Flag Data |
| Unity Layout Only | trackingState | Flag Data |
| Unity Layout Only | devicePosition | Vector3 |
| Unity Layout Only | deviceRotation | Quaternion |
| Unity Layout Only | pointerPosition | Vector3 |
| Unity Layout Only | pointerRotation | Quaternion |
[!include[](snippets/unity-layout.md)]

View File

@@ -0,0 +1,36 @@
---
uid: openxr-microsoft-mixed-reality-motion-controller-profile
---
# Microsoft Mixed Reality Motion Controller Profile
Enables the OpenXR interaction profile for the Microsoft Mixed Reality Motion controller and exposes the `<WMRSpatialController>` device layout within the [Unity Input System](xref:input-system-index).
For more information about the Microsoft Mixed Reality Motion Controller interaction profile, refer to the [OpenXR Specification](https://www.khronos.org/registry/OpenXR/specs/1.0/html/xrspec.html#_microsoft_mixed_reality_motion_controller_profile).
## Available controls
The following table outlines the mapping between the OpenXR paths and Unity's implementation:
| OpenXR Path | Unity Control Name | Type |
|----|----|----|
|`/input/thumbstick`| joystick | Vector2 |
|`/input/trackpad`| touchpad | Vector2 |
|`/input/squeeze/click`| grip | Float (Boolean cast to float) |
|`/input/squeeze/click`| gripPressed | Boolean |
|`/input/menu/click`| menu | Boolean |
|`/input/trigger/value`| trigger | Float |
|`/input/trigger/value`| triggerPressed | Boolean (float cast to Boolean) |
|`/input/thumbstick/click`| joystickClicked | Boolean |
|`/input/trackpad/click`| touchpadClicked | Boolean |
|`/input/trackpad/touch`| touchpadTouched | Boolean |
|`/input/grip/pose` | devicePose | Pose |
|`/input/aim/pose` | pointer | Pose |
|`/output/haptic` | haptic | Vibrate |
| Unity Layout Only | isTracked | Flag Data |
| Unity Layout Only | trackingState | Flag Data |
| Unity Layout Only | devicePosition | Vector3 |
| Unity Layout Only | deviceRotation | Quaternion |
| Unity Layout Only | pointerPosition | Vector3 |
| Unity Layout Only | pointerRotation | Quaternion |
[!include[](snippets/unity-layout.md)]

View File

@@ -0,0 +1,25 @@
---
uid: openxr-multiview-render-regions
---
# Multiview Render Regions
The Multiview Render Regions feature is an optimization technique that prevents processing on areas of the screen that are not visible to the user.
For a detailed explanation of Multiview Render Regions, refer to [Multiview Render Regions](https://docs.unity3d.com/6000.1/Documentation/Manual/xr-multiview-render-regions.html) in the Unity Manual.
## Prerequisites
To enable this feature, you will need the following:
* Unity 6.1 or newer.
* Ensure that the Vulkan API is enabled. This feature is not available on other graphics APIs at this point in time.
## Enable Multiview Render Regions
To enable the Multiview Render Regions feature:
1. Open the **OpenXR** section of **XR Plug-in Management** (menu: **Edit** > **Project Settings** > **XR Plug-in Management** > **OpenXR**).
2. Under **All Features**, enable **Meta Quest Support**.
3. Use the **Gear** icon to open **Meta Quest Support** settings.
4. Under **Rendering Settings**, enable **Optimize Multiview Render Regions**.
![The Optimize Multiview Render Regions feature is enabled in the OpenXR Rendering Settings.](../images/multiview-render-regions.png)<br/>*Enable the Optimize Multiview Render Regions feature in Rendering Settings.*

View File

@@ -0,0 +1,42 @@
---
uid: openxr-oculus-touch-controller-profile
---
# Oculus Touch Controller Profile
Enables the OpenXR interaction profile for Oculus Touch controllers and exposes the `<OculusTouchController>` device layout within the [Unity Input System](https://docs.unity3d.com/Packages/com.unity.inputsystem@1.0/manual/).
For more information about the Oculus Touch interaction profile, refer to the [OpenXR Specification](https://www.khronos.org/registry/OpenXR/specs/1.0/html/xrspec.html#_oculus_touch_controller_profile).
## Available controls
The following table outlines the mapping between the OpenXR paths and Unity's implementation:
| OpenXR Path | Unity Control Name | Type |
|----|----|----|
|`/input/thumbstick`| thumbstick | Vector2 |
|`/input/squeeze/value`| grip | Float |
|`/input/squeeze/value`| gripPressed | Boolean (float cast to Boolean) |
|`/input/menu/click`| menu (Left Hand Only)| Boolean |
|`/input/system/click`| menu (Right Hand Only)| Boolean |
|`/input/a/click`| primaryButton (Right Hand Only) | Boolean |
|`/input/a/touch`| primaryTouched (Right Hand Only) | Boolean |
|`/input/b/click`| secondaryButton (Right Hand Only) | Boolean |
|`/input/b/touch`| secondaryTouched (Right Hand Only) | Boolean |
|`/input/x/click`| primaryButton (Left Hand Only) | Boolean |
|`/input/x/touch`| primaryTouched (Left Hand Only) | Boolean |
|`/input/y/click`| secondaryButton (Left Hand Only) | Boolean |
|`/input/y/touch`| secondaryTouched (Left Hand Only) | Boolean |
|`/input/trigger/value`| trigger | Float |
|`/input/trigger/value`| triggerPressed | Boolean (float cast to Boolean) |
|`/input/trigger/touch`| triggerTouched| Boolean (float cast to Boolean) |
|`/input/thumbstick/click`| thumbstickClicked | Boolean |
|`/input/thumbstick/touch`| thumbstickTouched | Boolean |
|`/input/grip/pose` | devicePose | Pose |
|`/input/aim/pose` | pointer | Pose |
|`/output/haptic` | haptic | Vibrate |
| Unity Layout Only | isTracked | Flag Data |
| Unity Layout Only | trackingState | Flag Data |
| Unity Layout Only | devicePosition | Vector3 |
| Unity Layout Only | deviceRotation | Quaternion |
[!include[](snippets/unity-layout.md)]

View File

@@ -0,0 +1,18 @@
---
uid: openxr-palm-pose-interaction
---
# Palm Pose Interaction
Unity OpenXR provides support for the Palm Pose extension specified by Khronos. Use this layout to retrieve the pose data that the extension returns. This extension also adds a new input component path using this "palm_ext" pose identifier to existing interaction profiles when active.
Enables the OpenXR interaction feature for Palm Pose Interaction and exposes the `<PalmPose>` layout within the [Unity Input System](https://docs.unity3d.com/Packages/com.unity.inputsystem@1.0/manual/).
For more information about the Palm Pose extension, refer to the [OpenXR Specification](https://registry.khronos.org/OpenXR/specs/1.0/html/xrspec.html#XR_EXT_palm_pose).
## Available controls
The following table outlines the mapping between the OpenXR paths and Unity's implementation:
| OpenXR Path | Unity Control Name | Type |
|----|----|----|
| `/input/palm_ext/pose` | palmPose | Pose |

View File

@@ -0,0 +1,182 @@
---
uid: openxr-performance-settings
---
# XR performance settings
The OpenXR performance settings feature lets you provide performance hints to an OpenXR runtime and allows you to get notification when an important aspect of device performance changes.
XR devices must balance power and thermal state against performance to achieve an optimal user experience. You can provide performance hints to an OpenXR runtime to help it make the best tradeoffs for the task your application is running. For example, if your application is displaying a scene consisting of a loading graphic or simple menu without significant rendering requirements, you could tell the runtime to operate at a lower power level to save energy and thermal capacity for a later, more complex scene. Likewise, if you have a sequence that is very demanding, you could tell the runtime to operate at max capacity for a short time. You can provide hints separately for the CPU and the GPU. Refer to [Performance level hints](#performance-settings-level-hints) for more information.
You can subscribe to notifications that report changes in important aspects of device performance. These aspects include compositing, rendering, and thermal capacity. The performance notifications inform you when one of these performance categories changes between the normal, warning, and impaired states. The OpenXR runtime reports performance changes separately for the CPU and GPU. Your application can respond to these notifications by reducing the application workload. Refer to [Performance notifications](#performance-settings-notifications) for more information.
Refer to [Enable the performance settings OpenXR feature](#enable-performance-feature) for instructions about enabling the feature in your project.
> [!NOTE]
> The target XR platform must support the OpenXR "XR\_EXT\_performance\_settings" extension. Otherwise, the XR Performance Settings won't be activated at runtime. Be sure to confirm that your target platform supports the extension.
For more information about the OpenXR Performance Settings extension, refer to the [OpenXR Specification](https://registry.khronos.org/OpenXR/specs/1.0/html/xrspec.html#XR_EXT_performance_settings).
<a name="enable-performance-feature"></a>
## Enable the performance settings OpenXR feature
To use the performance settings API, you must enable it in the **OpenXR Project Settings**:
1. Open the **Project Settings** window (menu: **Edit > Project Settings**).
2. Expand the **XR Plug-in Management** section, if necessary.
3. Select the **OpenXR** area under **XR Plug-in Management**.
4. Choose the **Platform Build Target** from the tabs along the top of the **OpenXR** settings page.
5. Check the box next to **XR Performance Settings** (in the **All Features** group) to enable the feature.
6. Repeat for other platforms, if desired.
![](../images/performance-settings-feature.png)
<a name="performance-settings-level-hints"></a>
## Performance level hints
The performance level hints help an OpenXR runtime balance the tradeoffs it must make between energy usage, thermal state, and your application's performance needs. You can provide separate hints for the CPU and GPU hardware domains.
You can set the following levels as hints about your application's current performance needs:
- **Power Savings**: At this level, the OpenXR runtime prioritizes energy conservation over consistent frame rendering and latency.
- **Sustained Low**: At this level, the OpenXR runtime tries to balance resource usage and high performance, prioritizing the former. Energy savings are prioritized over low latency, but the runtime tries to maintain a relatively stable frame rendering and XR compositing flow, as long as it's thermally sustainable.
- **Sustained High**: This level is the default hint, if none is set by your application. At this level, the OpenXR runtime prioritizes consistent rendering, XR compositing and latency in a thermal range that is sustainable.
- **Boost**: At this level, the OpenXR runtime allows maximum resource usage without trying to keep the device operating in a thermally sustainable range. If the device hardware exceeds its thermal limits, the runtime must throttle performance, so you should limit use of this performance level to short durations (less that 30 seconds).
> [!NOTE]
> When setting the Performance Level Hint to **Sustained Low** or **Sustained High** levels, your application can still experience performance throttling. For example, a device might exceed thermal limits due to external circumstances, such as operating in a high-temperature environment or for an extended amount of time. You can use [Performance Notifications](#performance-settings-notifications) to detect such impending performance issues and lower application workload before the runtime must impose throttling.
Use the static method, [XrPerformanceSettingsFeature.SetPerformanceLevelHint](xref:UnityEngine.XR.OpenXR.Features.Extensions.PerformanceSettings.XrPerformanceSettingsFeature.SetPerformanceLevelHint*) to set a given performance level hint for a specific CPU or GPU performance domain.
By choosing the lowest level that gives your application good performance, you can extend the device's battery life and avoid overheating. For example, you could set the **Power Savings** hint during loading screens. For normal operation, you could set **Sustained Low** for less demanding sequences, and **Sustained High** for more complex scenes. When appropriate, you could set the **Boost** hint to achieve maximum performance for a brief time.
Here are some examples of how you might use performance level hints in different scenarios:
- The application shows static pictures without any interactive elements:
```c#
XrPerformanceSettingsFeature.SetPerformanceLevelHint(PerformanceDomain.CPU, PerformanceLevelHint.PowerSavings);
XrPerformanceSettingsFeature.SetPerformanceLevelHint(PerformanceDomain.GPU, PerformanceLevelHint.PowerSavings);
// Show a picture sequence and continue with app loading
```
- The application is in a normal state, with simple interactivity and low complexity scenes:
```c#
XrPerformanceSettingsFeature.SetPerformanceLevelHint(PerformanceDomain.CPU, PerformanceLevelHint.SustainedLow);
XrPerformanceSettingsFeature.SetPerformanceLevelHint(PerformanceDomain.GPU, PerformanceLevelHint.SustainedLow);
// Run regular app process
```
- The application needs to render a complex scene with multiple, interactive composition layers:
```c#
XrPerformanceSettingsFeature.SetPerformanceLevelHint(PerformanceDomain.GPU, PerformanceLevelHint.SustainedHigh);
// Load and process action scene
```
- The application needs to process a large amount of data on device:
```c#
XrPerformanceSettingsFeature.SetPerformanceLevelHint(PerformanceDomain.CPU, PerformanceLevelHint.Boost);
// Run complex logic
```
<a name="performance-settings-notifications"></a>
## Performance notifications
The OpenXR runtime can inform your application about performance changes, so you can react and adjust the current workload, if desired.
Performance is categorized as one of the following performance states:
- **Normal**: No performance degradation is expected and the app can run as usual.
- **Warning**: The OpenXR runtime is expecting performance degradation if device conditions do not improve.
- **Impaired**: The device is suffering performance degradation.
An OpenXR runtime monitors performance of the device CPU and GPU across a set of key performance areas, called subdomains. These subdomains include:
* [Compositing](#compositing-subdomain): A runtime task that takes all rendered layers and combines them for display.
* [Rendering](#rendering-subdomain): Your application's timely submission of frames to the compositor.
* [Thermal](#thermal-subdomain): The hardware device's temperature compared to its thermal limits.
If the performance state of a subdomain changes, the [XrPerformanceSettingsFeature](xref:UnityEngine.XR.OpenXR.Features.Extensions.PerformanceSettings.XrPerformanceSettingsFeature.OnXrPerformanceChangeNotification) object dispatches an event that describes the change. Refer to [Subscribe to performance notifications](#subscribe) for information about handling this event.
When the performance state of a domain and subdomain changes for the worse, you should take mitigation measures that reduce application workload in that area. If you do not, the OpenXR runtime might impose its own, more drastic, measures.
After a sufficient period (determined by the runtime) of improved performance, a performance state can change from impaired to warning, and from warning to normal. Leaving the impaired state means that the runtime has stopped its own mitigation measures.
> [!TIP]
> Performance impairment may not be caused directly by your application, but by situational causes, such as high room temperature or a large amount of background tasks running on the device. Even so, the better you can tune your app to avoid performance degradation, the better the user experience will be across variable operating conditions.
<a name="compositing-subdomain"></a>
### Compositing subdomain
Compositing is a task performed by the OpenXR runtime to combine submitted frames for final display. The runtime must share CPU and GPU resources with your application while performing this task. If the runtime can't complete compositing in time, visual artifacts can occur. When in the impaired state, the OpenXR runtime might take actions that interfere with your application, such as limiting frame rate, ignoring submitted layers, or even shutting down the application.
* In the **Normal** state, the compositor can consistently finish with sufficient margin.
* In the **Warning** state, the compositor is finishing its task in time, but the margin is considered insufficient.
* In the **Impaired** state, the compositor cannot finish its task in time.
<a name="rendering-subdomain"></a>
### Rendering subdomain
Your application's rendering pipeline must submit rendered layers to the compositor for display. If frames aren't submitted in time for the compositor to use them, dropped frames and other artifacts can occur. In the impaired state, the OpenXR runtime might take actions that interfere with your application, such as telling the user that the application is unresponsive or displaying a tracking environment to maintain user orientation.
* In the **Normal** state, your application is consistently submitting rendered frames to the compositor in time to be used.
* In the **Warning** state, at least one layer is regularly submitted past the compositor deadline.
* In the **Impaired** state, late submission of frames has reached a critical threshold.
<a name="thermal-subdomain"></a>
### Thermal subdomain
XR devices must stay within a safe operating temperature. When a device reaches its thermal limits, the OpenXR runtime must take drastic measures to lower heat generation. These mitigations can severely impact the user experience.
* In the **Normal** state, the device is operating within a sustainable thermal range.
* In the **Warning** state, the OpenXR runtime anticipates that the device will soon overheat under the current load.
* In the **Impaired** state, the OpenXR is taking measures such as throttling performance, to reduce the device temperature.
<a name="subscribe"></a>
### Subscribe to performance notifications
Subscribe to the [XrPerformanceSettingsFeature.OnXrPerformanceChangeNotification](xref:UnityEngine.XR.OpenXR.Features.Extensions.PerformanceSettings.XrPerformanceSettingsFeature.OnXrPerformanceChangeNotification) event to receive an event when the performance state of a domain and subdomain changes.
A notification event provides the following data (as a [PerformanceChangeNotification](xref:UnityEngine.XR.OpenXR.Features.Extensions.PerformanceSettings.PerformanceChangeNotification) struct):
* Old performance state (normal, warning, or impaired)
* New performance state (normal, warning, or impaired)
* Affected domain (CPU or GPU)
* Affected subdomain (compositing, rendering, or thermal)
The following code snippet illustrates how you can subscribe to the performance notification event and handle performance changes (by calling your own application-defined functions that modify performance):
```c#
// Subscribe to the performance notification event at runtime
XrPerformanceSettingsFeature.OnXrPerformanceChangeNotification += OnPerformanceChangeNotification;
// Process the notification when it happens
void OnPerformanceChangeNotification(PerformanceChangeNotification notification)
{
switch (notification.toLevel)
{
case PerformanceNotificationLevel.Normal:
// Let the application run as normal execution
RestorePerformance(notification.domain, notification.subDomain);
break;
case PerformanceNotificationLevel.Warning:
// Reduce workload of low priority tasks
LimitPerformance(notification.domain, notification.subDomain);
break;
case PerformanceNotificationLevel.Impaired:
// Execute app with the minimum required processes for the given domain and subdomain
ReducePerformance(notification.domain, notification.subDomain);
break;
}
}
```

View File

@@ -0,0 +1,2 @@
> [!NOTE]
> Some Unity controls don't correspond to an OpenXR path. Unity expresses [Pose](https://docs.unity3d.com/Documentation/ScriptReference/Pose.html) data as individual elements, whereas OpenXR expresses poses as a group of data. These additional controls contain `Unity Layout Only` in the **OpenXR Path** column. You can use this additional Pose data for more fine-grained controls. To learn more, refer to [Pose data](xref:openxr-input#pose-data).

View File

@@ -0,0 +1,44 @@
---
uid: openxr-valve-index-controller-profile
---
# Valve Index Controller Profile
Enables the OpenXR interaction profile for the Valve Index controller and exposes the `<ValveIndexController>` device layout within the [Unity Input System](https://docs.unity3d.com/Packages/com.unity.inputsystem@1.0/manual/).
For more information about the Valve Index interaction profile, refer to the [OpenXR Specification](https://www.khronos.org/registry/OpenXR/specs/1.0/html/xrspec.html#_htc_vive_controller_profile).
## Available controls
The following table outlines the mapping between the OpenXR paths and Unity's implementation:
| OpenXR Path | Unity Control Name | Type |
|----|----|----|
|`/input/system/click`| system | Boolean |
|`/input/system/touch`| systemTouched | Boolean |
|`/input/a/click`| primaryButton | Boolean |
|`/input/a/touch`| primaryTouched | Boolean |
|`/input/b/click`| secondaryButton | Boolean |
|`/input/b/touch`| secondaryTouched | Boolean |
|`/input/squeeze/value`| grip | Float |
|`/input/squeeze/value`| gripPressed | Boolean (cast from float) |
|`/input/squeeze/force`| gripForce | Float |
|`/input/trigger/click`| triggerPressed | Boolean |
|`/input/trigger/value`| trigger | Float |
|`/input/trigger/touch`| triggerTouched | Boolean |
|`/input/thumbstick`| thumbstick | Vector2 |
|`/input/thumbstick/click`| thumbstickClicked | Boolean |
|`/input/thumbstick/touch`| thumbstickTouched | Boolean |
|`/input/trackpad`| trackpad | Vector2 |
|`/input/trackpad/touch`| trackpadTouched | Boolean |
|`/input/trackpad/force`| trackpadForce | Float |
|`/input/grip/pose` | devicePose | Pose |
|`/input/aim/pose` | pointer | Pose |
|`/output/haptic` | haptic | Vibrate |
| Unity Layout Only | isTracked | Flag Data |
| Unity Layout Only | trackingState | Flag Data |
| Unity Layout Only | devicePosition | Vector3 |
| Unity Layout Only | deviceRotation | Quaternion |
| Unity Layout Only | pointerPosition | Vector3 |
| Unity Layout Only | pointerRotation | Quaternion |
[!include[](snippets/unity-layout.md)]

View File

@@ -0,0 +1,5 @@
apiRules:
- exclude:
uidRegex: ^UnityEngine\.XR\.OpenXR\.Samples\..*$
- exclude:
uidRegex: ^UnityEditor\.XR\.OpenXR\.Samples\..*$

BIN
Packages/com.unity.xr.openxr/Documentation~/images/gear.png (Stored with Git LFS) Normal file

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

View File

@@ -0,0 +1,210 @@
---
uid: openxr-manual
---
# OpenXR Plugin
OpenXR is an open, royalty-free standard developed by Khronos that aims to simplify AR/VR development by allowing developers to seamlessly target a wide range of AR/VR devices.
## Requirements
This version of OpenXR is compatible with the following versions of the Unity Editor:
* 2021.3 LTS+
## Runtimes
Unity's OpenXR plug-in should work with any device that supports conformant OpenXR runtimes. The following is a list of known runtimes that you may want to target:
|**Runtime**|**Build target**|**Preferred Graphics API**|**Feature Parity**|**Known Limitations**|
|---|---|---|---|---|
|Windows Mixed Reality|Windows 64-bit|DX11|Full feature parity via [Mixed Reality OpenXR Plugin for Unity](https://learn.microsoft.com/en-us/windows/mixed-reality/develop/unity/mixed-reality-openxr-plugin)||
|HoloLens 2|UWP arm64|DX11|Full feature parity via [Mixed Reality OpenXR Plugin for Unity](https://learn.microsoft.com/en-us/windows/mixed-reality/develop/unity/mixed-reality-openxr-plugin)||
|Oculus PC + Link|Windows 64-bit|DX11|HMD + Controllers|Oculus Integration package features not available|
|Meta Quest|Android arm64|Vulkan|HMD + Controllers via [Meta Quest Support Feature](./features/metaquest.md)|
|Magic Leap 2|Android x64|Vulkan|Full feature parity via [Magic Leap Unity Overview OpenXR](https://developer-docs.magicleap.cloud/docs/guides/unity/unity-overview-openxr/)||
|All other conformant runtimes (eg. SteamVR)|Windows 64-bit|DX11|HMD + Controllers|Given the unbounded combinations of possible hardware/software configurations, Unity is unable to test or guarantee that all configurations will work optimally.<br><br>SteamVR Plugin features not available|
To help the community as a whole, Unity will continue to submit any runtime issues, and contribute conformance tests and specification changes to the Khronos working group.
## OpenXR Loader
Each release of the OpenXR Plugin is linked against the Khronos Group [OpenXR-SDK](https://github.com/KhronosGroup/OpenXR-SDK/releases). This repository contains the authoritative public OpenXR headers, source code and build scripts used to generate the OpenXR Loader dll/so/libraries.
This release is linked against the OpenXR-SDK version [1.1.45](https://github.com/KhronosGroup/OpenXR-SDK/releases/tag/release-1.1.45).
Additionally, you can access the current OpenXR Runtime through the scripting API via `OpenXRRuntime.version`. This returns a string representing the semantic versioning of the current OpenXR Runtime.
## Considerations before porting to OpenXR
Unity does not yet provide out-of-the-box solutions to the following when using OpenXR, however there may be platform-specific plugins or third party solutions available:
* Controller / hand models
* Finger tracking
* Augmented reality / mixed reality features
* Composition layers
* Overlays
* Foveated rendering
## Getting started
To enable OpenXR in your project, follow the steps below:
1. Open the **Project Settings** window (menu: **Edit &gt; Project Settings**), and select **XR Plug-in Management**.
2. Enable the **OpenXR** option and any **Feature Groups** for the runtimes you intend to target.
3. In the **OpenXR > Features** tab, select the interaction profile of the device you are testing with.
4. In the **OpenXR** tab, make sure the current active runtime is set to the hardware you are testing with. See the [Choose an OpenXR runtime to use in Play mode](xref:openxr-project-config#openxr-runtime) for more information.
See [Project configuration](xref:openxr-project-config) for additional information about setting up your project to use an OpenXR plug-in.
## Project validation
Project validation is a feature the Unity OpenXR package has to assess configuration correctness of your project depending on the platform you are planning to build for. Unity will raise errors and warnings at build time if your project is not compatible with OpenXR.
For more information on how project validation works in OpenXR, see [Project validation](xref:openxr-project-config#project-validation).
## Troubleshooting
If you experience an issue, please [file a bug](https://unity3d.com/unity/qa/bug-reporting). When you do, please also check the [log file](https://docs.unity3d.com/2021.3/Documentation/Manual/LogFiles.html) to see if Unity supports the combination of OpenXR runtimes and features you are using. The log file will provide additional guidance.
Unity generates a diagnostic log in either the Player or Editor log, depending on where you run the application. The diagnostic log starts with `==== Start Unity OpenXR Diagnostic Report ====` and ends with `==== End Unity OpenXR Diagnostic Report ====` log entries. It contains information about your application, Unity version, OpenXR runtime, OpenXR Extensions, and other aspects that can help diagnose issues.
The most important part of the diagnostic log is the section marked `==== OpenXR Support Details ====`. This section provides some simple information on what parts of your application Unity supports, what it might not support, and what to do before submitting an issue or requesting assistance.
### Examples
#### Diagnostic log OpenXR Support Details section when running with Unity supported items
```
[XR] [58328] [12:54:14.788][Info ] ==== OpenXR Support Details ====
[XR] [58328] [12:54:14.788][Info ] OpenXR Runtime:
[XR] [58328] [12:54:14.788][Info ] <Some company>, which is a Unity supported partner
[XR] [58328] [12:54:14.788][Info ] Unity OpenXR Features:
[XR] [58328] [12:54:14.788][Info ] Android , ControllerSampleValidation Standalone, Standalone : Unity
[XR] [58328] [12:54:14.788][Info ] Unity Support:
[XR] [58328] [12:54:14.788][Info ] Unity supports the runtime and Unity OpenXR Features above. When requesting assistance, please copy the OpenXR section from ==== Start Unity OpenXR Diagnostic Report ==== to ==== End Unity OpenXR Diagnostic Report ==== to the bug or forum post.
```
#### Diagnostic log OpenXR Support Details section when running with items not supported by Unity
```
[XR] [58328] [12:54:14.788][Info ] ==== OpenXR Support Details ====
[XR] [58328] [12:54:14.788][Info ] OpenXR Runtime:
[XR] [58328] [12:54:14.788][Info ] <Some company>, which is not a Unity supported partner
[XR] [58328] [12:54:14.788][Info ] Unity OpenXR Features:
[XR] [58328] [12:54:14.788][Info ] Android , ControllerSampleValidation Standalone, Standalone : Unity
[XR] [58328] [12:54:14.788][Info ] Unity Support:
[XR] [58328] [12:54:14.788][Info ] Unity doesn't support some aspects of the runtime and Unity OpenXR Features above. Please attempt to reproduce the issue with only Unity supported aspects before submitting the issue to Unity.
```
## Known issues
* For projects targeting HoloLens 2 that are using Out of the Box Unity OpenXR support, **Project Settings - &gt; Player - &gt; Resolution and Presentation - &gt; Run in Background** must be enabled. For projects that are using the Microsoft OpenXR extended support package this is not required.
* An issue with an invalid stage space during startup may cause problems with the XR Origin component from the `com.unity.xr.interaction.toolkit` package, or the camera offset component in the `com.unity.xr.legacyinputhelpers` package. These packages will be updated shortly to contain fixes for this issue. Until then the workaround is to use the `Floor` Device Tracking Option setting.
* OpenXR doesn't currently provide a way to get Acceleration and AngularAcceleration values on input devices, so these values will always be zero.
## Upgrading a project to use OpenXR
OpenXR is a plug-in in Unity's [XR plug-in architecture](https://docs.unity3d.com/2020.1/Documentation/Manual/XRPluginArchitecture.html). Unity recommends using the [XR Interaction Toolkit](https://docs.unity3d.com/Packages/com.unity.xr.interaction.toolkit@1.0/manual/index.html) for input and interactions. If you are using any platform-vendor specific toolkits, please see the platform-vendor specific documentation on how to integrate those toolkits with OpenXR. Vendors are still in the process of adding OpenXR support to their toolkits; please make sure you check supported status before enabling OpenXR in your project.
The core steps to upgrade a project to use OpenXR are the setup instructions at the top. In addition, you might need to do some of the following:
1. **Update to the latest [supported version](#requirements) of Unity before implementing OpenXR**. Some APIs that your project relies on might have been changed or removed, and this will let you easily distinguish which changes are a result of the new Unity version and which come from OpenXR.
2. **Disable XR SDK plug-ins that are also supported by OpenXR.** There is a good chance that OpenXR supports your target device, so enabling a competing XR SDK plug-in may cause unexpected behavior. In **Project Settings &gt; XR Plug-in Management**, uncheck the plug-in provider(s) for your target device(s).
3. **Update your input code** if your project doesn't use the new Input System. For more information, see the [Quick start guide](https://docs.unity3d.com/Packages/com.unity.inputsystem@1.0/manual/QuickStartGuide.html) in the Input System package documentation.
4. **Change quality settings.** It is possible that switching over to the OpenXR provider will affect your application's visual quality. You can adjust the quality settings of your project (menu: **Edit &gt; Project Settings &gt; Player**, then select the **Quality** tab) to try getting your old visuals back.
## OpenXR concepts
### Understanding OpenXR's action-based input
See [OpenXR Input Documentation](./input.md).
<a name="openxr-features"></a>
### OpenXR features
Features allow third parties to extend Unity's base support for OpenXR. They bring the functionality of OpenXR spec extensions to the Unity ecosystem, but Unity is not involved in their development.
Features might integrate into the [Unity XR Plug-in framework](xref:XRPluginArchitecture) to provide data to other XR subsystems (for example, providing meshing or plane data for AR use cases which are not yet standardized in OpenXR 1.0).
Features are a collection of Unity Assets that can be distributed through the Package Manager, Asset Store, or any other mechanism.
#### General features
* Mock Runtime (**Note:** Enabling this will take over whatever current OpenXR runtime you might be using.)
* [XR performance settings](./features/performance-settings.md)
#### Interaction profile features
* [Eye Gaze Interaction](./features/eyegazeinteraction.md)
* [Microsoft Hand Interaction](./features/microsofthandinteraction.md)
* [HTC Vive Controller](./features/htcvivecontrollerprofile.md)
* [Khronos Simple Controller](./features/khrsimplecontrollerprofile.md)
* [Microsoft Motion Controller](./features/microsoftmotioncontrollerprofile.md)
* [Oculus Touch Controller](./features/oculustouchcontrollerprofile.md)
* [Valve Index Controller](./features/valveindexcontrollerprofile.md)
### Accessing features at runtime via script
You can also access all the settings in the **Features** window through script. The scripting API documentation in this package provides information on all APIs you can use for feature support. The code samples below illustrate some of the common tasks.
#### Iterating over all features
```c#
BuildTargetGroup buildTargetGroup = BuildTargetGroup.Standalone;
FeatureHelpers.RefreshFeatures(buildTargetGroup);
var features = OpenXRSettings.Instance.GetFeatures();
foreach (var feature in features)
{
// Toggle feature on/off
feature.enabled = ...;
}
```
#### Getting a specific feature by type
```c#
var feature = OpenXRSettings.Instance.GetFeature<MockRuntime>();
// Toggle feature on/off
feature.enabled = ...;
```
#### Iterating over all feature groups
Feature groups are an Editor-only concept and as such can only be accessed in the Unity Editor.
```c#
#if UNITY_EDITOR
BuildTargetGroup buildTargetGroup = BuildTargetGroup.Standalone;
var featureSets = OpenXRFeatureSetManager.FeatureSetsForBuildTarget(buildTargetGroup);
foreach(var featureSet in featureSets)
{
var featureSetId = featureSet.featureSetId;
// ...
}
#endif
```
#### Iterating over features in a feature group
Feature groups are an Editor-only concept and as such can only be accessed in the Unity Editor.
```c#
#if UNITY_EDITOR
var featureSet = OpenXRFeatureSetManager.GetFeatureSetWithId(buildTargetGroup, featureSetId); // featureSetId set earlier
var features = FeatureHelpers.GetFeaturesWithIdsForActiveBuildTarget(featureSet.featureIds);
foreach (var feature in features)
{
// ... Do something with the feature.
}
#endif
```
### Implementing a feature
Anyone can add new features. To learn more, see documentation on [how to add a feature to Unity's OpenXR support](features.md).
## References
* https://www.khronos.org/openxr/
* https://www.khronos.org/registry/OpenXR/specs/1.0/html/xrspec.html

View File

@@ -0,0 +1,228 @@
---
uid: openxr-input
---
# Input in Unity OpenXR
This page details how to use and configure OpenXR input within unity.
For information on how to configure Unity to use OpenXR input, see the [Getting Started](#getting-started) section of this document.
## Overview
Initially, Unity will provide a controller-based approach to interfacing with OpenXR. This will allow existing games and applications that are using Unity's [Input System](https://docs.unity3d.com/Packages/com.unity.inputsystem@1.0/manual/) or the [Feature API](https://docs.unity3d.com/Manual/xr_input.html) to continue to use their existing input mechanisms.
The Unity OpenXR package provides a set of controller layouts for various devices that you can bind to your actions when using the Unity Input System. For more information, see the [Interaction profile features](./index.md#interaction-profile-features) section.
To use OpenXR Input, you must select the correct interaction profiles features to send to OpenXR. To learn more about OpenXR features in Unity, see the [Interaction profile features](./index.md#interaction-profile-features) page.
Future versions of the Unity OpenXR Package will provide further integration with the OpenXR Input system. For smooth upgrading, Unity recommends that you use the device layouts included with the OpenXR package. These have the '(OpenXR)' suffix in the Unity Input System binding window.
Unity will provide documentation on these features when they become available.
Interaction profiles manifest themselves as device layouts in the Unity [Input System](https://docs.unity3d.com/Packages/com.unity.inputsystem@latest/).
## Getting Started
### Run the sample
The Open XR package contains a sample named `Controller` that will help you get started using input in OpenXR. To install the `Controller` sample, follow these steps:
1. Open the **Package Manager** window (menu: **Window &gt; Package Manager**).
2. Select the OpenXR package in the list.
3. Expand the **Samples** list on the right.
4. Click the **Import** button next to the `Controller` sample.
This adds a `Samples` folder to your project with a Scene named `ControllerSample` that you can run.
### Locking input to the game window
Versions V1.0.0 to V1.1.0 of the Unity Input System only route data to or from XR devices to the Unity Editor while the Editor is in the **Game** view. To work around this issue, use the [Unity OpenXR Project Validator](xref:openxr-project-config#project-validation) or follow these steps:
* Access the Input System Debugger window (menu: **Window &gt; Analysis &gt; Input Debugger**).
* In the **Options** section, enable the **Lock Input to the Game Window** option.
Unity recommends that you enable the **Lock Input to the Game Window** option from either the [Unity OpenXR Project Validator](xref:openxr-project-config#project-validation) or the Input System Debugger window
![lock-input-to-game-view](images/lock-input-to-game-view.png)
### Recommendations
To set up input in your project, follow these recommendations:
* Bind to the OpenXR layouts wherever possible.
* Use specific control bindings over usages.
* Avoid generic "any controller" bindings if possible (for example, bindings to `<XRController>`).
* Use action references and avoid inline action definitions.
OpenXR Requires that all bindings be attached only once at application startup. Unity recommends the use of Input Action Assets, and Input Action References to Actions within those assets so that Unity can present those bindings to OpenXR at applications startup.
## Using OpenXR input with Unity
Using OpenXR with Unity is the same as configuring any other input device using the Unity Input System:
1. Decide on what actions and action maps you want to use to describe your gameplay, experience or menu operations
2. Create an `Input Action` Asset, or use the one included with the [Sample](#run-the-sample).
3. Add the actions and action maps you defined in step 1 in the `Input Action` Asset you decided to use in step 2.
4. Create bindings for each action.
When using OpenXR, you must either create a "Generic" binding, or use a binding to a device that Unity's OpenXR implementation specifically supports. For a list of these specific devices, see the [Interaction bindings](#interaction-bindings) section.
5. Save your `Input Action` Asset.
6. Ensure your actions and action maps are enabled at runtime.
The [Sample](#run-the-sample) contains a helper script called `Action Asset Enabler` which enables every action within an `Input Action` Asset. If you want to enable or disable specific actions and action maps, you can manage this process yourself.
7. Write code that reads data from your actions.
For more information, see the [Input System](https://docs.unity3d.com/Packages/com.unity.inputsystem@1.0/manual/) package documentation, or consult the [Sample](#run-the-sample) to see how it reads input from various actions.
8. Enable the set of Interaction Features that your application uses.
If you want to receive input from OpenXR, the Interaction Features you enable must contain the devices you've created bindings with. For example, a binding to `<WMRSpatialController>{leftHand}/trigger` requires the Microsoft Motion Controller feature to be enabled in order for that binding to receive input. For more information on Interaction Features, see the [Interaction profile features](./index.md#interaction-profile-features) section.
9. Run your application!
You can use the Unity Input System Debugger (menu: **Window &gt; Analysis &gt; Input Debugger**) to troubleshoot any problems with input and actions.
The Input System Debugger can be found under **Window &gt; Analysis &gt; Input Debugger**
## Detailed information
### Unity Input System
Unity requires the use of the Input System package when using OpenXR. Unity automatically installs this package when you install Unity OpenXR Support. For more information, see the Input System package [documentation](https://docs.unity3d.com/Packages/com.unity.inputsystem@1.0/manual/Installation.html).
### Interaction Profile Features
Each Interaction Profile Feature contains both the device layout for creating bindings in the Unity Input System and a set of bindings that we send to OpenXR. The OpenXR Runtime will determine which bindings to use based on the set of Interaction Profiles that we send to it.
Unity Recommends that Developers select only the Interaction Profiles that they are able to test their experience with.
Selecting an Interaction Profile from the features menu will add that device to the bindable devices in the Unity Input System. They will be selectable from under the **XR Controller** section of the binding options.
![XR Controller Menu](images/xr-controller-input-menu.png)
See [Set the interaction profile](xref:openxr-project-config#interaction-profile) for instructions on setting a profile.
### Mapping between OpenXR paths and Unity bindings
The OpenXR specification details a number of `Interaction Profiles` that you can use to suggest bindings to the OpenXR runtime. Unity uses its own existing XRSDK naming scheme to identify controls and devices and map OpenXR action data to them.
The table below outlines the common mappings between OpenXR paths and Unity XRSDK Control names.
Which controls are available on which devices is covered in the specific device documentation.
| OpenXR Path | Unity Control Name | Type |
|----|----|----|
|`/input/system/click`| system | Boolean |
|`/input/system/touch`| systemTouched | Boolean |
|`/input/select/click`| select | Boolean |
|`/input/menu/click`| menu | Boolean |
|`/input/squeeze/value` | grip | Float |
|`/input/squeeze/click` | gripPressed | Boolean |
|`/input/squeeze/force` | gripForce | Boolean |
|`/input/trigger/value` | trigger | Float |
|`/input/trigger/squeeze` | triggerPressed | Boolean |
|`/input/trigger/touch` | triggerTouched | Boolean |
|`/input/thumbstick`| joystick | Vector2 |
|`/input/thumbstick/touch`| joystickTouched | Vector2 |
|`/input/thumbstick/clicked`| joystickClicked | Vector2 |
|`/input/trackpad`| touchpad | Vector2 |
|`/input/trackpad/touch`| touchpadTouched | Boolean |
|`/input/trackpad/clicked` | touchpadClicked | Boolean |
|`/input/a/click` | primaryButton | Boolean |
|`/input/a/touch` | primaryTouched | Boolean |
|`/input/b/click` | secondaryButton | Boolean |
|`/input/b/touch` | secondaryTouched | Boolean |
|`/input/x/click` | primaryButton | Boolean |
|`/input/x/touch` | primaryTouched | Boolean |
|`/input/y/click` | secondaryButton | Boolean |
|`/input/y/touch` | secondaryTouched | Boolean |
the Unity control `touchpad` and `trackpad` are used interchangeably, as are `joystick` and `thumbstick`.
<a id="pose-data"></a>
### Pose data
Unity expresses Pose data as individual elements (for example, position, rotation, velocity, and so on). OpenXR expresses poses as a group of data. Unity has introduced a new type to the Input System called a `Pose` that is used to represent OpenXR poses. The available poses and their OpenXR paths are listed below:
|Pose Mapping| |
|----|----|
|`/input/grip/pose`| devicePose |
|`/input/aim/pose` | pointerPose |
For backwards compatibility, the existing individual controls will continue to be supported when using OpenXR. The mapping between OpenXR pose data and Unity Input System pose data is found below.
|Pose | Pose Element| Binding| Type|
|---|---|---|---|
|`/input/grip/pose`| position | devicePosition | Vector3|
|`/input/grip/pose`| orientation | deviceRotation | Quaternion|
|`/input/aim/pose`| position | pointerPosition | Vector3|
|`/input/aim/pose`| orientation | pointerRotation | Quaternion|
### HMD bindings
To read HMD data from OpenXR, use the existing HMD bindings available in the Unity Input System. Unity recommends binding the `centerEye` action of the `XR HMD` device for HMD tracking. The following image shows the use of `centerEye` bindings with the `Tracked Pose Driver`.
![hmd-config-tpd](images/hmd-config-tpd.png)
OpenXR HMD Data contains the following elements.
- Center Eye
- Device
- Left Eye
- Right Eye
All of the elements expose the following controls:
- position
- rotation
- velocity
- angularVelocity
These are exposed in the Unity Input System through the following bindings. These bindings can be found under the XR HMD menu option when binding actions within the Input System.
- `<XRHMD>/centerEyePosition`
- `<XRHMD>/centerEyeRotation`
- `<XRHMD>/devicePosition`
- `<XRHMD>/deviceRotation`
- `<XRHMD>/leftEyePosition`
- `<XRHMD>/leftEyeRotation`
- `<XRHMD>/rightEyePosition`
- `<XRHMD>/rightEyePosition`
When using OpenXR the `centerEye` and `device` values are identical.
The HMD position reported by Unity when using OpenXR is calculated from the currently selected Tracking Origin space within OpenXR.
The Unity `Device Tracking Origin` is mapped to `Local Space`.
The Unity `Floor Tracking Origin` is mapped to `Stage Space`.
By default, Unity attempts to attach the `Stage Space` where possible. To help manage the different tracking origins, use the `XR Origin` from the XR Interaction Package, or the `Camera Offset` component from the Legacy Input Helpers package.
### Interaction bindings
If you use OpenXR input with controllers or interactions such as eye gaze, Unity recommends that you use bindings from the Device Layouts available with the Unity OpenXR package. The Unity OpenXR package provides the following Layouts via features:
|Device|Layout|Feature|
|-----|--------|----|
|Generic XR controller|`<XRController>`|n/a|
|Generic XR controller w/ rumble support|`<XRControllerWithRumble>`|n/a|
|Windows Mixed Reality controller|`<WMRSpatialController>`|[MicrosoftMotionControllerProfile](./features/microsoftmotioncontrollerprofile.md)|
|Oculus Touch (Quest,Rift)|`<OculusTouchController>`|[OculusTouchControllerProfile](./features/oculustouchcontrollerprofile.md)|
|HTC Vive controller|`<ViveController>`|[HTC Vive Controller Profile](./features/htcvivecontrollerprofile.md)|
|Valve Index controller|`<ValveIndexController>`|[ValveIndexControllerProfile](./features/valveindexcontrollerprofile.md)|
|Khronos Simple Controller|`<KHRSimpleController>`|[KHRSimpleControllerProfile](./features/khrsimplecontrollerprofile.md)|
|Eye Gaze Interaction|`<EyeGaze>`|[EyeGazeInteraction](./features/eyegazeinteraction.md)|
|Microsoft Hand Interaction|`<HololensHand>`|[MicrosoftHandInteraction](./features/microsofthandinteraction.md)|
## Haptics
OpenXR Controllers that support haptics contain a Haptic control that you can bind to an action in the [Unity Input System](https://docs.unity3d.com/Packages/com.unity.inputsystem@latest). Set the action type to Haptic when binding an InputAction to a haptic. To send a haptic impulse, call the [OpenXRInput.SendHapticImpulse](xref:UnityEngine.XR.OpenXR.Input.OpenXRInput.SendHapticImpulse*) method and specify the InputAction that you bound to the Haptic control. You can cancel the haptic impulse by calling the [OpenXRInput.StopHaptics](UnityEngine.XR.OpenXR.Input.OpenXRInput.StopHaptics*) method.
## Debugging
For more information on debugging OpenXR input, see the [Input System Debugging](https://docs.unity3d.com/Packages/com.unity.inputsystem@1.0/manual/Debugging.html) documentation.
## Future plans
Looking ahead, we will work towards allowing Unity users to leverage more functionality of OpenXR's input stack, allowing the runtime to bind Unity Actions to OpenXR Actions. This will allow OpenXR Runtimes to perform much more complex binding scenarios than currently possible.

View File

@@ -0,0 +1,262 @@
---
uid: openxr-project-config
---
# Project configuration
Use the **XR Plug-in Management** settings to configure the OpenXR plug-in for your project.
To get started, follow the instructions in [Enable the OpenXR plug-in](#enable-openxr). This also installs the OpenXR package, if needed. Once installed and enabled, you can configure your project settings as described in the [OpenXR Project settings](#project-settings) section.
You can review the [Project validation](#project-validation) section of the **XR Plug-in Management** settings to discover if any setting values are incompatible with OpenXR.
<a name="project-settings"></a>
## OpenXR Project settings
Some OpenXR features require specific Unity Project settings to function properly. The settings include:
* **[Enable the OpenXR XR plug-in](#enable-openxr)**: must be enabled to use OpenXR features.
* **[OpenXR features](#openxr-features)**: select the specific OpenXR features that you want to use.
* **[Render Mode](#render-mode)**: choose the rendering strategy.
* **[Color Submission Mode](#color-submission-mode)**: choose how color information is passed to the renderer.
* **[Depth Submission Mode](#depth-submission-mode)**: choose how depth information is passed to the renderer.
* **[Play Mode OpenXR Runtime](#openxr-runtime)** (Editor): choose which OpenXR plug-in to use when running in the Unity Editor Play mode.
* **[Interaction profiles](#interaction-profile)**: choose which OpenXR interaction profile to use for a platform.
* **[Color space](#color-space)**: When using the Open GL graphics API, you must set the **Color Space** to **Linear**.
<a name="enable-openxr"></a>
### Enable the OpenXR plug-in
To use OpenXR, you must enable the plug-in in your **XR Plug-in Management** settings. (Installing the package from the Package Manager does not automatically enable the plug-in.)
> [!NOTE]
> Enabling OpenXR also installs the package, if necessary. However, disabling OpenXR does not uninstall the package.
The **XR Plug-in Management** settings page displays a tab for each platform build target. Build support for a platform must be installed using the Unity Hub before you can enable an OpenXR plug-in for that platform. (Not every platform has a supported OpenXR plugin.) See the [Add modules](https://docs.unity3d.com/hub/manual/AddModules.html) section of the Unity Hub documentation for instructions.
To enable OpenXR:
1. Open the **Project Settings** window (menu: **Edit &gt; Project Settings**).
2. Select **XR Plug-in Management** to view the plug-in management settings.
3. Select the tab for a platform build target to view the settings for that target.
4. Enable the **OpenXR** option in the **Plug-in Providers** list.
![XR Plug-in Management](images/openxr-xrmanagement.png "OpenXR in XR Management")<br />*Enabling OpenXR*
> [!TIP]
> If your project uses OpenXR on more than one platform build target, you must enable the OpenXR plugin for each platform.
<a name="openxr-features"></a>
### Enable OpenXR features
OpenXR plug-ins and other packages can provide optional feature implementations that you can use with OpenXR itself. For example, the Unity OpenXR plug-in provides a **Runtime Debugger** and a **Mock Runtime** as OpenXR features. After you [enable the OpenXR plug-in](#enable-openxr), you can enable any available features.
Some features are organized as a *feature group*. You can enable a feature group to enable all the features in that group.
To enable an OpenXR feature:
1. Open the **Project Settings** window (menu: **Edit &gt; Project Settings**).
2. Click **XR Plug-in Management** to expand the plug-in section (if necessary).
3. Select **OpenXR** in the list of XR plug-ins.
4. Select the tab for a platform build target to view the features for that target.
![OpenXR features and feature groups](images/openxr-features.png "OpenXR features and feature groups")<br />*OpenXR features and feature groups*
5. Select the features and feature groups to enable.
6. Repeat for any other platform build targets your project supports.
If a feature has its own configuration options, you can click its gear icon (![](images/gear.png)) to open the feature's settings window. Some features provide an icon following their name that links to documentation.
See [OpenXR Features](index.md#openxr-features) for more information about features and groups.
<a name="render-mode"></a>
### Set the render mode
The **Render Mode** determines how OpenXR renders stereo graphics. Different plug-ins might support different modes, but the typical choices involve some form of the following:
|**Option** | **Description** |
| --------- | --------------- |
|**Multi-pass** | Each eye is rendered separately in independent *passes* from slightly different view points. Because the entire scene is rendered twice, multipass rendering is typically slower than other options. However, it is also the most similar to nonstereo rendering and does not require you to adapt your shader code. |
|**Single Pass Instanced** | Both eyes are rendered in one pass. Scene data is shared between the eyes using instanced draw calls, reducing the amount of data that must be transferred from the CPU to the GPU. Single-pass rendering is typically much faster than multi-pass, but requires compatible shaders. |
| **Single Pass Instanced / Multi-view** | Some GPU hardware supports instancing, while others support multi-view. The two render modes are otherwise very similar. If an OpenXR plug-in supports both types of hardware, you will see this option instead of the **Single Pass Instanced** option. |
To set the render mode:
1. Open the **Project Settings** window (menu: **Edit &gt; Project Settings**).
2. Click **XR Plug-in Management** to expand the plug-in section (if necessary).
3. Select **OpenXR** in the list of XR plug-ins.
4. Select the tab for a platform build target to view the features for that target.
5. Choose the desired **Render Mode**.
For more information see:
* [SinglePassStereoMode](xref:UnityEngine.Rendering.SinglePassStereoMode)
* [Single Pass Instanced rendering](xref:SinglePassInstancing)
<a name="depth-submission-mode"></a>
### Set the color submission mode
Some OpenXR runtimes support rendering to additional swapchain formats, such as 10- or 16-bit
high-dynamic range (HDR). Alternately, some performance may be gained on some devices by choosing
lower fidelity as a trade-off. The available formats depend on both the Unity version you are using
and the device and runtime that the Player is run on.
*Auto Color Submission Mode* currently selects the default platforms which is typically an 8bpc RGBA/BGRA format.
|**Option**|**Description**|
|---|---|
|**8 bits per channel (LDR, default)**|The default 8bpc RGBA/BGRA format. Will use sRGB if supported and either default or selected in player API options (e.g. in GLES).|
|**10 bits floating-point per color channel, 2 bit alpha (HDR)**|Packed 10bpc unsigned normalized floating-point color with 2 bits of alpha depth.|
|**16 bits floating-point per channel (HDR)**|16bpc signed half-width floating point color/alpha.|
|**5,6,5 bit packed (LDR, mobile)**|Compact packed format typically only used on low performance mobile devices and low gamut displays.|
|**11,11,10 bit packed floating-point (HDR)**|Packed color-only format using 11bpc unsigned float for red and green channels and 10bpc for blue.|
The best choice depends on your use case, platform, and target devices. Larger HDR formats will
generally encounter lower performance especially on lower-spec hardware, but generally provide
better rendering integrity in scenes with high dynamic range or luminance gradients (where banding
may be noticeable in LDR formats).
Reasonable rules of thumb when choosing a setting:
* For PC XR devices, consider your target devices and choose a performant HDR setting if you need
HDR. This often depends on the graphics API, GPU, and XR device together, so it may require extra
performance testing.
* For mobile XR devices, HDR swapchains are generally unsupported. In most cases it's best to stick
to **Auto Color Submission Mode** or **8 bits per channel (LDR, default)**.
To set the color submission mode:
1. Open the **Project Settings** window (menu: **Edit &gt; Project Settings**).
2. Click **XR Plug-in Management** to expand the plug-in section (if necessary).
3. Select **OpenXR** in the list of XR plug-ins.
4. Select the tab for a platform build target to view the features for that target.
5. Uncheck **Auto Color Submission Mode**.
6. Choose the desired **Color Submission Mode**s and sort according to priority (the order the list
is in sets the priority; actual selection depends on graphics API and hardware support.)
**8 bits per channel (LDR, default)** can be reordered but cannot be removed; it is a safe
fallback.
<a name="depth-submission-mode"></a>
### Set the depth submission mode
Many OpenXR runtimes can use depth information to perform more accurate and stable *reprojection* of content during rendering. The available **Depth Submission Modes** include:
|**Option**|**Description**|
|---|---|
|**None**|No depth submission support. No depth based stability or re-projection support that the platform might provide is enabled.|
|**Depth 16 bit**|A shared depth buffer using 16 bits per pixel is used.|
|**Depth 24 bit**|A shared depth buffer using 24 bits per pixel is used.|
The best choice can depend on the platform and specific target devices. Depth can significantly reduce judder and other XR rendering artifacts, especially with mixed reality (MR) content that combines rendered graphics with real-world video. The 16-bit option uses less bandwidth to transfer data between the CPU and GPU, which can improve rendering performance and battery life on mobile-type devices. However, the 24-bit option can minimize sorting issues and "z-fighting".
A reasonable rule of thumb to use when choosing a setting is:
* use 24-bit depth on PC XR devices
* use 16-bit depth for battery powered XR devices -- unless judder or sorting issues occur
* use **None** if your target devices don't use depth information or if you find that the benefits don't outweigh the extra rendering and battery life costs.
To set the depth submission mode:
1. Open the **Project Settings** window (menu: **Edit &gt; Project Settings**).
2. Click **XR Plug-in Management** to expand the plug-in section (if necessary).
3. Select **OpenXR** in the list of XR plug-ins.
4. Select the tab for a platform build target to view the features for that target.
5. Choose the desired **Depth Submission Mode**.
Setting the mode to anything other than **None** enables the OpenXR [XR_KHR_composition_layer depth extension](https://www.khronos.org/registry/OpenXR/specs/1.0/html/xrspec.html#XR_KHR_composition_layer_depth). Unity ignores the **Depth Submission Mode** for OpenXR plug-ins that do not support this extension.
<a name="openxr-runtime"></a>
### Choose an OpenXR runtime to use in Play mode
By default, Unity uses the OpenXR runtime that is setup as the active runtime for your computer. You can specify a different OpenXR runtime with the **Play Mode OpenXR Runtime** setting. Unity uses the selected runtime when you run a Scene using the Editor Play mode. See [Runtime Discovery](https://www.khronos.org/registry/OpenXR/specs/1.0/loader.html#runtime-discovery) for more information about how the active OpenXR runtime is determined.
> [!NOTE]
> The **Play Mode OpenXR Runtime** setting is not saved between Editor sessions. It reverts to the **System Default** option on exit.
To set the OpenXR runtime to use in Play mode:
1. Open the **Project Settings** window (menu: **Edit &gt; Project Settings**).
2. Click **XR Plug-in Management** to expand the plug-in section (if necessary).
3. Select **OpenXR** in the list of XR plug-ins.
4. Select the Mac, Windows, Linux settings (PC) tab.
5. Choose the desired **Play Mode OpenXR Runtime**.
The available options include the following choices:
|**Option**|**Description**|
|---|---|
|**System Default**| The active OpenXR runtime on your computer. Device makers who support OpenXR sometimes provide a utility to designate their OpenXR runtime as the active one.|
|**Windows Mixed Reality**| If available, sets the current OpenXR runtime to the Microsoft OpenXR runtime for Windows Mixed Reality.|
|**SteamVR**| If available, sets the current OpenXR runtime to the SteamVR OpenXR runtime.|
|**Oculus**| If available, sets the current OpenXR runtime to the Oculus OpenXR runtime.|
|**Other**| Specify a runtime by selecting its `json` config file on your hard drive. Choose this option to use an OpenXR runtime that Unity might not directly support or detect automatically.|
> [!TIP]
> When you hold your mouse over the drop-down control for the **Play Mode OpenXR Runtime** options, the tool-tip that pops up shows the path to selected OpenXR runtime.
<a name="interaction-profile"></a>
### Set the interaction profile
When using an OpenXR plug-in, you must specify which interaction profile to use. You can choose a profile in the **OpenXR** section of the **XR Plug-in Management** settings.
To add an OpenXR interaction profile:
1. Open the **Project Settings** window (menu: **Edit &gt; Project Settings**).
2. Click **XR Plug-in Management** to expand the plug-in section (if necessary).
3. Select **OpenXR** in the list of XR plug-ins.
4. In the **Interaction Profiles** section, click the **+** button to add a profile.
5. Select the profile to add from the list.
![Choose Interaction Profile](images/openxr-choose-interaction-profile.png)<br />*Choose an interaction profile*
See [Input in OpenXR](xref:openxr-input) for more information.
<a name="color-space"></a>
### Set the rendering color space
When you use the OpenGL graphics API, you must set the Unity Editor to use the linear [color space](xref:LinearRendering-LinearOrGammaWorkflow).
To change the color space:
1. Open the **Project Settings** window (menu: **Edit &gt; Project Settings**).
2. Select the **Player** settings category.
3. Scroll to the **Other Settings** section. (Click **Other Settings** to open the section, if necessary.)
4. Under the **Rendering** area, choose a **Color Space**.
<a name="project-validation"></a>
## Project validation
The OpenXR package defines a set of rules for the Project Validation system. These rules check for possible incompatibilities between OpenXR and the project configuration.
Some of the rules serve as warnings for possible configuration problems; you are not required to fix these. Other rules flag configuration errors that would result in your app failing to build or not working once built. You must fix these errors.
If you have the [XR Core Utilities](https://docs.unity3d.com/Packages/com.unity.xr.core-utils@latest) 2.1.0 or later installed, you can access the Project Validation status in the **Project Settings** window under **XR Plug-in Management** (menu: **Edit &gt; Project Settings**). These results include the checks for all XR plug-ins that provide validation rules.
![project-validation](images/ProjectValidation/project-validation-core-utils.png)
You can also open a separate **OpenXR Project Validation** window for OpenXR (menu: **Window &gt; XR &gt; OpenXR &gt; Project Validation**). This window only shows the validation results related to the OpenXR plug-in and features.
![feature-validation](images/ProjectValidation/feature-validation.png)
Rules that pass validation are not shown unless you enable **Show all**.
Some rules provide a **Fix** button that updates the configuration so that the rule passes validation. Other rules provide an **Edit** button that takes you to the relevant setting so that you can make the necessary adjustments yourself.
You can enable **Ignore build errors** to bypass the pre-build validation check. However, any misconfigured features in your app might not work at runtime.
### Validation issues reported in XR Plug-in Management
![loader-with-issues](images/ProjectValidation/loader-with-issues.png)
Clicking on either the validation warning or the error icon brings up the Validation window.
### Validation issues reported in features pane
![features-with-issues](images/ProjectValidation/features-with-issues.png)
Clicking on either the validation warning or the error icon brings up the Validation window.
### Validation issues reported in build
![build-with-issues](images/ProjectValidation/build-with-issues.png)
Double-clicking on build warnings or errors from validation brings up the Validation window.

View File

@@ -0,0 +1,7 @@
{
"hideGlobalNamespace": false,
"_noIndex": false,
"useMemberPages": false,
"showScriptRef": true,
"pmdt-additional-preprocessors": "XR_COMPOSITION_LAYERS"
}

View File

@@ -0,0 +1,7 @@
---
uid: xr-plug-in-management-upgrade-guide
---
# Upgrade Guide to 0.1.2
This is a new package release. In future package versions, this page will display a list of the actions you need to take to upgrade your project to that version.

View File

@@ -0,0 +1,7 @@
---
uid: xr-plug-in-management-whats-new
---
# What's new in version 0.1.2
This is a new package release. In future package versions, this page will display a summary of updates and changes for that version.