แก้ไข

แชร์ผ่าน


Port VR apps to Windows Mixed Reality

Windows 10 includes support for immersive and holographic headsets. Other devices like the Oculus Rift or HTC Vive have dependencies on libraries that exist above the operating system's platform API. To bring existing Win32 Unity VR apps over to Windows Mixed Reality, you need to retarget vendor-specific VR SDK usage to Unity's cross-vendor VR APIs and plugins.

Porting VR apps to Windows Mixed Reality requires the following high-level processes:

  1. Make sure your PC is running the Windows 10, version 1709 (the Fall Creator's Update) or newer, or Windows 11.
  2. Upgrade to the latest version of your graphics or game engine. Game engines must support the Windows 10 SDK version 10.0.15063.0 or higher.
  3. Upgrade middleware, plug-ins, and components. If your app contains any components, upgrade to the latest versions.
  4. Target the latest Unity version and OpenXR plugin. Remove dependencies on duplicate SDKs. Depending on which device your content targeted, remove or conditionally compile out that SDK.
  5. Work through build issues specific to your app, your engine, and your component dependencies.

Common porting steps

Start with the following common porting steps:

  1. Make sure you have the right development hardware. The VR enthusiast guide lists the recommended development hardware.

  2. Upgrade to the latest flight of Windows 10.

    1. Install the Windows 10 Creators Update
    2. Join the Windows Insider Program.
    3. Enable Developer Mode
    4. Switch to the Windows Insider Fast flights through Settings > Update & Security Section.

    Note

    The Windows Mixed Reality platform is still under active development. Join the Windows Insider Program to access the Windows Insider Fast flight. Don't get preview builds from the Insider Skip Ahead ring, because those builds aren't the most stable for mixed reality development.

  3. If you're using Visual Studio, upgrade to the most recent build. See Install the tools under Visual Studio 2022. Be sure to install the Game Development with Unity workload.

Unity porting steps

Review the common steps to make sure your development environment is set up correctly. To port your existing Unity content, follow these steps:

1. Upgrade to the latest public build of Unity with Windows MR support

  1. Save a copy of your project before you get started.
  2. Download the latest recommended public build of Unity with Windows Mixed Reality support.
  3. If your project was built on an older version of Unity, review the Unity Upgrade Guides.
  4. Follow the instructions for using Unity's automatic API updater.
  5. See if you need to make any other changes to get your project running, and work through any errors and warnings.

2. Upgrade your middleware to the latest versions

With any Unity update, you might need to update one or more middleware packages that your game or application depends on. Updating to the latest middleware increases the likelihood of success throughout the rest of the porting process.

3. Target your application to run on Win32

From inside your Unity application:

  1. Navigate to File > Build Settings.
  2. Select PC, Mac, Linux Standalone.
  3. Set target platform to Windows.
  4. Set architecture to x86.
  5. Select Switch Platform.

Note

If your application has any dependencies on device-specific services, such as match making from Steam, disable them now. You can hook up the Windows equivalent services later.

4. Add support for the Mixed Reality OpenXR Plugin

  1. Choose and install a Unity version and XR plugin. While the Unity 2020.3 LTS with the Mixed Reality OpenXR plugin is best for Mixed Reality development, you can also build apps with other Unity configurations.

  2. Remove or conditionally compile out any library support specific to another VR SDK. Those assets might change settings and properties on your project in ways that are incompatible with Windows Mixed Reality.

    For example, if your project references the SteamVR SDK, update your project to instead use Unity's common VR APIs, which support both Windows Mixed Reality and SteamVR.

  3. In your Unity project, target the Windows 10 SDK.

  4. For each scene, set up the camera.

5. Set up your Windows Mixed Reality hardware

  1. Review steps in Immersive headset setup.
  2. Learn how to Use the Windows Mixed Reality simulator and Navigate the Windows Mixed Reality home.

6. Use the stage to place content on the floor

You can build Mixed Reality experiences across a wide range of experience scales. If you're porting a seated-scale experience, make sure Unity is set to the Stationary tracking space type:

XRDevice.SetTrackingSpaceType(TrackingSpaceType.Stationary);

This code sets Unity's world coordinate system to track the stationary frame of reference. In the Stationary tracking mode, content you place in the editor just in front of the camera's default location (forward is -Z) appears in front of the user when the app launches. To recenter the user's seated origin, you can call Unity's XR.InputTracking.Recenter method.

If you're porting a standing-scale experience or room-scale experience, you're placing content relative to the floor. You reason about the user's floor using the spatial stage, which represents the user's defined floor-level origin. The spatial stage can include an optional room boundary you set up during the first run.

For these experiences, make sure Unity is set to the RoomScale tracking space type. RoomScale is the default, but set it explicitly and ensure you get back true. This practice catches situations where the user has moved their computer away from the room they calibrated.

if (XRDevice.SetTrackingSpaceType(TrackingSpaceType.RoomScale))
{
    // RoomScale mode was set successfully.  App can now assume that y=0 in Unity world coordinate represents the floor.
}
else
{
    // RoomScale mode was not set successfully.  App can't make assumptions about where the floor plane is.
}

Once your app successfully sets the RoomScale tracking space type, content placed on the y=0 plane appears on the floor. The origin at (0, 0, 0) is the specific place on the floor where the user stood during room setup, with -Z representing the forward direction they faced during setup.

In script code, you can then call the TryGetGeometry method on the UnityEngine.Experimental.XR.Boundary type to get a boundary polygon, specifying a boundary type of TrackedArea. If the user defined a boundary, you get back a list of vertices. You can then deliver a room-scale experience to the user, where they can walk around the scene you create.

The system automatically renders the boundary when the user approaches it. Your app doesn't need to use this polygon to render the boundary itself.

Example of results:

Example of results

For more information, see Coordinate systems in Unity.

7. Work through your input model

Each game or application that targets an existing head-mounted display (HMD) has a set of inputs that it handles, types of inputs that it needs for the experience, and specific APIs that it calls to get those inputs. It's simple and straightforward to take advantage of the inputs available in Windows Mixed Reality.

See the input porting guide for Unity for details about how Windows Mixed Reality exposes input, and how the input maps to what your application does now.

Important

If you use HP Reverb G2 controllers, see HP Reverb G2 Controllers in Unity for further input mapping instructions.

8. Test and tune performance

Windows Mixed Reality is available on many devices, ranging from high end gaming PCs to broad market mainstream PCs. These devices have significantly different compute and graphics budgets available for your application.

If you ported your app using a premium PC with significant compute and graphics budgets, be sure to test and profile your app on hardware that represents your target market. For more information, see Windows Mixed Reality minimum PC hardware compatibility guidelines.

Both Unity and Visual Studio include performance profilers, and both Microsoft and Intel publish guidelines on performance profiling and optimization.

For an extensive discussion of performance, see Understand performance for Mixed Reality. For specific details about Unity, see Performance recommendations for Unity.

Input mapping

For input mapping information and instructions, see Input porting guide for Unity.

Important

If you use HP Reverb G2 controllers, see HP Reverb G2 Controllers in Unity for further input mapping instructions.

See also