Native development overview

Native banner logo

3D engines like Unity or Unreal aren't the only Mixed Reality development paths open to you. You can also create Mixed Reality apps using the Windows Mixed Reality APIs with DirectX 11 or DirectX 12. By going to the platform source, you're essentially building your own middleware or framework.


If you have an existing WinRT project that you'd like to maintain, head over to our main WinRT documentation.

Development checkpoints

Use the following checkpoints to bring your Unity games and applications into the world of mixed reality.

1. Getting started

Windows Mixed Reality supports two kinds of apps:

  • UWP or Win32 Mixed Reality applications that use the HolographicSpace API or OpenXR API to render an immersive view that fills the headset display
  • 2D apps (UWP) that use DirectX, XAML, or another framework to render 2D views on slates in the Windows Mixed Reality home

The differences between DirectX development for 2D views and immersive views primarily concern holographic rendering and spatial input. Your UWP application's IFrameworkView or your Win32 application's HWND are required and remain largely the same. The same is true for the WinRT APIs that are available to your app. But you must use a different subset of these APIs to take advantage of holographic features. For example, the system for holographic applications manages the swapchain and frame present to enable a pose-predicted frame loop.

Checkpoint Outcome
What is OpenXR? Begin your native development journey by getting acquainted with OpenXR and what it has to offer
Install the latest tools Download and install the latest native development tools
Set up for HoloLens 2 Configure your device and environment for HoloLens 2 development
Set up for immersive headsets Configure your device and environment for Windows Mixed Reality development
Try a sample app Explore a UWP and Win32 version of the same basic OpenXR app on your device
Take a tour of the OpenXR API Watch a 60-minute walkthrough video that tours all key components of the OpenXR API in Visual Studio
Add the OpenXR loader Add the OpenXR loader to an existing native project to get started developing

2. Core building blocks

Windows Mixed Reality applications use the following APIs to build mixed-reality experiences for HoloLens and other immersive headsets:

Feature Capability
Gaze Let users target holograms with by looking at them
Gesture Add spatial actions to your apps
Holographic rendering Draw a hologram at a precise location in the world around your users
Motion controller Let your users take action in your Mixed Reality environments
Spatial mapping Map your physical space with a virtual mesh overlay to mark the boundaries of your environment
Voice Capture spoken keywords, phrases, and dictation from your users


You can find upcoming and in-development core features in the OpenXR roadmap documentation.

3. Deploying and testing

You can develop on a desktop using OpenXR on a HoloLens 2 or Windows Mixed Reality immersive headset. If you don't have access to a headset, you can use the HoloLens 2 Emulator or the Windows Mixed Reality Simulator instead.

What's next?

A developer's job is never done, especially when learning a new tool or SDK. The following sections can take you into areas beyond the beginner level material you've already completed. These topics and resources aren't in any sequential order, so feel free to jump around and explore!

Additional resources

If you're looking to level up your OpenXR game, check out the links below:

See also