Holographic Remoting Overview

You can use Holographic Remoting to stream holographic content to your HoloLens in real time. There are two main uses for Holographic Remoting, and it's important to understand the difference:

  1. (Unity or Unreal): You want to preview and debug your app during the development process: You can run your app locally in the Unity editor on your PC in Play Mode and stream the experience to your HoloLens. Holographic Remoting provides a way to quickly debug your app without building and deploying a full project. We call this type of app a Holographic Remoting Player app.

  2. (Unity, Unreal or C++): You want the resources of a PC to power your app instead of relying on the HoloLens on-board resources: You can create and build an app that has Holographic Remoting capability. The user experiences the app on the HoloLens, but the app actually runs on a PC, which allows it to take advantage of the PC's more powerful resources. Holographic Remoting can be especially helpful if your app has high-resolution assets or models and you don't want the frame rate to suffer. We call this type of app a Holographic Remoting Remote app.

In either case, inputs from the HoloLens--gaze, gesture, voice, and spatial mapping--are sent to the PC, content is rendered in a virtual immersive view, and the rendered frames are then sent to the HoloLens.


When developing a remote application either the Windows Mixed Reality API or the OpenXR API has to be used. Mixing both APIs within the same application is not supported.


Holographic Remoting for HoloLens 2 is a major version change. Remote applications for HoloLens (1st gen) must use NuGet package version 1.x.x and remote applications for HoloLens 2 must use 2.x.x. This implies that remote applications written for HoloLens 2 are not compatible with HoloLens (1st gen) and vice versa.

See Also