Azure remote rendering setting problem with Hololens 2

EE NTU 96 Reputation points
2022-08-04T13:03:08.123+00:00

Our factory is located in Kaohsiung City, Taiwan. We are currently using Hololens 2 with Azure remote rendering and found that there have been some situations of model distortion. Then we also used the Azure Latency Test and found that the latency result of Japan East was at least 80ms, so we have a few points question want to ask:

  1. Can we set not to do Late stage reprojection?
  2. In addition to network speed, the factors that affect latency? (Remote pose mode, Local pose mode, and Display viewport)
  3. In the state of Remote pose mode, the reason why the video is different from the actual picture (the model that we saw is distorted but through the video recording is not), and how to make people see the same result as the video recording?
Azure Remote Rendering
Azure Remote Rendering
An Azure service that renders high-quality, interactive three-dimensional content and streams it to edge devices in real time.
33 questions
HoloLens Development
HoloLens Development
HoloLens: A family of Microsoft self-contained, holographic devices that enable engagement with digital content and interaction with holograms in the surrounding environment.Development: The process of researching, productizing, and refining new or existing technologies.
402 questions
0 comments No comments
{count} vote

Accepted answer
  1. Christopher Manthei 251 Reputation points Microsoft Employee
    2022-08-04T17:29:46.05+00:00

    Hi @EE NTU !

    Let me answer your questions:

    1. You can choose between depth reprojection and planar reprojection, see our late stage reprojection documentation.
    2. Network roundtrip time and network jitter are the main contributors to display latency. For example, if there are lots of packet drops or periods of time where packets are delayed ARR has to compensate by increasing latency. In addition, make sure your client application runs at 60 fps.
    3. The mixed reality capture always uses planar reprojection so when using remote pose mode only the planar reprojection is visible in the video. In the local pose mode, ARR will always reproject the remote image into the local pose. This makes sure that locally rendered content is not distorted but will also be visible in the MRC video capture.

    Some notes on the different reprojection modes: As described here, you have the option of overriding the reprojection mode with these modes in OpenXR (code snipped to do so is in the first link):

    1. Depth Reprojection: This is what you are currently using.
    2. Planar Reprojection: This is the standard planar mode, and it requires you to set a focus point around which the image is stabilized. If you don't set one the image will be very unstable on the device. You can use the GetRemoteFocusPoint function to get the focus point in the middle of the remote content. However, you also need to combine this with a focus point for any local content (in case that one is closer) which is a non-trivial task to do and will require some experimentation to achieve good results for each use-case.
    3. Automatic Planar Reprojection: This is the preferable option to try: The OS will inspect the depth buffer and automatically chose a good focus point for you. I would suggest trying that first.

    Make sure to use Remote Pose mode with planar reprojection if you want to avoid all distortions in remote content.

    Cheers,
    Christopher

    1 person found this answer helpful.
    0 comments No comments

0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.