Remote rendering collisions Unity Hololens

Tomasz Woźniak 0 Reputation points
2023-01-20T20:00:23.2733333+00:00

Hi,
Please tell me - if it is possible to get point of collision Remote Rendered model with Gameobject on Unity scene?
How to get the exact point(s) where the 'local' object collides with the 'remote' model?

Similar question for Hololens - how to get the exact point(s) where a 'remote' object collides with a 'local' spatial environment like floor, walls, etc?

In the image example: the blue table is colliding with the umbrella table. Is it possible to get from ARR point of this collision (where one leg of the blue table is colliding with the round table?)

User's image

Azure Remote Rendering
Azure Remote Rendering
An Azure service that renders high-quality, interactive three-dimensional content and streams it to edge devices in real time.
32 questions
HoloLens Development
HoloLens Development
HoloLens: A family of Microsoft self-contained, holographic devices that enable engagement with digital content and interaction with holograms in the surrounding environment.Development: The process of researching, productizing, and refining new or existing technologies.
378 questions
{count} votes

2 answers

Sort by: Most helpful
  1. Deleted

    This answer has been deleted due to a violation of our Code of Conduct. The answer was manually reported or identified through automated detection before action was taken. Please refer to our Code of Conduct for more information.

    1 deleted comment

    Comments have been turned off. Learn more

  2. Jan Krassnigg 91 Reputation points
    2023-01-23T11:08:55.65+00:00

    Azure Remote Rendering does not have any physics simulation API, what it does have are the spatial queries, as you already discovered: https://learn.microsoft.com/en-us/azure/remote-rendering/overview/features/spatial-queries

    The image that you linked is not available anymore, could you please upload that again, so that I can see your use case?

    Basically what the spatial queries allow you to do, is to shoot rays into the remotely rendered scene and see where they hit the remote object. With a bit of creativity this functionality allows you to do all sorts of things. You can also ask ARR which objects overlap with a given box or sphere, however, that doesn't give you an exact intersection point.

    Say you want to place an object onto a table, you would retrieve the bounds of your object (https://learn.microsoft.com/en-us/azure/remote-rendering/concepts/object-bounds) and then use the lower center position of that to shoot a ray downwards to see where it hits the table. Then you can move the object that distance downwards to have it rest on the table.

    However, this only works, if the object that you shoot your ray against is also a remotely rendered object, because only then can the ray see and hit that object.

    If you try to work with real-life geometry (such as an actual table that you want to place virtual objects on), you need other methods. In this case you would need to utilize the HoloLens "spatial mapping" functionality (https://learn.microsoft.com/en-us/windows/mixed-reality/design/spatial-mapping). This lets you ask the HoloLens for a mesh representation of the scanned environment. Once you have that mesh, you can theoretically do raycasts against that mesh as well (though ARR would not be involved here).

    Here is an article about using spatial mapping with Unity: https://learn.microsoft.com/en-us/windows/mixed-reality/develop/unity/spatial-mapping-in-unity?tabs=mrtk

    I guess that Unity would let you do raycasts againt that mesh, but I'm no expert there.