Azure Remote Rendering does not have any physics simulation API, what it does have are the spatial queries, as you already discovered: https://learn.microsoft.com/en-us/azure/remote-rendering/overview/features/spatial-queries
The image that you linked is not available anymore, could you please upload that again, so that I can see your use case?
Basically what the spatial queries allow you to do, is to shoot rays into the remotely rendered scene and see where they hit the remote object. With a bit of creativity this functionality allows you to do all sorts of things. You can also ask ARR which objects overlap with a given box or sphere, however, that doesn't give you an exact intersection point.
Say you want to place an object onto a table, you would retrieve the bounds of your object (https://learn.microsoft.com/en-us/azure/remote-rendering/concepts/object-bounds) and then use the lower center position of that to shoot a ray downwards to see where it hits the table. Then you can move the object that distance downwards to have it rest on the table.
However, this only works, if the object that you shoot your ray against is also a remotely rendered object, because only then can the ray see and hit that object.
If you try to work with real-life geometry (such as an actual table that you want to place virtual objects on), you need other methods. In this case you would need to utilize the HoloLens "spatial mapping" functionality (https://learn.microsoft.com/en-us/windows/mixed-reality/design/spatial-mapping). This lets you ask the HoloLens for a mesh representation of the scanned environment. Once you have that mesh, you can theoretically do raycasts against that mesh as well (though ARR would not be involved here).
Here is an article about using spatial mapping with Unity: https://learn.microsoft.com/en-us/windows/mixed-reality/develop/unity/spatial-mapping-in-unity?tabs=mrtk
I guess that Unity would let you do raycasts againt that mesh, but I'm no expert there.