Getting the mesh data is not possible, but ARR has other features that might be of help.
As a bit of background: As you are certainly aware, ARR is used to render models that are too detailed for the HoloLens to render itself. Usually these meshes are even too large to load into its memory in the first place. Consequently, it would not only be impossible to render them, but also to do any kind of physics (even just raycasts) with those meshes on the device.
Therefore, the HoloLens never gets any of the mesh data. The model is only loaded on the server, the image is rendered there, and the device only gets a video stream and a scenegraph with some high level data to be able to position the model etc. But no triangle data is ever sent to the device. That means, there is no way to retrieve the mesh and give it to Unity.
However, obviously it is a very common usecase that an app needs to detect what model a user is pointing at or which piece of a model a user's hand is touching. Therefore, rather than doing such operations on the device, you can simply ask ARR to do them for you.
See this documentation: https://learn.microsoft.com/en-us/azure/remote-rendering/overview/features/spatial-queries
There are currently two types of queries:
* Spatial Queries
With the first query you shoot a ray into the scene and get back what mesh was hit and where. This is typically used to determine what a user is pointing at.
With the second query you define a volume and ask what meshes overlap with that volume. This is often used to figure out what mesh to move when a user tries to grab something.
There are of course many other things that can be achieved with these queries.
Now of course you can't do physics simulation with this, so you couldn't have an object rendered with ARR literally collide with other things and fall to the floor. But that would generally not work, because of the complexity of these objects. If you really want something like that, you need to have a very low resolution (and convex) representation of your mesh at hand, to use that as a collision mesh instead. Then you could do a physics simulation on the device and synchronize the resulting position after every update to ARR.