HoloLens Photo/Video Camera in Unreal
The HoloLens has a Photo/Video (PV) Camera on the visor that can be used for both Mixed Reality Capture (MRC) and locating objects in Unreal world space from pixel coordinates in the camera frame.
Important
The PV Camera isn't supported with Holographic Remoting, but it's possible to use a webcam attached to your PC to simulate the HoloLens PV Camera functionality.
PV Camera Feed Setup
Important
The PV camera is implemented in both Windows Mixed Reality and OpenXR plugins. However, OpenXR needs the Microsoft OpenXR plugin to be installed. Also, OpenXR for Unreal 4.26 has a limitation: camera can work with DirectX11 RHI. This limitation is fixed in Unreal 4.27.1 or later.
- In Project Settings > HoloLens, enable the Webcam capability:
- Create a new actor called “CamCapture” and add a plane to render the camera feed:
- Add the actor to your scene, create a new material called CamTextureMaterial with a Texture Object Parameter, and a texture sample. Send the texture’s rgb data to the output emissive color:
Rendering the PV Camera Feed
- In the CamCapture blueprint, turn on the PV Camera:
- Create a dynamic material instance from CamTextureMaterial and assign this material to the actor’s plane:
- Get the texture from the camera feed and assign it to the dynamic material if it's valid. If the texture isn't valid, start a timer and try again after the timeout:
- Finally, scale the plane by the camera image’s aspect ratio:
Find Camera Positions in World Space
The camera on the HoloLens 2 is offset vertically from the device’s head tracking. A few functions exist to locate the camera in world space to account for the offset.
GetPVCameraToWorldTransform gets the transform in world space of the PV Camera and will be positioned on the camera lens:
GetWorldSpaceRayFromCameraPoint casts a ray from the camera lens into the scene in Unreal world space to find a pixel's content in the camera frame:
GetPVCameraIntrinsics returns the camera intrinsic values, which can be used when doing computer vision processing on a camera frame:
To find what exists in world space at a particular pixel coordinate, use a line trace with the world space ray:
Here we cast a 2-meter ray from the camera lens to the camera-space position ¼ from the top left of the frame. Then use the hit result to render something where the object exists in world space:
When using spatial mapping, this hit position will match the surface that the camera is seeing.
Rendering the PV Camera Feed in C++
- Create a new C++ actor called CamCapture
- In the project’s build.cs, add “AugmentedReality” to the PublicDependencyModuleNames list:
PublicDependencyModuleNames.AddRange(
new string[] {
"Core",
"CoreUObject",
"Engine",
"InputCore",
"AugmentedReality"
});
- In CamCapture.h, include ARBlueprintLibrary.h
#include "ARBlueprintLibrary.h"
- You also need to add local variables for the mesh and material:
private:
UStaticMesh* StaticMesh;
UStaticMeshComponent* StaticMeshComponent;
UMaterialInstanceDynamic* DynamicMaterial;
bool IsTextureParamSet = false;
- In CamCapture.cpp, update the constructor to add a static mesh to the scene:
ACamCapture::ACamCapture()
{
PrimaryActorTick.bCanEverTick = true;
// Load a mesh from the engine to render the camera feed to.
StaticMesh = LoadObject<UStaticMesh>(nullptr, TEXT("/Engine/EngineMeshes/Cube.Cube"), nullptr, LOAD_None, nullptr);
// Create a static mesh component to render the static mesh
StaticMeshComponent = CreateDefaultSubobject<UStaticMeshComponent>(TEXT("CameraPlane"));
StaticMeshComponent->SetStaticMesh(StaticMesh);
// Scale and add to the scene
StaticMeshComponent->SetWorldScale3D(FVector(0.1f, 1, 1));
this->SetRootComponent(StaticMeshComponent);
}
In BeginPlay create a dynamic material instance from the project’s camera material, apply it to the static mesh component, and start the HoloLens camera.
In the editor, right-click on the CamTextureMaterial in the content browser and select “Copy Reference” to get the string for CameraMatPath.
void ACamCapture::BeginPlay()
{
Super::BeginPlay();
// Create a dynamic material instance from the game's camera material.
// Right-click on a material in the project and select "Copy Reference" to get this string.
FString CameraMatPath("Material'/Game/Materials/CamTextureMaterial.CamTextureMaterial'");
UMaterial* BaseMaterial = (UMaterial*)StaticLoadObject(UMaterial::StaticClass(), nullptr, *CameraMatPath, nullptr, LOAD_None, nullptr);
DynamicMaterial = UMaterialInstanceDynamic::Create(BaseMaterial, this);
// Use the dynamic material instance when rendering the camera mesh.
StaticMeshComponent->SetMaterial(0, DynamicMaterial);
// Start the webcam.
UARBlueprintLibrary::ToggleARCapture(true, EARCaptureType::Camera);
}
In Tick get the texture from the camera, set it to the texture parameter in the CamTextureMaterial material, and scale the static mesh component by the camera frame’s aspect ratio:
void ACamCapture::Tick(float DeltaTime)
{
Super::Tick(DeltaTime);
// Dynamic material instance only needs to be set once.
if(IsTextureParamSet)
{
return;
}
// Get the texture from the camera.
UARTexture* ARTexture = UARBlueprintLibrary::GetARTexture(EARTextureType::CameraImage);
if(ARTexture != nullptr)
{
// Set the shader's texture parameter (named "Param") to the camera image.
DynamicMaterial->SetTextureParameterValue("Param", ARTexture);
IsTextureParamSet = true;
// Get the camera instrincs
FARCameraIntrinsics Intrinsics;
UARBlueprintLibrary::GetCameraIntrinsics(Intrinsics);
// Scale the camera mesh by the aspect ratio.
float R = (float)Intrinsics.ImageResolution.X / (float)Intrinsics.ImageResolution.Y;
StaticMeshComponent->SetWorldScale3D(FVector(0.1f, R, 1));
}
}
Next Development Checkpoint
If you're following the Unreal development journey we've laid out, you're in the midst of exploring the Mixed Reality platform capabilities and APIs. From here, you can continue to the next topic:
Or jump directly to deploying your app on a device or emulator:
You can always go back to the Unreal development checkpoints at any time.