The hand tracking system uses a person’s palms and fingers as input. Data on position and rotation of every finger, the entire palm, and hand gestures is available. Starting in Unreal 4.26, hand tracking is based on the Unreal HeadMountedDisplay plugin and uses a common API across all XR platforms and devices. Functionality is the same for both Windows Mixed Reality and OpenXR systems.
Hand pose
Hand pose lets you track and use the hands and fingers of your users as input, which can be accessed in both Blueprints and C++. The Unreal API sends the data as a coordinate system, with ticks synchronized with the Unreal Engine.

The hierarchy is described by EHandKeypoint
enum:

You can get all this data from a user’s hands using the Get Motion Controller Data function. That function returns an XRMotionControllerData structure. Below is a sample Blueprint script that parses the XRMotionControllerData structure to get hand joint locations and draws a debug coordinate system at each joint’s location.

It's important to check if the structure is valid and that it's a hand. Otherwise, you may get undefined behavior in access to positions, rotations, and radii arrays.
The EWMRHandKeypoint
enum describes the Hand’s bone hierarchy. You can find each hand keypoint listed in your Blueprints:

The full C++ enum is listed below:
enum class EWMRHandKeypoint : uint8
{
Palm,
Wrist,
ThumbMetacarpal,
ThumbProximal,
ThumbDistal,
ThumbTip,
IndexMetacarpal,
IndexProximal,
IndexIntermediate,
IndexDistal,
IndexTip,
MiddleMetacarpal,
MiddleProximal,
MiddleIntermediate,
MiddleDistal,
MiddleTip,
RingMetacarpal,
RingProximal,
RingIntermediate,
RingDistal,
RingTip,
LittleMetacarpal,
LittleProximal,
LittleIntermediate,
LittleDistal,
LittleTip
};
You can find the numerical values for each enum case in the Windows.Perception.People.HandJointKind table.
Supporting Hand Tracking
You can use hand tracking in Blueprints by adding Supports Hand Tracking from Hand Tracking > Windows Mixed Reality:

This function returns true
if hand tracking is supported on the device and false
if hand tracking isn't available.

C++:
Include WindowsMixedRealityHandTrackingFunctionLibrary.h
.
static bool UWindowsMixedRealityHandTrackingFunctionLibrary::SupportsHandTracking()
Getting Hand Tracking
You can use GetHandJointTransform to return spatial data from the hand. The data updates every frame, but if you're inside a frame the returned values are cached. It's not recommended to have heavy logic in this function for performance reasons.

C++:
static bool UWindowsMixedRealityHandTrackingFunctionLibrary::GetHandJointTransform(EControllerHand Hand, EWMRHandKeypoint Keypoint, FTransform& OutTransform, float& OutRadius)
Here's a breakdown of GetHandJointTransform's function parameters:
- Hand – can be the users left or right hand.
- Keypoint – the bone of the hand.
- Transform – coordinates and orientation of bone’s base. You can request the base of the next bone to get the transform data for the end of a bone. A special Tip bone gives end of distal.
- **Radius—radius of the base of the bone.
- **Return Value—true if the bone is tracked this frame, false if the bone isn't tracked.
Hand Live Link Animation
Hand poses are exposed to Animation using the Live Link plugin.
If the Windows Mixed Reality and Live Link plugins are enabled:
- Select Window > Live Link to open the Live Link editor window.
- Select Source and enable Windows Mixed Reality Hand Tracking Source

After you enable the source and open an animation asset, expand the Animation section in the Preview Scene tab too see additional options.

The hand animation hierarchy is the same as in EWMRHandKeypoint
. Animation can be retargeted using WindowsMixedRealityHandTrackingLiveLinkRemapAsset:

It can also be subclassed in the editor:

Hand Mesh
Important
Hand mesh requires OpenXR.
The Microsoft OpenXR plugin must be used, available from the Unreal Marketplace or GitHub.
Hand Mesh as a Tracked Geometry
Important
Getting hand meshes as a tracked geometry in OpenXR requires you to call Set Use Hand Mesh with Enabled Tracking Geometry.
To enable that mode you should call Set Use Hand Mesh with Enabled Tracking Geometry:

Note
It’s not possible for both modes to be enabled at the same time. If you enable one, the other is automatically disabled.
Accessing Hand Mesh Data

Before you can access hand mesh data, you'll need to:
- Select your ARSessionConfig asset, expand the AR Settings -> World Mapping settings, and check Generate Mesh Data from Tracked Geometry.
Below are the default mesh parameters:
- Use Mesh Data for Occlusion
- Generate Collision for Mesh Data
- Generate Nav Mesh for Mesh Data
- Render Mesh Data in Wireframe – debug parameter that shows generated mesh
These parameter values are used as the spatial mapping mesh and hand mesh defaults. You can change them at any time in Blueprints or code for any mesh.
C++ API Reference
Use EEARObjectClassification
to find hand mesh values in all trackable objects.
enum class EARObjectClassification : uint8
{
// Other types
HandMesh,
};
The following delegates are called when the system detects any trackable object, including a hand mesh.
class FARSupportInterface
{
public:
// Other params
DECLARE_AR_SI_DELEGATE_FUNCS(OnTrackableAdded)
DECLARE_AR_SI_DELEGATE_FUNCS(OnTrackableUpdated)
DECLARE_AR_SI_DELEGATE_FUNCS(OnTrackableRemoved)
};
Make sure your delegate handlers follow the function signature below:
void UARHandMeshComponent::OnTrackableAdded(UARTrackedGeometry* Added)
You can access mesh data through the UARTrackedGeometry::GetUnderlyingMesh
:
UMRMeshComponent* UARTrackedGeometry::GetUnderlyingMesh()
Blueprint API Reference
To work with Hand Meshes in Blueprints:
- Add an ARTrackableNotify Component to a Blueprint actor

- Go to the Details panel and expand the Events section.

- Overwrite On Add/Update/Remove Tracked Geometry with the following nodes in your Event Graph:

Hand Mesh visualization in OpenXR
The recommended way to visualize hand mesh is to use Epic’s XRVisualization plugin together with the Microsoft OpenXR plugin.
Then in the blueprint editor, you should use Set Use Hand Mesh function from the Microsoft OpenXR plugin with Enabled XRVisualization as a parameter:

To manage the rendering process, you should use Render Motion Controller from XRVisualization:

The result:

If you need anything more complicated, such as drawing a hand mesh with a custom shader, you need to get the meshes as a tracked geometry.
Hand rays
Getting hand pose works for close interactions like grabbing objects or pressing buttons. However, sometimes you need to work with holograms that are far away from your users. This can be accomplished with hand rays, which can be used as pointing devices in both C++ and Blueprints. You can draw a ray from your hand to a far point and, with some help from Unreal ray tracing, select a hologram that would otherwise be out of reach.
Important
Since all function results change every frame, they're all made callable. For more information about pure and impure or callable functions, see the Blueprint user guid on functions.
To get the data for the hand rays, you should use the Get Motion Controller Data function from the previous section. The returned structure contains two parameters you can use to create a hand ray – Aim Position and Aim Rotation. These parameters form a ray directed by your elbow. You should take them and find a hologram being pointed by.
Below is an example of determining whether a hand ray hits a Widget and setting a custom hit result:

To use Hand Rays in Blueprints, search for any of the actions under Windows Mixed Reality HMD:

To access them in C++, include WindowsMixedRealityFunctionLibrary.h
to the top of your calling code file.
Enum
You also have access to input cases under EHMDInputControllerButtons, which can be used in Blueprints:

For access in C++, use the EHMDInputControllerButtons
enum class:
enum class EHMDInputControllerButtons : uint8
{
Select,
Grasp,
//......
};
Below is a breakdown of the two applicable enum cases:
- Select - User triggered Select event.
- Triggered in HoloLens 2 by air-tap, gaze, and commit, or by saying “Select” with voice input enabled.
- Grasp - User triggered Grasp event.
- Triggered in HoloLens 2 by closing the user’s fingers on a hologram.
You can access the tracking status of your hand mesh in C++ through the EHMDTrackingStatus
enum shown below:
enum class EHMDTrackingStatus : uint8
{
NotTracked,
//......
Tracked
};
Below is a breakdown of the two applicable enum cases:
- NotTracked –- the hand isn’t visible
- Tracked –- the hand is fully tracked
Struct
The PointerPoseInfo struct can give you information on the following hand data:
- Origin – origin of the hand
- Direction – direction of the hand
- Up – up vector of the hand
- Orientation – orientation quaternion
- Tracking Status – current tracking status
You can access the PointerPoseInfo struct through Blueprints, as shown below:

Or with C++:
struct FPointerPoseInfo
{
FVector Origin;
FVector Direction;
FVector Up;
FQuat Orientation;
EHMDTrackingStatus TrackingStatus;
};
Functions
All of the functions listed below can be called on every frame, which allows continuous monitoring.
- Get Pointer Pose Info returns complete information about the hand ray direction in the current frame.
Blueprint:

C++:
static FPointerPoseInfo UWindowsMixedRealityFunctionLibrary::GetPointerPoseInfo(EControllerHand hand);
- Is Grasped returns true if the hand is grasped in the current frame.
Blueprint:

C++:
static bool UWindowsMixedRealityFunctionLibrary::IsGrasped(EControllerHand hand);
- Is Select Pressed returns true if the user triggered Select in the current frame.
Blueprint:

C++:
static bool UWindowsMixedRealityFunctionLibrary::IsSelectPressed(EControllerHand hand);
- Is Button Clicked returns true if the event or button is triggered in the current frame.
Blueprint:

C++:
static bool UWindowsMixedRealityFunctionLibrary::IsButtonClicked(EControllerHand hand, EHMDInputControllerButtons button);
- Get Controller Tracking Status returns the tracking status in the current frame.
Blueprint:

C++:
static EHMDTrackingStatus UWindowsMixedRealityFunctionLibrary::GetControllerTrackingStatus(EControllerHand hand);
Gestures
The HoloLens 2 tracks spatial gestures, which means you can capture those gestures as input. Gesture tracking is based on a subscription model. You should use the “Configure Gestures” function to tell the device which gestures you want to track. You can find more details about gestures are the HoloLens 2 Basic Usage document.
Windows Mixed Reality

Then you should add code to subscribe to the following events:

OpenXR
In OpenXR, gesture events are tracked through the input pipeline. Using hand interaction, the device can automatically recognize Tap and Hold gestures, but not the others. They are named as OpenXRMsftHandInteraction Select and Grip mappings. You don’t need to enable subscription, you should declare the events in Project Settings/Engine/Input, just like this:

You can find the Blueprint function in under Windows Mixed Reality Spatial Input, and the C++ function by adding WindowsMixedRealitySpatialInputFunctionLibrary.h
in your calling code file.

Enum
Blueprint:

C++:
enum class ESpatialInputAxisGestureType : uint8
{
None = 0,
Manipulation = 1,
Navigation = 2,
NavigationRails = 3
};
Function
You can enable and disable gesture capture with the CaptureGestures
function. When an enabled gesture fires input events, the function returns true
if gesture capture succeeded, and false
if there's an error.
Blueprint:

C++:
static bool UWindowsMixedRealitySpatialInputFunctionLibrary::CaptureGestures(
bool Tap = false,
bool Hold = false,
ESpatialInputAxisGestureType AxisGesture = ESpatialInputAxisGestureType::None,
bool NavigationAxisX = true,
bool NavigationAxisY = true,
bool NavigationAxisZ = true);
The following are key events, which you can find in Blueprints and C++:


const FKey FSpatialInputKeys::TapGesture(TapGestureName);
const FKey FSpatialInputKeys::DoubleTapGesture(DoubleTapGestureName);
const FKey FSpatialInputKeys::HoldGesture(HoldGestureName);
const FKey FSpatialInputKeys::LeftTapGesture(LeftTapGestureName);
const FKey FSpatialInputKeys::LeftDoubleTapGesture(LeftDoubleTapGestureName);
const FKey FSpatialInputKeys::LeftHoldGesture(LeftHoldGestureName);
const FKey FSpatialInputKeys::RightTapGesture(RightTapGestureName);
const FKey FSpatialInputKeys::RightDoubleTapGesture(RightDoubleTapGestureName);
const FKey FSpatialInputKeys::RightHoldGesture(RightHoldGestureName);
const FKey FSpatialInputKeys::LeftManipulationGesture(LeftManipulationGestureName);
const FKey FSpatialInputKeys::LeftManipulationXGesture(LeftManipulationXGestureName);
const FKey FSpatialInputKeys::LeftManipulationYGesture(LeftManipulationYGestureName);
const FKey FSpatialInputKeys::LeftManipulationZGesture(LeftManipulationZGestureName);
const FKey FSpatialInputKeys::LeftNavigationGesture(LeftNavigationGestureName);
const FKey FSpatialInputKeys::LeftNavigationXGesture(LeftNavigationXGestureName);
const FKey FSpatialInputKeys::LeftNavigationYGesture(LeftNavigationYGestureName);
const FKey FSpatialInputKeys::LeftNavigationZGesture(LeftNavigationZGestureName);
const FKey FSpatialInputKeys::RightManipulationGesture(RightManipulationGestureName);
const FKey FSpatialInputKeys::RightManipulationXGesture(RightManipulationXGestureName);
const FKey FSpatialInputKeys::RightManipulationYGesture(RightManipulationYGestureName);
const FKey FSpatialInputKeys::RightManipulationZGesture(RightManipulationZGestureName);
const FKey FSpatialInputKeys::RightNavigationGesture(RightNavigationGestureName);
const FKey FSpatialInputKeys::RightNavigationXGesture(RightNavigationXGestureName);
const FKey FSpatialInputKeys::RightNavigationYGesture(RightNavigationYGestureName);
const FKey FSpatialInputKeys::RightNavigationZGesture(RightNavigationZGestureName);
Next Development Checkpoint
If you're following the Unreal development journey we've laid out, you're in the midst of exploring the MRTK core building blocks. From here, you can continue to the next building block:
Or jump to Mixed Reality platform capabilities and APIs:
You can always go back to the Unreal development checkpoints at any time.