Getting started with eye tracking in MRTK2

This page covers how to set up your Unity MRTK scene to use eye tracking in your app. The following document assumes you're starting out with a fresh new scene. Alternatively, you can check out our already configured MRTK eye tracking examples with tons of great examples that you can directly build on.

Eye tracking requirements checklist

For eye tracking to work correctly, the following requirements must be met. If you're new to eye tracking on HoloLens 2 and to how eye tracking is set up in MRTK, don't worry! We'll go into detail on how to address each of them further in the following sections.

  1. An 'Eye Gaze Data Provider' must be added to the input system. This data provider provides eye tracking data from the platform.
  2. The 'GazeInput' capability must be enabled in the application manifest. This capability can be set in Unity 2019, but in Unity 2018 and earlier this capability is only available in Visual Studio and through the MRTK build tool.
  3. The HoloLens must be eye calibrated for the current user. Check out our sample for detecting whether a user is eye calibrated or not.

A note on the GazeInput capability

The MRTK-provided build tooling (Mixed Reality Toolkit -> Utilities -> Build Window) can automatically enable the GazeInput capability for you. In order to do this step, you need to make sure that the 'Gaze Input Capability' is checked on the 'Appx Build Options' tab:

MRTK Build Tools

This tooling will find the AppX manifest after the Unity build is completed and manually add the GazeInput capability. Prior to Unity 2019, this tooling is NOT active when using Unity's built-in Build Window (File -> Build Settings).

Prior to Unity 2019, when using Unity's build window, the capability will need to be manually added after the Unity build, as follows:

  1. Open your compiled Visual Studio project and then open the 'Package.appxmanifest' in your solution.
  2. Make sure to tick the 'GazeInput' checkbox under Capabilities. If you don't see a 'GazeInput' capability, check that your system meets the prerequisites for using MRTK (in particular the Windows SDK version).

Note: You only have to do this if you build into a new build folder. You won't need to reapply your changes if you had already built your Unity project and set up the appxmanifest before and now target the same folder again.

Setting up eye tracking step-by-step

Setting up the scene

Set up the MixedRealityToolkit by clicking 'Mixed Reality Toolkit -> Configureā€¦' in the menu bar.

MRTK configure

Setting up the MRTK profiles required for eye tracking

After setting up your MRTK scene, you'll be asked to choose a profile for MRTK. You can select DefaultMixedRealityToolkitConfigurationProfile and then select the 'Copy & Customize' option.

MRTK profile

Create an "eye gaze data provider"

  • Click on the 'Input' tab in your MRTK profile.
  • To edit the default one ('DefaultMixedRealityInputSystemProfile'), click the 'Clone' button next to it. A 'Clone Profile' menu appears. Click on 'Clone' at the bottom of that menu.
  • Double click on your new input profile, expand 'Input Data Providers', and select '+ Add Data Provider'.
  • Add the correct data provider:
    • For legacy Windows Mixed Reality
      • Under Type select 'Microsoft.MixedReality.Toolkit.WindowsMixedReality.Input' -> 'WindowsMixedRealityEyeGazeDataProvider'
    • For Windows XR Plugin
      • Under Type select 'Microsoft.MixedReality.Toolkit.XRSDK.WindowsMixedReality' -> 'WindowsMixedRealityEyeGazeDataProvider'
    • For OpenXR
      • Under Type select 'Microsoft.MixedReality.Toolkit.XRSDK.OpenXR' -> 'OpenXREyeGazeDataProvider'

MRTK data provider

Enable "Use Eye Tracking Data"

  • Navigate to your MRTK configuration profile -> 'Input' -> 'Pointers'
    • Clone the 'DefaultMixedRealityInputPointerProfile' to make changes to it.
  • Find 'Use Eye Tracking Data' in the settings and enable it.
  • You may see a 'Set GazeInput capability' button now and should press it to enable the UWP capability.

MRTK use eye tracking data

Simulating eye tracking in the Unity Editor

You can simulate eye tracking input in the Unity Editor to ensure that events are correctly triggered before deploying the app to your HoloLens 2. The eye gaze signal is simulated by using the camera's location as eye gaze origin and the camera's forward vector as eye gaze direction. While this is great for initial testing, please note that it isn't a good imitation for rapid eye movements. For this, it's better to ensure frequent tests of your eye-based interactions on the HoloLens 2.

  1. Enable simulated eye tracking:

    • Click on the 'Input' tab in your MRTK configuration profile.
    • From there, navigate to 'Input Data Providers' -> 'Input Simulation Service'.
    • Clone the 'DefaultMixedRealityInputSimulationProfile' to make changes to it.
    • Select the appropriate 'Default Eye Gaze Simulation Mode' setting.

    MRTK eyes simulate

  2. Disable default head gaze cursor: In general, we recommend that you avoid showing an eye gaze cursor or, if it's absolutely required, make it very subtle. We recommend that you hide the default head gaze cursor that's attached to the MRTK gaze pointer profile by default.

    • Navigate to your MRTK configuration profile -> 'Input' -> 'Pointers'
    • Clone the 'DefaultMixedRealityInputPointerProfile' to make changes to it.
    • At the top of the 'Pointer Settings', you should assign an invisible cursor prefab to the 'Gaze Cursor Prefab'. You can do this by selecting the 'EyeGazeCursor' prefab from the MRTK Foundation.

Enabling eye-based gaze in the gaze provider

In HoloLens v1, head gaze was used as the primary pointing technique. While head gaze is still available via the GazeProvider in MRTK, which is attached to your Camera, you can use eye gaze instead by selecting the 'IsEyeTrackingEnabled' checkbox in the gaze settings of the input pointer profile.


Developers can toggle between eye-based gaze and head-based gaze in code by changing the 'IsEyeTrackingEnabled' property of 'GazeProvider'.


If any of the eye tracking requirements are not met, the application will automatically fall back to head-based gaze.

Accessing eye gaze data

Now that your scene is set up to use eye tracking, let's take a look at how to access it in your scripts: Accessing eye tracking data via EyeGazeProvider and eye-supported target selections.

Testing your Unity app on a HoloLens 2

Building your app with eye tracking should be similar to how you would compile other HoloLens 2 MRTK apps. Be sure that you've enabled the 'Gaze Input' capability as described above in the section A note on the GazeInput capability.

Ensure project is configured for HoloLens 2

Make sure you've properly configured your project by reviewing the configuration steps covered in Set up an OpenXR project with MRTK.

Key points to think about:

  • Review key settings in the Project Settings window.
    • Ensure Plugin Providers are set correctly
  • Resolve warning triangles.
    • Interaction profiles should be set
    • OpenXR Feature Groups should be properly selected

Eye calibration

Don't forget to run through the eye calibration on your HoloLens 2. The eye tracking system won't return any input if the user isn't calibrated. The easiest way to get to the calibration is by flipping the visor up and then back down. A system notification should appear that welcomes you as a new user and asks you to go through the eye calibration. Alternatively you can find the eye calibration in the system settings: Settings > System > Calibration > Run eye calibration.

Eye tracking permission

When you start the app on your HoloLens 2 for the first time, a prompt should pop up asking the user for permission to use eye tracking. If the prompt isn't showing up, that's usually an indication that the 'GazeInput' capability wasn't set.

After the permission prompt showed up once, it won't show up automatically again. If you "denied eye tracking permission", you can reset this in Settings -> Privacy -> Apps.

This should get you started with using eye tracking in your MRTK Unity app. Don't forget to check out our MRTK eye tracking tutorials and samples demonstrating how to use eye tracking input and conveniently providing scripts that you can reuse in your projects.

Back to "Eye tracking in the MixedRealityToolkit"