Unity 中的延伸眼球追蹤

若要存取擴充眼球追蹤範例的 GitHub 存放庫:

延伸眼球追蹤是HoloLens 2的新功能。 這是標準眼球追蹤的超集合,只提供結合的眼球注視資料。 延伸眼球追蹤也提供個別的眼球注視資料,並允許應用程式為注視資料設定不同的畫面播放速率,例如 30、60 和 90fps。 目前HoloLens 2不支援其他功能,例如眼球開啟和眼球頂點。

擴充眼球追蹤 SDK可讓應用程式存取延伸眼球追蹤的資料和功能。 它可與 OpenXR API 或舊版 WinRT API 搭配使用。

本文涵蓋在 Unity 中使用擴充眼球追蹤 SDK 的方式,以及 Mixed Reality OpenXR 外掛程式。

專案設定

  1. 設定 HoloLens 開發的 Unity 專案。
    • 選取注視輸入功能
  2. 從 MRTK 功能工具匯入 Mixed Reality OpenXR 外掛程式。
  3. 將眼球追蹤 SDK NuGet 套件匯入 Unity 專案。
    1. 下載並安裝 NuGetForUnity 套件。
    2. 在 Unity 編輯器中,移至 NuGet - >Manage NuGet Packages ,然後搜尋Microsoft.MixedReality.EyeTracking
    3. 按一下 [安裝] 按鈕以匯入最新版本的 NuGet 套件。
      眼球追蹤 SDK Nuget 套件的螢幕擷取畫面。
  4. 新增 Unity 協助程式腳本。
    1. 從這裡ExtendedEyeGazeDataProvider.cs 腳本新增至您的 Unity 專案。
    2. 建立場景,然後將腳本附加 ExtendedEyeGazeDataProvider.cs 至任何 GameObject。
  5. 取用 的 函 ExtendedEyeGazeDataProvider.cs 式並實作邏輯。
  6. 建置並部署至 HoloLens

取用 ExtendedEyeGazeDataProvider 的函式

注意

腳本 ExtendedEyeGazeDataProvider 取決於來自 Mixed Reality OpenXR 外掛程式的一些 API,以轉換注視資料的座標。 如果您的 Unity 專案使用已取代的 Windows XR 外掛程式或舊版 Unity 中的舊版內建 XR,則無法運作。 若要讓延伸眼球追蹤在這些案例中運作:

  • 如果您只需要存取畫面播放速率設定,則不需要Mixed Reality OpenXR 外掛程式,而且您可以修改 ExtendedEyeGazeDataProvider ,只保留畫面播放速率相關的邏輯。
  • 如果您仍然需要存取個別的眼球注視資料,則必須 在 Unity 中使用 WinRT API。 若要瞭解如何搭配 WinRT API 使用延伸眼球追蹤 SDK,請參閱一節。

類別會 ExtendedEyeGazeDataProvider 包裝延伸眼球追蹤 SDK API。 它提供可在 Unity 世界空間或相對於主相機的注視讀取功能。

以下是用來 ExtendedEyeGazeDataProvider 取得注視資料的程式碼範例。

ExtendedEyeGazeDataProvider extendedEyeGazeDataProvider;
void Update() {
    timestamp = DateTime.Now;

    var leftGazeReadingInWorldSpace = extendedEyeGazeDataProvider.GetWorldSpaceGazeReading(extendedEyeGazeDataProvider.GazeType.Left, timestamp);
    var rightGazeReadingInWorldSpace = extendedEyeGazeDataProvider.GetWorldSpaceGazeReading(extendedEyeGazeDataProvider.GazeType.Right, timestamp);
    var combinedGazeReadingInWorldSpace = extendedEyeGazeDataProvider.GetWorldSpaceGazeReading(extendedEyeGazeDataProvider.GazeType.Combined, timestamp);

    var combinedGazeReadingInCameraSpace = extendedEyeGazeDataProvider.GetCameraSpaceGazeReading(extendedEyeGazeDataProvider.GazeType.Combined, timestamp);
}

ExtendedEyeGazeDataProvider當腳本執行時,它會將注視資料畫面播放速率設定為最高選項,也就是目前為 90fps。

擴充眼球追蹤 SDK 的 API 參考

除了使用 ExtendedEyeGazeDataProvider 腳本之外,您也可以建立自己的腳本來直接取用 SDK API。

namespace Microsoft.MixedReality.EyeTracking
{
    /// <summary>
    /// Allow discovery of Eye Gaze Trackers connected to the system
    /// This is the only class from the Extended Eye Tracking SDK that the application will instantiate, 
    /// other classes' instances will be returned by method calls or properties.
    /// </summary>
    public class EyeGazeTrackerWatcher
    {
        /// <summary>
        /// Constructs an instance of the watcher
        /// </summary>
        public EyeGazeTrackerWatcher();

        /// <summary>
        /// Starts trackers enumeration.
        /// </summary>
        /// <returns>Task representing async action; completes when the initial enumeration is completed</returns>
        public System.Threading.Tasks.Task StartAsync();

        /// <summary>
        /// Stop listening to trackers additions and removal
        /// </summary>
        public void Stop();

        /// <summary>
        /// Raised when an Eye Gaze tracker is connected
        /// </summary>
        public event System.EventHandler<EyeGazeTracker> EyeGazeTrackerAdded;

        /// <summary>
        /// Raised when an Eye Gaze tracker is disconnected
        /// </summary>
        public event System.EventHandler<EyeGazeTracker> EyeGazeTrackerRemoved;        
    }

    /// <summary>
    /// Represents an Eye Tracker device
    /// </summary>
    public class EyeGazeTracker
    {
        /// <summary>
        /// True if Restricted mode is supported, which means the driver supports providing individual 
        /// eye gaze vector and frame rate 
        /// </summary>
        public bool IsRestrictedModeSupported;

        /// <summary>
        /// True if Vergence Distance is supported by tracker
        /// </summary>
        public bool IsVergenceDistanceSupported;

        /// <summary>
        /// True if Eye Openness is supported by the driver
        /// </summary>
        public bool IsEyeOpennessSupported;

        /// <summary>
        /// True if individual gazes are supported
        /// </summary>
        public bool AreLeftAndRightGazesSupported;

        /// <summary>
        /// Get the supported target frame rates of the tracker
        /// </summary>
        public System.Collections.Generic.IReadOnlyList<EyeGazeTrackerFrameRate> SupportedTargetFrameRates;

        /// <summary>
        /// NodeId of the tracker, used to retrieve a SpatialLocator or SpatialGraphNode to locate the tracker in the scene
        /// for the Perception API, use SpatialGraphInteropPreview.CreateLocatorForNode
        /// for the Mixed Reality OpenXR API, use SpatialGraphNode.FromDynamicNodeId
        /// </summary>
        public Guid TrackerSpaceLocatorNodeId;

        /// <summary>
        /// Opens the tracker
        /// </summary>
        /// <param name="restrictedMode">True if restricted mode active</param>
        /// <returns>Task representing async action; completes when the initial enumeration is completed</returns>
        public System.Threading.Tasks.Task OpenAsync(bool restrictedMode);

        /// <summary>
        /// Closes the tracker
        /// </summary>
        public void Close();

        /// <summary>
        /// Changes the target frame rate of the tracker
        /// </summary>
        /// <param name="newFrameRate">Target frame rate</param>
        public void SetTargetFrameRate(EyeGazeTrackerFrameRate newFrameRate);

        /// <summary>
        /// Try to get tracker state at a given timestamp
        /// </summary>
        /// <param name="timestamp">timestamp</param>
        /// <returns>State if available, null otherwise</returns>
        public EyeGazeTrackerReading TryGetReadingAtTimestamp(DateTime timestamp);

        /// <summary>
        /// Try to get tracker state at a system relative time
        /// </summary>
        /// <param name="time">time</param>
        /// <returns>State if available, null otherwise</returns>
        public EyeGazeTrackerReading TryGetReadingAtSystemRelativeTime(TimeSpan time);

        /// <summary>
        /// Try to get first first tracker state after a given timestamp
        /// </summary>
        /// <param name="timestamp">timestamp</param>
        /// <returns>State if available, null otherwise</returns>
        public EyeGazeTrackerReading TryGetReadingAfterTimestamp(DateTime timestamp);

        /// <summary>
        /// Try to get the first tracker state after a system relative time
        /// </summary>
        /// <param name="time">time</param>
        /// <returns>State if available, null otherwise</returns>
        public EyeGazeTrackerReading TryGetReadingAfterSystemRelativeTime(TimeSpan time);
    }

    /// <summary>
    /// Represents a frame rate supported by an Eye Tracker
    /// </summary>
    public class EyeGazeTrackerFrameRate
    {
        /// <summary>
        /// Frames per second of the frame rate
        /// </summary>
        public UInt32 FramesPerSecond;
    }

    /// <summary>
    /// Snapshot of Gaze Tracker state
    /// </summary>
    public class EyeGazeTrackerReading
    {
        /// <summary>
        /// Timestamp of state
        /// </summary>
        public DateTime Timestamp;

        /// <summary>
        /// Timestamp of state as system relative time
        /// Its SystemRelativeTime.Ticks could provide the QPC time to locate tracker pose 
        /// </summary>
        public TimeSpan SystemRelativeTime;

        /// <summary>
        /// Indicates of user calibration is valid
        /// </summary>
        public bool IsCalibrationValid;

        /// <summary>
        /// Tries to get a vector representing the combined gaze related to the tracker's node
        /// </summary>
        /// <param name="origin">Origin of the gaze vector</param>
        /// <param name="direction">Direction of the gaze vector</param>
        /// <returns></returns>
        public bool TryGetCombinedEyeGazeInTrackerSpace(out System.Numerics.Vector3 origin, out System.Numerics.Vector3 direction);

        /// <summary>
        /// Tries to get a vector representing the left eye gaze related to the tracker's node
        /// </summary>
        /// <param name="origin">Origin of the gaze vector</param>
        /// <param name="direction">Direction of the gaze vector</param>
        /// <returns></returns>
        public bool TryGetLeftEyeGazeInTrackerSpace(out System.Numerics.Vector3 origin, out System.Numerics.Vector3 direction);

        /// <summary>
        /// Tries to get a vector representing the right eye gaze related to the tracker's node position
        /// </summary>
        /// <param name="origin">Origin of the gaze vector</param>
        /// <param name="direction">Direction of the gaze vector</param>
        /// <returns></returns>
        public bool TryGetRightEyeGazeInTrackerSpace(out System.Numerics.Vector3 origin, out System.Numerics.Vector3 direction);

        /// <summary>
        /// Tries to read vergence distance
        /// </summary>
        /// <param name="value">Vergence distance if available</param>
        /// <returns>bool if value is valid</returns>
        public bool TryGetVergenceDistance(out float value);

        /// <summary>
        /// Tries to get left Eye openness information
        /// </summary>
        /// <param name="value">Eye Openness if valid</param>
        /// <returns>bool if value is valid</returns>
        public bool TryGetLeftEyeOpenness(out float value);

        /// <summary>
        /// Tries to get right Eye openness information
        /// </summary>
        /// <param name="value">Eye openness if valid</param>
        /// <returns>bool if value is valid</returns>
        public bool TryGetRightEyeOpenness(out float value);
    }
}

另請參閱