Partilhar via


Controlo ocular alargado no Unity

Para aceder ao repositório do GitHub para obter o exemplo de controlo ocular alargado:

O controlo ocular alargado é uma nova capacidade no HoloLens 2. É um superconjunto do controlo ocular padrão, que apenas fornece dados combinados de olhar para os olhos. O controlo ocular alargado também fornece dados individuais de olhar para os olhos e permite que as aplicações definam diferentes taxas de fotogramas para os dados de mira, como 30, 60 e 90fps. Outras funcionalidades, como a abertura ocular e a vergência ocular, não são suportadas por HoloLens 2 neste momento.

O SDK de Controlo Ocular Alargado permite que as aplicações acedam a dados e funcionalidades de controlo ocular alargado. Pode ser utilizado em conjunto com as APIs OpenXR ou as APIs WinRT legadas.

Este artigo aborda as formas de utilizar o SDK de controlo ocular alargado no Unity, juntamente com o Plug-in OpenXR Mixed Reality.

Configuração do projeto

  1. Configure o projeto unity para desenvolvimento do HoloLens.
    • Selecione a capacidade De Entrada de Olhar
  2. Importe o Mixed Reality Plug-in OpenXR a partir da ferramenta de funcionalidades MRTK.
  3. Importe o pacote NuGet do SDK de Controlo Ocular para o seu projeto do Unity.
    1. Transfira e instale o pacote NuGetForUnity .
    2. No editor do Unity, aceda a NuGet->Manage NuGet Packages e, em seguida, procure Microsoft.MixedReality.EyeTracking
    3. Clique no botão Instalar para importar a versão mais recente do pacote NuGet.
      Captura de ecrã do pacote Nuget do SDK de Controlo Ocular.
  4. Adicione os scripts auxiliares do Unity.
    1. Adicione o ExtendedEyeGazeDataProvider.cs script a partir daqui ao projeto do Unity.
    2. Crie uma cena e, em seguida, anexe o ExtendedEyeGazeDataProvider.cs script a qualquer GameObject.
  5. Consuma as funções de ExtendedEyeGazeDataProvider.cs e implemente as suas lógicas.
  6. Criar e implementar no HoloLens.

Consumir funções de ExtendedEyeGazeDataProvider

Nota

O ExtendedEyeGazeDataProvider script depende de algumas APIs do plug-in OpenXR Mixed Reality para converter as coordenadas dos dados de olhar atento. Não pode funcionar se o projeto do Unity utilizar o plug-in do Windows XR preterido ou o XR Incorporado legado na versão mais antiga do Unity. Para que o controlo ocular alargado também funcione nesses cenários:

  • Se apenas precisar de aceder às definições da taxa de fotogramas, o plug-in Mixed Reality OpenXR não é necessário e pode modificar o ExtendedEyeGazeDataProvider para manter apenas a lógica relacionada com a taxa de fotogramas.
  • Se ainda precisar de aceder a dados individuais de olhar atento, terá de utilizar as APIs WinRT no Unity. Para ver como utilizar o SDK de controlo ocular alargado com AS APIs WinRT, consulte a secção "Consulte Também".

A ExtendedEyeGazeDataProvider classe encapsula as APIs do SDK de controlo ocular alargado. Fornece funções para obter a leitura de olhares atentos no espaço mundial do Unity ou em relação à câmara principal.

Seguem-se exemplos de código a consumir ExtendedEyeGazeDataProvider para obter os dados de análise.

ExtendedEyeGazeDataProvider extendedEyeGazeDataProvider;
void Update() {
    timestamp = DateTime.Now;

    var leftGazeReadingInWorldSpace = extendedEyeGazeDataProvider.GetWorldSpaceGazeReading(extendedEyeGazeDataProvider.GazeType.Left, timestamp);
    var rightGazeReadingInWorldSpace = extendedEyeGazeDataProvider.GetWorldSpaceGazeReading(extendedEyeGazeDataProvider.GazeType.Right, timestamp);
    var combinedGazeReadingInWorldSpace = extendedEyeGazeDataProvider.GetWorldSpaceGazeReading(extendedEyeGazeDataProvider.GazeType.Combined, timestamp);

    var combinedGazeReadingInCameraSpace = extendedEyeGazeDataProvider.GetCameraSpaceGazeReading(extendedEyeGazeDataProvider.GazeType.Combined, timestamp);
}

Quando o ExtendedEyeGazeDataProvider script é executado, define a taxa de fotogramas de dados de olhar para a opção mais alta, que atualmente é de 90fps.

Referência da API do SDK de controlo ocular alargado

Além de utilizar o ExtendedEyeGazeDataProvider script, também pode criar o seu próprio script para consumir as APIs do SDK que se seguem diretamente.

namespace Microsoft.MixedReality.EyeTracking
{
    /// <summary>
    /// Allow discovery of Eye Gaze Trackers connected to the system
    /// This is the only class from the Extended Eye Tracking SDK that the application will instantiate, 
    /// other classes' instances will be returned by method calls or properties.
    /// </summary>
    public class EyeGazeTrackerWatcher
    {
        /// <summary>
        /// Constructs an instance of the watcher
        /// </summary>
        public EyeGazeTrackerWatcher();

        /// <summary>
        /// Starts trackers enumeration.
        /// </summary>
        /// <returns>Task representing async action; completes when the initial enumeration is completed</returns>
        public System.Threading.Tasks.Task StartAsync();

        /// <summary>
        /// Stop listening to trackers additions and removal
        /// </summary>
        public void Stop();

        /// <summary>
        /// Raised when an Eye Gaze tracker is connected
        /// </summary>
        public event System.EventHandler<EyeGazeTracker> EyeGazeTrackerAdded;

        /// <summary>
        /// Raised when an Eye Gaze tracker is disconnected
        /// </summary>
        public event System.EventHandler<EyeGazeTracker> EyeGazeTrackerRemoved;        
    }

    /// <summary>
    /// Represents an Eye Tracker device
    /// </summary>
    public class EyeGazeTracker
    {
        /// <summary>
        /// True if Restricted mode is supported, which means the driver supports providing individual 
        /// eye gaze vector and frame rate 
        /// </summary>
        public bool IsRestrictedModeSupported;

        /// <summary>
        /// True if Vergence Distance is supported by tracker
        /// </summary>
        public bool IsVergenceDistanceSupported;

        /// <summary>
        /// True if Eye Openness is supported by the driver
        /// </summary>
        public bool IsEyeOpennessSupported;

        /// <summary>
        /// True if individual gazes are supported
        /// </summary>
        public bool AreLeftAndRightGazesSupported;

        /// <summary>
        /// Get the supported target frame rates of the tracker
        /// </summary>
        public System.Collections.Generic.IReadOnlyList<EyeGazeTrackerFrameRate> SupportedTargetFrameRates;

        /// <summary>
        /// NodeId of the tracker, used to retrieve a SpatialLocator or SpatialGraphNode to locate the tracker in the scene
        /// for the Perception API, use SpatialGraphInteropPreview.CreateLocatorForNode
        /// for the Mixed Reality OpenXR API, use SpatialGraphNode.FromDynamicNodeId
        /// </summary>
        public Guid TrackerSpaceLocatorNodeId;

        /// <summary>
        /// Opens the tracker
        /// </summary>
        /// <param name="restrictedMode">True if restricted mode active</param>
        /// <returns>Task representing async action; completes when the initial enumeration is completed</returns>
        public System.Threading.Tasks.Task OpenAsync(bool restrictedMode);

        /// <summary>
        /// Closes the tracker
        /// </summary>
        public void Close();

        /// <summary>
        /// Changes the target frame rate of the tracker
        /// </summary>
        /// <param name="newFrameRate">Target frame rate</param>
        public void SetTargetFrameRate(EyeGazeTrackerFrameRate newFrameRate);

        /// <summary>
        /// Try to get tracker state at a given timestamp
        /// </summary>
        /// <param name="timestamp">timestamp</param>
        /// <returns>State if available, null otherwise</returns>
        public EyeGazeTrackerReading TryGetReadingAtTimestamp(DateTime timestamp);

        /// <summary>
        /// Try to get tracker state at a system relative time
        /// </summary>
        /// <param name="time">time</param>
        /// <returns>State if available, null otherwise</returns>
        public EyeGazeTrackerReading TryGetReadingAtSystemRelativeTime(TimeSpan time);

        /// <summary>
        /// Try to get first first tracker state after a given timestamp
        /// </summary>
        /// <param name="timestamp">timestamp</param>
        /// <returns>State if available, null otherwise</returns>
        public EyeGazeTrackerReading TryGetReadingAfterTimestamp(DateTime timestamp);

        /// <summary>
        /// Try to get the first tracker state after a system relative time
        /// </summary>
        /// <param name="time">time</param>
        /// <returns>State if available, null otherwise</returns>
        public EyeGazeTrackerReading TryGetReadingAfterSystemRelativeTime(TimeSpan time);
    }

    /// <summary>
    /// Represents a frame rate supported by an Eye Tracker
    /// </summary>
    public class EyeGazeTrackerFrameRate
    {
        /// <summary>
        /// Frames per second of the frame rate
        /// </summary>
        public UInt32 FramesPerSecond;
    }

    /// <summary>
    /// Snapshot of Gaze Tracker state
    /// </summary>
    public class EyeGazeTrackerReading
    {
        /// <summary>
        /// Timestamp of state
        /// </summary>
        public DateTime Timestamp;

        /// <summary>
        /// Timestamp of state as system relative time
        /// Its SystemRelativeTime.Ticks could provide the QPC time to locate tracker pose 
        /// </summary>
        public TimeSpan SystemRelativeTime;

        /// <summary>
        /// Indicates of user calibration is valid
        /// </summary>
        public bool IsCalibrationValid;

        /// <summary>
        /// Tries to get a vector representing the combined gaze related to the tracker's node
        /// </summary>
        /// <param name="origin">Origin of the gaze vector</param>
        /// <param name="direction">Direction of the gaze vector</param>
        /// <returns></returns>
        public bool TryGetCombinedEyeGazeInTrackerSpace(out System.Numerics.Vector3 origin, out System.Numerics.Vector3 direction);

        /// <summary>
        /// Tries to get a vector representing the left eye gaze related to the tracker's node
        /// </summary>
        /// <param name="origin">Origin of the gaze vector</param>
        /// <param name="direction">Direction of the gaze vector</param>
        /// <returns></returns>
        public bool TryGetLeftEyeGazeInTrackerSpace(out System.Numerics.Vector3 origin, out System.Numerics.Vector3 direction);

        /// <summary>
        /// Tries to get a vector representing the right eye gaze related to the tracker's node position
        /// </summary>
        /// <param name="origin">Origin of the gaze vector</param>
        /// <param name="direction">Direction of the gaze vector</param>
        /// <returns></returns>
        public bool TryGetRightEyeGazeInTrackerSpace(out System.Numerics.Vector3 origin, out System.Numerics.Vector3 direction);

        /// <summary>
        /// Tries to read vergence distance
        /// </summary>
        /// <param name="value">Vergence distance if available</param>
        /// <returns>bool if value is valid</returns>
        public bool TryGetVergenceDistance(out float value);

        /// <summary>
        /// Tries to get left Eye openness information
        /// </summary>
        /// <param name="value">Eye Openness if valid</param>
        /// <returns>bool if value is valid</returns>
        public bool TryGetLeftEyeOpenness(out float value);

        /// <summary>
        /// Tries to get right Eye openness information
        /// </summary>
        /// <param name="value">Eye openness if valid</param>
        /// <returns>bool if value is valid</returns>
        public bool TryGetRightEyeOpenness(out float value);
    }
}

Ver também