Migration guide from MRTK2 to MRTK3
As you begin using MRTK3, you'll notice that several concepts in MRTK v2 have been changed, replaced, or removed. This document helps bridge the gap between MRTK v2 concepts and their MRTK3 counterparts.
Interactions
MRTK3 uses Unity's XR Interaction Toolkit (XRI) framework to handle interaction, and the Unity Input System plus OpenXR for input.
Important
For developers new to XRI, we recommend that you first review Unity's XRI architecture documentation. All XRI documentation also applies to MRTK3, as most interaction and input features are simply inherited from XRI.
Terminology
MRTK v2 term | MRTK3 term | Description |
---|---|---|
Pointer | Interactor | Interactors perform interactions on Interactables. Some (but not all) Interactors are driven by Controllers, from which they receive input actions and poses. Other Interactors operate independently of controllers. MRTK ships several custom Interactors that provide useful mixed reality interactions on top of the basic Interactors Unity already provides. Custom interactors can be built either through inheritance or by implementing the Interactor interfaces ( IXRHoverInteractor , IXRSelectInteractor , and so forth). For more information, see the Interactor architecture documentation. |
NearInteractionGrabbable, NearInteractionTouchable, IMixedRealityPointerHandler |
Interactable | Interactables are the recipients of interactions. MRTK ships several custom Interactables that provide useful mixed reality interactions on top of the basic Interactables Unity already provides. Interactables can be built either through inheritance or by implementing the interactable interfaces (IXRHoverInteractable , IXRSelectInteractable , and so forth). For more information on how MRTK extends XRI interactables, see the Interactable architecture documentation. |
Controller | Controller | An ActionBasedController is a collection of Unity Input Actions representing the bindings associated with a particular device. The collection of Input Actions can be derived from multiple devices, as there isn't a 1:1 relationship between ActionBasedControllers and underlying input devices. (A DeviceBasedController is a 1:1 mapping of an input device, but we don't use them.) Many interactors ( ControllerBasedInteractor s, specifically) listen to Controllers for input actions--in other words, all ControllerBasedInteractor s underneath an XRController share the same select action. |
Teleport System | Locomotion System | The Locomotion System allows the user to move about the scene during an XR experience. MRTK v2's system allows for basic teleportation and teleport hotspots, with a high degree of customizability for the teleport cursor and pointer behavior. XRI offers the following added capabilities for locomotion:
|
Focus Provider | XR Interaction Manager | The XRInteractionManager is the Unity mechanism that serves as the intermediary between the Interactors and Interactables in the scene. The Unity XRInteractionManager synchronizes and arbitrates all interactions between Interactors and Interactables, and allows for significantly greater flexibility when compared to the legacy Focus Provider. |
Pointer Mediator | Interaction Mode Manager | The new Interaction Mode Manager is used to enable/disable sets of interactors depending on context within the scene. See the mode manager documentation for more information. |
SceneQuerymask | Interaction Layers | XRI interaction layers allow developers to filter which Interactors can act upon which Interactables. These layers are distinct from Unity physics layers. |
Focus | Hover | Interactors issue Hovers on Interactables when the Interactable is a valid target for the Interactor. Generally, a Hover indicates intent from the interactor, such as targeting with a ray, hand proximity for grab, or if the user is looking at the object. |
Select/Poke/Grab/Voice etc. | Select | Interactors issue Selects on Interactables when the Interactable is both a valid target and the Interactor chooses to do so. ControllerBasedInteractors generally emit Selects when their corresponding Controller's select input action fires. Other interactors can have more complex logic for determining when selects should be issued to the targeted Interactable. MRTK v2 handled different kinds of interactions with separate events and codepaths--in other words, a grab was a fundamentally different interaction than a ray click or poke, generated by separate systems. In MRTK3, all of these ways of "selecting" an object are unified under the same Select interaction. We strongly discourage developers from building interaction logic that relies on a specific type of interaction; instead, write generalizable code that responds generically to all Selects. This way, your interactions work across all input modalities, and even for types of interactions that have yet to be developed. See the Interactable architecture for further reading on why we discourage this course. |
N/A | Activate | Activate is an extra action that can be raised on an object that has already been Selected. For instance, if a user Selects a squirt gun with the grip of the controller, the trigger fires it with an Activate action. |
Data Provider | XRSubsystem + Provider | Most data providers are no longer necessary in MRTK3, since the Unity Input System and OpenXR handle most cross-platform input tasks. However, for some outliers that aren't yet covered by Unity, we provide XRSubsystem s that can provide data across different platforms--for example, HandsAggregatorSubsystem and SpeechSubsystem . See the subsystems architecture documentation for more conceptual reading on our subsystems approach. |
Events
MRTK v2 term | XRI term | Notes |
---|---|---|
OnFocusEnter/Exit |
FirstHoverEnter LastHoverExit |
Note the First and Last prefixes. These prefixes are included in the event names because any number of Interactors can simultaneously hover an Interactable. You can also listen to each individual hover enter/exit with HoverEnter and HoverExit , although it's less useful than monitoring the overall hover status. |
OnPointerDown/Up |
FirstSelectEnter LastSelectExit |
Note the First and Last prefixes. These prefixes are included in the event names because any number of Interactors can simultaneously select an Interactable (depending on the selection mode). You can also listen to each individual select enter/exit with SelectEnter and SelectExit , although it's less useful than monitoring the overall selection status. |
OnPointerDragged |
N/A | Simply poll the interactorsSelecting attach transforms with GetAttachTransform during a selection. Bear in mind that, depending on the selection mode of the Interactable, an unbounded number of Interactors can select (and manipulate) an Interactable. |
OnSourcePoseChanged, OnSourceDetected, OnSourceLost |
N/A | XRI doesn't raise these events. They are handled via the XRController monitoring its associated Input Device. |
UX Components
For full documentation on MRTK3 UX components, refer to the overviews for the UX packages: UX Core
, UX Components
, and UX Components (Non-Canvas)
. A major change in MRTK3 is the emphasis on Canvas UX components, which utilize Unity UI. There is also a package for Non-Canvas UX components. A comparison between Canvas and Non-Canvas UX can be found here.
Note
Hand Coach
, Tooltips
, Object Collection
, AppBar
, and Progress Indicator
components do not yet exist in MRTK3. Additionally, Toolbox
and optimized Text Prefabs
are not implemented. This document will be updated as additional MRTK3 UX components are added
MRTK2 | MRTK3 | Notes |
---|---|---|
Buttons |
Canvas Button Non-Canvas Button |
In MRTK3, Unity UI based buttons and Collider based buttons are renovated as Canvas and Non-Canvas buttons. Built in tools to group buttons in MRTK3 include Button Group and ToggleCollection . Samples can be found in the CanvasUITearsheet and NonCanvasUITearsheet scenes. |
Slider |
Canvas Slider Non-Canvas Slider |
A sample can be found in the HandInteractionExamples scene. |
Dialog |
Dialog Dialog API |
A sample can be found in the DialogExample scene. |
Scrolling Collection |
VirtualizedScrollRectList |
A sample can be found in the VirtualizedScrollRectList scene. MRTK3 documentation is in currently in progress. |
Slate |
Slate (Non-Canvas) |
A sample can be found in the SlateDrawingExample scene. |
See-it, Say-it Label |
See-it, Say-it Label |
A sample can be found in the SeeItSayItExample scene. |
Hand Menu |
Hand Menu |
A sample can be found in the HandMenuExamples scene. |
Near Menu |
Near Menu |
A sample can be found in the NearMenuExamples scene. |
System Keyboard |
System Keyboard |
A sample can be found in the HandInteractionExamples scene. |
Fingertip Visualization |
Fingertip Visualization |
The FingerCursor script and prefab are replaced in MRTK3 and the index fingertip is visualized via the MRTKPokeReticleVisual , RingReticle , and ReticleMagnetism scripts. The MRTK LeftHand Controller prefab contains an example of how to use these components. |
Constraint Manager |
Constraint Manager |
A sample can be found in the BoundsControlExamples scene. |
Bounds Control Bounding Box |
Bounds Control |
The BoundingBox script has been replaced. BoundsControl provides an automatically-sized bounding box, the visuals of which can be customized. There are several BoundingBox prefabs which can be used for visuals. A sample can be found in the BoundsControlExamples scene. |
Object Manipulator Manipulation Handler |
Object Manipulator |
Manipulation Handler is deprecated. Use Object Manipulator for the manipulation (move, rotate, scale) of an object by any interactor with a valid attach transform. A sample can be found in the HandInteractionExamples scene. |
Interactable |
StatefulInteractable |
A sample can be found in the InteractableButtonExamples scene. |
Dwell |
InteractorDwellManager |
In MRTK2, a DwellHandler was attached to objects and provided events for handling the start and end of dwell. In MRTK3, there is an InteractorDwellManager on the GazeInteractor and Far Rays in the MRTK XR Rig , which uses StatefulInteractable to determine if the object enables dwell or not, and if it does it selects the object for the duration of dwell. MRTK3 documentation is in currently in progress. |
Solvers |
Solvers |
MRTK3 sample scenes are currently in progress. |
Visual Theming |
Data Binding and Theming |
MRTK3 Data Binding and Theming framework is designed to make it easy to create visual elements that can be populated and updated dynamically at runtime. Not yet integrated with Canvas UX. |
Input Configurations
Input Actions
MRTK 3 uses the new Unity Input System Package for input actions. Most settings can be configured through an Input Action
asset.
Task | MRTK 2 | MRTK 3 |
---|---|---|
Create an Input Action |
Input Actions Profile | Use an Action Map within the Input Action asset. |
Bind an Input Action to a Controller |
Controller Input Mapping Profile | Set the binding for an action with the Input Action asset. |
Pointers
Pointers are attached to interactors in MRTK3. In the default MRTK XR Rig
, the interactors are positioned underneath the MRTK RightHand Controller
and MRTK LeftHand Controller
.
Task | MRTK 2 | MRTK 3 |
---|---|---|
Set a visual prefab for a pointer | Pointer Prefab property in MRTK 2 Pointer Configuration Profile. |
MonoBehaviours on the MRTK RightHand Controller and MRTK LeftHand Controller in the MRTK XR Rig . For instance, MRTKPokeReticleVisual , MRTKLineVisual , and MRTKRayReticleVisual . |
Limit which layers can be interacted with | Pointing Raycast Layer Masks property in MRTK Pointer Profile . This applies to all pointers. |
raycastMask property on the Interactor script. |
Set the extent of a pointer raycast | Pointing Extent property in MRTK Pointer Profile . This applies to all pointers. |
maxRaycastDistance property on the Interactor script. |
Set the priority of pointers | Controlled by the DefaultPointerMediator or an override. |
Configured through the InteractionModeManager (an MRTK3 MonoBehaviour). |
Gestures
Input Actions
can be assigned to various gesture input methods (currently only supported for Windows Recognition on HoloLens 2).
Task | MRTK 2 | MRTK 3 |
---|---|---|
Assign an action to a gesture | Assign gestures to Input Action in MixedRealityGesturesProfile . |
Gestures on HoloLens 2 are now recognized through the OpenXR plugin. |
Speech Commands
The KeywordRecognitionSubsystem
can be enabled to allow speech commands in MRTK 3. More information can be found in the documentation on Speech Input.
Task | MRTK 2 | MRTK 3 |
---|---|---|
Map speech commands to Input Actions |
Speech Commands Profile in the Input System Profile . |
Call CreateOrGetEventForKeyword on the KeywordRecongitionSubsystem with your keyword and action. |
Controller Configuration
Task | MRTK 2 | MRTK 3 |
---|---|---|
Configure controller button behavior | ControllerMappingProfile | Action Map within the Input Action asset. |
Set a prefab for controller visualization | ControllerMappingProfile | Configured in the XRController settings. For instance, the Model Prefab property in ArticulatedHandController. |