Skip to main content Link Search Menu Expand Document (external link)

Table of contents

  1. Overview
  2. Using with Ultraleap or Pimax devices
    1. Note fo Varjo Aero users
  3. Using with Oculus Quest or Quest 2
  4. Customizing the tracking and gestures
    1. Per-application configuration file
      1. Naming of the configuration file
    2. Prerequisites
    3. Grip and aim poses (Offsets tab)
      1. Defaults
    4. Controller bindings (Bindings tab)
      1. Gestures definitions
      2. Input actions definitions
      3. Defaults
    5. Sensitivity (Gestures tab)
    6. Trigger-on-haptics (Haptics tab)

Overview

The OpenXR Toolkit does not add hand tracking capability to your headset, but instead it leverages hand tracking capabilities of some devices (such as the Leap Motion controller) and enable the hand tracking to act as VR controllers.

Hand tracking in Microsoft Flight Simulator 2020
Hand tracking in Microsoft Flight Simulator 2020

Devices confirmed to work:

We would love to add more devices to this list! If you have a device that supports hand tracking via OpenXR that is no on this list, please contact us on Discord or submit an issue on GitHub.

Using with Ultraleap or Pimax devices

  1. Download and install the Leap Motion tracking software.

  2. Use the included Visualizer app to confirm that the Leap Motion Controller is properly setup and functional.

  3. Download and install the Ultraleap OpenXR Hand Tracking API Layer.

  4. With your game running, open the menu (Ctrl+F2), then navigate to the Input tab. Under Controller emulation, select either Both to use both hands, or Left/Right to use only one hand. Restart the VR session for the hand tracking to begin.

Enable hand tracking
Enable hand tracking in OpenXR Toolkit

Note fo Varjo Aero users

In order to use an Ultraleap device with the Varjo Aero, please follow these additional steps (after step 3 above):

  1. Copy the OpenXR Toolkit hand configuration file for your application (eg: FS2020.cfg) from the %ProgramFiles%\OpenXR-Toolkit folder to the %LocalAppData%\OpenXR-Toolkit\configs folder.

  2. Edit the copy of the hand configuration file for your application (eg: %LocalAppData%\OpenXR-Toolkit\configs\FS2020.cfg) , and replace the following content:

interaction_profile=/interaction_profiles/hp/mixed_reality_controller
  • with:
interaction_profile=/interaction_profiles/valve/index_controller

Using with Oculus Quest or Quest 2

  1. IMPORTANT: Create and log in with a developer account. You must be a registered developer for the Oculus runtime to expose the hand tracking features.

  2. Ensure that your Oculus software is up-to-date (v37 or later).

  3. Turn on hand tracking on the headset: https://support.oculus.com/articles/headsets-and-accessories/controllers-and-hand-tracking/index-controllers-hand-tracking

Enabling hand tracking

Click for details... Enabling hand tracking Enabling hand tracking Enabling hand tracking Enabling hand tracking
  1. Turn on Developer mode on the headset.
Click for details... Enabling developer mode Enabling developer mode
  1. On your PC, open the Oculus App, and navigate to Settings and the Beta menu.

  2. Enable Developer Runtime Features.

Enabling hand tracking

  1. Copy the OpenXR Toolkit hand configuration file for your application (eg: FS2020.cfg) from the %ProgramFiles%\OpenXR-Toolkit folder to the %LocalAppData%\OpenXR-Toolkit\configs folder.

  2. Edit the copy of the hand configuration file for your application (eg: %LocalAppData%\OpenXR-Toolkit\configs\FS2020.cfg) , and replace the following content:

interaction_profile=/interaction_profiles/hp/mixed_reality_controller
  • with:
interaction_profile=/interaction_profiles/oculus/touch_controller
  1. With your game running, open the menu (Ctrl+F2), then navigate to the Input tab. Under Controller emulation, select either Both to use both hands, or Left/Right to use only one hand. Restart the VR session for the hand tracking to begin.

Enable hand tracking
Enable hand tracking in OpenXR Toolkit

Customizing the tracking and gestures

Per-application configuration file

For each application, the hand tracking experience can be customized through a configuration file.

The OpenXR toolkit may provide a default configuration specific for an app. The preset configuration files are stored in %ProgramFiles%\OpenXR-Toolkit, for example FS2020.cfg for Microsoft Flight Simulator 2020.

When starting a game or application and hand tracking is enabled, the toolkit will first attempt to load the configuration file from the %LocalAppData%\OpenXR-Toolkit\configs folder, then attempt to load the configuration file from the %ProgramFiles%\OpenXR-Toolkit folder, before falling back to the built-in defaults if no file was found.

The OpenXR Hand-to-Controller Configuration tool (found in the Start menu) can be used to create configuration files to customize the hand tracking for each game or application.

You may either start from the default configuration, or load one of the existing configurations from the %ProgramFiles%\OpenXR-Toolkit folder. Your configuration file can then be saved into the %LocalAppData%\OpenXR-Toolkit\configs folder.

The configuration tool also supports live configuration, where every change made in the tool will immediately be applied to the currently running application. Once satisfied with the configuration, do not forget to save the configuration file!

For the live configuration feature to work, you need to allow the OpenXR Toolkit through the local firewall when this screen is presented:

Allow network

Naming of the configuration file

The configuration file must use the OpenXR name for the application, which is not the same as the Windows application name or shortcut name. To determine the OpenXR name for the application, you must run the application with hand tracking enabled (using the in-headset meanu), then inspect the log file (see Troubleshooting). In the log file, the name of the application will appear with one of the following messages:

Loading config for "FS2020"
Could not load config for "FS2020"

For this application (Microsoft Flight Simulator 2020), the name of the configuration file must be FS2020.cfg.

Prerequisites

The OpenXR Toolkit does not magically add hand tracking to any game or application. It leverages the support for VR controllers, and it translates hand tracking data into simulated VR controller poses (a position and rotation in 3D space) and button inputs.

In order to customize this translation behavior, it is important to understand how hand tracking input is represented. Each hand is decomposed in 26 individually articulated joints: 4 joints for the thumb finger, 5 joints for the other four fingers, and the wrist and palm of the hands. The name an exact position of these joints can be seen below (diagram courtesy of the Khronos group):

Hand joints

The process of translating hand tracking into VR controller input consists of:

  • Mapping one or more hand joint poses to the VR controller pose.
  • Recognizing gestures (based on hand joints positions relative to each other) and convert them into button inputs.

Coming up with settings for the configuration tool greatly depends on your preferred experience and the specific mechanics of the game or application. It can be a long and tedious process.

Grip and aim poses (Offsets tab)

The application may use two references for tracking of the VR controller: the grip pose and/or the aim pose.

The OpenXR specification describes them as follows:

  • grip - A pose that allows applications to reliably render a virtual object held in the user’s hand, whether it is tracked directly or by a motion controller.
  • aim - A pose that allows applications to point in the world using the input source, according to the platform’s conventions for aiming with that kind of source.

They can be visualized below (diagram courtesy of the Khronos group): Grip and aim pose

Some applications use the grip pose, some use the aim pose, and some use both. You have to experiment.

When the OpenXR toolkit translates the hand poses into controller poses, it must choose from one of the 26 hand joints pose and optionally perform an additional translation and/or rotation for the resulting controller grip pose or aim pose to be usable.

Mapping tool Offsets tab The Offsets tab of the configuration tool

The configutation tool lets you choose which hand joint to use for the grip pose and aim pose, and allows you to apply and additional translation and/or rotation for the final pose.

Defaults

When no configuration file is provided, the palm of the hand is used when representing the grip, and the “intermediate” joint of the index finger (see hand joint poses above) is used to represent the aim. No additional translation/rotation is applied.

Controller bindings (Bindings tab)

Gestures definitions

The following gestures are recognized and can be used to trigger the simulated input buttons.

Important: Some gestures may interfere with each other, and not all gestures should be bound to simulated inputs. For example the wrist tap and palm tap gestures are fairly similar, and mapping a distinct input to both of them can lead to false-triggering, or both inputs being simulated at the same time. Care must be taken to not “overload” the gesture bindings.

Pinch (one-handed)

Bringing the tip of the thumb and index together.

Thumb press (one-handed)

Using the thumb to “press” onto the index finger. The target is the “intermediate” joint of the index finger (see hand joint poses above).

Index bend (one-handed)

Bending the index finger in a trigger-like motion.

Finger gun (one-handed)

Using the thumb to “press” into the middle finger. The target is the “intermediate” joint of the middle finger (see hand joint poses above). This gesture allows the index finger to be extended (pointing forward).

Squeeze (one-handed)

Bending the middle finger, ring finger and little finger in a trigger-like motion.

Wrist tap (two-handed)

Using the tip of the index finger from the opposite hand to press on the wrist.

Palm tap (two-handed)

Using the tip of the index finger from the opposite hand to press on the center of the palm.

Index tip tap (two-handed)

Bring the tip of both index fingers together.

Input actions definitions

When translating hand tracking into VR controller inputs, the OpenXR Toolkit must use one of the supported controller type (also called “interaction profile”). It is important to select a controller type that is supported by the game or application. The following types are supported and can be chosen in the configuration tool:

  • /interaction_profiles/microsoft/motion_controller: The first generation Windows Mixed Reality motion controllers, with a trackpad instead of the A/B/X/Y buttons.
  • /interaction_profiles/hp/mixed_reality_controller: The HP Reverb motion controllers, with the A/B/X/Y buttons instead of the trackpad.
  • /interaction_profiles/oculus/touch_controller: The Oculus Touch controller.

Controller inputs are also referred to as “actions”, with two main categories of actions:

  • value actions: the action correspond to triggers with variable positions, where the fingers pressure on the trigger is reported as a decimal value between 0 and 1.
  • click actions: the action corresponds to a button that is either pressed or not pressed.

Each controller input is identified with an action path, which contains both a unique identifier for the input (such as a description of the input or the label on a button) and the type of input (value or click). The supported action paths are the following:

  • /input/menu/click: The menu button.
  • /input/trigger/value: The controller trigger.
  • /input/squeeze/value: The “grab” trigger. This action is only supported on HP Reverb motion controllers or Oculus Touch controllers.
  • /input/squeeze/click: The “grab” button (for controllers that do not have a grab trigger).
  • /input/x/click (left only): The X button.
  • /input/y/click (left only): The Y button.
  • /input/a/click (right only): The A button.
  • /input/b/click (right only): The B button.
  • /input/thumbstick/click: The thumbstick being pushed down.
  • /input/system/click: Emulates pressing the Windows button.
  • /input/trackpad/click: The trackpad being pushed down. This action is only supported on first generation Windows Mixed Reality motion controllers.

Mapping tool Bindings tab The Bindings tab of the configuration tool

The configutation tool lets you choose the interaction profile (type of controller) to simulate and bind the gestures to corresponding simulated button inputs.

Defaults

When no configuration file is provided, an HP Reverb motion controller is simulated, with the following gestures bound:

  • The index bend gesture on both hands is bound to the controller trigger.
  • The squeeze gesture on both hands is bound to the controller grab trigger or button.
  • Tapping the left wrist is bound to the left controller’s menu button.
  • The index tip tap gesture is bound to the right controller’s B button.

Sensitivity (Gestures tab)

Each gesture described above produces a decimal “output value” between 0 and 1. This value is based on the distance between the hand joints involved for the gesture. When at rest, the value must be 0. The closer the hand joints are from each other, the closer to 1 the value is.

The “far distance” for a gesture corresponds to the when the output value maps to 0. Any distance larger than the far distance will produce an output of 0.

The “near distance” for a gesture corresponds to the when the output value maps to 1. Any distance smaller than the far distance will produce an output of 1.

Example: With a near distance of 10mm and a far distance of 60mm for pinching:

  • When the tip of the thumb and index fingers are 60mm or more apart, the output value will read 0 (equivalent to the controller’s trigger being at rest for example).
  • When the tip of the thumb and index fingers are 10mm or less apart, the output value will read 1 (equivalent to the controller’s trigger being fully pressed).
  • When the tip of the thumb and index finger are 35mm apart, this output value reads 0.5 (because 35 is half-way between 10 and 60, equivalent to the trigger being pressed half of the way).

Adjusting the near and far distances effectively allows to modify the sensitivity of each gesture.

Another parameter, the “click threshold “, allows to modify the sensitivity of the click actions. As described earlier, click actions correspond to a button that is either pressed or not pressed. The click threshold determine the output value above which a gesture results in a click action being pressed. If the output value for a gesture is below the click threshold, then the button is reported not pressed. If the output value is greater or equal to the click threshold, the the button is reported pressed.

Mapping tool Bindings tab The Gestures tab of the configuration tool

The configuration tool lets you customize the near and far distance for each gesture, along with the click threshold.

The Detailed overlay can be used to view the current output value for each gesture, and should be used to tune the near and far distances and the click threshold.

Trigger-on-haptics (Haptics tab)

It is possible to program a chosen gesture to simulate an action (like pressing the trigger) upon haptics commands sent by the application. The haptics can be filtered by the frequency of the request vibration (for example, some applications will use different frequencies for different events).

This can be useful to simplify interactions, for example the finger gun gesture can be used to simulate trigger input upon haptics, and for applications that send haptics commands when the (simulated) VR controller approaches a (virtual) button, this can be used to simulate pressing the button without the need to initiate a gesture when near the button.

The Detailed overlay can be used to display when haptics commands are sent, and can be used to determine which frequency to filter on.