VR in Unity: Managing Controller Input and Hand Presence (Part 1: Controller Set Up)

Chase Mitchell
6 min readMay 12, 2021

In this guide I’ll cover tips and tricks for working with hand controllers in VR.

Even though controllers differ between headsets, the vast majority of input buttons and trackers work the same and you can easily map functionality to work across devices using Unity’s Input Management system.

Input Mapping

A great place to start is to review the Unity XR Input Mappings documentation:

This guide breaks down how each of the Input settings within Unity map to the various controllers.

Another useful tool when configuring input is the XR Interaction Debugger. Once you have your XR plugins installed from the Package Manager you can access this menu via Window -> Analysis -> XR Interaction Debugger. If you select the Input Devices tab and play your game with a headset connected, you will see how each input is tracking in real time as you use the VR rig.

Accessing Controller Input Data with Code

Let’s get into some scripting. Create an empty game object to store our HandPresence behavior and add a new C# script to it called HandPresence. Inside the script, when working with XR functionality we need to add the “using UnityEngine.XR” namespace.

Next we can create a list to keep track of our input devices:

When running the game this will now print out the devices that are available and tracked in our scene along with some characteristics that we can use to reference them.

Printout for an Oculus Quest 2

For example, if we wanted to access only the right controller, we could update the code to the following, using GetDevicesWithCharacteristics instead of GetDevices:

The code above stipulates that we only want to get access to devices that are marked as “Right” and “Controller” and we will thus print out only our right controller in the console window on Play:

We can also get a handle to a target device by referencing its index position in our new devices list and assigning it in void Start():

With this setup we have now assigned a controller on which we want to listen for input. We can then check for input within the Update() method. We do this with “targetDevice.TryGetFeatureValue()” which takes two parameters — the button we are listening for and the output value type. These value types can be boolean for a button (button is pressed or not), float for a trigger (ranges from 0 to 1 based on % of trigger depression) or Vector2 for movement along 2 axes (thumbstick movement direction).

There is a library of CommonUsages that allows for easy mapping to the basic buttons, triggers, and thumbpads on each VR controller.

We create variables to store our return values and check if those values return a registered input, then debug the input value to the console.

Basic input is now tracking for our right hand controller

There are some edge cases when using different controllers in which you might not have the common input that you are checking for in the code, so it would be a best practice to combine the input check into your if statements as follows:

Implementing Hand Models

Here are two unity packages for various types of hand models.

VR Controller Unity Package

Oculus Hands Unity Package

Download the packages and then drag it into the project folder to install. We can now set up control for which controller to display based on device name.

In the Hand Presence script create a public list to store the controller prefabs:

Drag in each of the controllers into this list:

To spawn the correct controller we will use the device name:

If you set your controller prefab gameobject names to match the original device names from the input list, the code above will pull in the correct controller prefab to match the user’s device. We can store the spawned controller as a private gameobject variable for later reference and instantiate our prefab model if a match is found:

If no match is found we log an error and instantiate the default controller

Now we can prefab the HandPresence game object with this script attached and update the Model Prefab field for each of our controllers to be the HandPresence prefab. Delete the original from the hierarchy and you should spawn in the correct controllers.

In my current version of Unity the Quest 2 controllers are tracking as Oculus Touch controllers, so in this particular case I will rename my Quest Controller model prefabs to Touch, and set the Touch controllers as Touch2 (will not be found) for the purposes of my short bowling VR game.

Quest 2 controllers online

A discerning eye might notice that both of these controllers are right hand controllers — we need to update our HandPresence script to track both instead of only the right controller.

We can delete the line of code referencing the right controller and create a new public device characteristics variable which we can assign in the inspector instead of the script.

Inside the HandPresence prefab we can now assign the characteristics of “Right” and “Controller”. Then duplicate this prefab and do the same for “Left” and “Controller”. Rename the prefabs to match their associated characteristics and reassign the appropriate prefabs to the XR Controller component of each controller object.

Our controllers are now correctly designated to the appropriate left and right controller models. In the Part 2 guide for this series I’ll cover switching to animated hand models. See ya there!

--

--