VR in Unity: Navigate UI Elements with the Ray Interactor

Chase Mitchell
4 min readMay 17, 2021

Learn how to create a UI in Virtual Reality and engage with it using your hand controllers.

This guide picks up from my VR Playground setup from the past few guides and uses the same Rig setup detailed in my first VR post on controller setup. To get started we create a new Device-based Ray Interactor in the Hierarchy and drag it into our Camera Offset game object and reset the position values. I will be creating this for both hands, so duplicate the object, update the Controller Node to the appropriate hand and rename the Interactors to match.

Next under both the Interaction Layer Mask and Raycast Mask properties of the XR Ray Interactor component, remove all layers except for UI. This will ensure our interactors can only engage with our UI elements and will not otherwise interfere in our scene.

To make these lines appear more like pointers we can also update the Line Width inside the Line Visual component from 0.02 to 0.05.

We also want to hide the rays unless they are colliding with an object we can interact with. Inside the Line Visual component select the Invalid Color Gradient and reduce the alpha values for the beginning and end points both to zero.

We do not yet have any UI elements in our scene so I will re-enable raycast masking for our “Grab” objects temporarily to test this out:

The behavior is working as expected, so we can now remove the raycast mask interaction for Grab objects and build our UI.

Building The User Interface

Right click in the Hierarchy and create a new UI -> Canvas. In the Scene View menu select the 2D option to focus on it. Let’s create a few UI elements to set up a basic UI to interact with.

Now, the standard canvas render mode is Screen Space — Overlay. While you can see your UI elements in the Game view, they will not render to a VR headset. You can change this to Screen Space — Camera and drag in our VR camera to render it inside the headset view, but this appearance looks cluttered, it can be hard to read the elements, and an always-on UI in VR can get annoying. Instead, when creating UI in VR it is best to create it in World Space. Adjust this setting and keep the VR Camera as the Event Camera.

To make our UI interactable, add the Tracked Device Graphic Raycaster component to the Canvas. On the EventSystem add the XRUI Input Module. For aesthetics we can also add a transparent background panel to our UI to help the unit stand out — if you do this be sure to nest the panel as the first child under the canvas so it renders behind the rest of your UI elements and does not interfere with selection.

Testing shows that we can now interact with our UI, but we are seeing a locking behavior from our pointer to the elements on the canvas:

There is a simple fix for the locking behavior — simply navigate to the VR Camera within our VR Rig and update its tag to Main Camera.

As you can see our pointers now disappear correctly when we exit the panel.

We still have one remaining behavior to fix which is to separate our teleport button functionality from our UI interaction functionality. Both are currently using the trigger, and this could cause the player to accidentally teleport when engaging with the menu. To fix this we can simply update the UI Press Usage button mapping on the XR Controller component for each ray and our menu works as intended!

Ta-Da! That’s it for this guide, I hope it was able to simplify the process of creating navigable UI within your VR instances. Once you get the hang of using the Ray Interactors it becomes a very straightforward process. See ya in the next guide!

--

--