Core Principles for Hand Controller Design
Both hand controllers behave the same.
- Most of the things could be done with one hand only.
- It is hard to remember which buttons are A/B, and which are X/Y so within the Morpheus XR metaverse, we have configured both sets of buttons to work the same.
- We don’t need to reconfigure controllers for left- and right-handed people.
- While holding something in one hand, users are still able to use something else in other.
Mapping of the Basic VR Controls
Function |
Control |
Reasoning |
Jumping and rotating |
Thumbstick |
All spatial controls are connected to thumbstick. Thumbstick is often used for movement in games, but also in some cars and motorized wheelchairs. |
Picking up and holding |
Grip button |
When an object is selected, pressing and holding Grip will always result in holding the object in the hand. Releasing the grip will return the object to its passive behaviour: falling down for Grabbable, floating for all other functions. |
Using and interacting |
Trigger |
We force each artifact to have exactly one function that is activated, toggled, triggered, turned on/off by trigger button. E.g., · Microphone is turned on/off. · Timer is started/paused. · Spotlight shoots the ray of light. · Gong plays the ‘chime’ sound. · Content Display shows the HUD menu |
A/X buttons |
Mute on/off |
Mute directly operates with hardware (turns the audio input on/off) thus it has its own hardware button, which is the bottom one of the round buttons. |
B/Y buttons |
HUD on/off |
HUD is a universal and omnipresent concept that introduces additional ‘smart’ layer of objects before users’s eyes. It is as close to hardware as it could be, thus having a dedicated button as well. |
Advanced Controls in VR
Function |
Controls and examples |
Reasoning |
Resizing objects |
Grab an object with both hands and move them closer/farther from each other. |
This behaviour mimics how ‘rubber’ objects are resized. |
Gestures |
E.g., raise two hands up to cheer. |
We keep the amount of gestures to minimum, make them distinctive, and mimic real-life gesturing. For mobile, we add on-screen buttons. |
Context-dependent controls |
Context-dependent controls are rendered in HUD and allow for complex interactions like: · Flipping slides of the presentation · Arranging furniture · Setting up timers · Gathering people around |
Context should be clearly visible, so users won’t need to remember which mode they are currently in. Being a visual layer, HUD helps with that. Moving all smart context-dependent controls to HUD removes them from the ‘physical’ layer which becomes more real and believable. |