2020
Design & Code
Explorations of gesture-based UI. Made in Unity on an Oculus Quest.
I am not fond of large menu panels or over-relying on the 'menu' button in VR.
Gestures offer many opportunities to activate small specific UI tools when you need them at arm reach.
I made these prototypes with very simple colliders on the fingers, which is very quick for sketching ideas, and can be refined for some light playtesting.
Gestures should be tied to the orientation of the hand.
The same action has a very different feeling when your palm is facing you, and also affects the positioning of the UI.
Very specific actions can be created, especially when using both hands.
We can also use comfortable hand positions for quick repeated actions, similar to a button press.
It's more reliable to use separate, distinct gestures as triggers for opening and closing UI.
We can choose our gestures based on how easily we want to access a feature.
A more specific gesture has higher friction to activate. It can also include more detailed secondary features like scaling.