Hand Interactions for Microsoft Mesh
Along with a group of other designers, I was tasked with translating the existing controller-based interactions in Microsoft Mesh for hand tracking inputs in VR.
I created several prototypes in Unity for us to experiment with using hand gestures for teleportation, rotation, object interaction and UI selection. We wanted to design gestures which were easy to learn, socially acceptable and physically comfortable to perform repeatably. Ultimately, we decided to leverage the standard "pinch" gesture as much as possible, due to its simplicity.
I also prototyped a "hand menu" which gives users easy access to quick actions when they glance at their open palm.
I then partnered with engineers and visual designers to integrate polished versions of these interactions into the product.
UI Manipulations
I designed an improved interaction patterns for the core UI systems in Mesh for VR, to align with modern industry standards. Previously, UI panels in mesh would stay mostly head locked to the centers of users FOV, with a very tight leash. If open, the UI would almost always completely occlude the view of the 3D space around users, preventing them from multitasking between UI menus and the immersive content.
In my design, the UI is more loosely body leashed to users, allowing users to freely move around without their UI constantly following them. I also added a designated grab handle beneath the UI panel which serves as the manipulation point. To improve readability, the UI panel shrinks when a user pulls it in closer to them and expands as users push it away. The panel will automatically rotate towards users when moved in the X or Y axis.
The UI panel will follow users again if they move far enough away from it and will reset to a default position and scale relative to the user every time it is reopened
Using an assortment of MRTK, XRI and custom Mesh components, I created a prototype for this behavior in Unity and am waiting for the work to be implemented in a final build by engineers.
Via prototyping in Figma, I explored UX designs for multitasking between multiple UI panels at once in VR. I tried a few options for the interaction model and the required visual feedback.
VR Magnifier
To help users with low vision experience Microsoft Mesh in virtual reality, I partnered with Engineers to design and prototype this magnifier lens which is locked to the center of users' field of view.