I am currently the Head of Design & Product at Avatour, a videoconferencing platform that allows remote users to watch live or pre-recorded 360° videos together while communicating with the 360° camera operator in real time. Each viewer has independent control of their view and can join via desktop, mobile, or VR headset.
Designing features that provide value across several use cases and device types is a wonderful challenge. Since joining, my team has developed branding and redesigned the UI for every device. We’ve created new paradigms of how people interact with the Avatour platform and their content. We’ve patented methods for bringing single, remote users into a shared virtual reality.
Common locomotion methods for VR include teleportation and moving with the joystick. But these break the illusion of immersion. What if we could create a more natural paradigm, where the intention of your movements are translated to the screen.
I’m currently working on a new paradigm for locomotion in VR in which the user holds a button and simply leans in the direction they would like to move, as if beginning to take a step in that direction. The button can then be released, returning the user to regular 6 DoF movement for further exploration of their new terrain.
With this method, movement can also be augmented through multipliers. Allowing the user to run at high speeds and jump over buildings, without inducing motion sickness.
Stereoscopic video in VR is incredibly immersive, but when you tilt your head, your eyes are no longer aligned with lenses from the camera.
I built this stereoscopic pan & tilt camera using a couple of cheap webcams, an arduino, some spare servos, and the Oculus DKII. It showed that a single user could have live control of the camera and see a proper stereoscopic image, regardless of head orientation.
Limited in terms of use cases, but it was very fun proof of concept.