Applying Touchscreen Based Navigation Techniques to Mobile Virtual Reality with Open Clip-On Lenses
Recently, a new breed of mobile virtual reality (dubbed as “EasyVR” in this work), has appeared in the form of conveniently clipping on a non-isolating magnifying lenses on the smartphone, still offering a reasonable level of immersion to using the isolated headset. Furthermore, such a form factor allows the fingers to touch the screen and select objects quite accurately, despite the finger(s) being seen unfocused over the lenses. Many navigation techniques have existed for both casual smartphone 3D applications using the touchscreen and immersive VR environments using the various controllers/sensors. However, no research has focused on the proper navigation interaction technique for a platform like EasyVR which necessitates the use of the touchscreen while holding the display device to the head and looking through the magnifying lenses. To design and propose the most fitting navigation method(s) with EasyVR, we mixed and matched the conventional touchscreen based and headset oriented navigation methods to come up with six viable navigation techniques—more specifically for selecting the travel direction and invoking the movement itself—including the use of head-rotation, on-screen keypads/buttons, one-touch teleport, drag-to-target, and finger gestures. These methods were experimentally compared for their basic usability and the level of immersion in navigating in 3D space with six degrees of freedom. The results provide a valuable guideline for designing/choosing the proper navigation method under different navigational needs of the given VR application.