Authors list : Qi Sun (Stony Brook University, NVIDIA, Adobe Research), Anjul Patney (NVIDIA), Li-Yi Wei (Adobe Research), Omer Shapira (NVIDIA), Jingwan Lu (Adobe Research), Paul Asente (Adobe Research), Suwen Zhu (Stony Brook University), Morgan McGuire, David Luebke, Arie Kaufman (Stony Brook University)
Redirected walking techniques can enhance the immersion and visual-vestibular comfort of virtual reality (VR) navigation, but are often limited by the size, shape, and content of the physical environments.
We propose a redirected walking technique that can apply to small physical environments with static or dynamic obstacles. Via a head- and eye-tracking VR headset, our method detects saccadic suppression and redirects the users during the resulting temporary blindness. Our dynamic path planning runs in real-time on a GPU, and thus can avoid static and dynamic obstacles, including walls, furniture, and other VR users sharing the same physical space. To further enhance saccadic redirection, we propose subtle gaze direction methods tailored for VR perception.
We demonstrate that saccades can significantly increase the rotation gains during redirection without introducing visual distortions or simulator sickness. This allows our method to apply to large open virtual spaces and small physical environments for room-scale VR. We evaluate our system via numerical simulations and real user studies.