When Vision Lies - Navigating Virtual Environments with Unreliable Visual Information

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Humans typically utilize vision in a dominant role for navigation. However, what happens when vision becomes actively unreliable? Will it impair user performance, be suppressed, or be used advantageously? While such scenarios are rare in the real world, this question has important implications for multisensory integration in extended reality applications - e.g. virtual walls that a user sees but can walk through. We created virtual mazes which could be solved via audio or visual cues. We then manipulated the reliability of these sensory channels by including invisible walls which are not perceived but still blocked passage, and ghost walls which could be perceived but did not block participants. Participants navigated the exact same layouts under all conditions, and could solve these levels by ignoring the unreliable sensory modality and using only the other. Participants easily completed these mazes using vision-only, and with some difficulty via audition-only. Partially unreliable vision degraded performance, though still above audio-only demonstrating utilization of the unreliable visual cues. Mazes whose entire visual input was false degraded performance to the level of audio only, though participants subjectively reported it as easier then audio-only and did not close their eyes indicating that they still relied on vision. Testing a control in which visual information was both false and constantly moved, preventing its use as landmarks or optic flow, indeed caused participants to close their eyes, disregarding the false vision, but was accompanied by confounding nausea. In parallel, auditory incongruencies were easily suppressed across all unreliable auditory conditions. This demonstrates human attachment to visual information, even when mostly or completely false, and the ability to glean practical advantages from it unless it is completely stripped from usability. More broadly it lays a foundation for testing multisensory integration of sustained false sensory channels, and has implications for mixed reality design.

Original languageAmerican English
Title of host publicationProceedings - 2024 IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2024
Pages935-944
Number of pages10
ISBN (Electronic)9798350374025
DOIs
StatePublished - 1 Jan 2024
Event31st IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2024 - Orlando, United States
Duration: 16 Mar 202421 Mar 2024

Publication series

NameProceedings - 2024 IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2024

Conference

Conference31st IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2024
Country/TerritoryUnited States
CityOrlando
Period16/03/2421/03/24

Keywords

  • Audition
  • Multimodal
  • Multisensory
  • Navigation
  • Sensory Substitution
  • Virtual Environment
  • Virtual Reality
  • Vision

All Science Journal Classification (ASJC) codes

  • Computer Graphics and Computer-Aided Design
  • Human-Computer Interaction
  • Media Technology
  • Modelling and Simulation

Fingerprint

Dive into the research topics of 'When Vision Lies - Navigating Virtual Environments with Unreliable Visual Information'. Together they form a unique fingerprint.

Cite this