TY - GEN
T1 - When Vision Lies - Navigating Virtual Environments with Unreliable Visual Information
AU - Or, Eden
AU - Maidenbaum, Shachar
N1 - Publisher Copyright: © 2024 IEEE.
PY - 2024/1/1
Y1 - 2024/1/1
N2 - Humans typically utilize vision in a dominant role for navigation. However, what happens when vision becomes actively unreliable? Will it impair user performance, be suppressed, or be used advantageously? While such scenarios are rare in the real world, this question has important implications for multisensory integration in extended reality applications - e.g. virtual walls that a user sees but can walk through. We created virtual mazes which could be solved via audio or visual cues. We then manipulated the reliability of these sensory channels by including invisible walls which are not perceived but still blocked passage, and ghost walls which could be perceived but did not block participants. Participants navigated the exact same layouts under all conditions, and could solve these levels by ignoring the unreliable sensory modality and using only the other. Participants easily completed these mazes using vision-only, and with some difficulty via audition-only. Partially unreliable vision degraded performance, though still above audio-only demonstrating utilization of the unreliable visual cues. Mazes whose entire visual input was false degraded performance to the level of audio only, though participants subjectively reported it as easier then audio-only and did not close their eyes indicating that they still relied on vision. Testing a control in which visual information was both false and constantly moved, preventing its use as landmarks or optic flow, indeed caused participants to close their eyes, disregarding the false vision, but was accompanied by confounding nausea. In parallel, auditory incongruencies were easily suppressed across all unreliable auditory conditions. This demonstrates human attachment to visual information, even when mostly or completely false, and the ability to glean practical advantages from it unless it is completely stripped from usability. More broadly it lays a foundation for testing multisensory integration of sustained false sensory channels, and has implications for mixed reality design.
AB - Humans typically utilize vision in a dominant role for navigation. However, what happens when vision becomes actively unreliable? Will it impair user performance, be suppressed, or be used advantageously? While such scenarios are rare in the real world, this question has important implications for multisensory integration in extended reality applications - e.g. virtual walls that a user sees but can walk through. We created virtual mazes which could be solved via audio or visual cues. We then manipulated the reliability of these sensory channels by including invisible walls which are not perceived but still blocked passage, and ghost walls which could be perceived but did not block participants. Participants navigated the exact same layouts under all conditions, and could solve these levels by ignoring the unreliable sensory modality and using only the other. Participants easily completed these mazes using vision-only, and with some difficulty via audition-only. Partially unreliable vision degraded performance, though still above audio-only demonstrating utilization of the unreliable visual cues. Mazes whose entire visual input was false degraded performance to the level of audio only, though participants subjectively reported it as easier then audio-only and did not close their eyes indicating that they still relied on vision. Testing a control in which visual information was both false and constantly moved, preventing its use as landmarks or optic flow, indeed caused participants to close their eyes, disregarding the false vision, but was accompanied by confounding nausea. In parallel, auditory incongruencies were easily suppressed across all unreliable auditory conditions. This demonstrates human attachment to visual information, even when mostly or completely false, and the ability to glean practical advantages from it unless it is completely stripped from usability. More broadly it lays a foundation for testing multisensory integration of sustained false sensory channels, and has implications for mixed reality design.
KW - Audition
KW - Multimodal
KW - Multisensory
KW - Navigation
KW - Sensory Substitution
KW - Virtual Environment
KW - Virtual Reality
KW - Vision
UR - http://www.scopus.com/inward/record.url?scp=85191449355&partnerID=8YFLogxK
U2 - https://doi.org/10.1109/VR58804.2024.00112
DO - https://doi.org/10.1109/VR58804.2024.00112
M3 - Conference contribution
T3 - Proceedings - 2024 IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2024
SP - 935
EP - 944
BT - Proceedings - 2024 IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2024
T2 - 31st IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2024
Y2 - 16 March 2024 through 21 March 2024
ER -