Войти в систему

Home
    - Создать дневник
    - Написать в дневник
       - Подробный режим

LJ.Rossia.org
    - Новости сайта
    - Общие настройки
    - Sitemap
    - Оплата
    - ljr-fif

Редактировать...
    - Настройки
    - Список друзей
    - Дневник
    - Картинки
    - Пароль
    - Вид дневника

Сообщества

Настроить S2

Помощь
    - Забыли пароль?
    - FAQ
    - Тех. поддержка



Пишет bioRxiv Subject Collection: Neuroscience ([info]syn_bx_neuro)
@ 2025-09-26 04:35:00


Previous Entry  Add to memories!  Tell a Friend!  Next Entry
Predictive gaze orienting during navigation in virtual reality
Natural vision is an active, predictive process guided by expectations about when and where information will appear. Yet how gaze is shaped in dynamic, multisensory environments remains poorly understood. Using immersive virtual reality with eye-tracking, we examined oculomotor behavior during naturalistic navigation. Participants cycled through a virtual city while avatar cyclists, first heard as overtaking them from behind via spatialized auditory cues, later became visible as they passed. Auditory cues triggered anticipatory gaze shifts to expected locations, indicating that eye movements were guided by auditory predictions rather than reactive visual responses. Violations of auditory-spatial expectations elicited longer fixations. Critically, removing auditory cues impaired predictive gaze orienting, delayed gaze orienting and increased collisions with obstacles. These findings demonstrate that auditory input fundamentally shapes predictive models guiding visual exploration and adaptive behavior in dynamic environments, underscoring the multisensory basis of active perception in real-world interactions.


(Читать комментарии) (Добавить комментарий)