|
Abstract |
While the human auditory system is proficient on its own at discerning the direction of incoming sounds, it operates in concert with other sensory modalities to reach accurate spatial awareness. Many studies have investigated the integration of auditory and visual information, but much less attention has been given to the importance of proprioceptive and vestibular information in the localisation process. The vestibular and proprioceptive systems aid in discerning self-motion from source motion and, through this, can stabilise perception and provide additional cues for sound localisation. The aim of this PhD thesis was to better understand how head movement and position estimation affects sound localisation. To this end, an ideal-observer model, based on Bayesian principles, was developed as a tool to predict dynamic sound localisation in humans and to test how performance is affected by the available information. Behavioural experiments were conducted in conjunction with model simulations to determine the acoustic cues and head motions that are relevant to dynamic sound localisation. The results from the psychoacoustic experiments were found to be in general agreement with the model output, though some quantitative differences indicated that dynamic sound localisation may involve processes that can be considered non-ideal. These studies offer valuable insights for the field of psychoacoustics and for auditory engineering applications in modern technologies such as hearing aids and virtual or augmented reality. |
|