Evidence for enhanced discrimination of virtual auditory distance among blind listeners using level and direct-to-reverberant cues

Andrew Kolarik, Silvia Cirstea, Shahina Pardhan

Research output: Contribution to journalArticlepeer-review

62 Citations (Scopus)

Abstract

Totally blind listeners often demonstrate better than normal capabilities when performing spatial hearing tasks. Accurate representation of three-dimensional auditory space requires the processing of available distance information between the listener and the sound source; however, auditory distance cues vary greatly depending upon the acoustic properties of the environment, and it is not known which distance cues are important to totally blind listeners. Our data show that totally blind listeners display better performance compared to sighted age-matched controls for distance discrimination tasks in anechoic and reverberant virtual rooms simulated using a room-image procedure. Totally blind listeners use two major auditory distance cues to stationary sound sources, level and direct-to-reverberant ratio, more effectively than sighted controls for many of the virtual distances tested. These results show that significant compensation among totally blind listeners for virtual auditory spatial distance leads to benefits across a range of simulated acoustic environments. No significant differences in performance were observed between listeners with partial non-correctable visual losses and sighted controls, suggesting that sensory compensation for virtual distance does not occur for listeners with partial vision loss.
Original languageEnglish
Pages (from-to)623–633
Number of pages11
JournalExperimental Brain Research
Volume224
Early online date22 Nov 2012
DOIs
Publication statusPublished - Feb 2013

Cite this