Task irrelevant sounds influence visual attention through graded crossmodal semantic modulation

Kira Wegner-Clemens, George Malcolm, Dwight Kravitz, Sarah Shomstein

Research output: Contribution to journalArticlepeer-review

Abstract

The role of semantics in guiding attentional prioritization visually is well established, but how semantic information shapes attention in multisensory, rather than unisensory, contexts has been less well characterized. Task-irrelevant sounds can speed search for matched visual targets; e.g., you find a dog more quickly if you hear it barking. However, matched sounds and images may be a unique case, with the attentional benefit derived by integration into a combined multisensory event. To establish if semantic information influences attention in audiovisual contexts beyond direct matches, we systematically varied sound–image semantic relatedness while participants searched for target images. Search speed scaled with sound–image semantic relatedness, such that target images were found faster when the sound was more closely related to the image. To examine whether sound–image semantic scaling hinges on task relevance, we conducted two follow up experiments while participants completed an orthogonal Gabor discrimination task. Observed results demonstrate that semantic knowledge guides audiovisual attention, even when task-irrelevant and not matched, suggesting a robust underlying cognitive mechanism processing semantics both within and across sensory modality that goes on to influence attentional allocation.
Original languageEnglish
Article number86
JournalPsychonomic Bulletin & Review
Volume33
Issue number2
DOIs
Publication statusPublished - 18 Feb 2026

Keywords

  • Attention
  • Audiovisual
  • Multisensory
  • Semantics

Cite this