TY - JOUR
T1 - Task irrelevant sounds influence visual attention through graded crossmodal semantic modulation
AU - Wegner-Clemens, Kira
AU - Malcolm, George
AU - Kravitz, Dwight
AU - Shomstein, Sarah
N1 - Open practices statement:
The data and materials for all experiments are available on Open Science Foundation: https://osf.io/cwy8d/overview?view_only=5dcccf35c7fd4205afb069da2dae9246 Experiment 1 was preregistered at osf.io/qxz6g/?view_only=f90b3e4802c04eb4a589dd9ba70addb3 Experiments 2 and 3 were preregistered at osf.io/qrhe7/?view_only=539018c9aa8b4a008aee5cdf9554f4d7
PY - 2026/2/18
Y1 - 2026/2/18
N2 - The role of semantics in guiding attentional prioritization visually is well established, but how semantic information shapes attention in multisensory, rather than unisensory, contexts has been less well characterized. Task-irrelevant sounds can speed search for matched visual targets; e.g., you find a dog more quickly if you hear it barking. However, matched sounds and images may be a unique case, with the attentional benefit derived by integration into a combined multisensory event. To establish if semantic information influences attention in audiovisual contexts beyond direct matches, we systematically varied sound–image semantic relatedness while participants searched for target images. Search speed scaled with sound–image semantic relatedness, such that target images were found faster when the sound was more closely related to the image. To examine whether sound–image semantic scaling hinges on task relevance, we conducted two follow up experiments while participants completed an orthogonal Gabor discrimination task. Observed results demonstrate that semantic knowledge guides audiovisual attention, even when task-irrelevant and not matched, suggesting a robust underlying cognitive mechanism processing semantics both within and across sensory modality that goes on to influence attentional allocation.
AB - The role of semantics in guiding attentional prioritization visually is well established, but how semantic information shapes attention in multisensory, rather than unisensory, contexts has been less well characterized. Task-irrelevant sounds can speed search for matched visual targets; e.g., you find a dog more quickly if you hear it barking. However, matched sounds and images may be a unique case, with the attentional benefit derived by integration into a combined multisensory event. To establish if semantic information influences attention in audiovisual contexts beyond direct matches, we systematically varied sound–image semantic relatedness while participants searched for target images. Search speed scaled with sound–image semantic relatedness, such that target images were found faster when the sound was more closely related to the image. To examine whether sound–image semantic scaling hinges on task relevance, we conducted two follow up experiments while participants completed an orthogonal Gabor discrimination task. Observed results demonstrate that semantic knowledge guides audiovisual attention, even when task-irrelevant and not matched, suggesting a robust underlying cognitive mechanism processing semantics both within and across sensory modality that goes on to influence attentional allocation.
KW - Attention
KW - Audiovisual
KW - Multisensory
KW - Semantics
UR - http://www.scopus.com/inward/record.url?scp=105030499984&partnerID=8YFLogxK
U2 - 10.3758/s13423-025-02842-y
DO - 10.3758/s13423-025-02842-y
M3 - Article
SN - 1069-9384
VL - 33
JO - Psychonomic Bulletin & Review
JF - Psychonomic Bulletin & Review
IS - 2
M1 - 86
ER -