Breaking through suppression: Face expertise selectively modulates very early awareness of high level face properties

Michael Papasavva, Louise Ewing, Inês Mares, Marie L. Smith

Research output: Contribution to journalArticlepeer-review

Abstract

Neurotypical variability in face recognition abilities is known to be driven by differences present across multiple elements of an extended processing pathway, i.e., from early visual perception through to later explicit retrieval and recall. Here across two experiments, we utilised breaking Continuous Flash Suppression paradigms to explore the earliest stage of face encoding: the lead up to conscious detection. We investigated whether faces selectively receive preferential access to awareness among participants with relatively stronger (cf. weaker) face recognition abilities at the categorical level (contrasting detection of faces with another object category) and higher levels of face processing (exploring differences associated with orientation and attractiveness). Both experiments identified selectively faster access to awareness for faces over a non-face object control (houses) in better face recognisers at both the group and individual level. Experiment two further clarified that these expertise-related effects are selective to upright (cf. inverted) faces, indicating that this link is unlikely to be solely driven by sensitivity to low level visual cues. We also observed expertise-related modulation of attractiveness effects on CFS breakthrough, consistent with the possibility that individuals with higher levels of face processing ability have accelerated early access to even this high-level stimulus dimension. Taken together these experiments provide new insight into very early face perception, and the extent to which expertise modulates this processing stage at both the group and individual level.
Original languageEnglish
Article number109104
JournalNeuropsychologia
Volume211
Early online date7 Mar 2025
DOIs
Publication statusE-pub ahead of print - 7 Mar 2025

Cite this