A neurobehavioral model of flexible spatial language behaviors

John Lipinski, Sebastian Schneegans, Yulia Sandamirskaya, John P. Spencer, Gregor Schöner

Research output: Contribution to journalArticlepeer-review

33 Citations (Scopus)


We propose a neural dynamic model that specifies how low-level visual processes can be integrated with higher level cognition to achieve flexible spatial language behaviors. This model uses real-word visual input that is linked to relational spatial descriptions through a neural mechanism for reference frame transformations. We demonstrate that the system can extract spatial relations from visual scenes, select items based on relational spatial descriptions, and perform reference object selection in a single unified architecture. We further show that the performance of the system is consistent with behavioral data in humans by simulating results from 2 independent empirical studies, 1 spatial term rating task and 1 study of reference object selection behavior. The architecture we present thereby achieves a high degree of task flexibility under realistic stimulus conditions. At the same time, it also provides a detailed neural grounding for complex behavioral and cognitive processes.
Original languageEnglish
Pages (from-to)1490-1511
Number of pages22
JournalJournal of Experimental Psychology: Learning, Memory, and Cognition
Issue number6
Publication statusPublished - Nov 2012

Cite this