Projects per year
Abstract
Intelligent manipulation of handheld tools marks a major discontinuity between humans and our closest ancestors. Here we identified neural representations about how tools are typically manipulated within left anterior temporal cortex, by shifting a searchlight classifier through whole-brain real action fMRI data when participants grasped 3D-printed tools in ways considered typical for use (i.e., by their handle). These neural representations were automatically evocated as task performance did not require semantic processing. In fact, findings from a behavioural motion-capture experiment confirmed that actions with tools (relative to non-tool) incurred additional processing costs, as would be suspected if semantic areas are being automatically engaged. These results substantiate theories of semantic cognition that claim the anterior temporal cortex combines sensorimotor and semantic content for advanced behaviours like tool manipulation.
Original language | English |
---|---|
Article number | 9042 |
Journal | Scientific Reports |
Volume | 12 |
DOIs | |
Publication status | Published - 5 Jun 2022 |
Projects
- 1 Finished
-
Decoding neural representations of human tool use from fMRI response patterns
31/10/15 → 30/10/18
Project: Research
Research output
- 5 Citations (Scopus)
- 1 Article
-
Hand-selective visual regions represent how to grasp 3D tools: Brain decoding during real actions
Knights, E., Mansfield, C., Tonin, D., Saada, J., Smith, F. W. & Rossit, S., 16 Jun 2021, In: The Journal of Neuroscience. 41, 24, p. 5263-5273 11 p.Research output: Contribution to journal › Article › peer-review
Open AccessFile16 Citations (Scopus)22 Downloads (Pure)
Datasets
-
Hand-selective visual regions represent how to grasp 3D tools for use: brain decoding during real actions
Knights, E. (Creator), Mansfield, C. (Creator), Tonin, D. (Creator), Saada, J. (Creator), Smith, F. (Creator) & Rossit, S. (Creator), OpenNeuro, 20 Oct 2020
DOI: 10.18112/openneuro.ds003342.v1.0.0
Dataset