Abstract

In our everyday life we use visual search to locate people, places and objects around us and visual attention plays a pivotal role for efficient and successful visual search. Even though our visual attention operates to find stimuli in both near (within reach) and far (out of reach) space, most of the research in this area has been conducted in near space alone. Impairments in visual attention are very common following stroke with visual neglect as a classic manifestation. While visual neglect has been shown to dissociate between near and far space, currently there is no validated tool that measures visual attention in far space. We present a new simple, portable, and open-source automated test of visual attention in far space, the Computerized Extrapersonal Neglect Test (CENT). CENT consists of computerised versions of cancellation and line bisection tasks completed on a large screen in far space using a wireless remote. We tested 179 healthy controls (18-94 years old) and 55 stroke survivors using the CENT. Aging effects, normative data and internal consistency were established from the healthy control data. Convergent and divergent validity and sensitivity were assessed in 55 stroke survivors (compared to 58 age-matched controls) who completed the CENT and the gold standard validated measures of visual neglect, cognition, and quality of life. Aging was accompanied by slower search speed and poorer quality of search and both these variables were significantly impaired in stroke survivors. The CENT demonstrated good internal consistency, convergent and divergent validity. Importantly, stroke survivors with neglect were specifically impaired in CENT when compared to other stroke survivors. In fact, the CENT presented higher sensitivity to attentional deficits when compared to gold standard measures. The CENT is a brief, automated, easy to administer tool, sensitive to age-related decline, brain injury and attentional impairments.
Original languageEnglish
Article number4877
JournalJournal of Vision
Volume23
DOIs
Publication statusPublished - Aug 2023

Cite this