Combining top-down processes to guide eye movements during real-world scene search

George L. Malcolm, John M. Henderson

Research output: Contribution to journalArticlepeer-review

157 Citations (Scopus)

Abstract

Eye movements can be guided by various types of information in real-world scenes. Here we investigated how the visual system combines multiple types of top-down information to facilitate search. We manipulated independently the specificity of the search target template and the usefulness of contextual constraint in an object search task. An eye tracker was used to segment search time into three behaviorally defined epochs so that influences on specific search processes could be identified. The results support previous studies indicating that the availability of either a specific target template or scene context facilitates search. The results also show that target template and contextual constraints combine additively in facilitating search. The results extend recent eye guidance models by suggesting the manner in which our visual system utilizes multiple types of top-down information.

Original languageEnglish
Article number4
Number of pages11
JournalJournal of Vision
Volume10
Issue number2
DOIs
Publication statusPublished - Feb 2010

Keywords

  • visual cognition
  • search
  • eye movements
  • scene recognition
  • VISUAL-ATTENTION
  • TIME-COURSE
  • PERCEPTION
  • GUIDANCE
  • TARGET
  • ALLOCATION
  • DURATIONS
  • SELECTION
  • SALIENCE
  • VISION

Cite this