Show simple item record

2016-03-10Teil eines Buches DOI: 10.18452/19862
Perception of the visual environment
dc.contributor.authorTatler, Benjamin W.
dc.date.accessioned2019-04-09T09:43:46Z
dc.date.available2019-04-09T09:43:46Z
dc.date.issued2016-03-10none
dc.identifier.isbn9789027267481
dc.identifier.other10.1075/aicr.93.02tat
dc.identifier.urihttp://edoc.hu-berlin.de/18452/20660
dc.description.abstractThe eyes are the front end to the vast majority of the human behavioural repertoire. The manner in which our eyes sample the environment places fundamental constraints upon the information that is available for subsequent processing in the brain: the small window of clear vision at the centre of gaze can only be directed at an average of about three locations in the environment in every second. We are largely unaware of these continual movements, making eye movements a valuable objective measure that can provide a window into the cognitive processes underlying many of our behaviours. The valuable resource of high quality vision must be allocated with care in order to provide the right information at the right time for the behaviours we engage in. However, the mechanisms that underlie the decisions about where and when to move the eyes remain to be fully understood. In this chapter I consider what has been learnt about targeting the eyes in a range of different experimental paradigms, from simple stimuli arrays of only a few isolated targets, to complex arrays and photographs of real environments, and finally to natural task settings. Much has been learnt about how we view photographs, and current models incorporate low-level image salience, motor biases to favour certain ways of moving the eyes, higher-level expectations of what objects look like and expectations about where we will find objects in a scene. Finally in this chapter I will consider the fate of information that has received overt visual attention. While much of the detailed information from what we look at is lost, some remains, yet our understanding of what we retain and the factors that govern what is remembered and what is forgotten are not well understood. It appears that our expectations about what we will need to know later in the task are important in determining what we represent and retain in visual memory, and that our representations are shaped by the interactions that we engage in with objects.eng
dc.language.isoengnone
dc.publisherHumboldt-Universität zu Berlin
dc.rights.urihttp://rightsstatements.org/vocab/InC/1.0/
dc.subject.ddc400 Sprachenone
dc.titlePerception of the visual environmentnone
dc.typebookPart
dc.identifier.urnurn:nbn:de:kobv:11-110-18452/20660-8
dc.identifier.doihttp://dx.doi.org/10.18452/19862
dc.type.versionacceptedVersionnone
local.edoc.container-titleVisually Situated Language Comprehensionnone
local.edoc.container-creatorKnoeferle, P., Pyykkönen-Klauck, P., & Crocker, M. W.none
local.edoc.pages61none
local.edoc.anmerkung© 2016 John Benjamins Publishing Company. This is the accepted manuscript of a chapter published in Knoeferle, P., Pyykkönen-Klauck, P., & Crocker, M. W. (Eds.). (2016). Visually Situated Language Comprehension. Advances in Consciousness Research. John Benjamins Publishing Company. The final version is available at: https://doi.org/10.1075/aicr.93.02tatnone
local.edoc.type-nameTeil eines Buches
local.edoc.container-typebook
local.edoc.container-type-nameBuch
local.edoc.container-publisher-nameJohn Benjamins Publishing Companynone
local.edoc.container-publisher-placeAmsterdamnone
local.edoc.container-year2016none
local.edoc.container-firstpage31none
local.edoc.container-lastpage66none

Show simple item record