An active vision of visual perception The visual system seems to give us a continuous view of the world and the objects around us, yet the eyes actually shift position three times a second through movements called saccades. This raises some interesting questions around how we achieve perceptual continuity, a topic of great interest to Professor Martin Rolfs. Our eyes seem to give us a continuous, unbroken view of the world, yet in reality they shift position from one location to the next three times per second. These shifts are called saccades, and within a time as short as a bowshot, the eyes move to gather information at a new location. “This is quite different to what most people report about how they perceive the world,” says Martin Rolfs, a Professor of Experimental Psychology at Humboldt-Universität zu Berlin. This raises some interesting questions, particularly around perceptual continuity. “How do we perceive a stable, continuous world when the eye is jerking around and creating an image on the back of the eye, on the retina, that is displaced every 300 milliseconds?” asks Professor Rolfs.
Active perception and cognition
crude information picked up in the periphery of the visual field, on the basis of which a decision is reached on where to move the eyes. “You then move them to a new location – but before you move, you shift attention there, and start analysing what’s going on in much more detail,” says Professor Rolfs. Before the eye moves to a new location, the brain starts processing that part of the scene much more rigorously than any other. “It starts ignoring other things and the part that you’re moving the eye towards becomes central to your
How do we perceive a stable, continuous world when the eye is jerking around and creating an image on the back of the eye, on the retina, that is displaced every 300 milliseconds? and psychology. “We are trying to understand the brain and how it works with visual information. People have started looking at the interaction between eye movements and perception as well as cognitive processes like memory,” says Professor Rolfs. “We can probe what someone sees and what they ignore in an experimental setting.” A lot of information is typically available in a visual scene, whether it’s shop names, car registration numbers or petrol prices, and we all have to work out what we want to focus on at any given point in time. This is based on very
awareness,” says Professor Rolfs. This happens every time that the eyes move to a new location. So there is an ongoing interaction between this covert selection of certain parts of a visual scene, and the subsequent movement of the eyes, which allows us to gain more detail about a certain part of a scene. “When you place your fovea, the centre of your gaze, on a certain point in a scene, you have a much higher resolution on that point. You can see much finer details,” explains Professor Rolfs. An attention shift precedes this eye movement. “Before the
Understanding psychological disorders The goal in research now is to build towards a theory of active vision, of how continuous perception is achieved, despite the presence of all these movements and changes on the retina. This could also lead to new insights into how attention works in disorders like schizophrenia, autism or attention deficit hyperactivity disorder (ADHD), a condition characterised by difficulties in paying attention to a specific task, although research suggests this is accompanied by heightened perception elsewhere. “One of my students has shown that kids with ADHD are better at detecting unexpected changes somewhere in the visual field than a normal person. They had a broader focus of attention and were easily distracted,” explains Professor Rolfs. “If they are given a task that requires more distributed attention, they might actually perform better than other people.” This is illustrated by the example of a well-known video in which viewers are asked to count the number of times players in a basketball game switch the ball from hand to hand. Most people become so absorbed in the task that they fail to notice a gorilla walking across the screen. “This might seem surprising to many people, but it shows how much we ignore in everyday vision. We just focus on whatever we need to focus on at that particular moment,” says Professor Rolfs. By contrast, Professor Rolfs says that children with ADHD often noticed the gorilla straight
away. “This suggests that their attention system is much more excitable by external stimuli while they are engaged in a certain task, but more research is needed in this domain,” he explains. Researchers have also looked at patients with other disorders, including schizophrenic patients. One theory about the cause of schizophrenia symptoms, such as hearing voices and experiencing hallucinations, relates to visual perception and cognition. “A lot of the processes that I’ve described are predictive. So you shift attention to the target of your upcoming eye movement – you’re predicting where the eye will be, and effectively anticipating the movement and gathering information at that location,” outlines Professor Rolfs. “One theory about schizophrenia is that patients’ brains might be making these predictions, but the other part of their brain – that would normally deal with them – are not receiving them in some way.” An experiment has been developed in collaboration with Professor Katy Thakkar at Michigan State University to build on this research, with scientists looking to gain deeper insights into schizophrenia through tests on the eye movement system. “That could eventually lead to a diagnostic tool, and also help us understand where the symptoms of schizophrenia come from,” says Professor Rolfs. Research into the eye movement system could prove to be an effective way of learning about schizophrenia, and Professor Rolfs plans to pursue further research in this area in the coming years. “Together with Professor Thakkar we’re investigating predictive processes relating to eye movements of schizophrenic patients, and this is something we need to understand better in the future,” he says.
Experimental Psychology Experimental Psychology: Active Perception and Cognition Project Objectives
When observers actively explore and manipulate their environment, the fundamental processes of perception, cognition, and movement control become intricately related. To understand perception and cognition in active observers, we leverage a broad range of methods including eye and motion tracking, psychophysics, computational modeling, EEG, robotics, and studies of clinical populations.
Martin Rolfs’ research is financially supported by the German Research Foundation, the Deutsche Forschungsgemeinschaft (DFG; grants RO3579/8-1 & RO3579/9-1). The collaboration with Katherine Thakkar at Michigan State University is supported by the National Institute of Health (NIH; grant 1 R01 MH112644-01A1).
Project Coordinator, Martin Rolfs, Prof. Dr. phil. Heisenberg Professor für ‘Allgemeine Psychologie: Aktive Wahrnehmung und Kognition’ Institut für Psychologie | HumboldtUniversität zu Berlin Unter den Linden 6 | 10099 Berlin, Germany T: +49 (0)30 2093-6775 E: email@example.com W: http://rolfslab.de W: www.martinrolfs.de
Recommended Reading Rolfs, M. (2015). Attention in active vision: A perspective on perceptual continuity across saccades. Perception, 44, 900-919. Thakkar, K.N., Diwadkar, V.A., & Rolfs, M. (2017). Oculomotor prediction: a window into the psychotic mind. Trends in Cognitive Sciences, 21, 344-356.
Professor Martin Rolfs
Martin Rolfs is Heisenberg Professor for Experimental Psychology at HumboldtUniversität zu Berlin. His research focuses on dynamic processes of visual perception and cognition in active observers, looking at how the movements of our eyes, heads and bodies confine what we perceive and how we perceive it.
Electrodes are attached to the EEG cap that will then be placed on the head of a participant in an experiment (Credit: Julius Krumbiegel).
Photo by Kopf & Kragen
This question is a central part of Professor Rolfs’ agenda, with researchers at his laboratory investigating the underlying processes behind active perception and cognition. Traditionally, perception was looked at in a passive way; for example, people participating in a study were asked to fixate on a point on a screen for a period of time. “They were presented with stimuli. People reported on these stimuli and reached a judgment about them,” outlines Professor Rolfs. But in daily life, we don’t typically fixate on certain objects for long periods of time, so now researchers embrace the fact that the visual system actively collects information about the world. “We move our eyes all the time, and these eye and body movements affect what we perceive,” explains Professor Rolfs. The team at Professor Rolfs’ laboratory is now investigating this area in greater depth, using a variety of techniques to revise our picture of perception and cognition. This includes building on ideas from computer vision, where scientists have been working to develop cameras capable of recognising certain objects in images, which is crucial to some emerging technologies. “Self-driving cars need to recognise and track the things around it constantly,” points out Professor Rolfs. The brain does this on a continuous basis, so Professor Rolfs believes human vision and computer vision can learn from
each other. “When they move a camera, for instance, they learn more about the structure of a scene than on a static image,” he explains. This is because the movement itself creates relative motion of objects in a scene, so things that are further away are read more slowly, while those that are closer are read more rapidly. The idea of active vision became more prominent in the computer vision community as a result, with researchers starting to move the camera more to understand a scene better. A similar shift has happened in neuroscience
eyes move, the brain provides the mind with a preview of the target of your eye movement. Then the eyes move and you get a really crisp image on the fovea, the central, high-resolution part of the retina,” continues Professor Rolfs.
This image shows a participant in an experiment wearing an EEG cap. The EEG cap picks up the electrical activity of the brain while participants perform psychophysical tasks (Credit: Julius Krumbiegel).
A participant in an experiment is looking at – and reporting on – stimuli on a screen, while her eye movements are tracked (Credit: Julius Krumbiegel).