Werner Reichardt Centrum für Integrative Neurowissenschaften (CIN)

Vision and Cognition

Overview

We study high-level visual perception in the human brain, with a focus on ecologically relevant and meaningful vision: how do I know where I am in space? How fast do I and that ball move? ; What emotion does that person feel? Have I not seen this place/object/person before? Our studies include links to attention, memory and social interactions. We collaborate with clinicians to gain insights on disorders such as autism, schizophrenia, and ADHD.

Methodologically we use non-invasive brain imaging (3T and 9.4T fMRI), and we can perturb neural processing using transcranial magnetic stimulation (TMS), also during fMRI (latest hardware!). For data analysis and modelling we use multivariate classifiers and old-fashioned stats. Our research questions currently include the following:

Illusions, Scene Segmentation, and Attention

Illusions allow separating visual processing from visual consciousness, and open windows to processes involved in scene segmentation, perceptual grouping, Gestalt perception, and reveal prior knowledge. We use primarily bi-stable stimuli and attention to examine these processes.

Natural scenes, motion, and space

Natural scenes (such as feature movies) are fascinating: they contain most visual input our brain evolved in. We study the interpretation of high-level motion in movies, but also of people, objects, and space. Using controlled paradigms we examine how the brain integrates visual signals with body-related signals (efference copies of muscle-movements, proprioceptive and vestibular signals) to provide perceptual stability. The aim is to understand the brain encodes our position in the environment, and how it reconstructs the 3D-space and objects around us based on visual input. - Motion, space, and memory are tightly interlinked.

Emotions and Social Interactions

How do we recognise emotional inflections in facial motion, in body posture, or by observing people interact? We study how dynamic changes of facial expressions and body posture are processed, and how visual and affective brain regions exchange information.

Methods

Stimuli:

We use anything from highly controlled stimuli (such as 3D-dotfields), virtual reality, to natural movies. Special paradigms such as binocular rivalry and visual illusions allow dissociating pure processing from processing related to conscious perception, attentional control and decision making.

Patients:

We are highly interested in understanding the mechanistic reasons that can lead to neglect, autism, ADHD, or schizophrenia. We therefore collaborate with clinicians (Neurologists and Psychiatrists) and examine their patients using our paradigms - purely behaviourally or also using fMRI.

Brain imaging:

fMRI (3T and 9.4T) and EEG. Analyses: we use standard statistics and multivariate approaches to gain insights into neural information content.

Brain stimulation:

TMS. We use neuronavigated transcranial magnetic stimulation (TMS) to disturb perception, attentional processes and associated decision making, in order to test the causal involvement of brain regions in a task. Simultaneous TMS-fMRI is used to examine neural effects of various TMS protocols using the latest 7-channel surface coils.

Throughout most experiments we use eye tracking (EyeLink or Arrington).

Selected Publications

  • Zaretskaya N, Bause J, Polimeni JR, Grassi PR, Scheffler K, Bartels A. Eye-selective fMRI activity in human primary visual cortex: Comparison between 3​ T and 9.4​ T, and effects across cortical depth (2020). NeuroImage 220, 117078
  • Bannert M and Bartels A (2018) Human V4 Activity Patterns Predict Behavioral Performance in Imagery of Object Color. Journal of Neuroscience 38(15) 3657-3668.
  • Grassi PR, Zaretskaya N, Bartels A (2018). A Generic Mechanism for Perceptual Organization in the Parietal Cortex. Journal of Neuroscience. 2018 Aug 8;38(32):7158-7169.
  • Bannert M, Bartels A (2013). Decoding the yellow of a gray banana. Current Biology 23(22), p. 2268-2272
  • Schindler A and Bartels A (2013). Parietal cortex codes for egocentric space beyond the field of view. Current Biology, 23(2):177-182.