Parallel Information Processing and Population Coding - Coordinator: Christian Leibold
To deal with the complexity of spatio-temporal stimuli, different stimulus attributes are often processed in parallel by neural populations. This raises the question how the computational load is distributed between the elementary units of larger processing modules, how the results of these computations are integrated across space and time to create consistent neural representations and how this is accomplished for dynamically varying input stimuli. The broad range of projects at the center allows us to compare population codes at various processing levels – from the sensory periphery (A2, B1, C1, C2, C4) and ascending pathways (A1, A3, B2, B3, B4) to central processing stages (A1, D2, PD) and (pre-)motor areas (C3, C4, D2). Similarly, we will be able to contrast coding principles in different auditory (A1, A3, B2, B4, C2), visual (A3, B1, B4, C3), vestibular (D2), somatosensory (A1) and electrosensory (A2) model systems. Joint research about the role of feed-forward versus feedback processing also touches upon another aspect of the center’s overarching research theme: how do neural systems synchronize the timing of internal information processing with the progression of time in the external world?
Overview of relevant projects
B1: Learning Invariant Representations From Retinal Activity (T. Wachtler & T. Gollisch)
B3: Auditory invariance against space-induced reverberation (L. Wiegrebe, B. Grothe & C. Leibold)
B4: Multisensory object representation and integration in space-time (H. Luksch, L. van Hemmen)
C4: How Neuronal Representations of Space-Time Lead to Action (L. van Hemmen, M. Buss & G. Cheng)
D2: Visual-vestibular and visuo-visual cortical interaction (T. Brandt, L. van Hemmen & S. Glasauer)