Personal tools
You are here: Home Research Subthemes Multimodal Interactions - Coordinators: Benedikt Grothe & Harald Luksch

Multimodal Interactions - Coordinators: Benedikt Grothe & Harald Luksch

Combining vision, audition, olfaction, and touch allows the brain to form a unified representation of the outside world that goes beyond simply summing up the different sensory modalities separately. As different senses reach the brain with different external and internal delays, the nervous system must bring the different signals into spatio-temporal register. A whole range of multimodal interactions will be the subject of study in Projects A3, B4, B5, C3, C4, and D2, spanning both biological and man-made systems. Comparing the individual strategies to combine information from different modalities (including strategies to deal with the respective uncertainties and potential cross-modal inconsistencies) may reveal general principles underlying neuronal representations of space-time. These investigations will profit substantially from the newly established Bernstein Virtual Reality Facility with its two parallel setups for animal neurophysiology and human psychophysics.

Overview of relevant projects

A3: New methods to study visual influences on auditory processing (B. Grothe, A. Herz, C. Leibold & H. Luksch)

B4: Multisensory object representation and integration in space-time (H. Luksch & L. van Hemmen)