SASFest2013Integrating information across the senses is critical for effective interactions with the environment. Over the past decade, evidence has accumulated that multisensory integration is not deferred to later processing in association cortices but starts already in primary, putatively unisensory, areas. Given this multitude of multisensory integration sites, characterizing their functional similarities and differences is of critical importance.Combining psychophysics, functional imaging and effective connectivity analyses, our research demonstrates that multisensory integration emerges in a functional hierarchy with temporal coincidence detection in primary sensory, informational integration in association and decisional interactions in prefrontal areas.

Audiovisual interactions in low level sensory areas are mediated via multiple mechanisms including feed forward thalamocortical, direct connections between sensory areas and top down influences from higher order association areas. In addition to identifying where in the brain sensory information is integrated, we also aimed to provide insights into the underlying computational operations by combining multivariate pattern decoding with models of Bayesian Causal Inference. Our results demonstrate audiovisual influences already at the primary cortical level. However, only parietal cortices integrate spatial signals from vision and audition weighted by their bottom-up sensory reliability and top-down task-relevance. Critically, in line with Bayesian Causal Inference, IPS arbitrates between information integration and segregation by taking into account the probabilities of the causal structure of the environment.

Speaker: Uta Noppeney (Birmingham)

Date and time: 23 January 2014, 17:00 – 19:00

Venue: Room 243 (Senate House)
Keep up to date with all the events at SAS


View Larger Map