Top-down attention regulates the neural expression of audiovisual integration

Citació

  • Morís Fernández L, Visser M, Ventura-Campos N, Ávila C, Soto-Faraco S. Top-down attention regulates the neural expression of audiovisual integration. NeuroImage. 2015;119:272-285. DOI: 10.1016/j.neuroimage.2015.06.052

Enllaç permanent

Descripció

  • Resum

    The interplay between attention and multisensory integration has proven to be a difficult question to tackle./nThere are almost as many studies showing that multisensory integration occurs independently from the focus/nof attention as studies implying that attention has a profound effect on integration. Addressing the neural/nexpression of multisensory integration for attended vs. unattended stimuli can help disentangle this apparent/ncontradiction. In the present study, we examine if selective attention to sound pitch influences the expression/nof audiovisual integration in both behavior and neural activity. Participants were asked to attend to one of two/nauditory speech streams while watching a pair of talking lips that could be congruent or incongruent with the/nattended speech stream. We measured behavioral and neural responses (fMRI) to multisensory stimuli under/nattended and unattended conditions while physical stimulation was kept constant. Our results indicate that participants/nrecognized words more accurately from an auditory stream that was both attended and audiovisually/n(AV) congruent, thus reflecting a benefit due to AV integration. On the other hand, no enhancement was found/nfor AV congruency when it was unattended. Furthermore, the fMRI results indicated that activity in the superior/ntemporal sulcus (an area known to be related to multisensory integration) was contingent on attention as well as/non audiovisual congruency. This attentional modulation extended beyond heteromodal areas to affect processing/nin areas classically recognized as unisensory, such as the superior temporal gyrus or the extrastriate cortex, and to/nnon-sensory areas such as the motor cortex. Interestingly, attention to audiovisual incongruence triggered responses/nin brain areas related to conflict processing (i.e., the anterior cingulate cortex and the anterior insula)./nBased on these results, we hypothesize that AV speech integration can take place automatically only when/nboth modalities are sufficiently processed, and that if a mismatch is detected between the AV modalities,/nfeedback from conflict areas minimizes the influence of this mismatch by reducing the processing of the least/ninformative modality.
  • Mostra el registre complet