Searching for audiovisual correspondence in multiple speaker scenarios

dc.contributor.authorAlsius, Agnèsca
dc.contributor.authorSoto-Faraco, Salvador, 1970-ca
dc.date.accessioned2015-11-16T10:47:09Z
dc.date.available2015-11-16T10:47:09Z
dc.date.issued2011ca
dc.description.abstractA critical question in multisensory processing is how the constant information flow that arrives to our different senses is organized in coherent representations. Some authors claim that pre-attentive detection of inter-sensory correlations supports crossmodal binding, whereas other findings indicate that attention plays a crucial role. We used visual and auditory search tasks for speaking faces to address the role of selective spatial attention in audiovisual binding. Search efficiency amongst faces for the match with a voice declined with the number of faces being monitored concurrently, consistent with an attentive search mechanism. In contrast, search amongst auditory speech streams for the match with a face was independent of the number of streams being monitored concurrently, as long as localization was not required. We suggest that the fundamental differences in the way in which auditory and visual information is encoded play a limiting role in crossmodal binding. Based on these unisensory limitations, we provide a unified explanation for several previous apparently contradictory findings.
dc.description.sponsorshipThis work was supported by grants PSI2010-15426 and Consolider INGENIO CSD2007-00012 (MICINN), Generalitat de Catalunya (SRG2009-092), and European Research Council (StG-2010 263145).
dc.format.mimetypeapplication/pdfca
dc.identifier.citationAlsius A, Soto-Faraco S. Searching for audiovisual correspondence in multiple speaker scenarios. Exp Brain Res. 2011 Sep;213(2-3): 175-83. DOI 10.1007/s00221-011-2624-0ca
dc.identifier.doihttp://dx.doi.org/10.1007/s00221-011-2624-0
dc.identifier.issn0014-4819ca
dc.identifier.urihttp://hdl.handle.net/10230/25098
dc.language.isoengca
dc.publisherSpringerca
dc.relation.ispartofExperimental Brain Research. 2011 Sep;213(2-3): 175-83.
dc.relation.projectIDinfo:eu-repo/grantAgreement/EC/FP7/263145ca
dc.relation.projectIDinfo:eu-repo/grantAgreement/ES/2PN/CSD2007-00012
dc.rights© Springer (The original publication is available at www.springerlink.com)ca
dc.rights.accessRightsinfo:eu-repo/semantics/openAccessca
dc.subject.keywordMultisensory integration
dc.subject.keywordAudiovisual speech perception
dc.subject.keywordSpatial attention
dc.subject.keywordVisual search
dc.subject.keywordAuditory search
dc.titleSearching for audiovisual correspondence in multiple speaker scenariosca
dc.typeinfo:eu-repo/semantics/articleca
dc.type.versioninfo:eu-repo/semantics/acceptedVersionca

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Alsius_EXR_1254.pdf
Size:
407.59 KB
Format:
Adobe Portable Document Format