Welcome to the UPF Digital Repository

Visual information constrains early and late stages of spoken-word recognition in sentence context

Show simple item record

dc.contributor.author Brunellière, Angèle
dc.contributor.author Sánchez García, Carolina, 1984-
dc.contributor.author Ikumi Montserrat, Nara, 1986-
dc.contributor.author Soto-Faraco, Salvador, 1970-
dc.date.accessioned 2015-10-30T08:27:55Z
dc.date.available 2015-10-30T08:27:55Z
dc.date.issued 2013
dc.identifier.citation Brunellière A, Sánchez-García C, Ikumi N, Soto-Faraco S. Visual information constrains early and late stages of spoken-word recognition in sentence context. International Journal of Psychophysiology. 2013 Jun 22;89(1): 136-47. DOI 10.1016/j.ijpsycho.2013.06.016
dc.identifier.issn 0167-8760
dc.identifier.uri http://hdl.handle.net/10230/24964
dc.description.abstract Audiovisual speech perception has been frequently studied considering phoneme, syllable and word processing levels. Here, we examined the constraints that visual speech information might exert during the recognition of words embedded in a natural sentence context. We recorded event-related potentials (ERPs) to words that could be either strongly or weakly predictable on the basis of the prior semantic sentential context and, whose initial phoneme varied in the degree of visual saliency from lip movements. When the sentences were presented audio-visually (Experiment 1), words weakly predicted from semantic context elicited a larger long-lasting N400, compared to strongly predictable words. This semantic effect interacted with the degree of visual saliency over a late part of the N400. When comparing audio-visual versus auditory alone presentation (Experiment 2), the typical amplitude-reduction effect over the auditory-evoked N100 response was observed in the audiovisual modality. Interestingly, a specific benefit of high- versus low-visual saliency constraints occurred over the early N100 response and at the late N400 time window, confirming the result of Experiment 1. Taken together, our results indicate that the saliency of visual speech can exert an influence over both auditory processing and word recognition at relatively late stages, and thus suggest strong interactivity between audio-visual integration and other (arguably higher) stages of information processing during natural speech comprehension.
dc.description.sponsorship This research was supported by the Spanish Ministry of Science and Innovation (PSI2010-15426 and Consolider INGENIO CSD2007-00012), Comissionat per a Universitats i Recerca del DIUE-Generalitat de Catalunya (SGR2009-092), and the European Research Council (StG-2010263145).
dc.format.mimetype application/pdf
dc.language.iso eng
dc.publisher Elsevier
dc.relation.ispartof International Journal of Psychophysiology. 2013 Jun 22;89(1): 136-47.
dc.rights © Elsevier http://dx.doi.org/10.1016/j.ijpsycho.2013.06.016
dc.title Visual information constrains early and late stages of spoken-word recognition in sentence context
dc.type info:eu-repo/semantics/article
dc.identifier.doi http://dx.doi.org/10.1016/j.ijpsycho.2013.06.016
dc.subject.keyword Visual speech
dc.subject.keyword Semantic constraints
dc.subject.keyword Spoken-word recognition
dc.subject.keyword Event-related potentials
dc.relation.projectID info:eu-repo/grantAgreement/EC/FP7/263145
dc.relation.projectID info:eu-repo/grantAgreement/ES/3PN/PSI2010-15426
dc.relation.projectID info:eu-repo/grantAgreement/ES/2PN/CSD2007-00012
dc.rights.accessRights info:eu-repo/semantics/openAccess
dc.type.version info:eu-repo/semantics/acceptedVersion

Thumbnail

This item appears in the following Collection(s)

Show simple item record

Search DSpace


Advanced Search

Browse

My Account

Statistics

In collaboration with Compliant to Partaking