Discriminating speech rhythms in audition, vision, and touch

dc.contributor.authorNavarra, Jordica
dc.contributor.authorSoto-Faraco, Salvador, 1970-ca
dc.contributor.authorSpence, Charlesca
dc.date.accessioned2015-10-07T07:35:56Z
dc.date.available2015-10-07T07:35:56Z
dc.date.issued2014ca
dc.description.abstractWe investigated the extent to which people can discriminate between languages on the basis of their characteristic temporal, rhythmic information, and the extent to which this ability generalizes across sensory modalities. We used rhythmical patterns derived from the alternation of vowels and consonants in English and Japanese, presented in audition, vision, both audition and vision at the same time, or touch. Experiment 1 confirmed that discrimination is possible on the basis of auditory rhythmic patterns, and extended it to the case of vision, using ‘aperture-close’ mouth movements of a schematic face. In Experiment 2, language discrimination was demonstrated using visual and auditory materials that did not resemble spoken articulation. In a combined analysis including data from Experiments 1 and 2, a beneficial effect was also found when the auditory rhythmic information was available to participants. Despite the fact that discrimination could be achieved using vision alone, auditory performance was nevertheless better. In a final experiment, we demonstrate that the rhythm of speech can also be discriminated successfully by means of vibrotactile patterns delivered to the fingertip. The results of the present study therefore demonstrate that discrimination between language's syllabic rhythmic patterns is possible on the basis of visual and tactile displays.
dc.description.sponsorshipThis research was supported by grants PSI2012-39149, PSI2009-12859 and RYC-2008-03672 from Ministerio de Economía y Competitividad (MINECO, Spain) to J. Navarra, the European COST action TD0904, and by grants PSI2010-15426 and CDS00012 from MINECO (Spain), 2009SGR-292 from Comissionat per a Universitats i Recerca del DIUE-Generalitat de Catalunya, and European Research Council (StG-2010 263145) to S. Soto-Faraco.
dc.format.mimetypeapplication/pdfca
dc.identifier.citationNavarra J, Soto-Faraco S, Spence C. Discriminating speech rhythms in audition, vision, and touch. Acta Psychol. 2014 Sep;151: 197-205. DOI 10.1016/j.actpsy.2014.05.021ca
dc.identifier.doihttp://dx.doi.org/10.1016/j.actpsy.2014.05.021
dc.identifier.issn0001-6918ca
dc.identifier.urihttp://hdl.handle.net/10230/24809
dc.language.isoengca
dc.publisherElsevierca
dc.relation.ispartofActa Psychologica. 2014 Sep;151: 197-205
dc.relation.projectIDinfo:eu-repo/grantAgreement/EC/FP7/263145ca
dc.relation.projectIDinfo:eu-repo/grantAgreement/ES/3PN/PSI2012-39149
dc.relation.projectIDinfo:eu-repo/grantAgreement/ES/3PN/PSI2009-12859
dc.relation.projectIDinfo:eu-repo/grantAgreement/ES/3PN/RYC2008-03672
dc.relation.projectIDinfo:eu-repo/grantAgreement/ES/3PN/PSI2010-15426
dc.rights© Elsevier http://dx.doi.org/10.1016/j.actpsy.2014.05.021ca
dc.rights.accessRightsinfo:eu-repo/semantics/openAccessca
dc.subject.keywordSpeech rhythm
dc.subject.keywordSpeechreading
dc.subject.keywordAudition
dc.subject.keywordVision
dc.subject.keywordTouch
dc.subject.keywordDiscrimination
dc.titleDiscriminating speech rhythms in audition, vision, and touchca
dc.typeinfo:eu-repo/semantics/articleca
dc.type.versioninfo:eu-repo/semantics/acceptedVersionca

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Navarra_AP_1265.pdf
Size:
303.31 KB
Format:
Adobe Portable Document Format