Grouping and segregation of sensory events by actions in temporal audio-visual recalibration
Mostra el registre complet Registre parcial de l'ítem
- dc.contributor.author Ikumi Montserrat, Nara, 1986-ca
- dc.contributor.author Soto-Faraco, Salvador, 1970-ca
- dc.date.accessioned 2018-02-07T15:18:23Z
- dc.date.available 2018-02-07T15:18:23Z
- dc.date.issued 2017
- dc.description.abstract Perception in multi-sensory environments involves both grouping and segregation of events across sensory modalities. Temporal coincidence between events is considered a strong cue to resolve multisensory perception. However, differences in physical transmission and neural processing times amongst modalities complicate this picture. This is illustrated by cross-modal recalibration, whereby adaptation to audio-visual asynchrony produces shifts in perceived simultaneity. Here, we examined whether voluntary actions might serve as a temporal anchor to cross-modal recalibration in time. Participants were tested on an audio-visual simultaneity judgment task after an adaptation phase where they had to synchronize voluntary actions with audio-visual pairs presented at a fixed asynchrony (vision leading or vision lagging). Our analysis focused on the magnitude of cross-modal recalibration to the adapted audio-visual asynchrony as a function of the nature of the actions during adaptation, putatively fostering cross-modal grouping or, segregation. We found larger temporal adjustments when actions promoted grouping than segregation of sensory events. However, a control experiment suggested that additional factors, such as attention to planning/execution of actions, could have an impact on recalibration effects. Contrary to the view that cross-modal temporal organization is mainly driven by external factors related to the stimulus or environment, our findings add supporting evidence for the idea that perceptual adjustments strongly depend on the observer's inner states induced by motor and cognitive demands.
- dc.format.mimetype application/pdf
- dc.identifier.citation Ikumi N, Soto-Faraco S. Grouping and segregation of sensory events by actions in temporal audio-visual recalibration. Front integr neurosci. 2017;10:44. DOI: 10.3389/fnint.2016.00044
- dc.identifier.doi http://dx.doi.org/10.3389/fnint.2016.00044
- dc.identifier.issn 1662-5145
- dc.identifier.uri http://hdl.handle.net/10230/33824
- dc.language.iso eng
- dc.publisher Frontiersca
- dc.relation.ispartof Front integr neurosci. 2017;10:44.
- dc.rights © 2017 Ikumi and Soto-Faraco. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
- dc.rights.accessRights info:eu-repo/semantics/openAccess
- dc.rights.uri https://creativecommons.org/licenses/by/4.0/
- dc.subject.keyword Action
- dc.subject.keyword Attention
- dc.subject.keyword Audio-visual
- dc.subject.keyword Cross-modal
- dc.subject.keyword Motor synchronization
- dc.subject.keyword Multisensory
- dc.subject.keyword Temporal recalibration
- dc.title Grouping and segregation of sensory events by actions in temporal audio-visual recalibrationca
- dc.type info:eu-repo/semantics/article
- dc.type.version info:eu-repo/semantics/publishedVersion