Brain songs framework used for discovering the relevant timescale of the human brain
Mostra el registre complet Registre parcial de l'ítem
- dc.contributor.author Deco, Gustavo
- dc.contributor.author Cruzat Grand, Josefina, 1983-
- dc.contributor.author Kringelbach, Morten L.
- dc.date.accessioned 2019-02-18T11:09:37Z
- dc.date.available 2019-02-18T11:09:37Z
- dc.date.issued 2019
- dc.description.abstract A key unresolved problem in neuroscience is to determine the relevant timescale for understanding spatiotemporal dynamics across the whole brain. While resting state fMRI reveals networks at an ultraslow timescale (below 0.1 Hz), other neuroimaging modalities such as MEG and EEG suggest that much faster timescales may be equally or more relevant for discovering spatiotemporal structure. Here, we introduce a novel way to generate wholebrain neural dynamical activity at the millisecond scale from fMRI signals. This method allows us to study the different timescales through binning the output of the model. These timescales can then be investigated using a method (poetically named brain songs) to extract the spacetime motifs at a given timescale. Using independent measures of entropy and hierarchy to characterize the richness of the dynamical repertoire, we show that both methods find a similar optimum at a timescale of around 200 ms in resting state and in task data.
- dc.description.sponsorship G.D. is supported by the Spanish Research Project PSI2016-75688-P (AEI/FEDER, EU), by the European Union’s Horizon 2020 Research and Innovation Programme under Grant Agreement Nos. 720270 (HBP SGA1) and 785907 (HBP SGA2), and by the Catalan AGAUR Programme 2017 SGR 1545. M.L.K. is supported by the ERC Consolidator Grant: CAREGIVING (No. 615539), and Center for Music in the Brain, funded by the Danish National Research Foundation (DNRF117).
- dc.format.mimetype application/pdf
- dc.identifier.citation Deco G, Cruzat J, Kringelbach ML. Brain songs framework used for discovering the relevant timescale of the human brain. Nat Commun. 2019 Feb 4;10:583-94. DOI: 10.1038/s41467-018-08186-7
- dc.identifier.doi http://dx.doi.org/10.1038/s41467-018-08186-7
- dc.identifier.issn 2041-1723
- dc.identifier.uri http://hdl.handle.net/10230/36610
- dc.language.iso eng
- dc.publisher Nature Research
- dc.relation.ispartof Nature Communications. 2019 Feb 4;10:583-94.
- dc.relation.projectID info:eu-repo/grantAgreement/ES/1PE/PSI2016-75688-P
- dc.relation.projectID info:eu-repo/grantAgreement/EC/H2020/720270
- dc.rights © Open Access. This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/. The Author(s) 2019
- dc.rights.accessRights info:eu-repo/semantics/openAccess
- dc.title Brain songs framework used for discovering the relevant timescale of the human brain
- dc.type info:eu-repo/semantics/article
- dc.type.version info:eu-repo/semantics/publishedVersion