Automatic characterization and generation of music loops and instrument samples for electronic music production
Mostra el registre complet Registre parcial de l'ítem
- dc.contributor.author Ramires, António
- dc.contributor.other Serra, Xavier
- dc.contributor.other Font Corbera, Frederic
- dc.contributor.other Universitat Pompeu Fabra. Departament de Tecnologies de la Informació i les Comunicacions
- dc.date.accessioned 2024-03-16T02:33:19Z
- dc.date.available 2024-03-16T02:33:19Z
- dc.date.issued 2023-02-15T11:11:12Z
- dc.date.issued 2023-02-15T11:11:12Z
- dc.date.issued 2023-02-08
- dc.date.modified 2024-03-15T10:58:06Z
- dc.description.abstract Repurposing audio material to create new music - also known as sampling - was a foundation of electronic music and is a fundamental component of this practice. Currently, large-scale databases of audio offer vast collections of audio material for users to work with. The navigation on these databases is heavily focused on hierarchical tree directories. Consequently, sound retrieval is tiresome and often identified as an undesired interruption in the creative process. We address two fundamental methods for navigating sounds: characterization and generation. Characterizing loops and one-shots in terms of instruments or instrumentation allows for organizing unstructured collections and a faster retrieval for music-making. The generation of loops and one-shot sounds enables the creation of new sounds not present in an audio collection through interpolation or modification of the existing material. To achieve this, we employ deep-learning-based data-driven methodologies for classification and generation.
- dc.description.abstract Repurposing audio material to create new music - also known as sampling - was a foundation of electronic music and is a fundamental component of this practice. Currently, large-scale databases of audio offer vast collections of audio material for users to work with. The navigation on these databases is heavily focused on hierarchical tree directories. Consequently, sound retrieval is tiresome and often identified as an undesired interruption in the creative process. We address two fundamental methods for navigating sounds: characterization and generation. Characterizing loops and one-shots in terms of instruments or instrumentation allows for organizing unstructured collections and a faster retrieval for music-making. The generation of loops and one-shot sounds enables the creation of new sounds not present in an audio collection through interpolation or modification of the existing material. To achieve this, we employ deep-learning-based data-driven methodologies for classification and generation.
- dc.description.abstract Programa de doctorat en Tecnologies de la Informació i les Comunicacions
- dc.format 182 p.
- dc.format application/pdf
- dc.identifier http://hdl.handle.net/10803/687697
- dc.identifier.uri http://hdl.handle.net/10230/55792
- dc.language.iso eng
- dc.publisher Universitat Pompeu Fabra
- dc.rights L'accés als continguts d'aquesta tesi queda condicionat a l'acceptació de les condicions d'ús establertes per la següent llicència Creative Commons: http://creativecommons.org/licenses/by-nc-nd/4.0/
- dc.rights http://creativecommons.org/licenses/by-nc-nd/4.0/
- dc.rights info:eu-repo/semantics/openAccess
- dc.source TDX (Tesis Doctorals en Xarxa)
- dc.subject.keyword Electronic music production
- dc.subject.keyword Instrument classification
- dc.subject.keyword Percussive sound generation
- dc.subject.keyword Music information retrieval
- dc.subject.keyword Deep learning
- dc.subject.keyword Deep generative models
- dc.subject.keyword Producción de música electrónica
- dc.subject.keyword Clasificación de instrumentos
- dc.subject.keyword Generación de sonidos percusivos
- dc.subject.keyword Recuperación de la información musical
- dc.subject.keyword Aprendizaje profundo
- dc.subject.keyword Modelos generativos profundos
- dc.subject.keyword 62
- dc.title Automatic characterization and generation of music loops and instrument samples for electronic music production
- dc.type info:eu-repo/semantics/doctoralThesis
- dc.type info:eu-repo/semantics/publishedVersion