Music proofreading with RefinPaint: where and how to modify compositions given context
Music proofreading with RefinPaint: where and how to modify compositions given context
Citació
- Ramoneda P, Rocamora M, Akama T. Music proofreading with RefinPaint: where and how to modify compositions given context. In: Kaneshiro B, Mysore G, Nieto O, Donahue C, Huang CZA, Lee JH, McFee B, McCallum M, editors. Proceedings of the 25th International Society for Music Information Retrieval Conference (ISMIR2024); 2024 November 10-14; San Francisco, USA. p. 204-7. DOI: https://www.doi.org/10.5281/zenodo.14877318
Enllaç permanent
Descripció
Resum
Autoregressive generative transformers are key in music generation, producing coherent compositions but facing challenges in human-machine collaboration. We propose RefinPaint, an iterative technique that improves the sampling process. It does this by identifying the weaker music elements using a feedback model, which then informs the choices for resampling by an inpainting model. This dualfocus methodology not only facilitates the machine’s ability to improve its automatic inpainting generation through repeated cycles but also offers a valuable tool for humans seeking to refine their compositions with automatic proofreading. Experimental results suggest RefinPaint’s effectiveness in inpainting and proofreading tasks, demonstrating its value for refining music created by both machines and humans. This approach not only facilitates creativity but also aids amateur composers in improving their work.