In this paper, we present JamSketch, a real-time improvisation support system which automatically generates melodies according to melodic outlines drawn by the users. The system generates the improvised melodies based on (1) an outline sketched by the user using a mouse or a touch screen,
(2) a genetic algorithm based on a dataset of existing music pieces as well as musical knowledge, and (3) an expressive performance model for timing and dynamic transformations. The aim of the system is to allow ...
In this paper, we present JamSketch, a real-time improvisation support system which automatically generates melodies according to melodic outlines drawn by the users. The system generates the improvised melodies based on (1) an outline sketched by the user using a mouse or a touch screen,
(2) a genetic algorithm based on a dataset of existing music pieces as well as musical knowledge, and (3) an expressive performance model for timing and dynamic transformations. The aim of the system is to allow people with no
prior musical knowledge to be able to enjoy playing music
by improvising melodies in real time.
+