MoMusic: A Motion-Driven Human-AI Collaborative Music Composition and Performing System

Weizhen Bian, Yijin Song, Nianzhen Gu, Tin Yan Chan, Tsz To Lo, Tsun Sun Li, King Chak Wong, Wei Xue*, Roberto Alonso Trillo*

*Corresponding author for this work

Research output: Chapter in book/report/conference proceedingConference proceedingpeer-review

5 Citations (Scopus)

Abstract

The significant development of artificial neural network architectures has facilitated the increasing adoption of automated music composition models over the past few years. However, most existing systems feature algorithmic generative structures based on hard code and predefined rules, generally excluding interactive or improvised behaviors. We propose a motion based music system, MoMusic, as a AI real time music generation system. MoMusic features a partially randomized harmonic sequencing model based on a probabilistic analysis of tonal chord progressions, mathematically abstracted through musical set theory. This model is presented against a dual dimension grid that produces resulting sounds through a posture recognition mechanism. A camera captures the users' fingers' movement and trajectories, creating coherent, partially improvised harmonic progressions. MoMusic integrates several timbrical registers, from traditional classical instruments such as the piano to a new''human voice instrument''created using a voice conversion technique. Our research demonstrates MoMusic's interactiveness, ability to inspire musicians, and ability to generate coherent musical material with various timbrical registers. MoMusic's capabilities could be easily expanded to incorporate different forms of posture controlled timbrical transformation, rhythmic transformation, dynamic transformation, or even digital sound processing techniques
Original languageEnglish
Title of host publicationProceedings of the 37th AAAI Conference on Artificial Intelligence
EditorsBrian Williams, Yiling Chen, Jennifer Neville
Place of PublicationWashington, DC
PublisherAAAI press
Chapter37
Pages16057-16062
Number of pages6
ISBN (Electronic)9781577358800
DOIs
Publication statusPublished - 6 Sept 2023
Event37th AAAI Conference on Artificial Intelligence, AAAI 2023 - Washington, United States
Duration: 7 Feb 202314 Feb 2023
https://ojs.aaai.org/index.php/AAAI/issue/view/553
https://aaai-23.aaai.org/

Publication series

NameProceedings of the AAAI Conference on Artificial Intelligence
PublisherAAAI Press
Number13
Volume37
ISSN (Print)2159-5399
ISSN (Electronic)2374-3468

Conference

Conference37th AAAI Conference on Artificial Intelligence, AAAI 2023
Country/TerritoryUnited States
CityWashington
Period7/02/2314/02/23
Internet address

Scopus Subject Areas

  • Music
  • Artificial Intelligence
  • Human-Computer Interaction

Fingerprint

Dive into the research topics of 'MoMusic: A Motion-Driven Human-AI Collaborative Music Composition and Performing System'. Together they form a unique fingerprint.

Cite this