MyoSpat is an interactive system that allows musicians to manipulate timbrical and spatial properties of the sound as well as light projections through hand-gestures. The system aims to facilitate the creative use of audio processing during live musical performance through easily learnable hand gestures. MyoSpat is developed using the Myo armband, Myo Mapper. machine learning models designed using Wekinator and implemented with ml.lib; and an audio-visual engine developed in Pure Data.
In addition to performing The Wood and The Water, MyoSpat has been used for delivering the HarpCI workshop at the University of Southampton and the MiXD workshop at Berklee College of Music - Valencia.
Di Donato, B., Dooley, J., Hockman, J., Bullock, J., and Hall, S. (2017) MyoSpat: a hand-gesture controlled system for sound and light projections manipulation. International Computer Music Conference (ICMC), Shanghai, China. (PDF)
Di Donato, B., and and Dooley, J., (2017) MyoSpat: a system for manipulating sound and light projections through hand gestures, 3rd Workshop on Intelligent Music Production (WIMP), Media City UK, University of Salford, Manchester, UK. (PDF)
Di Donato, B. and and Bullock, J. (2015) gSpat: Live sound spatialisation using gestural control. ICAD 2015 Student Think Thank, University of Technology, Graz, Austria. 10.13140/RG.2.1.2089.7128