Designing embodied human-computer interactions in music performance

A Thesis presented for the degree of Doctor of Philosophy at Royal Birmingham Conservatoire, Faculty of Art Design and Media, Birmingham City University, United Kingdom. February, 2020.

  • Thesis in PDF format
  • Citation:
    • Di Donato, B. (2020) Designing embodied human-computer interactions in music performance. PhD Thesis. Royal Birmingham Conservatoire (RBC), Birmingham City University (BCU). Birmingham, United Kingdom.
    • Bibtex file


Interfaces for musical expression are widely used for controlling and transforming sound in live performance. They aim to facilitate the interaction with a computer and empower the performer with a more expressive control over the sound. However, the actions made to control them have the potential to interfere with the musical performance, in relation to the instrumental technique, choreographic aspects or the physical characteristics of the played musical instrument.

To avoid this issue, modes of interaction and various devices have been designed and utilised in conjunction with interactive audio and visual software to control and transform audiovisual media.
In particular, gesture sensing technologies have been successfully used in different musical applications. However, they, in turn, raise questions such as, how can musicians most effectively control and transform auditory, visual and lighting effects during a live performance through gesture?
What interaction design considerations should be made that allow performers to interact simultaneously with an instrument and audio-visual-lighting processing? How can disruption during a live performance with embodied human-computer interactions be reduced?

The work presented in this thesis investigates modes of interaction with sound, visual projection and lighting effects during a musical performance that may result natural and embodied, and not dependent from a particular musical instrument, its sound or instrumental technique. For this purpose, using a User-Centred Design method, I realised `MyoSpat' upon Music and Human-Computer Interaction principles.
MyoSpat is an interactive system, which embeds Inertial Measurement Unit (IMU) and Electromyography (EMG) technology, for gesturally controlling audio and lighting processes during a musical performance. As part of this research, I also created Myo Mapper, a Thalmic Labs' Myo to Open Sound Control (OSC) messages mapper.

Outcomes of this research are presented in this thesis and through a portfolio of performances realised in collaboration with musicians.

Media appendix