Human-Sound Interaction (HSI)

The design of interactions with sound and audio processes is a seminal activity in creating a performance, installation, a virtual sound environment, or interface for musical expression. The interface often fixates the interaction design without considering human factors and our diverse abilities to perceive the sound and interface affordances. The Human-Sound Interaction (HSI) project looks at how people move and interact with sound and interfaces to control sound, considering their diverse abilities to perceive sound.

With the term Human-Sound Interaction (HSI) we identify interactions between the human and sound (through an interface) that are direct, engaging, natural and embodied [2]:

  • direct as the impression or a feeling about an interfacecapable of being described in terms of concrete actions [5];
  • engaging as fostering the “feeling of directly manipulating the objects of interest”,where “the world of interest is explicitly represented and there is no intermediary between user and world” [5];
  • natural, “as being marked by spontaneity” [4];
  • embodied as an extension and incorporation of humanskills and abilities within the interaction design of asystem [3].

Project activity

Phase 0: Exploratory activity

In the first phase this project we explored HSI thorugh the prototyping of an interface nambed Soundsculpt based on this principle. This is fully describe in a Conference paper published at the 7th International Conference on Movement and Computing (MOCO ’20). Read the article at this link. The presentation of this paper can be viewed at the video below.

Phase 1: Interaction design activities

The design of such interactions is now being explored in a series of workshops conducted in different venues. Please, see the list of workshop and events below.

Phase 2: Data collection

Together with a series of workshops, we are now building the HSI-data repository [1]. Data include motion and motion and sound features of music translated in sign language. This work is being conducted to model gesture-sound relationships of Sign Language translated music, and subsequently inform the design HSIs considering the abilities of aurally diverse musicians and audience to perceive sound.

Example of audio and movement feature data visualisation
Example of audio and motion feature data visualisation.

Phase ...: ...

Stay tuned! Please, get in touch if you are interested in collaborating. :)

Workshop, talks and events

  • 12th May 2021. Food & Paper - RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion. University of Oslo (Norway) - Registration - Slides Web, PDF
  • 21 July 2020. International Conference on New Interfaces for Musical Expression (NIME20). Online - Royal Birmingham Conservatoire, Birmingham City University, Birmingham, United Kingdom. Registration - Programme
  • 14 September 2020. Audio Mostly 2020. Online - IEM / University of Music and Performing Arts, Graz, Austria. Registration - Programme
  • 2 October 2020. ArteScienza 2020. Online - Centro Ricerche Musicali, Rome, Italy. Registration and Programme

Contributors

Balandino Di Donato, Creative Computing Research Group (CCRG), University of Leicester
Christopher Dewey, Department of Computer Science, University of Huddersfield
Tychonas Michailidis, DMT Lab - Birmingham City University
Alessio Gabriele, CRM - Centro Ricerche Musicali of Rome

Related publications

[1] Balandino Di Donato. 2021. HSI-data. GitHub. URL: https://github.com/balandinodidonato/HSI-data
[2] Balandino Di Donato, Christopher Dewey, and Tychonas Michailidis. 2020. Human-Sound Interaction: Towards a Human-Centred Sonic Interaction Design approach. In 7th International Conference on Movement and Computing (MOCO ’20), July 15–17, 2020, Jersey City/ Virtual, NJ, USA.ACM, NewYork, NY, USA, 4 pages. DOI: https://doi.org/10.1145/3401956.3404233
[3] P. Dourish.Where The Action Is: The Foundations of Embodied Interaction. MIT Press, Oct. 2001
[4] S. A. Grandhi, G. Joue, and I. Mittelberg. Understanding naturalness and intuitiveness ingesture production. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI, pages 821–824, Vancouver, BC, Canada, 2011.
[5] D. A. Norman and S. W. Draper. User Centered System Design: New Perspectives on Human-computer Interaction. Lawrence Erlbaum Associates, Hillsdale, New Jersey, 1986.