Tanay Chowdhury

Exploring 'balance' in full-body contactless audio-visual interactions

B-AUDIO 


 Full-body Audio-Visual interaction in a Web-browser


Tanay Chowdhury

India


Supervisor: Dr. Stephen Roddy


This project is an exploration of contactless full-body audio-visual interactions and investigations on how to enable interaction using a web-browser. The design of the interface is motivated by cross-modal investigations of  ‘stability’ or ‘balance’. Conceptual representations of human posture and music perception are co-related and empirically evaluated to inform a posture-to-sound mapping. A scalable browser-compatible sonic interface is developed by leveraging machine learning in conjunction with the recent additions to the MAX-MSP audio synthesis environment, namely Node for MAX.

This interaction is then evaluated for its technical and artistic viability using short études. Design artefacts, functional code and experimental results and observations were documented for further application and reference.


Background

Digital Musical Instruments, Sonic Interfaces and live installations have explored the idea of contactless body-based interaction with audio and visuals. At the core of these studies is an attempt to increase accessibility, facilitate intuitive learning and support engagement. This has been done largely by reviving the focus on aesthetics and undertaking a participant sourced design process. With this intent, recent studies have adopted embodied design methods from Human-Computer Interaction (HCI) and electroacoustic music practice.

In addition to adopting empirical and design-based approaches, the grammar of electroacoustic music analysis has been stuided for its strong embodied basis. This research aims to explore the embodied basis of sound by focussing on a design-specific evaluation and thereby construct a bridge between abstract concepts within the context of cross-modal and interactive audio-visual interfaces.


Methodology

The design of the system adopted a bottom-up ‘conceptual blending’ approach. This deals with how multiple conceptual domains arising from different sensory stimulus can blend to create an emergent experience. Spectromorphology literature was consulted to extract conceptual descriptors for sound. These were mapped to balance and spatiality schemas relating to the body. Participant sourced empirical data was collected and results were analysed to identify prevalent sound-to-body conceptual relationships.

System development was carried out incrementally by integrating machine learning, client-server architecture and audio-synthesis in stages. Ml5js and P5js were used for pose recognition, data processing and visuals. Node for MAX, Nodejs, Express, socket.io was used for communication and  MAX / MSP for audio-synthesis.

Instrument Models (Percolate), Ecological Sound design toolkit (SDT) and BEAP modular synthesizer were utilised for sound design. All mapping strategies were implemented by applying conceptual relations identified in the design phase. Each of the mapping strategies was then individually evaluated via an Étude.


Conclusions

  • Analysis and filtering of data from 32 ‘sound shapes’ related to spatial and balance schematics revealed a set to 4 prominent metaphors - Order, Continuity, Acceleration & Density.

  • Timbre and soundscape based mappings were both effective in indicating stability. Ecological mappings proved to be more transparent and intuitive. Continuity emerged as an effective auditory cue for stability. Spectral density variations were found to be more relevant as spatial cues. Acceleration was partially effective in denoting verticality but found to be constrained in further application.

  • Although  realistic soundscape composition with precise low-level parameter control over individual constituent elements can be challenging, it can result in an immersive and familiar conceptual representation of the system.

  • Composition flexibility and long term engagement were both potentially higher when abstract mappings were applied. Abstract mappings were found to facilitate an exploratory approach towards interaction. The compositional evaluation of the interface was crucial in identifying new avenues of sound-design and analysis.


Prominent references for this project include - Spectromorphology: Explaining Sound-Shapes (Dennis Smalley), Designing with Blends (Imaz and Benyon) and Embodied Sonification (Stephen Roddy)

Project Gallery

1/2
  • MMT 2019-2020 Facebook Page
  • MMT 2019-2020 Twitter Page
  • MMT 2019-2020 Instagram

©2020 by MMT 2020. Proudly created with Wix.com