Ray Hill

Reflection, Refraction, Reaction:

An audio environment which mirrors the presence and movements of its participants


Raymond Hill

Supervisor: Dr. Stephen Roddy


Reflection, Refraction, Reaction is a sound art piece, combining an interactive software system with pre-recorded music clips, to generate new arrangements of the recorded music in reaction to the presence and movements of people within a physical space. The title was chosen as a metaphoric description of the intended experience:


Reflection: the system will ‘reflect’ activity within a space by outputting audio mapped to data captured via a camera. In addition, the system may provoke reflection by participants as they consider their influence on the resulting output (Kwastek, 2013).


Refraction: the incoming data will be split into multiple streams, ‘refracting’ it in multiple directions to control various aspects of the audio output. It is envisioned that the participants, in reacting to the experience and changing environment, will also ‘refract’ these outputs into new data streams to drive the system to further change.


Reaction: the system will continuously react to the presence and movements of the participants. Participants may also react to factors within the environment: the physical space, the audio output of the system, other participants, etc.



BACKGROUND

”A sound art piece, like a visual artwork, has no specified timeline; it can be experienced over a long or short period of time, without missing the beginning, middle or end”. This is in contrast to a view of ‘music’ as having a fixed time duration, either as a live concert presentation or a recorded performance (Licht, 2009). This definition could be expanded to include a fixed arrangement; Recorded music is fixed by the nature of the recording process, and, generally speaking, live performances follow an arrangement originally conceived by a composer. This project seeks to create a ‘non-fixed’ piece of music which evolves in relation to the presence and movement of its ‘audience’.

There have been a number of forays into this arena in recent years: or example, Brian Eno’s “77 Million Paintings”, features an evolving arrangement of pre-recorded audio and visual elements. Eno has also developed a number of iPhone apps, which output audio content in response to the on-screen actions of users. A similar iPhone app called “EóN” was developed by Jean-Michel Jarre to explore new avenues of producing and delivering recorded music - the app produces a continuous, evolving, never-repeating stream of music from individual components pre-recorded by Jarre.

The work of experimentalist composers was considered in developing the idea of using the environmental data of a space as a Fluxus-style Event Score, like those in Yoko Ono’s 1964 book, “Grapefruit”, a collection of ‘event scores’ intended to provoke active engagement and participation from its audience (Ono, 2000).



METHODOLOGY & IMPLEMENTATION

A series of musical cells were composed in Ableton Live, which would serve as the building blocks of the piece. The aim was to create relatively simple rhythmic and melodic elements which could be combined to create a denser and more complex piece when mapped to the presence and movements of people within a physical space. It was decided to build the system in p5.js, a web-based version of the Processing coding environment to allow the flexibility to host instances of the installation online, as well as the intended hosting of the installation in public spaces.



CONCLUSION & FUTURE DEVELOPMENT

Unfortunately, the restrictions caused by the Covid-19 pandemic have limited any real-world testing of the currently designed system in public spaces. However, the simplicity of the system will allow for much future development. For example, further elements could be added to the pool of audio clips, allowing for further variation within the arrangement created; the addition of MIDI capabilities would open up possibilities even further by allowing the mapping of different timbres to the conditions of the presentation space in addition to different rhythms and melodies.

The possibility of presenting the project in a public space would also create opportunities for adding expanded levels of visual and physical engagement: the use of lighting and/or visual projections in combination with the audio presentation would allow for a more engaging and immersive experience for participants. The audio system, as currently designed, could easily be combined with a variety of visual presentations: static artworks in a gallery, dance and theatre performances, film screenings, etc.

Project Gallery

  • MMT 2019-2020 Facebook Page
  • MMT 2019-2020 Twitter Page
  • MMT 2019-2020 Instagram

©2020 by MMT 2020. Proudly created with Wix.com