RAVI (responsive audio-visual improvisation) is the collaboration of John Ferguson and Chris Stover. RAVI is a new initiative/work-in-progress that situates live improvisation between interactive multimedia technology and trombone within a responsive audio-visual environment created in TouchDesigner. Utilising sound and sensor data to generate real-time graphics, we seek to develop a system that provides sonic and visual catalysts ranging from the gentle steering of musical improvisation to more autonomous and potentially disruptive behaviour.
The project has two primary goals:
- to develop new creative works for public presentation and practice-based research outputs
- to create professional-quality artefacts to use in reflexive data analysis, research publication, and promotion of creative work.
It’s clear that ‘live visuals’ are increasingly commonplace in contemporary performance scenarios and the role of the musical score is evolving. However, with a few notable exceptions (Louise Harris, Myriam Bleau, Vicki Bennett, Nonotak Studio) there remains little artistic research or scholarly discourse regarding the situation of improvising musicians in responsive audio-visual environments; this is a gap that RAVI seeks to investigate.
Our primary research questions are:
- To what extent can improvised performance with interactive multimedia technology and trombone be successfully situated in a responsive audio-visual environment?
- What creative opportunities emerge when ‘musical improvisation and ‘live visuals’ combine in an environment where the line between cause and effect is blurred?
- In what ways does the very physicality of the trombone afford interactive stimuli for digital audiovisual resources?
The highlights video above was assembled from process development footage for presentation at the IDEAS Symposium Friday 25th November 2022 at the Creative Arts Research Institute (CARI) Griffith University.
- Tangible Media have been engaged to assist with AV documentation in early 2023;
- RAVI will perform at the Welcome to CMT concert @ QCGU on the evening of Monday 6th March 2023.
Process documentation 21st October 22
Process documentation August 22
The concept demonstration video above is a solo improvisation using amplitude tracking on my electronic sound world and amplitude tracking on what I imagine would have been Chris’s trombone performance, using a microphone. The goal with this setup is to ensure our sonic gestures both impact the moving image. However, in this proof of concept, I was basically hitting buttons with one hand while growling into a mic with the other, though only the direct audio is recorded (not the mic. feed, thankfully). A couple of different AV ideas were explored:
2D particles are birthed at a high rate (100 a second) while my sound manipulates wind-like forces that create motion. However, whenever the microphone detects sufficient sound level, the birth rate is dramatically reduced, down to zero, so if Chris is playing loud/close to the microphone: no particles are birthed. However, as soon as Chris stops playing, they come swooping back in. There’s a sense here that the musician is being ‘mobbed’ by the visual, but sustained notes result in gentler interaction too (which is at least as interesting as the reversed causal agenda).
Applying noise data to a grid, one of us controls the shape and size of the grid (me in this case), and the other person (also me on a microphone today) impacts parameters of the noise data applied – so when I’m not playing the grid sometimes shrinks to a line (could be fun to start with what appears 2D and then see this develop over time into 3D)
In the video above I’ve cut the first of 2 ideas recorded in half and arranged in ABA form, just to get into the habit of documenting stuff.
This is a work in progress and early stages proof of concept!
John Ferguson and Chris Stover have extensive backgrounds across a wide variety of music improvisational practices, but this project marks their first joint collaborative endeavour. The crux of this “New Initiative” is the formation of the RAVI duo project. In RAVI, technology and trombone work together within a responsive multimedia environment built in TouchDesigner. This work addresses a number of important questions about the nature of trans-media interaction, including especially what kinds of creative possibilities are afforded by the very nature of the interactive medium, which fundamentally blurs the distinction between cause and effect. One way this blurring takes place is the transformation of the trombone into an explicitly visual instrument, drawing upon its highly physical, motional nature via sensors that inscribe movement data across sonic and visual modalities.
A useful starting point is visual music. Oskar Fischinger’s (1938) An Optical Poem is an early example that William Moritz (1986) has used to suggest that ‘[s]ince ancient times, artists have longed to create with moving lights a music for the eye comparable to the effects of sound for the ear.’ However, while many early exemplars work with fixed musical and visual resources, RAVI focuses on improvised live performances in interactive environments. The real genesis might therefore be a work like John Cage’s Variations V (1965), which was created in collaboration with the Merce Cunningham Dance Company and made use of shadows cast on walls to trigger sound via light sensors. This resonates with what Simon Waters (2013) has termed ‘Touching at a Distance’ and might also be considered via John Whitney’s (1994) notion of ‘audio-visual complementarity’. These are all useful historical ideas to help develop this project. More recently, Cat Hope and Lindsay Vickery (2010) have furthered discussion of ‘The Aesthetics of the Screen-Score’ and it is clear their Decibel ensemble is a leader in this field. Similarly, Louise Harris (2021) has examined the nature of audiovisual experience with a particular emphasis on media hierarchy; our work builds upon her 2016 article ‘Audiovisual Coherence and Physical Presence’, which emphasises the relationship between audio-visual media and physical (human) presence.
This project will adopt an action-research methodology and adhere to ‘practice as research’ methods as outlined by Henk Borgdorff in The Debate on Research in the Arts (2006).
Borgdorff, H. (2007). The Debate on Research in the Arts. Focus on Artistic Research and Development, no. 02 , Bergen: Bergen National Academy of the Arts.
Harris, L. (2016). Audiovisual Coherence and Physical Prescense: I am there, therefore I am [?]. eContact! Canadian Electroacoustic Coummunity (CEC). 18.2. Retrieved March 24 2022 from https://econtact.ca/18_2/harris_audiovisualcoherence.html
Harris, L. (2021). Composing Audiovisually: Perspectives on audiovisual practices and relationships. Routledge.
Hope, C. and Vickery, L. (2010). The Aesthetics of the Screen-Score. Retrieved March 24 2022 from https://www.lindsayvickery.com/uploads/1/7/0/8/17081762/2010hopevicaestheticsscreenscore.pdf
Moritz, W. (1986). Towards an Aesthetics of Visual Music. Retrieved August 20, 2019, from http://www.centerforvisualmusic.org/TAVM.htm Waters, S. (2013).
Touching at a Distance: Resistance, tactility, proxemics and the development of a hybrid virtual/physical performance system. Contemporary Music Review 32/2–3 (2013) ‘Resistant Materials in Musical Creativity,’ pp. 119–134.
Whitney, J. (1994). To Paint on Water: The Audiovisual Duet of Complementarity. Computer Music Journal, 18(3), 45-52. doi:10.2307/3681184