Ben Carey: “Exploring, Navigating, Responding, Listening”

BY SAM GILLIES

 

Sydney-based performer and improviser Ben Carey has made a name for himself through his distinctive musical language and unique approach to the use electronics amongst an improvised performance practice. He has just released his debut album _derivations: Human-Machine Improvisations, which seeks to bridge the gap between the performer and the computer in live improvisation. Sam Gillies spoke to Ben about his music and the ideas that underpin it.

How would you characterise your music? 

The work I do mostly revolves around using the computer as an interactive partner in electro-acoustic performance. I build tools and performance systems that rely on improvisation for their realisation, but I have also written fixed-media electro-acoustic works and I improvise acoustically as well. Improvisation plays a large part in my work. First and foremost though, I’m a performer, so my music draws upon a sense of physicality and instrumental gesture.

What do you consider the relationship between composition and improvisation to be? 

I think it’s entirely personal but also multi-faceted. I like Richard Barrett’s characterisation of improvisation as a form of composition, not separate to composition but as a way of structuring music in performance. To my mind, composition is a process that defines the organisation of sounds outside of, and prior to, their performance, whilst improvisation is a way of organising through performance. In this way, improvisation is a way of exploring, navigating, responding, listening, and asking questions about music all at once.

You spent some time in France, studying at the Conservatoire de Bordeaux after finishing your Bachelor of Music in 2005. How did these studies help to lead you to the musical space you’re exploring now?

They had a profound effect on me. I was hot-housed in a pretty intense environment, focusing upon contemporary music for two years and doing little else but practicing and learning. I was studying saxophone and contemporary music with Marie-Bernadette Charrier, saxophonist and director of the ensemble Proxima Centauri. She was a taskmaster – very demanding and very inspiring. Although I was studying saxophone performance, I also maintained an interest in improvisation and electronic music. I got the chance to soak up the electro-acoustic music world France is known for, and began programming my own software tools in PureData and MaxMSP during the evenings. It was during this time I started to become excited about being able to affect electronics and to be effected by them – an excitement that’s stuck with me.

Has improvisation always been an important part of your musical practice? 

Improvisation was not initially a part of my relationship to the saxophone. However, I became fascinated with free improvisation after the end of university. Before Bordeaux, I’d begun improvising with electronics using quite simple setups. At first, I worked with feedback, using the instrument as a resonance chamber to coax different electronic tones from the speakers (the saxophonist John Butcher was a real inspiration in this area). That was wild, as it made me think about performing and controlling a sound source in a new and different way. I then became interested in sampling my sound and manipulating it live. This was a pretty fraught process for me, though, as it relegated my sax playing to something secondary. So, interactive music was a way to distance myself as an instrumentalist from control over the machine. I wanted to find a way to interact fully with electronic sound without having to be in control of it. I wanted it to surprise me, so this is how I began looking into ways of programming these kinds of scenarios for myself. This became the basis for _derivations.

Tell us a little bit about the _derivations project and what you were hoping to accomplish during this period of development.

_derivations is the product of a search for a way to interact with the computer as freely and intuitively as one would with an instrumental improviser. I’d been searching for a way to interact with electronics intuitively, to bounce off surprising ideas the computer might throw my way without having to explicitly trigger or control its output. At the same time, I was very interested in trying to coherently integrate the sound of the musician with the computer’s output – tying it to the timbral context given by the player and creating an organic electro-acoustic environment. _derivations is a software system built using a program called MaxMSP. It’s designed to listen to, record, and analyse the sound of a performer so it can act as a quasi-autonomous performance partner. Everything the performer plays is analysed using the kind of algorithms that you would find in speech recognition technology. It’s listening to aspects of the performer’s sound, and not to rhythmic, melodic or harmonic content. In this way, _derivations is built for a timbral mode of playing, and seeks to find connections between the improviser’s performance and a growing database of information relating to their sound. _derivations makes use of this information for its own contribution, processing and manipulating recordings in real-time to create an electro-acoustic environment the performer can then play off and respond to.

What were some of the key relationships you wanted to build between improvisers in the_derivations project and how would you describe the musical relationship that takes place in a typical performance?

The project was borne out of my own desires as a performer, though it didn’t take long to realise I was also developing the software for others. I was really interested in allowing a performer to have a say in the way the program interacted with them. It’s one thing to give a performer the freedom to affect and be affected by software on stage, but it’s another to allow them to curate the boundaries of their interaction. So, I wanted the software to be somewhat customizable, for musicians to be able to choose the kind of material the software would use in a performance. I think of _derivations as a ‘growing possibility’ space. The more material that is stored in its database, the more possibilities it has to navigate through for its material. Extending this past the boundaries of one performance, allowing the software to grow from performance to performance is what really excites me.

Your recently released album _derivations: Human-Machine Improvisations features some of Australia’s more notable improvisers performing with your interactive system. Did anything surprise you about the results of these performances?

There are always surprises, that’s what I love about it! The tracks on the album were collated from various performances recorded in different circumstances, both live and in the studio. What I found interesting was the even split between those performers that had rehearsed intensively with the program (Alana Blackburn and Joshua Hyde), and those that were on-the-spot interactions on stage (Antoine Läng and Evan Dorrian). To my ear, the performers that had interacted with the software previously didn’t lose any of the spontaneity of the on-the-spot interactions, and conversely the on-the-spot interactions didn’t lack any coherent sense of interactivity. The surprises that often occur are when a performer initiates an action in the computer without intending to do so, sending the system into a different space. These ruptures are fascinating, and are what keep the performances moving.

What do you see the role of electronics and software being in music today and what do you hope to accomplish by developing electronic systems for musical improvisation? 

Electronics and software are a ubiquitous part of music making today. Tools have always evolved alongside musicians, both as product of musical developments and as defining whole styles – consider of the forte-piano! Electronics are no different. Today, composers are finding ways of harnessing digital technology to expand their conceptions of what it means to compose. I think there’s a beautiful hybrid space where compositional and performance ideas are being developed and influenced by our machines. More and more, we’re living our lives as cyborgs: consuming, experiencing, and communicating through machines. It seems quite natural for our musical selves to be just as inseparable from technology. For me, programming my own software tools is a way of asking questions about what composition, and improvisation, is. I think you can learn a lot about your own assumptions by programming a computer to perform by itself, just as you learn a lot about interaction strategies by connecting with other musicians on stage.

Check out Ben Carey’s works at www.bencarey.net.

Image supplied.

HEAR IT LIVE

BACH, VIVALDI, AND HANDEL IN HAMER HALL

From 2-6 April with the Melbourne Symphony Orchestra.

THE AUSTRALIAN YOUTH ORCHESTRA PRESENTS

GET LISTENING!