Ethno Tekh is a collaboration between Chris Vik (Kinectar) and Brad Hammond (Splash), creating interactive installations and motion capture-based audio/visual performances. This video and write-up is for our very first Kinect-based a/v performance that we did at Microsoft’s TechEd 2012 (aus).
Our system is built primarily with Max and Unity3D, with the music and visuals run off separate computers. There’s a lot of communication between the two systems using OSC, including trigger messaging for instrument changes and drops, as well as FFT audio analysis and MIDI to OSC conversions to closely tie the vision with the audio. Read on for more detail or just enjoy the video.
Chris (Music / Performance / Code)
In this performance, we’re using completely custom-built software and part of a larger live interactive A/V framework we’re putting together. I’ve personally been performing music live with the Kinect for well over a year, but this was my first as a collaboration with visual artist and coder, Brad Hammond. My approach for this performance was very similar to those I’ve done in the past, in that I’m completely in control of the music creation using a Kinect via my software Kinectar, which sends MIDI to Ableton Live. Basically, I control and loop everything in the set, 100% live, except for the drums. Due to the latency inherit in the Kinect hardware, triggering drums with tight timing and consistency is not possible. The limitation of the latency in the system has made for some interesting restrictions to work within, as you find with any particular instrument, and has in turn resulted in new and interesting ways to create music within the confines of the devices abilities.
I have a number of virtual audio instruments I can swap between using the MIDI foot controller, all of which I control in some way with the Kinect and run as VSTi plugins hosted in an Ableton Live set. Each of the instruments are articulated in different ways, catered to getting the most expressive control out of each sound. I can loop the sound of each instrument as I play them, and form a piece of music live, by layering the loops.
The great thing about starting my work with Brad is that these sonic instruments that I play with the Kinect are now brought into the visual world as well. It not only helps the people connect to which aspect of the music I am controlling at any given moment but also serves to create a sensory experience that is greater than the sum of it’s parts, bringing the audience further into the performance.
An example of one of the A/V instruments is the first one I control in the performance video. At TechEd, futurist and film maker Jason Silva was speaking before our performance, and so he was also introducing us. I had an idea to help segue from Jason into our performance and hold the theme of the keynote. I have an instrument in my live arsenal which has the ability to scratch samples (you can see an example of that audio instrument here) using the Kinect. I recorded Jason saying “technology extends our thought, reach and vision”, loaded it into the system, then scratched that sample of his voice to introduce our performance.
For this performance we had 4 A/V instruments:
- - Granular Sample scratcher (the large blue disc/ring that surrounds the character)
- - Drum Loop Masher (the audio reactive yellow “drum ball”)
- - Melody (the green ring with the blocks shooting out)
- - Wobble Bassline (the “bass balls” with lasers shooting out of them)
After working with Brad on our interactive exhibition public override triChild(), we had a basic framework down for communicating between the sonic and visual components of our work. All of the information being sent between our computers run over OSC. We’re both sharing the skeleton data that’s coming out from the Kinect to do our own manipulation separately. I’ve also built some M4L plugins that send out OSC triggers to let Brad’s computer know which instrument I’m using, how far through the live set I am and things like triggers for the screen shakes on the drops.
We started trying to use FFT to analyse the audio on Brad’s machine however this was problematic and basically unusable due to lag. To solve this problem I ended up building some M4L FFT plugins to analyse the audio on my machine and put one in each channel. The FFT plugins then stream the values of the audio analysis to Brad’s computer via OSC. This reduced the lag from around 400ms all the way down to 30-50ms, which is basically a total win.
Our intention is to continue working on this framework for both performances and installations, and our performance at TechEd was a great way to kick our project off in the right direction.
Brad (Visuals / Code)
After seeing what Chris had been doing with his performances over the last year and a bit it was obvious that there was a lot of potential to add a visual component to help create an all encompassing stage presence.
First of all we went through the process of rebuilding some fundamental back end components to help distribute and share audio and skeletal data between both of our machines. This resulted in us building Ethno Tracker, a stand along OSC skeletal tracking tool which can fire out the data to multiple addresses and does some additional processing on the joins such as velocity, deltas and rotations.
Once the most of the back-end was locked down I started to flesh out the idea for the visuals. One main consideration for the overall composition of the visuals was that the screen was massive and above Chris while he performed.
Getting started I created a simple space like environment, added some abstract geometric elements and simple techy UI layer. This worked well with the future techy vibe Chris’s performances already had and left us with a lot of room to work with in terms of animating the performance. For the instrument creation process we analysed the gesture being used to control the instruments as well as the sonic output to help break them down into a meaningful visual language. For example the granular scratching instrument that Chris mentioned earlier, has a gesture and sonic aesthetic similar to traditional scratching on vinyl decks, which you can see has influenced the design of the large rotating discs and planes for that instrument.
In reality, this performance is only scratching the surface of what we’ve got in mind for our work down the track.