New Gesture Experiments
While experimenting with a range of gesture to sound mappings, I’ve come across a little gem:
It was initially designed as an educational tool and is quite simple to use. The only limitation is that you need to have a current version of Max/MSP. It works with a variety of sensors including the Kinect and Wii Remote. I’ve been using it to refine ideas for a movement-based vocal controller. Hopefully it can be built on to enable more variations and functionality.
I was quite excited by a program that offered artists a chance “to express themselves by ‘translating’ their sonic (instruments, singing, speech), visual and bodily/kinetic patterns or ‘narratives’ into parameters controlling audiovisual synthesis and processing modules.” However I haven’t managed to get it working yet. It requires a lot of externals that make the process of setting it up quite complicated, though I like the idea behind it.
So I’m back to recording the audio results of my experiments and seeing where that takes me. Here is a short snippet of the live piece I’m working on now: